Import Hippolyzer
This commit is contained in:
2
.coveragerc
Normal file
2
.coveragerc
Normal file
@@ -0,0 +1,2 @@
|
||||
[run]
|
||||
omit =
|
||||
6
.hgignore → .gitignore
vendored
6
.hgignore → .gitignore
vendored
@@ -1,8 +1,10 @@
|
||||
#use glob syntax
|
||||
syntax: glob
|
||||
|
||||
.hgignore
|
||||
*.pyc
|
||||
build/*
|
||||
pyogp.lib.base.egg-info/*
|
||||
*.egg-info
|
||||
dist/*
|
||||
.doctrees
|
||||
docs/html/*
|
||||
.coverage
|
||||
@@ -1,20 +0,0 @@
|
||||
Pyogp is an open source collaboration between Linden Lab and the
|
||||
Architecture Working Group (AWG) to support the development
|
||||
of interoperable virtual worlds through the Open Grid Protocol.
|
||||
|
||||
Links
|
||||
-----
|
||||
Project Documentation - http://wiki.secondlife.com/wiki/Pyogp
|
||||
Linden Lab - http://lindenlab.com
|
||||
Open Grid Protocol - http://wiki.secondlife.com/wiki/Open_Grid_Protocol
|
||||
AWG - http://wiki.secondlife.com/wiki/Architecture_Working_Group
|
||||
|
||||
Contributors
|
||||
------------
|
||||
Aaron Terrell (Enus Linden), Linden Lab
|
||||
Christian Schultz (Tao Takashi), COM.lounge
|
||||
Everett Kotler (Kotler Linden), Linden Lab
|
||||
Infinity Linden, Linden Lab
|
||||
Joshua Bell (Josh Linden), Linden Lab
|
||||
Lawson English (Saijanai Kuhn), Linden Lab
|
||||
Timothy Loughlin (Locklainn Linden), Linden Lab
|
||||
@@ -1,4 +0,0 @@
|
||||
build
|
||||
dist
|
||||
*.egg-info
|
||||
*.pyc
|
||||
289
LICENSE.txt
289
LICENSE.txt
@@ -1,176 +1,165 @@
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
GNU LESSER GENERAL PUBLIC LICENSE
|
||||
Version 3, 29 June 2007
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
|
||||
Everyone is permitted to copy and distribute verbatim copies
|
||||
of this license document, but changing it is not allowed.
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
This version of the GNU Lesser General Public License incorporates
|
||||
the terms and conditions of version 3 of the GNU General Public
|
||||
License, supplemented by the additional permissions listed below.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
0. Additional Definitions.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
As used herein, "this License" refers to version 3 of the GNU Lesser
|
||||
General Public License, and the "GNU GPL" refers to version 3 of the GNU
|
||||
General Public License.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
"The Library" refers to a covered work governed by this License,
|
||||
other than an Application or a Combined Work as defined below.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
An "Application" is any work that makes use of an interface provided
|
||||
by the Library, but which is not otherwise based on the Library.
|
||||
Defining a subclass of a class defined by the Library is deemed a mode
|
||||
of using an interface provided by the Library.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
A "Combined Work" is a work produced by combining or linking an
|
||||
Application with the Library. The particular version of the Library
|
||||
with which the Combined Work was made is also called the "Linked
|
||||
Version".
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
The "Minimal Corresponding Source" for a Combined Work means the
|
||||
Corresponding Source for the Combined Work, excluding any source code
|
||||
for portions of the Combined Work that, considered in isolation, are
|
||||
based on the Application, and not on the Linked Version.
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
The "Corresponding Application Code" for a Combined Work means the
|
||||
object code and/or source code for the Application, including any data
|
||||
and utility programs needed for reproducing the Combined Work from the
|
||||
Application, but excluding the System Libraries of the Combined Work.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
1. Exception to Section 3 of the GNU GPL.
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
You may convey a covered work under sections 3 and 4 of this License
|
||||
without being bound by section 3 of the GNU GPL.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
2. Conveying Modified Versions.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
If you modify a copy of the Library, and, in your modifications, a
|
||||
facility refers to a function or data to be supplied by an Application
|
||||
that uses the facility (other than as an argument passed when the
|
||||
facility is invoked), then you may convey a copy of the modified
|
||||
version:
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
a) under this License, provided that you make a good faith effort to
|
||||
ensure that, in the event an Application does not supply the
|
||||
function or data, the facility still operates, and performs
|
||||
whatever part of its purpose remains meaningful, or
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
b) under the GNU GPL, with none of the additional permissions of
|
||||
this License applicable to that copy.
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
3. Object Code Incorporating Material from Library Header Files.
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
The object code form of an Application may incorporate material from
|
||||
a header file that is part of the Library. You may convey such object
|
||||
code under terms of your choice, provided that, if the incorporated
|
||||
material is not limited to numerical parameters, data structure
|
||||
layouts and accessors, or small macros, inline functions and templates
|
||||
(ten or fewer lines in length), you do both of the following:
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
a) Give prominent notice with each copy of the object code that the
|
||||
Library is used in it and that the Library and its use are
|
||||
covered by this License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
b) Accompany the object code with a copy of the GNU GPL and this license
|
||||
document.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
4. Combined Works.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
You may convey a Combined Work under terms of your choice that,
|
||||
taken together, effectively do not restrict modification of the
|
||||
portions of the Library contained in the Combined Work and reverse
|
||||
engineering for debugging such modifications, if you also do each of
|
||||
the following:
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
a) Give prominent notice with each copy of the Combined Work that
|
||||
the Library is used in it and that the Library and its use are
|
||||
covered by this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
b) Accompany the Combined Work with a copy of the GNU GPL and this license
|
||||
document.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
c) For a Combined Work that displays copyright notices during
|
||||
execution, include the copyright notice for the Library among
|
||||
these notices, as well as a reference directing the user to the
|
||||
copies of the GNU GPL and this license document.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
d) Do one of the following:
|
||||
|
||||
0) Convey the Minimal Corresponding Source under the terms of this
|
||||
License, and the Corresponding Application Code in a form
|
||||
suitable for, and under terms that permit, the user to
|
||||
recombine or relink the Application with a modified version of
|
||||
the Linked Version to produce a modified Combined Work, in the
|
||||
manner specified by section 6 of the GNU GPL for conveying
|
||||
Corresponding Source.
|
||||
|
||||
1) Use a suitable shared library mechanism for linking with the
|
||||
Library. A suitable mechanism is one that (a) uses at run time
|
||||
a copy of the Library already present on the user's computer
|
||||
system, and (b) will operate properly with a modified version
|
||||
of the Library that is interface-compatible with the Linked
|
||||
Version.
|
||||
|
||||
e) Provide Installation Information, but only if you would otherwise
|
||||
be required to provide such information under section 6 of the
|
||||
GNU GPL, and only to the extent that such information is
|
||||
necessary to install and execute a modified version of the
|
||||
Combined Work produced by recombining or relinking the
|
||||
Application with a modified version of the Linked Version. (If
|
||||
you use option 4d0, the Installation Information must accompany
|
||||
the Minimal Corresponding Source and Corresponding Application
|
||||
Code. If you use option 4d1, you must provide the Installation
|
||||
Information in the manner specified by section 6 of the GNU GPL
|
||||
for conveying Corresponding Source.)
|
||||
|
||||
5. Combined Libraries.
|
||||
|
||||
You may place library facilities that are a work based on the
|
||||
Library side by side in a single library together with other library
|
||||
facilities that are not Applications and are not covered by this
|
||||
License, and convey such a combined library under terms of your
|
||||
choice, if you do both of the following:
|
||||
|
||||
a) Accompany the combined library with a copy of the same work based
|
||||
on the Library, uncombined with any other library facilities,
|
||||
conveyed under the terms of this License.
|
||||
|
||||
b) Give prominent notice with the combined library that part of it
|
||||
is a work based on the Library, and explaining where to find the
|
||||
accompanying uncombined form of the same work.
|
||||
|
||||
6. Revised Versions of the GNU Lesser General Public License.
|
||||
|
||||
The Free Software Foundation may publish revised and/or new versions
|
||||
of the GNU Lesser General Public License from time to time. Such new
|
||||
versions will be similar in spirit to the present version, but may
|
||||
differ in detail to address new problems or concerns.
|
||||
|
||||
Each version is given a distinguishing version number. If the
|
||||
Library as you received it specifies that a certain numbered version
|
||||
of the GNU Lesser General Public License "or any later version"
|
||||
applies to it, you have the option of following the terms and
|
||||
conditions either of that published version or of any later version
|
||||
published by the Free Software Foundation. If the Library as you
|
||||
received it does not specify a version number of the GNU Lesser
|
||||
General Public License, you may choose any version of the GNU Lesser
|
||||
General Public License ever published by the Free Software Foundation.
|
||||
|
||||
If the Library as you received it specifies that a proxy can decide
|
||||
whether future versions of the GNU Lesser General Public License shall
|
||||
apply, that proxy's public statement of acceptance of any version is
|
||||
permanent authorization for you to choose that version for the
|
||||
Library.
|
||||
|
||||
211
NOTICE.md
Normal file
211
NOTICE.md
Normal file
@@ -0,0 +1,211 @@
|
||||
Hippolyzer is based on PyOGP, Copyright 2009 Linden Research Inc.
|
||||
|
||||
Original contributors text and original license text below:
|
||||
|
||||
------
|
||||
|
||||
```
|
||||
Pyogp is an open source collaboration between Linden Lab and the
|
||||
Architecture Working Group (AWG) to support the development
|
||||
of interoperable virtual worlds through the Open Grid Protocol.
|
||||
|
||||
Links
|
||||
-----
|
||||
Project Documentation - http://wiki.secondlife.com/wiki/Pyogp
|
||||
Linden Lab - http://lindenlab.com
|
||||
Open Grid Protocol - http://wiki.secondlife.com/wiki/Open_Grid_Protocol
|
||||
AWG - http://wiki.secondlife.com/wiki/Architecture_Working_Group
|
||||
|
||||
Contributors
|
||||
------------
|
||||
Aaron Terrell (Enus Linden), Linden Lab
|
||||
Christian Schultz (Tao Takashi), COM.lounge
|
||||
Everett Kotler (Kotler Linden), Linden Lab
|
||||
Infinity Linden, Linden Lab
|
||||
Joshua Bell (Josh Linden), Linden Lab
|
||||
Lawson English (Saijanai Kuhn), Linden Lab
|
||||
Timothy Loughlin (Locklainn Linden), Linden Lab
|
||||
```
|
||||
|
||||
---------
|
||||
|
||||
PyOGP's Original License Text:
|
||||
|
||||
```
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
```
|
||||
346
README.md
Normal file
346
README.md
Normal file
@@ -0,0 +1,346 @@
|
||||
# Hippolyzer
|
||||
|
||||
[Hippolyzer](http://wiki.secondlife.com/wiki/Hippo) is a fork of Linden Lab's abandoned
|
||||
[PyOGP library](http://wiki.secondlife.com/wiki/PyOGP)
|
||||
targeting modern Python 3, with a focus on debugging issues in Second Life-compatible
|
||||
servers and clients. There is a secondary focus on mocking up new features without requiring a
|
||||
modified server or client.
|
||||
|
||||
Wherever reasonable, readability and testability are prioritized over performance.
|
||||
|
||||
Almost all code from PyOGP has been either rewritten or replaced. Major changes from
|
||||
upstream include making sure messages always correctly round-trip, and the addition
|
||||
of a debugging proxy similar to ye olde WinGridProxy.
|
||||
|
||||
It supports hot-reloaded addon scripts that can rewrite, inject or drop messages.
|
||||
Also included are tools for working with SL-specific assets like the internal animation format,
|
||||
and the internal mesh format.
|
||||
|
||||
It's quick and easy to bash together a script that does something useful if you're familiar
|
||||
with low-level SL details. See the [Local Animation addon example](https://github.com/SaladDais/Hippolyzer/blob/master/addon_examples/local_anim.py).
|
||||
|
||||

|
||||
|
||||
## Setup
|
||||
* Python 3.8 or above is **required**. If you're unable to upgrade your system Python package due to
|
||||
being on a stable distro, you can use [pyenv](https://github.com/pyenv/pyenv) to create
|
||||
a self-contained Python install with the appropriate version.
|
||||
* [Create a clean Python 3 virtualenv](https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/#creating-a-virtual-environment)
|
||||
with `python -mvenv <virtualenv_dir>`
|
||||
* Activate the virtualenv by `source`ing the appropriate activation script
|
||||
* * Under Linux this would be something like `source <virtualenv_dir>/bin/activate`
|
||||
* (Recommended) install known-good dependencies with `pip install -r requirements.txt`
|
||||
* Run `python setup.py develop`
|
||||
|
||||
## Proxy
|
||||
|
||||
A proxy is provided with both a CLI and Qt-based interface. The proxy application wraps a
|
||||
custom SOCKS 5 UDP proxy, as well as an HTTP proxy based on [mitmproxy](https://mitmproxy.org/).
|
||||
|
||||
Multiple clients are supported at a time, and UDP messages may be injected in either
|
||||
direction. The proxy UI was inspired by the Message Log and Message Builder as present in
|
||||
the [Alchemy](https://github.com/AlchemyViewer/Alchemy) viewer.
|
||||
|
||||
### Proxy Setup
|
||||
|
||||
* Run the proxy with `python hippolyzer/apps/proxy_gui.py <addon_paths...>`
|
||||
* * For example, `python hippolyzer/apps/proxy_gui.py addon_examples/bezoscape.py`
|
||||
* * Addons can also be loaded through the `File -> Manage Addons` menu
|
||||
* * Addons are hot-reloaded scripts that can inspect, modify, and drop messages as
|
||||
they pass through the proxy. Some examples are provided in [`addon_examples`](https://github.com/SaladDais/Hippolyzer/tree/master/addon_examples)
|
||||
* Install the proxy's HTTPS certificate by going to `File -> Install HTTPS Certs`
|
||||
* * You can also install it with `python hippolyzer/apps/proxy.py --setup-ca <path to your viewer settings dir>`.
|
||||
On Linux that would be `~/.firestorm_x64/` if you're using Firestorm.
|
||||
* * This can be sidestepped by disabling certificate validation in viewer debug settings, but is not recommended.
|
||||
* Start the viewer and configure it to use `127.0.0.1:9061` as a SOCKS proxy and `127.0.0.1:9062` as
|
||||
an HTTP proxy. You **must** select the option in the viewer to use the HTTP proxy for all HTTP
|
||||
traffic, or logins will fail.
|
||||
* If you want to reduce HTTP proxy lag you _can_ have asset requests bypass the HTTP proxy by setting the
|
||||
`no_proxy` env var appropriately. For ex. `no_proxy="asset-cdn.glb.agni.lindenlab.com" ./firestorm` or
|
||||
`setx /m "no_proxy" "asset-cdn.glb.agni.lindenlab.com"` on Windows.
|
||||
* Log in!
|
||||
|
||||

|
||||
|
||||
### Filtering
|
||||
|
||||
By default, the proxy's display filter is configured to ignore many high-frequency messages.
|
||||
The filter field allows filtering on the presence of specific blocks or the values of
|
||||
variables.
|
||||
|
||||
For example, to find either chat messages mentioning "foo" or any message referencing `125214`
|
||||
in an ID field you could use `ChatFrom*.ChatData.Message~="foo" || *.*.*ID==125214`. To find all
|
||||
ObjectUpdates related to object ID `125214` you could do
|
||||
`*ObjectUpdate*.ObjectData.*ID==125214 || *ObjectUpdate*.ObjectData.Data.*ID==125214`
|
||||
to parse through both templated fields and fields inside the binary `Data` fields for compressed and
|
||||
terse object updates.
|
||||
|
||||
Messages also have metadata attached that can be matched on. To match on all kinds of ObjectUpdates that were
|
||||
related to the most recently selected object at the time the update was logged, you could do a filter like
|
||||
`Meta.ObjectUpdateIDs ~= Meta.SelectedLocal`
|
||||
|
||||
Similarly, if you have multiple active sessions and are only interested in messages related to a specific
|
||||
agent's session, you can do `(Meta.AgentID == None || Meta.AgentID == "d929385f-41e3-4a34-a04e-f1fc39f24f12") && ...`.
|
||||
|
||||
Vectors can also be compared. This will get any ObjectUpdate variant that occurs within a certain range:
|
||||
`(*ObjectUpdate*.ObjectData.*Data.Position > (110, 50, 100) && *ObjectUpdate*.ObjectData.*Data.Position < (115, 55, 105))`
|
||||
|
||||
### Logging
|
||||
|
||||
Decoded messages are displayed in the log pane, clicking one will show the request and
|
||||
response for HTTP messages, and a human-friendly form for UDP messages. Some messages and
|
||||
fields have [special packers defined](https://github.com/SaladDais/Hippolyzer/blob/master/hippolyzer/lib/proxy/templates.py)
|
||||
that will give a more human-readable form of enum or binary fields, with the original form beside or below it.
|
||||
|
||||
For example, an `AgentUpdate` message may show up in the log pane like:
|
||||
|
||||
```
|
||||
OUT AgentUpdate
|
||||
# 15136: <PacketFlags.ZEROCODED: 128>
|
||||
|
||||
[AgentData]
|
||||
AgentID = [[AGENT_ID]]
|
||||
SessionID = [[SESSION_ID]]
|
||||
BodyRotation = (0.0, 0.0, 0.06852579861879349, 0.9976493446715918)
|
||||
HeadRotation = (-0.0, -0.0, 0.05799926817417145, 0.998316625570896)
|
||||
# Many flag fields are unpacked as tuples with the original value next to them
|
||||
State =| ('EDITING',) #16
|
||||
CameraCenter = <120.69703674316406, 99.8336181640625, 59.547847747802734>
|
||||
CameraAtAxis = <0.9625586271286011, 0.11959066987037659, -0.243267223238945>
|
||||
CameraLeftAxis = <-0.12329451739788055, 0.992370069026947, 0.0>
|
||||
CameraUpAxis = <0.24141110479831696, 0.029993515461683273, 0.9699592590332031>
|
||||
Far = 88.0
|
||||
ControlFlags =| ('YAW_POS', 'NUDGE_AT_POS') #524544
|
||||
Flags =| ('HIDE_TITLE',) #1
|
||||
```
|
||||
|
||||
and an `ObjectImage` for setting a prim's texture may look like
|
||||
|
||||
```
|
||||
OUT ObjectImage
|
||||
# 3849: <PacketFlags.0: 0>
|
||||
|
||||
[AgentData]
|
||||
AgentID = [[AGENT_ID]]
|
||||
SessionID = [[SESSION_ID]]
|
||||
[ObjectData]
|
||||
ObjectLocalID = 700966
|
||||
MediaURL = b''
|
||||
TextureEntry =| {'Textures': {None: '89556747-24cb-43ed-920b-47caed15465f'}, \
|
||||
'Color': {None: b'\xff\xff\xff\xff'}, \
|
||||
'ScalesS': {None: 1.0}, \
|
||||
'ScalesT': {None: 1.0}, \
|
||||
'OffsetsS': {None: 0}, \
|
||||
'OffsetsT': {None: 0}, \
|
||||
'Rotation': {None: 0}, \
|
||||
'BasicMaterials': {None: {'Bump': 0, 'FullBright': False, 'Shiny': 'MEDIUM'}}, \
|
||||
'MediaFlags': {None: {'WebPage': False, 'TexGen': 'DEFAULT', '_Unused': 0}}, \
|
||||
'Glow': {None: 0}, \
|
||||
'Materials': {None: '00000000-0000-0000-0000-000000000000'}}
|
||||
#TextureEntry = b'\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x00\x00\x00\x00\x00\x00\x00\x00\x80?\x00\x00\x00\x80?\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x80\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
|
||||
```
|
||||
|
||||
All unpackers also provide equivalent packers that work with the message builder.
|
||||
The scripting interface uses the same packers as the logging interface, but uses a different
|
||||
representation. Clicking the "Copy repr()" will give you a version of the message that you can paste into
|
||||
an addon's script.
|
||||
|
||||
### Building Messages
|
||||
|
||||
The proxy GUI includes a message builder similar to Alchemy's to allow building arbitrary messages, or
|
||||
resending messages from the message log window. Both UDP and Caps messages may be sent.
|
||||
|
||||
For example, here's a message that will drop a physical cube on your head:
|
||||
|
||||
```
|
||||
OUT ObjectAdd
|
||||
|
||||
[AgentData]
|
||||
# [[]] in a field value indicates a simple replacement
|
||||
# provided by the proxy
|
||||
AgentID = [[AGENT_ID]]
|
||||
SessionID = [[SESSION_ID]]
|
||||
GroupID = [[NULL_KEY]]
|
||||
[ObjectData]
|
||||
# =| means the we should use the field's special packer mode
|
||||
# We treat PCode as an enum, so we'll convert from its string name to its int val
|
||||
PCode =| 'PRIMITIVE'
|
||||
Material = 3
|
||||
# With =| you may represent flags as a tuple of strings rather than an int
|
||||
# The only allowed flags in ObjectAdd are USE_PHYSICS (1) and CREATE_SELECTED (2)
|
||||
AddFlags =| ('USE_PHYSICS',)
|
||||
PathCurve = 16
|
||||
ProfileCurve = 1
|
||||
PathBegin = 0
|
||||
PathEnd = 0
|
||||
PathScaleX = 100
|
||||
PathScaleY = 100
|
||||
PathShearX = 0
|
||||
PathShearY = 0
|
||||
PathTwist = 0
|
||||
PathTwistBegin = 0
|
||||
PathRadiusOffset = 0
|
||||
PathTaperX = 0
|
||||
PathTaperY = 0
|
||||
PathRevolutions = 0
|
||||
PathSkew = 0
|
||||
ProfileBegin = 0
|
||||
ProfileEnd = 0
|
||||
ProfileHollow = 0
|
||||
BypassRaycast = 1
|
||||
# =$ indicates an eval()ed field, this will result in a vector 3m above the agent.
|
||||
RayStart =$ AGENT_POS + Vector3(0, 0, 3)
|
||||
# We can reference whatever we put in `RayStart` by accessing `block`
|
||||
RayEnd =$ block["RayStart"]
|
||||
RayTargetID = [[NULL_KEY]]
|
||||
RayEndIsIntersection = 0
|
||||
Scale = <0.5, 0.5, 0.5>
|
||||
Rotation = <0.0, 0.0, 0.0, 1.0>
|
||||
State = 0
|
||||
```
|
||||
|
||||
The repeat spinner at the bottom of the window lets you send a message multiple times.
|
||||
an `i` variable is put into the eval context and can be used to vary messages accros repeats.
|
||||
With repeat set to two:
|
||||
|
||||
```
|
||||
OUT ChatFromViewer
|
||||
|
||||
[AgentData]
|
||||
AgentID = [[AGENT_ID]]
|
||||
SessionID = [[SESSION_ID]]
|
||||
[ChatData]
|
||||
# Simple templated f-string
|
||||
Message =$ f'foo {i * 2}'
|
||||
Type =| 'NORMAL'
|
||||
Channel = 0
|
||||
```
|
||||
|
||||
will print
|
||||
|
||||
```
|
||||
User: foo 0
|
||||
User: foo 2
|
||||
User: foo 4
|
||||
```
|
||||
|
||||
HTTP requests may be sent through the same window, with equivalent syntax for replacements
|
||||
and `eval()` within the request body, if requested. As an example, sending a chat message
|
||||
through the `UntrustedSimulatorMessage` cap would look like:
|
||||
|
||||
```
|
||||
POST [[UntrustedSimulatorMessage]] HTTP/1.1
|
||||
Content-Type: application/llsd+xml
|
||||
Accept: application/llsd+xml
|
||||
X-Hippo-Directives: 1
|
||||
|
||||
<llsd>
|
||||
<map>
|
||||
<key>message</key>
|
||||
<string>ChatFromViewer</string>
|
||||
<key>body</key>
|
||||
<map>
|
||||
<key>AgentData</key>
|
||||
<array>
|
||||
<map>
|
||||
<key>AgentID</key>
|
||||
<uuid><!HIPPOREPL[[AGENT_ID]]></uuid>
|
||||
<key>SessionID</key>
|
||||
<uuid><!HIPPOREPL[[SESSION_ID]]></uuid>
|
||||
</map>
|
||||
</array>
|
||||
<key>ChatData</key>
|
||||
<array>
|
||||
<map>
|
||||
<key>Channel</key>
|
||||
<integer>0</integer>
|
||||
<key>Message</key>
|
||||
<string>test <!HIPPOEVAL[[
|
||||
base64.b64encode(hex(1 + 1).encode("utf8"))
|
||||
]]></string>
|
||||
<key>Type</key>
|
||||
<integer>1</integer>
|
||||
</map>
|
||||
</array>
|
||||
</map>
|
||||
</map>
|
||||
</llsd>
|
||||
```
|
||||
|
||||
## Addon commands
|
||||
|
||||
By default, channel 524 is a special channel used for commands handled by addons'
|
||||
`handle_command` hooks. For ex, an addon that supplies a `foo` with one string parameter
|
||||
can be called by typing `/524 foo something` in chat.
|
||||
|
||||
`/524 help` will give you a list of all commands offered by currently loaded addons.
|
||||
|
||||
## Useful Extensions
|
||||
|
||||
These are quick and dirty, but should be viewer features. I'm not a viewer developer, so they're here.
|
||||
If you are a viewer developer, please put them in a viewer.
|
||||
|
||||
* Local Animation - Allows loading and playing animations in LL's internal format from disk, replaying
|
||||
when the animation changes on disk. Mostly useful for animators that want quick feedback
|
||||
* Local Mesh - Allows specifying a target object to apply a mesh preview to. When a local mesh target
|
||||
is specified, hitting the "calculate upload cost" button in the mesh uploader will instead
|
||||
apply the mesh to the local mesh target. It works on attachments too. Useful for testing rigs before a
|
||||
final, real upload.
|
||||
|
||||
## Potential Changes
|
||||
|
||||
* Make package-able for PyPI
|
||||
* GitHub action to build binary packages and pull together licenses bundle
|
||||
* AISv3 wrapper?
|
||||
* Higher level wrappers for common things? I don't really need these, so only if people want to write them.
|
||||
* Highlight matched portion of message in log view, if applicable
|
||||
* * Remember deep filters and return a map of them, have message formatter return text ranges?
|
||||
* Move things out of `templates.py`, right now most binary serialization stuff lives there
|
||||
because it's more convenient for me to hot-reload.
|
||||
* Ability to add menus?
|
||||
|
||||
## License
|
||||
|
||||
[LGPLv3](https://www.gnu.org/licenses/lgpl-3.0.en.html). If you have a good reason why, I might dual license.
|
||||
|
||||
## For Client Developers
|
||||
|
||||
This section is mostly useful if you're developing a new SL-compatible client from scratch. Clients based
|
||||
on LL's will work out of the box.
|
||||
|
||||
### Adding proxy support to a new client
|
||||
|
||||
Hippolyzer's proxy application actually combines two proxies, a [SOCKS 5](https://tools.ietf.org/html/rfc1928)
|
||||
UDP proxy and an HTTP proxy.
|
||||
|
||||
To have your client's traffic proxied through Hippolyzer the general flow is:
|
||||
|
||||
* Open a TCP connection to Hippolyzer's SOCKS 5 proxy port
|
||||
* * This should be done once per logical user session, as Hippolyzer assumes a 1:1 mapping of SOCKS
|
||||
connections to SL sessions
|
||||
* Send a UDP associate command without authentication
|
||||
* The proxy will respond with a host / port pair that UDP messages may be sent through
|
||||
* At this point you will no longer need to use the TCP connection, but it must be kept
|
||||
alive until you want to break the UDP association
|
||||
* Whenever you send a UDP packet to a remote host, you'll need to instead send it to the host / port
|
||||
from the UDP associate response. A SOCKS 5 header must be prepended to the data indicating the ultimate destination
|
||||
of the packet
|
||||
* Any received UDP packets will also have a SOCKS 5 header indicating the real source IP and address
|
||||
* * When in doubt, check `socks_proxy.py`, `packets.py` and the SOCKS 5 RFC for more info on how to deal with SOCKS.
|
||||
* All HTTP requests must be sent through the Hippolyzer's HTTP proxy port.
|
||||
* * You may not need to do any extra plumbing to get this to work if your chosen HTTP client
|
||||
respects the `HTTP_PROXY` environment variable.
|
||||
* All HTTPS connections will be encrypted with the proxy's TLS key. You'll need to either add it to whatever
|
||||
CA bundle your client uses or disable certificate validation when a proxy is used.
|
||||
* * mitmproxy does its own certificate validation so disabling it in your client is OK.
|
||||
* The proxy needs to use content sniffing to figure out which requests are login requests,
|
||||
so make sure your request would pass `MITMProxyEventManager._is_login_request()`
|
||||
|
||||
### Should I use this library to make an SL client in Python?
|
||||
|
||||
No. If you just want to write a client in Python, you should instead look at using
|
||||
[libremetaverse](https://github.com/cinderblocks/libremetaverse/) via pythonnet.
|
||||
I removed the client-related code inherited from PyOGP because libremetaverse's was simply better.
|
||||
|
||||
<https://github.com/CasperTech/node-metaverse/> also looks like a good, modern wrapper if you
|
||||
prefer TypeScript.
|
||||
13
README.txt
13
README.txt
@@ -1,13 +0,0 @@
|
||||
Introduction
|
||||
============
|
||||
The pyogp.lib.base package provides the basic networking, data parsing, and
|
||||
data parsing capabilities for maintaining a session with a Second Life grid.
|
||||
|
||||
Dependencies
|
||||
============
|
||||
uuid
|
||||
elementtree
|
||||
llbase
|
||||
WebOb
|
||||
wsgiref
|
||||
eventlet==0.8.14
|
||||
56
addon_examples/backwards.py
Normal file
56
addon_examples/backwards.py
Normal file
@@ -0,0 +1,56 @@
|
||||
"""
|
||||
All buttons make you go backwards.
|
||||
|
||||
Except for backward, which makes you go left.
|
||||
"""
|
||||
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
from hippolyzer.lib.proxy.templates import AgentControlFlags
|
||||
|
||||
|
||||
NUDGE_MASK = sum(x for x in AgentControlFlags if "NUDGE" in x.name)
|
||||
FAST_MASK = sum(x for x in AgentControlFlags if "FAST" in x.name)
|
||||
DIR_MASK = sum(x for x in AgentControlFlags if
|
||||
any(x.name.endswith(y) for y in ("_POS", "_NEG")))
|
||||
BACK_MASK = (AgentControlFlags.AT_NEG | AgentControlFlags.NUDGE_AT_NEG)
|
||||
|
||||
|
||||
class BackwardsAddon(BaseAddon):
|
||||
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
|
||||
if message.name == "AgentUpdate":
|
||||
agent_data_block = message["AgentData"][0]
|
||||
flags: AgentControlFlags = agent_data_block.deserialize_var("ControlFlags")
|
||||
# Don't want these at all.
|
||||
flags &= ~(AgentControlFlags.TURN_LEFT | AgentControlFlags.TURN_RIGHT)
|
||||
|
||||
any_nudge = bool(flags & NUDGE_MASK)
|
||||
any_fast = bool(flags & FAST_MASK)
|
||||
dir_vals = flags & DIR_MASK
|
||||
|
||||
going_back = bool(flags & BACK_MASK)
|
||||
other_dir_vals = dir_vals & ~BACK_MASK
|
||||
|
||||
new_flags = AgentControlFlags(0)
|
||||
# back -> left
|
||||
if going_back:
|
||||
if any_nudge:
|
||||
new_flags |= AgentControlFlags.NUDGE_LEFT_POS
|
||||
else:
|
||||
new_flags |= AgentControlFlags.LEFT_POS
|
||||
if any_fast:
|
||||
new_flags |= AgentControlFlags.FAST_LEFT
|
||||
# anything else -> back
|
||||
if other_dir_vals:
|
||||
if any_nudge:
|
||||
new_flags |= AgentControlFlags.NUDGE_AT_NEG
|
||||
else:
|
||||
new_flags |= AgentControlFlags.AT_NEG
|
||||
if any_fast:
|
||||
new_flags |= AgentControlFlags.FAST_AT
|
||||
agent_data_block["ControlFlags"] = new_flags
|
||||
|
||||
|
||||
addons = [BackwardsAddon()]
|
||||
97
addon_examples/bezoscape.py
Normal file
97
addon_examples/bezoscape.py
Normal file
@@ -0,0 +1,97 @@
|
||||
"""
|
||||
Make all object textures Jeff Bezos
|
||||
|
||||
Helpful for migrating your region to AWS
|
||||
"""
|
||||
|
||||
import copy
|
||||
import ctypes
|
||||
import random
|
||||
import secrets
|
||||
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
|
||||
|
||||
BEZOS_UUIDS = [UUID(x) for x in [
|
||||
"b8b8dcf9-758a-4539-ba63-793a01407236",
|
||||
"0010533a-cd41-44de-9a74-ab2125cbef8f",
|
||||
]]
|
||||
|
||||
|
||||
def _modify_crc(crc_tweak: int, crc_val: int):
|
||||
return ctypes.c_uint32(crc_val ^ crc_tweak).value
|
||||
|
||||
|
||||
def _bezosify_te(local_id, parsed_te):
|
||||
parsed_te = copy.copy(parsed_te)
|
||||
parsed_te.Textures = {None: BEZOS_UUIDS[local_id % len(BEZOS_UUIDS)]}
|
||||
return parsed_te
|
||||
|
||||
|
||||
class BezosifyAddon(BaseAddon):
|
||||
bezos_crc_xor: int = SessionProperty()
|
||||
|
||||
def handle_session_init(self, session: Session):
|
||||
# We want CRCs that are stable for the duration of the session, but will
|
||||
# cause a cache miss for objects cached before this session. Generate a
|
||||
# random value to XOR all CRCs with
|
||||
self.bezos_crc_xor = secrets.randbits(32)
|
||||
|
||||
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
|
||||
if message.name == "ObjectUpdateCached":
|
||||
for block in message["ObjectData"]:
|
||||
# Cached only really has a CRC, this will force the cache miss.
|
||||
block["CRC"] = _modify_crc(self.bezos_crc_xor, block["CRC"])
|
||||
elif message.name == "ObjectUpdate":
|
||||
for block in message["ObjectData"]:
|
||||
block["CRC"] = _modify_crc(self.bezos_crc_xor, block["CRC"])
|
||||
parsed_te = block.deserialize_var("TextureEntry")
|
||||
if not parsed_te:
|
||||
continue
|
||||
|
||||
parsed_te = _bezosify_te(block["ID"], parsed_te)
|
||||
block.serialize_var("TextureEntry", parsed_te)
|
||||
elif message.name == "ImprovedTerseObjectUpdate":
|
||||
for block in message["ObjectData"]:
|
||||
parsed_te = block.deserialize_var("TextureEntry")
|
||||
if not parsed_te:
|
||||
continue
|
||||
update_data = block.deserialize_var("Data")
|
||||
|
||||
parsed_te = _bezosify_te(update_data["ID"], parsed_te)
|
||||
block.serialize_var("TextureEntry", parsed_te)
|
||||
elif message.name == "AvatarAppearance":
|
||||
for block in message["ObjectData"]:
|
||||
parsed_te = block.deserialize_var("TextureEntry")
|
||||
if not parsed_te:
|
||||
continue
|
||||
# Need an integer ID to choose a bezos texture, just use
|
||||
# the last byte of the sender's UUID.
|
||||
sender_id = message["Sender"]["ID"]
|
||||
parsed_te = _bezosify_te(sender_id.bytes[-1], parsed_te)
|
||||
block.serialize_var("TextureEntry", parsed_te)
|
||||
elif message.name == "ObjectUpdateCompressed":
|
||||
for block in message["ObjectData"]:
|
||||
update_data = block.deserialize_var("Data")
|
||||
if not update_data:
|
||||
continue
|
||||
update_data["CRC"] = _modify_crc(self.bezos_crc_xor, update_data["CRC"])
|
||||
if not update_data.get("TextureEntry"):
|
||||
continue
|
||||
|
||||
update_data["TextureEntry"] = _bezosify_te(
|
||||
update_data["ID"],
|
||||
update_data["TextureEntry"],
|
||||
)
|
||||
block.serialize_var("Data", update_data)
|
||||
elif message.name == "RegionHandshake":
|
||||
for field_name, val in message["RegionInfo"][0].items():
|
||||
if field_name.startswith("Terrain") and isinstance(val, UUID):
|
||||
message["RegionInfo"][field_name] = random.choice(BEZOS_UUIDS)
|
||||
|
||||
|
||||
addons = [BezosifyAddon()]
|
||||
267
addon_examples/blueish_object_list.py
Normal file
267
addon_examples/blueish_object_list.py
Normal file
@@ -0,0 +1,267 @@
|
||||
"""
|
||||
Addon demonstrating a Qt GUI, use of the object manager and associated addon hooks
|
||||
|
||||
Displays a list of all objects that are mostly blue on at least one face based
|
||||
on prim colors.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import enum
|
||||
import os.path
|
||||
from typing import *
|
||||
|
||||
from PySide2 import QtCore, QtGui, QtWidgets
|
||||
|
||||
from hippolyzer.lib.base.datatypes import Vector3
|
||||
from hippolyzer.lib.base.message.message import Block
|
||||
from hippolyzer.lib.base.objects import Object
|
||||
from hippolyzer.lib.base.ui_helpers import loadUi
|
||||
from hippolyzer.lib.proxy.addons import AddonManager
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon
|
||||
from hippolyzer.lib.proxy.commands import handle_command
|
||||
from hippolyzer.lib.proxy.packets import Direction
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
from hippolyzer.lib.proxy.task_scheduler import TaskLifeScope
|
||||
from hippolyzer.lib.proxy.templates import PCode
|
||||
|
||||
|
||||
def _is_color_blueish(color: bytes) -> bool:
|
||||
# Eh this is pretty transparent.
|
||||
if color[3] < 128:
|
||||
return False
|
||||
|
||||
# pretty low value, more black than anything
|
||||
if color[2] < 50:
|
||||
return False
|
||||
|
||||
# Blue channel makes up at least 70% of the value
|
||||
return (color[2] / sum(color[:3])) > 0.7
|
||||
|
||||
|
||||
def _is_object_blueish(obj: Object):
|
||||
if obj.PCode != PCode.PRIMITIVE:
|
||||
return False
|
||||
for color in obj.TextureEntry.Color.values():
|
||||
if _is_color_blueish(color):
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
class BlueishObjectListGUIAddon(BaseAddon):
|
||||
def __init__(self):
|
||||
# Instance attribute because we only expect one to exist at a time,
|
||||
# and it should never persist.
|
||||
self.blueish_model: Optional[BlueishObjectModel] = None
|
||||
|
||||
# Cancel the coroutine associated with this command if the region, session or addon
|
||||
# changes for any reason. Only one allowed at once across all sessions.
|
||||
@handle_command(
|
||||
single_instance=True,
|
||||
lifetime=TaskLifeScope.SESSION | TaskLifeScope.REGION | TaskLifeScope.ADDON
|
||||
)
|
||||
async def track_blueish(self, session: Session, region: ProxiedRegion):
|
||||
"""Open a window that tracks blueish objects in the region"""
|
||||
parent = AddonManager.UI.main_window_handle()
|
||||
if parent is None:
|
||||
raise RuntimeError("Must be run under the GUI proxy")
|
||||
|
||||
win = BlueishObjectWindow(parent, session)
|
||||
win.objectHighlightClicked.connect(self._highlight_object) # type: ignore
|
||||
win.objectTeleportClicked.connect(self._teleport_to_object) # type: ignore
|
||||
win.show()
|
||||
try:
|
||||
self.blueish_model = win.model
|
||||
self._scan_all_objects(session, region)
|
||||
await win.closing
|
||||
self.blueish_model = None
|
||||
except:
|
||||
# Task got killed or something exploded, close the window ourselves
|
||||
self.blueish_model = None
|
||||
win.close()
|
||||
raise
|
||||
|
||||
def _highlight_object(self, session: Session, obj: Object):
|
||||
session.main_region.circuit.send_message(ProxiedMessage(
|
||||
"ForceObjectSelect",
|
||||
Block("Header", ResetList=False),
|
||||
Block("Data", LocalID=obj.LocalID),
|
||||
direction=Direction.IN,
|
||||
))
|
||||
|
||||
def _teleport_to_object(self, session: Session, obj: Object):
|
||||
session.main_region.circuit.send_message(ProxiedMessage(
|
||||
"TeleportLocationRequest",
|
||||
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
|
||||
Block(
|
||||
"Info",
|
||||
RegionHandle=session.main_region.handle,
|
||||
Position=obj.RegionPosition,
|
||||
LookAt=Vector3(0.0, 0.0, 0.0)
|
||||
),
|
||||
))
|
||||
|
||||
def _scan_all_objects(self, _session: Session, region: ProxiedRegion):
|
||||
self.blueish_model.clear()
|
||||
|
||||
for obj in region.objects.all_objects:
|
||||
if _is_object_blueish(obj):
|
||||
self.blueish_model.addObject(obj)
|
||||
|
||||
obj_list = self.blueish_model.objects
|
||||
region.objects.request_object_properties([o for o in obj_list if o.Name is None])
|
||||
|
||||
# Make sure we request any objects we didn't know about before,
|
||||
# they'll get picked up in the update handler.
|
||||
region.objects.request_missing_objects()
|
||||
|
||||
def handle_object_updated(self, session: Session, region: ProxiedRegion,
|
||||
obj: Object, updated_props: Set[str]):
|
||||
if self.blueish_model is None:
|
||||
return
|
||||
|
||||
if _is_object_blueish(obj):
|
||||
if obj not in self.blueish_model:
|
||||
if obj.Name is None:
|
||||
region.objects.request_object_properties(obj)
|
||||
self.blueish_model.addObject(obj)
|
||||
else:
|
||||
# mark the object as updated in the model,
|
||||
# fields may have changed.
|
||||
self.blueish_model.updateObject(obj)
|
||||
else:
|
||||
if obj in self.blueish_model:
|
||||
self.blueish_model.removeObject(obj)
|
||||
|
||||
def handle_object_killed(self, session: Session, region: ProxiedRegion, obj: Object):
|
||||
if self.blueish_model is None:
|
||||
return
|
||||
if obj in self.blueish_model:
|
||||
self.blueish_model.removeObject(obj)
|
||||
|
||||
|
||||
class BlueishModelHeader(enum.IntEnum):
|
||||
Name = 0
|
||||
Position = enum.auto()
|
||||
|
||||
|
||||
class BlueishObjectModel(QtCore.QAbstractTableModel):
|
||||
def __init__(self, parent):
|
||||
super().__init__(parent)
|
||||
self.objects: List[Object] = []
|
||||
|
||||
def __contains__(self, item):
|
||||
return item in self.objects
|
||||
|
||||
def addObject(self, obj: Object):
|
||||
if obj in self.objects:
|
||||
self.updateObject(obj)
|
||||
return
|
||||
|
||||
num_objs = len(self.objects)
|
||||
self.beginInsertRows(QtCore.QModelIndex(), num_objs, num_objs)
|
||||
self.objects.append(obj)
|
||||
self.endInsertRows()
|
||||
|
||||
def removeObject(self, obj: Object):
|
||||
try:
|
||||
obj_idx = self.objects.index(obj)
|
||||
except ValueError:
|
||||
return
|
||||
|
||||
self.beginRemoveRows(QtCore.QModelIndex(), obj_idx, obj_idx)
|
||||
self.objects.remove(obj)
|
||||
self.endRemoveRows()
|
||||
|
||||
def updateObject(self, obj: Object):
|
||||
try:
|
||||
obj_idx = self.objects.index(obj)
|
||||
except ValueError:
|
||||
return
|
||||
top_left = self.createIndex(obj_idx, 1)
|
||||
bottom_right = self.createIndex(obj_idx, self.columnCount())
|
||||
self.dataChanged.emit(top_left, bottom_right)
|
||||
|
||||
def rowCount(self, parent=None, *args, **kwargs):
|
||||
return len(self.objects)
|
||||
|
||||
def columnCount(self, parent: QtCore.QModelIndex = None) -> int:
|
||||
return len(BlueishModelHeader)
|
||||
|
||||
def data(self, index, role=None):
|
||||
if not index.isValid():
|
||||
return None
|
||||
obj = self.objects[index.row()]
|
||||
if role == QtCore.Qt.UserRole:
|
||||
return obj
|
||||
if role != QtCore.Qt.DisplayRole:
|
||||
return None
|
||||
|
||||
col = index.column()
|
||||
val = None
|
||||
|
||||
if col == BlueishModelHeader.Name:
|
||||
val = obj.Name or ""
|
||||
elif col == BlueishModelHeader.Position:
|
||||
try:
|
||||
val = str(obj.RegionPosition)
|
||||
except ValueError:
|
||||
# If the object is orphaned we may not be able to figure
|
||||
# out the region pos
|
||||
val = "Unknown"
|
||||
|
||||
return val
|
||||
|
||||
def headerData(self, col, orientation, role=None):
|
||||
if orientation == QtCore.Qt.Horizontal and role == QtCore.Qt.DisplayRole:
|
||||
return BlueishModelHeader(col).name
|
||||
|
||||
def clear(self):
|
||||
self.beginResetModel()
|
||||
self.objects = []
|
||||
self.endResetModel()
|
||||
|
||||
|
||||
BLUEISH_UI_PATH = os.path.join(os.path.dirname(__file__), "blueish_object_list.ui")
|
||||
|
||||
|
||||
class BlueishObjectWindow(QtWidgets.QMainWindow):
|
||||
objectHighlightClicked = QtCore.Signal(Session, Object)
|
||||
objectTeleportClicked = QtCore.Signal(Session, Object)
|
||||
|
||||
tableView: QtWidgets.QTableView
|
||||
|
||||
def __init__(self, parent, session: Session):
|
||||
self.closing = asyncio.Future()
|
||||
super().__init__(parent=parent)
|
||||
loadUi(BLUEISH_UI_PATH, self)
|
||||
self.model = BlueishObjectModel(self)
|
||||
self.session = session
|
||||
self.tableView.setModel(self.model)
|
||||
self.tableView.horizontalHeader().resizeSection(BlueishModelHeader.Name, 150)
|
||||
self.tableView.horizontalHeader().setStretchLastSection(True)
|
||||
self.buttonHighlight.clicked.connect(self._highlightClicked)
|
||||
self.buttonTeleport.clicked.connect(self._teleportClicked)
|
||||
|
||||
def closeEvent(self, event: QtGui.QCloseEvent):
|
||||
if not self.closing.done():
|
||||
self.closing.set_result(True)
|
||||
super().closeEvent(event)
|
||||
|
||||
def _highlightClicked(self):
|
||||
self._emitForSelectedObject(self.objectHighlightClicked)
|
||||
|
||||
def _teleportClicked(self):
|
||||
self._emitForSelectedObject(self.objectTeleportClicked)
|
||||
|
||||
def _emitForSelectedObject(self, signal: QtCore.Signal):
|
||||
object_indexes = self.tableView.selectionModel().selectedIndexes()
|
||||
if not object_indexes:
|
||||
return
|
||||
obj = object_indexes[0].data(QtCore.Qt.UserRole)
|
||||
signal.emit(self.session, obj) # type: ignore
|
||||
|
||||
|
||||
addons = [BlueishObjectListGUIAddon()]
|
||||
95
addon_examples/blueish_object_list.ui
Normal file
95
addon_examples/blueish_object_list.ui
Normal file
@@ -0,0 +1,95 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<ui version="4.0">
|
||||
<class>BlueishObjectWindow</class>
|
||||
<widget class="QMainWindow" name="BlueishObjectWindow">
|
||||
<property name="geometry">
|
||||
<rect>
|
||||
<x>0</x>
|
||||
<y>0</y>
|
||||
<width>851</width>
|
||||
<height>696</height>
|
||||
</rect>
|
||||
</property>
|
||||
<property name="windowTitle">
|
||||
<string>Blueish Object List</string>
|
||||
</property>
|
||||
<property name="locale">
|
||||
<locale language="English" country="UnitedStates"/>
|
||||
</property>
|
||||
<widget class="QWidget" name="centralwidget">
|
||||
<layout class="QGridLayout" name="gridLayout">
|
||||
<item row="1" column="0">
|
||||
<layout class="QHBoxLayout" name="horizontalLayout">
|
||||
<property name="leftMargin">
|
||||
<number>0</number>
|
||||
</property>
|
||||
<property name="topMargin">
|
||||
<number>0</number>
|
||||
</property>
|
||||
<item>
|
||||
<spacer name="horizontalSpacer">
|
||||
<property name="orientation">
|
||||
<enum>Qt::Horizontal</enum>
|
||||
</property>
|
||||
<property name="sizeHint" stdset="0">
|
||||
<size>
|
||||
<width>40</width>
|
||||
<height>20</height>
|
||||
</size>
|
||||
</property>
|
||||
</spacer>
|
||||
</item>
|
||||
<item>
|
||||
<widget class="QPushButton" name="buttonTeleport">
|
||||
<property name="text">
|
||||
<string>Teleport</string>
|
||||
</property>
|
||||
</widget>
|
||||
</item>
|
||||
<item>
|
||||
<widget class="QPushButton" name="buttonHighlight">
|
||||
<property name="text">
|
||||
<string>Highlight</string>
|
||||
</property>
|
||||
</widget>
|
||||
</item>
|
||||
</layout>
|
||||
</item>
|
||||
<item row="0" column="0">
|
||||
<layout class="QVBoxLayout" name="verticalLayout">
|
||||
<item>
|
||||
<widget class="QTableView" name="tableView">
|
||||
<property name="sizePolicy">
|
||||
<sizepolicy hsizetype="Expanding" vsizetype="Expanding">
|
||||
<horstretch>2</horstretch>
|
||||
<verstretch>0</verstretch>
|
||||
</sizepolicy>
|
||||
</property>
|
||||
<property name="alternatingRowColors">
|
||||
<bool>false</bool>
|
||||
</property>
|
||||
<property name="selectionMode">
|
||||
<enum>QAbstractItemView::SingleSelection</enum>
|
||||
</property>
|
||||
<property name="selectionBehavior">
|
||||
<enum>QAbstractItemView::SelectRows</enum>
|
||||
</property>
|
||||
<property name="wordWrap">
|
||||
<bool>false</bool>
|
||||
</property>
|
||||
<property name="cornerButtonEnabled">
|
||||
<bool>false</bool>
|
||||
</property>
|
||||
<attribute name="verticalHeaderVisible">
|
||||
<bool>false</bool>
|
||||
</attribute>
|
||||
</widget>
|
||||
</item>
|
||||
</layout>
|
||||
</item>
|
||||
</layout>
|
||||
</widget>
|
||||
</widget>
|
||||
<resources/>
|
||||
<connections/>
|
||||
</ui>
|
||||
33
addon_examples/caps_example.py
Normal file
33
addon_examples/caps_example.py
Normal file
@@ -0,0 +1,33 @@
|
||||
"""
|
||||
Example of how to make simple Caps requests
|
||||
"""
|
||||
import aiohttp
|
||||
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message
|
||||
from hippolyzer.lib.proxy.commands import handle_command
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
|
||||
|
||||
class CapsExampleAddon(BaseAddon):
|
||||
@handle_command()
|
||||
async def test_caps(self, _session: Session, region: ProxiedRegion):
|
||||
caps_client = region.caps_client
|
||||
# We can pass in a ClientSession if we want to do keep-alive across requests
|
||||
async with aiohttp.ClientSession() as aio_sess:
|
||||
async with caps_client.get("SimulatorFeatures", session=aio_sess) as resp:
|
||||
await resp.read_llsd()
|
||||
# Or we can have one created for us just for this request
|
||||
async with caps_client.get("SimulatorFeatures") as resp:
|
||||
show_message(await resp.read_llsd())
|
||||
|
||||
# POSTing LLSD works
|
||||
req = caps_client.post("AgentPreferences", llsd={
|
||||
"hover_height": 0.5,
|
||||
})
|
||||
# Request object can be built, then awaited
|
||||
async with req as resp:
|
||||
show_message(await resp.read_llsd())
|
||||
|
||||
|
||||
addons = [CapsExampleAddon()]
|
||||
14
addon_examples/counter.py
Normal file
14
addon_examples/counter.py
Normal file
@@ -0,0 +1,14 @@
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
|
||||
|
||||
def handle_lludp_message(session: Session, region: ProxiedRegion, message: ProxiedMessage):
|
||||
# addon_ctx will persist across addon reloads, use for storing data that
|
||||
# needs to survive across calls to this function
|
||||
ctx = session.addon_ctx
|
||||
if message.name == "ChatFromViewer":
|
||||
chat = message["ChatData"]["Message"]
|
||||
if chat == "COUNT":
|
||||
ctx["chat_counter"] = ctx.get("chat_counter", 0) + 1
|
||||
message["ChatData"]["Message"] = str(ctx["chat_counter"])
|
||||
31
addon_examples/custom_meta_filter.py
Normal file
31
addon_examples/custom_meta_filter.py
Normal file
@@ -0,0 +1,31 @@
|
||||
"""
|
||||
Example of custom meta tags, useful for complex expressions that wouldn't work
|
||||
well in the message log filter language.
|
||||
|
||||
Tags messages where someone said "hello", and record who they said hello to.
|
||||
|
||||
If you said "hello Someone", that message would be shown in the log pane when
|
||||
filtering with `Meta.Greeted == "Someone"` or just `Meta.Greeted` to match any
|
||||
message with a greeting.
|
||||
"""
|
||||
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
|
||||
|
||||
class CustomMetaExampleAddon(BaseAddon):
|
||||
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
|
||||
if not message.name.startswith("ChatFrom"):
|
||||
return
|
||||
|
||||
chat = message["ChatData"]["Message"]
|
||||
if not chat:
|
||||
return
|
||||
|
||||
if chat.lower().startswith("hello "):
|
||||
message.meta["Greeted"] = chat.split(" ", 1)[1]
|
||||
|
||||
|
||||
addons = [CustomMetaExampleAddon()]
|
||||
122
addon_examples/deformer_helper.py
Normal file
122
addon_examples/deformer_helper.py
Normal file
@@ -0,0 +1,122 @@
|
||||
"""
|
||||
Helper for making deformer anims. This could have a GUI I guess.
|
||||
"""
|
||||
import dataclasses
|
||||
from typing import *
|
||||
|
||||
from hippolyzer.lib.base.datatypes import Vector3, Quaternion, UUID
|
||||
from hippolyzer.lib.base.llanim import Joint, Animation, PosKeyframe, RotKeyframe
|
||||
from hippolyzer.lib.proxy.addon_utils import show_message, BaseAddon, SessionProperty
|
||||
from hippolyzer.lib.proxy.addons import AddonManager
|
||||
from hippolyzer.lib.proxy.commands import handle_command, Parameter
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
|
||||
import local_anim
|
||||
# We require any addons from local_anim to be loaded, and we want
|
||||
# our addon to be reloaded whenever local_anim changes.
|
||||
AddonManager.hot_reload(local_anim, require_addons_loaded=True)
|
||||
|
||||
|
||||
@dataclasses.dataclass
|
||||
class DeformerJoint:
|
||||
pos: Optional[Vector3] = None
|
||||
rot: Optional[Quaternion] = None
|
||||
|
||||
|
||||
def build_deformer(joints: Dict[str, DeformerJoint]) -> bytes:
|
||||
anim = Animation(
|
||||
major_version=1,
|
||||
minor_version=0,
|
||||
base_priority=5,
|
||||
duration=1.0,
|
||||
loop_out_point=1.0,
|
||||
loop=True,
|
||||
)
|
||||
|
||||
for joint_name, joint in joints.items():
|
||||
if not any((joint.pos, joint.rot)):
|
||||
continue
|
||||
anim.joints[joint_name] = Joint(
|
||||
priority=5,
|
||||
rot_keyframes=[RotKeyframe(time=0.0, rot=joint.rot)] if joint.rot else [],
|
||||
pos_keyframes=[PosKeyframe(time=0.0, pos=joint.pos)] if joint.pos else [],
|
||||
)
|
||||
return anim.to_bytes()
|
||||
|
||||
|
||||
class DeformerAddon(BaseAddon):
|
||||
deform_joints: Dict[str, DeformerJoint] = SessionProperty(dict)
|
||||
|
||||
@handle_command()
|
||||
async def save_deformer(self, _session: Session, _region: ProxiedRegion):
|
||||
filename = await AddonManager.UI.save_file(filter_str="SL Anim (*.anim)")
|
||||
if not filename:
|
||||
return
|
||||
with open(filename, "wb") as f:
|
||||
f.write(build_deformer(self.deform_joints))
|
||||
|
||||
# `sep=None` makes `coord` greedy, taking the rest of the message
|
||||
@handle_command(
|
||||
joint_name=str,
|
||||
coord_type=str,
|
||||
coord=Parameter(Vector3.parse, sep=None),
|
||||
)
|
||||
async def set_deformer_joint(self, session: Session, region: ProxiedRegion,
|
||||
joint_name: str, coord_type: str, coord: Vector3):
|
||||
"""
|
||||
Set a coordinate for a joint in the deformer
|
||||
|
||||
Example:
|
||||
set_deformer_joint mNeck pos <0, 0, 0.5>
|
||||
set_deformer_joint mNeck rot <0, 180, 0>
|
||||
"""
|
||||
joint_data = self.deform_joints.setdefault(joint_name, DeformerJoint())
|
||||
|
||||
if coord_type == "pos":
|
||||
joint_data.pos = coord
|
||||
elif coord_type == "rot":
|
||||
joint_data.rot = Quaternion.from_euler(*coord, degrees=True)
|
||||
else:
|
||||
show_message(f"Unknown deformer component {coord_type}")
|
||||
return
|
||||
self._reapply_deformer(session, region)
|
||||
|
||||
@handle_command()
|
||||
async def stop_deforming(self, session: Session, region: ProxiedRegion):
|
||||
"""Disable any active deformer, may have to reset skeleton manually"""
|
||||
self.deform_joints.clear()
|
||||
self._reapply_deformer(session, region)
|
||||
|
||||
def _reapply_deformer(self, session: Session, region: ProxiedRegion):
|
||||
anim_data = None
|
||||
if self.deform_joints:
|
||||
anim_data = build_deformer(self.deform_joints)
|
||||
local_anim.LocalAnimAddon.apply_local_anim(session, region, "deformer_addon", anim_data)
|
||||
|
||||
def handle_rlv_command(self, session: Session, region: ProxiedRegion, source: UUID,
|
||||
cmd: str, options: List[str], param: str):
|
||||
# An object in-world can also tell the client how to deform itself via
|
||||
# RLV-style commands.
|
||||
|
||||
# We only handle commands
|
||||
if param != "force":
|
||||
return
|
||||
|
||||
if cmd == "stop_deforming":
|
||||
self.deform_joints.clear()
|
||||
elif cmd == "deform_joints":
|
||||
self.deform_joints.clear()
|
||||
for joint_data in options:
|
||||
joint_split = joint_data.split("|")
|
||||
pos = Vector3(*joint_split[1].split("/")) if joint_split[1] else None
|
||||
rot = Quaternion(*joint_split[2].split("/")) if joint_split[2] else None
|
||||
self.deform_joints[joint_split[0]] = DeformerJoint(pos=pos, rot=rot)
|
||||
else:
|
||||
return
|
||||
|
||||
self._reapply_deformer(session, region)
|
||||
return True
|
||||
|
||||
|
||||
addons = [DeformerAddon()]
|
||||
101
addon_examples/find_packet_bugs.py
Normal file
101
addon_examples/find_packet_bugs.py
Normal file
@@ -0,0 +1,101 @@
|
||||
"""
|
||||
Test client handling of messages with malformed bodies
|
||||
|
||||
Serializes message, but mutates parts of the message body before the ACKs.
|
||||
|
||||
You don't want to use this unless you're a viewer developer trying to fix bugs.
|
||||
There's a 95% chance it will crash your viewer, and maybe make you teleport random
|
||||
places. Definitely don't test it while logged in with an account with access to
|
||||
anything important.
|
||||
"""
|
||||
import copy
|
||||
import datetime as dt
|
||||
import logging
|
||||
import random
|
||||
|
||||
from hippolyzer.lib.base.message.msgtypes import PacketLayout
|
||||
from hippolyzer.lib.base.message.udpserializer import UDPMessageSerializer
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.packets import Direction
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class PacketMutationAddon(BaseAddon):
|
||||
def __init__(self):
|
||||
self.serializer = UDPMessageSerializer()
|
||||
|
||||
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
|
||||
# Only inbound messages, don't fiddle with the sim.
|
||||
if message.direction != Direction.IN:
|
||||
return
|
||||
# Messing with these may kill your circuit
|
||||
if message.name in ("PacketAck", "StartPingCheck", "CompletePingCheck"):
|
||||
return
|
||||
|
||||
# Give login time to complete
|
||||
if session.started_at + dt.timedelta(seconds=10) > dt.datetime.now():
|
||||
return
|
||||
|
||||
# Do it randomly
|
||||
if random.random() < 0.5:
|
||||
return
|
||||
|
||||
# We need to take this message because we're going to
|
||||
# send our own re-serialized version. Don't use take_message()
|
||||
# because we're going to keep packet_id and acks the same.
|
||||
prepared = copy.deepcopy(message)
|
||||
# This message only had ACKs that were dropped.
|
||||
if not region.circuit.prepare_message(prepared):
|
||||
return
|
||||
|
||||
serialized = bytearray(self.serializer.serialize(prepared))
|
||||
|
||||
# Figure out where the ACKs will be so we don't mess those up
|
||||
acks_size = 0
|
||||
if prepared.acks:
|
||||
acks_size = 1 + (len(prepared.acks) * 4)
|
||||
|
||||
# Give enough space for the message name, and ignore the acks at the end
|
||||
# mesage name can be 5 bytes if zerocoded.
|
||||
body_slice = slice(PacketLayout.PHL_NAME + 5 + message.offset, -1 - acks_size)
|
||||
body_view = serialized[body_slice]
|
||||
# The message is too small and we're left with nothing.
|
||||
if not body_view:
|
||||
return
|
||||
|
||||
# Can be switched out with _flip_body_bytes() or something
|
||||
changed_body = self._truncate_body(body_view)
|
||||
if changed_body is None:
|
||||
return
|
||||
serialized[body_slice] = changed_body
|
||||
|
||||
# Send out the raw mutated datagram
|
||||
region.circuit.send_datagram(serialized, message.direction)
|
||||
# Tell the proxy that we already sent the message and to short-circuit
|
||||
return True
|
||||
|
||||
def _truncate_body(self, body_view: bytearray):
|
||||
# Don't want to mess with bodies this short.
|
||||
if len(body_view) < 4:
|
||||
return
|
||||
# Slice off the last bit of the body
|
||||
del body_view[int(len(body_view) * 0.7):-1]
|
||||
return body_view
|
||||
|
||||
def _flip_body_bytes(self, body_view: bytearray):
|
||||
# Don't want to mess with bodies this short.
|
||||
if len(body_view) < 4:
|
||||
return
|
||||
|
||||
# randomly flip bytes up to 19 bytes away from the end
|
||||
for i in range(-19, 0):
|
||||
if random.random() < 0.3:
|
||||
body_view[i] = (~body_view[i]) & 0xFF
|
||||
return body_view
|
||||
|
||||
|
||||
addons = [PacketMutationAddon()]
|
||||
31
addon_examples/greetings.py
Normal file
31
addon_examples/greetings.py
Normal file
@@ -0,0 +1,31 @@
|
||||
from hippolyzer.lib.base.datatypes import Vector3
|
||||
from hippolyzer.lib.proxy.addon_utils import send_chat, BaseAddon, show_message
|
||||
from hippolyzer.lib.proxy.commands import handle_command
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
|
||||
|
||||
class GreetingAddon(BaseAddon):
|
||||
@handle_command()
|
||||
async def greetings(self, session: Session, region: ProxiedRegion):
|
||||
"""Greet everyone around you"""
|
||||
agent_obj = region.objects.lookup_fullid(session.agent_id)
|
||||
if not agent_obj:
|
||||
show_message("Don't have an agent object?")
|
||||
|
||||
# Note that this will only have avatars closeish to your camera. The sim sends
|
||||
# KillObjects for avatars that get too far away.
|
||||
other_agents = [o for o in region.objects.all_avatars if o.FullID != agent_obj.FullID]
|
||||
|
||||
if not other_agents:
|
||||
show_message("No other agents?")
|
||||
|
||||
for other_agent in other_agents:
|
||||
dist = Vector3.dist(agent_obj.Position, other_agent.Position)
|
||||
if dist >= 19.0:
|
||||
continue
|
||||
nv = other_agent.NameValue.to_dict()
|
||||
send_chat(f"Greetings, {nv['FirstName']} {nv['LastName']}!")
|
||||
|
||||
|
||||
addons = [GreetingAddon()]
|
||||
29
addon_examples/hide_lookat.py
Normal file
29
addon_examples/hide_lookat.py
Normal file
@@ -0,0 +1,29 @@
|
||||
"""
|
||||
Drop outgoing packets that might leak what you're looking at, similar to Firestorm
|
||||
"""
|
||||
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.packets import Direction
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
from hippolyzer.lib.proxy.templates import ViewerEffectType
|
||||
|
||||
|
||||
BLOCKED_EFFECTS = (
|
||||
ViewerEffectType.EFFECT_LOOKAT,
|
||||
ViewerEffectType.EFFECT_BEAM,
|
||||
ViewerEffectType.EFFECT_POINTAT,
|
||||
ViewerEffectType.EFFECT_EDIT,
|
||||
)
|
||||
|
||||
|
||||
def handle_lludp_message(_session: Session, region: ProxiedRegion, msg: ProxiedMessage):
|
||||
if msg.name == "ViewerEffect" and msg.direction == Direction.OUT:
|
||||
new_blocks = [b for b in msg["Effect"] if b["Type"] not in BLOCKED_EFFECTS]
|
||||
if new_blocks:
|
||||
msg["Effect"] = new_blocks
|
||||
else:
|
||||
# drop `ViewerEffect` entirely if left with no blocks
|
||||
region.circuit.drop_message(msg)
|
||||
# Short-circuit any other addons processing this message
|
||||
return True
|
||||
136
addon_examples/horror_animator.py
Normal file
136
addon_examples/horror_animator.py
Normal file
@@ -0,0 +1,136 @@
|
||||
"""
|
||||
Body horror local animation mutator
|
||||
|
||||
Demonstrates programmatic modification / generation of animations
|
||||
|
||||
It will make you look absurd, obscene.
|
||||
"""
|
||||
import copy
|
||||
|
||||
import mitmproxy.http
|
||||
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.base.llanim import Animation
|
||||
from hippolyzer.lib.proxy.addon_utils import AssetAliasTracker, BaseAddon, GlobalProperty
|
||||
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session, SessionManager
|
||||
from hippolyzer.lib.proxy.vfs import STATIC_VFS
|
||||
|
||||
|
||||
JOINT_REPLS = {
|
||||
"Left": "Right",
|
||||
"Right": "Left",
|
||||
"LEFT": "RIGHT",
|
||||
"RIGHT": "LEFT",
|
||||
}
|
||||
|
||||
|
||||
def _change_joint_name(joint_name: str):
|
||||
for orig, repl in JOINT_REPLS.items():
|
||||
if orig in joint_name:
|
||||
return joint_name.replace(orig, repl)
|
||||
return joint_name
|
||||
|
||||
|
||||
def _mutate_anim_bytes(anim_bytes: bytes):
|
||||
anim = Animation.from_bytes(anim_bytes)
|
||||
new_joints = {}
|
||||
for name, joint in anim.joints.items():
|
||||
new_joints[_change_joint_name(name)] = joint
|
||||
anim.joints = new_joints
|
||||
for constraint in anim.constraints:
|
||||
constraint.source_volume = _change_joint_name(constraint.source_volume)
|
||||
constraint.target_volume = _change_joint_name(constraint.target_volume)
|
||||
return anim.to_bytes()
|
||||
|
||||
|
||||
class HorrorAnimatorAddon(BaseAddon):
|
||||
horror_anim_tracker: AssetAliasTracker = GlobalProperty(AssetAliasTracker)
|
||||
|
||||
def handle_init(self, session_manager: SessionManager):
|
||||
# We've reloaded, so make sure assets get new aliases
|
||||
self.horror_anim_tracker.invalidate_aliases()
|
||||
|
||||
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
|
||||
tracker = self.horror_anim_tracker
|
||||
|
||||
if message.name == "AvatarAnimation":
|
||||
# Only do this for the current user
|
||||
if message["Sender"]["ID"] != session.agent_id:
|
||||
return
|
||||
# Replace inbound anim IDs with alias IDs so we can force a cache
|
||||
# miss and replace the contents
|
||||
for block in message["AnimationList"][:]:
|
||||
anim_id = block["AnimID"]
|
||||
# Many of the anims in the static VFS have special meanings and the viewer
|
||||
# does different things based on the presence or absence of their IDs
|
||||
# in the motion list. Make sure those motions come through as usual, but
|
||||
# also add an alias so we can override the motions with an edited
|
||||
# version of the motion.
|
||||
if block["AnimID"] in STATIC_VFS:
|
||||
new_block = copy.deepcopy(block)
|
||||
new_block["AnimID"] = tracker.get_alias_uuid(anim_id)
|
||||
message["AnimationList"].append(new_block)
|
||||
else:
|
||||
block["AnimID"] = tracker.get_alias_uuid(anim_id)
|
||||
elif message.name == "AgentAnimation":
|
||||
# Make sure to remove any alias IDs from our outbound anim requests
|
||||
for block in message["AnimationList"]:
|
||||
orig_id = tracker.get_orig_uuid(block["AnimID"])
|
||||
if orig_id:
|
||||
block["AnimID"] = orig_id
|
||||
|
||||
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
|
||||
if not flow.cap_data.asset_server_cap:
|
||||
return
|
||||
|
||||
anim_id = flow.request.query.get("animatn_id")
|
||||
if not anim_id:
|
||||
return
|
||||
|
||||
orig_anim_id = self.horror_anim_tracker.get_orig_uuid(UUID(anim_id))
|
||||
if not orig_anim_id:
|
||||
return
|
||||
|
||||
flow.request.query["animatn_id"] = str(orig_anim_id)
|
||||
|
||||
flow.can_stream = False
|
||||
flow.metadata["horror_anim"] = True
|
||||
|
||||
if orig_anim_id in STATIC_VFS:
|
||||
# These animations are only in the static VFS and won't be served
|
||||
# by the asset server. Read the anim out of the static VFS and
|
||||
# send the response back immediately
|
||||
block = STATIC_VFS[orig_anim_id]
|
||||
anim_data = STATIC_VFS.read_block(block)
|
||||
flow.response = mitmproxy.http.HTTPResponse.make(
|
||||
200,
|
||||
_mutate_anim_bytes(anim_data),
|
||||
{
|
||||
"Content-Type": "binary/octet-stream",
|
||||
"Connection": "close",
|
||||
}
|
||||
)
|
||||
return True
|
||||
|
||||
# Partial requests for an anim wouldn't make any sense
|
||||
flow.request.headers.pop("Range", None)
|
||||
|
||||
def handle_http_response(self, session_manager: SessionManager, flow: HippoHTTPFlow):
|
||||
if not flow.metadata.get("horror_anim"):
|
||||
return
|
||||
|
||||
if flow.response.status_code not in (200, 206):
|
||||
return
|
||||
|
||||
flow.response.content = _mutate_anim_bytes(flow.response.content)
|
||||
# Not a range anymore, update the headers and status.
|
||||
flow.response.headers.pop("Content-Range", None)
|
||||
flow.response.status_code = 200
|
||||
|
||||
return True
|
||||
|
||||
|
||||
addons = [HorrorAnimatorAddon()]
|
||||
165
addon_examples/local_anim.py
Normal file
165
addon_examples/local_anim.py
Normal file
@@ -0,0 +1,165 @@
|
||||
"""
|
||||
Local animations
|
||||
|
||||
/524 load_local_anim
|
||||
assuming you loaded something.anim
|
||||
/524 start_local_anim something
|
||||
/524 stop_local_anim something
|
||||
|
||||
If you want to trigger the animation from an object to simulate llStartAnimation():
|
||||
llOwnerSay("@start_local_anim:something=force");
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import os
|
||||
import pathlib
|
||||
from typing import *
|
||||
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.base.message.message import Block
|
||||
from hippolyzer.lib.proxy.addons import AddonManager
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty
|
||||
from hippolyzer.lib.proxy.commands import handle_command
|
||||
from hippolyzer.lib.proxy.http_asset_repo import HTTPAssetRepo
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
|
||||
|
||||
def _get_mtime(path: str):
|
||||
try:
|
||||
return os.stat(path).st_mtime
|
||||
except:
|
||||
return None
|
||||
|
||||
|
||||
class LocalAnimAddon(BaseAddon):
|
||||
# name -> path, only for anims actually from files
|
||||
local_anim_paths: Dict[str, str] = SessionProperty(dict)
|
||||
# name -> mtime or None. Only for anims from files.
|
||||
local_anim_mtimes: Dict[str, Optional[float]] = SessionProperty(dict)
|
||||
# name -> current asset ID (changes each play)
|
||||
local_anim_playing_ids: Dict[str, UUID] = SessionProperty(dict)
|
||||
|
||||
def handle_session_init(self, session: Session):
|
||||
self._schedule_task(self._try_reload_anims(session))
|
||||
|
||||
@handle_command()
|
||||
async def load_local_anim(self, _session: Session, _region: ProxiedRegion):
|
||||
"""Load a local animation file into the list of local anims"""
|
||||
filename = await AddonManager.UI.open_file(filter_str="SL Anim (*.anim)")
|
||||
if filename:
|
||||
p = pathlib.Path(filename)
|
||||
self.local_anim_paths[p.stem] = filename
|
||||
|
||||
@handle_command(anim_name=str)
|
||||
async def start_local_anim(self, session: Session, region: ProxiedRegion, anim_name):
|
||||
"""
|
||||
Start a named local animation
|
||||
|
||||
Assuming you loaded an animation named something.anim:
|
||||
start_local_anim something
|
||||
"""
|
||||
self.apply_local_anim_from_file(session, region, anim_name)
|
||||
|
||||
@handle_command(anim_name=str)
|
||||
async def stop_local_anim(self, session: Session, region: ProxiedRegion, anim_name):
|
||||
"""Stop a named local animation"""
|
||||
self.apply_local_anim(session, region, anim_name, new_data=None)
|
||||
|
||||
async def _try_reload_anims(self, session: Session):
|
||||
while True:
|
||||
region = session.main_region
|
||||
if not region:
|
||||
await asyncio.sleep(2.0)
|
||||
continue
|
||||
|
||||
# Loop over local anims we loaded
|
||||
for anim_name, anim_id in self.local_anim_paths.items():
|
||||
anim_id = self.local_anim_playing_ids.get(anim_name)
|
||||
if not anim_id:
|
||||
continue
|
||||
# is playing right now, check if there's a newer version
|
||||
self.apply_local_anim_from_file(session, region, anim_name, only_if_changed=True)
|
||||
await asyncio.sleep(2.0)
|
||||
|
||||
def handle_rlv_command(self, session: Session, region: ProxiedRegion, source: UUID,
|
||||
cmd: str, options: List[str], param: str):
|
||||
# We only handle commands
|
||||
if param != "force":
|
||||
return
|
||||
|
||||
if cmd == "stop_local_anim":
|
||||
self.apply_local_anim(session, region, options[0], new_data=None)
|
||||
return True
|
||||
elif cmd == "start_local_anim":
|
||||
self.apply_local_anim_from_file(session, region, options[0])
|
||||
return True
|
||||
|
||||
@classmethod
|
||||
def apply_local_anim(cls, session: Session, region: ProxiedRegion,
|
||||
anim_name: str, new_data: Optional[bytes] = None):
|
||||
asset_repo: HTTPAssetRepo = session.session_manager.asset_repo
|
||||
next_id: Optional[UUID] = None
|
||||
new_msg = ProxiedMessage(
|
||||
"AgentAnimation",
|
||||
Block(
|
||||
"AgentData",
|
||||
AgentID=session.agent_id,
|
||||
SessionID=session.id,
|
||||
),
|
||||
)
|
||||
|
||||
# Stop any old version of the anim that might be playing first
|
||||
cur_id = cls.local_anim_playing_ids.get(anim_name)
|
||||
if cur_id:
|
||||
new_msg.add_block(Block(
|
||||
"AnimationList",
|
||||
AnimID=cur_id,
|
||||
StartAnim=False,
|
||||
))
|
||||
|
||||
if new_data is not None:
|
||||
# Create a temp asset ID for the new version and send out a start request
|
||||
next_id = asset_repo.create_asset(new_data, one_shot=True)
|
||||
new_msg.add_block(Block(
|
||||
"AnimationList",
|
||||
AnimID=next_id,
|
||||
StartAnim=True,
|
||||
))
|
||||
cls.local_anim_playing_ids[anim_name] = next_id
|
||||
else:
|
||||
# No data means just stop the anim
|
||||
cls.local_anim_playing_ids.pop(anim_name, None)
|
||||
|
||||
region.circuit.send_message(new_msg)
|
||||
print(f"Changing {anim_name} to {next_id}")
|
||||
|
||||
@classmethod
|
||||
def apply_local_anim_from_file(cls, session: Session, region: ProxiedRegion,
|
||||
anim_name: str, only_if_changed=False):
|
||||
anim_path = cls.local_anim_paths.get(anim_name)
|
||||
anim_data = None
|
||||
if anim_path:
|
||||
old_mtime = cls.local_anim_mtimes.get(anim_name)
|
||||
mtime = _get_mtime(anim_path)
|
||||
if only_if_changed and old_mtime == mtime:
|
||||
return
|
||||
|
||||
cls.local_anim_mtimes[anim_name] = mtime
|
||||
# file might not even exist anymore if mtime is `None`,
|
||||
# anim will automatically stop if that happens.
|
||||
if mtime:
|
||||
if only_if_changed:
|
||||
print(f"Re-applying {anim_name}")
|
||||
else:
|
||||
print(f"Playing {anim_name}")
|
||||
|
||||
with open(anim_path, "rb") as f:
|
||||
anim_data = f.read()
|
||||
else:
|
||||
print(f"Unknown anim {anim_name!r}")
|
||||
cls.apply_local_anim(session, region, anim_name, new_data=anim_data)
|
||||
|
||||
|
||||
addons = [LocalAnimAddon()]
|
||||
279
addon_examples/local_mesh.py
Normal file
279
addon_examples/local_mesh.py
Normal file
@@ -0,0 +1,279 @@
|
||||
"""
|
||||
Allows specifying a target object to apply a mesh preview to. When a local mesh target
|
||||
is specified, hitting the "calculate upload cost" button in the mesh uploader will instead
|
||||
apply the mesh to the local mesh target. It works on attachments too. Useful for testing rigs before a
|
||||
final, real upload.
|
||||
|
||||
Select an object and do /524 set_local_mesh_target, then go through the mesh upload flow.
|
||||
Mesh pieces will be mapped to your object based on object name. Note that if you're using Blender
|
||||
these will be based on the name of your _geometry nodes_ and not the objects themselves.
|
||||
|
||||
The object you select as a mesh target must contain a mesh prim. The mesh objects you use as a local
|
||||
mesh target should have at least as many faces as the mesh you want to apply to it or you won't
|
||||
be able to set textures on those faces correctly.
|
||||
|
||||
When you're done with local mesh and want to allow regular mesh upload again, do
|
||||
/524 disable_local_mesh
|
||||
|
||||
Does not attempt to apply textures uploaded with the mesh.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import ctypes
|
||||
import secrets
|
||||
from typing import *
|
||||
|
||||
import mitmproxy
|
||||
from mitmproxy.http import HTTPFlow
|
||||
|
||||
from hippolyzer.lib.base import llsd
|
||||
from hippolyzer.lib.base.datatypes import *
|
||||
from hippolyzer.lib.base.mesh import LLMeshSerializer, MeshAsset
|
||||
from hippolyzer.lib.base import serialization as se
|
||||
from hippolyzer.lib.base.objects import Object
|
||||
from hippolyzer.lib.proxy import addon_ctx
|
||||
from hippolyzer.lib.proxy.addon_utils import show_message, BaseAddon, GlobalProperty, SessionProperty
|
||||
from hippolyzer.lib.proxy.commands import handle_command
|
||||
from hippolyzer.lib.proxy.http_asset_repo import HTTPAssetRepo
|
||||
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session, SessionManager
|
||||
from hippolyzer.lib.proxy.templates import ExtraParamType
|
||||
|
||||
|
||||
def _modify_crc(crc_tweak, crc_val):
|
||||
return ctypes.c_uint32(crc_val ^ crc_tweak).value
|
||||
|
||||
|
||||
def _mangle_mesh_list(mesh_list: List[bytes], manglers: List[Callable]) -> List[bytes]:
|
||||
if not mesh_list or not manglers:
|
||||
return mesh_list
|
||||
new_mesh_list = []
|
||||
mesh_ser = LLMeshSerializer()
|
||||
for mesh_bytes in mesh_list:
|
||||
reader = se.BufferReader("!", mesh_bytes)
|
||||
mesh: MeshAsset = reader.read(mesh_ser)
|
||||
for mangler in manglers:
|
||||
mesh = mangler(mesh)
|
||||
writer = se.BufferWriter("!")
|
||||
writer.write(mesh_ser, mesh)
|
||||
new_mesh_list.append(writer.copy_buffer())
|
||||
return new_mesh_list
|
||||
|
||||
|
||||
class MeshUploadInterceptingAddon(BaseAddon):
|
||||
mesh_crc_tweak: int = GlobalProperty(default=secrets.randbits(32))
|
||||
mesh_manglers: List[Callable[[MeshAsset], MeshAsset]] = GlobalProperty(list)
|
||||
# LocalIDs being targeted for local mesh
|
||||
local_mesh_target_locals: List[int] = SessionProperty(list)
|
||||
# Name -> mesh index mapping
|
||||
local_mesh_mapping: Dict[str, int] = SessionProperty(dict)
|
||||
# originally supplied mesh bytes, indexed by mesh index.
|
||||
# mostly used for re-mangling if mesh manglers changed.
|
||||
local_mesh_orig_bytes: List[bytes] = SessionProperty(list)
|
||||
# Above, but for the local asset IDs
|
||||
local_mesh_asset_ids: List[UUID] = SessionProperty(list)
|
||||
|
||||
def handle_init(self, session_manager):
|
||||
# Other plugins can add to this list to apply transforms to mesh
|
||||
# whenever it's uploaded.
|
||||
self.remangle_local_mesh(session_manager)
|
||||
|
||||
@handle_command()
|
||||
async def set_local_mesh_target(self, session: Session, region: ProxiedRegion):
|
||||
"""Set the currently selected object as the target for local mesh"""
|
||||
parent_object = region.objects.lookup_localid(session.selected.object_local)
|
||||
linkset_objects = [parent_object] + parent_object.Children
|
||||
|
||||
old_locals = self.local_mesh_target_locals
|
||||
self.local_mesh_target_locals = [
|
||||
x.LocalID
|
||||
for x in linkset_objects
|
||||
if ExtraParamType.MESH in x.ExtraParams
|
||||
]
|
||||
|
||||
if old_locals:
|
||||
# Return the old objects to normal
|
||||
self.mesh_crc_tweak = secrets.randbits(32)
|
||||
region.objects.request_objects(old_locals)
|
||||
|
||||
if not self.local_mesh_target_locals:
|
||||
show_message("There must be at least one mesh object in the linkset!")
|
||||
return
|
||||
|
||||
# We'll need the name for all of these to pick which mesh asset to
|
||||
# apply to them.
|
||||
region.objects.request_object_properties(self.local_mesh_target_locals)
|
||||
show_message(f"Targeting {self.local_mesh_target_locals}")
|
||||
|
||||
@handle_command()
|
||||
async def disable_local_mesh(self, session: Session, region: ProxiedRegion):
|
||||
"""Disable local mesh mode, allowing mesh upload and returning targets to normal"""
|
||||
# Put the target objects back to normal and kill the temp assets
|
||||
old_locals = tuple(self.local_mesh_target_locals)
|
||||
self.local_mesh_target_locals.clear()
|
||||
asset_repo: HTTPAssetRepo = session.session_manager.asset_repo
|
||||
for asset_id in self.local_mesh_asset_ids:
|
||||
del asset_repo[asset_id]
|
||||
self.local_mesh_asset_ids.clear()
|
||||
self.local_mesh_asset_ids.clear()
|
||||
self.local_mesh_mapping.clear()
|
||||
if old_locals:
|
||||
region.objects.request_objects(old_locals)
|
||||
show_message(f"Cleared target {old_locals}")
|
||||
|
||||
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
|
||||
# Replace any mesh asset IDs in tracked objects with our local assets
|
||||
if not self.local_mesh_target_locals:
|
||||
return
|
||||
if not self.local_mesh_asset_ids:
|
||||
return
|
||||
|
||||
if message.name == "ObjectUpdate":
|
||||
for block in message["ObjectData"]:
|
||||
if block["ID"] not in self.local_mesh_target_locals:
|
||||
continue
|
||||
block["CRC"] = _modify_crc(self.mesh_crc_tweak, block["CRC"])
|
||||
parsed_params = block.deserialize_var("ExtraParams")
|
||||
if not parsed_params:
|
||||
continue
|
||||
obj = region.objects.lookup_localid(block["ID"])
|
||||
if not obj:
|
||||
return
|
||||
parsed_params[ExtraParamType.MESH]["Asset"] = self._pick_mesh_asset(obj)
|
||||
block.serialize_var("ExtraParams", parsed_params)
|
||||
elif message.name == "ObjectUpdateCompressed":
|
||||
for block in message["ObjectData"]:
|
||||
update_data = block.deserialize_var("Data")
|
||||
if not update_data:
|
||||
continue
|
||||
if update_data["ID"] not in self.local_mesh_target_locals:
|
||||
continue
|
||||
update_data["CRC"] = _modify_crc(self.mesh_crc_tweak, update_data["CRC"])
|
||||
if not update_data.get("ExtraParams"):
|
||||
continue
|
||||
|
||||
obj = region.objects.lookup_localid(update_data["ID"])
|
||||
if not obj:
|
||||
return
|
||||
extra_params = update_data["ExtraParams"]
|
||||
extra_params[ExtraParamType.MESH]["Asset"] = self._pick_mesh_asset(obj)
|
||||
block.serialize_var("Data", update_data)
|
||||
|
||||
def _pick_mesh_asset(self, obj: Object) -> UUID:
|
||||
mesh_idx = self.local_mesh_mapping.get(obj.Name, 0)
|
||||
# Use whatever the first mesh was if we don't have a match on name.
|
||||
return self.local_mesh_asset_ids[mesh_idx]
|
||||
|
||||
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
|
||||
cap_data = flow.cap_data
|
||||
if not cap_data:
|
||||
return
|
||||
if cap_data.cap_name == "NewFileAgentInventory":
|
||||
# Might be an upload cost calculation request for mesh, includes the actual mesh data.
|
||||
payload = llsd.parse_xml(flow.request.content)
|
||||
if "asset_resources" not in payload:
|
||||
return
|
||||
|
||||
orig_mesh_list = payload["asset_resources"].get("mesh_list")
|
||||
if not orig_mesh_list:
|
||||
return
|
||||
|
||||
# Replace the mesh instances in the payload with versions run through our mangler
|
||||
new_mesh_list = _mangle_mesh_list(orig_mesh_list, self.mesh_manglers)
|
||||
payload["asset_resources"]["mesh_list"] = new_mesh_list
|
||||
|
||||
# We have local mesh instances, re-use the data sent along with the upload cost
|
||||
# request to apply the mesh to our local mesh objects intead.
|
||||
if self.local_mesh_target_locals:
|
||||
region: ProxiedRegion = cap_data.region()
|
||||
asset_repo: HTTPAssetRepo = session_manager.asset_repo
|
||||
# Apply the new mesh to any local mesh targets
|
||||
self._replace_local_mesh(region, asset_repo, new_mesh_list)
|
||||
# Keep the original bytes around in case manglers get reloaded
|
||||
# and we want to re-run them
|
||||
self.local_mesh_orig_bytes = orig_mesh_list
|
||||
instances = payload["asset_resources"]["instance_list"]
|
||||
# To figure out what mesh index applies to what object name
|
||||
self.local_mesh_mapping = {x["mesh_name"]: x["mesh"] for x in instances}
|
||||
|
||||
# Fake a response, we don't want to actually send off the request.
|
||||
flow.response = mitmproxy.http.HTTPResponse.make(
|
||||
200,
|
||||
b"",
|
||||
{
|
||||
"Content-Type": "text/plain",
|
||||
"Connection": "close",
|
||||
}
|
||||
)
|
||||
show_message("Applying local mesh")
|
||||
elif self.mesh_manglers:
|
||||
flow.request.content = llsd.format_xml(payload)
|
||||
show_message("Mangled upload cost request")
|
||||
elif cap_data.cap_name == "NewFileAgentInventoryUploader":
|
||||
# Don't bother looking at this if we have no manglers
|
||||
if self.mesh_manglers:
|
||||
return
|
||||
# Depending on what asset is being uploaded the body may not even be LLSD.
|
||||
if not flow.request.content or b"mesh_list" not in flow.request.content:
|
||||
return
|
||||
payload = llsd.parse_xml(flow.request.content)
|
||||
if not payload.get("mesh_list"):
|
||||
return
|
||||
|
||||
payload["mesh_list"] = _mangle_mesh_list(payload["mesh_list"], self.mesh_manglers)
|
||||
flow.request.content = llsd.format_xml(payload)
|
||||
show_message("Mangled upload request")
|
||||
|
||||
def handle_object_updated(self, session: Session, region: ProxiedRegion,
|
||||
obj: Object, updated_props: Set[str]):
|
||||
if obj.LocalID not in self.local_mesh_target_locals:
|
||||
return
|
||||
if "Name" not in updated_props or obj.Name is None:
|
||||
return
|
||||
# A local mesh target has a new name, which mesh we need to apply
|
||||
# to the object may have changed.
|
||||
self.mesh_crc_tweak = secrets.randbits(32)
|
||||
region.objects.request_objects(obj.LocalID)
|
||||
|
||||
@classmethod
|
||||
def _replace_local_mesh(cls, region: ProxiedRegion, asset_repo, mesh_list: List[bytes]) -> None:
|
||||
cls.mesh_crc_tweak = secrets.randbits(32)
|
||||
|
||||
for asset_id in cls.local_mesh_asset_ids:
|
||||
del asset_repo[asset_id]
|
||||
cls.local_mesh_asset_ids.clear()
|
||||
for mesh_blob in mesh_list:
|
||||
cls.local_mesh_asset_ids.append(asset_repo.create_asset(mesh_blob))
|
||||
# Ask for a full update so we can clobber the mesh param
|
||||
# Janky hack around the fact that we don't know how to build
|
||||
# them from scratch yet.
|
||||
region.objects.request_objects(cls.local_mesh_target_locals)
|
||||
|
||||
@classmethod
|
||||
def remangle_local_mesh(cls, session_manager: SessionManager):
|
||||
# We want CRCs that are stable for the duration of the session, but will
|
||||
# cause a cache miss for objects cached before this session. Generate a
|
||||
# random value to XOR all CRCs with
|
||||
# We need to regen this when we force a re-mangle to indicate that the
|
||||
# viewer should pay attention to the incoming ObjectUpdate
|
||||
cls.mesh_crc_tweak = secrets.randbits(32)
|
||||
|
||||
asset_repo: HTTPAssetRepo = session_manager.asset_repo
|
||||
# Mesh manglers are global, so we need to re-mangle mesh for all sessions
|
||||
for session in session_manager.sessions:
|
||||
# Push the context of this session onto the stack so we can access
|
||||
# session-scoped properties
|
||||
with addon_ctx.push(new_session=session, new_region=session.main_region):
|
||||
if not cls.local_mesh_target_locals:
|
||||
continue
|
||||
orig_bytes = cls.local_mesh_orig_bytes
|
||||
if not orig_bytes:
|
||||
continue
|
||||
show_message("Remangling mesh", session=session)
|
||||
mesh_list = _mangle_mesh_list(orig_bytes, cls.mesh_manglers)
|
||||
cls._replace_local_mesh(session.main_region, asset_repo, mesh_list)
|
||||
|
||||
|
||||
addons = [MeshUploadInterceptingAddon()]
|
||||
73
addon_examples/mesh_mangler.py
Normal file
73
addon_examples/mesh_mangler.py
Normal file
@@ -0,0 +1,73 @@
|
||||
"""
|
||||
Example mesh mangler addon, to be used with local mesh addon.
|
||||
|
||||
You can edit this live to apply various transforms to local mesh.
|
||||
If there are no live local mesh instances, the transforms will be
|
||||
applied to the mesh before upload.
|
||||
|
||||
I personally use manglers to strip bounding box materials you need
|
||||
to add to give a mesh an arbitrary center of rotation / scaling.
|
||||
"""
|
||||
|
||||
from hippolyzer.lib.base.mesh import MeshAsset
|
||||
from hippolyzer.lib.proxy.addons import AddonManager
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon
|
||||
from hippolyzer.lib.proxy.sessions import SessionManager
|
||||
|
||||
import local_mesh
|
||||
AddonManager.hot_reload(local_mesh, require_addons_loaded=True)
|
||||
|
||||
|
||||
def _reorient_coord(coord, orientation):
|
||||
coords = []
|
||||
for axis in orientation:
|
||||
axis_idx = abs(axis) - 1
|
||||
coords.append(coord[axis_idx] if axis >= 0 else 1.0 - coord[axis_idx])
|
||||
if coord.__class__ in (list, tuple):
|
||||
return coord.__class__(coords)
|
||||
return coord.__class__(*coords)
|
||||
|
||||
|
||||
def _reorient_coord_list(coord_list, orientation):
|
||||
return [_reorient_coord(x, orientation) for x in coord_list]
|
||||
|
||||
|
||||
def reorient_mesh(orientation):
|
||||
# Returns a callable that will change `mesh` to match `orientation`
|
||||
# X=1, Y=2, Z=3
|
||||
def _reorienter(mesh: MeshAsset):
|
||||
for material in mesh.iter_lod_materials():
|
||||
# We don't need to use positions_(to/from)_domain here since we're just naively
|
||||
# flipping the axes around.
|
||||
material["Position"] = _reorient_coord_list(material["Position"], orientation)
|
||||
# Are you even supposed to do this to the normals?
|
||||
material["Normal"] = _reorient_coord_list(material["Normal"], orientation)
|
||||
return mesh
|
||||
return _reorienter
|
||||
|
||||
|
||||
OUR_MANGLERS = [
|
||||
# Negate the X and Y axes on any mesh we upload or create temp
|
||||
reorient_mesh((-1, -2, 3)),
|
||||
]
|
||||
|
||||
|
||||
class MeshManglerExampleAddon(BaseAddon):
|
||||
def handle_init(self, session_manager: SessionManager):
|
||||
# Add our manglers into the list
|
||||
local_mesh_addon = local_mesh.MeshUploadInterceptingAddon
|
||||
local_mesh_addon.mesh_manglers.extend(OUR_MANGLERS)
|
||||
# Tell the local mesh plugin that the mangler list changed, and to re-apply
|
||||
local_mesh_addon.remangle_local_mesh(session_manager)
|
||||
|
||||
def handle_unload(self, session_manager: SessionManager):
|
||||
# Clean up our manglers before we go away
|
||||
local_mesh_addon = local_mesh.MeshUploadInterceptingAddon
|
||||
mangler_list = local_mesh_addon.mesh_manglers
|
||||
for mangler in OUR_MANGLERS:
|
||||
if mangler in mangler_list:
|
||||
mangler_list.remove(mangler)
|
||||
local_mesh_addon.remangle_local_mesh(session_manager)
|
||||
|
||||
|
||||
addons = [MeshManglerExampleAddon()]
|
||||
263
addon_examples/monochrome.py
Normal file
263
addon_examples/monochrome.py
Normal file
@@ -0,0 +1,263 @@
|
||||
"""
|
||||
Make most object textures monochrome.
|
||||
|
||||
Avoids affecting materials and profile pictures and the like by
|
||||
replacing textures in TEs with an alias and only mutating requests
|
||||
for those alias IDs.
|
||||
|
||||
Demonstrates a multiprocessing / queue consumer pattern for addons that
|
||||
need to do expensive, potentially blocking work in the background.
|
||||
|
||||
This will make your CPU fan go crazy so enjoy that.
|
||||
"""
|
||||
|
||||
import copy
|
||||
import ctypes
|
||||
import multiprocessing
|
||||
import queue
|
||||
import secrets
|
||||
import statistics
|
||||
import time
|
||||
import traceback
|
||||
|
||||
import glymur
|
||||
import numpy as np
|
||||
from mitmproxy.http import HTTPFlow
|
||||
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.base.jp2_utils import BufferedJp2k
|
||||
from hippolyzer.lib.base.multiprocessing_utils import ParentProcessWatcher
|
||||
from hippolyzer.lib.proxy.addon_utils import AssetAliasTracker, BaseAddon, GlobalProperty
|
||||
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session, SessionManager
|
||||
from hippolyzer.lib.proxy.templates import TextureEntry
|
||||
|
||||
|
||||
glymur.set_option('lib.num_threads', 4)
|
||||
|
||||
|
||||
def _modify_crc(crc_tweak: int, crc_val: int):
|
||||
return ctypes.c_uint32(crc_val ^ crc_tweak).value
|
||||
|
||||
|
||||
class MonochromeAddon(BaseAddon):
|
||||
NUM_CONSUMERS = 6
|
||||
mono_crc_xor: int = GlobalProperty()
|
||||
mono_tracker: AssetAliasTracker = GlobalProperty(AssetAliasTracker)
|
||||
|
||||
def __init__(self):
|
||||
# These are global and should die whenever the addon reloads,
|
||||
# so they can live on the instance rather than in addon context
|
||||
self.mono_addon_shutdown_signal = multiprocessing.Event()
|
||||
self.image_resp_queue = multiprocessing.Queue()
|
||||
|
||||
def handle_init(self, session_manager: SessionManager):
|
||||
to_proxy_queue = session_manager.flow_context.to_proxy_queue
|
||||
for _ in range(self.NUM_CONSUMERS):
|
||||
multiprocessing.Process(
|
||||
target=_process_image_queue,
|
||||
args=(self.mono_addon_shutdown_signal, self.image_resp_queue, to_proxy_queue),
|
||||
daemon=True,
|
||||
).start()
|
||||
|
||||
# We want CRCs that are stable for the duration of the addon's life, but will
|
||||
# cause a cache miss for objects cached before. Generate a random value
|
||||
# to XOR all CRCs with.
|
||||
self.mono_crc_xor = secrets.randbits(32)
|
||||
self.mono_tracker.invalidate_aliases()
|
||||
|
||||
def handle_session_init(self, session: Session):
|
||||
# We loaded while this session was active, re-request all objects we
|
||||
# know about so we can process them
|
||||
if session.main_region:
|
||||
object_manager = session.main_region.objects
|
||||
object_manager.request_missing_objects()
|
||||
object_manager.request_objects([o.LocalID for o in object_manager.all_objects])
|
||||
|
||||
def handle_unload(self, session_manager: SessionManager):
|
||||
# Tell queue consumers to shut down
|
||||
self.mono_addon_shutdown_signal.set()
|
||||
|
||||
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
|
||||
tracker = self.mono_tracker
|
||||
if message.name == "ObjectUpdateCached":
|
||||
for block in message["ObjectData"]:
|
||||
# Cached only really has a CRC, this will force the cache miss.
|
||||
block["CRC"] = _modify_crc(self.mono_crc_xor, block["CRC"])
|
||||
elif message.name == "ObjectUpdate":
|
||||
for block in message["ObjectData"]:
|
||||
block["CRC"] = _modify_crc(self.mono_crc_xor, block["CRC"])
|
||||
parsed_te = block.deserialize_var("TextureEntry")
|
||||
if not parsed_te:
|
||||
continue
|
||||
|
||||
parsed_te = self._make_te_monochrome(tracker, parsed_te)
|
||||
block.serialize_var("TextureEntry", parsed_te)
|
||||
elif message.name == "ImprovedTerseObjectUpdate":
|
||||
for block in message["ObjectData"]:
|
||||
parsed_te = block.deserialize_var("TextureEntry")
|
||||
if not parsed_te:
|
||||
continue
|
||||
parsed_te = self._make_te_monochrome(tracker, parsed_te)
|
||||
block.serialize_var("TextureEntry", parsed_te)
|
||||
elif message.name == "AvatarAppearance":
|
||||
for block in message["ObjectData"]:
|
||||
parsed_te = block.deserialize_var("TextureEntry")
|
||||
if not parsed_te:
|
||||
continue
|
||||
# Have to hook AppearanceService for this to work.
|
||||
parsed_te = self._make_te_monochrome(tracker, parsed_te)
|
||||
block.serialize_var("TextureEntry", parsed_te)
|
||||
elif message.name == "ObjectUpdateCompressed":
|
||||
for block in message["ObjectData"]:
|
||||
update_data = block.deserialize_var("Data")
|
||||
if not update_data:
|
||||
continue
|
||||
update_data["CRC"] = _modify_crc(self.mono_crc_xor, update_data["CRC"])
|
||||
if not update_data.get("TextureEntry"):
|
||||
continue
|
||||
|
||||
update_data["TextureEntry"] = self._make_te_monochrome(
|
||||
tracker,
|
||||
update_data["TextureEntry"],
|
||||
)
|
||||
block.serialize_var("Data", update_data)
|
||||
elif message.name == "RegionHandshake":
|
||||
for field_name, val in message["RegionInfo"][0].items():
|
||||
if field_name.startswith("Terrain") and isinstance(val, UUID):
|
||||
message["RegionInfo"][field_name] = tracker.get_alias_uuid(val)
|
||||
|
||||
@staticmethod
|
||||
def _make_te_monochrome(tracker: AssetAliasTracker, parsed_te: TextureEntry):
|
||||
# Need a deepcopy because TEs are owned by the ObjectManager
|
||||
# and we don't want to change the canonical view.
|
||||
parsed_te = copy.deepcopy(parsed_te)
|
||||
for k, v in parsed_te.Textures.items():
|
||||
# Replace textures with their alias to bust the viewer cache
|
||||
parsed_te.Textures[k] = tracker.get_alias_uuid(v)
|
||||
for k, v in parsed_te.Color.items():
|
||||
# Convert face colors to monochrome, keeping alpha byte the same
|
||||
avg_val = int(round(statistics.mean((v[0], v[1], v[2]))))
|
||||
parsed_te.Color[k] = bytes((avg_val,) * 3 + (v[3],))
|
||||
return parsed_te
|
||||
|
||||
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
|
||||
cap_data = flow.cap_data
|
||||
if not cap_data:
|
||||
return
|
||||
is_appearance = cap_data.cap_name == "AppearanceService"
|
||||
if not (cap_data.asset_server_cap or is_appearance):
|
||||
return
|
||||
|
||||
if is_appearance:
|
||||
# Baked layers come from the sim-local appearance service, not the asset server.
|
||||
# Its request URIs look a little different.
|
||||
texture_id = flow.request.url.split('/')[-1]
|
||||
else:
|
||||
texture_id = flow.request.query.get("texture_id")
|
||||
|
||||
if not texture_id:
|
||||
return
|
||||
|
||||
orig_texture_id = self.mono_tracker.get_orig_uuid(UUID(texture_id))
|
||||
if not orig_texture_id:
|
||||
return
|
||||
|
||||
# The request was for a fake texture ID we created, rewrite the request to
|
||||
# request the real asset and mark the flow for modification once we receive
|
||||
# the image.
|
||||
if is_appearance:
|
||||
split_url = flow.request.url.split('/')
|
||||
split_url[-1] = str(orig_texture_id)
|
||||
flow.request.url = '/'.join(split_url)
|
||||
else:
|
||||
flow.request.query["texture_id"] = str(orig_texture_id)
|
||||
|
||||
flow.can_stream = False
|
||||
flow.metadata["make_monochrome"] = True
|
||||
# We can't parse a partial J2C. This is probably a little inefficient since we're
|
||||
# liable to have multiple in-flight requests for different ranges of a file at any
|
||||
# given time, and we'll have to re-encode multiple times. Meh.
|
||||
flow.request.headers.pop("Range", None)
|
||||
|
||||
def handle_http_response(self, session_manager: SessionManager, flow: HippoHTTPFlow):
|
||||
if not flow.metadata.get("make_monochrome"):
|
||||
return
|
||||
|
||||
if flow.response.status_code not in (200, 206):
|
||||
return
|
||||
|
||||
# Don't send the callback to the proxy immediately, our queue consumer is
|
||||
# messing with the image data and will send the callback itself.
|
||||
flow.take()
|
||||
# Put the serialized HTTP flow on the queue for one of the consumers to pick up
|
||||
self.image_resp_queue.put(flow.get_state())
|
||||
return True
|
||||
|
||||
|
||||
def _process_image_queue(
|
||||
shutdown_signal: multiprocessing.Event,
|
||||
image_resp_queue: multiprocessing.Queue,
|
||||
to_proxy_queue: multiprocessing.Queue
|
||||
):
|
||||
watcher = ParentProcessWatcher(shutdown_signal)
|
||||
while not watcher.check_shutdown_needed():
|
||||
try:
|
||||
flow_state = image_resp_queue.get(False)
|
||||
except queue.Empty:
|
||||
# Ok to block since we're in our own process
|
||||
time.sleep(0.01)
|
||||
continue
|
||||
|
||||
# Use HTTPFlow since we don't have a session manager and don't need
|
||||
# to understand any Hippolyzer-specific fields
|
||||
flow: HTTPFlow = HTTPFlow.from_state(flow_state)
|
||||
try:
|
||||
flow.response.content = _make_jp2_monochrome(flow.response.content)
|
||||
# Not a range anymore, update the headers and status.
|
||||
flow.response.headers.pop("Content-Range", None)
|
||||
flow.response.status_code = 200
|
||||
except:
|
||||
# Just log the exception and pass the image through unmodified
|
||||
traceback.print_exc()
|
||||
|
||||
# Put our modified response directly on the mitmproxy response queue,
|
||||
# no point sending it back to the addon.
|
||||
to_proxy_queue.put(("callback", flow.id, flow.get_state()))
|
||||
|
||||
|
||||
def _make_jp2_monochrome(jp2_data: bytes) -> bytes:
|
||||
j = BufferedJp2k(jp2_data)
|
||||
# Less than three components? Probably monochrome already.
|
||||
if len(j.shape) < 3 or j.shape[2] < 3:
|
||||
return jp2_data
|
||||
|
||||
# Downscale if it'll be a huge image, compression is slow.
|
||||
if any(c >= 1024 for c in j.shape[0:2]):
|
||||
pixels = j[::4, ::4]
|
||||
elif any(c >= 512 for c in j.shape[0:2]):
|
||||
pixels = j[::2, ::2]
|
||||
else:
|
||||
pixels = j[::]
|
||||
|
||||
for row in pixels:
|
||||
for col in row:
|
||||
# Simple average value monochrome conversion
|
||||
# Don't touch any component after the third. Those
|
||||
# have alpha and baked layer related data.
|
||||
col[0:3] = np.mean(col[0:3])
|
||||
|
||||
# RGB, we can convert this to a single monochrome channel since
|
||||
# we don't have to worry about messing with alpha.
|
||||
if pixels.shape[2] == 3:
|
||||
pixels = pixels[:, :, 0:1]
|
||||
# Inform glymur of the new array shape
|
||||
j.shape = (pixels.shape[0], pixels.shape[1], 1)
|
||||
|
||||
j[::] = pixels
|
||||
return bytes(j)
|
||||
|
||||
|
||||
addons = [MonochromeAddon()]
|
||||
78
addon_examples/objectupdate_blame.py
Normal file
78
addon_examples/objectupdate_blame.py
Normal file
@@ -0,0 +1,78 @@
|
||||
"""
|
||||
ObjectUpdate blame tracker, to figure out what objects are spamming updates
|
||||
|
||||
Assumes that you've received a full ObjectUpdate for everything (meaning the proxy
|
||||
object tracker knows about it) and that you have received an ObjectProperties for
|
||||
everything you want the name of. You can force a full ObjectUpdate for everything
|
||||
by relogging with an empty object cache. Doing the "precache_objects" command
|
||||
before you start tracking can help too.
|
||||
"""
|
||||
from typing import *
|
||||
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.base.objects import Object
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message, SessionProperty
|
||||
from hippolyzer.lib.proxy.commands import handle_command
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
from hippolyzer.lib.proxy.templates import PCode
|
||||
|
||||
|
||||
class ObjectUpdateBlameAddon(BaseAddon):
|
||||
update_blame_counter: Counter[UUID] = SessionProperty(Counter)
|
||||
track_update_blame: bool = SessionProperty(False)
|
||||
|
||||
@handle_command()
|
||||
async def precache_objects(self, _session: Session, region: ProxiedRegion):
|
||||
"""
|
||||
Make the proxy's object tracker request any missing objects
|
||||
|
||||
Should be done before tracking update blame to make sure the proxy
|
||||
knows about any objects that are cached in the client but not by the proxy
|
||||
"""
|
||||
region.objects.request_missing_objects()
|
||||
|
||||
@handle_command()
|
||||
async def object_cache_miss_stats(self, _session: Session, region: ProxiedRegion):
|
||||
show_message(len(region.objects.missing_locals))
|
||||
|
||||
@handle_command()
|
||||
async def track_update_blame(self, _session: Session, _region: ProxiedRegion):
|
||||
self.track_update_blame = True
|
||||
|
||||
@handle_command()
|
||||
async def untrack_update_blame(self, _session: Session, _region: ProxiedRegion):
|
||||
self.track_update_blame = False
|
||||
|
||||
@handle_command()
|
||||
async def clear_update_blame(self, _session: Session, _region: ProxiedRegion):
|
||||
self.update_blame_counter.clear()
|
||||
|
||||
@handle_command()
|
||||
async def dump_update_blame(self, _session: Session, region: ProxiedRegion):
|
||||
print("ObjectUpdate blame:")
|
||||
for obj_id, count in self.update_blame_counter.most_common(50):
|
||||
obj = region.objects.lookup_fullid(obj_id)
|
||||
name = obj.Name if obj and obj.Name else "<Unknown>"
|
||||
print(f"{obj_id} ({name!r}): {count}")
|
||||
|
||||
def handle_object_updated(self, session: Session, region: ProxiedRegion,
|
||||
obj: Object, updated_props: Set[str]):
|
||||
if not self.track_update_blame:
|
||||
return
|
||||
if region != session.main_region:
|
||||
return
|
||||
# Log this as related to the parent object unless the parent is an avatar
|
||||
if obj.Parent and obj.Parent.PCode != PCode.AVATAR:
|
||||
obj = obj.Parent
|
||||
|
||||
if obj.PCode != PCode.PRIMITIVE:
|
||||
return
|
||||
|
||||
self.update_blame_counter[obj.FullID] += 1
|
||||
# Ask the region for the object name if we don't know it
|
||||
if obj.Name is None:
|
||||
region.objects.request_object_properties(obj)
|
||||
|
||||
|
||||
addons = [ObjectUpdateBlameAddon()]
|
||||
48
addon_examples/payday.py
Normal file
48
addon_examples/payday.py
Normal file
@@ -0,0 +1,48 @@
|
||||
"""
|
||||
Do the money dance whenever someone in the sim pays you directly
|
||||
"""
|
||||
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.base.message.message import Block
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.addon_utils import send_chat, BaseAddon
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
from hippolyzer.lib.proxy.templates import MoneyTransactionType, PCode, ChatType
|
||||
|
||||
|
||||
class PaydayAddon(BaseAddon):
|
||||
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
|
||||
if message.name != "MoneyBalanceReply":
|
||||
return
|
||||
transaction_block = message["TransactionInfo"][0]
|
||||
# Check for direct user -> user transfer
|
||||
if transaction_block["TransactionType"] != MoneyTransactionType.GIFT:
|
||||
return
|
||||
|
||||
# Check transfer was to us, not from us
|
||||
if transaction_block["DestID"] != session.agent_id:
|
||||
return
|
||||
sender = transaction_block["SourceID"]
|
||||
if sender == session.agent_id:
|
||||
return
|
||||
|
||||
# Check if they're likely to be in the sim
|
||||
sender_obj = region.objects.lookup_fullid(sender)
|
||||
if not sender_obj or sender_obj.PCode != PCode.AVATAR:
|
||||
return
|
||||
|
||||
amount = transaction_block['Amount']
|
||||
send_chat(
|
||||
f"Thanks for the L${amount} secondlife:///app/agent/{sender}/completename !",
|
||||
chat_type=ChatType.SHOUT,
|
||||
)
|
||||
# Do the traditional money dance.
|
||||
session.main_region.circuit.send_message(ProxiedMessage(
|
||||
"AgentAnimation",
|
||||
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
|
||||
Block("AnimationList", AnimID=UUID("928cae18-e31d-76fd-9cc9-2f55160ff818"), StartAnim=True),
|
||||
))
|
||||
|
||||
|
||||
addons = [PaydayAddon()]
|
||||
52
addon_examples/profiler.py
Normal file
52
addon_examples/profiler.py
Normal file
@@ -0,0 +1,52 @@
|
||||
"""
|
||||
Debug performance issues in the proxy
|
||||
/524 start_profiling
|
||||
/524 stop_profiling
|
||||
"""
|
||||
|
||||
import cProfile
|
||||
from typing import *
|
||||
|
||||
from hippolyzer.lib.proxy.addons import AddonManager
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon
|
||||
from hippolyzer.lib.proxy.commands import handle_command
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session, SessionManager
|
||||
|
||||
|
||||
class ProfilingAddon(BaseAddon):
|
||||
def __init__(self):
|
||||
# We don't want this to surive module reloads so it can be an
|
||||
# instance attribute rather than on session_manager.addon_ctx
|
||||
self.profile: Optional[cProfile.Profile] = None
|
||||
|
||||
def handle_unload(self, session_manager: SessionManager):
|
||||
if self.profile is not None:
|
||||
self.profile.disable()
|
||||
self.profile = None
|
||||
|
||||
@handle_command()
|
||||
async def start_profiling(self, _session: Session, _region: ProxiedRegion):
|
||||
"""Start a cProfile session"""
|
||||
if self.profile is not None:
|
||||
self.profile.disable()
|
||||
self.profile = cProfile.Profile()
|
||||
self.profile.enable()
|
||||
print("Started profiling")
|
||||
|
||||
@handle_command()
|
||||
async def stop_profiling(self, _session: Session, _region: ProxiedRegion):
|
||||
"""Stop profiling and save to file"""
|
||||
if self.profile is None:
|
||||
return
|
||||
self.profile.disable()
|
||||
profile = self.profile
|
||||
self.profile = None
|
||||
|
||||
print("Finished profiling")
|
||||
profile_path = await AddonManager.UI.save_file(caption="Save Profile")
|
||||
if profile_path:
|
||||
profile.dump_stats(profile_path)
|
||||
|
||||
|
||||
addons = [ProfilingAddon()]
|
||||
54
addon_examples/properties.py
Normal file
54
addon_examples/properties.py
Normal file
@@ -0,0 +1,54 @@
|
||||
"""
|
||||
Demonstrates how addon state can be tied to sessions and survive
|
||||
across reloads using the GlobalProperty and SessionProperty ClassVars
|
||||
"""
|
||||
|
||||
from hippolyzer.lib.proxy.commands import handle_command, Parameter
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty, GlobalProperty, send_chat
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
|
||||
|
||||
class PropertyHelloWorldAddon(BaseAddon):
|
||||
# How to say hello, value shared across sessions and will be the same
|
||||
# regardless of which session is active when accessed.
|
||||
# "hello_greeting" is added to session_manager.addon_ctx's dict and will survive reloads
|
||||
# be mindful of conflicting with other addons' variables, other addons' variables
|
||||
# can be accessed manually through the session_manager.addon_ctx dict since
|
||||
# there isn't currently any namespacing.
|
||||
hello_greeting: str = GlobalProperty(default="Hello")
|
||||
# Who to say hello to, value specific to each session
|
||||
# Value will be different depending on which session is having its event
|
||||
# handled when the property is accessed.
|
||||
# "hello_person" is added to session.addon_ctx's dict and will survive reloads
|
||||
hello_person: str = SessionProperty(default="World")
|
||||
|
||||
def __init__(self):
|
||||
# Tied to the addon instance.
|
||||
# Shared across sessions and will die if the addon is reloaded
|
||||
self.hello_punctuation = "!"
|
||||
|
||||
@handle_command(greeting=Parameter(str, sep=None))
|
||||
async def set_hello_greeting(self, _session: Session, _region: ProxiedRegion, greeting: str):
|
||||
"""Set the person to say hello to"""
|
||||
self.hello_greeting = greeting
|
||||
|
||||
@handle_command(person=Parameter(str, sep=None))
|
||||
async def set_hello_person(self, _session: Session, _region: ProxiedRegion, person: str):
|
||||
"""Set the person to say hello to"""
|
||||
self.hello_person = person
|
||||
|
||||
@handle_command(punctuation=Parameter(str, sep=None))
|
||||
async def set_hello_punctuation(self, _session: Session, _region: ProxiedRegion, punctuation: str):
|
||||
"""Set the punctuation to use for saying hello"""
|
||||
self.hello_punctuation = punctuation
|
||||
|
||||
@handle_command()
|
||||
async def say_hello(self, _session: Session, _region: ProxiedRegion):
|
||||
"""Say hello using the configured hello variables"""
|
||||
# These aren't instance properties, they can be accessed via the class as well.
|
||||
hello_person = PropertyHelloWorldAddon.hello_person
|
||||
send_chat(f"{self.hello_greeting} {hello_person}{self.hello_punctuation}")
|
||||
|
||||
|
||||
addons = [PropertyHelloWorldAddon()]
|
||||
41
addon_examples/repl.py
Normal file
41
addon_examples/repl.py
Normal file
@@ -0,0 +1,41 @@
|
||||
from hippolyzer.lib.proxy.addons import AddonManager
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
|
||||
|
||||
class REPLExampleAddon(BaseAddon):
|
||||
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
|
||||
if message.name == "ChatFromViewer":
|
||||
chat_msg = message["ChatData"]["Message"]
|
||||
if not chat_msg:
|
||||
return
|
||||
# Intercept chat messages containing "hippolyzer_test" as an example
|
||||
if "hippolyzer_test" in chat_msg:
|
||||
if AddonManager.have_active_repl():
|
||||
# Already intercepting, don't touch it
|
||||
return
|
||||
# Take ownership of the message so it won't be sent by the
|
||||
# usual machinery.
|
||||
_new_msg = message.take()
|
||||
# repl will have access to `_new_msg` and can send it with
|
||||
# `region.circuit.send_message()` after it's modified.
|
||||
AddonManager.spawn_repl()
|
||||
return True
|
||||
if "hippolyzer_async_test" in chat_msg:
|
||||
if AddonManager.have_active_repl():
|
||||
# Already intercepting, don't touch it
|
||||
return
|
||||
|
||||
async def _coro():
|
||||
foo = 4
|
||||
# spawn_repl() can be `await`ed, changing foo
|
||||
# in the repl will change what's printed on exit.
|
||||
await AddonManager.spawn_repl()
|
||||
print("foo is", foo)
|
||||
|
||||
self._schedule_task(_coro())
|
||||
|
||||
|
||||
addons = [REPLExampleAddon()]
|
||||
128
addon_examples/serialization_sanity_checker.py
Normal file
128
addon_examples/serialization_sanity_checker.py
Normal file
@@ -0,0 +1,128 @@
|
||||
"""
|
||||
Validates that serialize(deserialize(packet)) == packet for any packet
|
||||
that passes through the proxy. Useful for ensuring that serializers don't
|
||||
change the meaning of a message, and that all of the viewer's quirks are
|
||||
faithfully reproduced.
|
||||
"""
|
||||
|
||||
import copy
|
||||
import itertools
|
||||
import logging
|
||||
from typing import *
|
||||
|
||||
from hippolyzer.lib.base.message.msgtypes import PacketLayout
|
||||
from hippolyzer.lib.base import serialization as se
|
||||
from hippolyzer.lib.base.message.udpdeserializer import UDPMessageDeserializer
|
||||
from hippolyzer.lib.base.message.udpserializer import UDPMessageSerializer
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.packets import ProxiedUDPPacket
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import SessionManager, Session
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class SerializationSanityChecker(BaseAddon):
|
||||
def __init__(self):
|
||||
self.serializer = UDPMessageSerializer()
|
||||
self.deserializer = UDPMessageDeserializer()
|
||||
|
||||
def handle_proxied_packet(self, session_manager: SessionManager, packet: ProxiedUDPPacket,
|
||||
session: Optional[Session], region: Optional[ProxiedRegion],
|
||||
message: Optional[ProxiedMessage]):
|
||||
# Well this doesn't even parse as a message, can't do anything about it.
|
||||
if message is None:
|
||||
LOG.error(f"Received unparseable message from {packet.src_addr!r}: {packet.data!r}")
|
||||
return
|
||||
try:
|
||||
message.ensure_parsed()
|
||||
except:
|
||||
LOG.exception(f"Exception during {message.name} message validation pre-parsing")
|
||||
return
|
||||
|
||||
try:
|
||||
# We already know the message won't match if the serializers don't roundtrip.
|
||||
if message and self._roundtrip_var_serializers(message):
|
||||
ser = self.serializer.serialize(message)
|
||||
# LL's ObjectUpdate specifically randomly uses inefficient zero-coding
|
||||
# which is hard to reproduce. It means the same thing when decompressed,
|
||||
# so just expand both and compare. Technically this incorrectly expands the
|
||||
# acks too, but shouldn't matter because they should be the same in both.
|
||||
if message.name == "ObjectUpdate" and message.zerocoded:
|
||||
orig_body = self.deserializer.zero_code_expand(packet.data[PacketLayout.PHL_NAME:])
|
||||
ser_body = self.deserializer.zero_code_expand(ser[PacketLayout.PHL_NAME:])
|
||||
matches = orig_body == ser_body
|
||||
else:
|
||||
matches = packet.data == ser
|
||||
|
||||
if not matches:
|
||||
direction = "Out" if packet.outgoing else "In"
|
||||
LOG.error("%s: %d %s\n%r != %r" %
|
||||
(direction, message.packet_id, message.name, packet.data, ser))
|
||||
except:
|
||||
LOG.exception(f"Exception during message validation:\n{message!r}")
|
||||
|
||||
def _roundtrip_var_serializers(self, message: ProxiedMessage):
|
||||
for block in itertools.chain(*message.blocks.values()):
|
||||
for var_name in block.vars.keys():
|
||||
orig_val = block[var_name]
|
||||
try:
|
||||
orig_serializer = block.get_serializer(var_name)
|
||||
except KeyError:
|
||||
# Don't have a serializer, onto the next field
|
||||
continue
|
||||
# need to copy the serializer since we're going to replace a member function
|
||||
serializer: se.BaseSubfieldSerializer = copy.copy(orig_serializer)
|
||||
|
||||
# Keep track of what got serialized at what position
|
||||
member_positions = []
|
||||
|
||||
def _serialize_template(template, val):
|
||||
writer = se.MemberTrackingBufferWriter(serializer.ENDIANNESS)
|
||||
writer.write(template, val)
|
||||
member_positions.clear()
|
||||
member_positions.extend(writer.member_positions)
|
||||
return writer.copy_buffer()
|
||||
|
||||
serializer._serialize_template = _serialize_template
|
||||
try:
|
||||
deser = serializer.deserialize(block, orig_val)
|
||||
except:
|
||||
LOG.error(f"Exploded in deserializer for {message.name}.{block.name}.{var_name}")
|
||||
raise
|
||||
|
||||
# For now we consider returning UNSERIALIZABLE to be acceptable.
|
||||
# We should probably consider raising instead of returning that.
|
||||
if deser is se.UNSERIALIZABLE:
|
||||
continue
|
||||
|
||||
try:
|
||||
new_val = serializer.serialize(block, deser)
|
||||
except:
|
||||
LOG.error(f"Exploded in serializer for {message.name}.{block.name}.{var_name}")
|
||||
raise
|
||||
|
||||
if orig_val != new_val:
|
||||
# OpenSim will put an extra NUL at the end of TEs with material fields
|
||||
# whereas the viewer and SL just use EOF rather than explicit NUL to signal
|
||||
# the end of the exception cases for the last field in a TE.
|
||||
# OpenSim's behaviour isn't incorrect, but we're not going to reproduce it.
|
||||
if var_name == "TextureEntry" and orig_val[:-1] == new_val and orig_val[-1] == 0:
|
||||
continue
|
||||
LOG.error("%d %s.%s.%s\n%r != %r" %
|
||||
(message.packet_id, message.name, block.name, var_name, orig_val, new_val))
|
||||
# This was templated, we can dig into which member mismatched
|
||||
if member_positions:
|
||||
# find the mismatch index
|
||||
i = 0
|
||||
bytes_zipped = itertools.zip_longest(orig_val, new_val, fillvalue=object())
|
||||
for i, (old_byte, new_byte) in enumerate(bytes_zipped):
|
||||
if old_byte != new_byte:
|
||||
break
|
||||
LOG.error(f"Mismatch at {i}, {member_positions!r}")
|
||||
return False
|
||||
return True
|
||||
|
||||
|
||||
addons = [SerializationSanityChecker()]
|
||||
34
addon_examples/shield.py
Normal file
34
addon_examples/shield.py
Normal file
@@ -0,0 +1,34 @@
|
||||
"""Block potentially bad things"""
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.packets import Direction
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
from hippolyzer.lib.proxy.templates import IMDialogType
|
||||
|
||||
SUSPICIOUS_PACKETS = {"RequestXfer", "TransferRequest", "UUIDNameRequest",
|
||||
"UUIDGroupNameRequest", "OpenCircuit"}
|
||||
REGULAR_IM_DIALOGS = (IMDialogType.TYPING_STOP, IMDialogType.TYPING_STOP, IMDialogType.NOTHING_SPECIAL)
|
||||
|
||||
|
||||
class ShieldAddon(BaseAddon):
|
||||
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
|
||||
if message.direction != Direction.IN:
|
||||
return
|
||||
if message.name in SUSPICIOUS_PACKETS:
|
||||
show_message(f"Blocked suspicious {message.name} packet")
|
||||
region.circuit.drop_message(message)
|
||||
return True
|
||||
if message.name == "ImprovedInstantMessage":
|
||||
msg_block = message["MessageBlock"][0]
|
||||
if msg_block["Dialog"] not in REGULAR_IM_DIALOGS:
|
||||
return
|
||||
from_agent = message["AgentData"]["AgentID"]
|
||||
if from_agent == session.agent_id:
|
||||
expected_id = from_agent
|
||||
else:
|
||||
expected_id = from_agent ^ session.agent_id
|
||||
msg_block["ID"] = expected_id
|
||||
|
||||
|
||||
addons = [ShieldAddon()]
|
||||
28
addon_examples/spongecase.py
Normal file
28
addon_examples/spongecase.py
Normal file
@@ -0,0 +1,28 @@
|
||||
import itertools
|
||||
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
|
||||
|
||||
def _to_spongecase(val):
|
||||
# give alternating casing for each character
|
||||
spongecased = itertools.zip_longest(val[::2].upper(), val[1::2].lower(), fillvalue="")
|
||||
# join them back together
|
||||
return "".join(itertools.chain(*spongecased))
|
||||
|
||||
|
||||
def handle_lludp_message(session: Session, _region: ProxiedRegion, message: ProxiedMessage):
|
||||
ctx = session.addon_ctx
|
||||
ctx.setdefault("spongecase", False)
|
||||
if message.name == "ChatFromViewer":
|
||||
chat = message["ChatData"]["Message"]
|
||||
if chat == "spongeon":
|
||||
ctx["spongecase"] = True
|
||||
elif chat == "spongeoff":
|
||||
ctx["spongecase"] = False
|
||||
|
||||
if ctx["spongecase"]:
|
||||
if not chat or message["ChatData"]["Channel"] != 0:
|
||||
return
|
||||
message["ChatData"]["Message"] = _to_spongecase(chat)
|
||||
50
addon_examples/task_example.py
Normal file
50
addon_examples/task_example.py
Normal file
@@ -0,0 +1,50 @@
|
||||
import asyncio
|
||||
import datetime as dt
|
||||
from typing import *
|
||||
|
||||
from hippolyzer.lib.proxy.addon_utils import send_chat, BaseAddon, SessionProperty
|
||||
from hippolyzer.lib.proxy.commands import handle_command
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
|
||||
|
||||
class TaskExampleAddon(BaseAddon):
|
||||
chat_loop_task: Optional[asyncio.Task] = SessionProperty(None)
|
||||
chat_loop_count: Optional[int] = SessionProperty(None)
|
||||
chat_loop_started: Optional[dt.datetime] = SessionProperty(None)
|
||||
|
||||
@handle_command()
|
||||
async def start_chat_task(self, session: Session, _region: ProxiedRegion):
|
||||
"""Start a task that sends chat in a loop, demonstrating task scheduling"""
|
||||
# Already doing a chat loop
|
||||
if self.chat_loop_task and not self.chat_loop_task.done():
|
||||
return
|
||||
self.chat_loop_started = dt.datetime.now()
|
||||
self.chat_loop_count = 0
|
||||
# Don't need to clean this up on session shutdown because _schedule_task()
|
||||
# binds tasks to session lifetime by default
|
||||
self.chat_loop_task = self._schedule_task(self._chat_loop(session))
|
||||
|
||||
@handle_command()
|
||||
async def stop_chat_task(self, _session: Session, _region: ProxiedRegion):
|
||||
"""Stop the chat task if one was active"""
|
||||
if self.chat_loop_task and not self.chat_loop_task.done():
|
||||
self.chat_loop_task.cancel()
|
||||
self.chat_loop_task = None
|
||||
|
||||
async def _chat_loop(self, session: Session, sleep_time=5.0):
|
||||
while True:
|
||||
region = session.main_region
|
||||
if not region:
|
||||
await asyncio.sleep(sleep_time)
|
||||
continue
|
||||
send_chat(
|
||||
f"Loop {self.chat_loop_count}, started "
|
||||
f"{dt.datetime.now() - self.chat_loop_started} ago",
|
||||
session=session
|
||||
)
|
||||
self.chat_loop_count += 1
|
||||
await asyncio.sleep(sleep_time)
|
||||
|
||||
|
||||
addons = [TaskExampleAddon()]
|
||||
71
addon_examples/transfer_example.py
Normal file
71
addon_examples/transfer_example.py
Normal file
@@ -0,0 +1,71 @@
|
||||
"""
|
||||
Example of how to request a Transfer
|
||||
"""
|
||||
from typing import *
|
||||
|
||||
from hippolyzer.lib.base.legacy_inv import InventoryModel, InventoryItem
|
||||
from hippolyzer.lib.base.message.message import Block
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message
|
||||
from hippolyzer.lib.proxy.commands import handle_command
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
from hippolyzer.lib.proxy.templates import (
|
||||
AssetType,
|
||||
EstateAssetType,
|
||||
TransferRequestParamsSimEstate,
|
||||
TransferRequestParamsSimInvItem,
|
||||
TransferSourceType,
|
||||
XferFilePath,
|
||||
)
|
||||
|
||||
|
||||
class TransferExampleAddon(BaseAddon):
|
||||
@handle_command()
|
||||
async def get_covenant(self, _session: Session, region: ProxiedRegion):
|
||||
"""Get the current region's covenent"""
|
||||
transfer = await region.transfer_manager.request(
|
||||
source_type=TransferSourceType.SIM_ESTATE,
|
||||
params=TransferRequestParamsSimEstate(
|
||||
EstateAssetType=EstateAssetType.COVENANT,
|
||||
),
|
||||
)
|
||||
show_message(transfer.reassemble_chunks().decode("utf8"))
|
||||
|
||||
@handle_command()
|
||||
async def get_first_script(self, session: Session, region: ProxiedRegion):
|
||||
"""Get the contents of the first script in the selected object"""
|
||||
# Ask for the object inventory so we can find a script
|
||||
region.circuit.send_message(ProxiedMessage(
|
||||
'RequestTaskInventory',
|
||||
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
|
||||
Block('InventoryData', LocalID=session.selected.object_local),
|
||||
))
|
||||
inv_message = await region.message_handler.wait_for('ReplyTaskInventory', timeout=5.0)
|
||||
|
||||
# Xfer the inventory file and look for a script
|
||||
xfer = await region.xfer_manager.request(
|
||||
file_name=inv_message["InventoryData"]["Filename"], file_path=XferFilePath.CACHE)
|
||||
inv_model = InventoryModel.from_bytes(xfer.reassemble_chunks())
|
||||
first_script: Optional[InventoryItem] = None
|
||||
for item in inv_model.items.values():
|
||||
if item.type == "lsltext":
|
||||
first_script = item
|
||||
if not first_script:
|
||||
show_message("No scripts in object?")
|
||||
return
|
||||
|
||||
# Ask for the actual script contents
|
||||
transfer = await region.transfer_manager.request(
|
||||
source_type=TransferSourceType.SIM_INV_ITEM,
|
||||
params=TransferRequestParamsSimInvItem(
|
||||
OwnerID=first_script.permissions.owner_id,
|
||||
TaskID=inv_model.root.node_id,
|
||||
ItemID=first_script.item_id,
|
||||
AssetType=AssetType.LSL_TEXT,
|
||||
),
|
||||
)
|
||||
show_message(transfer.reassemble_chunks().decode("utf8"))
|
||||
|
||||
|
||||
addons = [TransferExampleAddon()]
|
||||
109
addon_examples/uploader.py
Normal file
109
addon_examples/uploader.py
Normal file
@@ -0,0 +1,109 @@
|
||||
"""
|
||||
Example of how to upload assets, assumes assets are already encoded
|
||||
in the appropriate format.
|
||||
|
||||
/524 upload <asset type>
|
||||
"""
|
||||
import pprint
|
||||
from pathlib import Path
|
||||
from typing import *
|
||||
|
||||
import aiohttp
|
||||
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.base.message.message import Block
|
||||
from hippolyzer.lib.proxy.addons import AddonManager
|
||||
from hippolyzer.lib.proxy.addon_utils import ais_item_to_inventory_data, show_message, BaseAddon
|
||||
from hippolyzer.lib.proxy.commands import handle_command, Parameter
|
||||
from hippolyzer.lib.proxy.packets import Direction
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
from hippolyzer.lib.proxy.templates import AssetType
|
||||
|
||||
|
||||
class UploaderAddon(BaseAddon):
|
||||
@handle_command(
|
||||
asset_type=Parameter(lambda x: AssetType[x.upper()]),
|
||||
flags=Parameter(int, optional=True)
|
||||
)
|
||||
async def upload_asset(self, _session: Session, region: ProxiedRegion,
|
||||
asset_type: AssetType, flags: Optional[int] = None):
|
||||
"""Upload a raw asset with optional flags"""
|
||||
inv_type = asset_type.inventory_type
|
||||
file = await AddonManager.UI.open_file()
|
||||
if not file:
|
||||
return
|
||||
file = Path(file)
|
||||
if not file.exists():
|
||||
show_message(f"{file} does not exist")
|
||||
return
|
||||
name = file.stem
|
||||
|
||||
with open(file, "rb") as f:
|
||||
file_body = f.read()
|
||||
|
||||
params = {
|
||||
"asset_type": asset_type.human_name,
|
||||
"description": "(No Description)",
|
||||
"everyone_mask": 0,
|
||||
"group_mask": 0,
|
||||
"folder_id": UUID(), # Puts it in the default folder, I guess. Undocumented.
|
||||
"inventory_type": inv_type.human_name,
|
||||
"name": name,
|
||||
"next_owner_mask": 581632,
|
||||
}
|
||||
if flags is not None:
|
||||
params['flags'] = flags
|
||||
|
||||
caps = region.caps_client
|
||||
async with aiohttp.ClientSession() as sess:
|
||||
async with caps.post('NewFileAgentInventory', llsd=params, session=sess) as resp:
|
||||
parsed = await resp.read_llsd()
|
||||
if "uploader" not in parsed:
|
||||
show_message(f"Upload error!: {parsed!r}")
|
||||
return
|
||||
print("Got upload URL, uploading...")
|
||||
|
||||
async with caps.post(parsed["uploader"], data=file_body, session=sess) as resp:
|
||||
upload_parsed = await resp.read_llsd()
|
||||
|
||||
if "new_inventory_item" not in upload_parsed:
|
||||
show_message(f"Got weird upload resp: {pprint.pformat(upload_parsed)}")
|
||||
return
|
||||
|
||||
await self._force_inv_update(region, upload_parsed['new_inventory_item'])
|
||||
|
||||
@handle_command(item_id=UUID)
|
||||
async def force_inv_update(self, _session: Session, region: ProxiedRegion, item_id: UUID):
|
||||
"""Force an inventory update for a given item id"""
|
||||
await self._force_inv_update(region, item_id)
|
||||
|
||||
async def _force_inv_update(self, region: ProxiedRegion, item_id: UUID):
|
||||
session = region.session()
|
||||
ais_req_data = {
|
||||
"items": [
|
||||
{
|
||||
"owner_id": session.agent_id,
|
||||
"item_id": item_id,
|
||||
}
|
||||
]
|
||||
}
|
||||
async with region.caps_client.post('FetchInventory2', llsd=ais_req_data) as resp:
|
||||
ais_item = (await resp.read_llsd())["items"][0]
|
||||
|
||||
message = ProxiedMessage(
|
||||
"UpdateCreateInventoryItem",
|
||||
Block(
|
||||
"AgentData",
|
||||
AgentID=session.agent_id,
|
||||
SimApproved=1,
|
||||
TransactionID=UUID.random(),
|
||||
),
|
||||
ais_item_to_inventory_data(ais_item),
|
||||
direction=Direction.IN
|
||||
)
|
||||
region.circuit.send_message(message)
|
||||
|
||||
|
||||
addons = [UploaderAddon()]
|
||||
64
addon_examples/xfer_example.py
Normal file
64
addon_examples/xfer_example.py
Normal file
@@ -0,0 +1,64 @@
|
||||
"""
|
||||
Example of how to request an Xfer
|
||||
"""
|
||||
from hippolyzer.lib.base.legacy_inv import InventoryModel
|
||||
from hippolyzer.lib.base.message.message import Block
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message
|
||||
from hippolyzer.lib.proxy.commands import handle_command
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
from hippolyzer.lib.proxy.templates import XferFilePath
|
||||
|
||||
|
||||
class XferExampleAddon(BaseAddon):
|
||||
@handle_command()
|
||||
async def get_mute_list(self, session: Session, region: ProxiedRegion):
|
||||
"""Fetch the current user's mute list"""
|
||||
region.circuit.send_message(ProxiedMessage(
|
||||
'MuteListRequest',
|
||||
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
|
||||
Block("MuteData", MuteCRC=0),
|
||||
))
|
||||
|
||||
# Wait for any MuteListUpdate, dropping it before it reaches the viewer
|
||||
update_msg = await region.message_handler.wait_for('MuteListUpdate', timeout=5.0)
|
||||
mute_file_name = update_msg["MuteData"]["Filename"]
|
||||
if not mute_file_name:
|
||||
show_message("Nobody muted?")
|
||||
return
|
||||
|
||||
xfer = await region.xfer_manager.request(
|
||||
file_name=mute_file_name, file_path=XferFilePath.CACHE)
|
||||
show_message(xfer.reassemble_chunks().decode("utf8"))
|
||||
|
||||
@handle_command()
|
||||
async def get_task_inventory(self, session: Session, region: ProxiedRegion):
|
||||
"""Get the inventory of the currently selected object"""
|
||||
region.circuit.send_message(ProxiedMessage(
|
||||
'RequestTaskInventory',
|
||||
# If no session is passed in we'll use the active session when the coro was created
|
||||
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
|
||||
Block('InventoryData', LocalID=session.selected.object_local),
|
||||
))
|
||||
|
||||
inv_message = await region.message_handler.wait_for('ReplyTaskInventory', timeout=5.0)
|
||||
|
||||
# Xfer doesn't need to be immediately awaited, multiple signals can be waited on.
|
||||
xfer = region.xfer_manager.request(
|
||||
file_name=inv_message["InventoryData"]["Filename"], file_path=XferFilePath.CACHE)
|
||||
|
||||
# Wait until we have the first packet so we can tell the expected length
|
||||
# The difference in time is obvious for large inventories, and we can cancel
|
||||
# mid-request if we want.
|
||||
show_message(f"Inventory is {await xfer.size_known} bytes")
|
||||
|
||||
# Wait for the rest of the body to be done
|
||||
await xfer
|
||||
|
||||
inv_model = InventoryModel.from_bytes(xfer.reassemble_chunks())
|
||||
item_names = [item.name for item in inv_model.items.values()]
|
||||
show_message(item_names)
|
||||
|
||||
|
||||
addons = [XferExampleAddon()]
|
||||
@@ -1,17 +0,0 @@
|
||||
This checkout contains the most recently complied version of the documentation in docs/html/.
|
||||
|
||||
To rebuild the sphinx doc set:
|
||||
|
||||
Get sphinx!!!
|
||||
|
||||
Either use your virtualenv, or your native python install and run:
|
||||
easy_install -U Sphinx
|
||||
|
||||
Then, from the docs dir:
|
||||
|
||||
1. python refresh.py
|
||||
|
||||
refresh.py stages the sphinx .rst files, and then runs 'sphinx-build -a -c source/configure/' source/ html/
|
||||
|
||||
The docs/html/ directory will contain the fully compiled documentation set.
|
||||
Please check in updated docs if you add functionality.
|
||||
@@ -1,4 +0,0 @@
|
||||
# Sphinx build info version 1
|
||||
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
|
||||
config:
|
||||
tags:
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user