157 Commits

Author SHA1 Message Date
Salad Dais
220a02543e v0.11.1 2022-07-20 20:38:17 +00:00
Salad Dais
8ac47c2397 Fix use of dynamically imported globals in REPL 2022-07-20 20:30:41 +00:00
Salad Dais
d384978322 UpdateType -> ObjectUpdateType 2022-07-20 20:26:50 +00:00
Salad Dais
f02a479834 Add get_task_inventory_cap.py addon example
An example of mocking out actually useful behavior for the viewer.
Better (faster!) task inventory fetching API.
2022-07-20 09:20:27 +00:00
Salad Dais
b5e8b36173 Add more enum and flag defs to templates.py 2022-07-20 06:35:04 +00:00
Salad Dais
08a39f4df7 Make object update handling more robust 2022-07-20 06:35:04 +00:00
Salad Dais
61ec51beec Add demo autoattacher addon example 2022-07-19 23:48:40 +00:00
Salad Dais
9adbdcdcc8 Add a couple more flag definitions to templates.py 2022-07-19 09:49:43 +00:00
Salad Dais
e7b05f72ca Dequantize TimeDilation message var 2022-07-19 05:57:19 +00:00
Salad Dais
75f2f363a4 Handle TE glow field quantization 2022-07-18 22:29:37 +00:00
Salad Dais
cc1bb9ac1d Give MediaFlags and BasicMaterials sensible default values 2022-07-18 22:08:06 +00:00
Salad Dais
d498d1f2c8 v0.11.0 2022-07-18 08:53:24 +00:00
Salad Dais
8c0635bb2a Add classmethod for rebuilding TEs into a TECollection 2022-07-18 06:37:20 +00:00
Salad Dais
309dbeeb52 Add TextureEntry.st_to_uv() to convert between coords 2022-07-18 00:34:56 +00:00
Salad Dais
4cc87bf81e Add a default value for TextureEntryCollection.realize() num_faces 2022-07-17 01:09:22 +00:00
Salad Dais
f34bb42dcb TextureEntry -> TextureEntryCollection, improve .realize()
The "TextureEntry" name from the message template is kind of a
misnomer, the field actually includes multiple TextureEntries.
2022-07-17 00:45:20 +00:00
Salad Dais
59ec99809a Correct TE rotation quantization
Literally everything has its own special float quantization. Argh.
2022-07-16 23:17:34 +00:00
Salad Dais
4b963f96d2 Add TextureEntry.realize() to ease indexing into specific faces 2022-07-14 03:10:11 +00:00
Salad Dais
58db8f66de Correct type signatures for TextureEntriy 2022-07-10 17:58:13 +00:00
Salad Dais
95623eba58 More InventoryModel fixes 2022-07-10 01:55:34 +00:00
Salad Dais
8dba0617bd Make injecting inventory EQ events easier 2022-07-09 04:21:44 +00:00
Salad Dais
289073be8e Add InventoryModel diffing 2022-07-09 02:48:23 +00:00
Salad Dais
f3c8015366 Support mutable InventoryModels 2022-07-08 22:06:14 +00:00
Salad Dais
99e8118458 Support HIPPO XML directives in injected EQ events 2022-07-05 14:24:35 +00:00
Salad Dais
80745cfd1c Add TextureEntry.unwrap() to ease working with potentially lazy TEs 2022-07-05 03:08:52 +00:00
Salad Dais
92a06bccaf Dequantize OffsetS and OffsetT in TextureEntrys 2022-07-05 02:08:53 +00:00
Salad Dais
fde9ddf4d9 Initial work to support in-flight EQ response pre-emption 2022-07-04 17:57:05 +00:00
Salad Dais
03a56c9982 Auto-load certain symbols in REPL, add docs for REPL 2022-06-27 01:49:27 +00:00
Salad Dais
d07a0df0fd WIP LLMesh -> Collada
First half of the LLMesh -> Collada -> LLMesh transform for #24
2022-06-24 13:15:20 +00:00
Salad Dais
848397fe63 Fix windows build workflow 2022-06-24 07:36:51 +00:00
Salad Dais
0f9246c5c6 Use github.ref_name instead of github.ref 2022-06-24 02:32:50 +00:00
Salad Dais
2e7f887970 v0.10.0 2022-06-24 01:54:37 +00:00
Salad Dais
ef9df6b058 Update Windows bundling action to add artifact to release 2022-06-24 01:12:21 +00:00
Salad Dais
baae0f6d6e Fix TupleCoord negation 2022-06-21 07:15:49 +00:00
Salad Dais
0f369b682d Upgrade to mitmproxy 8.0
Not 8.1 since that drops Python 3.8 support. Closes #26
2022-06-20 15:15:57 +00:00
Salad Dais
1f1e4de254 Add addon for testing object manager conformance against viewer
Closes #18
2022-06-20 12:38:11 +00:00
Salad Dais
75ddc0a5ba Be smarter about object cache miss autorequests 2022-06-20 12:33:12 +00:00
Salad Dais
e4cb168138 Clear up last few event loop warnings 2022-06-20 12:31:08 +00:00
Salad Dais
63aebba754 Clear up some event loop deprecation warnings 2022-06-20 05:55:01 +00:00
Salad Dais
8cf1a43d59 Better defaults when parsing ObjectUpdateCompressed
This helps our view of the cache better match the viewer's VOCache
2022-06-20 03:23:46 +00:00
Salad Dais
bbc8813b61 Add unary minus for TupleCoords 2022-06-19 04:33:20 +00:00
Salad Dais
5b51dbd30f Add workaround instructions for most recent Firestorm release
Closes #25
2022-05-13 23:52:50 +00:00
Salad Dais
295c7972e7 Use windows-2019 runner instead of windows-latest
windows-latest has some weird ACL changes that cause the cx_Freeze
packaging steps to fail.
2022-05-13 23:39:37 +00:00
Salad Dais
b034661c38 Revert "Temporarily stop generating lib_licenses.txt automatically"
This reverts commit f12fd95ee1.
2022-05-13 23:39:09 +00:00
Salad Dais
f12fd95ee1 Temporarily stop generating lib_licenses.txt automatically
Something is busted with pip-licenses in CI. Not sure why, but
it's only needed for Windows builds anyway.
2022-03-12 19:15:59 +00:00
Salad Dais
bc33313fc7 v0.9.0 2022-03-12 18:40:38 +00:00
Salad Dais
affc7fcf89 Clarify comment in proxy object manager 2022-03-05 11:03:28 +00:00
Salad Dais
b8f1593a2c Allow filtering on HTTP status code 2022-03-05 10:50:09 +00:00
Salad Dais
7879f4e118 Split up mitmproxy integration test a bit 2022-03-05 10:49:55 +00:00
Salad Dais
4ba611ae01 Only apply local mesh to selected links 2022-02-28 07:32:46 +00:00
Salad Dais
82ff6d9c64 Add more TeleportFlags 2022-02-28 07:32:22 +00:00
Salad Dais
f603ea6186 Better handle timeouts that have missing cap_data metadata 2021-12-18 20:43:10 +00:00
Salad Dais
fcf6a4568b Better handling for proxied HTTP requests that timeout 2021-12-17 19:27:20 +00:00
Salad Dais
2ad6cc1b51 Better handle broken 'LLSD' responses 2021-12-17 00:18:51 +00:00
Salad Dais
025f7d31f2 Make sure .queued is cleared if message take()n twice 2021-12-15 20:17:54 +00:00
Salad Dais
9fdb281e4a Create example addon for simulating packet loss 2021-12-13 06:12:43 +00:00
Salad Dais
11e28bde2a Allow filtering message log on HTTP headers 2021-12-11 15:08:45 +00:00
Salad Dais
1faa6f977c Update docs on send() and send_reliable() 2021-12-10 13:41:20 +00:00
Salad Dais
6866e7397f Clean up cap registration API 2021-12-10 13:22:54 +00:00
Salad Dais
fa0b3a5340 Mark all Messages synthetic unless they came off the wire 2021-12-10 07:30:02 +00:00
Salad Dais
16c808bce8 Match viewer resend behaviour 2021-12-10 07:04:36 +00:00
Salad Dais
ec4b2d0770 Move last of the explicit direction params 2021-12-10 06:50:07 +00:00
Salad Dais
3b610fdfd1 Add awaitable send_reliable() 2021-12-09 05:30:35 +00:00
Salad Dais
8b93c5eefa Rename send_message() to send() 2021-12-09 05:30:12 +00:00
Salad Dais
f4bb9eae8f Fix __contains__ for JankStringyBytes 2021-12-09 03:48:29 +00:00
Salad Dais
ecb14197cf Make message log filter highlight every matched field
Previously only the first match was being highlighted.
2021-12-09 01:14:09 +00:00
Salad Dais
95fd58e25a Begin PySide6 cleanup 2021-12-09 00:02:48 +00:00
Salad Dais
afc333ab49 Improve highlighting of matched fields in message log 2021-12-08 23:50:16 +00:00
Salad Dais
eb6406bca4 Fix ACK collection logic for injected reliable messages 2021-12-08 22:29:29 +00:00
Salad Dais
d486aa130d Add support for specifying flags in message builder 2021-12-08 21:10:06 +00:00
Salad Dais
d66d5226a2 Initial implementation of reliable injected packets
See #17. Not yet tested for real.
2021-12-08 04:49:45 +00:00
Salad Dais
d86da70eeb v0.8.0 2021-12-07 07:16:25 +00:00
Salad Dais
aa0b4b63a9 Update cx_freeze script to handle PySide6 2021-12-07 07:16:25 +00:00
Salad Dais
5f479e46b4 Automatically offer to install the HTTPS certs on first run 2021-12-07 07:16:25 +00:00
Salad Dais
1e55d5a9d8 Continue handling HTTP flows if flow logging fails
If flow beautification for display throws then we don't want
to bypass other handling of the flow.

This fixes a login failure due to SL's login XML-RPC endpoint
returning a Content-Type of "application/llsd+xml/r/n" when it's
actually "application/xml".
2021-12-06 17:01:13 +00:00
Salad Dais
077a95b5e7 Migrate to PySide6 to support Python 3.10
Update Glymur too
2021-12-06 13:37:31 +00:00
Salad Dais
4f1399cf66 Add note about LinHippoAutoProxy 2021-12-06 12:26:16 +00:00
Salad Dais
9590b30e66 Add note about Python 3.10 support 2021-12-05 20:25:06 +00:00
Salad Dais
34f3ee4c3e Move mtime wrapper to helpers 2021-12-05 18:14:26 +00:00
Salad Dais
7d655543f5 Dont reserialize responses as pretty LLSD-XML
Certain LLSD parsers don't like the empty text nodes it adds around
the root element of the document. Yuck.
2021-12-05 18:12:53 +00:00
Salad Dais
5de3ed0d5e Add support for LLSD inventory representations 2021-12-03 05:59:58 +00:00
Salad Dais
74c3287cc0 Add base addon for creating proxy-only caps based on ASGI apps 2021-12-02 06:04:29 +00:00
Salad Dais
3a7f8072a0 Initial implementation of proxy-provided caps
Useful for mocking out a cap while developing the viewer-side
pieces of it.
2021-12-02 03:22:47 +00:00
dependabot[bot]
5fa91580eb Bump mitmproxy from 7.0.2 to 7.0.3 (#21)
Bumps [mitmproxy](https://github.com/mitmproxy/mitmproxy) from 7.0.2 to 7.0.3.
- [Release notes](https://github.com/mitmproxy/mitmproxy/releases)
- [Changelog](https://github.com/mitmproxy/mitmproxy/blob/main/CHANGELOG.md)
- [Commits](https://github.com/mitmproxy/mitmproxy/compare/v7.0.2...v7.0.3)

---
updated-dependencies:
- dependency-name: mitmproxy
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-11-30 05:30:06 -04:00
Salad Dais
d8fbb55438 Improve LLUDP integration tests 2021-11-30 09:25:31 +00:00
Salad Dais
99eb4fed74 Fix _reorient_coord to work correctly for normals again 2021-11-30 09:24:49 +00:00
Salad Dais
6b78b841df Fix range of mesh normals 2021-11-23 01:36:14 +00:00
Salad Dais
dae852db69 Fix filter dialog 2021-11-19 04:30:36 +00:00
Salad Dais
0c0de2bcbc v0.7.1 2021-09-04 07:27:20 +00:00
Salad Dais
9f2d2f2194 Pin recordclass version, use requirements.txt for windows build
recordclass had some breaking changes in 0.15
2021-09-04 07:12:45 +00:00
Salad Dais
c6e0a400a9 v0.7.0 2021-08-10 01:16:20 +00:00
Salad Dais
d01122d542 Call correct method to raise new message log window 2021-08-10 01:11:21 +00:00
Salad Dais
690d6b51b8 Upgrade to mitmproxy 7.0.2
Our fix for `Flow.set_state()` has been upstreamed
2021-08-09 22:16:23 +00:00
Salad Dais
2437a8b14f Add a framework for simple local anim creation, tail animator 2021-08-05 21:08:18 +00:00
Salad Dais
afa601fffe Support session-specific viewer cache directories 2021-08-02 18:23:13 +00:00
Salad Dais
874feff471 Fix incorrect reference to mitmproxy class 2021-08-01 12:16:10 +00:00
Salad Dais
05c53bba9f Add CapsClient to BaseClientSession 2021-08-01 06:39:04 +00:00
Salad Dais
578f1d8c4e Add setting to disable all proxy object autorequests
Will help with #18 by not changing object request behaviour when
running through the proxy.
2021-08-01 06:37:33 +00:00
Salad Dais
7d8e18440a Add local anim mangler support with example
Analogous to local mesh mangler support.
2021-07-31 11:56:17 +00:00
Salad Dais
66e112dd52 Add basic message log import / export feature
Closes #20
2021-07-30 03:13:33 +00:00
Salad Dais
02ac022ab3 Add export formats for message log entries 2021-07-30 01:06:29 +00:00
Salad Dais
33ce74754e Fix mirror_target_agent check in http hooks 2021-07-30 01:06:29 +00:00
Salad Dais
74dd6b977c Add extended to_dict() format for Message class
This will allow proper import / export of message logs.
2021-07-29 10:26:42 +00:00
Salad Dais
387652731a Add Message Mirror example addon 2021-07-29 09:43:20 +00:00
Salad Dais
e4601fd879 Support multiple Message Log windows
Closes #19
2021-07-29 01:00:57 +00:00
Salad Dais
6eb25f96d9 Support logging to a hierarchy of message loggers
Necessary to eventually support multiple message log windows
2021-07-27 02:35:03 +00:00
Salad Dais
22b9eeb5cb Better handling of optional command parameters 2021-07-22 23:59:55 +00:00
Salad Dais
0dbedcb2f5 Improve coverage 2021-07-22 23:58:17 +00:00
Salad Dais
7d9712c16e Fix message dropping and queueing corner cases 2021-07-22 05:08:47 +00:00
Salad Dais
82663c0fc2 Add parse_bool helper function for command parameters 2021-07-21 06:39:29 +00:00
Salad Dais
9fb4884470 Extend TlsLayer.tls_start_server instead of monkeypatching OpenSSL funcs
We have a more elegant way of unsetting `X509_CHECK_FLAG_NEVER_CHECK_SUBJECT`
now that mitmproxy 7.0 is out.

See https://github.com/mitmproxy/mitmproxy/pull/4688
2021-07-19 20:17:31 +00:00
Salad Dais
cf69c42f67 Rework HTTP proxying code to work with mitmproxy 7.0.0 2021-07-18 07:02:45 +00:00
Salad Dais
be658b9026 v0.6.3
Cutting a release before working on mitmproxy upgrade
2021-07-18 06:57:40 +00:00
Salad Dais
c505941595 Improve test for TE serialization 2021-07-18 06:33:55 +00:00
Salad Dais
96f471d6b7 Add initial support for Message-specific Block subclasses 2021-07-07 12:49:32 +00:00
Salad Dais
4238016767 Change readme wording
:)
2021-07-07 12:49:32 +00:00
Salad Dais
a35a67718d Add default_value to MessateTemplateVariable 2021-07-01 21:25:51 +00:00
Salad Dais
c2981b107a Remove CodeQL scanning
Maybe later, doesn't seem to do anything useful out of the box.
2021-06-28 06:00:42 -03:00
Salad Dais
851375499a Add CodeQL scanning 2021-06-28 05:44:02 -03:00
Salad Dais
d064ecd466 Don't raise when reading a new avatar_name_cache.xml 2021-06-25 18:45:42 +00:00
Salad Dais
fda37656c9 Reduce boilerplate for mesh mangling addons
Makes it less annoying to compose separate addons with different manglers
2021-06-24 05:29:23 +00:00
Salad Dais
49a9c6f28f Workaround for failed teleports due to EventQueue timeouts
Closes #16
2021-06-23 16:43:09 +00:00
Salad Dais
050ac5e3a9 v0.6.2 2021-06-19 03:06:39 +00:00
Salad Dais
fe0d3132e4 Update shield addon 2021-06-18 20:49:31 +00:00
Salad Dais
d7f18e05be Fix typo 2021-06-18 20:49:20 +00:00
Salad Dais
9bf4240411 Allow tagging UDPPackets with arbitrary metadata
The metadata should propagate to any Messages deserialized
from the packet as well.
2021-06-18 20:31:15 +00:00
Salad Dais
76df9a0424 Streamline template dictionary use 2021-06-17 21:28:22 +00:00
Salad Dais
a91bc67a43 v0.6.1 2021-06-16 14:27:26 +00:00
Salad Dais
48180b85d1 Export proxy test utils for use in addon test suites 2021-06-15 18:48:05 +00:00
Salad Dais
77d3bf2fe1 Make ObjectCacheChain handle invalid caches properly 2021-06-14 14:17:21 +00:00
Salad Dais
d8ec9ee77a Add hooks to allow swapping out transports 2021-06-14 13:48:30 +00:00
Salad Dais
0b46b95f81 Minor API changes 2021-06-14 13:33:17 +00:00
Salad Dais
73e66c56e5 Clarify addon state management example addon 2021-06-13 12:06:04 +00:00
Salad Dais
fd2a4d8dce Remove incorrect comment from JPEG2000 test 2021-06-13 10:23:18 +00:00
Salad Dais
2209ebdd0c Add unit tests for JPEG2000 utils 2021-06-13 10:20:18 +00:00
Salad Dais
ccfb641cc2 Add pixel artist example addon 2021-06-12 15:44:26 +00:00
Salad Dais
220d8ddf65 Add confirmation helper for InteractionManager API 2021-06-12 15:15:34 +00:00
Salad Dais
235bc8e09e Change TextureEntry type signatures to play nicer with type checker 2021-06-12 15:15:03 +00:00
Salad Dais
41fd67577a Add ability to wait on object-related events 2021-06-12 10:43:16 +00:00
Salad Dais
8347b341f5 Give default values for TextureEntry fields 2021-06-12 10:26:52 +00:00
Salad Dais
9d5599939e Add MCode enum definition 2021-06-12 08:54:34 +00:00
Salad Dais
1fd6decf91 Add integration tests for addon (un)loading 2021-06-11 19:44:53 +00:00
Salad Dais
4ddc6aa852 Remove unloaded addon scripts from sys.modules 2021-06-11 19:44:35 +00:00
Salad Dais
ab89f6bc14 Add integration test for asset server wrapper cap 2021-06-11 17:53:55 +00:00
Salad Dais
cb8c1cfe91 Only generate lowercase hostnames in register_wrapper_cap()
Hostnames are case insensitive and passing a URL through urlparse()
will always give you a lowercase domain name.
2021-06-11 17:52:03 +00:00
Salad Dais
52679bf708 HTTPAssetRepo: Don't throw when trying to serve invalid UUID 2021-06-11 17:51:45 +00:00
Salad Dais
a21c0439e9 Test for mitmproxy handling HTTPS requests as well 2021-06-10 23:32:38 +00:00
Salad Dais
216ffb3777 Add integration test for mitmproxy interception 2021-06-10 23:22:59 +00:00
Salad Dais
d4c30d998d Allow handling Firestorm Bridge responses, use to guess avatar Z pos 2021-06-09 02:02:09 +00:00
Salad Dais
003f37c3d3 Auto-request unknown objects when an avatar sits on them
We need to know about an avatar's parent to get their exact position
due to the Object.Position field always being relative to the parent.
2021-06-08 23:44:08 +00:00
Salad Dais
d64a07c04c Better guard to prevent accidental lazy serializable hydration 2021-06-08 18:57:57 +00:00
Salad Dais
82b156813b Add more name accessors to Avatar class 2021-06-08 18:57:24 +00:00
Salad Dais
b71da8f5a4 Add option to automatically request missing cached objects 2021-06-08 18:41:44 +00:00
Salad Dais
5618bcbac1 Add new persistent (Proxy)Settings object, use to pass down settings 2021-06-08 16:55:19 +00:00
Salad Dais
24abc36df2 Correct AgentState enum definition 2021-06-07 12:56:39 +00:00
Salad Dais
9ceea8324a Fix templates.py reloading by importing importlib 2021-06-07 12:56:21 +00:00
Salad Dais
29653c350f Bundle addon examples with Windows build 2021-06-07 11:40:45 +00:00
109 changed files with 5411 additions and 1324 deletions

View File

@@ -1,5 +1,6 @@
[run]
omit =
concurrency = multiprocessing
[report]
exclude_lines =
pragma: no cover
@@ -7,3 +8,5 @@ exclude_lines =
if typing.TYPE_CHECKING:
def __repr__
raise AssertionError
assert False
pass

View File

@@ -2,18 +2,23 @@
# onto the release after it gets created. Don't want actions with repo write.
name: Bundle Windows EXE
on:
# Only trigger on release creation
release:
types:
- created
workflow_dispatch:
env:
target_tag: ${{ github.ref_name }}
jobs:
build:
runs-on: windows-latest
runs-on: windows-2019
permissions:
contents: write
strategy:
matrix:
python-version: [3.9]
@@ -29,18 +34,29 @@ jobs:
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install -e .
pip install cx_freeze
- name: Bundle with cx_Freeze
shell: bash
run: |
python setup_cxfreeze.py build_exe
pip install pip-licenses
pip-licenses --format=plain-vertical --with-license-file --no-license-path --output-file=lib_licenses.txt
python setup_cxfreeze.py finalize_cxfreeze
# Should only be one, but we don't know what it's named
mv ./dist/*.zip hippolyzer-windows-${{ env.target_tag }}.zip
- name: Upload the artifact
uses: actions/upload-artifact@v2
with:
name: hippolyzer-gui-windows-${{ github.sha }}
path: ./dist/**
name: hippolyzer-windows-${{ github.sha }}
path: ./hippolyzer-windows-${{ env.target_tag }}.zip
- uses: ncipollo/release-action@v1.10.0
with:
artifacts: hippolyzer-windows-${{ env.target_tag }}.zip
tag: ${{ env.target_tag }}
token: ${{ secrets.GITHUB_TOKEN }}
allowUpdates: true

View File

@@ -8,7 +8,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [3.8, 3.9]
python-version: ["3.8", "3.10"]
steps:
- uses: actions/checkout@v2
@@ -23,6 +23,7 @@ jobs:
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install -r requirements-test.txt
sudo apt-get install libopenjp2-7
- name: Run Flake8
run: |
flake8 .

View File

@@ -2,7 +2,7 @@
![Python Test Status](https://github.com/SaladDais/Hippolyzer/workflows/Run%20Python%20Tests/badge.svg) [![codecov](https://codecov.io/gh/SaladDais/Hippolyzer/branch/master/graph/badge.svg?token=HCTFA4RAXX)](https://codecov.io/gh/SaladDais/Hippolyzer)
[Hippolyzer](http://wiki.secondlife.com/wiki/Hippo) is a fork of Linden Lab's abandoned
[Hippolyzer](http://wiki.secondlife.com/wiki/Hippo) is a revival of Linden Lab's
[PyOGP library](http://wiki.secondlife.com/wiki/PyOGP)
targeting modern Python 3, with a focus on debugging issues in Second Life-compatible
servers and clients. There is a secondary focus on mocking up new features without requiring a
@@ -83,6 +83,28 @@ SOCKS 5 works correctly on these platforms, so you can just configure it through
the `no_proxy` env var appropriately. For ex. `no_proxy="asset-cdn.glb.agni.lindenlab.com" ./firestorm`.
* Log in!
##### Firestorm
The proxy selection dialog in the most recent Firestorm release is non-functional, as
https://bitbucket.org/lindenlab/viewer/commits/454c7f4543688126b2fa5c0560710f5a1733702e was not pulled in.
As a workaround, you can go to `Debug -> Show Debug Settings` and enter the following values:
| Name | Value |
|---------------------|-----------|
| HttpProxyType | Web |
| BrowserProxyAddress | 127.0.0.1 |
| BrowserProxyEnabled | TRUE |
| BrowserProxyPort | 9062 |
| Socks5ProxyEnabled | TRUE |
| Socks5ProxyHost | 127.0.0.1 |
| Socks5ProxyPort | 9061 |
Or, if you're on Linux, you can also use [LinHippoAutoProxy](https://github.com/SaladDais/LinHippoAutoProxy).
Connections from the in-viewer browser will likely _not_ be run through Hippolyzer when using either of
these workarounds.
### Filtering
By default, the proxy's display filter is configured to ignore many high-frequency messages.
@@ -224,7 +246,7 @@ OUT ObjectAdd
```
The repeat spinner at the bottom of the window lets you send a message multiple times.
an `i` variable is put into the eval context and can be used to vary messages accros repeats.
an `i` variable is put into the eval context and can be used to vary messages across repeats.
With repeat set to two:
```
@@ -311,6 +333,22 @@ If you are a viewer developer, please put them in a viewer.
apply the mesh to the local mesh target. It works on attachments too. Useful for testing rigs before a
final, real upload.
## REPL
A quick and dirty REPL is also included for when you want to do ad-hoc introspection of proxy state.
It can be launched at any time by typing `/524 spawn_repl` in chat.
![Screenshot of REPL](https://github.com/SaladDais/Hippolyzer/blob/master/static/repl_screenshot.png?raw=true)
The REPL is fully async aware and allows awaiting events without blocking:
```python
>>> from hippolyzer.lib.client.object_manager import ObjectUpdateType
>>> evt = await session.objects.events.wait_for((ObjectUpdateType.OBJECT_UPDATE,), timeout=2.0)
>>> evt.updated
{'Position'}
```
## Potential Changes
* AISv3 wrapper?
@@ -375,6 +413,12 @@ To have your client's traffic proxied through Hippolyzer the general flow is:
* The proxy needs to use content sniffing to figure out which requests are login requests,
so make sure your request would pass `MITMProxyEventManager._is_login_request()`
#### Do I have to do all that?
You might be able to automate some of it on Linux by using
[LinHippoAutoProxy](https://github.com/SaladDais/LinHippoAutoProxy). If you're on Windows or MacOS the
above is your only option.
### Should I use this library to make an SL client in Python?
No. If you just want to write a client in Python, you should instead look at using

View File

@@ -9,7 +9,7 @@ from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
class PropertyHelloWorldAddon(BaseAddon):
class AddonStateHelloWorldAddon(BaseAddon):
# How to say hello, value shared across sessions and will be the same
# regardless of which session is active when accessed.
# "hello_greeting" is added to session_manager.addon_ctx's dict and will survive reloads
@@ -28,7 +28,11 @@ class PropertyHelloWorldAddon(BaseAddon):
# Shared across sessions and will die if the addon is reloaded
self.hello_punctuation = "!"
@handle_command(greeting=Parameter(str, sep=None))
@handle_command(
# Use the longer-form `Parameter()` for declaring this because
# this field should be greedy and take the rest of the message (no separator.)
greeting=Parameter(str, sep=None),
)
async def set_hello_greeting(self, _session: Session, _region: ProxiedRegion, greeting: str):
"""Set the person to say hello to"""
self.hello_greeting = greeting
@@ -38,7 +42,10 @@ class PropertyHelloWorldAddon(BaseAddon):
"""Set the person to say hello to"""
self.hello_person = person
@handle_command(punctuation=Parameter(str, sep=None))
@handle_command(
# Punctuation should have no whitespace, so using a simple parameter is OK.
punctuation=str,
)
async def set_hello_punctuation(self, _session: Session, _region: ProxiedRegion, punctuation: str):
"""Set the punctuation to use for saying hello"""
self.hello_punctuation = punctuation
@@ -47,8 +54,8 @@ class PropertyHelloWorldAddon(BaseAddon):
async def say_hello(self, _session: Session, _region: ProxiedRegion):
"""Say hello using the configured hello variables"""
# These aren't instance properties, they can be accessed via the class as well.
hello_person = PropertyHelloWorldAddon.hello_person
hello_person = AddonStateHelloWorldAddon.hello_person
send_chat(f"{self.hello_greeting} {hello_person}{self.hello_punctuation}")
addons = [PropertyHelloWorldAddon()]
addons = [AddonStateHelloWorldAddon()]

View File

@@ -0,0 +1,32 @@
"""
Example anim mangler addon, to be used with local anim addon.
You can edit this live to apply various transforms to local anims,
as well as any uploaded anims. Any changes will be reflected in currently
playing local anims.
This example modifies any position keys of an animation's mHipRight joint.
"""
from hippolyzer.lib.base.llanim import Animation
from hippolyzer.lib.proxy.addons import AddonManager
import local_anim
AddonManager.hot_reload(local_anim, require_addons_loaded=True)
def offset_right_hip(anim: Animation):
hip_joint = anim.joints.get("mHipRight")
if hip_joint:
for pos_frame in hip_joint.pos_keyframes:
pos_frame.pos.Z *= 2.5
pos_frame.pos.X *= 5.0
return anim
class ExampleAnimManglerAddon(local_anim.BaseAnimManglerAddon):
ANIM_MANGLERS = [
offset_right_hip,
]
addons = [ExampleAnimManglerAddon()]

View File

@@ -11,7 +11,7 @@ import enum
import os.path
from typing import *
from PySide2 import QtCore, QtGui, QtWidgets
from PySide6 import QtCore, QtGui, QtWidgets
from hippolyzer.lib.base.datatypes import Vector3
from hippolyzer.lib.base.message.message import Block, Message
@@ -80,7 +80,7 @@ class BlueishObjectListGUIAddon(BaseAddon):
raise
def _highlight_object(self, session: Session, obj: Object):
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"ForceObjectSelect",
Block("Header", ResetList=False),
Block("Data", LocalID=obj.LocalID),
@@ -88,7 +88,7 @@ class BlueishObjectListGUIAddon(BaseAddon):
))
def _teleport_to_object(self, session: Session, obj: Object):
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"TeleportLocationRequest",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
Block(

View File

@@ -0,0 +1,158 @@
"""
Detect receipt of a marketplace order for a demo, and auto-attach the most appropriate object
"""
import asyncio
import re
from typing import List, Tuple, Dict, Optional, Sequence
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message, Block
from hippolyzer.lib.base.templates import InventoryType, Permissions, FolderType
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
MARKETPLACE_TRANSACTION_ID = UUID('ffffffff-ffff-ffff-ffff-ffffffffffff')
class DemoAutoAttacher(BaseAddon):
def handle_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
if event["message"] != "BulkUpdateInventory":
return
# Check that this update even possibly came from the marketplace
if event["body"]["AgentData"][0]["TransactionID"] != MARKETPLACE_TRANSACTION_ID:
return
# Make sure that the transaction targeted our real received items folder
folders = event["body"]["FolderData"]
received_folder = folders[0]
if received_folder["Name"] != "Received Items":
return
skel = session.login_data['inventory-skeleton']
actual_received = [x for x in skel if x['type_default'] == FolderType.INBOX]
assert actual_received
if UUID(actual_received[0]['folder_id']) != received_folder["FolderID"]:
show_message(f"Strange received folder ID spoofing? {folders!r}")
return
if not re.match(r".*\bdemo\b.*", folders[1]["Name"], flags=re.I):
return
# Alright, so we have a demo... thing from the marketplace. What now?
items = event["body"]["ItemData"]
object_items = [x for x in items if x["InvType"] == InventoryType.OBJECT]
if not object_items:
return
self._schedule_task(self._attach_best_object(session, region, object_items))
async def _attach_best_object(self, session: Session, region: ProxiedRegion, object_items: List[Dict]):
own_body_type = await self._guess_own_body(session, region)
show_message(f"Trying to find demo for {own_body_type}")
guess_patterns = self.BODY_CLOTHING_PATTERNS.get(own_body_type)
to_attach = []
if own_body_type and guess_patterns:
matching_items = self._get_matching_items(object_items, guess_patterns)
if matching_items:
# Only take the first one
to_attach.append(matching_items[0])
if not to_attach:
# Don't know what body's being used or couldn't figure out what item
# would work best with our body. Just attach the first object in the folder.
to_attach.append(object_items[0])
# Also attach whatever HUDs, maybe we need them.
for hud in self._get_matching_items(object_items, ("hud",)):
if hud not in to_attach:
to_attach.append(hud)
region.circuit.send(Message(
'RezMultipleAttachmentsFromInv',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('HeaderData', CompoundMsgID=UUID.random(), TotalObjects=len(to_attach), FirstDetachAll=0),
*[Block(
'ObjectData',
ItemID=o["ItemID"],
OwnerID=session.agent_id,
# 128 = "add", uses whatever attachmentpt was defined on the object
AttachmentPt=128,
ItemFlags_=(),
GroupMask_=(),
EveryoneMask_=(),
NextOwnerMask_=(Permissions.COPY | Permissions.MOVE),
Name=o["Name"],
Description=o["Description"],
) for o in to_attach]
))
def _get_matching_items(self, items: List[dict], patterns: Sequence[str]):
# Loop over patterns to search for our body type, in order of preference
matched = []
for guess_pattern in patterns:
# Check each item for that pattern
for item in items:
if re.match(rf".*\b{guess_pattern}\b.*", item["Name"], re.I):
matched.append(item)
return matched
# We scan the agent's attached objects to guess what kind of body they use
BODY_PREFIXES = {
"-Belleza- Jake ": "jake",
"-Belleza- Freya ": "freya",
"-Belleza- Isis ": "isis",
"-Belleza- Venus ": "venus",
"[Signature] Gianni Body": "gianni",
"[Signature] Geralt Body": "geralt",
"Maitreya Mesh Body - Lara": "maitreya",
"Slink Physique Hourglass Petite": "hg_petite",
"Slink Physique Mesh Body Hourglass": "hourglass",
"Slink Physique Original Petite": "phys_petite",
"Slink Physique Mesh Body Original": "physique",
"[BODY] Legacy (f)": "legacy_f",
"[BODY] Legacy (m)": "legacy_m",
"[Signature] Alice Body": "sig_alice",
"Slink Physique MALE Mesh Body": "slink_male",
"AESTHETIC - [Mesh Body]": "aesthetic",
}
# Different bodies' clothes have different naming conventions according to different merchants.
# These are common naming patterns we use to choose objects to attach, in order of preference.
BODY_CLOTHING_PATTERNS: Dict[str, Tuple[str, ...]] = {
"jake": ("jake", "belleza"),
"freya": ("freya", "belleza"),
"isis": ("isis", "belleza"),
"venus": ("venus", "belleza"),
"gianni": ("gianni", "signature", "sig"),
"geralt": ("geralt", "signature", "sig"),
"hg_petite": ("hourglass petite", "hg petite", "hourglass", "hg", "slink"),
"hourglass": ("hourglass", "hg", "slink"),
"phys_petite": ("physique petite", "phys petite", "physique", "phys", "slink"),
"physique": ("physique", "phys", "slink"),
"legacy_f": ("legacy",),
"legacy_m": ("legacy",),
"sig_alice": ("alice", "signature"),
"slink_male": ("physique", "slink"),
"aesthetic": ("aesthetic",),
}
async def _guess_own_body(self, session: Session, region: ProxiedRegion) -> Optional[str]:
agent_obj = region.objects.lookup_fullid(session.agent_id)
if not agent_obj:
return None
# We probably won't know the names for all of our attachments, request them.
# Could be obviated by looking at the COF, not worth it for this.
try:
await asyncio.wait(region.objects.request_object_properties(agent_obj.Children), timeout=0.5)
except asyncio.TimeoutError:
# We expect that we just won't ever receive some property requests, that's fine
pass
for prefix, body_type in self.BODY_PREFIXES.items():
for obj in agent_obj.Children:
if not obj.Name:
continue
if obj.Name.startswith(prefix):
return body_type
return None
addons = [DemoAutoAttacher()]

View File

@@ -0,0 +1,100 @@
"""
Loading task inventory doesn't actually need to be slow.
By using a cap instead of the slow xfer path and sending the LLSD inventory
model we get 15x speedups even when mocking things behind the scenes by using
a hacked up version of xfer. See turbo_object_inventory.py
"""
import asyncio
import asgiref.wsgi
from typing import *
from flask import Flask, Response, request
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.inventory import InventoryModel
from hippolyzer.lib.base.message.message import Message, Block
from hippolyzer.lib.base.templates import XferFilePath
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.webapp_cap_addon import WebAppCapAddon
app = Flask("GetTaskInventoryCapApp")
@app.route('/', methods=["POST"])
async def get_task_inventory():
# Should always have the current region, the cap handler is bound to one.
# Just need to pull it from the `addon_ctx` module's global.
region = addon_ctx.region.get()
session = addon_ctx.session.get()
obj_id = UUID(request.args["task_id"])
obj = region.objects.lookup_fullid(obj_id)
if not obj:
return Response(f"Couldn't find {obj_id}", status=404, mimetype="text/plain")
request_msg = Message(
'RequestTaskInventory',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('InventoryData', LocalID=obj.LocalID),
)
# Keep around a dict of chunks we saw previously in case we have to restart
# an Xfer due to missing chunks. We don't expect chunks to change across Xfers
# so this can be used to recover from dropped SendXferPackets in subsequent attempts
existing_chunks: Dict[int, bytes] = {}
for _ in range(3):
# Any previous requests will have triggered a delete of the inventory file
# by marking it complete on the server-side. Re-send our RequestTaskInventory
# To make sure there's a fresh copy.
region.circuit.send(request_msg.take())
inv_message = await region.message_handler.wait_for(
('ReplyTaskInventory',),
predicate=lambda x: x["InventoryData"]["TaskID"] == obj.FullID,
timeout=5.0,
)
# No task inventory, send the reply as-is
file_name = inv_message["InventoryData"]["Filename"]
if not file_name:
return Response("", status=204)
if inv_message["InventoryData"]["Serial"] == int(request.args.get("last_serial", None)):
# Nothing has changed since the version of the inventory they say they have, say so.
return Response("", status=304)
xfer = region.xfer_manager.request(
file_name=file_name,
file_path=XferFilePath.CACHE,
turbo=True,
)
xfer.chunks.update(existing_chunks)
try:
await xfer
except asyncio.TimeoutError:
# We likely failed the request due to missing chunks, store
# the chunks that we _did_ get for the next attempt.
existing_chunks.update(xfer.chunks)
continue
inv_model = InventoryModel.from_str(xfer.reassemble_chunks().decode("utf8"))
return Response(
llsd.format_notation({
"inventory": inv_model.to_llsd(),
"inv_serial": inv_message["InventoryData"]["Serial"],
}),
headers={"Content-Type": "application/llsd+notation"},
)
raise asyncio.TimeoutError("Failed to get inventory after 3 tries")
class GetTaskInventoryCapExampleAddon(WebAppCapAddon):
# A cap URL with this name will be tied to each region when
# the sim is first connected to. The URL will be returned to the
# viewer in the Seed if the viewer requests it by name.
CAP_NAME = "GetTaskInventoryExample"
# Any asgi app should be fine.
APP = asgiref.wsgi.WsgiToAsgi(app)
addons = [GetTaskInventoryCapExampleAddon()]

View File

@@ -105,7 +105,7 @@ class HorrorAnimatorAddon(BaseAddon):
# send the response back immediately
block = STATIC_VFS[orig_anim_id]
anim_data = STATIC_VFS.read_block(block)
flow.response = mitmproxy.http.HTTPResponse.make(
flow.response = mitmproxy.http.Response.make(
200,
_mutate_anim_bytes(anim_data),
{

View File

@@ -5,42 +5,56 @@ Local animations
assuming you loaded something.anim
/524 start_local_anim something
/524 stop_local_anim something
/524 save_local_anim something
If you want to trigger the animation from an object to simulate llStartAnimation():
llOwnerSay("@start_local_anim:something=force");
Also includes a concept of "anim manglers" similar to the "mesh manglers" of the
local mesh addon. This is useful if you want to test making procedural changes
to animations before uploading them. The manglers will be applied to any uploaded
animations as well.
May also be useful if you need to make ad-hoc changes to a bunch of animations on
bulk upload, like changing priority or removing a joint.
"""
import asyncio
import os
import pathlib
from abc import abstractmethod
from typing import *
from hippolyzer.lib.base import serialization as se
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import get_mtime
from hippolyzer.lib.base.llanim import Animation
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty, GlobalProperty, show_message
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.http_asset_repo import HTTPAssetRepo
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
def _get_mtime(path: str):
try:
return os.stat(path).st_mtime
except:
return None
from hippolyzer.lib.proxy.sessions import Session, SessionManager
class LocalAnimAddon(BaseAddon):
# name -> path, only for anims actually from files
local_anim_paths: Dict[str, str] = SessionProperty(dict)
# name -> anim bytes
local_anim_bytes: Dict[str, bytes] = SessionProperty(dict)
# name -> mtime or None. Only for anims from files.
local_anim_mtimes: Dict[str, Optional[float]] = SessionProperty(dict)
# name -> current asset ID (changes each play)
local_anim_playing_ids: Dict[str, UUID] = SessionProperty(dict)
anim_manglers: List[Callable[[Animation], Animation]] = GlobalProperty(list)
def handle_init(self, session_manager: SessionManager):
self.remangle_local_anims(session_manager)
def handle_session_init(self, session: Session):
# Reload anims and reload any manglers if we have any
self._schedule_task(self._try_reload_anims(session))
@handle_command()
@@ -66,11 +80,23 @@ class LocalAnimAddon(BaseAddon):
"""Stop a named local animation"""
self.apply_local_anim(session, region, anim_name, new_data=None)
@handle_command(anim_name=str)
async def save_local_anim(self, _session: Session, _region: ProxiedRegion, anim_name: str):
"""Save a named local anim to disk"""
anim_bytes = self.local_anim_bytes.get(anim_name)
if not anim_bytes:
return
filename = await AddonManager.UI.save_file(filter_str="SL Anim (*.anim)", default_suffix="anim")
if not filename:
return
with open(filename, "wb") as f:
f.write(anim_bytes)
async def _try_reload_anims(self, session: Session):
while True:
region = session.main_region
if not region:
await asyncio.sleep(2.0)
await asyncio.sleep(1.0)
continue
# Loop over local anims we loaded
@@ -80,7 +106,7 @@ class LocalAnimAddon(BaseAddon):
continue
# is playing right now, check if there's a newer version
self.apply_local_anim_from_file(session, region, anim_name, only_if_changed=True)
await asyncio.sleep(2.0)
await asyncio.sleep(1.0)
def handle_rlv_command(self, session: Session, region: ProxiedRegion, source: UUID,
cmd: str, options: List[str], param: str):
@@ -127,11 +153,13 @@ class LocalAnimAddon(BaseAddon):
StartAnim=True,
))
cls.local_anim_playing_ids[anim_name] = next_id
cls.local_anim_bytes[anim_name] = new_data
else:
# No data means just stop the anim
cls.local_anim_playing_ids.pop(anim_name, None)
cls.local_anim_bytes.pop(anim_name, None)
region.circuit.send_message(new_msg)
region.circuit.send(new_msg)
print(f"Changing {anim_name} to {next_id}")
@classmethod
@@ -141,7 +169,7 @@ class LocalAnimAddon(BaseAddon):
anim_data = None
if anim_path:
old_mtime = cls.local_anim_mtimes.get(anim_name)
mtime = _get_mtime(anim_path)
mtime = get_mtime(anim_path)
if only_if_changed and old_mtime == mtime:
return
@@ -156,9 +184,94 @@ class LocalAnimAddon(BaseAddon):
with open(anim_path, "rb") as f:
anim_data = f.read()
anim_data = cls._mangle_anim(anim_data)
else:
print(f"Unknown anim {anim_name!r}")
cls.apply_local_anim(session, region, anim_name, new_data=anim_data)
@classmethod
def _mangle_anim(cls, anim_data: bytes) -> bytes:
if not cls.anim_manglers:
return anim_data
reader = se.BufferReader("<", anim_data)
spec = se.Dataclass(Animation)
anim = reader.read(spec)
for mangler in cls.anim_manglers:
anim = mangler(anim)
writer = se.BufferWriter("<")
writer.write(spec, anim)
return writer.copy_buffer()
@classmethod
def remangle_local_anims(cls, session_manager: SessionManager):
# Anim manglers are global, so we need to re-mangle anims for all sessions
for session in session_manager.sessions:
# Push the context of this session onto the stack so we can access
# session-scoped properties
with addon_ctx.push(new_session=session, new_region=session.main_region):
cls.local_anim_mtimes.clear()
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
if flow.name == "NewFileAgentInventoryUploader":
# Don't bother looking at this if we have no manglers
if not self.anim_manglers:
return
# This is kind of a crappy match but these magic bytes shouldn't match anything that SL
# allows as an upload type but animations.
if not flow.request.content or not flow.request.content.startswith(b"\x01\x00\x00\x00"):
return
# Replace the uploaded anim with the mangled version
flow.request.content = self._mangle_anim(flow.request.content)
show_message("Mangled upload request")
class BaseAnimManglerAddon(BaseAddon):
"""Base class for addons that mangle uploaded or file-based local animations"""
ANIM_MANGLERS: List[Callable[[Animation], Animation]]
def handle_init(self, session_manager: SessionManager):
# Add our manglers into the list
LocalAnimAddon.anim_manglers.extend(self.ANIM_MANGLERS)
LocalAnimAddon.remangle_local_anims(session_manager)
def handle_unload(self, session_manager: SessionManager):
# Clean up our manglers before we go away
mangler_list = LocalAnimAddon.anim_manglers
for mangler in self.ANIM_MANGLERS:
if mangler in mangler_list:
mangler_list.remove(mangler)
LocalAnimAddon.remangle_local_anims(session_manager)
class BaseAnimHelperAddon(BaseAddon):
"""
Base class for local creation of procedural animations
Animation generated by build_anim() gets applied to all active sessions
"""
ANIM_NAME: str
def handle_session_init(self, session: Session):
self._reapply_anim(session, session.main_region)
def handle_session_closed(self, session: Session):
LocalAnimAddon.apply_local_anim(session, session.main_region, self.ANIM_NAME, None)
def handle_unload(self, session_manager: SessionManager):
for session in session_manager.sessions:
# TODO: Nasty. Since we need to access session-local attrs we need to set the
# context even though we also explicitly pass session and region.
# Need to rethink the LocalAnimAddon API.
with addon_ctx.push(session, session.main_region):
LocalAnimAddon.apply_local_anim(session, session.main_region, self.ANIM_NAME, None)
@abstractmethod
def build_anim(self) -> Animation:
pass
def _reapply_anim(self, session: Session, region: ProxiedRegion):
LocalAnimAddon.apply_local_anim(session, region, self.ANIM_NAME, self.build_anim().to_bytes())
addons = [LocalAnimAddon()]

View File

@@ -81,17 +81,16 @@ class MeshUploadInterceptingAddon(BaseAddon):
@handle_command()
async def set_local_mesh_target(self, session: Session, region: ProxiedRegion):
"""Set the currently selected object as the target for local mesh"""
parent_object = region.objects.lookup_localid(session.selected.object_local)
if not parent_object:
"""Set the currently selected objects as the target for local mesh"""
selected_links = [region.objects.lookup_localid(l_id) for l_id in session.selected.object_locals]
selected_links = [o for o in selected_links if o is not None]
if not selected_links:
show_message("Nothing selected")
return
linkset_objects = [parent_object] + parent_object.Children
old_locals = self.local_mesh_target_locals
self.local_mesh_target_locals = [
x.LocalID
for x in linkset_objects
for x in selected_links
if ExtraParamType.MESH in x.ExtraParams
]
@@ -201,7 +200,7 @@ class MeshUploadInterceptingAddon(BaseAddon):
self.local_mesh_mapping = {x["mesh_name"]: x["mesh"] for x in instances}
# Fake a response, we don't want to actually send off the request.
flow.response = mitmproxy.http.HTTPResponse.make(
flow.response = mitmproxy.http.Response.make(
200,
b"",
{
@@ -280,4 +279,23 @@ class MeshUploadInterceptingAddon(BaseAddon):
cls._replace_local_mesh(session.main_region, asset_repo, mesh_list)
class BaseMeshManglerAddon(BaseAddon):
"""Base class for addons that mangle uploaded or local mesh"""
MESH_MANGLERS: List[Callable[[MeshAsset], MeshAsset]]
def handle_init(self, session_manager: SessionManager):
# Add our manglers into the list
MeshUploadInterceptingAddon.mesh_manglers.extend(self.MESH_MANGLERS)
# Tell the local mesh plugin that the mangler list changed, and to re-apply
MeshUploadInterceptingAddon.remangle_local_mesh(session_manager)
def handle_unload(self, session_manager: SessionManager):
# Clean up our manglers before we go away
mangler_list = MeshUploadInterceptingAddon.mesh_manglers
for mangler in self.MESH_MANGLERS:
if mangler in mangler_list:
mangler_list.remove(mangler)
MeshUploadInterceptingAddon.remangle_local_mesh(session_manager)
addons = [MeshUploadInterceptingAddon()]

View File

@@ -11,25 +11,28 @@ to add to give a mesh an arbitrary center of rotation / scaling.
from hippolyzer.lib.base.mesh import MeshAsset
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.sessions import SessionManager
import local_mesh
AddonManager.hot_reload(local_mesh, require_addons_loaded=True)
def _reorient_coord(coord, orientation):
def _reorient_coord(coord, orientation, normals=False):
coords = []
for axis in orientation:
axis_idx = abs(axis) - 1
coords.append(coord[axis_idx] if axis >= 0 else 1.0 - coord[axis_idx])
if normals:
# Normals have a static domain from -1.0 to 1.0, just negate.
new_coord = coord[axis_idx] if axis >= 0 else -coord[axis_idx]
else:
new_coord = coord[axis_idx] if axis >= 0 else 1.0 - coord[axis_idx]
coords.append(new_coord)
if coord.__class__ in (list, tuple):
return coord.__class__(coords)
return coord.__class__(*coords)
def _reorient_coord_list(coord_list, orientation):
return [_reorient_coord(x, orientation) for x in coord_list]
def _reorient_coord_list(coord_list, orientation, normals=False):
return [_reorient_coord(x, orientation, normals) for x in coord_list]
def reorient_mesh(orientation):
@@ -37,37 +40,23 @@ def reorient_mesh(orientation):
# X=1, Y=2, Z=3
def _reorienter(mesh: MeshAsset):
for material in mesh.iter_lod_materials():
if "Position" not in material:
# Must be a NoGeometry LOD
continue
# We don't need to use positions_(to/from)_domain here since we're just naively
# flipping the axes around.
material["Position"] = _reorient_coord_list(material["Position"], orientation)
# Are you even supposed to do this to the normals?
material["Normal"] = _reorient_coord_list(material["Normal"], orientation)
material["Normal"] = _reorient_coord_list(material["Normal"], orientation, normals=True)
return mesh
return _reorienter
OUR_MANGLERS = [
# Negate the X and Y axes on any mesh we upload or create temp
reorient_mesh((-1, -2, 3)),
]
class ExampleMeshManglerAddon(local_mesh.BaseMeshManglerAddon):
MESH_MANGLERS = [
# Negate the X and Y axes on any mesh we upload or create temp
reorient_mesh((-1, -2, 3)),
]
class MeshManglerExampleAddon(BaseAddon):
def handle_init(self, session_manager: SessionManager):
# Add our manglers into the list
local_mesh_addon = local_mesh.MeshUploadInterceptingAddon
local_mesh_addon.mesh_manglers.extend(OUR_MANGLERS)
# Tell the local mesh plugin that the mangler list changed, and to re-apply
local_mesh_addon.remangle_local_mesh(session_manager)
def handle_unload(self, session_manager: SessionManager):
# Clean up our manglers before we go away
local_mesh_addon = local_mesh.MeshUploadInterceptingAddon
mangler_list = local_mesh_addon.mesh_manglers
for mangler in OUR_MANGLERS:
if mangler in mangler_list:
mangler_list.remove(mangler)
local_mesh_addon.remangle_local_mesh(session_manager)
addons = [MeshManglerExampleAddon()]
addons = [ExampleMeshManglerAddon()]

View File

@@ -0,0 +1,244 @@
"""
Message Mirror
Re-routes messages through the circuit of another agent running through this proxy,
rewriting the messages to use the credentials tied to that circuit.
Useful if you need to quickly QA authorization checks on a message handler or script.
Or if you want to chat as two people at once. Whatever.
Also shows some advanced ways of managing / rerouting Messages and HTTP flows.
Fiddle with the values of `SEND_NORMALLY` and `MIRROR` to change how and which
messages get moved to other circuits.
Usage: /524 mirror_to <mirror_agent_uuid>
To Disable: /524 mirror_to
"""
import weakref
from typing import Optional
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.template_dict import DEFAULT_TEMPLATE_DICT
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty, show_message
from hippolyzer.lib.proxy.commands import handle_command, Parameter, parse_bool
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.caps import CapData, CapType
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
# Things that make no sense to mirror, or will make everything explode if mirrored.
SEND_NORMALLY = {
'StartPingCheck', 'CompletePingCheck', 'PacketAck', 'SimulatorViewerTimeMessage', 'SimStats',
'SoundTrigger', 'EventQueueGet', 'GetMesh', 'GetMesh2', 'ParcelDwellRequest', 'ViewerEffect', 'ViewerStats',
'ParcelAccessListRequest', 'FirestormBridge', 'AvatarRenderInfo', 'ParcelPropertiesRequest', 'GetObjectCost',
'RequestMultipleObjects', 'GetObjectPhysicsData', 'GetExperienceInfo', 'RequestTaskInventory', 'AgentRequestSit',
'MuteListRequest', 'UpdateMuteListEntry', 'RemoveMuteListEntry', 'RequestImage',
'AgentThrottle', 'UseCircuitCode', 'AgentWearablesRequest', 'AvatarPickerRequest', 'CloseCircuit',
'CompleteAgentMovement', 'RegionHandshakeReply', 'LogoutRequest', 'ParcelPropertiesRequest',
'ParcelPropertiesRequestByID', 'MapBlockRequest', 'MapLayerRequest', 'MapItemRequest', 'MapNameRequest',
'ParcelAccessListRequest', 'AvatarPropertiesRequest', 'DirFindQuery',
'SetAlwaysRun', 'GetDisplayNames', 'ViewerMetrics', 'AgentResume', 'AgentPause',
'ViewerAsset', 'GetTexture', 'UUIDNameRequest', 'AgentUpdate', 'AgentAnimation'
# Would just be confusing for everyone
'ImprovedInstantMessage',
# Xfer system isn't authed to begin with, and duping Xfers can lead to premature file deletion. Skip.
'RequestXfer', 'ConfirmXferPacket', 'AbortXfer', 'SendXferPacket',
}
# Messages that _must_ be sent normally, but are worth mirroring onto the target session to see how
# they would respond
MIRROR = {
'RequestObjectPropertiesFamily', 'ObjectSelect', 'RequestObjectProperties', 'TransferRequest',
'RequestMultipleObjects', 'RequestTaskInventory', 'FetchInventory2', 'ScriptDialogReply',
'ObjectDeselect', 'GenericMessage', 'ChatFromViewer'
}
for msg_name in DEFAULT_TEMPLATE_DICT.message_templates.keys():
# There are a lot of these.
if msg_name.startswith("Group") and msg_name.endswith("Request"):
MIRROR.add(msg_name)
class MessageMirrorAddon(BaseAddon):
mirror_target_agent: Optional[UUID] = SessionProperty(None)
mirror_use_target_session: bool = SessionProperty(True)
mirror_use_target_agent: bool = SessionProperty(True)
@handle_command(target_agent=Parameter(UUID, optional=True))
async def mirror_to(self, session: Session, _region, target_agent: Optional[UUID] = None):
"""
Send this session's outbound messages over another proxied agent's circuit
"""
if target_agent:
if target_agent == session.agent_id:
show_message("Can't mirror our own session")
target_agent = None
elif not any(s.agent_id == target_agent for s in session.session_manager.sessions):
show_message(f"No active proxied session for agent {target_agent}")
target_agent = None
self.mirror_target_agent = target_agent
if target_agent:
show_message(f"Mirroring to {target_agent}")
else:
show_message("Message mirroring disabled")
@handle_command(enabled=parse_bool)
async def set_mirror_use_target_session(self, _session, _region, enabled):
"""Replace the original session ID with the target session's ID when mirroring"""
self.mirror_use_target_session = enabled
@handle_command(enabled=parse_bool)
async def set_mirror_use_target_agent(self, _session, _region, enabled):
"""Replace the original agent ID with the target agent's ID when mirroring"""
self.mirror_use_target_agent = enabled
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if message.direction != Direction.OUT:
return
if not self.mirror_target_agent:
return
if message.name in SEND_NORMALLY:
return
target_session = None
for poss_session in session.session_manager.sessions:
if poss_session.agent_id == self.mirror_target_agent:
target_session = poss_session
if not target_session:
print("Couldn't find target session?")
return
target_region = None
for poss_region in target_session.regions:
if poss_region.circuit_addr == region.circuit_addr:
target_region = poss_region
if not target_region:
print("Couldn't find equivalent target region?")
return
# Send the message normally first if we're mirroring
if message.name in MIRROR:
region.circuit.send(message)
# We're going to send the message on a new circuit, we need to take
# it so we get a new packet ID and clean ACKs
message = message.take()
self._lludp_fixups(target_session, message)
target_region.circuit.send(message)
return True
def _lludp_fixups(self, target_session: Session, message: Message):
if "AgentData" in message:
agent_block = message["AgentData"][0]
if "AgentID" in agent_block and self.mirror_use_target_agent:
agent_block["AgentID"] = target_session.agent_id
if "SessionID" in agent_block and self.mirror_use_target_session:
agent_block["SessionID"] = target_session.id
if message.name == "TransferRequest":
transfer_block = message["TransferInfo"][0]
# This is a duplicated message so we need to give it a new ID
transfer_block["TransferID"] = UUID.random()
params = transfer_block.deserialize_var("Params")
# This kind of Transfer might not even use agent credentials
if self.mirror_use_target_agent and hasattr(params, 'AgentID'):
params.AgentID = target_session.agent_id
if self.mirror_use_target_session and hasattr(params, 'SessionID'):
params.SessionID = target_session.id
transfer_block.serialize_var("Params", params)
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
# Already mirrored, ignore.
if flow.is_replay:
return
cap_data = flow.cap_data
if not cap_data:
return
if cap_data.cap_name in SEND_NORMALLY:
return
if cap_data.asset_server_cap:
return
# Likely doesn't have an exact equivalent in the target session, this is a temporary
# cap like an uploader URL or a stats URL.
if cap_data.type == CapType.TEMPORARY:
return
session: Optional[Session] = cap_data.session and cap_data.session()
if not session:
return
region: Optional[ProxiedRegion] = cap_data.region and cap_data.region()
if not region:
return
# Session-scoped, so we need to know if we have a session before checking
if not self.mirror_target_agent:
return
target_session: Optional[Session] = None
for poss_session in session.session_manager.sessions:
if poss_session.agent_id == self.mirror_target_agent:
target_session = poss_session
if not target_session:
return
caps_source = target_session
target_region: Optional[ProxiedRegion] = None
if region:
target_region = None
for poss_region in target_session.regions:
if poss_region.circuit_addr == region.circuit_addr:
target_region = poss_region
if not target_region:
print("No region in cap?")
return
caps_source = target_region
new_base_url = caps_source.cap_urls.get(cap_data.cap_name)
if not new_base_url:
print("No equiv cap?")
return
if cap_data.cap_name in MIRROR:
flow = flow.copy()
# Have the cap data reflect the new URL we're pointing at
flow.metadata["cap_data"] = CapData(
cap_name=cap_data.cap_name,
region=weakref.ref(target_region) if target_region else None,
session=weakref.ref(target_session),
base_url=new_base_url,
)
# Tack any params onto the new base URL for the cap
new_url = new_base_url + flow.request.url[len(cap_data.base_url):]
flow.request.url = new_url
if cap_data.cap_name in MIRROR:
self._replay_flow(flow, session.session_manager)
def _replay_flow(self, flow: HippoHTTPFlow, session_manager: SessionManager):
# Work around mitmproxy bug, changing the URL updates the Host header, which may
# cause it to drop the port even when it shouldn't have. Fix the host header.
if flow.request.port not in (80, 443) and ":" not in flow.request.host_header:
flow.request.host_header = f"{flow.request.host}:{flow.request.port}"
# Should get repopulated when it goes back through the MITM addon
flow.metadata.pop("cap_data_ser", None)
flow.metadata.pop("cap_data", None)
proxy_queue = session_manager.flow_context.to_proxy_queue
proxy_queue.put_nowait(("replay", None, flow.get_state()))
addons = [MessageMirrorAddon()]

View File

@@ -0,0 +1,49 @@
"""
Example of proxy-provided caps
Useful for mocking out a cap that isn't actually implemented by the server
while developing the viewer-side pieces of it.
Implements a cap that accepts an `obj_id` UUID query parameter and returns
the name of the object.
"""
import asyncio
import asgiref.wsgi
from flask import Flask, Response, request
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.webapp_cap_addon import WebAppCapAddon
app = Flask("GetObjectNameCapApp")
@app.route('/')
async def get_object_name():
# Should always have the current region, the cap handler is bound to one.
# Just need to pull it from the `addon_ctx` module's global.
obj_mgr = addon_ctx.region.get().objects
obj_id = UUID(request.args['obj_id'])
obj = obj_mgr.lookup_fullid(obj_id)
if not obj:
return Response(f"Couldn't find {obj_id!r}", status=404, mimetype="text/plain")
try:
await asyncio.wait_for(obj_mgr.request_object_properties(obj)[0], 1.0)
except asyncio.TimeoutError:
return Response(f"Timed out requesting {obj_id!r}'s properties", status=500, mimetype="text/plain")
return Response(obj.Name, mimetype="text/plain")
class MockProxyCapExampleAddon(WebAppCapAddon):
# A cap URL with this name will be tied to each region when
# the sim is first connected to. The URL will be returned to the
# viewer in the Seed if the viewer requests it by name.
CAP_NAME = "GetObjectNameExample"
# Any asgi app should be fine.
APP = asgiref.wsgi.WsgiToAsgi(app)
addons = [MockProxyCapExampleAddon()]

View File

@@ -27,7 +27,7 @@ from mitmproxy.http import HTTPFlow
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.jp2_utils import BufferedJp2k
from hippolyzer.lib.base.multiprocessing_utils import ParentProcessWatcher
from hippolyzer.lib.base.templates import TextureEntry
from hippolyzer.lib.base.templates import TextureEntryCollection
from hippolyzer.lib.proxy.addon_utils import AssetAliasTracker, BaseAddon, GlobalProperty, AddonProcess
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.base.message.message import Message
@@ -148,7 +148,7 @@ class MonochromeAddon(BaseAddon):
message["RegionInfo"][field_name] = tracker.get_alias_uuid(val)
@staticmethod
def _make_te_monochrome(tracker: AssetAliasTracker, parsed_te: TextureEntry):
def _make_te_monochrome(tracker: AssetAliasTracker, parsed_te: TextureEntryCollection):
# Need a deepcopy because TEs are owned by the ObjectManager
# and we don't want to change the canonical view.
parsed_te = copy.deepcopy(parsed_te)

View File

@@ -0,0 +1,111 @@
"""
Check object manager state against region ViewerObject cache
Can't look at every object we've tracked and every object in VOCache
and report mismatches due to weird VOCache cache eviction criteria and certain
cacheable objects not being added to the VOCache.
Off the top of my head, animesh objects get explicit KillObjects at extreme
view distances same as avatars, but will still be present in the cache even
though they will not be in gObjectList.
"""
import asyncio
import logging
from typing import *
from hippolyzer.lib.base.objects import normalize_object_update_compressed_data
from hippolyzer.lib.base.templates import ObjectUpdateFlags, PCode
from hippolyzer.lib.proxy.addon_utils import BaseAddon, GlobalProperty
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager, Session
from hippolyzer.lib.proxy.vocache import is_valid_vocache_dir, RegionViewerObjectCacheChain
LOG = logging.getLogger(__name__)
class ObjectManagementValidator(BaseAddon):
base_cache_path: Optional[str] = GlobalProperty(None)
orig_auto_request: Optional[bool] = GlobalProperty(None)
def handle_init(self, session_manager: SessionManager):
if self.orig_auto_request is None:
self.orig_auto_request = session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS
session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS = False
async def _choose_cache_path():
while not self.base_cache_path:
cache_dir = await AddonManager.UI.open_dir("Choose the base cache directory")
if not cache_dir:
return
if not is_valid_vocache_dir(cache_dir):
continue
self.base_cache_path = cache_dir
if not self.base_cache_path:
self._schedule_task(_choose_cache_path(), session_scoped=False)
def handle_unload(self, session_manager: SessionManager):
session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS = self.orig_auto_request
def handle_session_init(self, session: Session):
# Use only the specified cache path for the vocache
session.cache_dir = self.base_cache_path
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if message.name != "DisableSimulator":
return
# Send it off to the client without handling it normally,
# we need to defer region teardown in the proxy
region.circuit.send(message)
self._schedule_task(self._check_cache_before_region_teardown(region))
return True
async def _check_cache_before_region_teardown(self, region: ProxiedRegion):
await asyncio.sleep(0.5)
print("Ok, checking cache differences")
try:
# Index will have been rewritten, so re-read it.
region_cache_chain = RegionViewerObjectCacheChain.for_region(
handle=region.handle,
cache_id=region.cache_id,
cache_dir=self.base_cache_path
)
if not region_cache_chain.region_caches:
print(f"no caches for {region!r}?")
return
all_full_ids = set()
for obj in region.objects.all_objects:
cacheable = True
orig_obj = obj
# Walk along the ancestry checking for things that would make the tree non-cacheable
while obj is not None:
if obj.UpdateFlags & ObjectUpdateFlags.TEMPORARY_ON_REZ:
cacheable = False
if obj.PCode == PCode.AVATAR:
cacheable = False
obj = obj.Parent
if cacheable:
all_full_ids.add(orig_obj.FullID)
for key in all_full_ids:
obj = region.objects.lookup_fullid(key)
cached_data = region_cache_chain.lookup_object_data(obj.LocalID, obj.CRC)
if not cached_data:
continue
orig_dict = obj.to_dict()
parsed_data = normalize_object_update_compressed_data(cached_data)
updated = obj.update_properties(parsed_data)
# Can't compare this yet
updated -= {"TextureEntry"}
if updated:
print(key)
for attr in updated:
print("\t", attr, orig_dict[attr], parsed_data[attr])
finally:
# Ok to teardown region in the proxy now
region.mark_dead()
addons = [ObjectManagementValidator()]

View File

@@ -4,7 +4,7 @@ Do the money dance whenever someone in the sim pays you directly
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.templates import MoneyTransactionType, PCode, ChatType
from hippolyzer.lib.base.templates import MoneyTransactionType, ChatType
from hippolyzer.lib.proxy.addon_utils import send_chat, BaseAddon
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
@@ -27,8 +27,8 @@ class PaydayAddon(BaseAddon):
return
# Check if they're likely to be in the sim
sender_obj = region.objects.lookup_fullid(sender)
if not sender_obj or sender_obj.PCode != PCode.AVATAR:
sender_obj = region.objects.lookup_avatar(sender)
if not sender_obj:
return
amount = transaction_block['Amount']
@@ -37,7 +37,7 @@ class PaydayAddon(BaseAddon):
chat_type=ChatType.SHOUT,
)
# Do the traditional money dance.
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"AgentAnimation",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
Block("AnimationList", AnimID=UUID("928cae18-e31d-76fd-9cc9-2f55160ff818"), StartAnim=True),

View File

@@ -0,0 +1,161 @@
"""
Import a small image (like a nintendo sprite) and create it out of cube prims
Inefficient and doesn't even do line fill, expect it to take `width * height`
prims for whatever image you import!
"""
import asyncio
import struct
from typing import *
from PySide6.QtGui import QImage
from hippolyzer.lib.base.datatypes import UUID, Vector3, Quaternion
from hippolyzer.lib.base.helpers import to_chunks
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.templates import ObjectUpdateFlags, PCode, MCode, MultipleObjectUpdateFlags, \
TextureEntryCollection, JUST_CREATED_FLAGS
from hippolyzer.lib.client.object_manager import ObjectEvent, ObjectUpdateType
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
PRIM_SCALE = 0.2
class PixelArtistAddon(BaseAddon):
@handle_command()
async def import_pixel_art(self, session: Session, region: ProxiedRegion):
"""
Import a small image (like a nintendo sprite) and create it out of cube prims
"""
filename = await AddonManager.UI.open_file(
"Open an image",
filter_str="Images (*.png *.jpg *.jpeg *.bmp)",
)
if not filename:
return
img = QImage()
with open(filename, "rb") as f:
img.loadFromData(f.read(), format=None)
img = img.convertToFormat(QImage.Format_RGBA8888)
height = img.height()
width = img.width()
pixels: List[Optional[bytes]] = []
needed_prims = 0
for y in range(height):
for x in range(width):
color: int = img.pixel(x, y)
# This will be ARGB, SL wants RGBA
alpha = (color & 0xFF000000) >> 24
color = color & 0x00FFFFFF
if alpha > 20:
# Repack RGBA to the bytes format we use for colors
pixels.append(struct.pack("!I", (color << 8) | alpha))
needed_prims += 1
else:
# Pretty transparent, skip it
pixels.append(None)
if not await AddonManager.UI.confirm("Confirm prim use", f"This will take {needed_prims} prims"):
return
agent_obj = region.objects.lookup_fullid(session.agent_id)
agent_pos = agent_obj.RegionPosition
created_prims = []
# Watch for any newly created prims, this is basically what the viewer does to find
# prims that it just created with the build tool.
with session.objects.events.subscribe_async(
(ObjectUpdateType.OBJECT_UPDATE,),
predicate=lambda e: e.object.UpdateFlags & JUST_CREATED_FLAGS and "LocalID" in e.updated
) as get_events:
# Create a pool of prims to use for building the pixel art
for _ in range(needed_prims):
# TODO: We don't track the land group or user's active group, so
# "anyone can build" must be on for rezzing to work.
group_id = UUID()
region.circuit.send(Message(
'ObjectAdd',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id, GroupID=group_id),
Block(
'ObjectData',
PCode=PCode.PRIMITIVE,
Material=MCode.WOOD,
AddFlags=ObjectUpdateFlags.CREATE_SELECTED,
PathCurve=16,
ProfileCurve=1,
PathScaleX=100,
PathScaleY=100,
BypassRaycast=1,
RayStart=agent_obj.RegionPosition + Vector3(0, 0, 2),
RayEnd=agent_obj.RegionPosition + Vector3(0, 0, 2),
RayTargetID=UUID(),
RayEndIsIntersection=0,
Scale=Vector3(PRIM_SCALE, PRIM_SCALE, PRIM_SCALE),
Rotation=Quaternion(0.0, 0.0, 0.0, 1.0),
fill_missing=True,
),
))
# Don't spam a ton of creates at once
await asyncio.sleep(0.02)
# Read any creation events that queued up while we were creating the objects
# So we can figure out the newly-created objects' IDs
for _ in range(needed_prims):
evt: ObjectEvent = await asyncio.wait_for(get_events(), 1.0)
created_prims.append(evt.object)
# Drawing origin starts at the top left, should be positioned just above the
# avatar on Z and centered on Y.
top_left = Vector3(0, (width * PRIM_SCALE) * -0.5, (height * PRIM_SCALE) + 2.0) + agent_pos
positioning_blocks = []
prim_idx = 0
for i, pixel_color in enumerate(pixels):
# Transparent, skip
if pixel_color is None:
continue
x = i % width
y = i // width
obj = created_prims[prim_idx]
# Set a blank texture on all faces
te = TextureEntryCollection()
te.Textures[None] = UUID('5748decc-f629-461c-9a36-a35a221fe21f')
# Set the prim color to the color from the pixel
te.Color[None] = pixel_color
# Set the prim texture and color
region.circuit.send(Message(
'ObjectImage',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('ObjectData', ObjectLocalID=obj.LocalID, MediaURL=b'', TextureEntry_=te),
direction=Direction.OUT,
))
# Save the repositioning data for later since it uses a different message,
# but it can be set in batches.
positioning_blocks.append(Block(
'ObjectData',
ObjectLocalID=obj.LocalID,
Type=MultipleObjectUpdateFlags.POSITION,
Data_={'POSITION': top_left + Vector3(0, x * PRIM_SCALE, y * -PRIM_SCALE)},
))
await asyncio.sleep(0.01)
# We actually used a prim for this, so increment the index
prim_idx += 1
# Move the "pixels" to their correct position in chunks
for chunk in to_chunks(positioning_blocks, 25):
region.circuit.send(Message(
'MultipleObjectUpdate',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
*chunk,
direction=Direction.OUT,
))
await asyncio.sleep(0.01)
addons = [PixelArtistAddon()]

View File

@@ -70,7 +70,7 @@ class RecapitatorAddon(BaseAddon):
async def _proxy_bodypart_upload(self, session: Session, region: ProxiedRegion, message: Message):
asset_block = message["AssetBlock"]
# Asset will already be in the viewer's VFS as the expected asset ID, calculate it.
asset_id = session.tid_to_assetid(asset_block["TransactionID"])
asset_id = session.transaction_to_assetid(asset_block["TransactionID"])
success = False
try:
# Xfer the asset from the viewer if it wasn't small enough to fit in AssetData
@@ -116,7 +116,7 @@ class RecapitatorAddon(BaseAddon):
except:
logging.exception("Exception while recapitating")
# Tell the viewer about the status of its original upload
region.circuit.send_message(Message(
region.circuit.send(Message(
"AssetUploadComplete",
Block("AssetBlock", UUID=asset_id, Type=asset_block["Type"], Success=success),
direction=Direction.IN,

View File

@@ -29,10 +29,11 @@ class SerializationSanityChecker(BaseAddon):
self.deserializer = UDPMessageDeserializer()
def handle_proxied_packet(self, session_manager: SessionManager, packet: UDPPacket,
session: Optional[Session], region: Optional[ProxiedRegion],
message: Optional[Message]):
session: Optional[Session], region: Optional[ProxiedRegion]):
# Well this doesn't even parse as a message, can't do anything about it.
if message is None:
try:
message = self.deserializer.deserialize(packet.data)
except:
LOG.error(f"Received unparseable message from {packet.src_addr!r}: {packet.data!r}")
return
try:

View File

@@ -6,7 +6,13 @@ from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
SUSPICIOUS_PACKETS = {"TransferRequest", "UUIDNameRequest", "UUIDGroupNameRequest", "OpenCircuit"}
SUSPICIOUS_PACKETS = {
"TransferRequest",
"UUIDNameRequest",
"UUIDGroupNameRequest",
"OpenCircuit",
"AddCircuitCode",
}
REGULAR_IM_DIALOGS = (IMDialogType.TYPING_STOP, IMDialogType.TYPING_STOP, IMDialogType.NOTHING_SPECIAL)

View File

@@ -0,0 +1,22 @@
import random
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
class SimulatePacketLossAddon(BaseAddon):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
# Messing with these may kill your circuit
if message.name in {"PacketAck", "StartPingCheck", "CompletePingCheck", "UseCircuitCode",
"CompleteAgentMovement", "AgentMovementComplete"}:
return
# Simulate 30% packet loss
if random.random() > 0.7:
# Do nothing, drop this packet on the floor
return True
return
addons = [SimulatePacketLossAddon()]

View File

@@ -0,0 +1,55 @@
"""
Tail animation generator
Demonstrates programmatic generation of local motions using BaseAnimHelperAddon
You can use this to create an animation with a script, fiddle with it until it
looks right, then finally save it with /524 save_local_anim <ANIM_NAME>.
The built animation is automatically applied to all active sessions when loaded,
and is re-generated whenever the script is edited. Unloading the script stops
the animations.
"""
from hippolyzer.lib.base.anim_utils import shift_keyframes, smooth_rot
from hippolyzer.lib.base.datatypes import Quaternion
from hippolyzer.lib.base.llanim import Animation, Joint
from hippolyzer.lib.proxy.addons import AddonManager
import local_anim
AddonManager.hot_reload(local_anim, require_addons_loaded=True)
class TailAnimator(local_anim.BaseAnimHelperAddon):
# Should be unique
ANIM_NAME = "tail_anim"
def build_anim(self) -> Animation:
anim = Animation(
base_priority=5,
duration=5.0,
loop_out_point=5.0,
loop=True,
)
# Iterate along tail joints 1 through 6
for joint_num in range(1, 7):
# Give further along joints a wider range of motion
start_rot = Quaternion.from_euler(0.2, -0.3, 0.15 * joint_num)
end_rot = Quaternion.from_euler(-0.2, -0.3, -0.15 * joint_num)
rot_keyframes = [
# Tween between start_rot and end_rot, using smooth interpolation.
# SL's keyframes only allow linear interpolation which doesn't look great
# for natural motions. `smooth_rot()` gets around that by generating
# smooth inter frames for SL to linearly interpolate between.
*smooth_rot(start_rot, end_rot, inter_frames=10, time=0.0, duration=2.5),
*smooth_rot(end_rot, start_rot, inter_frames=10, time=2.5, duration=2.5),
]
anim.joints[f"mTail{joint_num}"] = Joint(
priority=5,
# Each joint's frames should be ahead of the previous joint's by 2 frames
rot_keyframes=shift_keyframes(rot_keyframes, joint_num * 2),
)
return anim
addons = [TailAnimator()]

View File

@@ -3,7 +3,7 @@ Example of how to request a Transfer
"""
from typing import *
from hippolyzer.lib.base.legacy_inv import InventoryModel, InventoryItem
from hippolyzer.lib.base.inventory import InventoryModel, InventoryItem
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.templates import (
AssetType,
@@ -35,19 +35,19 @@ class TransferExampleAddon(BaseAddon):
async def get_first_script(self, session: Session, region: ProxiedRegion):
"""Get the contents of the first script in the selected object"""
# Ask for the object inventory so we can find a script
region.circuit.send_message(Message(
region.circuit.send(Message(
'RequestTaskInventory',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('InventoryData', LocalID=session.selected.object_local),
))
inv_message = await region.message_handler.wait_for('ReplyTaskInventory', timeout=5.0)
inv_message = await region.message_handler.wait_for(('ReplyTaskInventory',), timeout=5.0)
# Xfer the inventory file and look for a script
xfer = await region.xfer_manager.request(
file_name=inv_message["InventoryData"]["Filename"], file_path=XferFilePath.CACHE)
inv_model = InventoryModel.from_bytes(xfer.reassemble_chunks())
first_script: Optional[InventoryItem] = None
for item in inv_model.items.values():
for item in inv_model.all_items:
if item.type == "lsltext":
first_script = item
if not first_script:

View File

@@ -64,12 +64,12 @@ class TurboObjectInventoryAddon(BaseAddon):
# Any previous requests will have triggered a delete of the inventory file
# by marking it complete on the server-side. Re-send our RequestTaskInventory
# To make sure there's a fresh copy.
region.circuit.send_message(request_msg.take())
inv_message = await region.message_handler.wait_for('ReplyTaskInventory', timeout=5.0)
region.circuit.send(request_msg.take())
inv_message = await region.message_handler.wait_for(('ReplyTaskInventory',), timeout=5.0)
# No task inventory, send the reply as-is
file_name = inv_message["InventoryData"]["Filename"]
if not file_name:
region.circuit.send_message(inv_message)
region.circuit.send(inv_message)
return
xfer = region.xfer_manager.request(
@@ -87,7 +87,7 @@ class TurboObjectInventoryAddon(BaseAddon):
continue
# Send the original ReplyTaskInventory to the viewer so it knows the file is ready
region.circuit.send_message(inv_message)
region.circuit.send(inv_message)
proxied_xfer = Xfer(data=xfer.reassemble_chunks())
# Wait for the viewer to request the inventory file

View File

@@ -102,7 +102,7 @@ class UploaderAddon(BaseAddon):
ais_item_to_inventory_data(ais_item),
direction=Direction.IN
)
region.circuit.send_message(message)
region.circuit.send(message)
addons = [UploaderAddon()]

View File

@@ -2,7 +2,7 @@
Example of how to request an Xfer
"""
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.legacy_inv import InventoryModel
from hippolyzer.lib.base.inventory import InventoryModel
from hippolyzer.lib.base.templates import XferFilePath, AssetType, InventoryType, WearableType
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message
@@ -15,14 +15,14 @@ class XferExampleAddon(BaseAddon):
@handle_command()
async def get_mute_list(self, session: Session, region: ProxiedRegion):
"""Fetch the current user's mute list"""
region.circuit.send_message(Message(
region.circuit.send(Message(
'MuteListRequest',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block("MuteData", MuteCRC=0),
))
# Wait for any MuteListUpdate, dropping it before it reaches the viewer
update_msg = await region.message_handler.wait_for('MuteListUpdate', timeout=5.0)
update_msg = await region.message_handler.wait_for(('MuteListUpdate',), timeout=5.0)
mute_file_name = update_msg["MuteData"]["Filename"]
if not mute_file_name:
show_message("Nobody muted?")
@@ -35,14 +35,14 @@ class XferExampleAddon(BaseAddon):
@handle_command()
async def get_task_inventory(self, session: Session, region: ProxiedRegion):
"""Get the inventory of the currently selected object"""
region.circuit.send_message(Message(
region.circuit.send(Message(
'RequestTaskInventory',
# If no session is passed in we'll use the active session when the coro was created
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('InventoryData', LocalID=session.selected.object_local),
))
inv_message = await region.message_handler.wait_for('ReplyTaskInventory', timeout=5.0)
inv_message = await region.message_handler.wait_for(('ReplyTaskInventory',), timeout=5.0)
# Xfer doesn't need to be immediately awaited, multiple signals can be waited on.
xfer = region.xfer_manager.request(
@@ -57,7 +57,7 @@ class XferExampleAddon(BaseAddon):
await xfer
inv_model = InventoryModel.from_bytes(xfer.reassemble_chunks())
item_names = [item.name for item in inv_model.items.values()]
item_names = [item.name for item in inv_model.all_items]
show_message(item_names)
@handle_command()
@@ -98,7 +98,7 @@ textures 1
data=asset_data,
transaction_id=transaction_id
)
region.circuit.send_message(Message(
region.circuit.send(Message(
'CreateInventoryItem',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block(

View File

@@ -2,7 +2,7 @@ import enum
import logging
import typing
from PySide2 import QtCore, QtGui
from PySide6 import QtCore, QtGui
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.message_logger import FilteringMessageLogger
@@ -19,9 +19,9 @@ class MessageLogHeader(enum.IntEnum):
class MessageLogModel(QtCore.QAbstractTableModel, FilteringMessageLogger):
def __init__(self, parent=None):
def __init__(self, parent=None, maxlen=2000):
QtCore.QAbstractTableModel.__init__(self, parent)
FilteringMessageLogger.__init__(self)
FilteringMessageLogger.__init__(self, maxlen=maxlen)
def _begin_insert(self, insert_idx: int):
self.beginInsertRows(QtCore.QModelIndex(), insert_idx, insert_idx)

View File

@@ -7,6 +7,7 @@ import sys
import time
from typing import Optional
import mitmproxy.ctx
import mitmproxy.exceptions
from hippolyzer.lib.base import llsd
@@ -20,6 +21,7 @@ from hippolyzer.lib.proxy.lludp_proxy import SLSOCKS5Server
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager, Session
from hippolyzer.lib.proxy.settings import ProxySettings
LOG = logging.getLogger(__name__)
@@ -42,7 +44,7 @@ class SelectionManagerAddon(BaseAddon):
LOG.debug(f"Don't know about selected {local_id}, requesting object")
needed_objects.add(local_id)
if needed_objects:
if needed_objects and session.session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS:
region.objects.request_objects(needed_objects)
# ParcelDwellRequests are sent whenever "about land" is opened. This gives us a
# decent mechanism for selecting parcels.
@@ -85,14 +87,17 @@ class REPLAddon(BaseAddon):
def run_http_proxy_process(proxy_host, http_proxy_port, flow_context: HTTPFlowContext):
mitm_loop = asyncio.new_event_loop()
asyncio.set_event_loop(mitm_loop)
mitmproxy_master = create_http_proxy(proxy_host, http_proxy_port, flow_context)
mitmproxy_master.start_server()
gc.freeze()
mitm_loop.run_forever()
async def mitmproxy_loop():
mitmproxy_master = create_http_proxy(proxy_host, http_proxy_port, flow_context)
gc.freeze()
await mitmproxy_master.run()
asyncio.run(mitmproxy_loop())
def start_proxy(extra_addons: Optional[list] = None, extra_addon_paths: Optional[list] = None,
session_manager=None, proxy_host=None):
def start_proxy(session_manager: SessionManager, extra_addons: Optional[list] = None,
extra_addon_paths: Optional[list] = None, proxy_host=None):
extra_addons = extra_addons or []
extra_addon_paths = extra_addon_paths or []
extra_addons.append(SelectionManagerAddon())
@@ -103,14 +108,13 @@ def start_proxy(extra_addons: Optional[list] = None, extra_addon_paths: Optional
root_log.setLevel(logging.INFO)
logging.basicConfig()
loop = asyncio.get_event_loop()
loop = asyncio.get_event_loop_policy().get_event_loop()
udp_proxy_port = int(os.environ.get("HIPPO_UDP_PORT", 9061))
http_proxy_port = int(os.environ.get("HIPPO_HTTP_PORT", 9062))
udp_proxy_port = session_manager.settings.SOCKS_PROXY_PORT
http_proxy_port = session_manager.settings.HTTP_PROXY_PORT
if proxy_host is None:
proxy_host = os.environ.get("HIPPO_BIND_HOST", "127.0.0.1")
proxy_host = session_manager.settings.PROXY_BIND_ADDR
session_manager = session_manager or SessionManager()
flow_context = session_manager.flow_context
session_manager.name_cache.load_viewer_caches()
@@ -119,7 +123,7 @@ def start_proxy(extra_addons: Optional[list] = None, extra_addon_paths: Optional
if sys.argv[1] == "--setup-ca":
try:
mitmproxy_master = create_http_proxy(proxy_host, http_proxy_port, flow_context)
except mitmproxy.exceptions.ServerException:
except mitmproxy.exceptions.MitmproxyException:
# Proxy already running, create the master so we don't try to bind to a port
mitmproxy_master = create_proxy_master(proxy_host, http_proxy_port, flow_context)
setup_ca(sys.argv[2], mitmproxy_master)
@@ -131,6 +135,9 @@ def start_proxy(extra_addons: Optional[list] = None, extra_addon_paths: Optional
daemon=True,
)
http_proc.start()
# These need to be set for mitmproxy's ASGIApp serving code to work.
mitmproxy.ctx.master = None
mitmproxy.ctx.log = logging.getLogger("mitmproxy log")
server = SLSOCKS5Server(session_manager)
coro = asyncio.start_server(server.handle_connection, proxy_host, udp_proxy_port)
@@ -186,7 +193,7 @@ def _windows_timeout_killer(pid: int):
def main():
multiprocessing.set_start_method("spawn")
start_proxy()
start_proxy(SessionManager(ProxySettings()))
if __name__ == "__main__":

View File

@@ -1,5 +1,6 @@
import asyncio
import base64
import dataclasses
import email
import functools
import html
@@ -16,8 +17,8 @@ import urllib.parse
from typing import *
import multidict
from qasync import QEventLoop
from PySide2 import QtCore, QtWidgets, QtGui
from qasync import QEventLoop, asyncSlot
from PySide6 import QtCore, QtWidgets, QtGui
from hippolyzer.apps.model import MessageLogModel, MessageLogHeader, RegionListModel
from hippolyzer.apps.proxy import start_proxy
@@ -33,17 +34,20 @@ from hippolyzer.lib.base.message.message_formatting import (
SpannedString,
)
from hippolyzer.lib.base.message.msgtypes import MsgType
from hippolyzer.lib.base.message.template_dict import TemplateDictionary
from hippolyzer.lib.base.message.template_dict import DEFAULT_TEMPLATE_DICT
from hippolyzer.lib.base.settings import SettingDescriptor
from hippolyzer.lib.base.ui_helpers import loadUi
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base.network.transport import Direction, WrappingUDPTransport
from hippolyzer.lib.base.network.transport import Direction, SocketUDPTransport
from hippolyzer.lib.proxy.addons import BaseInteractionManager, AddonManager
from hippolyzer.lib.proxy.ca_utils import setup_ca_everywhere
from hippolyzer.lib.proxy.caps_client import ProxyCapsClient
from hippolyzer.lib.proxy.http_proxy import create_proxy_master, HTTPFlowContext
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, AbstractMessageLogEntry
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, AbstractMessageLogEntry, WrappingMessageLogger, \
import_log_entries, export_log_entries
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
from hippolyzer.lib.proxy.settings import ProxySettings
from hippolyzer.lib.proxy.templates import CAP_TEMPLATES
LOG = logging.getLogger(__name__)
@@ -58,7 +62,7 @@ def show_error_message(error_msg, parent=None):
error_dialog = QtWidgets.QErrorMessage(parent=parent)
# No obvious way to set this to plaintext, yuck...
error_dialog.showMessage(html.escape(error_msg))
error_dialog.exec_()
error_dialog.exec()
error_dialog.raise_()
@@ -66,11 +70,11 @@ class GUISessionManager(SessionManager, QtCore.QObject):
regionAdded = QtCore.Signal(ProxiedRegion)
regionRemoved = QtCore.Signal(ProxiedRegion)
def __init__(self, model):
SessionManager.__init__(self)
def __init__(self, settings):
SessionManager.__init__(self, settings)
QtCore.QObject.__init__(self)
self.all_regions = []
self.message_logger = model
self.message_logger = WrappingMessageLogger()
def checkRegions(self):
new_regions = itertools.chain(*[s.regions for s in self.sessions])
@@ -85,13 +89,13 @@ class GUISessionManager(SessionManager, QtCore.QObject):
self.all_regions = new_regions
class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
def __init__(self, parent):
class GUIInteractionManager(BaseInteractionManager):
def __init__(self, parent: QtWidgets.QWidget):
BaseInteractionManager.__init__(self)
QtCore.QObject.__init__(self, parent=parent)
self._parent = parent
def main_window_handle(self) -> Any:
return self.parent()
return self._parent
def _dialog_async_exec(self, dialog: QtWidgets.QDialog):
future = asyncio.Future()
@@ -99,12 +103,16 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
dialog.open()
return future
async def _file_dialog(self, caption: str, directory: str, filter_str: str, mode: QtWidgets.QFileDialog.FileMode) \
-> Tuple[bool, QtWidgets.QFileDialog]:
dialog = QtWidgets.QFileDialog(self.parent(), caption=caption, directory=directory, filter=filter_str)
async def _file_dialog(
self, caption: str, directory: str, filter_str: str, mode: QtWidgets.QFileDialog.FileMode,
default_suffix: str = '',
) -> Tuple[bool, QtWidgets.QFileDialog]:
dialog = QtWidgets.QFileDialog(self._parent, caption=caption, directory=directory, filter=filter_str)
dialog.setFileMode(mode)
if mode == QtWidgets.QFileDialog.FileMode.AnyFile:
dialog.setAcceptMode(QtWidgets.QFileDialog.AcceptMode.AcceptSave)
if default_suffix:
dialog.setDefaultSuffix(default_suffix)
res = await self._dialog_async_exec(dialog)
return res, dialog
@@ -132,14 +140,46 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
return None
return dialog.selectedFiles()[0]
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '') -> Optional[str]:
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '',
default_suffix: str = '') -> Optional[str]:
res, dialog = await self._file_dialog(
caption, directory, filter_str, QtWidgets.QFileDialog.FileMode.AnyFile
caption, directory, filter_str, QtWidgets.QFileDialog.FileMode.AnyFile, default_suffix,
)
if not res or not dialog.selectedFiles():
return None
return dialog.selectedFiles()[0]
async def confirm(self, title: str, caption: str) -> bool:
msg = QtWidgets.QMessageBox(
QtWidgets.QMessageBox.Icon.Question,
title,
caption,
QtWidgets.QMessageBox.Ok | QtWidgets.QMessageBox.Cancel,
self._parent,
)
fut = asyncio.Future()
msg.finished.connect(lambda r: fut.set_result(r))
msg.open()
return (await fut) == QtWidgets.QMessageBox.Ok
class GUIProxySettings(ProxySettings):
FIRST_RUN: bool = SettingDescriptor(True)
"""Persistent settings backed by QSettings"""
def __init__(self, settings: QtCore.QSettings):
super().__init__()
self._settings_obj = settings
def get_setting(self, name: str) -> Any:
val: Any = self._settings_obj.value(name, defaultValue=dataclasses.MISSING)
if val is dataclasses.MISSING:
return val
return json.loads(val)
def set_setting(self, name: str, val: Any):
self._settings_obj.setValue(name, json.dumps(val))
def nonFatalExceptions(f):
@functools.wraps(f)
@@ -154,7 +194,35 @@ def nonFatalExceptions(f):
return _wrapper
class ProxyGUI(QtWidgets.QMainWindow):
def buildReplacements(session: Session, region: ProxiedRegion):
if not session or not region:
return {}
selected = session.selected
agent_object = region.objects.lookup_fullid(session.agent_id)
selected_local = selected.object_local
selected_object = None
if selected_local:
# We may or may not have an object for this
selected_object = region.objects.lookup_localid(selected_local)
return {
"SELECTED_LOCAL": selected_local,
"SELECTED_FULL": selected_object.FullID if selected_object else None,
"SELECTED_PARCEL_LOCAL": selected.parcel_local,
"SELECTED_PARCEL_FULL": selected.parcel_full,
"SELECTED_SCRIPT_ITEM": selected.script_item,
"SELECTED_TASK_ITEM": selected.task_item,
"AGENT_ID": session.agent_id,
"AGENT_LOCAL": agent_object.LocalID if agent_object else None,
"SESSION_ID": session.id,
"AGENT_POS": agent_object.Position if agent_object else None,
"NULL_KEY": UUID(),
"RANDOM_KEY": UUID.random,
"CIRCUIT_CODE": session.circuit_code,
"REGION_HANDLE": region.handle,
}
class MessageLogWindow(QtWidgets.QMainWindow):
DEFAULT_IGNORE = "StartPingCheck CompletePingCheck PacketAck SimulatorViewerTimeMessage SimStats " \
"AgentUpdate AgentAnimation AvatarAnimation ViewerEffect CoarseLocationUpdate LayerData " \
"CameraConstraint ObjectUpdateCached RequestMultipleObjects ObjectUpdate ObjectUpdateCompressed " \
@@ -168,44 +236,57 @@ class ProxyGUI(QtWidgets.QMainWindow):
textRequest: QtWidgets.QTextEdit
def __init__(self):
super().__init__()
def __init__(
self, settings: GUIProxySettings, session_manager: GUISessionManager,
log_live_messages: bool, parent: Optional[QtWidgets.QWidget] = None,
):
super().__init__(parent=parent)
loadUi(MAIN_WINDOW_UI_PATH, self)
self.settings = QtCore.QSettings("SaladDais", "hippolyzer")
if parent:
self.setWindowTitle("Message Log")
self.menuBar.setEnabled(False) # type: ignore
self.menuBar.hide() # type: ignore
self._selectedEntry: Optional[AbstractMessageLogEntry] = None
self.model = MessageLogModel(parent=self.tableView)
self.settings = settings
self.sessionManager = session_manager
if log_live_messages:
self.model = MessageLogModel(parent=self.tableView)
session_manager.message_logger.loggers.append(self.model)
else:
self.model = MessageLogModel(parent=self.tableView, maxlen=None)
self.tableView.setModel(self.model)
self.model.rowsAboutToBeInserted.connect(self.beforeInsert)
self.model.rowsInserted.connect(self.afterInsert)
self.tableView.selectionModel().selectionChanged.connect(self._messageSelected)
self.checkBeautify.clicked.connect(self._showSelectedMessage)
self.checkPause.clicked.connect(self._setPaused)
self._setFilter(self.DEFAULT_FILTER)
self.setFilter(self.DEFAULT_FILTER)
self.btnClearLog.clicked.connect(self.model.clear)
self.lineEditFilter.editingFinished.connect(self._setFilter)
self.lineEditFilter.editingFinished.connect(self.setFilter)
self.btnMessageBuilder.clicked.connect(self._sendToMessageBuilder)
self.btnCopyRepr.clicked.connect(self._copyRepr)
self.actionInstallHTTPSCerts.triggered.connect(self._installHTTPSCerts)
self.actionInstallHTTPSCerts.triggered.connect(self.installHTTPSCerts)
self.actionManageAddons.triggered.connect(self._manageAddons)
self.actionManageFilters.triggered.connect(self._manageFilters)
self.actionOpenMessageBuilder.triggered.connect(self._openMessageBuilder)
self.actionProxyRemotelyAccessible.setChecked(
self.settings.value("RemotelyAccessible", False, type=bool))
self.actionUseViewerObjectCache.setChecked(
self.settings.value("UseViewerObjectCache", False, type=bool))
self.actionProxyRemotelyAccessible.setChecked(self.settings.REMOTELY_ACCESSIBLE)
self.actionUseViewerObjectCache.setChecked(self.settings.USE_VIEWER_OBJECT_CACHE)
self.actionRequestMissingObjects.setChecked(self.settings.AUTOMATICALLY_REQUEST_MISSING_OBJECTS)
self.actionProxyRemotelyAccessible.triggered.connect(self._setProxyRemotelyAccessible)
self.actionUseViewerObjectCache.triggered.connect(self._setUseViewerObjectCache)
self.actionRequestMissingObjects.triggered.connect(self._setRequestMissingObjects)
self.actionOpenNewMessageLogWindow.triggered.connect(self._openNewMessageLogWindow)
self.actionImportLogEntries.triggered.connect(self._importLogEntries)
self.actionExportLogEntries.triggered.connect(self._exportLogEntries)
self._filterMenu = QtWidgets.QMenu()
self._populateFilterMenu()
self.toolButtonFilter.setMenu(self._filterMenu)
self.sessionManager = GUISessionManager(self.model)
self.interactionManager = GUIInteractionManager(self)
AddonManager.UI = self.interactionManager
self._shouldScrollOnInsert = True
self.tableView.horizontalHeader().resizeSection(MessageLogHeader.Host, 80)
self.tableView.horizontalHeader().resizeSection(MessageLogHeader.Method, 60)
@@ -214,32 +295,38 @@ class ProxyGUI(QtWidgets.QMainWindow):
self.textResponse.hide()
def closeEvent(self, event) -> None:
loggers = self.sessionManager.message_logger.loggers
if self.model in loggers:
loggers.remove(self.model)
super().closeEvent(event)
def _populateFilterMenu(self):
def _addFilterAction(text, filter_str):
filter_action = QtWidgets.QAction(text, self)
filter_action.triggered.connect(lambda: self._setFilter(filter_str))
filter_action = QtGui.QAction(text, self)
filter_action.triggered.connect(lambda: self.setFilter(filter_str))
self._filterMenu.addAction(filter_action)
self._filterMenu.clear()
_addFilterAction("Default", self.DEFAULT_FILTER)
filters = self.getFilterDict()
filters = self.settings.FILTERS
for preset_name, preset_filter in filters.items():
_addFilterAction(preset_name, preset_filter)
def getFilterDict(self):
return json.loads(str(self.settings.value("Filters", "{}")))
return self.settings.FILTERS
def setFilterDict(self, val: dict):
self.settings.setValue("Filters", json.dumps(val))
self.settings.FILTERS = val
self._populateFilterMenu()
def _manageFilters(self):
dialog = FilterDialog(self)
dialog.exec_()
dialog.exec()
@nonFatalExceptions
def _setFilter(self, filter_str=None):
def setFilter(self, filter_str=None):
if filter_str is None:
filter_str = self.lineEditFilter.text()
else:
@@ -271,23 +358,22 @@ class ProxyGUI(QtWidgets.QMainWindow):
return
req = entry.request(
beautify=self.checkBeautify.isChecked(),
replacements=self.buildReplacements(entry.session, entry.region),
replacements=buildReplacements(entry.session, entry.region),
)
highlight_range = None
if isinstance(req, SpannedString):
match_result = self.model.filter.match(entry)
# Match result was a tuple indicating what matched
if isinstance(match_result, tuple):
highlight_range = req.spans.get(match_result)
self.textRequest.setPlainText(req)
if highlight_range:
cursor = self.textRequest.textCursor()
cursor.setPosition(highlight_range[0], QtGui.QTextCursor.MoveAnchor)
cursor.setPosition(highlight_range[1], QtGui.QTextCursor.KeepAnchor)
highlight_format = QtGui.QTextBlockFormat()
highlight_format.setBackground(QtCore.Qt.yellow)
cursor.setBlockFormat(highlight_format)
# The string has a map of fields and their associated positions within the string,
# use that to highlight any individual fields the filter matched on.
if isinstance(req, SpannedString):
for field in self.model.filter.match(entry, short_circuit=False).fields:
field_span = req.spans.get(field)
if not field_span:
continue
cursor = self.textRequest.textCursor()
cursor.setPosition(field_span[0], QtGui.QTextCursor.MoveAnchor)
cursor.setPosition(field_span[1], QtGui.QTextCursor.KeepAnchor)
highlight_format = QtGui.QTextBlockFormat()
highlight_format.setBackground(QtCore.Qt.yellow)
cursor.setBlockFormat(highlight_format)
resp = entry.response(beautify=self.checkBeautify.isChecked())
if resp:
@@ -311,7 +397,7 @@ class ProxyGUI(QtWidgets.QMainWindow):
win.show()
msg = self._selectedEntry
beautify = self.checkBeautify.isChecked()
replacements = self.buildReplacements(msg.session, msg.region)
replacements = buildReplacements(msg.session, msg.region)
win.setMessageText(msg.request(beautify=beautify, replacements=replacements))
@nonFatalExceptions
@@ -327,37 +413,43 @@ class ProxyGUI(QtWidgets.QMainWindow):
win = MessageBuilderWindow(self, self.sessionManager)
win.show()
def buildReplacements(self, session: Session, region: ProxiedRegion):
if not session or not region:
return {}
selected = session.selected
agent_object = region.objects.lookup_fullid(session.agent_id)
selected_local = selected.object_local
selected_object = None
if selected_local:
# We may or may not have an object for this
selected_object = region.objects.lookup_localid(selected_local)
return {
"SELECTED_LOCAL": selected_local,
"SELECTED_FULL": selected_object.FullID if selected_object else None,
"SELECTED_PARCEL_LOCAL": selected.parcel_local,
"SELECTED_PARCEL_FULL": selected.parcel_full,
"SELECTED_SCRIPT_ITEM": selected.script_item,
"SELECTED_TASK_ITEM": selected.task_item,
"AGENT_ID": session.agent_id,
"AGENT_LOCAL": agent_object.LocalID if agent_object else None,
"SESSION_ID": session.id,
"AGENT_POS": agent_object.Position if agent_object else None,
"NULL_KEY": UUID(),
"RANDOM_KEY": UUID.random,
"CIRCUIT_CODE": session.circuit_code,
"REGION_HANDLE": region.handle,
}
def _openNewMessageLogWindow(self):
win: QtWidgets.QMainWindow = MessageLogWindow(
self.settings, self.sessionManager, log_live_messages=True, parent=self)
win.setFilter(self.lineEditFilter.text())
win.show()
win.activateWindow()
def _installHTTPSCerts(self):
@asyncSlot()
async def _importLogEntries(self):
log_file = await AddonManager.UI.open_file(
caption="Import Log Entries", filter_str="Hippolyzer Logs (*.hippolog)"
)
if not log_file:
return
win = MessageLogWindow(self.settings, self.sessionManager, log_live_messages=False, parent=self)
win.setFilter(self.lineEditFilter.text())
with open(log_file, "rb") as f:
entries = import_log_entries(f.read())
for entry in entries:
win.model.add_log_entry(entry)
win.show()
win.activateWindow()
@asyncSlot()
async def _exportLogEntries(self):
log_file = await AddonManager.UI.save_file(
caption="Export Log Entries", filter_str="Hippolyzer Logs (*.hippolog)", default_suffix="hippolog",
)
if not log_file:
return
with open(log_file, "wb") as f:
f.write(export_log_entries(self.model))
def installHTTPSCerts(self):
msg = QtWidgets.QMessageBox()
msg.setText("This will install the proxy's HTTPS certificate in the config dir"
" of any installed viewers, continue?")
msg.setText("Would you like to install the proxy's HTTPS certificate in the config dir"
" of any installed viewers so that HTTPS connections will work?")
yes_btn = msg.addButton("Yes", QtWidgets.QMessageBox.NoRole)
msg.addButton("No", QtWidgets.QMessageBox.NoRole)
msg.exec()
@@ -376,24 +468,26 @@ class ProxyGUI(QtWidgets.QMainWindow):
msg.exec()
def _setProxyRemotelyAccessible(self, checked: bool):
self.settings.setValue("RemotelyAccessible", checked)
self.sessionManager.settings.REMOTELY_ACCESSIBLE = checked
msg = QtWidgets.QMessageBox()
msg.setText("Remote accessibility setting changes will take effect on next run")
msg.exec()
def _setUseViewerObjectCache(self, checked: bool):
self.settings.setValue("UseViewerObjectCache", checked)
self.sessionManager.use_viewer_object_cache = checked
self.sessionManager.settings.USE_VIEWER_OBJECT_CACHE = checked
def _setRequestMissingObjects(self, checked: bool):
self.sessionManager.settings.AUTOMATICALLY_REQUEST_MISSING_OBJECTS = checked
def _manageAddons(self):
dialog = AddonDialog(self)
dialog.exec_()
dialog.exec()
def getAddonList(self) -> List[str]:
return json.loads(str(self.settings.value("Addons", "[]")))
return self.sessionManager.settings.ADDON_SCRIPTS
def setAddonList(self, val: List[str]):
self.settings.setValue("Addons", json.dumps(val))
self.sessionManager.settings.ADDON_SCRIPTS = val
BANNED_HEADERS = ("content-length", "host")
@@ -431,7 +525,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
def __init__(self, parent, session_manager):
super().__init__(parent=parent)
loadUi(MESSAGE_BUILDER_UI_PATH, self)
self.templateDict = TemplateDictionary()
self.templateDict = DEFAULT_TEMPLATE_DICT
self.llsdSerializer = LLSDMessageSerializer()
self.sessionManager: SessionManager = session_manager
self.regionModel = RegionListModel(self, self.sessionManager)
@@ -476,7 +570,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
else:
self.comboUntrusted.addItem(message_name)
cap_names = sorted(set(itertools.chain(*[r.caps.keys() for r in self.regionModel.regions])))
cap_names = sorted(set(itertools.chain(*[r.cap_urls.keys() for r in self.regionModel.regions])))
for cap_name in cap_names:
if cap_name.endswith("ProxyWrapper"):
continue
@@ -507,7 +601,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
break
self.textRequest.setPlainText(
f"""{method} [[{cap_name}]]{path}{params} HTTP/1.1
# {region.caps.get(cap_name, "<unknown URI>")}
# {region.cap_urls.get(cap_name, "<unknown URI>")}
{headers}
{body}"""
)
@@ -560,24 +654,9 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
if var.name in ("TaskID", "ObjectID"):
return VerbatimHumanVal("[[SELECTED_FULL]]")
if var.type.is_int:
return 0
elif var.type.is_float:
return 0.0
elif var.type == MsgType.MVT_LLUUID:
return UUID()
elif var.type == MsgType.MVT_BOOL:
return False
elif var.type == MsgType.MVT_VARIABLE:
return ""
elif var.type in (MsgType.MVT_LLVector3, MsgType.MVT_LLVector3d, MsgType.MVT_LLQuaternion):
return VerbatimHumanVal("(0.0, 0.0, 0.0)")
elif var.type == MsgType.MVT_LLVector4:
return VerbatimHumanVal("(0.0, 0.0, 0.0, 0.0)")
elif var.type == MsgType.MVT_FIXED:
return b"\x00" * var.size
elif var.type == MsgType.MVT_IP_ADDR:
return "0.0.0.0"
default_val = var.default_value
if default_val is not None:
return default_val
return VerbatimHumanVal("")
@nonFatalExceptions
@@ -585,7 +664,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
session, region = self._getTarget()
msg_text = self.textRequest.toPlainText()
replacements = self.parent().buildReplacements(session, region)
replacements = buildReplacements(session, region)
if re.match(r"\A\s*(in|out)\s+", msg_text, re.I):
sender_func = self._sendLLUDPMessage
@@ -617,13 +696,11 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
msg = HumanMessageSerializer.from_human_string(msg_text, replacements, env, safe=False)
if self.checkLLUDPViaCaps.isChecked():
if msg.direction == Direction.IN:
region.eq_manager.queue_event(
self.llsdSerializer.serialize(msg, as_dict=True)
)
region.eq_manager.inject_message(msg)
else:
self._sendHTTPRequest(
"POST",
region.caps["UntrustedSimulatorMessage"],
region.cap_urls["UntrustedSimulatorMessage"],
{"Content-Type": "application/llsd+xml", "Accept": "application/llsd+xml"},
self.llsdSerializer.serialize(msg),
)
@@ -631,19 +708,26 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
transport = None
off_circuit = self.checkOffCircuit.isChecked()
if off_circuit:
transport = WrappingUDPTransport(socket.socket(socket.AF_INET, socket.SOCK_DGRAM))
region.circuit.send_message(msg, transport=transport)
transport = SocketUDPTransport(socket.socket(socket.AF_INET, socket.SOCK_DGRAM))
region.circuit.send(msg, transport=transport)
if off_circuit:
transport.close()
def _sendEQMessage(self, session, region: Optional[ProxiedRegion], msg_text: str, _replacements: dict):
def _sendEQMessage(self, session, region: Optional[ProxiedRegion], msg_text: str, replacements: dict):
if not session or not region:
raise RuntimeError("Need a valid session and region to send EQ event")
message_line, _, body = (x.strip() for x in msg_text.partition("\n"))
message_name = message_line.rsplit(" ", 1)[-1]
region.eq_manager.queue_event({
env = self._buildEnv(session, region)
def directive_handler(m):
return self._handleHTTPDirective(env, replacements, False, m)
body = re.sub(rb"<!HIPPO(\w+)\[\[(.*?)]]>", directive_handler, body.encode("utf8"), flags=re.S)
region.eq_manager.inject_event({
"message": message_name,
"body": llsd.parse_xml(body.encode("utf8")),
"body": llsd.parse_xml(body),
})
def _sendHTTPMessage(self, session, region, msg_text: str, replacements: dict):
@@ -667,7 +751,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
cap_name = match.group(1)
cap_url = session.global_caps.get(cap_name)
if not cap_url:
cap_url = region.caps.get(cap_name)
cap_url = region.cap_urls.get(cap_name)
if not cap_url:
raise ValueError("Don't have a Cap for %s" % cap_name)
uri = cap_url + match.group(2)
@@ -707,7 +791,10 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
val = subfield_eval(contents.decode("utf8").strip(), globals_={**env, **replacements})
val = _coerce_to_bytes(val)
elif directive == b"REPL":
val = _coerce_to_bytes(replacements[contents.decode("utf8").strip()])
repl = replacements[contents.decode("utf8").strip()]
if callable(repl):
repl = repl()
val = _coerce_to_bytes(repl)
else:
raise ValueError(f"Unknown directive {directive}")
@@ -719,7 +806,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
return val
def _sendHTTPRequest(self, method, uri, headers, body):
caps_client = ProxyCapsClient()
caps_client = ProxyCapsClient(self.sessionManager.settings)
async def _send_request():
req = caps_client.request(method, uri, headers=headers, data=body)
@@ -734,7 +821,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
class AddonDialog(QtWidgets.QDialog):
listAddons: QtWidgets.QListWidget
def __init__(self, parent: ProxyGUI):
def __init__(self, parent: MessageLogWindow):
super().__init__()
loadUi(ADDON_DIALOG_UI_PATH, self)
@@ -785,7 +872,7 @@ class AddonDialog(QtWidgets.QDialog):
class FilterDialog(QtWidgets.QDialog):
listFilters: QtWidgets.QListWidget
def __init__(self, parent: ProxyGUI):
def __init__(self, parent: MessageLogWindow):
super().__init__()
loadUi(FILTER_DIALOG_UI_PATH, self)
@@ -829,18 +916,22 @@ def gui_main():
app = QtWidgets.QApplication(sys.argv)
loop = QEventLoop(app)
asyncio.set_event_loop(loop)
window = ProxyGUI()
settings = GUIProxySettings(QtCore.QSettings("SaladDais", "hippolyzer"))
session_manager = GUISessionManager(settings)
window = MessageLogWindow(settings, session_manager, log_live_messages=True)
AddonManager.UI = GUIInteractionManager(window)
timer = QtCore.QTimer(app)
timer.timeout.connect(window.sessionManager.checkRegions)
timer.start(100)
signal.signal(signal.SIGINT, lambda *args: QtWidgets.QApplication.quit())
window.show()
remote_access = window.settings.value("RemotelyAccessible", False, type=bool)
use_vocache = window.settings.value("UseViewerObjectCache", False, type=bool)
window.sessionManager.use_viewer_object_cache = use_vocache
http_host = None
if remote_access:
if window.sessionManager.settings.REMOTELY_ACCESSIBLE:
http_host = "0.0.0.0"
if settings.FIRST_RUN:
settings.FIRST_RUN = False
# Automatically offer to install the HTTPS certs on first run.
window.installHTTPSCerts()
start_proxy(
session_manager=window.sessionManager,
extra_addon_paths=window.getAddonList(),

View File

@@ -256,6 +256,10 @@
<bool>true</bool>
</property>
<addaction name="actionOpenMessageBuilder"/>
<addaction name="actionOpenNewMessageLogWindow"/>
<addaction name="separator"/>
<addaction name="actionImportLogEntries"/>
<addaction name="actionExportLogEntries"/>
<addaction name="separator"/>
<addaction name="actionInstallHTTPSCerts"/>
<addaction name="actionManageAddons"/>
@@ -263,6 +267,7 @@
<addaction name="separator"/>
<addaction name="actionProxyRemotelyAccessible"/>
<addaction name="actionUseViewerObjectCache"/>
<addaction name="actionRequestMissingObjects"/>
</widget>
<addaction name="menuFile"/>
</widget>
@@ -311,6 +316,32 @@
<string>Can help make the proxy aware of certain objects, but can cause slowdowns</string>
</property>
</action>
<action name="actionRequestMissingObjects">
<property name="checkable">
<bool>true</bool>
</property>
<property name="text">
<string>Automatically Request Missing Objects</string>
</property>
<property name="toolTip">
<string>Force the proxy to request objects that it doesn't know about due to cache misses</string>
</property>
</action>
<action name="actionOpenNewMessageLogWindow">
<property name="text">
<string>Open New Message Log Window</string>
</property>
</action>
<action name="actionImportLogEntries">
<property name="text">
<string>Import Log Entries</string>
</property>
</action>
<action name="actionExportLogEntries">
<property name="text">
<string>Export Log Entries</string>
</property>
</action>
</widget>
<resources/>
<connections/>

View File

@@ -0,0 +1,91 @@
"""
Assorted utilities to make creating animations from scratch easier
"""
import copy
from typing import List, Union
from hippolyzer.lib.base.datatypes import Vector3, Quaternion
from hippolyzer.lib.base.llanim import PosKeyframe, RotKeyframe
def smooth_step(t: float):
t = max(0.0, min(1.0, t))
return t * t * (3 - 2 * t)
def rot_interp(r0: Quaternion, r1: Quaternion, t: float):
"""
Bad quaternion interpolation
TODO: This is definitely not correct yet seems to work ok? Implement slerp.
"""
# Ignore W
r0 = r0.data(3)
r1 = r1.data(3)
return Quaternion(*map(lambda pair: ((pair[0] * (1.0 - t)) + (pair[1] * t)), zip(r0, r1)))
def unique_frames(frames: List[Union[PosKeyframe, RotKeyframe]]):
"""Drop frames where time and coordinate are exact duplicates of another frame"""
new_frames = []
for frame in frames:
# TODO: fudge factor for float comparison instead
if frame not in new_frames:
new_frames.append(frame)
return new_frames
def shift_keyframes(frames: List[Union[PosKeyframe, RotKeyframe]], num: int):
"""
Shift keyframes around by `num` frames
Assumes keyframes occur at a set cadence, and that first and last keyframe are at the same coord.
"""
# Get rid of duplicate frames
frames = unique_frames(frames)
pop_idx = -1
insert_idx = 0
if num < 0:
insert_idx = len(frames) - 1
pop_idx = 0
num = -num
old_times = [f.time for f in frames]
new_frames = frames.copy()
# Drop last, duped frame. We'll copy the first frame to replace it later
new_frames.pop(-1)
for _ in range(num):
new_frames.insert(insert_idx, new_frames.pop(pop_idx))
# Put first frame back on the end
new_frames.append(copy.copy(new_frames[0]))
assert len(old_times) == len(new_frames)
assert new_frames[0] == new_frames[-1]
# Make the times of the shifted keyframes match up with the previous timeline
for old_time, new_frame in zip(old_times, new_frames):
new_frame.time = old_time
return new_frames
def smooth_pos(start: Vector3, end: Vector3, inter_frames: int, time: float, duration: float) -> List[PosKeyframe]:
"""Generate keyframes to smoothly interpolate between two positions"""
frames = [PosKeyframe(time=time, pos=start)]
for i in range(0, inter_frames):
t = (i + 1) / (inter_frames + 1)
smooth_t = smooth_step(t)
pos = Vector3(smooth_t, smooth_t, smooth_t).interpolate(start, end)
frames.append(PosKeyframe(time=time + (t * duration), pos=pos))
return frames + [PosKeyframe(time=time + duration, pos=end)]
def smooth_rot(start: Quaternion, end: Quaternion, inter_frames: int, time: float, duration: float)\
-> List[RotKeyframe]:
"""Generate keyframes to smoothly interpolate between two rotations"""
frames = [RotKeyframe(time=time, rot=start)]
for i in range(0, inter_frames):
t = (i + 1) / (inter_frames + 1)
smooth_t = smooth_step(t)
frames.append(RotKeyframe(time=time + (t * duration), rot=rot_interp(start, end, smooth_t)))
return frames + [RotKeyframe(time=time + duration, rot=end)]

View File

@@ -0,0 +1,306 @@
# This currently implements basic LLMesh -> Collada.
#
# TODO:
# * inverse, Collada -> LLMesh (for simple cases, maybe using impasse rather than pycollada)
# * round-tripping tests, LLMesh->Collada->LLMesh
# * * Can't really test using Collada->LLMesh->Collada because Collada->LLMesh is almost always
# going to be lossy due to how SL represents vertex data and materials compared to what
# Collada allows.
# * Eventually scrap this and just use GLTF instead once we know we have the semantics correct
# * * Collada was just easier to bootstrap given that it's the only officially supported input format
# * * Collada tooling sucks and even LL is moving away from it
# * * Ensuring LLMesh->Collada and LLMesh->GLTF conversion don't differ semantically is easy via assimp.
import collections
import os.path
import secrets
import statistics
import sys
from typing import Dict, List, Iterable, Optional
import collada
import collada.source
from collada import E
from lxml import etree
import numpy as np
import transformations
from hippolyzer.lib.base.helpers import get_resource_filename
from hippolyzer.lib.base.serialization import BufferReader
from hippolyzer.lib.base.mesh import LLMeshSerializer, MeshAsset, positions_from_domain, SkinSegmentDict
DIR = os.path.dirname(os.path.realpath(__file__))
def mesh_to_collada(ll_mesh: MeshAsset, include_skin=True) -> collada.Collada:
dae = collada.Collada()
axis = collada.asset.UP_AXIS.Z_UP
dae.assetInfo.upaxis = axis
scene = collada.scene.Scene("scene", [llmesh_to_node(ll_mesh, dae, include_skin=include_skin)])
dae.scenes.append(scene)
dae.scene = scene
return dae
def llmesh_to_node(ll_mesh: MeshAsset, dae: collada.Collada, uniq=None,
include_skin=True, node_transform: Optional[np.ndarray] = None) -> collada.scene.Node:
if node_transform is None:
node_transform = np.identity(4)
should_skin = False
skin_seg = ll_mesh.segments.get('skin')
bind_shape_matrix = None
if include_skin and skin_seg:
bind_shape_matrix = np.array(skin_seg["bind_shape_matrix"]).reshape((4, 4))
should_skin = True
# Transform from the skin will be applied on the controller, not the node
node_transform = np.identity(4)
if not uniq:
uniq = secrets.token_urlsafe(4)
geom_nodes = []
node_name = f"mainnode{uniq}"
# TODO: do the other LODs?
for submesh_num, submesh in enumerate(ll_mesh.segments["high_lod"]):
# Make sure none of our IDs collide with those of other nodes
sub_uniq = uniq + str(submesh_num)
range_xyz = positions_from_domain(submesh["Position"], submesh["PositionDomain"])
xyz = np.array([x.data() for x in range_xyz])
range_uv = positions_from_domain(submesh['TexCoord0'], submesh['TexCoord0Domain'])
uv = np.array([x.data() for x in range_uv]).flatten()
norms = np.array([x.data() for x in submesh["Normal"]])
effect = collada.material.Effect(
id=f"effect{sub_uniq}",
params=[],
specular=(0.0, 0.0, 0.0, 0.0),
reflectivity=(0.0, 0.0, 0.0, 0.0),
emission=(0.0, 0.0, 0.0, 0.0),
ambient=(0.0, 0.0, 0.0, 0.0),
reflective=0.0,
shadingtype="blinn",
shininess=0.0,
diffuse=(0.0, 0.0, 0.0),
)
mat = collada.material.Material(f"material{sub_uniq}", f"material{sub_uniq}", effect)
dae.materials.append(mat)
dae.effects.append(effect)
vert_src = collada.source.FloatSource(f"verts-array{sub_uniq}", xyz.flatten(), ("X", "Y", "Z"))
norm_src = collada.source.FloatSource(f"norms-array{sub_uniq}", norms.flatten(), ("X", "Y", "Z"))
# UV maps have to have the same name or they'll behave weirdly when objects are merged.
uv_src = collada.source.FloatSource("uvs-array", np.array(uv), ("U", "V"))
geom = collada.geometry.Geometry(dae, f"geometry{sub_uniq}", "geometry", [vert_src, norm_src, uv_src])
input_list = collada.source.InputList()
input_list.addInput(0, 'VERTEX', f'#verts-array{sub_uniq}', set="0")
input_list.addInput(0, 'NORMAL', f'#norms-array{sub_uniq}', set="0")
input_list.addInput(0, 'TEXCOORD', '#uvs-array', set="0")
tri_idxs = np.array(submesh["TriangleList"]).flatten()
matnode = collada.scene.MaterialNode(f"materialref{sub_uniq}", mat, inputs=[])
tri_set = geom.createTriangleSet(tri_idxs, input_list, f'materialref{sub_uniq}')
geom.primitives.append(tri_set)
dae.geometries.append(geom)
if should_skin:
joint_names = np.array(skin_seg['joint_names'], dtype=object)
joints_source = collada.source.NameSource(f"joint-names{sub_uniq}", joint_names, ("JOINT",))
# PyCollada has a bug where it doesn't set the source URI correctly. Fix it.
accessor = joints_source.xmlnode.find(f"{dae.tag('technique_common')}/{dae.tag('accessor')}")
if not accessor.get('source').startswith('#'):
accessor.set('source', f"#{accessor.get('source')}")
flattened_bind_poses = []
# LLMesh matrices are row-major, convert to col-major for Collada.
for bind_pose in skin_seg['inverse_bind_matrix']:
flattened_bind_poses.append(np.array(bind_pose).reshape((4, 4)).flatten('F'))
flattened_bind_poses = np.array(flattened_bind_poses)
inv_bind_source = _create_mat4_source(f"bind-poses{sub_uniq}", flattened_bind_poses, "TRANSFORM")
weight_joint_idxs = []
weights = []
vert_weight_counts = []
cur_weight_idx = 0
for vert_weights in submesh['Weights']:
vert_weight_counts.append(len(vert_weights))
for vert_weight in vert_weights:
weights.append(vert_weight.weight)
weight_joint_idxs.append(vert_weight.joint_idx)
weight_joint_idxs.append(cur_weight_idx)
cur_weight_idx += 1
weights_source = collada.source.FloatSource(f"skin-weights{sub_uniq}", np.array(weights), ("WEIGHT",))
# We need to make a controller for each material since materials are essentially distinct meshes
# in SL, with their own distinct sets of weights and vertex data.
controller_node = E.controller(
E.skin(
E.bind_shape_matrix(' '.join(str(x) for x in bind_shape_matrix.flatten('F'))),
joints_source.xmlnode,
inv_bind_source.xmlnode,
weights_source.xmlnode,
E.joints(
E.input(semantic="JOINT", source=f"#joint-names{sub_uniq}"),
E.input(semantic="INV_BIND_MATRIX", source=f"#bind-poses{sub_uniq}")
),
E.vertex_weights(
E.input(semantic="JOINT", source=f"#joint-names{sub_uniq}", offset="0"),
E.input(semantic="WEIGHT", source=f"#skin-weights{sub_uniq}", offset="1"),
E.vcount(' '.join(str(x) for x in vert_weight_counts)),
E.v(' '.join(str(x) for x in weight_joint_idxs)),
count=str(len(submesh['Weights']))
),
source=f"#geometry{sub_uniq}"
),
id=f"Armature-{sub_uniq}",
name=node_name
)
controller = collada.controller.Controller.load(dae, {}, controller_node)
dae.controllers.append(controller)
geom_node = collada.scene.ControllerNode(controller, [matnode])
else:
geom_node = collada.scene.GeometryNode(geom, [matnode])
geom_nodes.append(geom_node)
node = collada.scene.Node(
node_name,
children=geom_nodes,
transforms=[collada.scene.MatrixTransform(np.array(node_transform.flatten('F')))],
)
if should_skin:
# We need a skeleton per _mesh asset_ because you could have incongruous skeletons
# within the same linkset.
skel_root = load_skeleton_nodes()
transform_skeleton(skel_root, dae, skin_seg)
skel = collada.scene.Node.load(dae, skel_root, {})
skel.children.append(node)
skel.id = f"Skel-{uniq}"
skel.save()
node = skel
return node
def load_skeleton_nodes() -> etree.ElementBase:
# TODO: this sucks. Can't we construct nodes with the appropriate transformation
# matrices from the data in `avatar_skeleton.xml`?
skel_path = get_resource_filename("lib/base/data/male_collada_joints.xml")
with open(skel_path, 'r') as f:
return etree.fromstring(f.read())
def transform_skeleton(skel_root: etree.ElementBase, dae: collada.Collada, skin_seg: SkinSegmentDict,
include_unreferenced_bones=False):
"""Update skeleton XML nodes to account for joint translations in the mesh"""
# TODO: Use translation component only.
joint_nodes: Dict[str, collada.scene.Node] = {}
for skel_node in skel_root.iter():
# xpath is loathsome so this is easier.
if skel_node.tag != dae.tag('node') or skel_node.get('type') != 'JOINT':
continue
joint_nodes[skel_node.get('name')] = collada.scene.Node.load(dae, skel_node, {})
for joint_name, matrix in zip(skin_seg['joint_names'], skin_seg.get('alt_inverse_bind_matrix', [])):
joint_node = joint_nodes[joint_name]
joint_node.matrix = np.array(matrix).reshape((4, 4)).flatten('F')
# Update the underlying XML element with the new transform matrix
joint_node.save()
if not include_unreferenced_bones:
needed_heirarchy = set()
for skel_node in joint_nodes.values():
skel_node = skel_node.xmlnode
if skel_node.get('name') in skin_seg['joint_names']:
# Add this joint and any ancestors the list of needed joints
while skel_node is not None:
needed_heirarchy.add(skel_node.get('name'))
skel_node = skel_node.getparent()
for skel_node in joint_nodes.values():
skel_node = skel_node.xmlnode
if skel_node.get('name') not in needed_heirarchy:
skel_node.getparent().remove(skel_node)
pelvis_offset = skin_seg.get('pelvis_offset')
# TODO: should we even do this here? It's not present in the collada, just
# something that's specified in the uploader before conversion to LLMesh.
if pelvis_offset and 'mPelvis' in joint_nodes:
pelvis_node = joint_nodes['mPelvis']
# Column-major!
pelvis_node.matrix[3][2] += pelvis_offset
pelvis_node.save()
def _create_mat4_source(name: str, data: np.ndarray, semantic: str):
# PyCollada has no way to make a source with a float4x4 semantic. Do it a bad way.
# Note that collada demands column-major matrices whereas LLSD mesh has them row-major!
source = collada.source.FloatSource(name, data, tuple(f"M{x}" for x in range(16)))
accessor = source.xmlnode[1][0]
for child in list(accessor):
accessor.remove(child)
accessor.append(E.param(name=semantic, type="float4x4"))
return source
def fix_weird_bind_matrices(skin_seg: SkinSegmentDict):
"""
Fix weird-looking bind matrices to have normal scaling
Not sure why these even happen (weird mesh authoring programs?)
Sometimes get enormous inverse bind matrices (each component 10k+) and tiny
bind shape matrix components. This detects inverse bind shape matrices
with weird scales and tries to set them to what they "should" be without
the weird inverted scaling.
"""
axis_counters = [collections.Counter() for _ in range(3)]
for joint_inv in skin_seg['inverse_bind_matrix']:
joint_mat = np.array(joint_inv).reshape((4, 4))
joint_scale = transformations.decompose_matrix(joint_mat)[0]
for axis_counter, axis_val in zip(axis_counters, joint_scale):
axis_counter[axis_val] += 1
most_common_inv_scale = []
for axis_counter in axis_counters:
most_common_inv_scale.append(axis_counter.most_common(1)[0][0])
if abs(1.0 - statistics.fmean(most_common_inv_scale)) > 1.0:
# The magnitude of the scales in the inverse bind matrices look very strange.
# The bind matrix itself is probably messed up as well, try to fix it.
skin_seg['bind_shape_matrix'] = fix_llsd_matrix_scale(skin_seg['bind_shape_matrix'], most_common_inv_scale)
if joint_positions := skin_seg.get('alt_inverse_bind_matrix', None):
fix_matrix_list_scale(joint_positions, most_common_inv_scale)
rev_scale = tuple(1.0 / x for x in most_common_inv_scale)
fix_matrix_list_scale(skin_seg['inverse_bind_matrix'], rev_scale)
def fix_matrix_list_scale(source: List[List[float]], scale_fixup: Iterable[float]):
for i, alt_inv_matrix in enumerate(source):
source[i] = fix_llsd_matrix_scale(alt_inv_matrix, scale_fixup)
def fix_llsd_matrix_scale(source: List[float], scale_fixup: Iterable[float]):
matrix = np.array(source).reshape((4, 4))
decomposed = list(transformations.decompose_matrix(matrix))
# Need to handle both the scale and translation matrices
for idx in (0, 3):
decomposed[idx] = tuple(x * y for x, y in zip(decomposed[idx], scale_fixup))
return list(transformations.compose_matrix(*decomposed).flatten('C'))
def main():
# Take an llmesh file as an argument and spit out basename-converted.dae
with open(sys.argv[1], "rb") as f:
reader = BufferReader("<", f.read())
mesh = mesh_to_collada(reader.read(LLMeshSerializer(parse_segment_contents=True)))
mesh.write(sys.argv[1].rsplit(".", 1)[0] + "-converted.dae")
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,485 @@
<!-- from http://wiki.secondlife.com/wiki/Project_Bento_Resources_and_Information collada -->
<node id="Avatar" name="Avatar" type="NODE" xmlns="http://www.collada.org/2005/11/COLLADASchema">
<translate sid="location">0 0 0</translate>
<rotate sid="rotationZ">0 0 1 0</rotate>
<rotate sid="rotationY">0 1 0 0</rotate>
<rotate sid="rotationX">1 0 0 0</rotate>
<scale sid="scale">1 1 1</scale>
<node id="mPelvis" name="mPelvis" sid="mPelvis" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 1.067 0 0 0 1</matrix>
<node id="PELVIS" name="PELVIS" sid="PELVIS" type="JOINT">
<matrix sid="transform">1 0 0 -0.01 0 1 0 0 0 0 1 -0.02 0 0 0 1</matrix>
</node>
<node id="BUTT" name="BUTT" sid="BUTT" type="JOINT">
<matrix sid="transform">1 0 0 -0.06 0 1 0 0 0 0 1 -0.1 0 0 0 1</matrix>
</node>
<node id="mSpine1" name="mSpine1" sid="mSpine1" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.084 0 0 0 1</matrix>
<node id="mSpine2" name="mSpine2" sid="mSpine2" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 -0.084 0 0 0 1</matrix>
<node id="mTorso" name="mTorso" sid="mTorso" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.084 0 0 0 1</matrix>
<node id="BELLY" name="BELLY" sid="BELLY" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 0 0 0 1 0.04 0 0 0 1</matrix>
</node>
<node id="LEFT_HANDLE" name="LEFT_HANDLE" sid="LEFT_HANDLE" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.1 0 0 1 0.058 0 0 0 1</matrix>
</node>
<node id="RIGHT_HANDLE" name="RIGHT_HANDLE" sid="RIGHT_HANDLE" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.1 0 0 1 0.058 0 0 0 1</matrix>
</node>
<node id="LOWER_BACK" name="LOWER_BACK" sid="LOWER_BACK" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.023 0 0 0 1</matrix>
</node>
<node id="mSpine3" name="mSpine3" sid="mSpine3" type="JOINT">
<matrix sid="transform">1 0 0 -0.015 0 1 0 0 0 0 1 0.205 0 0 0 1</matrix>
<node id="mSpine4" name="mSpine4" sid="mSpine4" type="JOINT">
<matrix sid="transform">1 0 0 0.015 0 1 0 0 0 0 1 -0.205 0 0 0 1</matrix>
<node id="mChest" name="mChest" sid="mChest" type="JOINT">
<matrix sid="transform">1 0 0 -0.015 0 1 0 0 0 0 1 0.205 0 0 0 1</matrix>
<node id="CHEST" name="CHEST" sid="CHEST" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 0 0 0 1 0.07 0 0 0 1</matrix>
</node>
<node id="LEFT_PEC" name="LEFT_PEC" sid="LEFT_PEC" type="JOINT">
<matrix sid="transform">1 0 0 0.119 0 1 0 0.082 0 0 1 0.042 0 0 0 1</matrix>
</node>
<node id="RIGHT_PEC" name="RIGHT_PEC" sid="RIGHT_PEC" type="JOINT">
<matrix sid="transform">1 0 0 0.119 0 1 0 -0.082 0 0 1 0.042 0 0 0 1</matrix>
</node>
<node id="UPPER_BACK" name="UPPER_BACK" sid="UPPER_BACK" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.017 0 0 0 1</matrix>
</node>
<node id="mNeck" name="mNeck" sid="mNeck" type="JOINT">
<matrix sid="transform">1 0 0 -0.01 0 1 0 0 0 0 1 0.251 0 0 0 1</matrix>
<node id="NECK" name="NECK" sid="NECK" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
</node>
<node id="mHead" name="mHead" sid="mHead" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.076 0 0 0 1</matrix>
<node id="HEAD" name="HEAD" sid="HEAD" type="JOINT">
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 0.07 0 0 0 1</matrix>
</node>
<node id="mSkull" name="mSkull" sid="mSkull" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.079 0 0 0 1</matrix>
</node>
<node id="mEyeRight" name="mEyeRight" sid="mEyeRight" type="JOINT">
<matrix sid="transform">1 0 0 0.098 0 1 0 -0.036 0 0 1 0.079 0 0 0 1</matrix>
</node>
<node id="mEyeLeft" name="mEyeLeft" sid="mEyeLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.098 0 1 0 0.036 0 0 1 0.079 0 0 0 1</matrix>
</node>
<node id="mFaceRoot" name="mFaceRoot" sid="mFaceRoot" type="JOINT">
<matrix sid="transform">1 0 0 0.025 0 1 0 0 0 0 1 0.045 0 0 0 1</matrix>
<node id="mFaceEyeAltRight" name="mFaceEyeAltRight" sid="mFaceEyeAltRight" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 -0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEyeAltLeft" name="mFaceEyeAltLeft" sid="mFaceEyeAltLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceForeheadLeft" name="mFaceForeheadLeft" sid="mFaceForeheadLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.061 0 1 0 0.035 0 0 1 0.083 0 0 0 1</matrix>
</node>
<node id="mFaceForeheadRight" name="mFaceForeheadRight" sid="mFaceForeheadRight" type="JOINT">
<matrix sid="transform">1 0 0 0.061 0 1 0 -0.035 0 0 1 0.083 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowOuterLeft" name="mFaceEyebrowOuterLeft" sid="mFaceEyebrowOuterLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.064 0 1 0 0.051 0 0 1 0.048 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowCenterLeft" name="mFaceEyebrowCenterLeft" sid="mFaceEyebrowCenterLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.07 0 1 0 0.043 0 0 1 0.056 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowInnerLeft" name="mFaceEyebrowInnerLeft" sid="mFaceEyebrowInnerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.075 0 1 0 0.022 0 0 1 0.051 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowOuterRight" name="mFaceEyebrowOuterRight" sid="mFaceEyebrowOuterRight" type="JOINT">
<matrix sid="transform">1 0 0 0.064 0 1 0 -0.051 0 0 1 0.048 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowCenterRight" name="mFaceEyebrowCenterRight" sid="mFaceEyebrowCenterRight" type="JOINT">
<matrix sid="transform">1 0 0 0.07 0 1 0 -0.043 0 0 1 0.056 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowInnerRight" name="mFaceEyebrowInnerRight" sid="mFaceEyebrowInnerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.075 0 1 0 -0.022 0 0 1 0.051 0 0 0 1</matrix>
</node>
<node id="mFaceEyeLidUpperLeft" name="mFaceEyeLidUpperLeft" sid="mFaceEyeLidUpperLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEyeLidLowerLeft" name="mFaceEyeLidLowerLeft" sid="mFaceEyeLidLowerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEyeLidUpperRight" name="mFaceEyeLidUpperRight" sid="mFaceEyeLidUpperRight" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 -0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEyeLidLowerRight" name="mFaceEyeLidLowerRight" sid="mFaceEyeLidLowerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 -0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEar1Left" name="mFaceEar1Left" sid="mFaceEar1Left" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.08 0 0 1 0.002 0 0 0 1</matrix>
<node id="mFaceEar2Left" name="mFaceEar2Left" sid="mFaceEar2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.019 0 1 0 0.018 0 0 1 0.025 0 0 0 1</matrix>
</node>
</node>
<node id="mFaceEar1Right" name="mFaceEar1Right" sid="mFaceEar1Right" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.08 0 0 1 0.002 0 0 0 1</matrix>
<node id="mFaceEar2Right" name="mFaceEar2Right" sid="mFaceEar2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.019 0 1 0 -0.018 0 0 1 0.025 0 0 0 1</matrix>
</node>
</node>
<node id="mFaceNoseLeft" name="mFaceNoseLeft" sid="mFaceNoseLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.086 0 1 0 0.015 0 0 1 -0.004 0 0 0 1</matrix>
</node>
<node id="mFaceNoseCenter" name="mFaceNoseCenter" sid="mFaceNoseCenter" type="JOINT">
<matrix sid="transform">1 0 0 0.102 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceNoseRight" name="mFaceNoseRight" sid="mFaceNoseRight" type="JOINT">
<matrix sid="transform">1 0 0 0.086 0 1 0 -0.015 0 0 1 -0.004 0 0 0 1</matrix>
</node>
<node id="mFaceCheekLowerLeft" name="mFaceCheekLowerLeft" sid="mFaceCheekLowerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.05 0 1 0 0.034 0 0 1 -0.031 0 0 0 1</matrix>
</node>
<node id="mFaceCheekUpperLeft" name="mFaceCheekUpperLeft" sid="mFaceCheekUpperLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.07 0 1 0 0.034 0 0 1 -0.005 0 0 0 1</matrix>
</node>
<node id="mFaceCheekLowerRight" name="mFaceCheekLowerRight" sid="mFaceCheekLowerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.05 0 1 0 -0.034 0 0 1 -0.031 0 0 0 1</matrix>
</node>
<node id="mFaceCheekUpperRight" name="mFaceCheekUpperRight" sid="mFaceCheekUpperRight" type="JOINT">
<matrix sid="transform">1 0 0 0.07 0 1 0 -0.034 0 0 1 -0.005 0 0 0 1</matrix>
</node>
<node id="mFaceJaw" name="mFaceJaw" sid="mFaceJaw" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 0 0 0 1 -0.015 0 0 0 1</matrix>
<node id="mFaceChin" name="mFaceChin" sid="mFaceChin" type="JOINT">
<matrix sid="transform">1 0 0 0.074 0 1 0 0 0 0 1 -0.054 0 0 0 1</matrix>
</node>
<node id="mFaceTeethLower" name="mFaceTeethLower" sid="mFaceTeethLower" type="JOINT">
<matrix sid="transform">1 0 0 0.021 0 1 0 0 0 0 1 -0.039 0 0 0 1</matrix>
<node id="mFaceLipLowerLeft" name="mFaceLipLowerLeft" sid="mFaceLipLowerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceLipLowerRight" name="mFaceLipLowerRight" sid="mFaceLipLowerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceLipLowerCenter" name="mFaceLipLowerCenter" sid="mFaceLipLowerCenter" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceTongueBase" name="mFaceTongueBase" sid="mFaceTongueBase" type="JOINT">
<matrix sid="transform">1 0 0 0.039 0 1 0 0 0 0 1 0.005 0 0 0 1</matrix>
<node id="mFaceTongueTip" name="mFaceTongueTip" sid="mFaceTongueTip" type="JOINT">
<matrix sid="transform">1 0 0 0.022 0 1 0 0 0 0 1 0.007 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
<node id="mFaceJawShaper" name="mFaceJawShaper" sid="mFaceJawShaper" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceForeheadCenter" name="mFaceForeheadCenter" sid="mFaceForeheadCenter" type="JOINT">
<matrix sid="transform">1 0 0 0.069 0 1 0 0 0 0 1 0.065 0 0 0 1</matrix>
</node>
<node id="mFaceNoseBase" name="mFaceNoseBase" sid="mFaceNoseBase" type="JOINT">
<matrix sid="transform">1 0 0 0.094 0 1 0 0 0 0 1 -0.016 0 0 0 1</matrix>
</node>
<node id="mFaceTeethUpper" name="mFaceTeethUpper" sid="mFaceTeethUpper" type="JOINT">
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 -0.03 0 0 0 1</matrix>
<node id="mFaceLipUpperLeft" name="mFaceLipUpperLeft" sid="mFaceLipUpperLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 -0.003 0 0 0 1</matrix>
</node>
<node id="mFaceLipUpperRight" name="mFaceLipUpperRight" sid="mFaceLipUpperRight" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 -0.003 0 0 0 1</matrix>
</node>
<node id="mFaceLipCornerLeft" name="mFaceLipCornerLeft" sid="mFaceLipCornerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 -0.019 0 0 1 -0.01 0 0 0 1</matrix>
</node>
<node id="mFaceLipCornerRight" name="mFaceLipCornerRight" sid="mFaceLipCornerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 0.019 0 0 1 -0.01 0 0 0 1</matrix>
</node>
<node id="mFaceLipUpperCenter" name="mFaceLipUpperCenter" sid="mFaceLipUpperCenter" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 -0.003 0 0 0 1</matrix>
</node>
</node>
<node id="mFaceEyecornerInnerLeft" name="mFaceEyecornerInnerLeft" sid="mFaceEyecornerInnerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.075 0 1 0 0.017 0 0 1 0.032 0 0 0 1</matrix>
</node>
<node id="mFaceEyecornerInnerRight" name="mFaceEyecornerInnerRight" sid="mFaceEyecornerInnerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.075 0 1 0 -0.017 0 0 1 0.032 0 0 0 1</matrix>
</node>
<node id="mFaceNoseBridge" name="mFaceNoseBridge" sid="mFaceNoseBridge" type="JOINT">
<matrix sid="transform">1 0 0 0.091 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
<node id="mCollarLeft" name="mCollarLeft" sid="mCollarLeft" type="JOINT">
<matrix sid="transform">1 0 0 -0.021 0 1 0 0.085 0 0 1 0.165 0 0 0 1</matrix>
<node id="L_CLAVICLE" name="L_CLAVICLE" sid="L_CLAVICLE" type="JOINT">
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
</node>
<node id="mShoulderLeft" name="mShoulderLeft" sid="mShoulderLeft" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.079 0 0 1 0 0 0 0 1</matrix>
<node id="L_UPPER_ARM" name="L_UPPER_ARM" sid="L_UPPER_ARM" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.12 0 0 1 0.01 0 0 0 1</matrix>
</node>
<node id="mElbowLeft" name="mElbowLeft" sid="mElbowLeft" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.248 0 0 1 0 0 0 0 1</matrix>
<node id="L_LOWER_ARM" name="L_LOWER_ARM" sid="L_LOWER_ARM" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.1 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mWristLeft" name="mWristLeft" sid="mWristLeft" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.205 0 0 1 0 0 0 0 1</matrix>
<node id="L_HAND" name="L_HAND" sid="L_HAND" type="JOINT">
<matrix sid="transform">1 0 0 0.01 0 1 0 0.05 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mHandMiddle1Left" name="mHandMiddle1Left" sid="mHandMiddle1Left" type="JOINT">
<matrix sid="transform">1 0 0 0.013 0 1 0 0.101 0 0 1 0.015 0 0 0 1</matrix>
<node id="mHandMiddle2Left" name="mHandMiddle2Left" sid="mHandMiddle2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 0.04 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandMiddle3Left" name="mHandMiddle3Left" sid="mHandMiddle3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 0.049 0 0 1 -0.008 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandIndex1Left" name="mHandIndex1Left" sid="mHandIndex1Left" type="JOINT">
<matrix sid="transform">1 0 0 0.038 0 1 0 0.097 0 0 1 0.015 0 0 0 1</matrix>
<node id="mHandIndex2Left" name="mHandIndex2Left" sid="mHandIndex2Left" type="JOINT">
<matrix sid="transform">1 0 0 0.017 0 1 0 0.036 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandIndex3Left" name="mHandIndex3Left" sid="mHandIndex3Left" type="JOINT">
<matrix sid="transform">1 0 0 0.014 0 1 0 0.032 0 0 1 -0.006 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandRing1Left" name="mHandRing1Left" sid="mHandRing1Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.01 0 1 0 0.099 0 0 1 0.009 0 0 0 1</matrix>
<node id="mHandRing2Left" name="mHandRing2Left" sid="mHandRing2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.013 0 1 0 0.038 0 0 1 -0.008 0 0 0 1</matrix>
<node id="mHandRing3Left" name="mHandRing3Left" sid="mHandRing3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.013 0 1 0 0.04 0 0 1 -0.009 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandPinky1Left" name="mHandPinky1Left" sid="mHandPinky1Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.031 0 1 0 0.095 0 0 1 0.003 0 0 0 1</matrix>
<node id="mHandPinky2Left" name="mHandPinky2Left" sid="mHandPinky2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.024 0 1 0 0.025 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandPinky3Left" name="mHandPinky3Left" sid="mHandPinky3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.015 0 1 0 0.018 0 0 1 -0.004 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandThumb1Left" name="mHandThumb1Left" sid="mHandThumb1Left" type="JOINT">
<matrix sid="transform">1 0 0 0.031 0 1 0 0.026 0 0 1 0.004 0 0 0 1</matrix>
<node id="mHandThumb2Left" name="mHandThumb2Left" sid="mHandThumb2Left" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 0.032 0 0 1 -0.001 0 0 0 1</matrix>
<node id="mHandThumb3Left" name="mHandThumb3Left" sid="mHandThumb3Left" type="JOINT">
<matrix sid="transform">1 0 0 0.023 0 1 0 0.031 0 0 1 -0.001 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
<node id="mCollarRight" name="mCollarRight" sid="mCollarRight" type="JOINT">
<matrix sid="transform">1 0 0 -0.021 0 1 0 -0.085 0 0 1 0.165 0 0 0 1</matrix>
<node id="R_CLAVICLE" name="R_CLAVICLE" sid="R_CLAVICLE" type="JOINT">
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
</node>
<node id="mShoulderRight" name="mShoulderRight" sid="mShoulderRight" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.079 0 0 1 0 0 0 0 1</matrix>
<node id="R_UPPER_ARM" name="R_UPPER_ARM" sid="R_UPPER_ARM" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.12 0 0 1 0.01 0 0 0 1</matrix>
</node>
<node id="mElbowRight" name="mElbowRight" sid="mElbowRight" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.248 0 0 1 0 0 0 0 1</matrix>
<node id="R_LOWER_ARM" name="R_LOWER_ARM" sid="R_LOWER_ARM" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.1 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mWristRight" name="mWristRight" sid="mWristRight" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.205 0 0 1 0 0 0 0 1</matrix>
<node id="R_HAND" name="R_HAND" sid="R_HAND" type="JOINT">
<matrix sid="transform">1 0 0 0.01 0 1 0 -0.05 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mHandMiddle1Right" name="mHandMiddle1Right" sid="mHandMiddle1Right" type="JOINT">
<matrix sid="transform">1 0 0 0.013 0 1 0 -0.101 0 0 1 0.015 0 0 0 1</matrix>
<node id="mHandMiddle2Right" name="mHandMiddle2Right" sid="mHandMiddle2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 -0.04 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandMiddle3Right" name="mHandMiddle3Right" sid="mHandMiddle3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 -0.049 0 0 1 -0.008 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandIndex1Right" name="mHandIndex1Right" sid="mHandIndex1Right" type="JOINT">
<matrix sid="transform">1 0 0 0.038 0 1 0 -0.097 0 0 1 0.015 0 0 0 1</matrix>
<node id="mHandIndex2Right" name="mHandIndex2Right" sid="mHandIndex2Right" type="JOINT">
<matrix sid="transform">1 0 0 0.017 0 1 0 -0.036 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandIndex3Right" name="mHandIndex3Right" sid="mHandIndex3Right" type="JOINT">
<matrix sid="transform">1 0 0 0.014 0 1 0 -0.032 0 0 1 -0.006 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandRing1Right" name="mHandRing1Right" sid="mHandRing1Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.01 0 1 0 -0.099 0 0 1 0.009 0 0 0 1</matrix>
<node id="mHandRing2Right" name="mHandRing2Right" sid="mHandRing2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.013 0 1 0 -0.038 0 0 1 -0.008 0 0 0 1</matrix>
<node id="mHandRing3Right" name="mHandRing3Right" sid="mHandRing3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.013 0 1 0 -0.04 0 0 1 -0.009 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandPinky1Right" name="mHandPinky1Right" sid="mHandPinky1Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.031 0 1 0 -0.095 0 0 1 0.003 0 0 0 1</matrix>
<node id="mHandPinky2Right" name="mHandPinky2Right" sid="mHandPinky2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.024 0 1 0 -0.025 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandPinky3Right" name="mHandPinky3Right" sid="mHandPinky3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.015 0 1 0 -0.018 0 0 1 -0.004 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandThumb1Right" name="mHandThumb1Right" sid="mHandThumb1Right" type="JOINT">
<matrix sid="transform">1 0 0 0.031 0 1 0 -0.026 0 0 1 0.004 0 0 0 1</matrix>
<node id="mHandThumb2Right" name="mHandThumb2Right" sid="mHandThumb2Right" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 -0.032 0 0 1 -0.001 0 0 0 1</matrix>
<node id="mHandThumb3Right" name="mHandThumb3Right" sid="mHandThumb3Right" type="JOINT">
<matrix sid="transform">1 0 0 0.023 0 1 0 -0.031 0 0 1 -0.001 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
<node id="mWingsRoot" name="mWingsRoot" sid="mWingsRoot" type="JOINT">
<matrix sid="transform">1 0 0 -0.014 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mWing1Left" name="mWing1Left" sid="mWing1Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.099 0 1 0 0.105 0 0 1 0.181 0 0 0 1</matrix>
<node id="mWing2Left" name="mWing2Left" sid="mWing2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.168 0 1 0 0.169 0 0 1 0.067 0 0 0 1</matrix>
<node id="mWing3Left" name="mWing3Left" sid="mWing3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.181 0 1 0 0.183 0 0 1 0 0 0 0 1</matrix>
<node id="mWing4Left" name="mWing4Left" sid="mWing4Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.171 0 1 0 0.173 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mWing4FanLeft" name="mWing4FanLeft" sid="mWing4FanLeft" type="JOINT">
<matrix sid="transform">1 0 0 -0.171 0 1 0 0.173 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
<node id="mWing1Right" name="mWing1Right" sid="mWing1Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.099 0 1 0 -0.105 0 0 1 0.181 0 0 0 1</matrix>
<node id="mWing2Right" name="mWing2Right" sid="mWing2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.168 0 1 0 -0.169 0 0 1 0.067 0 0 0 1</matrix>
<node id="mWing3Right" name="mWing3Right" sid="mWing3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.181 0 1 0 -0.183 0 0 1 0 0 0 0 1</matrix>
<node id="mWing4Right" name="mWing4Right" sid="mWing4Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.171 0 1 0 -0.173 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mWing4FanRight" name="mWing4FanRight" sid="mWing4FanRight" type="JOINT">
<matrix sid="transform">1 0 0 -0.171 0 1 0 -0.173 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
<node id="mHipRight" name="mHipRight" sid="mHipRight" type="JOINT">
<matrix sid="transform">1 0 0 0.034 0 1 0 -0.129 0 0 1 -0.041 0 0 0 1</matrix>
<node id="R_UPPER_LEG" name="R_UPPER_LEG" sid="R_UPPER_LEG" type="JOINT">
<matrix sid="transform">1 0 0 -0.02 0 1 0 0.05 0 0 1 -0.22 0 0 0 1</matrix>
</node>
<node id="mKneeRight" name="mKneeRight" sid="mKneeRight" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 0.049 0 0 1 -0.491 0 0 0 1</matrix>
<node id="R_LOWER_LEG" name="R_LOWER_LEG" sid="R_LOWER_LEG" type="JOINT">
<matrix sid="transform">1 0 0 -0.02 0 1 0 0 0 0 1 -0.2 0 0 0 1</matrix>
</node>
<node id="mAnkleRight" name="mAnkleRight" sid="mAnkleRight" type="JOINT">
<matrix sid="transform">1 0 0 -0.029 0 1 0 0 0 0 1 -0.468 0 0 0 1</matrix>
<node id="R_FOOT" name="R_FOOT" sid="R_FOOT" type="JOINT">
<matrix sid="transform">1 0 0 0.077 0 1 0 0 0 0 1 -0.041 0 0 0 1</matrix>
</node>
<node id="mFootRight" name="mFootRight" sid="mFootRight" type="JOINT">
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
<node id="mToeRight" name="mToeRight" sid="mToeRight" type="JOINT">
<matrix sid="transform">1 0 0 0.109 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
<node id="mHipLeft" name="mHipLeft" sid="mHipLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.034 0 1 0 0.127 0 0 1 -0.041 0 0 0 1</matrix>
<node id="L_UPPER_LEG" name="L_UPPER_LEG" sid="L_UPPER_LEG" type="JOINT">
<matrix sid="transform">1 0 0 -0.02 0 1 0 -0.05 0 0 1 -0.22 0 0 0 1</matrix>
</node>
<node id="mKneeLeft" name="mKneeLeft" sid="mKneeLeft" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 -0.046 0 0 1 -0.491 0 0 0 1</matrix>
<node id="L_LOWER_LEG" name="L_LOWER_LEG" sid="L_LOWER_LEG" type="JOINT">
<matrix sid="transform">1 0 0 -0.02 0 1 0 0 0 0 1 -0.2 0 0 0 1</matrix>
</node>
<node id="mAnkleLeft" name="mAnkleLeft" sid="mAnkleLeft" type="JOINT">
<matrix sid="transform">1 0 0 -0.029 0 1 0 0.001 0 0 1 -0.468 0 0 0 1</matrix>
<node id="L_FOOT" name="L_FOOT" sid="L_FOOT" type="JOINT">
<matrix sid="transform">1 0 0 0.077 0 1 0 0 0 0 1 -0.041 0 0 0 1</matrix>
</node>
<node id="mFootLeft" name="mFootLeft" sid="mFootLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
<node id="mToeLeft" name="mToeLeft" sid="mToeLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.109 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
<node id="mTail1" name="mTail1" sid="mTail1" type="JOINT">
<matrix sid="transform">1 0 0 -0.116 0 1 0 0 0 0 1 0.047 0 0 0 1</matrix>
<node id="mTail2" name="mTail2" sid="mTail2" type="JOINT">
<matrix sid="transform">1 0 0 -0.197 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mTail3" name="mTail3" sid="mTail3" type="JOINT">
<matrix sid="transform">1 0 0 -0.168 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mTail4" name="mTail4" sid="mTail4" type="JOINT">
<matrix sid="transform">1 0 0 -0.142 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mTail5" name="mTail5" sid="mTail5" type="JOINT">
<matrix sid="transform">1 0 0 -0.112 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mTail6" name="mTail6" sid="mTail6" type="JOINT">
<matrix sid="transform">1 0 0 -0.094 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
<node id="mGroin" name="mGroin" sid="mGroin" type="JOINT">
<matrix sid="transform">1 0 0 0.064 0 1 0 0 0 0 1 -0.097 0 0 0 1</matrix>
</node>
<node id="mHindLimbsRoot" name="mHindLimbsRoot" sid="mHindLimbsRoot" type="JOINT">
<matrix sid="transform">1 0 0 -0.2 0 1 0 0 0 0 1 0.084 0 0 0 1</matrix>
<node id="mHindLimb1Left" name="mHindLimb1Left" sid="mHindLimb1Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.204 0 1 0 0.129 0 0 1 -0.125 0 0 0 1</matrix>
<node id="mHindLimb2Left" name="mHindLimb2Left" sid="mHindLimb2Left" type="JOINT">
<matrix sid="transform">1 0 0 0.002 0 1 0 -0.046 0 0 1 -0.491 0 0 0 1</matrix>
<node id="mHindLimb3Left" name="mHindLimb3Left" sid="mHindLimb3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.03 0 1 0 -0.003 0 0 1 -0.468 0 0 0 1</matrix>
<node id="mHindLimb4Left" name="mHindLimb4Left" sid="mHindLimb4Left" type="JOINT">
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
<node id="mHindLimb1Right" name="mHindLimb1Right" sid="mHindLimb1Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.204 0 1 0 -0.129 0 0 1 -0.125 0 0 0 1</matrix>
<node id="mHindLimb2Right" name="mHindLimb2Right" sid="mHindLimb2Right" type="JOINT">
<matrix sid="transform">1 0 0 0.002 0 1 0 0.046 0 0 1 -0.491 0 0 0 1</matrix>
<node id="mHindLimb3Right" name="mHindLimb3Right" sid="mHindLimb3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.03 0 1 0 0.003 0 0 1 -0.468 0 0 0 1</matrix>
<node id="mHindLimb4Right" name="mHindLimb4Right" sid="mHindLimb4Right" type="JOINT">
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
</node>

View File

@@ -18,6 +18,8 @@ You should have received a copy of the GNU Lesser General Public License
along with this program; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
from __future__ import annotations
import ast
import enum
import hashlib
@@ -58,6 +60,9 @@ class TupleCoord(recordclass.datatuple, _IterableStub): # type: ignore
def __abs__(self):
return self.__class__(*(abs(x) for x in self))
def __neg__(self):
return self.__class__(*(-x for x in self))
def __add__(self, other):
return self.__class__(*(x + y for x, y in zip(self, other)))
@@ -244,6 +249,7 @@ class Quaternion(TupleCoord):
class UUID(uuid.UUID):
_NULL_UUID_STR = '00000000-0000-0000-0000-000000000000'
ZERO: UUID
__slots__ = ()
def __init__(self, val: Union[uuid.UUID, str, None] = None, bytes=None, int=None):
@@ -268,12 +274,16 @@ class UUID(uuid.UUID):
return self.__class__(int=self.int ^ other.int)
UUID.ZERO = UUID()
class JankStringyBytes(bytes):
"""
Treat bytes as UTF8 if used in string context
Sinful, but necessary evil for now since templates don't specify what's
binary and what's a string.
binary and what's a string. There are also certain fields where the value
may be either binary _or_ a string, depending on the context.
"""
__slots__ = ()
@@ -288,12 +298,28 @@ class JankStringyBytes(bytes):
def __ne__(self, other):
return not self.__eq__(other)
def __contains__(self, item):
if isinstance(item, str):
return item in str(self)
return item in bytes(self)
class RawBytes(bytes):
__slots__ = ()
pass
_T = TypeVar("_T")
class Pretty(Generic[_T]):
"""Wrapper for var values so Messages will know to serialize"""
__slots__ = ("value",)
def __init__(self, value: _T):
self.value: _T = value
class StringEnum(str, enum.Enum):
def __str__(self):
return self.value
@@ -333,5 +359,5 @@ class TaggedUnion(recordclass.datatuple): # type: ignore
__all__ = [
"Vector3", "Vector4", "Vector2", "Quaternion", "TupleCoord",
"UUID", "RawBytes", "StringEnum", "JankStringyBytes", "TaggedUnion",
"IntEnum", "IntFlag", "flags_to_pod"
"IntEnum", "IntFlag", "flags_to_pod", "Pretty"
]

View File

@@ -2,6 +2,8 @@ from __future__ import annotations
import codecs
import functools
import os
import pkg_resources
import re
import weakref
@@ -139,3 +141,16 @@ def bytes_escape(val: bytes) -> bytes:
def get_resource_filename(resource_filename: str):
return pkg_resources.resource_filename("hippolyzer", resource_filename)
def to_chunks(chunkable: Sequence[_T], chunk_size: int) -> Generator[_T, None, None]:
while chunkable:
yield chunkable[:chunk_size]
chunkable = chunkable[chunk_size:]
def get_mtime(path):
try:
return os.stat(path).st_mtime
except:
return None

View File

@@ -7,8 +7,9 @@ from __future__ import annotations
import dataclasses
import datetime as dt
import itertools
import logging
import struct
import typing
import weakref
from io import StringIO
from typing import *
@@ -33,6 +34,17 @@ LOG = logging.getLogger(__name__)
_T = TypeVar("_T")
class SchemaFlagField(SchemaHexInt):
"""Like a hex int, but must be serialized as bytes in LLSD due to being a U32"""
@classmethod
def from_llsd(cls, val: Any) -> int:
return struct.unpack("!I", val)[0]
@classmethod
def to_llsd(cls, val: int) -> Any:
return struct.pack("!I", val)
def _yield_schema_tokens(reader: StringIO):
in_bracket = False
# empty str == EOF in Python
@@ -76,7 +88,7 @@ class InventoryBase(SchemaBase):
if schema_name != cls.SCHEMA_NAME:
raise ValueError(f"Expected schema name {schema_name!r} to be {cls.SCHEMA_NAME!r}")
fields = cls._fields_dict()
fields = cls._get_fields_dict()
obj_dict = {}
for key, val in tok_iter:
if key in fields:
@@ -100,7 +112,7 @@ class InventoryBase(SchemaBase):
def to_writer(self, writer: StringIO):
writer.write(f"\t{self.SCHEMA_NAME}\t0\n")
writer.write("\t{\n")
for field_name, field in self._fields_dict().items():
for field_name, field in self._get_fields_dict().items():
spec = field.metadata.get("spec")
# Not meant to be serialized
if not spec:
@@ -120,10 +132,14 @@ class InventoryBase(SchemaBase):
writer.write("\t}\n")
class InventoryDifferences(typing.NamedTuple):
changed: List[InventoryNodeBase]
removed: List[InventoryNodeBase]
class InventoryModel(InventoryBase):
def __init__(self):
self.containers: Dict[UUID, InventoryContainerBase] = {}
self.items: Dict[UUID, InventoryItem] = {}
self.nodes: Dict[UUID, InventoryNodeBase] = {}
self.root: Optional[InventoryContainerBase] = None
@classmethod
@@ -133,48 +149,113 @@ class InventoryModel(InventoryBase):
if key == "inv_object":
obj = InventoryObject.from_reader(reader)
if obj is not None:
model.add_container(obj)
model.add(obj)
elif key == "inv_category":
cat = InventoryCategory.from_reader(reader)
if cat is not None:
model.add_container(cat)
model.add(cat)
elif key == "inv_item":
item = InventoryItem.from_reader(reader)
if item is not None:
model.add_item(item)
model.add(item)
else:
LOG.warning("Unknown key {0}".format(key))
model.reparent_nodes()
return model
@classmethod
def from_llsd(cls, llsd_val: List[Dict]) -> InventoryModel:
model = cls()
for obj_dict in llsd_val:
if InventoryCategory.ID_ATTR in obj_dict:
if (obj := InventoryCategory.from_llsd(obj_dict)) is not None:
model.add(obj)
elif InventoryObject.ID_ATTR in obj_dict:
if (obj := InventoryObject.from_llsd(obj_dict)) is not None:
model.add(obj)
elif InventoryItem.ID_ATTR in obj_dict:
if (obj := InventoryItem.from_llsd(obj_dict)) is not None:
model.add(obj)
else:
LOG.warning(f"Unknown object type {obj_dict!r}")
return model
@property
def ordered_nodes(self) -> Iterable[InventoryNodeBase]:
yield from self.all_containers
yield from self.all_items
@property
def all_containers(self) -> Iterable[InventoryContainerBase]:
for node in self.nodes.values():
if isinstance(node, InventoryContainerBase):
yield node
@property
def all_items(self) -> Iterable[InventoryItem]:
for node in self.nodes.values():
if not isinstance(node, InventoryContainerBase):
yield node
def __eq__(self, other):
if not isinstance(other, InventoryModel):
return False
return set(self.nodes.values()) == set(other.nodes.values())
def to_writer(self, writer: StringIO):
for container in self.containers.values():
container.to_writer(writer)
for item in self.items.values():
item.to_writer(writer)
for node in self.ordered_nodes:
node.to_writer(writer)
def add_container(self, container: InventoryContainerBase):
self.containers[container.node_id] = container
container.model = weakref.proxy(self)
def to_llsd(self):
return list(node.to_llsd() for node in self.ordered_nodes)
def add_item(self, item: InventoryItem):
self.items[item.item_id] = item
item.model = weakref.proxy(self)
def add(self, node: InventoryNodeBase):
if node.node_id in self.nodes:
raise KeyError(f"{node.node_id} already exists in the inventory model")
def reparent_nodes(self):
self.root = None
for container in self.containers.values():
container.children.clear()
if container.parent_id == UUID():
self.root = container
for obj in itertools.chain(self.items.values(), self.containers.values()):
if not obj.parent_id or obj.parent_id == UUID():
continue
parent_container = self.containers.get(obj.parent_id)
if not parent_container:
LOG.warning("{0} had an invalid parent {1}".format(obj, obj.parent_id))
continue
parent_container.children.append(obj)
self.nodes[node.node_id] = node
if isinstance(node, InventoryContainerBase):
if node.parent_id == UUID.ZERO:
self.root = node
node.model = weakref.proxy(self)
def unlink(self, node: InventoryNodeBase) -> Sequence[InventoryNodeBase]:
"""Unlink a node and its descendants from the tree, returning the removed nodes"""
assert node.model == self
if node == self.root:
self.root = None
unlinked = [node]
if isinstance(node, InventoryContainerBase):
for child in node.children:
unlinked.extend(self.unlink(child))
self.nodes.pop(node.node_id, None)
node.model = None
return unlinked
def get_differences(self, other: InventoryModel) -> InventoryDifferences:
# Includes modified things with the same ID
changed_in_other = []
removed_in_other = []
other_keys = set(other.nodes.keys())
our_keys = set(self.nodes.keys())
# Removed
for key in our_keys - other_keys:
removed_in_other.append(self.nodes[key])
# Updated
for key in other_keys.intersection(our_keys):
other_node = other.nodes[key]
if other_node != self.nodes[key]:
changed_in_other.append(other_node)
# Added
for key in other_keys - our_keys:
changed_in_other.append(other.nodes[key])
return InventoryDifferences(
changed=changed_in_other,
removed=removed_in_other,
)
@dataclasses.dataclass
@@ -204,16 +285,27 @@ class InventorySaleInfo(InventoryBase):
class InventoryNodeBase(InventoryBase):
ID_ATTR: ClassVar[str]
name: str
parent_id: Optional[UUID] = schema_field(SchemaUUID)
model: Optional[InventoryModel] = dataclasses.field(default=None, init=False)
model: Optional[InventoryModel] = dataclasses.field(
default=None, init=False, hash=False, compare=False, repr=False
)
@property
def node_id(self) -> UUID:
return getattr(self, self.ID_ATTR)
@node_id.setter
def node_id(self, val: UUID):
setattr(self, self.ID_ATTR, val)
@property
def parent(self):
return self.model.containers.get(self.parent_id)
def parent(self) -> Optional[InventoryContainerBase]:
return self.model.nodes.get(self.parent_id)
def unlink(self) -> Sequence[InventoryNodeBase]:
return self.model.unlink(self)
@classmethod
def _obj_from_dict(cls, obj_dict):
@@ -224,12 +316,58 @@ class InventoryNodeBase(InventoryBase):
return None
return super()._obj_from_dict(obj_dict)
def __hash__(self):
return hash(self.node_id)
def __iter__(self) -> Iterator[InventoryNodeBase]:
return iter(())
def __contains__(self, item) -> bool:
return item in tuple(self)
@dataclasses.dataclass
class InventoryContainerBase(InventoryNodeBase):
type: str = schema_field(SchemaStr)
name: str = schema_field(SchemaMultilineStr)
children: List[InventoryNodeBase] = dataclasses.field(default_factory=list, init=False)
@property
def children(self) -> Sequence[InventoryNodeBase]:
return tuple(
x for x in self.model.nodes.values()
if x.parent_id == self.node_id
)
def __getitem__(self, item: Union[int, str]) -> InventoryNodeBase:
if isinstance(item, int):
return self.children[item]
for child in self.children:
if child.name == item:
return child
raise KeyError(f"{item!r} not found in children")
def __iter__(self) -> Iterator[InventoryNodeBase]:
return iter(self.children)
def get_or_create_subcategory(self, name: str) -> InventoryCategory:
for child in self:
if child.name == name and isinstance(child, InventoryCategory):
return child
child = InventoryCategory(
name=name,
cat_id=UUID.random(),
parent_id=self.node_id,
type="category",
pref_type="-1",
owner_id=getattr(self, 'owner_id', UUID.ZERO),
version=1,
)
self.model.add(child)
return child
# So autogenerated __hash__ doesn't kill our inherited one
__hash__ = InventoryNodeBase.__hash__
@dataclasses.dataclass
@@ -239,17 +377,21 @@ class InventoryObject(InventoryContainerBase):
obj_id: UUID = schema_field(SchemaUUID)
__hash__ = InventoryNodeBase.__hash__
@dataclasses.dataclass
class InventoryCategory(InventoryContainerBase):
ID_ATTR: ClassVar[str] = "cat_id"
SCHEMA_NAME: ClassVar[str] = "inv_object"
SCHEMA_NAME: ClassVar[str] = "inv_category"
cat_id: UUID = schema_field(SchemaUUID)
pref_type: str = schema_field(SchemaStr)
pref_type: str = schema_field(SchemaStr, llsd_name="preferred_type")
owner_id: UUID = schema_field(SchemaUUID)
version: int = schema_field(SchemaInt)
__hash__ = InventoryNodeBase.__hash__
@dataclasses.dataclass
class InventoryItem(InventoryNodeBase):
@@ -259,15 +401,17 @@ class InventoryItem(InventoryNodeBase):
item_id: UUID = schema_field(SchemaUUID)
type: str = schema_field(SchemaStr)
inv_type: str = schema_field(SchemaStr)
flags: int = schema_field(SchemaHexInt)
flags: int = schema_field(SchemaFlagField)
name: str = schema_field(SchemaMultilineStr)
desc: str = schema_field(SchemaMultilineStr)
creation_date: dt.datetime = schema_field(SchemaDate)
creation_date: dt.datetime = schema_field(SchemaDate, llsd_name="created_at")
permissions: InventoryPermissions = schema_field(InventoryPermissions)
sale_info: InventorySaleInfo = schema_field(InventorySaleInfo)
asset_id: Optional[UUID] = schema_field(SchemaUUID, default=None)
shadow_id: Optional[UUID] = schema_field(SchemaUUID, default=None)
__hash__ = InventoryNodeBase.__hash__
@property
def true_asset_id(self) -> UUID:
if self.asset_id is not None:

View File

@@ -1,7 +1,6 @@
import os
import tempfile
from io import BytesIO
from typing import *
import defusedxml.ElementTree
from glymur import jp2box, Jp2k
@@ -10,12 +9,6 @@ from glymur import jp2box, Jp2k
jp2box.ET = defusedxml.ElementTree
SL_DEFAULT_ENCODE = {
"cratios": (1920.0, 480.0, 120.0, 30.0, 10.0),
"irreversible": True,
}
class BufferedJp2k(Jp2k):
"""
For manipulating JP2K from within a binary buffer.
@@ -24,12 +17,7 @@ class BufferedJp2k(Jp2k):
based on filename, so this is the least brittle approach.
"""
def __init__(self, contents: bytes, encode_kwargs: Optional[Dict] = None):
if encode_kwargs is None:
self.encode_kwargs = SL_DEFAULT_ENCODE.copy()
else:
self.encode_kwargs = encode_kwargs
def __init__(self, contents: bytes):
stream = BytesIO(contents)
self.temp_file = tempfile.NamedTemporaryFile(delete=False)
stream.seek(0)
@@ -44,11 +32,12 @@ class BufferedJp2k(Jp2k):
os.remove(self.temp_file.name)
self.temp_file = None
def _write(self, img_array, verbose=False, **kwargs):
# Glymur normally only lets you control encode params when a write happens within
# the constructor. Keep around the encode params from the constructor and pass
# them to successive write calls.
return super()._write(img_array, verbose=False, **self.encode_kwargs, **kwargs)
def _populate_cparams(self, img_array):
if self._cratios is None:
self._cratios = (1920.0, 480.0, 120.0, 30.0, 10.0)
if self._irreversible is None:
self.irreversible = True
return super()._populate_cparams(img_array)
def __bytes__(self):
with open(self.temp_file.name, "rb") as f:

View File

@@ -31,6 +31,14 @@ class SchemaFieldSerializer(abc.ABC, Generic[_T]):
def serialize(cls, val: _T) -> str:
pass
@classmethod
def from_llsd(cls, val: Any) -> _T:
return val
@classmethod
def to_llsd(cls, val: _T) -> Any:
return val
class SchemaDate(SchemaFieldSerializer[dt.datetime]):
@classmethod
@@ -41,6 +49,14 @@ class SchemaDate(SchemaFieldSerializer[dt.datetime]):
def serialize(cls, val: dt.datetime) -> str:
return str(calendar.timegm(val.utctimetuple()))
@classmethod
def from_llsd(cls, val: Any) -> dt.datetime:
return dt.datetime.utcfromtimestamp(val)
@classmethod
def to_llsd(cls, val: dt.datetime):
return calendar.timegm(val.utctimetuple())
class SchemaHexInt(SchemaFieldSerializer[int]):
@classmethod
@@ -95,10 +111,11 @@ class SchemaUUID(SchemaFieldSerializer[UUID]):
def schema_field(spec: Type[Union[SchemaBase, SchemaFieldSerializer]], *, default=dataclasses.MISSING, init=True,
repr=True, hash=None, compare=True) -> dataclasses.Field: # noqa
repr=True, hash=None, compare=True, llsd_name=None) -> dataclasses.Field: # noqa
"""Describe a field in the inventory schema and the shape of its value"""
return dataclasses.field(
metadata={"spec": spec}, default=default, init=init, repr=repr, hash=hash, compare=compare
metadata={"spec": spec, "llsd_name": llsd_name}, default=default,
init=init, repr=repr, hash=hash, compare=compare,
)
@@ -121,8 +138,14 @@ def parse_schema_line(line: str):
@dataclasses.dataclass
class SchemaBase(abc.ABC):
@classmethod
def _fields_dict(cls):
return {f.name: f for f in dataclasses.fields(cls)}
def _get_fields_dict(cls, llsd=False):
fields_dict = {}
for field in dataclasses.fields(cls):
field_name = field.name
if llsd:
field_name = field.metadata.get("llsd_name") or field_name
fields_dict[field_name] = field
return fields_dict
@classmethod
def from_str(cls, text: str):
@@ -137,6 +160,30 @@ class SchemaBase(abc.ABC):
def from_bytes(cls, data: bytes):
return cls.from_str(data.decode("utf8"))
@classmethod
def from_llsd(cls, inv_dict: Dict):
fields = cls._get_fields_dict(llsd=True)
obj_dict = {}
for key, val in inv_dict.items():
if key in fields:
field: dataclasses.Field = fields[key]
key = field.name
spec = field.metadata.get("spec")
# Not a real key, an internal var on our dataclass
if not spec:
LOG.warning(f"Internal key {key!r}")
continue
# some kind of nested structure like sale_info
if issubclass(spec, SchemaBase):
obj_dict[key] = spec.from_llsd(val)
elif issubclass(spec, SchemaFieldSerializer):
obj_dict[key] = spec.from_llsd(val)
else:
raise ValueError(f"Unsupported spec for {key!r}, {spec!r}")
else:
LOG.warning(f"Unknown key {key!r}")
return cls._obj_from_dict(obj_dict)
def to_bytes(self) -> bytes:
return self.to_str().encode("utf8")
@@ -146,6 +193,28 @@ class SchemaBase(abc.ABC):
writer.seek(0)
return writer.read()
def to_llsd(self):
obj_dict = {}
for field_name, field in self._get_fields_dict(llsd=True).items():
spec = field.metadata.get("spec")
# Not meant to be serialized
if not spec:
continue
val = getattr(self, field.name)
if val is None:
continue
# Some kind of nested structure like sale_info
if isinstance(val, SchemaBase):
val = val.to_llsd()
elif issubclass(spec, SchemaFieldSerializer):
val = spec.to_llsd(val)
else:
raise ValueError(f"Bad inventory spec {spec!r}")
obj_dict[field_name] = val
return obj_dict
@abc.abstractmethod
def to_writer(self, writer: StringIO):
pass

View File

@@ -270,8 +270,8 @@ LOD_SEGMENT_SERIALIZER = SegmentSerializer({
# Each position represents a single vert.
"Position": se.Collection(None, se.Vector3U16(0.0, 1.0)),
"TexCoord0": se.Collection(None, se.Vector2U16(0.0, 1.0)),
# Normals have a static domain between -1 and 1
"Normal": se.Collection(None, se.Vector3U16(0.0, 1.0)),
# Normals have a static domain between -1 and 1, so just use that.
"Normal": se.Collection(None, se.Vector3U16(-1.0, 1.0)),
"Weights": se.Collection(None, VertexWeights)
})

View File

@@ -1,6 +1,9 @@
from __future__ import annotations
import abc
import asyncio
import copy
import dataclasses
import datetime as dt
import logging
from typing import *
@@ -13,6 +16,14 @@ from .msgtypes import PacketFlags
from .udpserializer import UDPMessageSerializer
@dataclasses.dataclass
class ReliableResendInfo:
last_resent: dt.datetime
message: Message
completed: asyncio.Future = dataclasses.field(default_factory=asyncio.Future)
tries_left: int = 10
class Circuit:
def __init__(self, near_host: Optional[ADDR_TUPLE], far_host: ADDR_TUPLE, transport):
self.near_host: Optional[ADDR_TUPLE] = near_host
@@ -22,6 +33,8 @@ class Circuit:
self.serializer = UDPMessageSerializer()
self.last_packet_at = dt.datetime.now()
self.packet_id_base = 0
self.unacked_reliable: Dict[Tuple[Direction, int], ReliableResendInfo] = {}
self.resend_every: float = 3.0
def _send_prepared_message(self, message: Message, transport=None):
try:
@@ -46,22 +59,69 @@ class Circuit:
raise RuntimeError(f"Trying to re-send finalized {message!r}")
message.packet_id = self.packet_id_base
self.packet_id_base += 1
if not message.acks:
message.send_flags &= PacketFlags.ACK
if message.acks:
message.send_flags |= PacketFlags.ACK
else:
message.send_flags &= ~PacketFlags.ACK
# If it was queued, it's not anymore
message.queued = False
message.finalized = True
def send_message(self, message: Message, transport=None):
def send(self, message: Message, transport=None) -> UDPPacket:
if self.prepare_message(message):
# If the message originates from us then we're responsible for resends.
if message.reliable and message.synthetic:
self.unacked_reliable[(message.direction, message.packet_id)] = ReliableResendInfo(
last_resent=dt.datetime.now(),
message=message,
)
return self._send_prepared_message(message, transport)
# Temporary alias
send_message = send
def send_reliable(self, message: Message, transport=None) -> asyncio.Future:
"""send() wrapper that always sends reliably and allows `await`ing ACK receipt"""
if not message.synthetic:
raise ValueError("Not able to send non-synthetic message reliably!")
message.send_flags |= PacketFlags.RELIABLE
self.send(message, transport)
return self.unacked_reliable[(message.direction, message.packet_id)].completed
def collect_acks(self, message: Message):
effective_acks = list(message.acks)
if message.name == "PacketAck":
effective_acks.extend(x["ID"] for x in message["Packets"])
for ack in effective_acks:
resend_info = self.unacked_reliable.pop((~message.direction, ack), None)
if resend_info:
resend_info.completed.set_result(None)
def resend_unacked(self):
for resend_info in list(self.unacked_reliable.values()):
# Not time to attempt a resend yet
if dt.datetime.now() - resend_info.last_resent < dt.timedelta(seconds=self.resend_every):
continue
msg = copy.copy(resend_info.message)
resend_info.tries_left -= 1
# We were on our last try and we never received an ack
if not resend_info.tries_left:
logging.warning(f"Giving up on unacked {msg.packet_id}")
del self.unacked_reliable[(msg.direction, msg.packet_id)]
resend_info.completed.set_exception(TimeoutError("Exceeded resend limit"))
continue
resend_info.last_resent = dt.datetime.now()
msg.send_flags |= PacketFlags.RESENT
self._send_prepared_message(msg)
def send_acks(self, to_ack: Sequence[int], direction=Direction.OUT, packet_id=None):
logging.debug("%r acking %r" % (direction, to_ack))
# TODO: maybe tack this onto `.acks` for next message?
message = Message('PacketAck', *[Block('Packets', ID=x) for x in to_ack])
message.packet_id = packet_id
message.direction = direction
message.injected = True
self.send_message(message)
self.send(message)
def __repr__(self):
return "<%s %r : %r>" % (self.__class__.__name__, self.near_host, self.host)
@@ -77,4 +137,4 @@ class ConnectionHolder(abc.ABC):
lifetime of a session (due to region restarts, etc.)
"""
circuit: Optional[Circuit]
message_handler: MessageHandler[Message]
message_handler: MessageHandler[Message, str]

View File

@@ -5,14 +5,13 @@ from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.message.data_packer import LLSDDataPacker
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.template import MessageTemplateVariable
from hippolyzer.lib.base.message.template_dict import TemplateDictionary
from hippolyzer.lib.base.message.template_dict import TemplateDictionary, DEFAULT_TEMPLATE_DICT
VAR_PAIR = Tuple[dict, MessageTemplateVariable]
class LLSDMessageSerializer:
DEFAULT_TEMPLATE = TemplateDictionary()
DEFAULT_TEMPLATE = DEFAULT_TEMPLATE_DICT
def __init__(self, message_template=None, message_cls: Type[Message] = Message):
if message_template is not None:

View File

@@ -22,6 +22,7 @@ from __future__ import annotations
import copy
import enum
import importlib
import itertools
import logging
import os
@@ -31,6 +32,7 @@ from typing import *
from hippolyzer.lib.base.datatypes import *
import hippolyzer.lib.base.serialization as se
import hippolyzer.lib.base.templates as templates
from hippolyzer.lib.base.datatypes import Pretty
from hippolyzer.lib.base.message.msgtypes import PacketFlags
from hippolyzer.lib.base.network.transport import Direction, ADDR_TUPLE
@@ -61,11 +63,12 @@ class Block:
Block expects a name, and kwargs for variables (var_name = value)
"""
__slots__ = ('name', 'size', 'vars', 'message_name', '_ser_cache', 'fill_missing',)
PARENT_MESSAGE_NAME: ClassVar[Optional[str]] = None
def __init__(self, name, /, *, fill_missing=False, **kwargs):
self.name = name
self.size = 0
self.message_name: Optional[str] = None
self.message_name: Optional[str] = self.PARENT_MESSAGE_NAME
self.vars: Dict[str, VAR_TYPE] = {}
self._ser_cache: Dict[str, Any] = {}
self.fill_missing = fill_missing
@@ -82,6 +85,9 @@ class Block:
return self.vars[name]
def __setitem__(self, key, value):
if isinstance(value, Pretty):
return self.serialize_var(key, value.value)
# These don't pickle well since they're likely to get hot-reloaded
if isinstance(value, (enum.IntEnum, enum.IntFlag)):
value = int(value)
@@ -180,9 +186,9 @@ class MsgBlockList(List["Block"]):
class Message:
__slots__ = ("name", "send_flags", "_packet_id", "acks", "body_boundaries", "queued",
__slots__ = ("name", "send_flags", "packet_id", "acks", "body_boundaries", "queued",
"offset", "raw_extra", "raw_body", "deserializer", "_blocks", "finalized",
"direction", "meta", "injected", "dropped", "sender")
"direction", "meta", "synthetic", "dropped", "sender")
def __init__(self, name, *args, packet_id=None, flags=0, acks=None, direction=None):
# TODO: Do this on a timer or something.
@@ -190,7 +196,7 @@ class Message:
self.name = name
self.send_flags = flags
self._packet_id: Optional[int] = packet_id # aka, sequence number
self.packet_id: Optional[int] = packet_id # aka, sequence number
self.acks = acks if acks is not None else tuple()
self.body_boundaries = (-1, -1)
@@ -207,22 +213,12 @@ class Message:
self.queued: bool = False
self._blocks: BLOCK_DICT = {}
self.meta = {}
self.injected = False
self.synthetic = packet_id is None
self.dropped = False
self.sender: Optional[ADDR_TUPLE] = None
self.add_blocks(args)
@property
def packet_id(self) -> Optional[int]:
return self._packet_id
@packet_id.setter
def packet_id(self, val: Optional[int]):
self._packet_id = val
# Changing packet ID clears the finalized flag
self.finalized = False
def add_blocks(self, block_list):
# can have a list of blocks if it is multiple or variable
for block in block_list:
@@ -295,7 +291,7 @@ class Message:
if self.raw_body and self.deserializer():
self.deserializer().parse_message_body(self)
def to_dict(self):
def to_dict(self, extended=False):
""" A dict representation of a message.
This is the form used for templated messages sent via EQ.
@@ -311,6 +307,18 @@ class Message:
new_vars[var_name] = val
dict_blocks.append(new_vars)
if extended:
base_repr.update({
"packet_id": self.packet_id,
"meta": self.meta.copy(),
"dropped": self.dropped,
"synthetic": self.synthetic,
"direction": self.direction.name,
"send_flags": int(self.send_flags),
"extra": self.extra,
"acks": self.acks,
})
return base_repr
@classmethod
@@ -320,6 +328,17 @@ class Message:
msg.create_block_list(block_type)
for block in blocks:
msg.add_block(Block(block_type, **block))
if 'packet_id' in dict_val:
# extended format
msg.packet_id = dict_val['packet_id']
msg.meta = dict_val['meta']
msg.dropped = dict_val['dropped']
msg.synthetic = dict_val['synthetic']
msg.direction = Direction[dict_val['direction']]
msg.send_flags = dict_val['send_flags']
msg.extra = dict_val['extra']
msg.acks = dict_val['acks']
return msg
def invalidate_caches(self):
@@ -358,12 +377,16 @@ class Message:
message_copy = copy.deepcopy(self)
# Set the queued flag so the original will be dropped and acks will be sent
self.queued = True
if not self.finalized:
self.queued = True
# Original was dropped so let's make sure we have clean acks and packet id
message_copy.acks = tuple()
message_copy.send_flags &= ~PacketFlags.ACK
message_copy.packet_id = None
message_copy.dropped = False
message_copy.finalized = False
message_copy.queued = False
return message_copy
def to_summary(self):

View File

@@ -62,9 +62,16 @@ class HumanMessageSerializer:
continue
if first_line:
direction, message_name = line.split(" ", 1)
first_split = [x for x in line.split(" ") if x]
direction, message_name = first_split[:2]
options = [x.strip("[]") for x in first_split[2:]]
msg = Message(message_name)
msg.direction = Direction[direction.upper()]
for option in options:
if option in PacketFlags.__members__:
msg.send_flags |= PacketFlags[option]
elif re.match(r"^\d+$", option):
msg.send_flags |= int(option)
first_line = False
continue
@@ -137,9 +144,17 @@ class HumanMessageSerializer:
if msg.direction is not None:
string += f'{msg.direction.name} '
string += msg.name
flags = msg.send_flags
for poss_flag in iter(PacketFlags):
if flags & poss_flag:
flags &= ~poss_flag
string += f" [{poss_flag.name}]"
# Make sure flags with unknown meanings don't get lost
if flags:
string += f" [{int(flags)}]"
if msg.packet_id is not None:
string += f'\n# {msg.packet_id}: {PacketFlags(msg.send_flags)!r}'
string += f'{", DROPPED" if msg.dropped else ""}{", INJECTED" if msg.injected else ""}'
string += f'\n# ID: {msg.packet_id}'
string += f'{", DROPPED" if msg.dropped else ""}{", SYNTHETIC" if msg.synthetic else ""}'
if msg.extra:
string += f'\n# EXTRA: {msg.extra!r}'
string += '\n\n'

View File

@@ -28,28 +28,28 @@ from hippolyzer.lib.base.events import Event
LOG = logging.getLogger(__name__)
_T = TypeVar("_T")
_K = TypeVar("_K", bound=Hashable)
MESSAGE_HANDLER = Callable[[_T], Any]
PREDICATE = Callable[[_T], bool]
MESSAGE_NAMES = Union[str, Iterable[str]]
MESSAGE_NAMES = Iterable[_K]
class MessageHandler(Generic[_T]):
def __init__(self):
self.handlers: Dict[str, Event] = {}
class MessageHandler(Generic[_T, _K]):
def __init__(self, take_by_default: bool = True):
self.handlers: Dict[_K, Event] = {}
self.take_by_default = take_by_default
def register(self, message_name: str) -> Event:
def register(self, message_name: _K) -> Event:
LOG.debug('Creating a monitor for %s' % message_name)
return self.handlers.setdefault(message_name, Event())
def subscribe(self, message_name: str, handler: MESSAGE_HANDLER) -> Event:
def subscribe(self, message_name: _K, handler: MESSAGE_HANDLER) -> Event:
notifier = self.register(message_name)
notifier.subscribe(handler)
return notifier
def _subscribe_all(self, message_names: MESSAGE_NAMES, handler: MESSAGE_HANDLER,
predicate: Optional[PREDICATE] = None) -> List[Event]:
if isinstance(message_names, str):
message_names = (message_names,)
notifiers = [self.register(name) for name in message_names]
for n in notifiers:
n.subscribe(handler, predicate=predicate)
@@ -57,7 +57,7 @@ class MessageHandler(Generic[_T]):
@contextlib.contextmanager
def subscribe_async(self, message_names: MESSAGE_NAMES, predicate: Optional[PREDICATE] = None,
take: bool = True) -> ContextManager[Callable[[], Awaitable[_T]]]:
take: Optional[bool] = None) -> ContextManager[Callable[[], Awaitable[_T]]]:
"""
Subscribe to a set of message matching predicate while within a block
@@ -69,6 +69,8 @@ class MessageHandler(Generic[_T]):
If a subscriber is just an observer that will never drop or modify a message, take=False
may be used and messages will be sent as usual.
"""
if take is None:
take = self.take_by_default
msg_queue = asyncio.Queue()
def _handler_wrapper(message: _T):
@@ -91,8 +93,8 @@ class MessageHandler(Generic[_T]):
for n in notifiers:
n.unsubscribe(_handler_wrapper)
def wait_for(self, message_names: MESSAGE_NAMES,
predicate: Optional[PREDICATE] = None, timeout=None, take=True) -> Awaitable[_T]:
def wait_for(self, message_names: MESSAGE_NAMES, predicate: Optional[PREDICATE] = None,
timeout: Optional[float] = None, take: Optional[bool] = None) -> Awaitable[_T]:
"""
Wait for a single instance one of message_names matching predicate
@@ -101,16 +103,18 @@ class MessageHandler(Generic[_T]):
sequence of packets, since multiple packets may come in after the future has already
been marked completed, causing some to be missed.
"""
if isinstance(message_names, str):
message_names = (message_names,)
if take is None:
take = self.take_by_default
notifiers = [self.register(name) for name in message_names]
fut = asyncio.get_event_loop().create_future()
loop = asyncio.get_event_loop_policy().get_event_loop()
fut = loop.create_future()
timeout_task = None
async def _canceller():
await asyncio.sleep(timeout)
fut.set_exception(asyncio.exceptions.TimeoutError("Timed out waiting for packet"))
if not fut.done():
fut.set_exception(asyncio.exceptions.TimeoutError("Timed out waiting for packet"))
for n in notifiers:
n.unsubscribe(_handler)
@@ -123,7 +127,8 @@ class MessageHandler(Generic[_T]):
# Whatever was awaiting this future now owns this message
if take:
message = message.take()
fut.set_result(message)
if not fut.done():
fut.set_result(message)
# Make sure to unregister this handler for all message types
for n in notifiers:
n.unsubscribe(_handler)
@@ -132,7 +137,7 @@ class MessageHandler(Generic[_T]):
notifier.subscribe(_handler, predicate=predicate)
return fut
def is_handled(self, message_name: str):
def is_handled(self, message_name: _K):
return message_name in self.handlers
def handle(self, message: _T):
@@ -140,7 +145,7 @@ class MessageHandler(Generic[_T]):
# Always try to call wildcard handlers
self._handle_type('*', message)
def _handle_type(self, name: str, message: _T):
def _handle_type(self, name: _K, message: _T):
handler = self.handlers.get(name)
if not handler:
return

View File

@@ -22,6 +22,7 @@ Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
import typing
from .msgtypes import MsgType, MsgBlockType
from ..datatypes import UUID
class MessageTemplateVariable:
@@ -61,6 +62,32 @@ class MessageTemplateVariable:
self._probably_text = self._probably_text and self.name != "NameValue"
return self._probably_text
@property
def default_value(self):
if self.type.is_int:
return 0
elif self.type.is_float:
return 0.0
elif self.type == MsgType.MVT_LLUUID:
return UUID()
elif self.type == MsgType.MVT_BOOL:
return False
elif self.type == MsgType.MVT_VARIABLE:
if self.probably_binary:
return b""
if self.probably_text:
return ""
return b""
elif self.type in (MsgType.MVT_LLVector3, MsgType.MVT_LLVector3d, MsgType.MVT_LLQuaternion):
return 0.0, 0.0, 0.0
elif self.type == MsgType.MVT_LLVector4:
return 0.0, 0.0, 0.0, 0.0
elif self.type == MsgType.MVT_FIXED:
return b"\x00" * self.size
elif self.type == MsgType.MVT_IP_ADDR:
return "0.0.0.0"
return None
class MessageTemplateBlock:
def __init__(self, name):

View File

@@ -27,25 +27,35 @@ from .template import MessageTemplate
from .template_parser import MessageTemplateParser
DEFAULT_PARSER = MessageTemplateParser(msg_tmpl)
class TemplateDictionary:
"""the dictionary with all known templates"""
def __init__(self, template_list=None, message_template=None):
if template_list is None:
if message_template is None:
parser = MessageTemplateParser(msg_tmpl)
parser = DEFAULT_PARSER
else:
parser = MessageTemplateParser(message_template)
template_list = parser.message_templates
self.template_list: typing.List[MessageTemplate] = template_list
self.template_list: typing.List[MessageTemplate] = []
# maps name to template
self.message_templates = {}
# maps (freq,num) to template
self.message_dict = {}
self.load_templates(template_list)
def load_templates(self, template_list):
self.template_list.clear()
self.template_list.extend(template_list)
self.message_templates.clear()
self.message_dict.clear()
self.build_dictionaries(template_list)
self.build_message_ids()
@@ -99,3 +109,6 @@ class TemplateDictionary:
def __iter__(self):
return iter(self.template_list)
DEFAULT_TEMPLATE_DICT = TemplateDictionary()

View File

@@ -26,7 +26,7 @@ from logging import getLogger
from hippolyzer.lib.base.datatypes import JankStringyBytes
from hippolyzer.lib.base.settings import Settings
from .template import MessageTemplateVariable
from .template_dict import TemplateDictionary
from .template_dict import DEFAULT_TEMPLATE_DICT
from .msgtypes import MsgType, MsgBlockType, PacketLayout
from .data_packer import TemplateDataPacker
from .message import Message, Block
@@ -62,13 +62,13 @@ def _parse_msg_num(reader: se.BufferReader):
class UDPMessageDeserializer:
DEFAULT_TEMPLATE = TemplateDictionary()
DEFAULT_TEMPLATE = DEFAULT_TEMPLATE_DICT
def __init__(self, settings=None):
self.settings = settings or Settings()
self.template_dict = self.DEFAULT_TEMPLATE
def deserialize(self, msg_buff: bytes):
def deserialize(self, msg_buff: bytes) -> Message:
msg = self._parse_message_header(msg_buff)
if not self.settings.ENABLE_DEFERRED_PACKET_PARSING:
try:
@@ -85,6 +85,7 @@ class UDPMessageDeserializer:
reader = se.BufferReader("!", data)
msg: Message = Message("Placeholder")
msg.synthetic = False
msg.send_flags = reader.read(se.U8)
msg.packet_id = reader.read(se.U32)

View File

@@ -26,7 +26,7 @@ from .data_packer import TemplateDataPacker
from .message import Message, MsgBlockList
from .msgtypes import MsgType, MsgBlockType
from .template import MessageTemplateVariable, MessageTemplateBlock
from .template_dict import TemplateDictionary
from .template_dict import TemplateDictionary, DEFAULT_TEMPLATE_DICT
from hippolyzer.lib.base import exc
from hippolyzer.lib.base import serialization as se
from hippolyzer.lib.base.datatypes import RawBytes
@@ -35,7 +35,7 @@ logger = getLogger('message.udpserializer')
class UDPMessageSerializer:
DEFAULT_TEMPLATE = TemplateDictionary(None)
DEFAULT_TEMPLATE = DEFAULT_TEMPLATE_DICT
def __init__(self, message_template=None):
if message_template is not None:

View File

@@ -30,6 +30,7 @@ class UDPPacket:
self.dst_addr = dst_addr
self.data = data
self.direction = direction
self.meta = {}
@property
def outgoing(self):
@@ -58,7 +59,7 @@ class AbstractUDPTransport(abc.ABC):
pass
class WrappingUDPTransport(AbstractUDPTransport):
class SocketUDPTransport(AbstractUDPTransport):
def __init__(self, transport: Union[asyncio.DatagramTransport, socket.socket]):
super().__init__()
self.transport = transport

View File

@@ -45,8 +45,8 @@ class Object(recordclass.datatuple): # type: ignore
State: Optional[int] = None
FullID: Optional[UUID] = None
CRC: Optional[int] = None
PCode: Optional[int] = None
Material: Optional[int] = None
PCode: Optional[tmpls.PCode] = None
Material: Optional[tmpls.MCode] = None
ClickAction: Optional[int] = None
Scale: Optional[Vector3] = None
ParentID: Optional[int] = None
@@ -71,7 +71,7 @@ class Object(recordclass.datatuple): # type: ignore
ProfileBegin: Optional[int] = None
ProfileEnd: Optional[int] = None
ProfileHollow: Optional[int] = None
TextureEntry: Optional[tmpls.TextureEntry] = None
TextureEntry: Optional[tmpls.TextureEntryCollection] = None
TextureAnim: Optional[tmpls.TextureAnim] = None
NameValue: Optional[Any] = None
Data: Optional[Any] = None
@@ -182,14 +182,14 @@ class Object(recordclass.datatuple): # type: ignore
old_val = getattr(self, key, dataclasses.MISSING)
# Don't check equality if we're using a lazy proxy,
# parsing is deferred until we actually use it.
if isinstance(val, lazy_object_proxy.Proxy):
if any(isinstance(x, lazy_object_proxy.Proxy) for x in (old_val, val)):
# TODO: be smarter about this. Can we store the raw bytes and
# compare those if it's an unparsed object?
if old_val is not val:
updated_properties.add(key)
is_updated = old_val is not val
else:
if old_val != val:
updated_properties.add(key)
is_updated = old_val != val
if is_updated:
updated_properties.add(key)
setattr(self, key, val)
return updated_properties
@@ -270,6 +270,9 @@ def normalize_object_update_compressed_data(data: bytes):
# Only used for determining which sections are present
del compressed["Flags"]
# Unlike other ObjectUpdate types, a null value in an ObjectUpdateCompressed
# always means that there is no value, not that the value hasn't changed
# from the client's view. Use the default value when that happens.
ps_block = compressed.pop("PSBlockNew", None)
if ps_block is None:
ps_block = compressed.pop("PSBlock", None)
@@ -278,6 +281,20 @@ def normalize_object_update_compressed_data(data: bytes):
compressed.pop("PSBlock", None)
if compressed["NameValue"] is None:
compressed["NameValue"] = NameValueCollection()
if compressed["Text"] is None:
compressed["Text"] = b""
compressed["TextColor"] = b""
if compressed["MediaURL"] is None:
compressed["MediaURL"] = b""
if compressed["AngularVelocity"] is None:
compressed["AngularVelocity"] = Vector3()
if compressed["SoundFlags"] is None:
compressed["SoundFlags"] = 0
compressed["SoundGain"] = 0.0
compressed["SoundRadius"] = 0.0
compressed["Sound"] = UUID()
if compressed["TextureEntry"] is None:
compressed["TextureEntry"] = tmpls.TextureEntryCollection()
object_data = {
"PSBlock": ps_block.value,
@@ -286,9 +303,9 @@ def normalize_object_update_compressed_data(data: bytes):
"LocalID": compressed.pop("ID"),
**compressed,
}
if object_data["TextureEntry"] is None:
object_data.pop("TextureEntry")
# Don't clobber OwnerID in case the object has a proper one.
# Don't clobber OwnerID in case the object has a proper one from
# a previous ObjectProperties. OwnerID isn't expected to be populated
# on ObjectUpdates unless an attached sound is playing.
if object_data["OwnerID"] == UUID():
del object_data["OwnerID"]
return object_data

View File

@@ -19,81 +19,48 @@ along with this program; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
from __future__ import annotations
import dataclasses
from typing import *
_T = TypeVar("_T")
class SettingDescriptor(Generic[_T]):
__slots__ = ("name", "default")
def __init__(self, default: Union[Callable[[], _T], _T]):
self.default = default
self.name: Optional[str] = None
def __set_name__(self, owner: Settings, name: str):
self.name = name
def _make_default(self) -> _T:
if callable(self.default):
return self.default()
return self.default
def __get__(self, obj: Settings, owner: Optional[Type] = None) -> _T:
val: Union[_T, dataclasses.MISSING] = obj.get_setting(self.name)
if val is dataclasses.MISSING:
val = self._make_default()
return val
def __set__(self, obj: Settings, value: _T) -> None:
obj.set_setting(self.name, value)
class Settings:
def __init__(self, quiet_logging=False, spammy_logging=False, log_tests=True):
""" some lovely configurable settings
ENABLE_DEFERRED_PACKET_PARSING: bool = SettingDescriptor(True)
These are applied application wide, and can be
overridden at any time in a specific instance
def __init__(self):
self._settings: Dict[str, Any] = {}
quiet_logging overrides spammy_logging
"""
def get_setting(self, name: str) -> Any:
return self._settings.get(name, dataclasses.MISSING)
self.quiet_logging = quiet_logging
self.spammy_logging = spammy_logging
# toggle handling udp packets
self.HANDLE_PACKETS = True
self.HANDLE_OUTGOING_PACKETS = False
# toggle parsing all/handled packets
self.ENABLE_DEFERRED_PACKET_PARSING = True
# ~~~~~~~~~~~~~~~~~~
# Logging behaviors
# ~~~~~~~~~~~~~~~~~~
# being a test tool, and an immature one at that,
# enable fine granularity in the logging, but
# make sure we can tone it down as well
self.LOG_VERBOSE = True
self.ENABLE_BYTES_TO_HEX_LOGGING = False
self.ENABLE_CAPS_LOGGING = True
self.ENABLE_CAPS_LLSD_LOGGING = False
self.ENABLE_EQ_LOGGING = True
self.ENABLE_UDP_LOGGING = True
self.ENABLE_OBJECT_LOGGING = True
self.LOG_SKIPPED_PACKETS = True
self.ENABLE_HOST_LOGGING = True
self.LOG_COROUTINE_SPAWNS = True
self.PROXY_LOGGING = False
# allow disabling logging of certain packets
self.DISABLE_SPAMMERS = True
self.UDP_SPAMMERS = ['PacketAck', 'AgentUpdate']
# toggle handling a region's event queue
self.ENABLE_REGION_EVENT_QUEUE = True
# how many seconds to wait between polling
# a region's event queue
self.REGION_EVENT_QUEUE_POLL_INTERVAL = 1
if self.spammy_logging:
self.ENABLE_BYTES_TO_HEX_LOGGING = True
self.ENABLE_CAPS_LLSD_LOGGING = True
self.DISABLE_SPAMMERS = False
# override the defaults
if self.quiet_logging:
self.LOG_VERBOSE = False
self.ENABLE_BYTES_TO_HEX_LOGGING = False
self.ENABLE_CAPS_LOGGING = False
self.ENABLE_CAPS_LLSD_LOGGING = False
self.ENABLE_EQ_LOGGING = False
self.ENABLE_UDP_LOGGING = False
self.LOG_SKIPPED_PACKETS = False
self.ENABLE_OBJECT_LOGGING = False
self.ENABLE_HOST_LOGGING = False
self.LOG_COROUTINE_SPAWNS = False
self.DISABLE_SPAMMERS = True
# ~~~~~~~~~~~~~~~~~~~~~~
# Test related settings
# ~~~~~~~~~~~~~~~~~~~~~~
if log_tests:
self.ENABLE_LOGGING_IN_TESTS = True
else:
self.ENABLE_LOGGING_IN_TESTS = False
def set_setting(self, name: str, val: Any):
self._settings[name] = val

View File

@@ -3,16 +3,18 @@ Serialization templates for structures used in LLUDP and HTTP bodies.
"""
import abc
import collections
import dataclasses
import enum
import importlib
import logging
import math
import zlib
from typing import *
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID, IntEnum, IntFlag
from hippolyzer.lib.base.datatypes import UUID, IntEnum, IntFlag, Vector3
from hippolyzer.lib.base.namevalue import NameValuesSerializer
try:
@@ -144,6 +146,50 @@ class InventoryType(IntEnum):
}.get(lower, lower)
class FolderType(IntEnum):
TEXTURE = 0
SOUND = 1
CALLINGCARD = 2
LANDMARK = 3
CLOTHING = 5
OBJECT = 6
NOTECARD = 7
# We'd really like to change this to 9 since AT_CATEGORY is 8,
# but "My Inventory" has been type 8 for a long time.
ROOT_INVENTORY = 8
LSL_TEXT = 10
BODYPART = 13
TRASH = 14
SNAPSHOT_CATEGORY = 15
LOST_AND_FOUND = 16
ANIMATION = 20
GESTURE = 21
FAVORITE = 23
ENSEMBLE_START = 26
ENSEMBLE_END = 45
# This range is reserved for special clothing folder types.
CURRENT_OUTFIT = 46
OUTFIT = 47
MY_OUTFITS = 48
MESH = 49
# "received items" for MP
INBOX = 50
OUTBOX = 51
BASIC_ROOT = 52
MARKETPLACE_LISTINGS = 53
MARKETPLACE_STOCK = 54
# Note: We actually *never* create folders with that type. This is used for icon override only.
MARKETPLACE_VERSION = 55
SETTINGS = 56
# Firestorm folders, may not actually exist
FIRESTORM = 57
PHOENIX = 58
RLV = 59
# Opensim folders
MY_SUITCASE = 100
NONE = -1
@se.enum_field_serializer("AgentIsNowWearing", "WearableData", "WearableType")
@se.enum_field_serializer("AgentWearablesUpdate", "WearableData", "WearableType")
@se.enum_field_serializer("CreateInventoryItem", "InventoryBlock", "WearableType")
@@ -177,6 +223,7 @@ def _register_permissions_flags(message_name, block_name):
@se.flag_field_serializer("ObjectPermissions", "ObjectData", "Mask")
@_register_permissions_flags("ObjectProperties", "ObjectData")
@_register_permissions_flags("ObjectPropertiesFamily", "ObjectData")
@_register_permissions_flags("UpdateCreateInventoryItem", "InventoryData")
@_register_permissions_flags("UpdateTaskInventory", "InventoryData")
@_register_permissions_flags("CreateInventoryItem", "InventoryBlock")
@@ -201,11 +248,74 @@ class Permissions(IntFlag):
RESERVED = 1 << 31
@se.enum_field_serializer("ObjectSaleInfo", "ObjectData", "SaleType")
@se.enum_field_serializer("ObjectProperties", "ObjectData", "SaleType")
@se.enum_field_serializer("ObjectPropertiesFamily", "ObjectData", "SaleType")
@se.enum_field_serializer("ObjectBuy", "ObjectData", "SaleType")
@se.enum_field_serializer("RezScript", "InventoryBlock", "SaleType")
@se.enum_field_serializer("RezObject", "InventoryData", "SaleType")
@se.enum_field_serializer("UpdateTaskInventory", "InventoryData", "SaleType")
@se.enum_field_serializer("UpdateCreateInventoryItem", "InventoryData", "SaleType")
class SaleInfo(IntEnum):
NOT = 0
ORIGINAL = 1
COPY = 2
CONTENTS = 3
@se.flag_field_serializer("ParcelInfoReply", "Data", "Flags")
class ParcelInfoFlags(IntFlag):
MATURE = 1 << 0
# You should never see adult without mature
ADULT = 1 << 1
GROUP_OWNED = 1 << 2
@se.flag_field_serializer("MapItemRequest", "AgentData", "Flags")
@se.flag_field_serializer("MapNameRequest", "AgentData", "Flags")
@se.flag_field_serializer("MapBlockRequest", "AgentData", "Flags")
@se.flag_field_serializer("MapItemReply", "AgentData", "Flags")
@se.flag_field_serializer("MapNameReply", "AgentData", "Flags")
@se.flag_field_serializer("MapBlockReply", "AgentData", "Flags")
class MapImageFlags(IntFlag):
# No clue, honestly. I guess there's potentially different image types you could request.
LAYER = 1 << 1
@se.enum_field_serializer("MapBlockReply", "Data", "Access")
class SimAccess(IntEnum):
# Treated as 'unknown', usually ends up being SIM_ACCESS_PG
MIN = 0
PG = 13
MATURE = 21
ADULT = 42
DOWN = 254
@se.enum_field_serializer("MapItemRequest", "RequestData", "ItemType")
@se.enum_field_serializer("MapItemReply", "RequestData", "ItemType")
class MapItemType(IntEnum):
# Treated as 'unknown', usually ends up being SIM_ACCESS_PG
TELEHUB = 0x01
PG_EVENT = 0x02
MATURE_EVENT = 0x03
# No longer supported, 2009-03-02 KLW
DEPRECATED_POPULAR = 0x04
DEPRECATED_AGENT_COUNT = 0x05
AGENT_LOCATIONS = 0x06
LAND_FOR_SALE = 0x07
CLASSIFIED = 0x08
ADULT_EVENT = 0x09
LAND_FOR_SALE_ADULT = 0x0a
@se.flag_field_serializer("RezObject", "RezData", "ItemFlags")
@se.flag_field_serializer("RezMultipleAttachmentsFromInv", "ObjectData", "ItemFlags")
@se.flag_field_serializer("RezObject", "InventoryData", "Flags")
@se.flag_field_serializer("RezScript", "InventoryBlock", "Flags")
@se.flag_field_serializer("UpdateCreateInventoryItem", "InventoryData", "Flags")
@se.flag_field_serializer("UpdateTaskInventory", "InventoryData", "Flags")
@se.flag_field_serializer("ChangeInventoryItemFlags", "InventoryData", "Flags")
class InventoryItemFlags(IntFlag):
# The asset has only one reference in the system. If the
# inventory item is deleted, or the assetid updated, then we
@@ -232,7 +342,8 @@ class InventoryItemFlags(IntFlag):
OBJECT_HAS_MULTIPLE_ITEMS = 0x200000
@property
def attachment_point(self):
def subtype(self):
"""Subtype of the given item type, could be an attachment point or setting type, etc."""
return self & 0xFF
@@ -742,10 +853,26 @@ class PCode(IntEnum):
TREE = 255
@se.enum_field_serializer("ObjectUpdate", "ObjectData", "Material")
@se.enum_field_serializer("ObjectAdd", "ObjectData", "Material")
@se.enum_field_serializer("ObjectMaterial", "ObjectData", "Material")
class MCode(IntEnum):
# Seems like this is normally stored in a U8 with the high nybble masked off?
# What's in the high nybble, anything?
STONE = 0
METAL = 1
WOOD = 3
FLESH = 4
PLASTIC = 5
RUBBER = 6
LIGHT = 7
@se.flag_field_serializer("ObjectUpdate", "ObjectData", "UpdateFlags")
@se.flag_field_serializer("ObjectUpdateCompressed", "ObjectData", "UpdateFlags")
@se.flag_field_serializer("ObjectUpdateCached", "ObjectData", "UpdateFlags")
@se.flag_field_serializer("ObjectAdd", "ObjectData", "AddFlags")
@se.flag_field_serializer("ObjectDuplicate", "SharedData", "DuplicateFlags")
class ObjectUpdateFlags(IntFlag):
USE_PHYSICS = 1 << 0
CREATE_SELECTED = 1 << 1
@@ -781,6 +908,9 @@ class ObjectUpdateFlags(IntFlag):
ZLIB_COMPRESSED_REPRECATED = 1 << 31
JUST_CREATED_FLAGS = (ObjectUpdateFlags.CREATE_SELECTED | ObjectUpdateFlags.OBJECT_YOU_OWNER)
class AttachmentStateAdapter(se.Adapter):
# Encoded attachment point ID for attached objects
# nibbles are swapped around because old attachment nums only used to live
@@ -801,7 +931,7 @@ class AttachmentStateAdapter(se.Adapter):
@se.flag_field_serializer("AgentUpdate", "AgentData", "State")
class AgentState(IntFlag):
TYPING = 1 << 3
TYPING = 1 << 2
EDITING = 1 << 4
@@ -825,6 +955,15 @@ class ObjectStateSerializer(se.AdapterSubfieldSerializer):
ORIG_INLINE = True
@se.subfield_serializer("ObjectUpdate", "RegionData", "TimeDilation")
@se.subfield_serializer("ObjectUpdateCompressed", "RegionData", "TimeDilation")
@se.subfield_serializer("ObjectUpdateCached", "RegionData", "TimeDilation")
@se.subfield_serializer("ImprovedTerseObjectUpdate", "RegionData", "TimeDilation")
class TimeDilationSerializer(se.AdapterSubfieldSerializer):
ADAPTER = se.QuantizedFloat(se.U16, 0.0, 1.0, False)
ORIG_INLINE = True
@se.subfield_serializer("ImprovedTerseObjectUpdate", "ObjectData", "Data")
class ImprovedTerseObjectUpdateDataSerializer(se.SimpleSubfieldSerializer):
TEMPLATE = se.Template({
@@ -847,12 +986,12 @@ class ShineLevel(IntEnum):
HIGH = 3
@dataclasses.dataclass
@dataclasses.dataclass(unsafe_hash=True)
class BasicMaterials:
# Meaning is technically implementation-dependent, these are in LL data files
Bump: int = se.bitfield_field(bits=5)
FullBright: bool = se.bitfield_field(bits=1, adapter=se.BoolAdapter())
Shiny: int = se.bitfield_field(bits=2, adapter=se.IntEnum(ShineLevel))
Bump: int = se.bitfield_field(bits=5, default=0)
FullBright: bool = se.bitfield_field(bits=1, adapter=se.BoolAdapter(), default=False)
Shiny: int = se.bitfield_field(bits=2, adapter=se.IntEnum(ShineLevel), default=0)
BUMP_SHINY_FULLBRIGHT = se.BitfieldDataclass(BasicMaterials, se.U8)
@@ -866,12 +1005,12 @@ class TexGen(IntEnum):
CYLINDRICAL = 0x6
@dataclasses.dataclass
@dataclasses.dataclass(unsafe_hash=True)
class MediaFlags:
WebPage: bool = se.bitfield_field(bits=1, adapter=se.BoolAdapter())
TexGen: "TexGen" = se.bitfield_field(bits=2, adapter=se.IntEnum(TexGen))
WebPage: bool = se.bitfield_field(bits=1, adapter=se.BoolAdapter(), default=False)
TexGen: "TexGen" = se.bitfield_field(bits=2, adapter=se.IntEnum(TexGen), default=TexGen.DEFAULT)
# Probably unused but show it just in case
_Unused: int = se.bitfield_field(bits=5)
_Unused: int = se.bitfield_field(bits=5, default=0)
# Not shifted so enum definitions can match indra
@@ -1007,31 +1146,147 @@ class TEExceptionField(se.SerializableBase):
return dict
def _te_dataclass_field(spec: se.SERIALIZABLE_TYPE, first=False, optional=False):
return se.dataclass_field(TEExceptionField(spec, first=first, optional=optional))
def _te_field(spec: se.SERIALIZABLE_TYPE, first=False, optional=False,
default_factory=dataclasses.MISSING, default=dataclasses.MISSING):
if default_factory is not dataclasses.MISSING:
new_default_factory = lambda: {None: default_factory()}
elif default is not None:
new_default_factory = lambda: {None: default}
else:
new_default_factory = dataclasses.MISSING
return se.dataclass_field(
TEExceptionField(spec, first=first, optional=optional),
default_factory=new_default_factory,
)
_T = TypeVar("_T")
TE_FIELD_TYPE = Dict[Optional[Sequence[int]], _T]
_TE_FIELD_KEY = Optional[Sequence[int]]
# If this seems weird it's because it is. TE offsets are S16s with `0` as the actual 0
# point, and LL divides by `0x7FFF` to convert back to float. Negative S16s can
# actually go to -0x8000 due to two's complement, creating a larger range for negatives.
TE_S16_COORD = se.QuantizedFloat(se.S16, -1.000030518509476, 1.0, False)
class PackedTERotation(se.QuantizedFloat):
"""Another weird one, packed TE rotations have their own special quantization"""
def __init__(self):
super().__init__(se.S16, math.pi * -2, math.pi * 2, zero_median=False)
self.step_mag = 1.0 / (se.U16.max_val + 1)
def _float_to_quantized(self, val: float, lower: float, upper: float):
val = math.fmod(val, upper)
val = super()._float_to_quantized(val, lower, upper)
if val == se.S16.max_val + 1:
val = self.prim_min
return val
@dataclasses.dataclass
class TextureEntry:
Textures: TE_FIELD_TYPE[UUID] = _te_dataclass_field(se.UUID, first=True)
"""Representation of a TE for a single face. Not sent over the wire."""
Textures: UUID = UUID('89556747-24cb-43ed-920b-47caed15465f')
Color: bytes = b"\xff\xff\xff\xff"
ScalesS: float = 1.0
ScalesT: float = 1.0
OffsetsS: float = 0.0
OffsetsT: float = 0.0
# In radians
Rotation: float = 0.0
MediaFlags: "MediaFlags" = dataclasses.field(default_factory=MediaFlags)
BasicMaterials: "BasicMaterials" = dataclasses.field(default_factory=BasicMaterials)
Glow: float = 0.0
Materials: UUID = UUID.ZERO
def st_to_uv(self, st_coord: Vector3) -> Vector3:
"""Convert OpenGL ST coordinates to UV coordinates, accounting for mapping"""
uv = Vector3(st_coord.X - 0.5, st_coord.Y - 0.5)
cos_rot = math.cos(self.Rotation)
sin_rot = math.sin(self.Rotation)
uv = Vector3(
X=uv.X * cos_rot + uv.Y * sin_rot,
Y=-uv.X * sin_rot + uv.Y * cos_rot
)
uv *= Vector3(self.ScalesS, self.ScalesT)
return uv + Vector3(self.OffsetsS + 0.5, self.OffsetsT + 0.5)
# Max number of TEs possible according to llprimitive (but not really true!)
# Useful if you don't know how many faces / TEs an object really has because it's mesh
# or something.
MAX_TES = 45
@dataclasses.dataclass
class TextureEntryCollection:
Textures: Dict[_TE_FIELD_KEY, UUID] = _te_field(
# Plywood texture
se.UUID, first=True, default=UUID('89556747-24cb-43ed-920b-47caed15465f'))
# Bytes are inverted so fully opaque white is \x00\x00\x00\x00
Color: TE_FIELD_TYPE[bytes] = _te_dataclass_field(Color4(invert_bytes=True))
ScalesS: TE_FIELD_TYPE[float] = _te_dataclass_field(se.F32)
ScalesT: TE_FIELD_TYPE[float] = _te_dataclass_field(se.F32)
OffsetsS: TE_FIELD_TYPE[int] = _te_dataclass_field(se.S16)
OffsetsT: TE_FIELD_TYPE[int] = _te_dataclass_field(se.S16)
Rotation: TE_FIELD_TYPE[int] = _te_dataclass_field(se.S16)
BasicMaterials: TE_FIELD_TYPE["BasicMaterials"] = _te_dataclass_field(BUMP_SHINY_FULLBRIGHT)
MediaFlags: TE_FIELD_TYPE["MediaFlags"] = _te_dataclass_field(MEDIA_FLAGS)
Glow: TE_FIELD_TYPE[int] = _te_dataclass_field(se.U8)
Materials: TE_FIELD_TYPE[UUID] = _te_dataclass_field(se.UUID, optional=True)
Color: Dict[_TE_FIELD_KEY, bytes] = _te_field(Color4(invert_bytes=True), default=b"\xff\xff\xff\xff")
ScalesS: Dict[_TE_FIELD_KEY, float] = _te_field(se.F32, default=1.0)
ScalesT: Dict[_TE_FIELD_KEY, float] = _te_field(se.F32, default=1.0)
OffsetsS: Dict[_TE_FIELD_KEY, float] = _te_field(TE_S16_COORD, default=0.0)
OffsetsT: Dict[_TE_FIELD_KEY, float] = _te_field(TE_S16_COORD, default=0.0)
Rotation: Dict[_TE_FIELD_KEY, float] = _te_field(PackedTERotation(), default=0.0)
BasicMaterials: Dict[_TE_FIELD_KEY, "BasicMaterials"] = _te_field(
BUMP_SHINY_FULLBRIGHT, default_factory=BasicMaterials,
)
MediaFlags: Dict[_TE_FIELD_KEY, "MediaFlags"] = _te_field(MEDIA_FLAGS, default_factory=MediaFlags)
Glow: Dict[_TE_FIELD_KEY, float] = _te_field(se.QuantizedFloat(se.U8, 0.0, 1.0), default=0.0)
Materials: Dict[_TE_FIELD_KEY, UUID] = _te_field(se.UUID, optional=True, default=UUID.ZERO)
def unwrap(self):
"""Return `self` regardless of whether this is lazy wrapped object or not"""
return self
def realize(self, num_faces: int = MAX_TES) -> List[TextureEntry]:
"""
Turn the "default" vs "exception cases" wire format TE representation to per-face lookups
Makes it easier to get all TE details associated with a specific face
"""
as_dicts = [dict() for _ in range(num_faces)]
for field in dataclasses.fields(self):
key = field.name
vals = getattr(self, key)
# Fill give all faces the default value for this key
for te in as_dicts:
te[key] = vals[None]
# Walk over the exception cases and replace the default value
for face_nums, val in vals.items():
# Default case already handled
if face_nums is None:
continue
for face_num in face_nums:
if face_num >= num_faces:
raise ValueError(f"Bad value for num_faces? {face_num} >= {num_faces}")
as_dicts[face_num][key] = val
return [TextureEntry(**x) for x in as_dicts]
@classmethod
def from_tes(cls, tes: List[TextureEntry]) -> "TextureEntryCollection":
instance = cls()
if not tes:
return instance
for field in dataclasses.fields(cls):
te_vals: Dict[Any, List[int]] = collections.defaultdict(list)
for i, te in enumerate(tes):
# Group values by what face they occur on
te_vals[getattr(te, field.name)].append(i)
# Make most common value the "default", everything else is an exception
sorted_vals = sorted(te_vals.items(), key=lambda x: len(x[1]), reverse=True)
default_val = sorted_vals.pop(0)[0]
te_vals = {None: default_val}
for val, face_nums in sorted_vals:
te_vals[tuple(face_nums)] = val
setattr(instance, field.name, te_vals)
return instance
TE_SERIALIZER = se.Dataclass(TextureEntry)
TE_SERIALIZER = se.Dataclass(TextureEntryCollection)
@se.subfield_serializer("ObjectUpdate", "ObjectData", "TextureEntry")
@@ -1326,7 +1581,7 @@ class ObjectUpdateCompressedDataSerializer(se.SimpleSubfieldSerializer):
# point if an object with parents set to an avatar.
"State": ObjectStateAdapter(se.U8),
"CRC": se.U32,
"Material": se.U8,
"Material": se.IntEnum(MCode, se.U8),
"ClickAction": se.U8,
"Scale": se.Vector3,
"Position": se.Vector3,
@@ -1523,6 +1778,7 @@ class DeRezObjectDestination(IntEnum):
@se.flag_field_serializer("SimStats", "RegionInfo", "RegionFlagsExtended")
@se.flag_field_serializer("RegionInfo", "RegionInfo", "RegionFlags")
@se.flag_field_serializer("RegionInfo", "RegionInfo3", "RegionFlagsExtended")
@se.flag_field_serializer("MapBlockReply", "Data", "RegionFlags")
class RegionFlags(IntFlag):
ALLOW_DAMAGE = 1 << 0
ALLOW_LANDMARK = 1 << 1
@@ -1568,6 +1824,7 @@ class RegionHandshakeReplyFlags(IntFlag):
@se.flag_field_serializer("TeleportStart", "Info", "TeleportFlags")
@se.flag_field_serializer("TeleportProgress", "Info", "TeleportFlags")
@se.flag_field_serializer("TeleportFinish", "Info", "TeleportFlags")
@se.flag_field_serializer("TeleportLocal", "Info", "TeleportFlags")
@se.flag_field_serializer("TeleportLureRequest", "Info", "TeleportFlags")
class TeleportFlags(IntFlag):
SET_HOME_TO_TARGET = 1 << 0 # newbie leaving prelude (starter area)
@@ -1586,6 +1843,133 @@ class TeleportFlags(IntFlag):
IS_FLYING = 1 << 13
SHOW_RESET_HOME = 1 << 14
FORCE_REDIRECT = 1 << 15
VIA_GLOBAL_COORDS = 1 << 16
WITHIN_REGION = 1 << 17
@se.flag_field_serializer("AvatarPropertiesReply", "PropertiesData", "Flags")
class AvatarPropertiesFlags(IntFlag):
ALLOW_PUBLISH = 1 << 0 # whether profile is externally visible or not
MATURE_PUBLISH = 1 << 1 # profile is "mature"
IDENTIFIED = 1 << 2 # whether avatar has provided payment info
TRANSACTED = 1 << 3 # whether avatar has actively used payment info
ONLINE = 1 << 4 # the online status of this avatar, if known.
AGEVERIFIED = 1 << 5 # whether avatar has been age-verified
@se.flag_field_serializer("AvatarGroupsReply", "GroupData", "GroupPowers")
@se.flag_field_serializer("AvatarGroupDataUpdate", "GroupData", "GroupPowers")
@se.flag_field_serializer("AvatarDataUpdate", "AgentDataData", "GroupPowers")
class GroupPowerFlags(IntFlag):
MEMBER_INVITE = 1 << 1 # Invite member
MEMBER_EJECT = 1 << 2 # Eject member from group
MEMBER_OPTIONS = 1 << 3 # Toggle "Open enrollment" and change "Signup Fee"
MEMBER_VISIBLE_IN_DIR = 1 << 47
# Roles
ROLE_CREATE = 1 << 4 # Create new roles
ROLE_DELETE = 1 << 5 # Delete roles
ROLE_PROPERTIES = 1 << 6 # Change Role Names, Titles, and Descriptions (Of roles the user is in, only, or any role in group?)
ROLE_ASSIGN_MEMBER_LIMITED = 1 << 7 # Assign Member to a Role that the assigner is in
ROLE_ASSIGN_MEMBER = 1 << 8 # Assign Member to Role
ROLE_REMOVE_MEMBER = 1 << 9 # Remove Member from Role
ROLE_CHANGE_ACTIONS = 1 << 10 # Change actions a role can perform
# Group Identity
GROUP_CHANGE_IDENTITY = 1 << 11 # Charter, insignia, 'Show In Group List', 'Publish on the web', 'Mature', all 'Show Member In Group Profile' checkboxes
# Parcel Management
LAND_DEED = 1 << 12 # Deed Land and Buy Land for Group
LAND_RELEASE = 1 << 13 # Release Land (to Gov. Linden)
LAND_SET_SALE_INFO = 1 << 14 # Set for sale info (Toggle "For Sale", Set Price, Set Target, Toggle "Sell objects with the land")
LAND_DIVIDE_JOIN = 1 << 15 # Divide and Join Parcels
# Parcel Identity
LAND_FIND_PLACES = 1 << 17 # Toggle "Show in Find Places" and Set Category.
LAND_CHANGE_IDENTITY = 1 << 18 # Change Parcel Identity: Parcel Name, Parcel Description, Snapshot, 'Publish on the web', and 'Mature' checkbox
LAND_SET_LANDING_POINT = 1 << 19 # Set Landing Point
# Parcel Settings
LAND_CHANGE_MEDIA = 1 << 20 # Change Media Settings
LAND_EDIT = 1 << 21 # Toggle Edit Land
LAND_OPTIONS = 1 << 22 # Toggle Set Home Point, Fly, Outside Scripts, Create/Edit Objects, Landmark, and Damage checkboxes
# Parcel Powers
LAND_ALLOW_EDIT_LAND = 1 << 23 # Bypass Edit Land Restriction
LAND_ALLOW_FLY = 1 << 24 # Bypass Fly Restriction
LAND_ALLOW_CREATE = 1 << 25 # Bypass Create/Edit Objects Restriction
LAND_ALLOW_LANDMARK = 1 << 26 # Bypass Landmark Restriction
LAND_ALLOW_SET_HOME = 1 << 28 # Bypass Set Home Point Restriction
LAND_ALLOW_HOLD_EVENT = 1 << 41 # Allowed to hold events on group-owned land
LAND_ALLOW_ENVIRONMENT = 1 << 46 # Allowed to change the environment
# Parcel Access
LAND_MANAGE_ALLOWED = 1 << 29 # Manage Allowed List
LAND_MANAGE_BANNED = 1 << 30 # Manage Banned List
LAND_MANAGE_PASSES = 1 << 31 # Change Sell Pass Settings
LAND_ADMIN = 1 << 32 # Eject and Freeze Users on the land
# Parcel Content
LAND_RETURN_GROUP_SET = 1 << 33 # Return objects on parcel that are set to group
LAND_RETURN_NON_GROUP = 1 << 34 # Return objects on parcel that are not set to group
LAND_RETURN_GROUP_OWNED = 1 << 48 # Return objects on parcel that are owned by the group
LAND_GARDENING = 1 << 35 # Parcel Gardening - plant and move linden trees
# Object Management
OBJECT_DEED = 1 << 36 # Deed Object
OBJECT_MANIPULATE = 1 << 38 # Manipulate Group Owned Objects (Move, Copy, Mod)
OBJECT_SET_SALE = 1 << 39 # Set Group Owned Object for Sale
# Accounting
ACCOUNTING_ACCOUNTABLE = 1 << 40 # Pay Group Liabilities and Receive Group Dividends
# Notices
NOTICES_SEND = 1 << 42 # Send Notices
NOTICES_RECEIVE = 1 << 43 # Receive Notices and View Notice History
# Proposals
# TODO: _DEPRECATED suffix as part of vote removal - DEV-24856:
PROPOSAL_START = 1 << 44 # Start Proposal
# TODO: _DEPRECATED suffix as part of vote removal - DEV-24856:
PROPOSAL_VOTE = 1 << 45 # Vote on Proposal
# Group chat moderation related
SESSION_JOIN = 1 << 16 # can join session
SESSION_VOICE = 1 << 27 # can hear/talk
SESSION_MODERATOR = 1 << 37 # can mute people's session
EXPERIENCE_ADMIN = 1 << 49 # has admin rights to any experiences owned by this group
EXPERIENCE_CREATOR = 1 << 50 # can sign scripts for experiences owned by this group
# Group Banning
GROUP_BAN_ACCESS = 1 << 51 # Allows access to ban / un-ban agents from a group.
@se.flag_field_serializer("RequestObjectPropertiesFamily", "ObjectData", "RequestFlags")
@se.flag_field_serializer("ObjectPropertiesFamily", "ObjectData", "RequestFlags")
class ObjectPropertiesFamilyRequestFlags(IntFlag):
BUG_REPORT = 1 << 0
COMPLAINT_REPORT = 1 << 1
OBJECT_PAY = 1 << 2
@se.enum_field_serializer("RequestImage", "RequestImage", "Type")
class RequestImageType(IntEnum):
NORMAL = 0
AVATAR_BAKE = 1
@se.enum_field_serializer("ImageData", "ImageID", "Codec")
class ImageCodec(IntEnum):
INVALID = 0
RGB = 1
J2C = 2
BMP = 3
TGA = 4
JPEG = 5
DXT = 6
PNG = 7
@se.http_serializer("RenderMaterials")

View File

@@ -94,7 +94,7 @@ class TransferManager:
if params_dict.get("SessionID", dataclasses.MISSING) is None:
params.SessionID = self._session_id
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
'TransferRequest',
Block(
'TransferInfo',

View File

@@ -1,5 +1,5 @@
from PySide2.QtCore import QMetaObject
from PySide2.QtUiTools import QUiLoader
from PySide6.QtCore import QMetaObject
from PySide6.QtUiTools import QUiLoader
class UiLoader(QUiLoader):

View File

@@ -13,7 +13,7 @@ from xml.etree.ElementTree import parse as parse_etree
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import get_resource_filename
from hippolyzer.lib.base.legacy_inv import InventorySaleInfo, InventoryPermissions
from hippolyzer.lib.base.inventory import InventorySaleInfo, InventoryPermissions
from hippolyzer.lib.base.legacy_schema import SchemaBase, parse_schema_line, SchemaParsingError
from hippolyzer.lib.base.templates import WearableType

View File

@@ -110,7 +110,7 @@ class XferManager:
direction: Direction = Direction.OUT,
) -> Xfer:
xfer_id = xfer_id if xfer_id is not None else random.getrandbits(64)
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
'RequestXfer',
Block(
'XferID',
@@ -174,7 +174,7 @@ class XferManager:
to_ack = range(xfer.next_ackable, ack_max)
xfer.next_ackable = ack_max
for ack_id in to_ack:
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
"ConfirmXferPacket",
Block("XferID", ID=xfer.xfer_id, Packet=ack_id),
direction=xfer.direction,
@@ -216,7 +216,7 @@ class XferManager:
else:
inline_data = data
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
"AssetUploadRequest",
Block(
"AssetBlock",
@@ -246,7 +246,7 @@ class XferManager:
def complete_predicate(complete_msg: Message):
return complete_msg["AssetBlock"]["UUID"] == asset_id
msg = await message_handler.wait_for('AssetUploadComplete', predicate=complete_predicate)
msg = await message_handler.wait_for(('AssetUploadComplete',), predicate=complete_predicate)
if msg["AssetBlock"]["Success"] == 1:
fut.set_result(asset_id)
else:
@@ -263,7 +263,7 @@ class XferManager:
):
message_handler = self._connection_holder.message_handler
request_msg = await message_handler.wait_for(
'RequestXfer', predicate=request_predicate, timeout=5.0)
('RequestXfer',), predicate=request_predicate, timeout=5.0)
xfer.xfer_id = request_msg["XferID"]["ID"]
packet_id = 0
@@ -272,7 +272,7 @@ class XferManager:
chunk = xfer.chunks.pop(packet_id)
# EOF if there are no chunks left
packet_val = XferPacket(PacketID=packet_id, IsEOF=not bool(xfer.chunks))
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
"SendXferPacket",
Block("XferID", ID=xfer.xfer_id, Packet_=packet_val),
Block("DataPacket", Data=chunk),
@@ -282,5 +282,5 @@ class XferManager:
# Don't care about the value, just want to know it was confirmed.
if wait_for_confirm:
await message_handler.wait_for(
"ConfirmXferPacket", predicate=xfer.is_our_message, timeout=5.0)
("ConfirmXferPacket",), predicate=xfer.is_our_message, timeout=5.0)
packet_id += 1

View File

@@ -39,7 +39,7 @@ class NameCache:
def create_subscriptions(
self,
message_handler: MessageHandler[Message],
message_handler: MessageHandler[Message, str],
):
message_handler.subscribe("UUIDNameReply", self._handle_uuid_name_reply)

View File

@@ -16,6 +16,7 @@ from typing import *
from hippolyzer.lib.base.datatypes import UUID, Vector3
from hippolyzer.lib.base.helpers import proxify
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.base.objects import (
normalize_object_update,
normalize_terse_object_update,
@@ -23,6 +24,7 @@ from hippolyzer.lib.base.objects import (
normalize_object_update_compressed,
Object, handle_to_global_pos,
)
from hippolyzer.lib.base.settings import Settings
from hippolyzer.lib.client.namecache import NameCache, NameCacheEntry
from hippolyzer.lib.client.state import BaseClientSession, BaseClientRegion
from hippolyzer.lib.base.templates import PCode, ObjectStateSerializer
@@ -32,11 +34,12 @@ LOG = logging.getLogger(__name__)
OBJECT_OR_LOCAL = Union[Object, int]
class UpdateType(enum.IntEnum):
class ObjectUpdateType(enum.IntEnum):
OBJECT_UPDATE = enum.auto()
PROPERTIES = enum.auto()
FAMILY = enum.auto()
COSTS = enum.auto()
KILL = enum.auto()
class ClientObjectManager:
@@ -63,11 +66,11 @@ class ClientObjectManager:
return self.state.missing_locals
def clear(self):
self.state.clear()
if self._region.handle is not None:
# We're tracked by the world object manager, tell it to untrack
# any objects that we owned
self._world_objects.clear_region_objects(self._region.handle)
self.state.clear()
def lookup_localid(self, localid: int) -> Optional[Object]:
return self.state.lookup_localid(localid)
@@ -110,18 +113,18 @@ class ClientObjectManager:
while ids_to_req:
blocks = [
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
*[Block("ObjectData", ObjectLocalID=x) for x in ids_to_req[:100]],
*[Block("ObjectData", ObjectLocalID=x) for x in ids_to_req[:255]],
]
# Selecting causes ObjectProperties to be sent
self._region.circuit.send_message(Message("ObjectSelect", blocks))
self._region.circuit.send_message(Message("ObjectDeselect", blocks))
ids_to_req = ids_to_req[100:]
self._region.circuit.send(Message("ObjectSelect", blocks))
self._region.circuit.send(Message("ObjectDeselect", blocks))
ids_to_req = ids_to_req[255:]
futures = []
for local_id in local_ids:
if local_id in unselected_ids:
# Need to wait until we get our reply
fut = self.state.register_future(local_id, UpdateType.PROPERTIES)
fut = self.state.register_future(local_id, ObjectUpdateType.PROPERTIES)
else:
# This was selected so we should already have up to date info
fut = asyncio.Future()
@@ -147,28 +150,47 @@ class ClientObjectManager:
ids_to_req = local_ids
while ids_to_req:
self._region.circuit.send_message(Message(
self._region.circuit.send(Message(
"RequestMultipleObjects",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
*[Block("ObjectData", CacheMissType=0, ID=x) for x in ids_to_req[:100]],
*[Block("ObjectData", CacheMissType=0, ID=x) for x in ids_to_req[:255]],
))
ids_to_req = ids_to_req[100:]
ids_to_req = ids_to_req[255:]
futures = []
for local_id in local_ids:
futures.append(self.state.register_future(local_id, UpdateType.OBJECT_UPDATE))
futures.append(self.state.register_future(local_id, ObjectUpdateType.OBJECT_UPDATE))
return futures
class ObjectEvent:
__slots__ = ("object", "updated", "update_type")
object: Object
updated: Set[str]
update_type: ObjectUpdateType
def __init__(self, obj: Object, updated: Set[str], update_type: ObjectUpdateType):
self.object = obj
self.updated = updated
self.update_type = update_type
@property
def name(self) -> ObjectUpdateType:
return self.update_type
class ClientWorldObjectManager:
"""Manages Objects for a session's whole world"""
def __init__(self, session: BaseClientSession, name_cache: Optional[NameCache]):
def __init__(self, session: BaseClientSession, settings: Settings, name_cache: Optional[NameCache]):
self._session: BaseClientSession = session
self._settings = settings
self.name_cache = name_cache or NameCache()
self.events: MessageHandler[ObjectEvent, ObjectUpdateType] = MessageHandler(take_by_default=False)
self._fullid_lookup: Dict[UUID, Object] = {}
self._avatars: Dict[UUID, Avatar] = {}
self._avatar_objects: Dict[UUID, Object] = {}
self._region_managers: Dict[int, ClientObjectManager] = {}
self.name_cache = name_cache or NameCache()
message_handler = self._session.message_handler
message_handler.subscribe("ObjectUpdate", self._handle_object_update)
message_handler.subscribe("ImprovedTerseObjectUpdate",
@@ -215,13 +237,12 @@ class ClientWorldObjectManager:
self._region_managers[handle] = proxify(self._session.region_by_handle(handle).objects)
def clear_region_objects(self, handle: int):
"""Signal that a region object manager is being cleared"""
region_mgr = self._region_managers.get(handle)
if region_mgr is None:
return
# Make sure they're gone from our lookup table first
for obj in region_mgr.all_objects:
del self._fullid_lookup[obj.FullID]
"""Handle signal that a region object manager was just cleared"""
# Make sure they're gone from our lookup table
for obj in tuple(self._fullid_lookup.values()):
if obj.RegionHandle == handle:
del self._fullid_lookup[obj.FullID]
self._rebuild_avatar_objects()
def _get_region_manager(self, handle: int) -> Optional[ClientObjectManager]:
return self._region_managers.get(handle)
@@ -274,7 +295,7 @@ class ClientWorldObjectManager:
self._rebuild_avatar_objects()
self._region_managers.clear()
def _update_existing_object(self, obj: Object, new_properties: dict, update_type: UpdateType):
def _update_existing_object(self, obj: Object, new_properties: dict, update_type: ObjectUpdateType):
old_parent_id = obj.ParentID
new_parent_id = new_properties.get("ParentID", obj.ParentID)
old_local_id = obj.LocalID
@@ -333,7 +354,7 @@ class ClientWorldObjectManager:
if obj.PCode == PCode.AVATAR:
self._avatar_objects[obj.FullID] = obj
self._rebuild_avatar_objects()
self._run_object_update_hooks(obj, set(obj.to_dict().keys()), UpdateType.OBJECT_UPDATE)
self._run_object_update_hooks(obj, set(obj.to_dict().keys()), ObjectUpdateType.OBJECT_UPDATE)
def _kill_object_by_local_id(self, region_state: RegionObjectsState, local_id: int):
obj = region_state.lookup_localid(local_id)
@@ -385,7 +406,7 @@ class ClientWorldObjectManager:
# our view of the world then we want to move it to this region.
obj = self.lookup_fullid(object_data["FullID"])
if obj:
self._update_existing_object(obj, object_data, UpdateType.OBJECT_UPDATE)
self._update_existing_object(obj, object_data, ObjectUpdateType.OBJECT_UPDATE)
else:
if region_state is None:
continue
@@ -409,7 +430,7 @@ class ClientWorldObjectManager:
# Need the Object as context because decoding state requires PCode.
state_deserializer = ObjectStateSerializer.deserialize
object_data["State"] = state_deserializer(ctx_obj=obj, val=object_data["State"])
self._update_existing_object(obj, object_data, UpdateType.OBJECT_UPDATE)
self._update_existing_object(obj, object_data, ObjectUpdateType.OBJECT_UPDATE)
else:
if region_state:
region_state.missing_locals.add(object_data["LocalID"])
@@ -437,7 +458,7 @@ class ClientWorldObjectManager:
self._update_existing_object(obj, {
"UpdateFlags": update_flags,
"RegionHandle": handle,
}, UpdateType.OBJECT_UPDATE)
}, ObjectUpdateType.OBJECT_UPDATE)
continue
cached_obj_data = self._lookup_cache_entry(handle, block["ID"], block["CRC"])
@@ -452,15 +473,17 @@ class ClientWorldObjectManager:
missing_locals.add(block["ID"])
if region_state:
region_state.missing_locals.update(missing_locals)
self._handle_object_update_cached_misses(handle, missing_locals)
if missing_locals:
self._handle_object_update_cached_misses(handle, missing_locals)
msg.meta["ObjectUpdateIDs"] = tuple(seen_locals)
def _handle_object_update_cached_misses(self, region_handle: int, local_ids: Set[int]):
def _handle_object_update_cached_misses(self, region_handle: int, missing_locals: Set[int]):
"""Handle an ObjectUpdateCached that referenced some un-cached local IDs"""
region_mgr = self._get_region_manager(region_handle)
region_mgr.request_objects(local_ids)
region_mgr.request_objects(missing_locals)
# noinspection PyUnusedLocal
def _lookup_cache_entry(self, handle: int, local_id: int, crc: int) -> Optional[bytes]:
def _lookup_cache_entry(self, region_handle: int, local_id: int, crc: int) -> Optional[bytes]:
return None
def _handle_object_update_compressed(self, msg: Message):
@@ -474,7 +497,7 @@ class ClientWorldObjectManager:
LOG.warning(f"Got ObjectUpdateCompressed for unknown region {handle}: {object_data!r}")
obj = self.lookup_fullid(object_data["FullID"])
if obj:
self._update_existing_object(obj, object_data, UpdateType.OBJECT_UPDATE)
self._update_existing_object(obj, object_data, ObjectUpdateType.OBJECT_UPDATE)
else:
if region_state is None:
continue
@@ -491,7 +514,7 @@ class ClientWorldObjectManager:
obj = self.lookup_fullid(block["ObjectID"])
if obj:
seen_locals.append(obj.LocalID)
self._update_existing_object(obj, object_properties, UpdateType.PROPERTIES)
self._update_existing_object(obj, object_properties, ObjectUpdateType.PROPERTIES)
else:
LOG.debug(f"Received {packet.name} for unknown {block['ObjectID']}")
packet.meta["ObjectUpdateIDs"] = tuple(seen_locals)
@@ -538,17 +561,18 @@ class ClientWorldObjectManager:
LOG.debug(f"Received ObjectCost for unknown {object_id}")
continue
obj.ObjectCosts.update(object_costs)
self._run_object_update_hooks(obj, {"ObjectCosts"}, UpdateType.COSTS)
self._run_object_update_hooks(obj, {"ObjectCosts"}, ObjectUpdateType.COSTS)
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str], update_type: UpdateType):
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str], update_type: ObjectUpdateType):
region_state = self._get_region_state(obj.RegionHandle)
region_state.resolve_futures(obj, update_type)
if obj.PCode == PCode.AVATAR and "NameValue" in updated_props:
if obj.NameValue:
self.name_cache.update(obj.FullID, obj.NameValue.to_dict())
self.events.handle(ObjectEvent(obj, updated_props, update_type))
def _run_kill_object_hooks(self, obj: Object):
pass
self.events.handle(ObjectEvent(obj, set(), ObjectUpdateType.KILL))
def _rebuild_avatar_objects(self):
# Get all avatars known through coarse locations and which region the location was in
@@ -574,6 +598,9 @@ class ClientWorldObjectManager:
coarse_handle, coarse_location = coarse_pair
av.CoarseLocation = coarse_location
av.RegionHandle = coarse_handle
# If we have a real value for Z then throw away any stale guesses
if av.CoarseLocation.Z != math.inf:
av.GuessedZ = None
if av_obj:
av.Object = av_obj
av.RegionHandle = av_obj.RegionHandle
@@ -752,7 +779,7 @@ class RegionObjectsState:
del self._orphans[parent_id]
return removed
def register_future(self, local_id: int, future_type: UpdateType) -> asyncio.Future[Object]:
def register_future(self, local_id: int, future_type: ObjectUpdateType) -> asyncio.Future[Object]:
fut = asyncio.Future()
fut_key = (local_id, future_type)
local_futs = self._object_futures.get(fut_key, [])
@@ -761,7 +788,7 @@ class RegionObjectsState:
fut.add_done_callback(local_futs.remove)
return fut
def resolve_futures(self, obj: Object, update_type: UpdateType):
def resolve_futures(self, obj: Object, update_type: ObjectUpdateType):
futures = self._object_futures.get((obj.LocalID, update_type), [])
for fut in futures[:]:
fut.set_result(obj)
@@ -799,6 +826,7 @@ class Avatar:
# to fill in the Z axis if it's infinite
self.CoarseLocation = coarse_location
self.Valid = True
self.GuessedZ: Optional[float] = None
self._resolved_name = resolved_name
@property
@@ -814,6 +842,9 @@ class Avatar:
if self.Object and self.Object.AncestorsKnown:
return self.Object.RegionPosition
if self.CoarseLocation is not None:
if self.CoarseLocation.Z == math.inf and self.GuessedZ is not None:
coarse = self.CoarseLocation
return Vector3(coarse.X, coarse.Y, self.GuessedZ)
return self.CoarseLocation
raise ValueError(f"Avatar {self.FullID} has no known position")
@@ -833,6 +864,18 @@ class Avatar:
return None
return self._resolved_name.preferred_name
@property
def DisplayName(self) -> Optional[str]:
if not self._resolved_name:
return None
return self._resolved_name.display_name
@property
def LegacyName(self) -> Optional[str]:
if not self._resolved_name:
return None
return self._resolved_name.legacy_name
def __repr__(self):
loc_str = str(self.RegionPosition) if self.LocationType != LocationType.NONE else "?"
return f"<{self.__class__.__name__} {self.FullID} {self.Name!r} @ {loc_str}>"

View File

@@ -10,6 +10,7 @@ from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.circuit import ConnectionHolder
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.base.network.caps_client import CapsClient
from hippolyzer.lib.base.network.transport import ADDR_TUPLE
if TYPE_CHECKING:
@@ -18,10 +19,11 @@ if TYPE_CHECKING:
class BaseClientRegion(ConnectionHolder, abc.ABC):
"""Represents a client's view of a remote region"""
# Actually a weakref
handle: Optional[int]
# Actually a weakref
session: Callable[[], BaseClientSession]
objects: ClientObjectManager
caps_client: CapsClient
class BaseClientSession(abc.ABC):
@@ -29,7 +31,7 @@ class BaseClientSession(abc.ABC):
id: UUID
agent_id: UUID
secure_session_id: UUID
message_handler: MessageHandler[Message]
message_handler: MessageHandler[Message, str]
regions: Sequence[BaseClientRegion]
region_by_handle: Callable[[int], Optional[BaseClientRegion]]
region_by_circuit_addr: Callable[[ADDR_TUPLE], Optional[BaseClientRegion]]

View File

@@ -73,17 +73,17 @@ def show_message(text, session=None) -> None:
direction=Direction.IN,
)
if session:
session.main_region.circuit.send_message(message)
session.main_region.circuit.send(message)
else:
for session in AddonManager.SESSION_MANAGER.sessions:
session.main_region.circuit.send_message(copy.copy(message))
session.main_region.circuit.send(copy.copy(message))
def send_chat(message: Union[bytes, str], channel=0, chat_type=ChatType.NORMAL, session=None):
session = session or addon_ctx.session.get(None) or None
if not session:
raise RuntimeError("Tried to send chat without session")
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"ChatFromViewer",
Block(
"AgentData",
@@ -128,6 +128,17 @@ def ais_item_to_inventory_data(ais_item: dict):
)
def ais_folder_to_inventory_data(ais_folder: dict):
return Block(
"FolderData",
FolderID=ais_folder["cat_id"],
ParentID=ais_folder["parent_id"],
CallbackID=0,
Type=ais_folder["preferred_type"],
Name=ais_folder["name"],
)
class BaseAddon(abc.ABC):
def _schedule_task(self, coro: Coroutine, session=None,
region_scoped=False, session_scoped=True, addon_scoped=True):
@@ -181,13 +192,18 @@ class BaseAddon(abc.ABC):
def handle_region_changed(self, session: Session, region: ProxiedRegion):
pass
def handle_region_registered(self, session: Session, region: ProxiedRegion):
pass
def handle_circuit_created(self, session: Session, region: ProxiedRegion):
pass
def handle_rlv_command(self, session: Session, region: ProxiedRegion, source: UUID,
cmd: str, options: List[str], param: str):
pass
def handle_proxied_packet(self, session_manager: SessionManager, packet: UDPPacket,
session: Optional[Session], region: Optional[ProxiedRegion],
message: Optional[Message]):
session: Optional[Session], region: Optional[ProxiedRegion]):
pass

View File

@@ -16,28 +16,22 @@ from types import ModuleType
from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import get_mtime
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.network.transport import UDPPacket
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.task_scheduler import TaskLifeScope, TaskScheduler
if TYPE_CHECKING:
from hippolyzer.lib.proxy.commands import CommandDetails, WrappedCommandCallable
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.objects import Object
from hippolyzer.lib.base.network.transport import UDPPacket
from hippolyzer.lib.proxy.object_manager import Object
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
LOG = logging.getLogger(__name__)
def _get_mtime(path):
try:
return os.stat(path).st_mtime
except:
return None
class BaseInteractionManager:
@abc.abstractmethod
async def open_dir(self, caption: str = '', directory: str = '', filter_str: str = '') -> Optional[str]:
@@ -52,13 +46,27 @@ class BaseInteractionManager:
pass
@abc.abstractmethod
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '') -> Optional[str]:
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '',
default_suffix: str = '') -> Optional[str]:
pass
@abc.abstractmethod
async def confirm(self, title: str, caption: str) -> bool:
pass
def main_window_handle(self) -> Any:
return None
# Used to initialize a REPL environment with commonly desired helpers
REPL_INITIALIZER = r"""
from hippolyzer.lib.base.datatypes import *
from hippolyzer.lib.base.templates import *
from hippolyzer.lib.base.message.message import Block, Message, Direction
from hippolyzer.lib.proxy.addon_utils import send_chat, show_message
"""
class AddonManager:
COMMAND_CHANNEL = 524
@@ -97,9 +105,14 @@ class AddonManager:
@classmethod
def shutdown(cls):
to_pop = []
for mod in cls.FRESH_ADDON_MODULES.values():
to_pop.append(mod)
cls._call_module_hooks(mod, "handle_unload", cls.SESSION_MANAGER)
cls.SCHEDULER.shutdown()
for mod in to_pop:
if isinstance(mod, ModuleType):
sys.modules.pop(mod.__name__, None)
@classmethod
def have_active_repl(cls):
@@ -129,6 +142,16 @@ class AddonManager:
if _locals is None:
_locals = stack.frame.f_locals
init_globals = {}
exec(REPL_INITIALIZER, init_globals, None)
# We're modifying the globals of the caller, be careful of things we imported
# for the REPL initializer clobber things that already exist in the caller's globals.
# Making our own mutable copy of the globals dict, mutating that and then passing it
# to embed() is not an option due to https://github.com/prompt-toolkit/ptpython/issues/279
for global_name, global_val in init_globals.items():
if global_name not in _globals:
_globals[global_name] = global_val
async def _wrapper():
coro: Coroutine = ptpython.repl.embed( # noqa: the type signature lies
globals=_globals,
@@ -169,6 +192,7 @@ class AddonManager:
old_mod = cls.FRESH_ADDON_MODULES.pop(specs[0].name, None)
if old_mod:
cls._unload_module(old_mod)
sys.modules.pop(old_mod.__name__, None)
if reload:
cls._reload_addons()
@@ -176,7 +200,7 @@ class AddonManager:
def _check_hotreloads(cls):
"""Mark addons that rely on changed files for reloading"""
for filename, importers in cls.HOTRELOAD_IMPORTERS.items():
mtime = _get_mtime(filename)
mtime = get_mtime(filename)
if not mtime or mtime == cls.FILE_MTIMES.get(filename, None):
continue
@@ -205,7 +229,7 @@ class AddonManager:
# Mark the caller as having imported (and being dependent on) `module`
stack = inspect.stack()[1]
cls.HOTRELOAD_IMPORTERS[imported_file].add(stack.filename)
cls.FILE_MTIMES[imported_file] = _get_mtime(imported_file)
cls.FILE_MTIMES[imported_file] = get_mtime(imported_file)
importing_spec = next((s for s in cls.BASE_ADDON_SPECS if s.origin == stack.filename), None)
imported_spec = next((s for s in cls.BASE_ADDON_SPECS if s.origin == imported_file), None)
@@ -253,7 +277,7 @@ class AddonManager:
for spec in cls.BASE_ADDON_SPECS[:]:
had_mod = spec.name in cls.FRESH_ADDON_MODULES
try:
mtime = _get_mtime(spec.origin)
mtime = get_mtime(spec.origin)
mtime_changed = mtime != cls.FILE_MTIMES.get(spec.origin, None)
if not mtime_changed and had_mod:
continue
@@ -277,8 +301,8 @@ class AddonManager:
# Make sure module initialization happens after any pending task cancellations
# due to module unloading.
asyncio.get_event_loop().call_soon(cls._init_module, mod)
loop = asyncio.get_event_loop_policy().get_event_loop()
loop.call_soon(cls._init_module, mod)
except Exception as e:
if had_mod:
logging.exception("Exploded trying to reload addon %s" % spec.name)
@@ -516,9 +540,19 @@ class AddonManager:
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_region_changed", session, region)
@classmethod
def handle_region_registered(cls, session: Session, region: ProxiedRegion):
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_region_registered", session, region)
@classmethod
def handle_circuit_created(cls, session: Session, region: ProxiedRegion):
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_circuit_created", session, region)
@classmethod
def handle_proxied_packet(cls, session_manager: SessionManager, packet: UDPPacket,
session: Optional[Session], region: Optional[ProxiedRegion],
message: Optional[Message]):
return cls._call_all_addon_hooks("handle_proxied_packet", session_manager,
packet, session, region, message)
session: Optional[Session], region: Optional[ProxiedRegion]):
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_proxied_packet", session_manager,
packet, session, region)

View File

@@ -0,0 +1,93 @@
from __future__ import annotations
import enum
import typing
from weakref import ref
from typing import *
if TYPE_CHECKING:
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
def is_asset_server_cap_name(cap_name):
return cap_name and (
cap_name.startswith("GetMesh")
or cap_name.startswith("GetTexture")
or cap_name.startswith("ViewerAsset")
)
class CapType(enum.Enum):
NORMAL = enum.auto()
TEMPORARY = enum.auto()
WRAPPER = enum.auto()
PROXY_ONLY = enum.auto()
@property
def fake(self) -> bool:
return self == CapType.PROXY_ONLY or self == CapType.WRAPPER
class SerializedCapData(typing.NamedTuple):
cap_name: typing.Optional[str] = None
region_addr: typing.Optional[str] = None
session_id: typing.Optional[str] = None
base_url: typing.Optional[str] = None
type: str = "NORMAL"
def __bool__(self):
return bool(self.cap_name or self.session_id)
@property
def asset_server_cap(self):
return is_asset_server_cap_name(self.cap_name)
class CapData(NamedTuple):
cap_name: Optional[str] = None
# Actually they're weakrefs but the type sigs suck.
region: Optional[Callable[[], Optional[ProxiedRegion]]] = None
session: Optional[Callable[[], Optional[Session]]] = None
base_url: Optional[str] = None
type: CapType = CapType.NORMAL
def __bool__(self):
return bool(self.cap_name or self.session)
def serialize(self) -> "SerializedCapData":
return SerializedCapData(
cap_name=self.cap_name,
region_addr=str(self.region().circuit_addr) if self.region and self.region() else None,
session_id=str(self.session().id) if self.session and self.session() else None,
base_url=self.base_url,
type=self.type.name,
)
@classmethod
def deserialize(
cls,
ser_cap_data: "SerializedCapData",
session_mgr: Optional[SessionManager],
) -> "CapData":
cap_session = None
cap_region = None
if session_mgr and ser_cap_data.session_id:
for session in session_mgr.sessions:
if ser_cap_data.session_id == str(session.id):
cap_session = session
if cap_session and ser_cap_data.region_addr:
for region in cap_session.regions:
if ser_cap_data.region_addr == str(region.circuit_addr):
cap_region = region
return cls(
cap_name=ser_cap_data.cap_name,
region=ref(cap_region) if cap_region else None,
session=ref(cap_session) if cap_session else None,
base_url=ser_cap_data.base_url,
type=CapType[ser_cap_data.type],
)
@property
def asset_server_cap(self) -> bool:
return is_asset_server_cap_name(self.cap_name)

View File

@@ -1,25 +1,26 @@
from __future__ import annotations
import os
import re
import sys
from typing import *
from hippolyzer.lib.base.network.caps_client import CapsClient, CAPS_DICT
from hippolyzer.lib.proxy.settings import ProxySettings
if TYPE_CHECKING:
from hippolyzer.lib.proxy.region import ProxiedRegion
class ProxyCapsClient(CapsClient):
def __init__(self, region: Optional[ProxiedRegion] = None):
def __init__(self, settings: ProxySettings, region: Optional[ProxiedRegion] = None):
super().__init__(None)
self._region = region
self._settings = settings
def _get_caps(self) -> Optional[CAPS_DICT]:
if not self._region:
return None
return self._region.caps
return self._region.cap_urls
def _request_fixups(self, cap_or_url: str, headers: Dict, proxy: Optional[bool], ssl: Any):
# We want to proxy this through Hippolyzer
@@ -27,9 +28,9 @@ class ProxyCapsClient(CapsClient):
# We go through the proxy by default, tack on a header letting mitmproxy know the
# request came from us so we can tag the request as injected. The header will be popped
# off before passing through to the server.
headers["X-Hippo-Injected"] = "1"
# TODO: Have a setting for this
proxy_port = int(os.environ.get("HIPPO_HTTP_PORT", 9062))
if "X-Hippo-Injected" not in headers:
headers["X-Hippo-Injected"] = "1"
proxy_port = self._settings.HTTP_PROXY_PORT
proxy = f"http://127.0.0.1:{proxy_port}"
# TODO: set up the SSLContext to validate mitmproxy's cert
ssl = ssl or False

View File

@@ -25,7 +25,7 @@ class ProxiedCircuit(Circuit):
except:
logging.exception(f"Failed to serialize: {message.to_dict()!r}")
raise
if self.logging_hook and message.injected:
if self.logging_hook and message.synthetic:
self.logging_hook(message)
return self.send_datagram(serialized, message.direction, transport=transport)
@@ -34,44 +34,46 @@ class ProxiedCircuit(Circuit):
return self.out_injections, self.in_injections
return self.in_injections, self.out_injections
def prepare_message(self, message: Message, direction=None):
def prepare_message(self, message: Message):
if message.finalized:
raise RuntimeError(f"Trying to re-send finalized {message!r}")
direction = direction or getattr(message, 'direction')
fwd_injections, reverse_injections = self._get_injections(direction)
if message.queued:
# This is due to be dropped, nothing should be sending the original
raise RuntimeError(f"Trying to send original of queued {message!r}")
fwd_injections, reverse_injections = self._get_injections(message.direction)
message.finalized = True
# Injected, let's gen an ID
if message.packet_id is None:
message.packet_id = fwd_injections.gen_injectable_id()
message.injected = True
else:
message.synthetic = True
# This message wasn't injected by the proxy so we need to rewrite packet IDs
# to account for IDs the real creator of the packet couldn't have known about.
elif not message.synthetic:
# was_dropped needs the unmodified packet ID
if fwd_injections.was_dropped(message.packet_id) and message.name != "PacketAck":
logging.warning("Attempting to re-send previously dropped %s:%s, did we ack?" %
(message.packet_id, message.name))
message.packet_id = fwd_injections.get_effective_id(message.packet_id)
fwd_injections.track_seen(message.packet_id)
message.finalized = True
if not message.injected:
# This message wasn't injected by the proxy so we need to rewrite packet IDs
# to account for IDs the other parties couldn't have known about.
message.acks = tuple(
reverse_injections.get_original_id(x) for x in message.acks
if not reverse_injections.was_injected(x)
)
if message.name == "PacketAck":
if not self._rewrite_packet_ack(message, reverse_injections):
logging.debug(f"Dropping {direction} ack for injected packets!")
if not self._rewrite_packet_ack(message, reverse_injections) and not message.acks:
logging.debug(f"Dropping {message.direction} ack for injected packets!")
# Let caller know this shouldn't be sent at all, it's strictly ACKs for
# injected packets.
return False
elif message.name == "StartPingCheck":
self._rewrite_start_ping_check(message, fwd_injections)
if not message.acks:
if message.acks:
message.send_flags |= PacketFlags.ACK
else:
message.send_flags &= ~PacketFlags.ACK
return True
@@ -97,15 +99,18 @@ class ProxiedCircuit(Circuit):
new_id = fwd_injections.get_effective_id(orig_id)
if orig_id != new_id:
logging.debug("Rewrote oldest unacked %s -> %s" % (orig_id, new_id))
# Get a list of unacked IDs for the direction this StartPingCheck is heading
fwd_unacked = (a for (d, a) in self.unacked_reliable.keys() if d == message.direction)
# Use the proxy's oldest unacked ID if it's older than the client's
new_id = min((new_id, *fwd_unacked))
message["PingID"]["OldestUnacked"] = new_id
def drop_message(self, message: Message, orig_direction=None):
def drop_message(self, message: Message):
if message.finalized:
raise RuntimeError(f"Trying to drop finalized {message!r}")
if message.packet_id is None:
return
orig_direction = orig_direction or message.direction
fwd_injections, reverse_injections = self._get_injections(orig_direction)
fwd_injections, reverse_injections = self._get_injections(message.direction)
fwd_injections.mark_dropped(message.packet_id)
message.dropped = True
@@ -113,7 +118,7 @@ class ProxiedCircuit(Circuit):
# Was sent reliably, tell the other end that we saw it and to shut up.
if message.reliable:
self.send_acks([message.packet_id], ~orig_direction)
self.send_acks([message.packet_id], ~message.direction)
# This packet had acks for the other end, send them in a separate PacketAck
effective_acks = tuple(
@@ -121,7 +126,7 @@ class ProxiedCircuit(Circuit):
if not reverse_injections.was_injected(x)
)
if effective_acks:
self.send_acks(effective_acks, orig_direction, packet_id=message.packet_id)
self.send_acks(effective_acks, message.direction, packet_id=message.packet_id)
class InjectionTracker:

View File

@@ -26,6 +26,10 @@ class CommandDetails(NamedTuple):
lifetime: Optional[TaskLifeScope] = None
def parse_bool(val: str) -> bool:
return val.lower() in ('on', 'true', '1', '1.0', 'yes')
def handle_command(command_name: Optional[str] = None, /, *, lifetime: Optional[TaskLifeScope] = None,
single_instance: bool = False, **params: Union[Parameter, callable]):
"""
@@ -61,13 +65,13 @@ def handle_command(command_name: Optional[str] = None, /, *, lifetime: Optional[
# Greedy, takes the rest of the message
if param.sep is None:
param_val = message
message = None
message = ""
else:
message = message.lstrip(param.sep)
if not message:
if param.optional:
break
raise KeyError(f"Missing parameter {param_name}")
if not param.optional:
raise KeyError(f"Missing parameter {param_name}")
continue
param_val, _, message = message.partition(param.sep) # type: ignore
param_vals[param_name] = param.parser(param_val)

View File

@@ -48,13 +48,17 @@ class HTTPAssetRepo(collections.UserDict):
asset_id = None
for name, val in flow.request.query.items():
if name.endswith("_id"):
asset_id = UUID(val)
try:
asset_id = UUID(val)
break
except ValueError:
pass
if not asset_id or asset_id not in self.data:
return False
asset = self[asset_id]
flow.response = http.HTTPResponse.make(
flow.response = http.Response.make(
content=asset.data,
headers={
"Content-Type": "application/octet-stream",

View File

@@ -14,11 +14,13 @@ import defusedxml.xmlrpc
import mitmproxy.http
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.llsd_msg_serializer import LLSDMessageSerializer
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.region import ProxiedRegion, CapType
from hippolyzer.lib.proxy.sessions import SessionManager, CapData, Session
from hippolyzer.lib.proxy.caps import CapData, CapType
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager, Session
from hippolyzer.lib.proxy.http_proxy import HTTPFlowContext
@@ -81,16 +83,19 @@ class MITMProxyEventManager:
finally:
# If someone has taken this request out of the regular callback flow,
# they'll manually send a callback at some later time.
if not flow.taken:
self.to_proxy_queue.put(("callback", flow.id, flow.get_state()))
if not flow.taken and not flow.resumed:
# Addon hasn't taken ownership of this flow, send it back to mitmproxy
# ourselves.
flow.resume()
def _handle_request(self, flow: HippoHTTPFlow):
url = flow.request.url
cap_data = self.session_manager.resolve_cap(url)
flow.cap_data = cap_data
# Don't do anything special with the proxy's own requests,
# we only pass it through for logging purposes.
if flow.request_injected:
# Don't do anything special with the proxy's own requests unless the requested
# URL can only be handled by the proxy. Ideally we only pass the request through
# for logging purposes.
if flow.request_injected and (not cap_data or not cap_data.type.fake):
return
# The local asset repo gets first bite at the apple
@@ -102,7 +107,7 @@ class MITMProxyEventManager:
AddonManager.handle_http_request(flow)
if cap_data and cap_data.cap_name.endswith("ProxyWrapper"):
orig_cap_name = cap_data.cap_name.rsplit("ProxyWrapper", 1)[0]
orig_cap_url = cap_data.region().caps[orig_cap_name]
orig_cap_url = cap_data.region().cap_urls[orig_cap_name]
split_orig_url = urllib.parse.urlsplit(orig_cap_url)
orig_cap_host = split_orig_url[1]
@@ -119,7 +124,7 @@ class MITMProxyEventManager:
if not flow.can_stream or self._asset_server_proxied:
flow.request.url = redir_url
else:
flow.response = mitmproxy.http.HTTPResponse.make(
flow.response = mitmproxy.http.Response.make(
307,
# Can't provide explanation in the body because this results in failing Range requests under
# mitmproxy that return garbage data. Chances are there's weird interactions
@@ -133,9 +138,41 @@ class MITMProxyEventManager:
)
elif cap_data and cap_data.asset_server_cap:
# Both the wrapper request and the actual asset server request went through
# the proxy
# the proxy. Don't bother trying the redirect strategy anymore.
self._asset_server_proxied = True
logging.warning("noproxy not used, switching to URI rewrite strategy")
elif cap_data and cap_data.cap_name == "EventQueueGet":
# HACK: The sim's EQ acking mechanism doesn't seem to actually work.
# if the client drops the connection due to timeout before we can
# proxy back the response then it will be lost forever. Keep around
# the last EQ response we got so we can re-send it if the client repeats
# its previous request.
req_ack_id = llsd.parse_xml(flow.request.content)["ack"]
eq_manager = cap_data.region().eq_manager
cached_resp = eq_manager.get_cached_poll_response(req_ack_id)
if cached_resp:
logging.warning("Had to serve a cached EventQueueGet due to client desync")
flow.response = mitmproxy.http.Response.make(
200,
llsd.format_xml(cached_resp),
{
"Content-Type": "application/llsd+xml",
# So we can differentiate these in the log
"X-Hippo-Fake-EQ": "1",
"Connection": "close",
},
)
elif cap_data and cap_data.cap_name == "Seed":
# Drop any proxy-only caps from the seed request we send to the server,
# add those cap names as metadata so we know to send their urls in the response
parsed_seed: List[str] = llsd.parse_xml(flow.request.content)
flow.metadata['needed_proxy_caps'] = []
for known_cap_name, (known_cap_type, known_cap_url) in cap_data.region().caps.items():
if known_cap_type == CapType.PROXY_ONLY and known_cap_name in parsed_seed:
parsed_seed.remove(known_cap_name)
flow.metadata['needed_proxy_caps'].append(known_cap_name)
if flow.metadata['needed_proxy_caps']:
flow.request.content = llsd.format_xml(parsed_seed)
elif not cap_data:
if self._is_login_request(flow):
# Not strictly a Cap, but makes it easier to filter on.
@@ -144,7 +181,7 @@ class MITMProxyEventManager:
if cap_data and cap_data.type == CapType.PROXY_ONLY:
# A proxy addon was supposed to respond itself, but it didn't.
if not flow.taken and not flow.response_injected:
flow.response = mitmproxy.http.HTTPResponse.make(
flow.response = mitmproxy.http.Response.make(
500,
b"Proxy didn't handle proxy-only Cap correctly",
{
@@ -175,75 +212,105 @@ class MITMProxyEventManager:
def _handle_response(self, flow: HippoHTTPFlow):
message_logger = self.session_manager.message_logger
if message_logger:
message_logger.log_http_response(flow)
try:
message_logger.log_http_response(flow)
except:
logging.exception("Failed while logging HTTP flow")
# Don't handle responses for requests injected by the proxy
if flow.request_injected:
return
if AddonManager.handle_http_response(flow):
# Don't process responses for requests or responses injected by the proxy.
# We already processed it, it came from us!
if flow.request_injected or flow.response_injected:
return
status = flow.response.status_code
cap_data: Optional[CapData] = flow.metadata["cap_data"]
if cap_data:
if status != 200:
if status == 200 and cap_data and cap_data.cap_name == "FirestormBridge":
# Fake FirestormBridge cap based on a bridge-like response coming from
# a non-browser HTTP request. Figure out what session it belongs to
# so it can be handled in the session and region HTTP MessageHandlers
agent_id_str = flow.response.headers.get("X-SecondLife-Owner-Key", "")
if not agent_id_str:
return
agent_id = UUID(agent_id_str)
for session in self.session_manager.sessions:
if session.pending:
continue
if session.agent_id == agent_id:
# Enrich the flow with the session and region info
cap_data = CapData(
cap_name="FirestormBridge",
region=weakref.ref(session.main_region),
session=weakref.ref(session),
)
flow.cap_data = cap_data
break
if cap_data.cap_name == "LoginRequest":
self._handle_login_flow(flow)
if AddonManager.handle_http_response(flow):
return
if status != 200 or not cap_data:
return
if cap_data.cap_name == "LoginRequest":
self._handle_login_flow(flow)
return
try:
session = cap_data.session and cap_data.session()
if not session:
return
try:
session = cap_data.session and cap_data.session()
if not session:
return
session.http_message_handler.handle(flow)
session.http_message_handler.handle(flow)
region = cap_data.region and cap_data.region()
region = cap_data.region and cap_data.region()
if not region:
return
region.http_message_handler.handle(flow)
if cap_data.cap_name == "Seed":
parsed = llsd.parse_xml(flow.response.content)
logging.debug("Got seed cap for %r : %r" % (cap_data, parsed))
region.update_caps(parsed)
# On LL's grid these URIs aren't unique across sessions or regions,
# so we get request attribution by replacing them with a unique
# alias URI.
logging.debug("Replacing GetMesh caps with wrapped versions")
wrappable_caps = {"GetMesh2", "GetMesh", "GetTexture", "ViewerAsset"}
for cap_name in wrappable_caps:
if cap_name in parsed:
parsed[cap_name] = region.register_wrapper_cap(cap_name)
# Send the client the URLs for any proxy-only caps it requested
for cap_name in flow.metadata['needed_proxy_caps']:
parsed[cap_name] = region.cap_urls[cap_name]
flow.response.content = llsd.format_xml(parsed)
elif cap_data.cap_name == "EventQueueGet":
parsed_eq_resp = llsd.parse_xml(flow.response.content)
if parsed_eq_resp:
old_events = parsed_eq_resp["events"]
new_events = []
for event in old_events:
if not self._handle_eq_event(cap_data.session(), region, event):
new_events.append(event)
# Add on any fake events that've been queued by addons
eq_manager = cap_data.region().eq_manager
new_events.extend(eq_manager.take_injected_events())
parsed_eq_resp["events"] = new_events
# Empty event list is an error, need to return undef instead.
if old_events and not new_events:
parsed_eq_resp = None
# HACK: see note in above request handler for EventQueueGet
req_ack_id = llsd.parse_xml(flow.request.content)["ack"]
eq_manager.cache_last_poll_response(req_ack_id, parsed_eq_resp)
flow.response.content = llsd.format_xml(parsed_eq_resp)
elif cap_data.cap_name in self.UPLOAD_CREATING_CAPS:
if not region:
return
region.http_message_handler.handle(flow)
if cap_data.cap_name == "Seed":
parsed = llsd.parse_xml(flow.response.content)
logging.debug("Got seed cap for %r : %r" % (cap_data, parsed))
region.update_caps(parsed)
# On LL's grid these URIs aren't unique across sessions or regions,
# so we get request attribution by replacing them with a unique
# alias URI.
logging.debug("Replacing GetMesh caps with wrapped versions")
wrappable_caps = {"GetMesh2", "GetMesh", "GetTexture", "ViewerAsset"}
for cap_name in wrappable_caps:
if cap_name in parsed:
parsed[cap_name] = region.register_wrapper_cap(cap_name)
flow.response.content = llsd.format_pretty_xml(parsed)
elif cap_data.cap_name == "EventQueueGet":
parsed_eq_resp = llsd.parse_xml(flow.response.content)
if parsed_eq_resp:
old_events = parsed_eq_resp["events"]
new_events = []
for event in old_events:
if not self._handle_eq_event(cap_data.session(), region, event):
new_events.append(event)
# Add on any fake events that've been queued by addons
eq_manager = cap_data.region().eq_manager
new_events.extend(eq_manager.take_events())
parsed_eq_resp["events"] = new_events
if old_events and not new_events:
# Need at least one event or the viewer will refuse to ack!
new_events.append({"message": "NOP", "body": {}})
flow.response.content = llsd.format_pretty_xml(parsed_eq_resp)
elif cap_data.cap_name in self.UPLOAD_CREATING_CAPS:
if not region:
return
parsed = llsd.parse_xml(flow.response.content)
if "uploader" in parsed:
region.register_temporary_cap(cap_data.cap_name + "Uploader", parsed["uploader"])
except:
logging.exception("OOPS, blew up in HTTP proxy!")
parsed = llsd.parse_xml(flow.response.content)
if "uploader" in parsed:
region.register_cap(cap_data.cap_name + "Uploader", parsed["uploader"], CapType.TEMPORARY)
except:
logging.exception("OOPS, blew up in HTTP proxy!")
def _handle_login_flow(self, flow: HippoHTTPFlow):
resp = xmlrpc.client.loads(flow.response.content)[0][0] # type: ignore

View File

@@ -1,13 +1,18 @@
from __future__ import annotations
import copy
import multiprocessing
import weakref
from typing import *
from typing import Optional
import mitmproxy.http
from mitmproxy.http import HTTPFlow
from hippolyzer.lib.proxy.caps import CapData
if TYPE_CHECKING:
from hippolyzer.lib.proxy.sessions import CapData, SessionManager
from hippolyzer.lib.proxy.sessions import SessionManager
class HippoHTTPFlow:
@@ -17,24 +22,26 @@ class HippoHTTPFlow:
Hides the nastiness of writing to flow.metadata so we can pass
state back and forth between the two proxies
"""
__slots__ = ("flow",)
__slots__ = ("flow", "callback_queue", "resumed", "taken")
def __init__(self, flow: HTTPFlow):
def __init__(self, flow: HTTPFlow, callback_queue: Optional[multiprocessing.Queue] = None):
self.flow: HTTPFlow = flow
self.resumed = False
self.taken = False
self.callback_queue = weakref.ref(callback_queue) if callback_queue else None
meta = self.flow.metadata
meta.setdefault("taken", False)
meta.setdefault("can_stream", True)
meta.setdefault("response_injected", False)
meta.setdefault("request_injected", False)
meta.setdefault("cap_data", None)
meta.setdefault("cap_data", CapData())
meta.setdefault("from_browser", False)
@property
def request(self) -> mitmproxy.http.HTTPRequest:
def request(self) -> mitmproxy.http.Request:
return self.flow.request
@property
def response(self) -> Optional[mitmproxy.http.HTTPResponse]:
def response(self) -> Optional[mitmproxy.http.Response]:
return self.flow.response
@property
@@ -42,7 +49,7 @@ class HippoHTTPFlow:
return self.flow.id
@response.setter
def response(self, val: Optional[mitmproxy.http.HTTPResponse]):
def response(self, val: Optional[mitmproxy.http.Response]):
self.flow.metadata["response_injected"] = True
self.flow.response = val
@@ -88,12 +95,27 @@ class HippoHTTPFlow:
def take(self) -> HippoHTTPFlow:
"""Don't automatically pass this flow back to mitmproxy"""
self.metadata["taken"] = True
# TODO: Having to explicitly take / release Flows to use them in an async
# context is kind of janky. The HTTP callback handling code should probably
# be made totally async, including the addon hooks. Would coroutine per-callback
# be expensive?
assert not self.taken and not self.resumed
self.taken = True
return self
@property
def taken(self) -> bool:
return self.metadata["taken"]
def resume(self):
"""Release the HTTP flow back to the normal processing flow"""
assert self.callback_queue
assert not self.resumed
self.taken = False
self.resumed = True
self.callback_queue().put(("callback", self.flow.id, self.get_state()))
def preempt(self):
# Must be some flow that we previously resumed, we're racing
# the result from the server end.
assert not self.taken and self.resumed
self.callback_queue().put(("preempt", self.flow.id, self.get_state()))
@property
def is_replay(self) -> bool:
@@ -113,15 +135,18 @@ class HippoHTTPFlow:
return state
@classmethod
def from_state(cls, flow_state: Dict, session_manager: SessionManager) -> HippoHTTPFlow:
def from_state(cls, flow_state: Dict, session_manager: Optional[SessionManager]) -> HippoHTTPFlow:
flow: Optional[HTTPFlow] = HTTPFlow.from_state(flow_state)
assert flow is not None
cap_data_ser = flow.metadata.get("cap_data_ser")
callback_queue = None
if session_manager:
callback_queue = session_manager.flow_context.to_proxy_queue
if cap_data_ser is not None:
flow.metadata["cap_data"] = session_manager.deserialize_cap_data(cap_data_ser)
flow.metadata["cap_data"] = CapData.deserialize(cap_data_ser, session_manager)
else:
flow.metadata["cap_data"] = None
return cls(flow)
return cls(flow, callback_queue)
def copy(self) -> HippoHTTPFlow:
# HACK: flow.copy() expects the flow to be fully JSON serializable, but

View File

@@ -1,5 +1,4 @@
import asyncio
import functools
import logging
import multiprocessing
import os
@@ -8,6 +7,7 @@ import sys
import queue
import typing
import uuid
import weakref
import mitmproxy.certs
import mitmproxy.ctx
@@ -15,42 +15,30 @@ import mitmproxy.log
import mitmproxy.master
import mitmproxy.options
import mitmproxy.proxy
from mitmproxy.addons import core, clientplayback
from mitmproxy.addons import core, clientplayback, proxyserver, next_layer, disable_h2c
from mitmproxy.http import HTTPFlow
from mitmproxy.proxy.layers import tls
import OpenSSL
from hippolyzer.lib.base.helpers import get_resource_filename
from hippolyzer.lib.base.multiprocessing_utils import ParentProcessWatcher
orig_sethostflags = OpenSSL.SSL._lib.X509_VERIFY_PARAM_set_hostflags # noqa
@functools.wraps(orig_sethostflags)
def _sethostflags_wrapper(param, flags):
# Since 2000 the recommendation per RFCs has been to only check SANs and not the CN field.
# Most browsers do this, as does mitmproxy. The viewer does not, and the sim certs have no SAN
# field. Just monkeypatch out this flag since mitmproxy's internals are in flux and there's
# no good way to stop setting this flag currently.
return orig_sethostflags(
param,
flags & (~OpenSSL.SSL._lib.X509_CHECK_FLAG_NEVER_CHECK_SUBJECT) # noqa
)
OpenSSL.SSL._lib.X509_VERIFY_PARAM_set_hostflags = _sethostflags_wrapper # noqa
from hippolyzer.lib.proxy.caps import SerializedCapData
class SLCertStore(mitmproxy.certs.CertStore):
def get_cert(self, commonname: typing.Optional[bytes], sans: typing.List[bytes], *args):
cert, privkey, chain = super().get_cert(commonname, sans, *args)
x509: OpenSSL.crypto.X509 = cert.x509
def get_cert(self, commonname: typing.Optional[str], sans: typing.List[str], *args, **kwargs):
entry = super().get_cert(commonname, sans, *args, **kwargs)
cert, privkey, chain = entry.cert, entry.privatekey, entry.chain_file
x509 = cert.to_pyopenssl()
# The cert must have a subject key ID or the viewer will reject it.
for i in range(0, x509.get_extension_count()):
ext = x509.get_extension(i)
# This cert already has a subject key id, pass through.
if ext.get_short_name() == b"subjectKeyIdentifier":
return cert, privkey, chain
return entry
# Need to add a subject key ID onto this cert or the viewer will reject it.
# The viewer doesn't actually use the subject key ID for its intended purpose,
# so a random, unique value is fine.
x509.add_extensions([
OpenSSL.crypto.X509Extension(
b"subjectKeyIdentifier",
@@ -58,17 +46,24 @@ class SLCertStore(mitmproxy.certs.CertStore):
uuid.uuid4().hex.encode("utf8"),
),
])
x509.sign(privkey, "sha256") # type: ignore
return cert, privkey, chain
x509.sign(OpenSSL.crypto.PKey.from_cryptography_key(privkey), "sha256") # type: ignore
new_entry = mitmproxy.certs.CertStoreEntry(
mitmproxy.certs.Cert.from_pyopenssl(x509), privkey, chain
)
# Replace the cert that was created in the base `get_cert()` with our modified cert
self.certs[(commonname, tuple(sans))] = new_entry
self.expire_queue.pop(-1)
self.expire(new_entry)
return new_entry
class SLProxyConfig(mitmproxy.proxy.ProxyConfig):
def configure(self, options, updated) -> None:
super().configure(options, updated)
class SLTlsConfig(mitmproxy.addons.tlsconfig.TlsConfig):
def running(self):
super().running()
old_cert_store = self.certstore
# Replace the cert store with one that knows how to add
# a subject key ID extension.
self.certstore = SLCertStore( # noqa
self.certstore = SLCertStore(
default_privatekey=old_cert_store.default_privatekey,
default_ca=old_cert_store.default_ca,
default_chain_file=old_cert_store.default_chain_file,
@@ -76,12 +71,25 @@ class SLProxyConfig(mitmproxy.proxy.ProxyConfig):
)
self.certstore.certs = old_cert_store.certs
def tls_start_server(self, tls_start: tls.TlsData):
super().tls_start_server(tls_start)
# Since 2000 the recommendation per RFCs has been to only check SANs and not the CN field.
# Most browsers do this, as does mitmproxy. The viewer does not, and the sim certs have no SAN
# field. set the host verification flags to remove the flag that disallows falling back to
# checking the CN (X509_CHECK_FLAG_NEVER_CHECK_SUBJECT)
param = OpenSSL.SSL._lib.SSL_get0_param(tls_start.ssl_conn._ssl) # noqa
# get_hostflags() doesn't seem to be exposed, just set the usual flags without
# the problematic `X509_CHECK_FLAG_NEVER_CHECK_SUBJECT` flag.
flags = OpenSSL.SSL._lib.X509_CHECK_FLAG_NO_PARTIAL_WILDCARDS # noqa
OpenSSL.SSL._lib.X509_VERIFY_PARAM_set_hostflags(param, flags) # noqa
class HTTPFlowContext:
def __init__(self):
self.from_proxy_queue = multiprocessing.Queue()
self.to_proxy_queue = multiprocessing.Queue()
self.shutdown_signal = multiprocessing.Event()
self.mitmproxy_ready = multiprocessing.Event()
class IPCInterceptionAddon:
@@ -91,12 +99,13 @@ class IPCInterceptionAddon:
flow which is merged in and resumed.
"""
def __init__(self, flow_context: HTTPFlowContext):
self.intercepted_flows: typing.Dict[str, HTTPFlow] = {}
self.mitmproxy_ready = flow_context.mitmproxy_ready
self.flows: weakref.WeakValueDictionary[str, HTTPFlow] = weakref.WeakValueDictionary()
self.from_proxy_queue: multiprocessing.Queue = flow_context.from_proxy_queue
self.to_proxy_queue: multiprocessing.Queue = flow_context.to_proxy_queue
self.shutdown_signal: multiprocessing.Event = flow_context.shutdown_signal
def log(self, entry: mitmproxy.log.LogEntry):
def add_log(self, entry: mitmproxy.log.LogEntry):
if entry.level == "debug":
logging.debug(entry.msg)
elif entry.level in ("alert", "info"):
@@ -111,6 +120,8 @@ class IPCInterceptionAddon:
def running(self):
# register to pump the events or something here
asyncio.create_task(self._pump_callbacks())
# Tell the main process mitmproxy is ready to handle requests
self.mitmproxy_ready.set()
async def _pump_callbacks(self):
watcher = ParentProcessWatcher(self.shutdown_signal)
@@ -124,11 +135,13 @@ class IPCInterceptionAddon:
await asyncio.sleep(0.001)
continue
if event_type == "callback":
orig_flow = self.intercepted_flows.pop(flow_id)
orig_flow = self.flows[flow_id]
orig_flow.set_state(flow_state)
# Remove the taken flag from the flow if present, the flow by definition
# isn't take()n anymore once it's been passed back to the proxy.
orig_flow.metadata.pop("taken", None)
elif event_type == "preempt":
orig_flow = self.flows.get(flow_id)
if orig_flow:
orig_flow.intercept()
orig_flow.set_state(flow_state)
elif event_type == "replay":
flow: HTTPFlow = HTTPFlow.from_state(flow_state)
# mitmproxy won't replay intercepted flows, this is an old flow so
@@ -150,8 +163,8 @@ class IPCInterceptionAddon:
from_browser = "Mozilla" in flow.request.headers.get("User-Agent", "")
flow.metadata["from_browser"] = from_browser
# Only trust the "injected" header if not from a browser
was_injected = flow.request.headers.pop("X-Hippo-Injected", False)
if was_injected and not from_browser:
was_injected = flow.request.headers.pop("X-Hippo-Injected", "")
if was_injected == "1" and not from_browser:
flow.metadata["request_injected"] = True
# Does this request need the stupid hack around aiohttp's windows proactor bug
@@ -162,13 +175,13 @@ class IPCInterceptionAddon:
def _queue_flow_interception(self, event_type: str, flow: HTTPFlow):
flow.intercept()
self.intercepted_flows[flow.id] = flow
self.flows[flow.id] = flow
self.from_proxy_queue.put((event_type, flow.get_state()), True)
def responseheaders(self, flow: HTTPFlow):
# The response was injected earlier in an earlier handler,
# we don't want to touch this anymore.
if flow.metadata["response_injected"]:
if flow.metadata.get("response_injected"):
return
# Someone fucked up and put a mimetype in Content-Encoding.
@@ -179,7 +192,10 @@ class IPCInterceptionAddon:
flow.response.headers["Content-Encoding"] = "identity"
def response(self, flow: HTTPFlow):
if flow.metadata["response_injected"]:
cap_data: typing.Optional[SerializedCapData] = flow.metadata.get("cap_data")
if flow.metadata.get("response_injected") and cap_data and cap_data.asset_server_cap:
# Don't bother intercepting asset server requests where we injected a response.
# We don't want to log them and they don't need any more processing by user hooks.
return
self._queue_flow_interception("response", flow)
@@ -187,10 +203,10 @@ class IPCInterceptionAddon:
class SLMITMAddon(IPCInterceptionAddon):
def responseheaders(self, flow: HTTPFlow):
super().responseheaders(flow)
cap_data: typing.Optional[SerializedCapData] = flow.metadata["cap_data_ser"]
cap_data: typing.Optional[SerializedCapData] = flow.metadata.get("cap_data_ser")
# Request came from the proxy itself, don't touch it.
if flow.metadata["request_injected"]:
if flow.metadata.get("request_injected"):
return
# This is an asset server response that we're not interested in intercepting.
@@ -199,7 +215,7 @@ class SLMITMAddon(IPCInterceptionAddon):
# Can't stream if we injected our own response or we were asked not to stream
if not flow.metadata["response_injected"] and flow.metadata["can_stream"]:
flow.response.stream = True
elif not cap_data and not flow.metadata["from_browser"]:
elif not cap_data and not flow.metadata.get("from_browser"):
object_name = flow.response.headers.get("X-SecondLife-Object-Name", "")
# Meh. Add some fake Cap data for this so it can be matched on.
if object_name.startswith("#Firestorm LSL Bridge"):
@@ -212,13 +228,13 @@ class SLMITMMaster(mitmproxy.master.Master):
self.addons.add(
core.Core(),
clientplayback.ClientPlayback(),
SLMITMAddon(flow_context)
disable_h2c.DisableH2C(),
proxyserver.Proxyserver(),
next_layer.NextLayer(),
SLTlsConfig(),
SLMITMAddon(flow_context),
)
def start_server(self):
self.start()
asyncio.ensure_future(self.running())
def create_proxy_master(host, port, flow_context: HTTPFlowContext): # pragma: no cover
opts = mitmproxy.options.Options()
@@ -241,30 +257,4 @@ def create_proxy_master(host, port, flow_context: HTTPFlowContext): # pragma: n
def create_http_proxy(bind_host, port, flow_context: HTTPFlowContext): # pragma: no cover
master = create_proxy_master(bind_host, port, flow_context)
pconf = SLProxyConfig(master.options)
server = mitmproxy.proxy.server.ProxyServer(pconf)
master.server = server
return master
def is_asset_server_cap_name(cap_name):
return cap_name and (
cap_name.startswith("GetMesh")
or cap_name.startswith("GetTexture")
or cap_name.startswith("ViewerAsset")
)
class SerializedCapData(typing.NamedTuple):
cap_name: typing.Optional[str] = None
region_addr: typing.Optional[str] = None
session_id: typing.Optional[str] = None
base_url: typing.Optional[str] = None
type: str = "NORMAL"
def __bool__(self):
return bool(self.cap_name or self.session_id)
@property
def asset_server_cap(self):
return is_asset_server_cap_name(self.cap_name)

View File

@@ -1,3 +1,4 @@
import asyncio
import logging
import weakref
from typing import Optional, Tuple
@@ -5,7 +6,6 @@ from typing import Optional, Tuple
from hippolyzer.lib.base.message.message_dot_xml import MessageDotXML
from hippolyzer.lib.base.message.udpdeserializer import UDPMessageDeserializer
from hippolyzer.lib.base.message.udpserializer import UDPMessageSerializer
from hippolyzer.lib.base.settings import Settings
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.base.network.transport import UDPPacket
from hippolyzer.lib.base.message.message import Message
@@ -26,17 +26,28 @@ class SLSOCKS5Server(SOCKS5Server):
return lambda: InterceptingLLUDPProxyProtocol(source_addr, self.session_manager)
class BaseLLUDPProxyProtocol(UDPProxyProtocol):
def __init__(self, source_addr: Tuple[str, int]):
class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
def __init__(self, source_addr: Tuple[str, int], session_manager: SessionManager):
super().__init__(source_addr)
self.settings = Settings()
self.settings.ENABLE_DEFERRED_PACKET_PARSING = True
self.settings.HANDLE_PACKETS = False
self.session_manager: SessionManager = session_manager
self.serializer = UDPMessageSerializer()
self.deserializer = UDPMessageDeserializer(
settings=self.settings,
settings=self.session_manager.settings,
)
self.message_xml = MessageDotXML()
self.session: Optional[Session] = None
loop = asyncio.get_event_loop_policy().get_event_loop()
self.resend_task = loop.create_task(self.attempt_resends())
async def attempt_resends(self):
while True:
await asyncio.sleep(0.1)
if self.session is None:
continue
for region in self.session.regions:
if not region.circuit or not region.circuit.is_alive:
continue
region.circuit.resend_unacked()
def _ensure_message_allowed(self, msg: Message):
if not self.message_xml.validate_udp_msg(msg.name):
@@ -45,37 +56,22 @@ class BaseLLUDPProxyProtocol(UDPProxyProtocol):
)
raise PermissionError(f"UDPBanned message {msg.name}")
class InterceptingLLUDPProxyProtocol(BaseLLUDPProxyProtocol):
def __init__(self, source_addr: Tuple[str, int], session_manager: SessionManager):
super().__init__(source_addr)
self.session_manager: SessionManager = session_manager
self.session: Optional[Session] = None
def _handle_proxied_packet(self, packet: UDPPacket):
message: Optional[Message] = None
def handle_proxied_packet(self, packet: UDPPacket):
region: Optional[ProxiedRegion] = None
# Try to do an initial region lookup so we have it for handle_proxied_packet()
if self.session:
region = self.session.region_by_circuit_addr(packet.far_addr)
deserialize_exc = None
try:
message = self.deserializer.deserialize(packet.data)
message.direction = packet.direction
message.sender = packet.src_addr
except Exception as e:
# Hang onto this since handle_proxied_packet doesn't need a parseable
# message. If that hook doesn't handle the packet then re-raise.
deserialize_exc = e
# the proxied packet handler is allowed to mutate `packet.data` before
# the message gets parsed.
if AddonManager.handle_proxied_packet(self.session_manager, packet,
self.session, region, message):
# Swallow any error raised by above message deserialization, it was handled.
self.session, region):
return
if deserialize_exc is not None:
# handle_proxied_packet() didn't deal with the error, so it's fatal.
raise deserialize_exc
message = self.deserializer.deserialize(packet.data)
message.direction = packet.direction
message.sender = packet.src_addr
message.meta.update(packet.meta)
assert message is not None
# Check for UDP bans on inbound messages
@@ -116,6 +112,9 @@ class InterceptingLLUDPProxyProtocol(BaseLLUDPProxyProtocol):
LOG.error("No circuit for %r, dropping packet!" % (packet.far_addr,))
return
# Process any ACKs for messages we injected first
region.circuit.collect_acks(message)
if message.name == "AgentMovementComplete":
self.session.main_region = region
if region.handle is None:
@@ -125,7 +124,7 @@ class InterceptingLLUDPProxyProtocol(BaseLLUDPProxyProtocol):
if message.name == "RegionHandshake":
region.cache_id = message["RegionInfo"]["CacheID"]
self.session.objects.track_region_objects(region.handle)
if self.session_manager.use_viewer_object_cache:
if self.session_manager.settings.USE_VIEWER_OBJECT_CACHE:
try:
region.objects.load_cache()
except:
@@ -148,7 +147,7 @@ class InterceptingLLUDPProxyProtocol(BaseLLUDPProxyProtocol):
# This message is owned by an async handler, drop it so it doesn't get
# sent with the normal flow.
if message.queued and not message.dropped:
if message.queued:
region.circuit.drop_message(message)
# Shouldn't mutate the message past this point, so log it now.
@@ -163,8 +162,9 @@ class InterceptingLLUDPProxyProtocol(BaseLLUDPProxyProtocol):
elif message.name == "RegionHandshake":
region.name = str(message["RegionInfo"][0]["SimName"])
if not message.dropped:
region.circuit.send_message(message)
# Send the message if it wasn't explicitly dropped or sent before
if not message.finalized:
region.circuit.send(message)
def close(self):
super().close()
@@ -172,3 +172,4 @@ class InterceptingLLUDPProxyProtocol(BaseLLUDPProxyProtocol):
AddonManager.handle_session_closed(self.session)
self.session_manager.close_session(self.session)
self.session = None
self.resend_task.cancel()

View File

@@ -3,7 +3,7 @@ import ast
import typing
from arpeggio import Optional, ZeroOrMore, EOF, \
ParserPython, PTNodeVisitor, visit_parse_tree, RegExMatch
ParserPython, PTNodeVisitor, visit_parse_tree, RegExMatch, OneOrMore
def literal():
@@ -12,7 +12,7 @@ def literal():
# https://stackoverflow.com/questions/14366401/#comment79795017_14366904
RegExMatch(r'''b?(\"\"\"|\'\'\'|\"|\')((?<!\\)(\\\\)*\\\1|.)*?\1'''),
# base16
RegExMatch(r'0x\d+'),
RegExMatch(r'0x[0-9a-fA-F]+'),
# base10 int or float.
RegExMatch(r'\d+(\.\d+)?'),
"None",
@@ -26,7 +26,9 @@ def literal():
def identifier():
return RegExMatch(r'[a-zA-Z*]([a-zA-Z0-9_*]+)?')
# Identifiers are allowed to have "-". It's not a special character
# in our grammar, and we expect them to show up some places, like header names.
return RegExMatch(r'[a-zA-Z*]([a-zA-Z0-9_*-]+)?')
def field_specifier():
@@ -42,7 +44,7 @@ def unary_expression():
def meta_field_specifier():
return "Meta", ".", identifier
return "Meta", OneOrMore(".", identifier)
def enum_field_specifier():
@@ -69,12 +71,17 @@ def message_filter():
return expression, EOF
MATCH_RESULT = typing.Union[bool, typing.Tuple]
class MatchResult(typing.NamedTuple):
result: bool
fields: typing.List[typing.Tuple]
def __bool__(self):
return self.result
class BaseFilterNode(abc.ABC):
@abc.abstractmethod
def match(self, msg) -> MATCH_RESULT:
def match(self, msg, short_circuit=True) -> MatchResult:
raise NotImplementedError()
@property
@@ -104,18 +111,36 @@ class BinaryFilterNode(BaseFilterNode, abc.ABC):
class UnaryNotFilterNode(UnaryFilterNode):
def match(self, msg) -> MATCH_RESULT:
return not self.node.match(msg)
def match(self, msg, short_circuit=True) -> MatchResult:
# Should we pass fields up here? Maybe not.
return MatchResult(not self.node.match(msg, short_circuit), [])
class OrFilterNode(BinaryFilterNode):
def match(self, msg) -> MATCH_RESULT:
return self.left_node.match(msg) or self.right_node.match(msg)
def match(self, msg, short_circuit=True) -> MatchResult:
left_match = self.left_node.match(msg, short_circuit)
if left_match and short_circuit:
return MatchResult(True, left_match.fields)
right_match = self.right_node.match(msg, short_circuit)
if right_match and short_circuit:
return MatchResult(True, right_match.fields)
if left_match or right_match:
# Fine since fields should be empty when result=False
return MatchResult(True, left_match.fields + right_match.fields)
return MatchResult(False, [])
class AndFilterNode(BinaryFilterNode):
def match(self, msg) -> MATCH_RESULT:
return self.left_node.match(msg) and self.right_node.match(msg)
def match(self, msg, short_circuit=True) -> MatchResult:
left_match = self.left_node.match(msg, short_circuit)
if not left_match:
return MatchResult(False, [])
right_match = self.right_node.match(msg, short_circuit)
if not right_match:
return MatchResult(False, [])
return MatchResult(True, left_match.fields + right_match.fields)
class MessageFilterNode(BaseFilterNode):
@@ -124,15 +149,15 @@ class MessageFilterNode(BaseFilterNode):
self.operator = operator
self.value = value
def match(self, msg) -> MATCH_RESULT:
return msg.matches(self)
def match(self, msg, short_circuit=True) -> MatchResult:
return msg.matches(self, short_circuit)
@property
def children(self):
return self.selector, self.operator, self.value
class MetaFieldSpecifier(str):
class MetaFieldSpecifier(tuple):
pass
@@ -158,7 +183,7 @@ class MessageFilterVisitor(PTNodeVisitor):
return LiteralValue(ast.literal_eval(node.value))
def visit_meta_field_specifier(self, _node, children):
return MetaFieldSpecifier(children[0])
return MetaFieldSpecifier(children)
def visit_enum_field_specifier(self, _node, children):
return EnumFieldSpecifier(*children)

View File

@@ -1,8 +1,11 @@
from __future__ import annotations
import abc
import ast
import collections
import copy
import fnmatch
import gzip
import io
import logging
import pickle
@@ -13,16 +16,16 @@ import weakref
from defusedxml import minidom
from hippolyzer.lib.base import serialization as se, llsd
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.datatypes import TaggedUnion, UUID, TupleCoord
from hippolyzer.lib.base.helpers import bytes_escape
from hippolyzer.lib.base.message.message_formatting import HumanMessageSerializer
from hippolyzer.lib.proxy.message_filter import MetaFieldSpecifier, compile_filter, BaseFilterNode, MessageFilterNode, \
EnumFieldSpecifier
from hippolyzer.lib.proxy.region import CapType
EnumFieldSpecifier, MatchResult
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.caps import CapType, SerializedCapData
if typing.TYPE_CHECKING:
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
@@ -30,24 +33,42 @@ LOG = logging.getLogger(__name__)
class BaseMessageLogger:
paused: bool
def log_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
pass
if self.paused:
return False
return self.add_log_entry(LLUDPMessageLogEntry(message, region, session))
def log_http_response(self, flow: HippoHTTPFlow):
pass
if self.paused:
return False
# These are huge, let's not log them for now.
if flow.cap_data and flow.cap_data.asset_server_cap:
return False
return self.add_log_entry(HTTPMessageLogEntry(flow))
def log_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
if self.paused:
return False
return self.add_log_entry(EQMessageLogEntry(event, region, session))
@abc.abstractmethod
def add_log_entry(self, entry: AbstractMessageLogEntry):
pass
class FilteringMessageLogger(BaseMessageLogger):
def __init__(self):
def __init__(self, maxlen=2000):
BaseMessageLogger.__init__(self)
self._raw_entries = collections.deque(maxlen=2000)
self._raw_entries = collections.deque(maxlen=maxlen)
self._filtered_entries: typing.List[AbstractMessageLogEntry] = []
self._paused = False
self.paused = False
self.filter: BaseFilterNode = compile_filter("")
def __iter__(self) -> typing.Iterator[AbstractMessageLogEntry]:
return iter(self._filtered_entries)
def set_filter(self, filter_str: str):
self.filter = compile_filter(filter_str)
self._begin_reset()
@@ -61,25 +82,7 @@ class FilteringMessageLogger(BaseMessageLogger):
self._end_reset()
def set_paused(self, paused: bool):
self._paused = paused
def log_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if self._paused:
return
self._add_log_entry(LLUDPMessageLogEntry(message, region, session))
def log_http_response(self, flow: HippoHTTPFlow):
if self._paused:
return
# These are huge, let's not log them for now.
if flow.cap_data and flow.cap_data.asset_server_cap:
return
self._add_log_entry(HTTPMessageLogEntry(flow))
def log_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
if self._paused:
return
self._add_log_entry(EQMessageLogEntry(event, region, session))
self.paused = paused
# Hooks that Qt models will want to implement
def _begin_insert(self, insert_idx: int):
@@ -94,25 +97,21 @@ class FilteringMessageLogger(BaseMessageLogger):
def _end_reset(self):
pass
def _add_log_entry(self, entry: AbstractMessageLogEntry):
def add_log_entry(self, entry: AbstractMessageLogEntry):
try:
# Paused, throw it away.
if self._paused:
return
if self.paused:
return False
self._raw_entries.append(entry)
if self.filter.match(entry):
next_idx = len(self._filtered_entries)
self._begin_insert(next_idx)
self._filtered_entries.append(entry)
self._end_insert()
entry.cache_summary()
# In the common case we don't need to keep around the serialization
# caches anymore. If the filter changes, the caches will be repopulated
# as necessary.
entry.freeze()
return True
except Exception:
LOG.exception("Failed to filter queued message")
return False
def clear(self):
self._begin_reset()
@@ -121,7 +120,27 @@ class FilteringMessageLogger(BaseMessageLogger):
self._end_reset()
class AbstractMessageLogEntry:
class WrappingMessageLogger(BaseMessageLogger):
def __init__(self):
self.loggers: typing.List[BaseMessageLogger] = []
@property
def paused(self):
return all(x.paused for x in self.loggers)
def add_log_entry(self, entry: AbstractMessageLogEntry):
logged = False
for logger in self.loggers:
if logger.add_log_entry(entry):
logged = True
# At least one logger ended up keeping the message around, so let's
# cache the summary before we freeze the message.
if logged:
entry.cache_summary()
entry.freeze()
class AbstractMessageLogEntry(abc.ABC):
region: typing.Optional[ProxiedRegion]
session: typing.Optional[Session]
name: str
@@ -129,7 +148,7 @@ class AbstractMessageLogEntry:
__slots__ = ["_region", "_session", "_region_name", "_agent_id", "_summary", "meta"]
def __init__(self, region, session):
def __init__(self, region: typing.Optional[ProxiedRegion], session: typing.Optional[Session]):
if region and not isinstance(region, weakref.ReferenceType):
region = weakref.ref(region)
if session and not isinstance(session, weakref.ReferenceType):
@@ -159,6 +178,45 @@ class AbstractMessageLogEntry:
"SelectedFull": self._current_selected_full(),
}
def to_dict(self) -> dict:
meta = self.meta.copy()
def _dehydrate_meta_uuid(key: str):
if meta[key]:
meta[key] = str(meta[key])
_dehydrate_meta_uuid("AgentID")
_dehydrate_meta_uuid("SelectedFull")
_dehydrate_meta_uuid("SessionID")
return {
"type": self.type,
"region_name": self.region_name,
"agent_id": str(self.agent_id) if self.agent_id is not None else None,
"summary": self.summary,
"meta": meta,
}
@classmethod
@abc.abstractmethod
def from_dict(cls, val: dict):
pass
def apply_dict(self, val: dict) -> None:
self._region_name = val['region_name']
self._agent_id = UUID(val['agent_id']) if val['agent_id'] else None
self._summary = val['summary']
meta = val['meta'].copy()
def _hydrate_meta_uuid(key: str):
if meta[key]:
meta[key] = UUID(meta[key])
_hydrate_meta_uuid("AgentID")
_hydrate_meta_uuid("SelectedFull")
_hydrate_meta_uuid("SessionID")
self.meta.update(meta)
def freeze(self):
pass
@@ -177,7 +235,7 @@ class AbstractMessageLogEntry:
obj = self.region.objects.lookup_localid(selected_local)
return obj and obj.FullID
def _get_meta(self, name: str):
def _get_meta(self, name: str) -> typing.Any:
# Slight difference in semantics. Filters are meant to return the same
# thing no matter when they're run, so SelectedLocal and friends resolve
# to the selected items _at the time the message was logged_. To handle
@@ -250,7 +308,9 @@ class AbstractMessageLogEntry:
def _val_matches(self, operator, val, expected):
if isinstance(expected, MetaFieldSpecifier):
expected = self._get_meta(str(expected))
if len(expected) != 1:
raise ValueError(f"Can only support single-level Meta specifiers, not {expected!r}")
expected = self._get_meta(str(expected[0]))
if not isinstance(expected, (int, float, bytes, str, type(None), tuple)):
if callable(expected):
expected = expected()
@@ -304,12 +364,18 @@ class AbstractMessageLogEntry:
if matcher.value or matcher.operator:
return False
return self._packet_root_matches(matcher.selector[0])
if len(matcher.selector) == 2 and matcher.selector[0] == "Meta":
return self._val_matches(matcher.operator, self._get_meta(matcher.selector[1]), matcher.value)
if matcher.selector[0] == "Meta":
if len(matcher.selector) == 2:
return self._val_matches(matcher.operator, self._get_meta(matcher.selector[1]), matcher.value)
elif len(matcher.selector) == 3:
meta_dict = self._get_meta(matcher.selector[1])
if not meta_dict or not hasattr(meta_dict, 'get'):
return False
return self._val_matches(matcher.operator, meta_dict.get(matcher.selector[2]), matcher.value)
return None
def matches(self, matcher: "MessageFilterNode"):
return self._base_matches(matcher) or False
def matches(self, matcher: "MessageFilterNode", short_circuit=True) -> "MatchResult":
return MatchResult(self._base_matches(matcher) or False, [])
@property
def seq(self):
@@ -330,6 +396,14 @@ class AbstractMessageLogEntry:
xmlified = re.sub(rb" <key>", b"<key>", xmlified)
return xmlified.decode("utf8", errors="replace")
@staticmethod
def _format_xml(content):
beautified = minidom.parseString(content).toprettyxml(indent=" ")
# kill blank lines. will break cdata sections. meh.
beautified = re.sub(r'\n\s*\n', '\n', beautified, flags=re.MULTILINE)
return re.sub(r'<([\w]+)>\s*</\1>', r'<\1></\1>',
beautified, flags=re.MULTILINE)
class HTTPMessageLogEntry(AbstractMessageLogEntry):
__slots__ = ["flow"]
@@ -342,7 +416,7 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
super().__init__(region, session)
# This was a request the proxy made through itself
self.meta["Injected"] = flow.request_injected
self.meta["Synthetic"] = flow.request_injected
@property
def type(self):
@@ -418,13 +492,17 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
if not beautified:
content_type = self._guess_content_type(message)
if content_type.startswith("application/llsd"):
beautified = self._format_llsd(llsd.parse(message.content))
try:
beautified = self._format_llsd(llsd.parse(message.content))
except llsd.LLSDParseError:
# Sometimes LL sends plain XML with a Content-Type of application/llsd+xml.
# Try to detect that case and work around it
if content_type == "application/llsd+xml" and message.content.startswith(b'<'):
beautified = self._format_xml(message.content)
else:
raise
elif any(content_type.startswith(x) for x in ("application/xml", "text/xml")):
beautified = minidom.parseString(message.content).toprettyxml(indent=" ")
# kill blank lines. will break cdata sections. meh.
beautified = re.sub(r'\n\s*\n', '\n', beautified, flags=re.MULTILINE)
beautified = re.sub(r'<([\w]+)>\s*</\1>', r'<\1></\1>',
beautified, flags=re.MULTILINE)
beautified = self._format_xml(message.content)
except:
LOG.exception("Failed to beautify message")
@@ -483,6 +561,40 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
return "application/xml"
return content_type
def _get_meta(self, name: str) -> typing.Any:
lower_name = name.lower()
if lower_name == "url":
return self.flow.request.url
elif lower_name == "reqheaders":
return self.flow.request.headers
elif lower_name == "respheaders":
return self.flow.response.headers
elif lower_name == "host":
return self.flow.request.host.lower()
elif lower_name == "status":
return self.flow.response.status_code
return super()._get_meta(name)
def to_dict(self):
val = super().to_dict()
val['flow'] = self.flow.get_state()
cap_data = val['flow'].get('metadata', {}).get('cap_data_ser')
if cap_data is not None:
# Have to convert this from a namedtuple to a dict to make
# it importable
cap_dict = cap_data._asdict() # noqa
val['flow']['metadata']['cap_data_ser'] = cap_dict
return val
@classmethod
def from_dict(cls, val: dict):
cap_data = val['flow'].get('metadata', {}).get('cap_data_ser')
if cap_data:
val['flow']['metadata']['cap_data_ser'] = SerializedCapData(**cap_data)
ev = cls(HippoHTTPFlow.from_state(val['flow'], None))
ev.apply_dict(val)
return ev
class EQMessageLogEntry(AbstractMessageLogEntry):
__slots__ = ["event"]
@@ -510,6 +622,17 @@ class EQMessageLogEntry(AbstractMessageLogEntry):
self._summary = llsd.format_notation(self.event["body"]).decode("utf8")[:500]
return self._summary
def to_dict(self) -> dict:
val = super().to_dict()
val['event'] = llsd.format_notation(self.event)
return val
@classmethod
def from_dict(cls, val: dict):
ev = cls(llsd.parse_notation(val['event']), None, None)
ev.apply_dict(val)
return ev
class LLUDPMessageLogEntry(AbstractMessageLogEntry):
__slots__ = ["_message", "_name", "_direction", "_frozen_message", "_seq", "_deserializer"]
@@ -524,7 +647,7 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
super().__init__(region, session)
_MESSAGE_META_ATTRS = {
"Injected", "Dropped", "Extra", "Resent", "Zerocoded", "Acks", "Reliable",
"Synthetic", "Dropped", "Extra", "Resent", "Zerocoded", "Acks", "Reliable",
}
def _get_meta(self, name: str):
@@ -582,20 +705,21 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
def request(self, beautify=False, replacements=None):
return HumanMessageSerializer.to_human_string(self.message, replacements, beautify)
def matches(self, matcher):
def matches(self, matcher, short_circuit=True) -> "MatchResult":
base_matched = self._base_matches(matcher)
if base_matched is not None:
return base_matched
return MatchResult(base_matched, [])
if not self._packet_root_matches(matcher.selector[0]):
return False
return MatchResult(False, [])
message = self.message
selector_len = len(matcher.selector)
# name, block_name, var_name(, subfield_name)?
if selector_len not in (3, 4):
return False
return MatchResult(False, [])
found_field_keys = []
for block_name in message.blocks:
if not fnmatch.fnmatchcase(block_name, matcher.selector[1]):
continue
@@ -604,13 +728,13 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
if not fnmatch.fnmatchcase(var_name, matcher.selector[2]):
continue
# So we know where the match happened
span_key = (message.name, block_name, block_num, var_name)
field_key = (message.name, block_name, block_num, var_name)
if selector_len == 3:
# We're just matching on the var existing, not having any particular value
if matcher.value is None:
return span_key
if self._val_matches(matcher.operator, block[var_name], matcher.value):
return span_key
found_field_keys.append(field_key)
elif self._val_matches(matcher.operator, block[var_name], matcher.value):
found_field_keys.append(field_key)
# Need to invoke a special unpacker
elif selector_len == 4:
try:
@@ -621,15 +745,21 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
if isinstance(deserialized, TaggedUnion):
deserialized = deserialized.value
if not isinstance(deserialized, dict):
return False
continue
for key in deserialized.keys():
if fnmatch.fnmatchcase(str(key), matcher.selector[3]):
if matcher.value is None:
return span_key
if self._val_matches(matcher.operator, deserialized[key], matcher.value):
return span_key
# Short-circuiting checking individual subfields is fine since
# we only highlight fields anyway.
found_field_keys.append(field_key)
break
elif self._val_matches(matcher.operator, deserialized[key], matcher.value):
found_field_keys.append(field_key)
break
return False
if short_circuit and found_field_keys:
return MatchResult(True, found_field_keys)
return MatchResult(bool(found_field_keys), found_field_keys)
@property
def summary(self):
@@ -642,3 +772,30 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
if self._message:
self._seq = self._message.packet_id
return self._seq
def to_dict(self):
val = super().to_dict()
val['message'] = llsd.format_notation(self.message.to_dict(extended=True))
return val
@classmethod
def from_dict(cls, val: dict):
ev = cls(Message.from_dict(llsd.parse_notation(val['message'])), None, None)
ev.apply_dict(val)
return ev
def export_log_entries(entries: typing.Iterable[AbstractMessageLogEntry]) -> bytes:
return gzip.compress(repr([e.to_dict() for e in entries]).encode("utf8"))
_TYPE_CLASSES = {
"HTTP": HTTPMessageLogEntry,
"LLUDP": LLUDPMessageLogEntry,
"EQ": EQMessageLogEntry,
}
def import_log_entries(data: bytes) -> typing.List[AbstractMessageLogEntry]:
entries = ast.literal_eval(gzip.decompress(data).decode("utf8"))
return [_TYPE_CLASSES[e['type']].from_dict(e) for e in entries]

View File

@@ -17,8 +17,8 @@ if TYPE_CHECKING:
class ProxyNameCache(NameCache):
def create_subscriptions(
self,
message_handler: MessageHandler[Message],
http_message_handler: Optional[MessageHandler[HippoHTTPFlow]] = None,
message_handler: MessageHandler[Message, str],
http_message_handler: Optional[MessageHandler[HippoHTTPFlow, str]] = None,
):
super().create_subscriptions(message_handler)
if http_message_handler is not None:
@@ -32,6 +32,9 @@ class ProxyNameCache(NameCache):
with open(namecache_file, "rb") as f:
namecache_bytes = f.read()
agents = llsd.parse_xml(namecache_bytes)["agents"]
# Can be `None` if the file was just created
if not agents:
continue
for agent_id, agent_data in agents.items():
# Don't set display name if they just have the default
display_name = None

View File

@@ -1,18 +1,23 @@
from __future__ import annotations
import asyncio
import logging
from typing import *
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.templates import PCode
from hippolyzer.lib.client.namecache import NameCache
from hippolyzer.lib.client.object_manager import (
ClientObjectManager,
UpdateType, ClientWorldObjectManager,
ObjectUpdateType, ClientWorldObjectManager,
)
from hippolyzer.lib.base.objects import Object
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.settings import ProxySettings
from hippolyzer.lib.proxy.vocache import RegionViewerObjectCacheChain
if TYPE_CHECKING:
@@ -31,50 +36,115 @@ class ProxyObjectManager(ClientObjectManager):
def __init__(
self,
region: ProxiedRegion,
use_vo_cache: bool = False
may_use_vo_cache: bool = False
):
super().__init__(region)
self.use_vo_cache = use_vo_cache
self.may_use_vo_cache = may_use_vo_cache
self.cache_loaded = False
self.object_cache = RegionViewerObjectCacheChain([])
self._cache_miss_timer: Optional[asyncio.TimerHandle] = None
self.queued_cache_misses: Set[int] = set()
region.message_handler.subscribe(
"RequestMultipleObjects",
self._handle_request_multiple_objects,
)
def load_cache(self):
if not self.use_vo_cache or self.cache_loaded:
if not self.may_use_vo_cache or self.cache_loaded:
return
handle = self._region.handle
if not handle:
LOG.warning(f"Tried to load cache for {self._region} without a handle")
return
self.cache_loaded = True
self.object_cache = RegionViewerObjectCacheChain.for_region(handle, self._region.cache_id)
self.object_cache = RegionViewerObjectCacheChain.for_region(
handle=handle,
cache_id=self._region.cache_id,
cache_dir=self._region.session().cache_dir,
)
def request_missed_cached_objects_soon(self, report_only=False):
if self._cache_miss_timer:
self._cache_miss_timer.cancel()
# Basically debounce. Will only trigger 0.2 seconds after the last time it's invoked to
# deal with the initial flood of ObjectUpdateCached and the natural lag time between that
# and the viewers' RequestMultipleObjects messages
loop = asyncio.get_event_loop_policy().get_event_loop()
self._cache_miss_timer = loop.call_later(0.2, self._request_missed_cached_objects, report_only)
def _request_missed_cached_objects(self, report_only: bool):
self._cache_miss_timer = None
if not self.queued_cache_misses:
# All the queued cache misses ended up being satisfied without us
# having to request them, no need to fire off a request.
return
if report_only:
print(f"Would have automatically requested {self.queued_cache_misses!r}")
else:
self.request_objects(self.queued_cache_misses)
self.queued_cache_misses.clear()
def clear(self):
super().clear()
self.object_cache = RegionViewerObjectCacheChain([])
self.cache_loaded = False
self.queued_cache_misses.clear()
if self._cache_miss_timer:
self._cache_miss_timer.cancel()
self._cache_miss_timer = None
def _is_localid_selected(self, localid: int):
return localid in self._region.session().selected.object_locals
def _handle_request_multiple_objects(self, msg: Message):
# Remove any queued cache misses that the viewer just requested for itself
self.queued_cache_misses -= {b["ID"] for b in msg["ObjectData"]}
class ProxyWorldObjectManager(ClientWorldObjectManager):
_session: Session
_settings: ProxySettings
def __init__(self, session: Session, name_cache: Optional[NameCache]):
super().__init__(session, name_cache)
def __init__(self, session: Session, settings: ProxySettings, name_cache: Optional[NameCache]):
super().__init__(session, settings, name_cache)
session.http_message_handler.subscribe(
"GetObjectCost",
self._handle_get_object_cost
)
session.http_message_handler.subscribe(
"FirestormBridge",
self._handle_firestorm_bridge_request,
)
def _handle_object_update_cached_misses(self, region_handle: int, local_ids: Set[int]):
# Don't do anything automatically. People have to manually ask for
# missed objects to be fetched.
pass
def _handle_object_update_cached_misses(self, region_handle: int, missing_locals: Set[int]):
region_mgr: Optional[ProxyObjectManager] = self._get_region_manager(region_handle)
if not self._settings.ALLOW_AUTO_REQUEST_OBJECTS:
if self._settings.USE_VIEWER_OBJECT_CACHE:
region_mgr.queued_cache_misses |= missing_locals
region_mgr.request_missed_cached_objects_soon(report_only=True)
elif self._settings.AUTOMATICALLY_REQUEST_MISSING_OBJECTS:
# Schedule these local IDs to be requested soon if the viewer doesn't request
# them itself. Ideally we could just mutate the CRC of the ObjectUpdateCached
# to force a CRC cache miss in the viewer, but that appears to cause the viewer
# to drop the resulting ObjectUpdateCompressed when the CRC doesn't match?
# It was causing all objects to go missing even though the ObjectUpdateCompressed
# was received.
region_mgr: Optional[ProxyObjectManager] = self._get_region_manager(region_handle)
region_mgr.queued_cache_misses |= missing_locals
region_mgr.request_missed_cached_objects_soon()
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str], update_type: UpdateType):
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str], update_type: ObjectUpdateType):
super()._run_object_update_hooks(obj, updated_props, update_type)
region = self._session.region_by_handle(obj.RegionHandle)
if self._settings.ALLOW_AUTO_REQUEST_OBJECTS:
if obj.PCode == PCode.AVATAR and "ParentID" in updated_props:
if obj.ParentID and not region.objects.lookup_localid(obj.ParentID):
# If an avatar just sat on an object we don't know about, add it to the queued
# cache misses and request it if the viewer doesn't. This should happen
# regardless of the auto-request missing objects setting because otherwise we
# have no way to get a sitting agent's true region location, even if it's ourselves.
region.objects.queued_cache_misses.add(obj.ParentID)
region.objects.request_missed_cached_objects_soon()
AddonManager.handle_object_updated(self._session, region, obj, updated_props)
def _run_kill_object_hooks(self, obj: Object):
@@ -82,10 +152,34 @@ class ProxyWorldObjectManager(ClientWorldObjectManager):
region = self._session.region_by_handle(obj.RegionHandle)
AddonManager.handle_object_killed(self._session, region, obj)
def _lookup_cache_entry(self, handle: int, local_id: int, crc: int) -> Optional[bytes]:
region_mgr: Optional[ProxyObjectManager] = self._get_region_manager(handle)
def _lookup_cache_entry(self, region_handle: int, local_id: int, crc: int) -> Optional[bytes]:
region_mgr: Optional[ProxyObjectManager] = self._get_region_manager(region_handle)
return region_mgr.object_cache.lookup_object_data(local_id, crc)
def _handle_get_object_cost(self, flow: HippoHTTPFlow):
parsed = llsd.parse_xml(flow.response.content)
self._process_get_object_cost_response(parsed)
def _handle_firestorm_bridge_request(self, flow: HippoHTTPFlow):
"""
Pull guessed avatar Z offsets from Firestorm Bridge requests
CoarseLocationUpdate packets can only represent heights up to 1024, so
viewers typically use an LSL bridge to get avatar heights beyond that range
and combine it with their X and Y coords from CoarseLocationUpdate packets.
"""
if not flow.request.content.startswith(b'<llsd><string>getZOffsets|'):
return
parsed: str = llsd.parse_xml(flow.response.content)
if not parsed:
return
# av_1_id, 1025.001, av_2_id, 3000.0, ...
split = parsed.split(", ")
for av_id, z_offset in zip(split[0::2], split[1::2]):
av_id = UUID(av_id)
z_offset = float(z_offset)
av = self.lookup_avatar(av_id)
if not av:
continue
av.GuessedZ = z_offset

View File

@@ -1,6 +1,5 @@
from __future__ import annotations
import enum
import logging
import hashlib
import uuid
@@ -12,12 +11,14 @@ import multidict
from hippolyzer.lib.base.datatypes import Vector3, UUID
from hippolyzer.lib.base.helpers import proxify
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.llsd_msg_serializer import LLSDMessageSerializer
from hippolyzer.lib.base.message.message import Message, Block
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.base.objects import handle_to_global_pos
from hippolyzer.lib.client.state import BaseClientRegion
from hippolyzer.lib.proxy.caps_client import ProxyCapsClient
from hippolyzer.lib.proxy.circuit import ProxiedCircuit
from hippolyzer.lib.proxy.caps import CapType
from hippolyzer.lib.proxy.object_manager import ProxyObjectManager
from hippolyzer.lib.base.transfer_manager import TransferManager
from hippolyzer.lib.base.xfer_manager import XferManager
@@ -27,13 +28,6 @@ if TYPE_CHECKING:
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
class CapType(enum.Enum):
NORMAL = enum.auto()
TEMPORARY = enum.auto()
WRAPPER = enum.auto()
PROXY_ONLY = enum.auto()
class CapsMultiDict(multidict.MultiDict[Tuple[CapType, str]]):
# TODO: Make a view object for this that's just name -> URL
# deriving from MultiMapping[_T] so we don't have to do
@@ -49,7 +43,7 @@ class CapsMultiDict(multidict.MultiDict[Tuple[CapType, str]]):
class ProxiedRegion(BaseClientRegion):
def __init__(self, circuit_addr, seed_cap: str, session, handle=None):
def __init__(self, circuit_addr, seed_cap: str, session: Session, handle=None):
# A client may make a Seed request twice, and may get back two (valid!) sets of
# Cap URIs. We need to be able to look up both, so MultiDict is necessary.
self.handle: Optional[int] = handle
@@ -58,16 +52,18 @@ class ProxiedRegion(BaseClientRegion):
self.cache_id: Optional[UUID] = None
self.circuit: Optional[ProxiedCircuit] = None
self.circuit_addr = circuit_addr
self._caps = CapsMultiDict()
self.caps = CapsMultiDict()
# Reverse lookup for URL -> cap data
self._caps_url_lookup: Dict[str, Tuple[CapType, str]] = {}
if seed_cap:
self._caps["Seed"] = (CapType.NORMAL, seed_cap)
self.caps["Seed"] = (CapType.NORMAL, seed_cap)
self.session: Callable[[], Session] = weakref.ref(session)
self.message_handler: MessageHandler[Message] = MessageHandler()
self.http_message_handler: MessageHandler[HippoHTTPFlow] = MessageHandler()
self.message_handler: MessageHandler[Message, str] = MessageHandler()
self.http_message_handler: MessageHandler[HippoHTTPFlow, str] = MessageHandler()
self.eq_manager = EventQueueManager(self)
self.caps_client = ProxyCapsClient(proxify(self))
self.objects: ProxyObjectManager = ProxyObjectManager(self, use_vo_cache=True)
settings = session.session_manager.settings
self.caps_client = ProxyCapsClient(settings, proxify(self))
self.objects: ProxyObjectManager = ProxyObjectManager(self, may_use_vo_cache=True)
self.xfer_manager = XferManager(proxify(self), self.session().secure_session_id)
self.transfer_manager = TransferManager(proxify(self), session.agent_id, session.id)
self._recalc_caps()
@@ -83,8 +79,8 @@ class ProxiedRegion(BaseClientRegion):
self._name = val
@property
def caps(self):
return multidict.MultiDict((x, y[1]) for x, y in self._caps.items())
def cap_urls(self) -> multidict.MultiDict[str, str]:
return multidict.MultiDict((x, y[1]) for x, y in self.caps.items())
@property
def global_pos(self) -> Vector3:
@@ -101,12 +97,12 @@ class ProxiedRegion(BaseClientRegion):
def update_caps(self, caps: Mapping[str, str]):
for cap_name, cap_url in caps.items():
if isinstance(cap_url, str) and cap_url.startswith('http'):
self._caps.add(cap_name, (CapType.NORMAL, cap_url))
self.caps.add(cap_name, (CapType.NORMAL, cap_url))
self._recalc_caps()
def _recalc_caps(self):
self._caps_url_lookup.clear()
for name, cap_info in self._caps.items():
for name, cap_info in self.caps.items():
cap_type, cap_url = cap_info
self._caps_url_lookup[cap_url] = (cap_type, name)
@@ -115,32 +111,35 @@ class ProxiedRegion(BaseClientRegion):
Wrap an existing, non-unique cap with a unique URL
caps like ViewerAsset may be the same globally and wouldn't let us infer
which session / region the request was related to without a wrapper
which session / region the request was related to without a wrapper URL
that we inject into the seed response sent to the viewer.
"""
parsed = list(urllib.parse.urlsplit(self._caps[name][1]))
seed_id = self._caps["Seed"][1].split("/")[-1].encode("utf8")
parsed = list(urllib.parse.urlsplit(self.caps[name][1]))
seed_id = self.caps["Seed"][1].split("/")[-1].encode("utf8")
# Give it a unique domain tied to the current Seed URI
parsed[1] = f"{name}-{hashlib.sha256(seed_id).hexdigest()[:16]}.hippo-proxy.localhost"
parsed[1] = f"{name.lower()}-{hashlib.sha256(seed_id).hexdigest()[:16]}.hippo-proxy.localhost"
# Force the URL to HTTP, we're going to handle the request ourselves so it doesn't need
# to be secure. This should save on expensive TLS context setup for each req.
parsed[0] = "http"
wrapper_url = urllib.parse.urlunsplit(parsed)
self._caps.add(name + "ProxyWrapper", (CapType.WRAPPER, wrapper_url))
self._recalc_caps()
# Register it with "ProxyWrapper" appended so we don't shadow the real cap URL
# in our own view of the caps
self.register_cap(name + "ProxyWrapper", wrapper_url, CapType.WRAPPER)
return wrapper_url
def register_proxy_cap(self, name: str):
"""
Register a cap to be completely handled by the proxy
"""
cap_url = f"https://caps.hippo-proxy.localhost/cap/{uuid.uuid4()!s}"
self._caps.add(name, (CapType.PROXY_ONLY, cap_url))
self._recalc_caps()
"""Register a cap to be completely handled by the proxy"""
if name in self.caps:
# If we have an existing cap then we should just use that.
cap_data = self.caps[name]
if cap_data[1] == CapType.PROXY_ONLY:
return cap_data[0]
cap_url = f"http://{uuid.uuid4()!s}.caps.hippo-proxy.localhost"
self.register_cap(name, cap_url, CapType.PROXY_ONLY)
return cap_url
def register_temporary_cap(self, name: str, cap_url: str):
"""Register a Cap that only has meaning the first time it's used"""
self._caps.add(name, (CapType.TEMPORARY, cap_url))
def register_cap(self, name: str, cap_url: str, cap_type: CapType = CapType.NORMAL):
self.caps.add(name, (cap_type, cap_url))
self._recalc_caps()
def resolve_cap(self, url: str, consume=True) -> Optional[Tuple[str, str, CapType]]:
@@ -149,9 +148,9 @@ class ProxiedRegion(BaseClientRegion):
cap_type, name = self._caps_url_lookup[cap_url]
if cap_type == CapType.TEMPORARY and consume:
# Resolving a temporary cap pops it out of the dict
temporary_caps = self._caps.popall(name)
temporary_caps = self.caps.popall(name)
temporary_caps.remove((cap_type, cap_url))
self._caps.extend((name, x) for x in temporary_caps)
self.caps.extend((name, x) for x in temporary_caps)
self._recalc_caps()
return name, cap_url, cap_type
return None
@@ -161,6 +160,7 @@ class ProxiedRegion(BaseClientRegion):
if self.circuit:
self.circuit.is_alive = False
self.objects.clear()
self.eq_manager.clear()
def __repr__(self):
return "<%s %s>" % (self.__class__.__name__, self.name)
@@ -171,11 +171,44 @@ class EventQueueManager:
# TODO: Per-EQ InjectionTracker so we can inject fake responses on 499
self._queued_events = []
self._region = weakref.proxy(region)
self._last_ack: Optional[int] = None
self._last_payload: Optional[Any] = None
self.llsd_message_serializer = LLSDMessageSerializer()
def queue_event(self, event: dict):
def inject_message(self, message: Message):
self.inject_event(self.llsd_message_serializer.serialize(message, True))
def inject_event(self, event: dict):
self._queued_events.append(event)
if self._region:
circuit: ProxiedCircuit = self._region.circuit
session: Session = self._region.session()
# Inject an outbound PlacesQuery message so we can trigger an inbound PlacesReply
# over the EQ. That will allow us to shove our own event onto the response once it comes in,
# otherwise we have to wait until the EQ legitimately returns 200 due to a new event.
# May or may not work in OpenSim.
circuit.send_message(Message(
'PlacesQuery',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id, QueryID=UUID()),
Block('TransactionData', TransactionID=UUID()),
Block('QueryData', QueryText=b'', QueryFlags=64, Category=-1, SimName=b''),
))
def take_events(self):
def take_injected_events(self):
events = self._queued_events
self._queued_events = []
return events
def cache_last_poll_response(self, req_ack: int, payload: Any):
self._last_ack = req_ack
self._last_payload = payload
def get_cached_poll_response(self, req_ack: Optional[int]) -> Optional[Any]:
if self._last_ack == req_ack:
return self._last_payload
return None
def clear(self):
self._queued_events.clear()
self._last_ack = None
self._last_payload = None

View File

@@ -13,12 +13,15 @@ from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.client.state import BaseClientSession
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.circuit import ProxiedCircuit
from hippolyzer.lib.proxy.http_asset_repo import HTTPAssetRepo
from hippolyzer.lib.proxy.http_proxy import HTTPFlowContext, is_asset_server_cap_name, SerializedCapData
from hippolyzer.lib.proxy.http_proxy import HTTPFlowContext
from hippolyzer.lib.proxy.caps import is_asset_server_cap_name, CapData, CapType
from hippolyzer.lib.proxy.namecache import ProxyNameCache
from hippolyzer.lib.proxy.object_manager import ProxyWorldObjectManager
from hippolyzer.lib.proxy.region import ProxiedRegion, CapType
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.settings import ProxySettings
if TYPE_CHECKING:
from hippolyzer.lib.proxy.message_logger import BaseMessageLogger
@@ -27,7 +30,7 @@ if TYPE_CHECKING:
class Session(BaseClientSession):
def __init__(self, session_id, secure_session_id, agent_id, circuit_code,
login_data=None, session_manager: Optional[SessionManager] = None):
session_manager: Optional[SessionManager], login_data=None):
self.login_data = login_data or {}
self.pending = True
self.id: UUID = session_id
@@ -41,9 +44,11 @@ class Session(BaseClientSession):
self.selected: SelectionModel = SelectionModel()
self.regions: List[ProxiedRegion] = []
self.started_at = datetime.datetime.now()
self.message_handler: MessageHandler[Message] = MessageHandler()
self.http_message_handler: MessageHandler[HippoHTTPFlow] = MessageHandler()
self.objects = ProxyWorldObjectManager(self, session_manager.name_cache)
self.message_handler: MessageHandler[Message, str] = MessageHandler()
self.http_message_handler: MessageHandler[HippoHTTPFlow, str] = MessageHandler()
self.objects = ProxyWorldObjectManager(self, session_manager.settings, session_manager.name_cache)
# Base path of a newview type cache directory for this session
self.cache_dir: Optional[str] = None
self._main_region = None
@property
@@ -59,8 +64,8 @@ class Session(BaseClientSession):
secure_session_id=UUID(login_data["secure_session_id"]),
agent_id=UUID(login_data["agent_id"]),
circuit_code=int(login_data["circuit_code"]),
login_data=login_data,
session_manager=session_manager,
login_data=login_data,
)
appearance_service = login_data.get("agent_appearance_service")
map_image_service = login_data.get("map-server-url")
@@ -94,12 +99,12 @@ class Session(BaseClientSession):
for region in self.regions:
if region.circuit_addr == circuit_addr:
if seed_url and region.caps.get("Seed") != seed_url:
if seed_url and region.cap_urls.get("Seed") != seed_url:
region.update_caps({"Seed": seed_url})
if handle:
region.handle = handle
return region
if seed_url and region.caps.get("Seed") == seed_url:
if seed_url and region.cap_urls.get("Seed") == seed_url:
return region
if not circuit_addr:
@@ -108,6 +113,7 @@ class Session(BaseClientSession):
logging.info("Registering region for %r" % (circuit_addr,))
region = ProxiedRegion(circuit_addr, seed_url, self, handle=handle)
self.regions.append(region)
AddonManager.handle_region_registered(self, region)
return region
def region_by_circuit_addr(self, circuit_addr) -> Optional[ProxiedRegion]:
@@ -135,6 +141,7 @@ class Session(BaseClientSession):
)
region.circuit = ProxiedCircuit(
near_addr, circuit_addr, transport, logging_hook=logging_hook)
AddonManager.handle_circuit_created(self, region)
return True
if region.circuit and region.circuit.is_alive:
# Whatever, already open
@@ -160,7 +167,7 @@ class Session(BaseClientSession):
return CapData(cap_name, ref(region), ref(self), base_url, cap_type)
return None
def tid_to_assetid(self, transaction_id: UUID):
def transaction_to_assetid(self, transaction_id: UUID):
return UUID.combine(transaction_id, self.secure_session_id)
def __repr__(self):
@@ -168,7 +175,8 @@ class Session(BaseClientSession):
class SessionManager:
def __init__(self):
def __init__(self, settings: ProxySettings):
self.settings: ProxySettings = settings
self.sessions: List[Session] = []
self.shutdown_signal = multiprocessing.Event()
self.flow_context = HTTPFlowContext()
@@ -176,7 +184,6 @@ class SessionManager:
self.message_logger: Optional[BaseMessageLogger] = None
self.addon_ctx: Dict[str, Any] = {}
self.name_cache = ProxyNameCache()
self.use_viewer_object_cache: bool = False
def create_session(self, login_data) -> Session:
session = Session.from_login_data(login_data, self)
@@ -208,50 +215,6 @@ class SessionManager:
return cap_data
return CapData()
def deserialize_cap_data(self, ser_cap_data: "SerializedCapData") -> "CapData":
cap_session = None
cap_region = None
if ser_cap_data.session_id:
for session in self.sessions:
if ser_cap_data.session_id == str(session.id):
cap_session = session
if cap_session and ser_cap_data.region_addr:
for region in cap_session.regions:
if ser_cap_data.region_addr == str(region.circuit_addr):
cap_region = region
return CapData(
cap_name=ser_cap_data.cap_name,
region=ref(cap_region) if cap_region else None,
session=ref(cap_session) if cap_session else None,
base_url=ser_cap_data.base_url,
type=CapType[ser_cap_data.type],
)
class CapData(NamedTuple):
cap_name: Optional[str] = None
# Actually they're weakrefs but the type sigs suck.
region: Optional[Callable[[], Optional[ProxiedRegion]]] = None
session: Optional[Callable[[], Optional[Session]]] = None
base_url: Optional[str] = None
type: CapType = CapType.NORMAL
def __bool__(self):
return bool(self.cap_name or self.session)
def serialize(self) -> "SerializedCapData":
return SerializedCapData(
cap_name=self.cap_name,
region_addr=str(self.region().circuit_addr) if self.region and self.region() else None,
session_id=str(self.session().id) if self.session and self.session() else None,
base_url=self.base_url,
type=self.type.name,
)
@property
def asset_server_cap(self) -> bool:
return is_asset_server_cap_name(self.cap_name)
@dataclasses.dataclass
class SelectionModel:

View File

@@ -0,0 +1,36 @@
import os
from typing import *
from hippolyzer.lib.base.settings import Settings, SettingDescriptor
_T = TypeVar("_T")
class EnvSettingDescriptor(SettingDescriptor):
"""A setting that prefers to pull its value from the environment"""
__slots__ = ("_env_name", "_env_callable")
def __init__(self, default: Union[Callable[[], _T], _T], env_name: str, spec: Callable[[str], _T]):
super().__init__(default)
self._env_name = env_name
self._env_callable = spec
def __get__(self, obj, owner=None) -> _T:
val = os.getenv(self._env_name)
if val is not None:
return self._env_callable(val)
return super().__get__(obj, owner)
class ProxySettings(Settings):
SOCKS_PROXY_PORT: int = EnvSettingDescriptor(9061, "HIPPO_UDP_PORT", int)
HTTP_PROXY_PORT: int = EnvSettingDescriptor(9062, "HIPPO_HTTP_PORT", int)
PROXY_BIND_ADDR: str = EnvSettingDescriptor("127.0.0.1", "HIPPO_BIND_HOST", str)
REMOTELY_ACCESSIBLE: bool = SettingDescriptor(False)
USE_VIEWER_OBJECT_CACHE: bool = SettingDescriptor(False)
# Whether having the proxy do automatic internal requests objects is allowed at all
ALLOW_AUTO_REQUEST_OBJECTS: bool = SettingDescriptor(True)
# Whether the viewer should request any directly referenced objects it didn't know about.
AUTOMATICALLY_REQUEST_MISSING_OBJECTS: bool = SettingDescriptor(False)
ADDON_SCRIPTS: List[str] = SettingDescriptor(list)
FILTERS: Dict[str, str] = SettingDescriptor(dict)

View File

@@ -207,12 +207,12 @@ class UDPProxyProtocol(asyncio.DatagramProtocol):
)
try:
self._handle_proxied_packet(src_packet)
self.handle_proxied_packet(src_packet)
except:
logging.exception("Barfed while handling UDP packet!")
raise
def _handle_proxied_packet(self, packet):
def handle_proxied_packet(self, packet):
self.transport.send_packet(packet)
def close(self):

View File

@@ -63,8 +63,14 @@ class TaskScheduler:
def shutdown(self):
for task_data, task in self.tasks:
task.cancel()
await_all = asyncio.gather(*(task for task_data, task in self.tasks))
asyncio.get_event_loop().run_until_complete(await_all)
try:
event_loop = asyncio.get_running_loop()
await_all = asyncio.gather(*(task for task_data, task in self.tasks))
event_loop.run_until_complete(await_all)
except RuntimeError:
pass
self.tasks.clear()
def _task_done(self, task: asyncio.Task):
for task_details in reversed(self.tasks):

View File

@@ -0,0 +1,83 @@
import asyncio
import unittest
from typing import Any, Optional, List, Tuple
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.udpserializer import UDPMessageSerializer
from hippolyzer.lib.base.network.transport import UDPPacket, AbstractUDPTransport, ADDR_TUPLE
from hippolyzer.lib.proxy.lludp_proxy import InterceptingLLUDPProxyProtocol
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager
from hippolyzer.lib.proxy.settings import ProxySettings
from hippolyzer.lib.proxy.transport import SOCKS5UDPTransport
class BaseProxyTest(unittest.IsolatedAsyncioTestCase):
async def asyncSetUp(self) -> None:
self.client_addr = ("127.0.0.1", 1)
self.region_addr = ("127.0.0.1", 3)
self.circuit_code = 1234
self.session_manager = SessionManager(ProxySettings())
self.session = self.session_manager.create_session({
"session_id": UUID.random(),
"secure_session_id": UUID.random(),
"agent_id": UUID.random(),
"circuit_code": self.circuit_code,
"sim_ip": self.region_addr[0],
"sim_port": self.region_addr[1],
"region_x": 0,
"region_y": 123,
"seed_capability": "https://test.localhost:4/foo",
})
self.transport = MockTransport()
self.protocol = InterceptingLLUDPProxyProtocol(
self.client_addr, self.session_manager)
self.protocol.transport = self.transport
self.serializer = UDPMessageSerializer()
self.session.objects.track_region_objects(123)
def tearDown(self) -> None:
self.protocol.close()
async def _wait_drained(self):
await asyncio.sleep(0.001)
def _setup_default_circuit(self):
self._setup_region_circuit(self.session.regions[-1])
self.session.main_region = self.session.regions[-1]
def _setup_region_circuit(self, region: ProxiedRegion):
# Not going to send a UseCircuitCode, so have to pretend we already did the
# client -> region NAT hole-punching
self.protocol.session = self.session
self.protocol.far_to_near_map[region.circuit_addr] = self.client_addr
self.session_manager.claim_session(self.session.id)
self.session.open_circuit(self.client_addr, region.circuit_addr,
self.protocol.transport)
def _msg_to_packet(self, msg: Message, src, dst) -> UDPPacket:
return UDPPacket(src_addr=src, dst_addr=dst, data=self.serializer.serialize(msg),
direction=msg.direction)
def _msg_to_datagram(self, msg: Message, src, dst, socks_header=True):
packet = self._msg_to_packet(msg, src, dst)
return SOCKS5UDPTransport.serialize(packet, force_socks_header=socks_header)
class MockTransport(AbstractUDPTransport):
def sendto(self, data: Any, addr: Optional[ADDR_TUPLE] = ...) -> None:
pass
def abort(self) -> None:
pass
def close(self) -> None:
pass
def __init__(self):
super().__init__()
self.packets: List[Tuple[bytes, Tuple[str, int]]] = []
def send_packet(self, packet: UDPPacket) -> None:
self.packets.append((packet.data, packet.dst_addr))

View File

@@ -1,10 +1,10 @@
import socket
import struct
from hippolyzer.lib.base.network.transport import WrappingUDPTransport, UDPPacket
from hippolyzer.lib.base.network.transport import SocketUDPTransport, UDPPacket
class SOCKS5UDPTransport(WrappingUDPTransport):
class SOCKS5UDPTransport(SocketUDPTransport):
HEADER_STRUCT = struct.Struct("!HBB4sH")
@classmethod

View File

@@ -58,6 +58,7 @@ from __future__ import annotations
import io
import logging
import pathlib
from pathlib import Path
from typing import *
@@ -82,6 +83,7 @@ class ViewerObjectCache:
@classmethod
def from_path(cls, base_path: Union[str, Path]):
base_path = pathlib.Path(base_path)
cache = cls(base_path)
with open(cache.base_path / "object.cache", "rb") as fh:
reader = se.BufferReader("<", fh.read())
@@ -143,6 +145,10 @@ class ViewerObjectCacheEntry(recordclass.datatuple): # type: ignore
data: bytes
def is_valid_vocache_dir(cache_dir):
return (pathlib.Path(cache_dir) / "objectcache" / "object.cache").exists()
class RegionViewerObjectCache:
"""Parser and container for .slc files"""
def __init__(self, cache_id: UUID, entries: List[ViewerObjectCacheEntry]):
@@ -201,7 +207,7 @@ class RegionViewerObjectCacheChain:
return None
@classmethod
def for_region(cls, handle: int, cache_id: UUID):
def for_region(cls, handle: int, cache_id: UUID, cache_dir: Optional[str] = None):
"""
Get a cache chain for a specific region, called on region connection
@@ -209,10 +215,17 @@ class RegionViewerObjectCacheChain:
so we have to try every region object cache file for every viewer installed.
"""
caches = []
for cache_dir in iter_viewer_cache_dirs():
if not (cache_dir / "objectcache" / "object.cache").exists():
if cache_dir is None:
cache_dirs = iter_viewer_cache_dirs()
else:
cache_dirs = [pathlib.Path(cache_dir)]
for cache_dir in cache_dirs:
if not is_valid_vocache_dir(cache_dir):
continue
caches.append(ViewerObjectCache.from_path(cache_dir / "objectcache"))
cache = ViewerObjectCache.from_path(cache_dir / "objectcache")
if cache:
caches.append(cache)
regions = []
for cache in caches:
region = cache.read_region(handle)

View File

@@ -0,0 +1,42 @@
import abc
from mitmproxy.addons import asgiapp
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
async def serve(app, flow: HippoHTTPFlow):
"""Serve a request based on a Hippolyzer HTTP flow using a provided app"""
await asgiapp.serve(app, flow.flow)
# Send the modified flow object back to mitmproxy
flow.resume()
class WebAppCapAddon(BaseAddon, abc.ABC):
"""
Addon that provides a cap via an ASGI webapp
Handles all registration of the cap URL and routing of the request.
"""
CAP_NAME: str
APP: any
def handle_region_registered(self, session: Session, region: ProxiedRegion):
# Register a fake URL for our cap. This will add the cap URL to the Seed
# response that gets sent back to the client if that cap name was requested.
region.register_proxy_cap(self.CAP_NAME)
def handle_session_init(self, session: Session):
for region in session.regions:
region.register_proxy_cap(self.CAP_NAME)
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
if flow.cap_data.cap_name != self.CAP_NAME:
return
# This request may take a while to generate a response for, take it out of the normal
# HTTP handling flow and handle it in a async task.
# TODO: Make all HTTP handling hooks async so this isn't necessary
self._schedule_task(serve(self.APP, flow.take()))

View File

@@ -1,69 +1,68 @@
aiohttp==3.7.4.post0
aiohttp==3.8.1
aiosignal==1.2.0
appdirs==1.4.4
Arpeggio==1.10.2
asgiref==3.3.4
async-timeout==3.0.1
attrs==20.3.0
black==21.4b2
asgiref==3.4.1
async-timeout==4.0.1
attrs==21.2.0
blinker==1.4
Brotli==1.0.9
certifi==2020.12.5
cffi==1.14.5
chardet==4.0.0
click==7.1.2
cryptography==3.3.2
certifi==2021.10.8
cffi==1.15.0
charset-normalizer==2.0.9
click==8.0.3
cryptography==36.0.2
defusedxml==0.7.1
Flask==1.1.2
Glymur==0.9.3
Flask==2.0.2
frozenlist==1.2.0
Glymur==0.9.6
h11==0.12.0
h2==4.0.0
h2==4.1.0
hpack==4.0.0
hyperframe==6.0.1
idna==2.10
itsdangerous==1.1.0
jedi==0.18.0
Jinja2==2.11.3
itsdangerous==2.0.1
jedi==0.18.1
Jinja2==3.0.3
kaitaistruct==0.9
lazy-object-proxy==1.6.0
ldap3==2.8.1
llbase==1.2.10
lxml==4.6.3
MarkupSafe==1.1.1
mitmproxy==6.0.2
msgpack==1.0.2
multidict==5.1.0
mypy-extensions==0.4.3
numpy==1.20.2
parso==0.8.2
ldap3==2.9.1
llbase==1.2.11
lxml==4.6.4
MarkupSafe==2.0.1
mitmproxy==8.0.0
msgpack==1.0.3
multidict==5.2.0
numpy==1.21.4
parso==0.8.3
passlib==1.7.4
pathspec==0.8.1
prompt-toolkit==3.0.18
protobuf==3.14.0
ptpython==3.0.17
prompt-toolkit==3.0.23
protobuf==3.18.1
ptpython==3.0.20
publicsuffix2==2.20191221
pyasn1==0.4.8
pycparser==2.20
Pygments==2.8.1
pyOpenSSL==20.0.1
pycparser==2.21
pycollada==0.7.2
Pygments==2.10.0
pyOpenSSL==22.0.0
pyparsing==2.4.7
pyperclip==1.8.2
PySide2==5.15.2
qasync==0.15.0
PySide6==6.2.2
qasync==0.22.0
recordclass==0.14.3
regex==2021.4.4
requests==2.25.1
ruamel.yaml==0.16.13
ruamel.yaml.clib==0.2.2
shiboken2==5.15.2
six==1.15.0
sortedcontainers==2.3.0
toml==0.10.2
requests==2.26.0
ruamel.yaml==0.17.16
ruamel.yaml.clib==0.2.6
shiboken6==6.2.2
six==1.16.0
sortedcontainers==2.4.0
tornado==6.1
typing-extensions==3.7.4.3
urllib3==1.26.5
transformations==2021.6.6
typing-extensions==4.0.1
urllib3==1.26.7
urwid==2.1.2
wcwidth==0.2.5
Werkzeug==1.0.1
Werkzeug==2.0.2
wsproto==1.0.0
yarl==1.6.3
zstandard==0.14.1
yarl==1.7.2
zstandard==0.15.2

View File

@@ -9,4 +9,4 @@ universal = 1
[flake8]
max-line-length = 160
exclude = build/*, .eggs/*
ignore = F405, F403, E501, F841, E722, W503, E741
ignore = F405, F403, E501, F841, E722, W503, E741, E731

View File

@@ -25,7 +25,7 @@ from setuptools import setup, find_packages
here = path.abspath(path.dirname(__file__))
version = '0.6.0'
version = '0.11.1'
with open(path.join(here, 'README.md')) as readme_fh:
readme = readme_fh.read()
@@ -44,6 +44,7 @@ setup(
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: Implementation :: CPython",
"Topic :: System :: Networking :: Monitoring",
"Topic :: Software Development :: Libraries :: Python Modules",
@@ -66,6 +67,7 @@ setup(
'lib/base/data/static_data.db2',
'lib/base/data/static_index.db2',
'lib/base/data/avatar_lad.xml',
'lib/base/data/male_collada_joints.xml',
'lib/base/data/avatar_skeleton.xml',
'lib/base/data/LICENSE-artwork.txt',
],
@@ -82,21 +84,24 @@ setup(
'llbase>=1.2.5',
'defusedxml',
'aiohttp<4.0.0',
'recordclass',
'recordclass<0.15',
'lazy-object-proxy',
'arpeggio',
# requests breaks with newer idna
'idna<3,>=2.5',
# 7.x will be a major change.
'mitmproxy<7.0.0',
'mitmproxy>=8.0.0,<8.1',
# For REPLs
'ptpython<4.0',
# JP2 codec
'Glymur<1.0',
'Glymur<0.9.7',
'numpy<2.0',
# These could be in extras_require if you don't want a GUI.
'pyside2<6.0',
'pyside6',
'qasync',
# Needed for mesh format conversion tooling
'pycollada',
'transformations',
],
tests_require=[
"pytest",

View File

@@ -9,20 +9,21 @@ from cx_Freeze import setup, Executable
# We don't need any of these and they make the archive huge.
TO_DELETE = [
"lib/PySide2/Qt3DRender.pyd",
"lib/PySide2/Qt53DRender.dll",
"lib/PySide2/Qt5Charts.dll",
"lib/PySide2/Qt5Location.dll",
"lib/PySide2/Qt5Pdf.dll",
"lib/PySide2/Qt5Quick.dll",
"lib/PySide2/Qt5WebEngineCore.dll",
"lib/PySide2/QtCharts.pyd",
"lib/PySide2/QtMultimedia.pyd",
"lib/PySide2/QtOpenGLFunctions.pyd",
"lib/PySide2/QtOpenGLFunctions.pyi",
"lib/PySide2/d3dcompiler_47.dll",
"lib/PySide2/opengl32sw.dll",
"lib/PySide2/translations",
"lib/PySide6/Qt6DRender.pyd",
"lib/PySide6/Qt63DRender.dll",
"lib/PySide6/Qt6Charts.dll",
"lib/PySide6/Qt6Location.dll",
"lib/PySide6/Qt6Pdf.dll",
"lib/PySide6/Qt6Quick.dll",
"lib/PySide6/Qt6WebEngineCore.dll",
"lib/PySide6/QtCharts.pyd",
"lib/PySide6/QtMultimedia.pyd",
"lib/PySide6/QtOpenGLFunctions.pyd",
"lib/PySide6/QtOpenGLFunctions.pyi",
"lib/PySide6/d3dcompiler_47.dll",
"lib/PySide6/opengl32sw.dll",
"lib/PySide6/lupdate.exe",
"lib/PySide6/translations",
"lib/aiohttp/_find_header.c",
"lib/aiohttp/_frozenlist.c",
"lib/aiohttp/_helpers.c",
@@ -82,6 +83,7 @@ class FinalizeCXFreezeCommand(Command):
pass
for to_copy in COPY_TO_ZIP:
shutil.copy(BASE_DIR / to_copy, path / to_copy)
shutil.copytree(BASE_DIR / "addon_examples", path / "addon_examples")
zip_path = BASE_DIR / "dist" / path.name
shutil.make_archive(zip_path, "zip", path)
@@ -111,7 +113,7 @@ executables = [
setup(
name="hippolyzer_gui",
version="0.6.0",
version="0.9.0",
description="Hippolyzer GUI",
options=options,
executables=executables,

BIN
static/repl_screenshot.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 42 KiB

View File

@@ -50,4 +50,4 @@ class TestCapsClient(unittest.IsolatedAsyncioTestCase):
with self.assertRaises(KeyError):
with self.caps_client.get("BadCap"):
pass
assert False

View File

@@ -134,3 +134,15 @@ class TestDatatypes(unittest.TestCase):
val = llsd.parse_binary(llsd.format_binary(orig))
self.assertIsInstance(val, UUID)
self.assertEqual(orig, val)
def test_jank_stringy_bytes(self):
val = JankStringyBytes(b"foo\x00")
self.assertTrue("o" in val)
self.assertTrue(b"o" in val)
self.assertFalse(b"z" in val)
self.assertFalse("z" in val)
self.assertEqual("foo", val)
self.assertEqual(b"foo\x00", val)
self.assertNotEqual(b"foo", val)
self.assertEqual(b"foo", JankStringyBytes(b"foo"))
self.assertEqual("foo", JankStringyBytes(b"foo"))

40
tests/base/test_jp2.py Normal file
View File

@@ -0,0 +1,40 @@
import os.path
import unittest
import glymur
from glymur.codestream import CMEsegment
from hippolyzer.lib.base.jp2_utils import BufferedJp2k
BASE_PATH = os.path.dirname(os.path.abspath(__file__))
@unittest.skipIf(glymur.jp2k.opj2.OPENJP2 is None, "OpenJPEG library missing")
class TestJP2Utils(unittest.TestCase):
@classmethod
def setUpClass(cls) -> None:
with open(os.path.join(BASE_PATH, "test_resources", "plywood.j2c"), "rb") as f:
cls.j2c_bytes = f.read()
def test_load_j2c(self):
j = BufferedJp2k(contents=self.j2c_bytes)
j.parse()
# Last segment in the header is the comment section
com: CMEsegment = j.codestream.segment[-1]
self.assertEqual("CME", com.marker_id)
# In this case the comment is the encoder version
self.assertEqual(b'Kakadu-3.0.3', com.ccme)
def test_read_j2c_data(self):
j = BufferedJp2k(self.j2c_bytes)
pixels = j[::]
self.assertEqual((512, 512, 3), pixels.shape)
def test_save_j2c_data(self):
j = BufferedJp2k(self.j2c_bytes)
pixels = j[::]
j[::] = pixels
new_j2c_bytes = bytes(j)
self.assertNotEqual(self.j2c_bytes, new_j2c_bytes)
# Glymur will have replaced the CME section with its own
self.assertIn(b"Created by OpenJPEG", new_j2c_bytes)

View File

@@ -1,7 +1,8 @@
import copy
import unittest
from hippolyzer.lib.base.datatypes import *
from hippolyzer.lib.base.legacy_inv import InventoryModel
from hippolyzer.lib.base.inventory import InventoryModel
from hippolyzer.lib.base.wearables import Wearable, VISUAL_PARAMS
SIMPLE_INV = """\tinv_object\t0
@@ -44,22 +45,123 @@ SIMPLE_INV = """\tinv_object\t0
class TestLegacyInv(unittest.TestCase):
def setUp(self) -> None:
self.model = InventoryModel.from_str(SIMPLE_INV)
def test_parse(self):
model = InventoryModel.from_str(SIMPLE_INV)
self.assertTrue(UUID('f4d91477-def1-487a-b4f3-6fa201c17376') in model.containers)
self.assertIsNotNone(model.root)
self.assertTrue(UUID('f4d91477-def1-487a-b4f3-6fa201c17376') in self.model.nodes)
self.assertIsNotNone(self.model.root)
def test_serialize(self):
model = InventoryModel.from_str(SIMPLE_INV)
new_model = InventoryModel.from_str(model.to_str())
self.assertEqual(model, new_model)
self.model = InventoryModel.from_str(SIMPLE_INV)
new_model = InventoryModel.from_str(self.model.to_str())
self.assertEqual(self.model, new_model)
def test_item_access(self):
model = InventoryModel.from_str(SIMPLE_INV)
item = model.items[UUID('dd163122-946b-44df-99f6-a6030e2b9597')]
item = self.model.nodes[UUID('dd163122-946b-44df-99f6-a6030e2b9597')]
self.assertEqual(item.name, "New Script")
self.assertEqual(item.sale_info.sale_type, "not")
self.assertEqual(item.model, model)
self.assertEqual(item.model, self.model)
def test_access_children(self):
root = self.model.root
item = tuple(self.model.ordered_nodes)[1]
self.assertEqual((item,), root.children)
def test_access_parent(self):
root = self.model.root
item = tuple(self.model.ordered_nodes)[1]
self.assertEqual(root, item.parent)
self.assertEqual(None, root.parent)
def test_unlink(self):
self.assertEqual(1, len(self.model.root.children))
item = tuple(self.model.ordered_nodes)[1]
self.assertEqual([item], item.unlink())
self.assertEqual(0, len(self.model.root.children))
self.assertEqual(None, item.model)
def test_relink(self):
item = tuple(self.model.ordered_nodes)[1]
for unlinked in item.unlink():
self.model.add(unlinked)
self.assertEqual(self.model, item.model)
self.assertEqual(1, len(self.model.root.children))
def test_eq_excludes_model(self):
item = tuple(self.model.ordered_nodes)[1]
item_copy = copy.copy(item)
item_copy.model = None
self.assertEqual(item, item_copy)
def test_llsd_serialization(self):
self.assertEqual(
self.model.to_llsd(),
[
{
'name': 'Contents',
'obj_id': UUID('f4d91477-def1-487a-b4f3-6fa201c17376'),
'parent_id': UUID('00000000-0000-0000-0000-000000000000'),
'type': 'category'
},
{
'asset_id': UUID('00000000-0000-0000-0000-000000000000'),
'created_at': 1587367239,
'desc': '2020-04-20 04:20:39 lsl2 script',
'flags': b'\x00\x00\x00\x00',
'inv_type': 'script',
'item_id': UUID('dd163122-946b-44df-99f6-a6030e2b9597'),
'name': 'New Script',
'parent_id': UUID('f4d91477-def1-487a-b4f3-6fa201c17376'),
'permissions': {
'base_mask': 2147483647,
'creator_id': UUID('a2e76fcd-9360-4f6d-a924-000000000003'),
'everyone_mask': 0,
'group_id': UUID('00000000-0000-0000-0000-000000000000'),
'group_mask': 0,
'last_owner_id': UUID('a2e76fcd-9360-4f6d-a924-000000000003'),
'next_owner_mask': 581632,
'owner_id': UUID('a2e76fcd-9360-4f6d-a924-000000000003'),
'owner_mask': 2147483647
},
'sale_info': {
'sale_price': 10,
'sale_type': 'not'
},
'type': 'lsltext'
}
]
)
def test_llsd_legacy_equality(self):
new_model = InventoryModel.from_llsd(self.model.to_llsd())
self.assertEqual(self.model, new_model)
new_model.root.name = "foo"
self.assertNotEqual(self.model, new_model)
def test_difference_added(self):
new_model = InventoryModel.from_llsd(self.model.to_llsd())
diff = self.model.get_differences(new_model)
self.assertEqual([], diff.changed)
self.assertEqual([], diff.removed)
new_model.root.name = "foo"
diff = self.model.get_differences(new_model)
self.assertEqual([new_model.root], diff.changed)
self.assertEqual([], diff.removed)
item = new_model.root.children[0]
item.unlink()
diff = self.model.get_differences(new_model)
self.assertEqual([new_model.root], diff.changed)
self.assertEqual([item], diff.removed)
new_item = copy.copy(item)
new_item.node_id = UUID.random()
new_model.add(new_item)
diff = self.model.get_differences(new_model)
self.assertEqual([new_model.root, new_item], diff.changed)
self.assertEqual([item], diff.removed)
GIRL_NEXT_DOOR_SHAPE = """LLWearable version 22

View File

@@ -46,7 +46,6 @@ class TestMessage(unittest.TestCase):
self.serial = UDPMessageSerializer()
settings = Settings()
settings.ENABLE_DEFERRED_PACKET_PARSING = True
settings.HANDLE_PACKETS = False
self.deserial = UDPMessageDeserializer(settings=settings)
def test_block(self):
@@ -147,6 +146,12 @@ class TestMessage(unittest.TestCase):
new_msg = Message.from_dict(self.chat_msg.to_dict())
self.assertEqual(pickle.dumps(self.chat_msg), pickle.dumps(new_msg))
def test_todict_extended(self):
self.chat_msg.packet_id = 5
new_msg = Message.from_dict(self.chat_msg.to_dict(extended=True))
self.assertEqual(5, new_msg.packet_id)
self.assertEqual(pickle.dumps(self.chat_msg), pickle.dumps(new_msg))
def test_todict_multiple_blocks(self):
chat_msg = self.chat_msg
# If we dupe the ChatData block it should survive to_dict()
@@ -170,7 +175,7 @@ class TestMessage(unittest.TestCase):
class TestMessageHandlers(unittest.IsolatedAsyncioTestCase):
def setUp(self) -> None:
self.message_handler: MessageHandler[Message] = MessageHandler()
self.message_handler: MessageHandler[Message, str] = MessageHandler()
def _fake_received_message(self, msg: Message):
self.message_handler.handle(msg)
@@ -204,7 +209,7 @@ class TestMessageHandlers(unittest.IsolatedAsyncioTestCase):
self.assertEqual(len(foo_handlers), 0)
async def test_subscription_no_take(self):
with self.message_handler.subscribe_async("Foo", take=False) as get_msg:
with self.message_handler.subscribe_async(("Foo",), take=False) as get_msg:
msg = Message("Foo", Block("Bar", Baz=1, Biz=1))
self._fake_received_message(msg)
# Should not copy
@@ -213,7 +218,7 @@ class TestMessageHandlers(unittest.IsolatedAsyncioTestCase):
self.assertFalse(msg.queued)
async def test_wait_for(self):
fut = self.message_handler.wait_for("Foo", timeout=0.001, take=False)
fut = self.message_handler.wait_for(("Foo",), timeout=0.001, take=False)
foo_handlers = self.message_handler.handlers['Foo']
# We are subscribed
self.assertEqual(len(foo_handlers), 1)
@@ -227,7 +232,7 @@ class TestMessageHandlers(unittest.IsolatedAsyncioTestCase):
self.assertEqual(len(foo_handlers), 0)
async def test_wait_for_take(self):
fut = self.message_handler.wait_for("Foo", timeout=0.001)
fut = self.message_handler.wait_for(("Foo",), timeout=0.001)
foo_handlers = self.message_handler.handlers['Foo']
# We are subscribed
self.assertEqual(len(foo_handlers), 1)
@@ -295,3 +300,14 @@ class HumanReadableMessageTests(unittest.TestCase):
with self.assertRaises(ValueError):
HumanMessageSerializer.from_human_string(val)
def test_flags(self):
val = """
OUT FooMessage [ZEROCODED] [RELIABLE] [1]
[SomeBlock]
foo = 1
"""
msg = HumanMessageSerializer.from_human_string(val)
self.assertEqual(HumanMessageSerializer.to_human_string(msg).strip(), val.strip())

Binary file not shown.

View File

@@ -791,7 +791,3 @@ class SubfieldSerializationTests(BaseSerializationTest):
self.assertEqual(ser.serialize(None, FooFlags.FOO), 1)
self.assertEqual(ser.serialize(None, 3), 3)
self.assertEqual(ser.serialize(None, 7), 7)
if __name__ == "__main__":
unittest.main()

View File

@@ -32,32 +32,6 @@ class TestEvents(unittest.TestCase):
def test_base_settings(self):
settings = Settings()
self.assertEqual(settings.quiet_logging, False)
self.assertEqual(settings.HANDLE_PACKETS, True)
self.assertEqual(settings.LOG_VERBOSE, True)
self.assertEqual(settings.ENABLE_BYTES_TO_HEX_LOGGING, False)
self.assertEqual(settings.ENABLE_CAPS_LOGGING, True)
self.assertEqual(settings.ENABLE_CAPS_LLSD_LOGGING, False)
self.assertEqual(settings.ENABLE_EQ_LOGGING, True)
self.assertEqual(settings.ENABLE_UDP_LOGGING, True)
self.assertEqual(settings.ENABLE_OBJECT_LOGGING, True)
self.assertEqual(settings.LOG_SKIPPED_PACKETS, True)
self.assertEqual(settings.ENABLE_HOST_LOGGING, True)
self.assertEqual(settings.LOG_COROUTINE_SPAWNS, True)
self.assertEqual(settings.DISABLE_SPAMMERS, True)
self.assertEqual(settings.UDP_SPAMMERS, ['PacketAck', 'AgentUpdate'])
def test_quiet_settings(self):
settings = Settings(True)
self.assertEqual(settings.quiet_logging, True)
self.assertEqual(settings.HANDLE_PACKETS, True)
self.assertEqual(settings.LOG_VERBOSE, False)
self.assertEqual(settings.ENABLE_BYTES_TO_HEX_LOGGING, False)
self.assertEqual(settings.ENABLE_CAPS_LOGGING, False)
self.assertEqual(settings.ENABLE_CAPS_LLSD_LOGGING, False)
self.assertEqual(settings.ENABLE_EQ_LOGGING, False)
self.assertEqual(settings.ENABLE_UDP_LOGGING, False)
self.assertEqual(settings.ENABLE_OBJECT_LOGGING, False)
self.assertEqual(settings.LOG_SKIPPED_PACKETS, False)
self.assertEqual(settings.ENABLE_HOST_LOGGING, False)
self.assertEqual(settings.LOG_COROUTINE_SPAWNS, False)
self.assertEqual(settings.ENABLE_DEFERRED_PACKET_PARSING, True)
settings.ENABLE_DEFERRED_PACKET_PARSING = False
self.assertEqual(settings.ENABLE_DEFERRED_PACKET_PARSING, False)

View File

@@ -23,12 +23,13 @@ from hippolyzer.lib.base.xfer_manager import XferManager
class MockHandlingCircuit(ProxiedCircuit):
def __init__(self, handler: MessageHandler[Message]):
def __init__(self, handler: MessageHandler[Message, str]):
super().__init__(("127.0.0.1", 1), ("127.0.0.1", 2), None)
self.handler = handler
def _send_prepared_message(self, message: Message, transport=None):
asyncio.get_event_loop().call_soon(self.handler.handle, message)
loop = asyncio.get_event_loop_policy().get_event_loop()
loop.call_soon(self.handler.handle, message)
class MockConnectionHolder(ConnectionHolder):
@@ -42,8 +43,8 @@ class BaseTransferTests(unittest.IsolatedAsyncioTestCase):
LARGE_PAYLOAD = b"foobar" * 500
def setUp(self) -> None:
self.server_message_handler: MessageHandler[Message] = MessageHandler()
self.client_message_handler: MessageHandler[Message] = MessageHandler()
self.server_message_handler: MessageHandler[Message, str] = MessageHandler()
self.client_message_handler: MessageHandler[Message, str] = MessageHandler()
# The client side should send messages to the server side's message handler
# and vice-versa
self.client_circuit = MockHandlingCircuit(self.server_message_handler)
@@ -60,7 +61,7 @@ class XferManagerTests(BaseTransferTests):
self.received_bytes: Optional[bytes] = None
async def _handle_vfile_upload(self):
msg = await self.server_message_handler.wait_for('AssetUploadRequest', timeout=0.01)
msg = await self.server_message_handler.wait_for(('AssetUploadRequest',), timeout=0.01)
asset_block = msg["AssetBlock"]
transaction_id = asset_block["TransactionID"]
asset_id = UUID.combine(transaction_id, self.secure_session_id)
@@ -70,7 +71,7 @@ class XferManagerTests(BaseTransferTests):
manager = XferManager(self.server_connection)
xfer = await manager.request(vfile_id=asset_id, vfile_type=AssetType.BODYPART)
self.received_bytes = xfer.reassemble_chunks()
self.server_circuit.send_message(Message(
self.server_circuit.send(Message(
"AssetUploadComplete",
Block("AssetBlock", UUID=asset_id, Type=asset_block["Type"], Success=True),
direction=Direction.IN,
@@ -102,14 +103,14 @@ class TestTransferManager(BaseTransferTests):
)
async def _handle_covenant_download(self):
msg = await self.server_message_handler.wait_for('TransferRequest', timeout=0.01)
msg = await self.server_message_handler.wait_for(('TransferRequest',), timeout=0.01)
self.assertEqual(TransferSourceType.SIM_ESTATE, msg["TransferInfo"]["SourceType"])
tid = msg["TransferInfo"]["TransferID"]
params: TransferRequestParamsSimEstate = msg["TransferInfo"][0].deserialize_var("Params")
self.assertEqual(EstateAssetType.COVENANT, params.EstateAssetType)
data = self.LARGE_PAYLOAD
self.server_circuit.send_message(Message(
self.server_circuit.send(Message(
'TransferInfo',
Block(
'TransferInfo',
@@ -125,7 +126,7 @@ class TestTransferManager(BaseTransferTests):
while True:
chunk = data[:1000]
data = data[1000:]
self.server_circuit.send_message(Message(
self.server_circuit.send(Message(
'TransferPacket',
Block(
'TransferData',

View File

@@ -1,77 +0,0 @@
import asyncio
from typing import *
import unittest
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.udpserializer import UDPMessageSerializer
from hippolyzer.lib.base.network.transport import AbstractUDPTransport, UDPPacket, ADDR_TUPLE
from hippolyzer.lib.proxy.lludp_proxy import InterceptingLLUDPProxyProtocol
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager
from hippolyzer.lib.proxy.transport import SOCKS5UDPTransport
class MockTransport(AbstractUDPTransport):
def sendto(self, data: Any, addr: Optional[ADDR_TUPLE] = ...) -> None:
pass
def abort(self) -> None:
pass
def close(self) -> None:
pass
def __init__(self):
super().__init__()
self.packets: List[Tuple[bytes, Tuple[str, int]]] = []
def send_packet(self, packet: UDPPacket) -> None:
self.packets.append((packet.data, packet.dst_addr))
class BaseProxyTest(unittest.IsolatedAsyncioTestCase):
def setUp(self) -> None:
self.client_addr = ("127.0.0.1", 1)
self.region_addr = ("127.0.0.1", 3)
self.circuit_code = 1234
self.session_manager = SessionManager()
self.session = self.session_manager.create_session({
"session_id": UUID.random(),
"secure_session_id": UUID.random(),
"agent_id": UUID.random(),
"circuit_code": self.circuit_code,
"sim_ip": self.region_addr[0],
"sim_port": self.region_addr[1],
"region_x": 0,
"region_y": 123,
"seed_capability": "https://test.localhost:4/foo",
})
self.transport = MockTransport()
self.protocol = InterceptingLLUDPProxyProtocol(
self.client_addr, self.session_manager)
self.protocol.transport = self.transport
self.serializer = UDPMessageSerializer()
self.session.objects.track_region_objects(123)
async def _wait_drained(self):
await asyncio.sleep(0.001)
def _setup_default_circuit(self):
self._setup_region_circuit(self.session.regions[-1])
self.session.main_region = self.session.regions[-1]
def _setup_region_circuit(self, region: ProxiedRegion):
# Not going to send a UseCircuitCode, so have to pretend we already did the
# client -> region NAT hole-punching
self.protocol.session = self.session
self.protocol.far_to_near_map[region.circuit_addr] = self.client_addr
self.session_manager.claim_session(self.session.id)
self.session.open_circuit(self.client_addr, region.circuit_addr,
self.protocol.transport)
def _msg_to_datagram(self, msg: Message, src, dst, direction, socks_header=True):
serialized = self.serializer.serialize(msg)
packet = UDPPacket(src_addr=src, dst_addr=dst, data=serialized,
direction=direction)
return SOCKS5UDPTransport.serialize(packet, force_socks_header=socks_header)

Some files were not shown because too many files have changed in this diff Show More