157 Commits

Author SHA1 Message Date
Salad Dais
5ef9b5354a v0.12.0 2022-08-18 15:13:02 +00:00
Salad Dais
34ca7d54be Support formatting SL's busted login endpoint responses 2022-08-18 14:40:33 +00:00
Salad Dais
cb316f1992 Only load the newest version of an agent's inventory cache
This isn't entirely correct, but without a cross-platform way to
map specifically the requesting viewer to a cache directory this
is the least annoying thing we can do.
2022-08-18 14:39:49 +00:00
Salad Dais
da05a6cf1f Begin reshuffling inventory management code 2022-08-18 14:30:42 +00:00
Salad Dais
f06c31e225 Greatly improve matrix handling logic in collada code 2022-08-18 14:29:28 +00:00
Salad Dais
b4e5596ca2 Add more utils for converting between quat and euler 2022-08-08 00:38:09 +00:00
Salad Dais
49a54ce099 Fix anim mangler exceptions causing reload to fail 2022-08-07 04:42:06 +00:00
Salad Dais
0349fd9078 Fix RLV command parser to better match RLV's actual behavior 2022-08-02 08:18:28 +00:00
Salad Dais
118ef2813a Fix new flake8 lint errors 2022-08-01 01:41:15 +00:00
Salad Dais
256f74b71a Add InventoryManager to proxy Session object 2022-07-31 18:31:56 +00:00
Salad Dais
4a84453ca4 Add start of proxy inventory manager 2022-07-31 16:54:57 +00:00
Salad Dais
34316cb166 Fix LLSD notation serialization with embedded newline 2022-07-30 14:39:48 +00:00
Salad Dais
0f7d35cdca Handle HTTP messages with missing (not just empty) body 2022-07-30 00:37:35 +00:00
Salad Dais
2ee8a6f008 Clean up typing to appease the linter 2022-07-28 18:26:05 +00:00
Salad Dais
848a6745c0 v0.11.3 2022-07-28 03:55:22 +00:00
Salad Dais
0cbbedd27b Make assignments on BaseAddon class objects work as expected
The descriptors were being silently clobbered for a while now, and
I never noticed. Oops!
2022-07-28 03:39:53 +00:00
Salad Dais
e951a5b5c3 Make datetime objects (de)serialize in binary LLSD more accurately
Fixes some precision issues with LLBase's LLSD serialization stuff
where the microseconds component was dropped. May still get some
off-by-one serialization differences due to rounding.
2022-07-27 22:42:58 +00:00
Salad Dais
68bf3ba4a2 More comments in mesh module 2022-07-27 22:21:42 +00:00
Salad Dais
5b4f8f03dc Use same compression ratio for LLSD as indra 2022-07-27 22:16:31 +00:00
Salad Dais
d7c2215cbc Remove special Firestorm section from readme
The new Firestorm release added proxy configuration back in.
2022-07-27 02:50:06 +00:00
Salad Dais
629e59d3f9 Add option to upload mesh deformer directly 2022-07-26 04:13:15 +00:00
Salad Dais
8f68bc219e Split up deformer helper a little 2022-07-26 03:44:32 +00:00
Salad Dais
ba296377de Save mesh deformers as files rather than uploading directly 2022-07-26 02:12:54 +00:00
Salad Dais
e34927a996 Improve AssetUploader API, make uploader example addon use it 2022-07-26 00:11:37 +00:00
Salad Dais
3c6a917550 Add command to deformer_helper addon that uploads mesh deformers
Sometimes these are preferable to deformer anims.
2022-07-25 23:11:15 +00:00
Salad Dais
dbae2acf27 Add basic AssetUploader class
Should make it less anoying to upload procedurally generated mesh
outside of local mesh mode
2022-07-25 22:08:28 +00:00
Salad Dais
722e8eeabf v0.11.2 2022-07-24 09:02:02 +00:00
Salad Dais
a6a26a9999 Make sure module unload hooks always run
Fixes anim and mesh manglers not getting manglers unregistered
2022-07-24 08:57:47 +00:00
Salad Dais
a6328d5aee Update get_task_inventory_cap example 2022-07-22 04:04:13 +00:00
Salad Dais
4e76ebe7cf Fix get_task_inventory_cap example 2022-07-21 21:44:32 +00:00
Salad Dais
c0a26ffb57 Send proxy-created Messages reliably where appropriate 2022-07-21 21:44:06 +00:00
Salad Dais
7dfb10cb51 Make TextureEntry deserialization lazy in the ObjectUpdate case too 2022-07-21 08:05:25 +00:00
Salad Dais
de33906db5 Add a couple more enum defs 2022-07-21 08:05:17 +00:00
Salad Dais
605337b280 Remove erroneous comment 2022-07-20 21:30:03 +00:00
Salad Dais
235cd4929f Update message template to add new messages / blocks 2022-07-20 21:23:28 +00:00
Salad Dais
220a02543e v0.11.1 2022-07-20 20:38:17 +00:00
Salad Dais
8ac47c2397 Fix use of dynamically imported globals in REPL 2022-07-20 20:30:41 +00:00
Salad Dais
d384978322 UpdateType -> ObjectUpdateType 2022-07-20 20:26:50 +00:00
Salad Dais
f02a479834 Add get_task_inventory_cap.py addon example
An example of mocking out actually useful behavior for the viewer.
Better (faster!) task inventory fetching API.
2022-07-20 09:20:27 +00:00
Salad Dais
b5e8b36173 Add more enum and flag defs to templates.py 2022-07-20 06:35:04 +00:00
Salad Dais
08a39f4df7 Make object update handling more robust 2022-07-20 06:35:04 +00:00
Salad Dais
61ec51beec Add demo autoattacher addon example 2022-07-19 23:48:40 +00:00
Salad Dais
9adbdcdcc8 Add a couple more flag definitions to templates.py 2022-07-19 09:49:43 +00:00
Salad Dais
e7b05f72ca Dequantize TimeDilation message var 2022-07-19 05:57:19 +00:00
Salad Dais
75f2f363a4 Handle TE glow field quantization 2022-07-18 22:29:37 +00:00
Salad Dais
cc1bb9ac1d Give MediaFlags and BasicMaterials sensible default values 2022-07-18 22:08:06 +00:00
Salad Dais
d498d1f2c8 v0.11.0 2022-07-18 08:53:24 +00:00
Salad Dais
8c0635bb2a Add classmethod for rebuilding TEs into a TECollection 2022-07-18 06:37:20 +00:00
Salad Dais
309dbeeb52 Add TextureEntry.st_to_uv() to convert between coords 2022-07-18 00:34:56 +00:00
Salad Dais
4cc87bf81e Add a default value for TextureEntryCollection.realize() num_faces 2022-07-17 01:09:22 +00:00
Salad Dais
f34bb42dcb TextureEntry -> TextureEntryCollection, improve .realize()
The "TextureEntry" name from the message template is kind of a
misnomer, the field actually includes multiple TextureEntries.
2022-07-17 00:45:20 +00:00
Salad Dais
59ec99809a Correct TE rotation quantization
Literally everything has its own special float quantization. Argh.
2022-07-16 23:17:34 +00:00
Salad Dais
4b963f96d2 Add TextureEntry.realize() to ease indexing into specific faces 2022-07-14 03:10:11 +00:00
Salad Dais
58db8f66de Correct type signatures for TextureEntriy 2022-07-10 17:58:13 +00:00
Salad Dais
95623eba58 More InventoryModel fixes 2022-07-10 01:55:34 +00:00
Salad Dais
8dba0617bd Make injecting inventory EQ events easier 2022-07-09 04:21:44 +00:00
Salad Dais
289073be8e Add InventoryModel diffing 2022-07-09 02:48:23 +00:00
Salad Dais
f3c8015366 Support mutable InventoryModels 2022-07-08 22:06:14 +00:00
Salad Dais
99e8118458 Support HIPPO XML directives in injected EQ events 2022-07-05 14:24:35 +00:00
Salad Dais
80745cfd1c Add TextureEntry.unwrap() to ease working with potentially lazy TEs 2022-07-05 03:08:52 +00:00
Salad Dais
92a06bccaf Dequantize OffsetS and OffsetT in TextureEntrys 2022-07-05 02:08:53 +00:00
Salad Dais
fde9ddf4d9 Initial work to support in-flight EQ response pre-emption 2022-07-04 17:57:05 +00:00
Salad Dais
03a56c9982 Auto-load certain symbols in REPL, add docs for REPL 2022-06-27 01:49:27 +00:00
Salad Dais
d07a0df0fd WIP LLMesh -> Collada
First half of the LLMesh -> Collada -> LLMesh transform for #24
2022-06-24 13:15:20 +00:00
Salad Dais
848397fe63 Fix windows build workflow 2022-06-24 07:36:51 +00:00
Salad Dais
0f9246c5c6 Use github.ref_name instead of github.ref 2022-06-24 02:32:50 +00:00
Salad Dais
2e7f887970 v0.10.0 2022-06-24 01:54:37 +00:00
Salad Dais
ef9df6b058 Update Windows bundling action to add artifact to release 2022-06-24 01:12:21 +00:00
Salad Dais
baae0f6d6e Fix TupleCoord negation 2022-06-21 07:15:49 +00:00
Salad Dais
0f369b682d Upgrade to mitmproxy 8.0
Not 8.1 since that drops Python 3.8 support. Closes #26
2022-06-20 15:15:57 +00:00
Salad Dais
1f1e4de254 Add addon for testing object manager conformance against viewer
Closes #18
2022-06-20 12:38:11 +00:00
Salad Dais
75ddc0a5ba Be smarter about object cache miss autorequests 2022-06-20 12:33:12 +00:00
Salad Dais
e4cb168138 Clear up last few event loop warnings 2022-06-20 12:31:08 +00:00
Salad Dais
63aebba754 Clear up some event loop deprecation warnings 2022-06-20 05:55:01 +00:00
Salad Dais
8cf1a43d59 Better defaults when parsing ObjectUpdateCompressed
This helps our view of the cache better match the viewer's VOCache
2022-06-20 03:23:46 +00:00
Salad Dais
bbc8813b61 Add unary minus for TupleCoords 2022-06-19 04:33:20 +00:00
Salad Dais
5b51dbd30f Add workaround instructions for most recent Firestorm release
Closes #25
2022-05-13 23:52:50 +00:00
Salad Dais
295c7972e7 Use windows-2019 runner instead of windows-latest
windows-latest has some weird ACL changes that cause the cx_Freeze
packaging steps to fail.
2022-05-13 23:39:37 +00:00
Salad Dais
b034661c38 Revert "Temporarily stop generating lib_licenses.txt automatically"
This reverts commit f12fd95ee1.
2022-05-13 23:39:09 +00:00
Salad Dais
f12fd95ee1 Temporarily stop generating lib_licenses.txt automatically
Something is busted with pip-licenses in CI. Not sure why, but
it's only needed for Windows builds anyway.
2022-03-12 19:15:59 +00:00
Salad Dais
bc33313fc7 v0.9.0 2022-03-12 18:40:38 +00:00
Salad Dais
affc7fcf89 Clarify comment in proxy object manager 2022-03-05 11:03:28 +00:00
Salad Dais
b8f1593a2c Allow filtering on HTTP status code 2022-03-05 10:50:09 +00:00
Salad Dais
7879f4e118 Split up mitmproxy integration test a bit 2022-03-05 10:49:55 +00:00
Salad Dais
4ba611ae01 Only apply local mesh to selected links 2022-02-28 07:32:46 +00:00
Salad Dais
82ff6d9c64 Add more TeleportFlags 2022-02-28 07:32:22 +00:00
Salad Dais
f603ea6186 Better handle timeouts that have missing cap_data metadata 2021-12-18 20:43:10 +00:00
Salad Dais
fcf6a4568b Better handling for proxied HTTP requests that timeout 2021-12-17 19:27:20 +00:00
Salad Dais
2ad6cc1b51 Better handle broken 'LLSD' responses 2021-12-17 00:18:51 +00:00
Salad Dais
025f7d31f2 Make sure .queued is cleared if message take()n twice 2021-12-15 20:17:54 +00:00
Salad Dais
9fdb281e4a Create example addon for simulating packet loss 2021-12-13 06:12:43 +00:00
Salad Dais
11e28bde2a Allow filtering message log on HTTP headers 2021-12-11 15:08:45 +00:00
Salad Dais
1faa6f977c Update docs on send() and send_reliable() 2021-12-10 13:41:20 +00:00
Salad Dais
6866e7397f Clean up cap registration API 2021-12-10 13:22:54 +00:00
Salad Dais
fa0b3a5340 Mark all Messages synthetic unless they came off the wire 2021-12-10 07:30:02 +00:00
Salad Dais
16c808bce8 Match viewer resend behaviour 2021-12-10 07:04:36 +00:00
Salad Dais
ec4b2d0770 Move last of the explicit direction params 2021-12-10 06:50:07 +00:00
Salad Dais
3b610fdfd1 Add awaitable send_reliable() 2021-12-09 05:30:35 +00:00
Salad Dais
8b93c5eefa Rename send_message() to send() 2021-12-09 05:30:12 +00:00
Salad Dais
f4bb9eae8f Fix __contains__ for JankStringyBytes 2021-12-09 03:48:29 +00:00
Salad Dais
ecb14197cf Make message log filter highlight every matched field
Previously only the first match was being highlighted.
2021-12-09 01:14:09 +00:00
Salad Dais
95fd58e25a Begin PySide6 cleanup 2021-12-09 00:02:48 +00:00
Salad Dais
afc333ab49 Improve highlighting of matched fields in message log 2021-12-08 23:50:16 +00:00
Salad Dais
eb6406bca4 Fix ACK collection logic for injected reliable messages 2021-12-08 22:29:29 +00:00
Salad Dais
d486aa130d Add support for specifying flags in message builder 2021-12-08 21:10:06 +00:00
Salad Dais
d66d5226a2 Initial implementation of reliable injected packets
See #17. Not yet tested for real.
2021-12-08 04:49:45 +00:00
Salad Dais
d86da70eeb v0.8.0 2021-12-07 07:16:25 +00:00
Salad Dais
aa0b4b63a9 Update cx_freeze script to handle PySide6 2021-12-07 07:16:25 +00:00
Salad Dais
5f479e46b4 Automatically offer to install the HTTPS certs on first run 2021-12-07 07:16:25 +00:00
Salad Dais
1e55d5a9d8 Continue handling HTTP flows if flow logging fails
If flow beautification for display throws then we don't want
to bypass other handling of the flow.

This fixes a login failure due to SL's login XML-RPC endpoint
returning a Content-Type of "application/llsd+xml/r/n" when it's
actually "application/xml".
2021-12-06 17:01:13 +00:00
Salad Dais
077a95b5e7 Migrate to PySide6 to support Python 3.10
Update Glymur too
2021-12-06 13:37:31 +00:00
Salad Dais
4f1399cf66 Add note about LinHippoAutoProxy 2021-12-06 12:26:16 +00:00
Salad Dais
9590b30e66 Add note about Python 3.10 support 2021-12-05 20:25:06 +00:00
Salad Dais
34f3ee4c3e Move mtime wrapper to helpers 2021-12-05 18:14:26 +00:00
Salad Dais
7d655543f5 Dont reserialize responses as pretty LLSD-XML
Certain LLSD parsers don't like the empty text nodes it adds around
the root element of the document. Yuck.
2021-12-05 18:12:53 +00:00
Salad Dais
5de3ed0d5e Add support for LLSD inventory representations 2021-12-03 05:59:58 +00:00
Salad Dais
74c3287cc0 Add base addon for creating proxy-only caps based on ASGI apps 2021-12-02 06:04:29 +00:00
Salad Dais
3a7f8072a0 Initial implementation of proxy-provided caps
Useful for mocking out a cap while developing the viewer-side
pieces of it.
2021-12-02 03:22:47 +00:00
dependabot[bot]
5fa91580eb Bump mitmproxy from 7.0.2 to 7.0.3 (#21)
Bumps [mitmproxy](https://github.com/mitmproxy/mitmproxy) from 7.0.2 to 7.0.3.
- [Release notes](https://github.com/mitmproxy/mitmproxy/releases)
- [Changelog](https://github.com/mitmproxy/mitmproxy/blob/main/CHANGELOG.md)
- [Commits](https://github.com/mitmproxy/mitmproxy/compare/v7.0.2...v7.0.3)

---
updated-dependencies:
- dependency-name: mitmproxy
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-11-30 05:30:06 -04:00
Salad Dais
d8fbb55438 Improve LLUDP integration tests 2021-11-30 09:25:31 +00:00
Salad Dais
99eb4fed74 Fix _reorient_coord to work correctly for normals again 2021-11-30 09:24:49 +00:00
Salad Dais
6b78b841df Fix range of mesh normals 2021-11-23 01:36:14 +00:00
Salad Dais
dae852db69 Fix filter dialog 2021-11-19 04:30:36 +00:00
Salad Dais
0c0de2bcbc v0.7.1 2021-09-04 07:27:20 +00:00
Salad Dais
9f2d2f2194 Pin recordclass version, use requirements.txt for windows build
recordclass had some breaking changes in 0.15
2021-09-04 07:12:45 +00:00
Salad Dais
c6e0a400a9 v0.7.0 2021-08-10 01:16:20 +00:00
Salad Dais
d01122d542 Call correct method to raise new message log window 2021-08-10 01:11:21 +00:00
Salad Dais
690d6b51b8 Upgrade to mitmproxy 7.0.2
Our fix for `Flow.set_state()` has been upstreamed
2021-08-09 22:16:23 +00:00
Salad Dais
2437a8b14f Add a framework for simple local anim creation, tail animator 2021-08-05 21:08:18 +00:00
Salad Dais
afa601fffe Support session-specific viewer cache directories 2021-08-02 18:23:13 +00:00
Salad Dais
874feff471 Fix incorrect reference to mitmproxy class 2021-08-01 12:16:10 +00:00
Salad Dais
05c53bba9f Add CapsClient to BaseClientSession 2021-08-01 06:39:04 +00:00
Salad Dais
578f1d8c4e Add setting to disable all proxy object autorequests
Will help with #18 by not changing object request behaviour when
running through the proxy.
2021-08-01 06:37:33 +00:00
Salad Dais
7d8e18440a Add local anim mangler support with example
Analogous to local mesh mangler support.
2021-07-31 11:56:17 +00:00
Salad Dais
66e112dd52 Add basic message log import / export feature
Closes #20
2021-07-30 03:13:33 +00:00
Salad Dais
02ac022ab3 Add export formats for message log entries 2021-07-30 01:06:29 +00:00
Salad Dais
33ce74754e Fix mirror_target_agent check in http hooks 2021-07-30 01:06:29 +00:00
Salad Dais
74dd6b977c Add extended to_dict() format for Message class
This will allow proper import / export of message logs.
2021-07-29 10:26:42 +00:00
Salad Dais
387652731a Add Message Mirror example addon 2021-07-29 09:43:20 +00:00
Salad Dais
e4601fd879 Support multiple Message Log windows
Closes #19
2021-07-29 01:00:57 +00:00
Salad Dais
6eb25f96d9 Support logging to a hierarchy of message loggers
Necessary to eventually support multiple message log windows
2021-07-27 02:35:03 +00:00
Salad Dais
22b9eeb5cb Better handling of optional command parameters 2021-07-22 23:59:55 +00:00
Salad Dais
0dbedcb2f5 Improve coverage 2021-07-22 23:58:17 +00:00
Salad Dais
7d9712c16e Fix message dropping and queueing corner cases 2021-07-22 05:08:47 +00:00
Salad Dais
82663c0fc2 Add parse_bool helper function for command parameters 2021-07-21 06:39:29 +00:00
Salad Dais
9fb4884470 Extend TlsLayer.tls_start_server instead of monkeypatching OpenSSL funcs
We have a more elegant way of unsetting `X509_CHECK_FLAG_NEVER_CHECK_SUBJECT`
now that mitmproxy 7.0 is out.

See https://github.com/mitmproxy/mitmproxy/pull/4688
2021-07-19 20:17:31 +00:00
Salad Dais
cf69c42f67 Rework HTTP proxying code to work with mitmproxy 7.0.0 2021-07-18 07:02:45 +00:00
Salad Dais
be658b9026 v0.6.3
Cutting a release before working on mitmproxy upgrade
2021-07-18 06:57:40 +00:00
Salad Dais
c505941595 Improve test for TE serialization 2021-07-18 06:33:55 +00:00
Salad Dais
96f471d6b7 Add initial support for Message-specific Block subclasses 2021-07-07 12:49:32 +00:00
Salad Dais
4238016767 Change readme wording
:)
2021-07-07 12:49:32 +00:00
Salad Dais
a35a67718d Add default_value to MessateTemplateVariable 2021-07-01 21:25:51 +00:00
Salad Dais
c2981b107a Remove CodeQL scanning
Maybe later, doesn't seem to do anything useful out of the box.
2021-06-28 06:00:42 -03:00
Salad Dais
851375499a Add CodeQL scanning 2021-06-28 05:44:02 -03:00
Salad Dais
d064ecd466 Don't raise when reading a new avatar_name_cache.xml 2021-06-25 18:45:42 +00:00
Salad Dais
fda37656c9 Reduce boilerplate for mesh mangling addons
Makes it less annoying to compose separate addons with different manglers
2021-06-24 05:29:23 +00:00
Salad Dais
49a9c6f28f Workaround for failed teleports due to EventQueue timeouts
Closes #16
2021-06-23 16:43:09 +00:00
102 changed files with 5389 additions and 1161 deletions

View File

@@ -8,3 +8,5 @@ exclude_lines =
if typing.TYPE_CHECKING:
def __repr__
raise AssertionError
assert False
pass

View File

@@ -2,18 +2,23 @@
# onto the release after it gets created. Don't want actions with repo write.
name: Bundle Windows EXE
on:
# Only trigger on release creation
release:
types:
- created
workflow_dispatch:
env:
target_tag: ${{ github.ref_name }}
jobs:
build:
runs-on: windows-latest
runs-on: windows-2019
permissions:
contents: write
strategy:
matrix:
python-version: [3.9]
@@ -29,18 +34,29 @@ jobs:
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install -e .
pip install cx_freeze
- name: Bundle with cx_Freeze
shell: bash
run: |
python setup_cxfreeze.py build_exe
pip install pip-licenses
pip-licenses --format=plain-vertical --with-license-file --no-license-path --output-file=lib_licenses.txt
python setup_cxfreeze.py finalize_cxfreeze
# Should only be one, but we don't know what it's named
mv ./dist/*.zip hippolyzer-windows-${{ env.target_tag }}.zip
- name: Upload the artifact
uses: actions/upload-artifact@v2
with:
name: hippolyzer-gui-windows-${{ github.sha }}
path: ./dist/**
name: hippolyzer-windows-${{ github.sha }}
path: ./hippolyzer-windows-${{ env.target_tag }}.zip
- uses: ncipollo/release-action@v1.10.0
with:
artifacts: hippolyzer-windows-${{ env.target_tag }}.zip
tag: ${{ env.target_tag }}
token: ${{ secrets.GITHUB_TOKEN }}
allowUpdates: true

View File

@@ -8,7 +8,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [3.8, 3.9]
python-version: ["3.8", "3.10"]
steps:
- uses: actions/checkout@v2

View File

@@ -2,7 +2,7 @@
![Python Test Status](https://github.com/SaladDais/Hippolyzer/workflows/Run%20Python%20Tests/badge.svg) [![codecov](https://codecov.io/gh/SaladDais/Hippolyzer/branch/master/graph/badge.svg?token=HCTFA4RAXX)](https://codecov.io/gh/SaladDais/Hippolyzer)
[Hippolyzer](http://wiki.secondlife.com/wiki/Hippo) is a fork of Linden Lab's abandoned
[Hippolyzer](http://wiki.secondlife.com/wiki/Hippo) is a revival of Linden Lab's
[PyOGP library](http://wiki.secondlife.com/wiki/PyOGP)
targeting modern Python 3, with a focus on debugging issues in Second Life-compatible
servers and clients. There is a secondary focus on mocking up new features without requiring a
@@ -83,6 +83,10 @@ SOCKS 5 works correctly on these platforms, so you can just configure it through
the `no_proxy` env var appropriately. For ex. `no_proxy="asset-cdn.glb.agni.lindenlab.com" ./firestorm`.
* Log in!
Or, if you're on Linux, you can instead use [LinHippoAutoProxy](https://github.com/SaladDais/LinHippoAutoProxy)
to launch your viewer, which will configure everything for you. Note that connections from the in-viewer browser will
likely _not_ be run through Hippolyzer when using LinHippoAutoProxy.
### Filtering
By default, the proxy's display filter is configured to ignore many high-frequency messages.
@@ -311,6 +315,22 @@ If you are a viewer developer, please put them in a viewer.
apply the mesh to the local mesh target. It works on attachments too. Useful for testing rigs before a
final, real upload.
## REPL
A quick and dirty REPL is also included for when you want to do ad-hoc introspection of proxy state.
It can be launched at any time by typing `/524 spawn_repl` in chat.
![Screenshot of REPL](https://github.com/SaladDais/Hippolyzer/blob/master/static/repl_screenshot.png?raw=true)
The REPL is fully async aware and allows awaiting events without blocking:
```python
>>> from hippolyzer.lib.client.object_manager import ObjectUpdateType
>>> evt = await session.objects.events.wait_for((ObjectUpdateType.OBJECT_UPDATE,), timeout=2.0)
>>> evt.updated
{'Position'}
```
## Potential Changes
* AISv3 wrapper?
@@ -375,6 +395,12 @@ To have your client's traffic proxied through Hippolyzer the general flow is:
* The proxy needs to use content sniffing to figure out which requests are login requests,
so make sure your request would pass `MITMProxyEventManager._is_login_request()`
#### Do I have to do all that?
You might be able to automate some of it on Linux by using
[LinHippoAutoProxy](https://github.com/SaladDais/LinHippoAutoProxy). If you're on Windows or MacOS the
above is your only option.
### Should I use this library to make an SL client in Python?
No. If you just want to write a client in Python, you should instead look at using

View File

@@ -0,0 +1,32 @@
"""
Example anim mangler addon, to be used with local anim addon.
You can edit this live to apply various transforms to local anims,
as well as any uploaded anims. Any changes will be reflected in currently
playing local anims.
This example modifies any position keys of an animation's mHipRight joint.
"""
from hippolyzer.lib.base.llanim import Animation
from hippolyzer.lib.proxy.addons import AddonManager
import local_anim
AddonManager.hot_reload(local_anim, require_addons_loaded=True)
def offset_right_hip(anim: Animation):
hip_joint = anim.joints.get("mHipRight")
if hip_joint:
for pos_frame in hip_joint.pos_keyframes:
pos_frame.pos.Z *= 2.5
pos_frame.pos.X *= 5.0
return anim
class ExampleAnimManglerAddon(local_anim.BaseAnimManglerAddon):
ANIM_MANGLERS = [
offset_right_hip,
]
addons = [ExampleAnimManglerAddon()]

View File

@@ -11,7 +11,7 @@ import enum
import os.path
from typing import *
from PySide2 import QtCore, QtGui, QtWidgets
from PySide6 import QtCore, QtGui, QtWidgets
from hippolyzer.lib.base.datatypes import Vector3
from hippolyzer.lib.base.message.message import Block, Message
@@ -80,7 +80,7 @@ class BlueishObjectListGUIAddon(BaseAddon):
raise
def _highlight_object(self, session: Session, obj: Object):
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"ForceObjectSelect",
Block("Header", ResetList=False),
Block("Data", LocalID=obj.LocalID),
@@ -88,7 +88,7 @@ class BlueishObjectListGUIAddon(BaseAddon):
))
def _teleport_to_object(self, session: Session, obj: Object):
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"TeleportLocationRequest",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
Block(

View File

@@ -4,8 +4,13 @@ Helper for making deformer anims. This could have a GUI I guess.
import dataclasses
from typing import *
import numpy as np
import transformations
from hippolyzer.lib.base.datatypes import Vector3, Quaternion, UUID
from hippolyzer.lib.base.llanim import Joint, Animation, PosKeyframe, RotKeyframe
from hippolyzer.lib.base.mesh import MeshAsset, SegmentHeaderDict, SkinSegmentDict, LLMeshSerializer
from hippolyzer.lib.base.serialization import BufferWriter
from hippolyzer.lib.proxy.addon_utils import show_message, BaseAddon, SessionProperty
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.commands import handle_command, Parameter
@@ -45,6 +50,58 @@ def build_deformer(joints: Dict[str, DeformerJoint]) -> bytes:
return anim.to_bytes()
def build_mesh_deformer(joints: Dict[str, DeformerJoint]) -> bytes:
skin_seg = SkinSegmentDict(
joint_names=[],
bind_shape_matrix=identity_mat4(),
inverse_bind_matrix=[],
alt_inverse_bind_matrix=[],
pelvis_offset=0.0,
lock_scale_if_joint_position=False
)
for joint_name, joint in joints.items():
# We can only represent joint translations, ignore this joint if it doesn't have any.
if not joint.pos:
continue
skin_seg['joint_names'].append(joint_name)
# Inverse bind matrix isn't actually used, so we can just give it a placeholder value of the
# identity mat4. This might break things in weird ways because the matrix isn't actually sensible.
skin_seg['inverse_bind_matrix'].append(identity_mat4())
# Create a flattened mat4 that only has a translation component of our joint pos
# The viewer ignores any other component of these matrices so no point putting shear
# or perspective or whatever :)
joint_mat4 = pos_to_mat4(joint.pos)
# Ask the viewer to override this joint's usual parent-relative position with our matrix
skin_seg['alt_inverse_bind_matrix'].append(joint_mat4)
# Make a dummy mesh and shove our skin segment onto it. None of the tris are rigged, so the
# viewer will freak out and refuse to display the tri, only the joint translations will be used.
# Supposedly a mesh with a `skin` segment but no weights on the material should just result in an
# effectively unrigged material, but that's not the case. Oh well.
mesh = MeshAsset.make_triangle()
mesh.header['skin'] = SegmentHeaderDict(offset=0, size=0)
mesh.segments['skin'] = skin_seg
writer = BufferWriter("!")
writer.write(LLMeshSerializer(), mesh)
return writer.copy_buffer()
def identity_mat4() -> List[float]:
"""
Return an "Identity" mat4
Effectively represents a transform of no rot, no translation, no shear, no perspective
and scaling by 1.0 on every axis.
"""
return list(np.identity(4).flatten('F'))
def pos_to_mat4(pos: Vector3) -> List[float]:
"""Convert a position Vector3 to a Translation Mat4"""
return list(transformations.compose_matrix(translate=tuple(pos)).flatten('F'))
class DeformerAddon(BaseAddon):
deform_joints: Dict[str, DeformerJoint] = SessionProperty(dict)
@@ -118,5 +175,41 @@ class DeformerAddon(BaseAddon):
self._reapply_deformer(session, region)
return True
@handle_command()
async def save_deformer_as_mesh(self, _session: Session, _region: ProxiedRegion):
"""
Export the deformer as a crafted rigged mesh rather than an animation
Mesh deformers have the advantage that they don't cause your joints to "stick"
like animations do when using animations with pos keyframes.
"""
filename = await AddonManager.UI.save_file(filter_str="LL Mesh (*.llmesh)")
if not filename:
return
with open(filename, "wb") as f:
f.write(build_mesh_deformer(self.deform_joints))
@handle_command()
async def upload_deformer_as_mesh(self, _session: Session, region: ProxiedRegion):
"""Same as save_deformer_as_mesh, but uploads the mesh directly to SL."""
mesh_bytes = build_mesh_deformer(self.deform_joints)
try:
# Send off mesh to calculate upload cost
upload_token = await region.asset_uploader.initiate_mesh_upload("deformer", mesh_bytes)
except Exception as e:
show_message(e)
raise
if not await AddonManager.UI.confirm("Upload", f"Spend {upload_token.linden_cost}L on upload?"):
return
# Do the actual upload
try:
await region.asset_uploader.complete_upload(upload_token)
except Exception as e:
show_message(e)
raise
addons = [DeformerAddon()]

View File

@@ -0,0 +1,158 @@
"""
Detect receipt of a marketplace order for a demo, and auto-attach the most appropriate object
"""
import asyncio
import re
from typing import List, Tuple, Dict, Optional, Sequence
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message, Block
from hippolyzer.lib.base.templates import InventoryType, Permissions, FolderType
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
MARKETPLACE_TRANSACTION_ID = UUID('ffffffff-ffff-ffff-ffff-ffffffffffff')
class DemoAutoAttacher(BaseAddon):
def handle_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
if event["message"] != "BulkUpdateInventory":
return
# Check that this update even possibly came from the marketplace
if event["body"]["AgentData"][0]["TransactionID"] != MARKETPLACE_TRANSACTION_ID:
return
# Make sure that the transaction targeted our real received items folder
folders = event["body"]["FolderData"]
received_folder = folders[0]
if received_folder["Name"] != "Received Items":
return
skel = session.login_data['inventory-skeleton']
actual_received = [x for x in skel if x['type_default'] == FolderType.INBOX]
assert actual_received
if UUID(actual_received[0]['folder_id']) != received_folder["FolderID"]:
show_message(f"Strange received folder ID spoofing? {folders!r}")
return
if not re.match(r".*\bdemo\b.*", folders[1]["Name"], flags=re.I):
return
# Alright, so we have a demo... thing from the marketplace. What now?
items = event["body"]["ItemData"]
object_items = [x for x in items if x["InvType"] == InventoryType.OBJECT]
if not object_items:
return
self._schedule_task(self._attach_best_object(session, region, object_items))
async def _attach_best_object(self, session: Session, region: ProxiedRegion, object_items: List[Dict]):
own_body_type = await self._guess_own_body(session, region)
show_message(f"Trying to find demo for {own_body_type}")
guess_patterns = self.BODY_CLOTHING_PATTERNS.get(own_body_type)
to_attach = []
if own_body_type and guess_patterns:
matching_items = self._get_matching_items(object_items, guess_patterns)
if matching_items:
# Only take the first one
to_attach.append(matching_items[0])
if not to_attach:
# Don't know what body's being used or couldn't figure out what item
# would work best with our body. Just attach the first object in the folder.
to_attach.append(object_items[0])
# Also attach whatever HUDs, maybe we need them.
for hud in self._get_matching_items(object_items, ("hud",)):
if hud not in to_attach:
to_attach.append(hud)
region.circuit.send(Message(
'RezMultipleAttachmentsFromInv',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('HeaderData', CompoundMsgID=UUID.random(), TotalObjects=len(to_attach), FirstDetachAll=0),
*[Block(
'ObjectData',
ItemID=o["ItemID"],
OwnerID=session.agent_id,
# 128 = "add", uses whatever attachmentpt was defined on the object
AttachmentPt=128,
ItemFlags_=(),
GroupMask_=(),
EveryoneMask_=(),
NextOwnerMask_=(Permissions.COPY | Permissions.MOVE),
Name=o["Name"],
Description=o["Description"],
) for o in to_attach]
))
def _get_matching_items(self, items: List[dict], patterns: Sequence[str]):
# Loop over patterns to search for our body type, in order of preference
matched = []
for guess_pattern in patterns:
# Check each item for that pattern
for item in items:
if re.match(rf".*\b{guess_pattern}\b.*", item["Name"], re.I):
matched.append(item)
return matched
# We scan the agent's attached objects to guess what kind of body they use
BODY_PREFIXES = {
"-Belleza- Jake ": "jake",
"-Belleza- Freya ": "freya",
"-Belleza- Isis ": "isis",
"-Belleza- Venus ": "venus",
"[Signature] Gianni Body": "gianni",
"[Signature] Geralt Body": "geralt",
"Maitreya Mesh Body - Lara": "maitreya",
"Slink Physique Hourglass Petite": "hg_petite",
"Slink Physique Mesh Body Hourglass": "hourglass",
"Slink Physique Original Petite": "phys_petite",
"Slink Physique Mesh Body Original": "physique",
"[BODY] Legacy (f)": "legacy_f",
"[BODY] Legacy (m)": "legacy_m",
"[Signature] Alice Body": "sig_alice",
"Slink Physique MALE Mesh Body": "slink_male",
"AESTHETIC - [Mesh Body]": "aesthetic",
}
# Different bodies' clothes have different naming conventions according to different merchants.
# These are common naming patterns we use to choose objects to attach, in order of preference.
BODY_CLOTHING_PATTERNS: Dict[str, Tuple[str, ...]] = {
"jake": ("jake", "belleza"),
"freya": ("freya", "belleza"),
"isis": ("isis", "belleza"),
"venus": ("venus", "belleza"),
"gianni": ("gianni", "signature", "sig"),
"geralt": ("geralt", "signature", "sig"),
"hg_petite": ("hourglass petite", "hg petite", "hourglass", "hg", "slink"),
"hourglass": ("hourglass", "hg", "slink"),
"phys_petite": ("physique petite", "phys petite", "physique", "phys", "slink"),
"physique": ("physique", "phys", "slink"),
"legacy_f": ("legacy",),
"legacy_m": ("legacy",),
"sig_alice": ("alice", "signature"),
"slink_male": ("physique", "slink"),
"aesthetic": ("aesthetic",),
}
async def _guess_own_body(self, session: Session, region: ProxiedRegion) -> Optional[str]:
agent_obj = region.objects.lookup_fullid(session.agent_id)
if not agent_obj:
return None
# We probably won't know the names for all of our attachments, request them.
# Could be obviated by looking at the COF, not worth it for this.
try:
await asyncio.wait(region.objects.request_object_properties(agent_obj.Children), timeout=0.5)
except asyncio.TimeoutError:
# We expect that we just won't ever receive some property requests, that's fine
pass
for prefix, body_type in self.BODY_PREFIXES.items():
for obj in agent_obj.Children:
if not obj.Name:
continue
if obj.Name.startswith(prefix):
return body_type
return None
addons = [DemoAutoAttacher()]

View File

@@ -0,0 +1,119 @@
"""
Loading task inventory doesn't actually need to be slow.
By using a cap instead of the slow xfer path and sending the LLSD inventory
model we get 15x speedups even when mocking things behind the scenes by using
a hacked up version of xfer. See turbo_object_inventory.py
"""
import asyncio
import asgiref.wsgi
from typing import *
from flask import Flask, Response, request
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.inventory import InventoryModel, InventoryObject
from hippolyzer.lib.base.message.message import Message, Block
from hippolyzer.lib.base.templates import XferFilePath
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.webapp_cap_addon import WebAppCapAddon
app = Flask("GetTaskInventoryCapApp")
@app.route('/', methods=["GET"])
async def get_task_inventory():
# Should always have the current region, the cap handler is bound to one.
# Just need to pull it from the `addon_ctx` module's global.
region = addon_ctx.region.get()
session = addon_ctx.session.get()
obj_id = UUID(request.args["task_id"])
obj = region.objects.lookup_fullid(obj_id)
if not obj:
return Response(f"Couldn't find {obj_id}", status=404, mimetype="text/plain")
request_msg = Message(
'RequestTaskInventory',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('InventoryData', LocalID=obj.LocalID),
)
# Keep around a dict of chunks we saw previously in case we have to restart
# an Xfer due to missing chunks. We don't expect chunks to change across Xfers
# so this can be used to recover from dropped SendXferPackets in subsequent attempts
existing_chunks: Dict[int, bytes] = {}
for _ in range(3):
# Any previous requests will have triggered a delete of the inventory file
# by marking it complete on the server-side. Re-send our RequestTaskInventory
# To make sure there's a fresh copy.
region.circuit.send(request_msg.take())
inv_message = await region.message_handler.wait_for(
('ReplyTaskInventory',),
predicate=lambda x: x["InventoryData"]["TaskID"] == obj.FullID,
timeout=5.0,
)
# No task inventory, send the reply as-is
file_name = inv_message["InventoryData"]["Filename"]
if not file_name:
# The "Contents" folder always has to be there, if we don't put it here
# then the viewer will have to lie about it being there itself.
return Response(
llsd.format_xml({
"inventory": [
InventoryObject(
name="Contents",
parent_id=UUID.ZERO,
type="category",
obj_id=obj_id
).to_llsd()
],
"inv_serial": inv_message["InventoryData"]["Serial"],
}),
headers={"Content-Type": "application/llsd+xml"},
status=200,
)
last_serial = request.args.get("last_serial", None)
if last_serial:
last_serial = int(last_serial)
if inv_message["InventoryData"]["Serial"] == last_serial:
# Nothing has changed since the version of the inventory they say they have, say so.
return Response("", status=304)
xfer = region.xfer_manager.request(
file_name=file_name,
file_path=XferFilePath.CACHE,
turbo=True,
)
xfer.chunks.update(existing_chunks)
try:
await xfer
except asyncio.TimeoutError:
# We likely failed the request due to missing chunks, store
# the chunks that we _did_ get for the next attempt.
existing_chunks.update(xfer.chunks)
continue
inv_model = InventoryModel.from_str(xfer.reassemble_chunks().decode("utf8"))
return Response(
llsd.format_xml({
"inventory": inv_model.to_llsd(),
"inv_serial": inv_message["InventoryData"]["Serial"],
}),
headers={"Content-Type": "application/llsd+xml"},
)
raise asyncio.TimeoutError("Failed to get inventory after 3 tries")
class GetTaskInventoryCapExampleAddon(WebAppCapAddon):
# A cap URL with this name will be tied to each region when
# the sim is first connected to. The URL will be returned to the
# viewer in the Seed if the viewer requests it by name.
CAP_NAME = "GetTaskInventoryExample"
# Any asgi app should be fine.
APP = asgiref.wsgi.WsgiToAsgi(app)
addons = [GetTaskInventoryCapExampleAddon()]

View File

@@ -105,7 +105,7 @@ class HorrorAnimatorAddon(BaseAddon):
# send the response back immediately
block = STATIC_VFS[orig_anim_id]
anim_data = STATIC_VFS.read_block(block)
flow.response = mitmproxy.http.HTTPResponse.make(
flow.response = mitmproxy.http.Response.make(
200,
_mutate_anim_bytes(anim_data),
{

View File

@@ -5,42 +5,58 @@ Local animations
assuming you loaded something.anim
/524 start_local_anim something
/524 stop_local_anim something
/524 save_local_anim something
If you want to trigger the animation from an object to simulate llStartAnimation():
llOwnerSay("@start_local_anim:something=force");
Also includes a concept of "anim manglers" similar to the "mesh manglers" of the
local mesh addon. This is useful if you want to test making procedural changes
to animations before uploading them. The manglers will be applied to any uploaded
animations as well.
May also be useful if you need to make ad-hoc changes to a bunch of animations on
bulk upload, like changing priority or removing a joint.
"""
import asyncio
import os
import logging
import pathlib
from abc import abstractmethod
from typing import *
from hippolyzer.lib.base import serialization as se
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import get_mtime
from hippolyzer.lib.base.llanim import Animation
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.msgtypes import PacketFlags
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty, GlobalProperty, show_message
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.http_asset_repo import HTTPAssetRepo
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
def _get_mtime(path: str):
try:
return os.stat(path).st_mtime
except:
return None
from hippolyzer.lib.proxy.sessions import Session, SessionManager
class LocalAnimAddon(BaseAddon):
# name -> path, only for anims actually from files
local_anim_paths: Dict[str, str] = SessionProperty(dict)
# name -> anim bytes
local_anim_bytes: Dict[str, bytes] = SessionProperty(dict)
# name -> mtime or None. Only for anims from files.
local_anim_mtimes: Dict[str, Optional[float]] = SessionProperty(dict)
# name -> current asset ID (changes each play)
local_anim_playing_ids: Dict[str, UUID] = SessionProperty(dict)
anim_manglers: List[Callable[[Animation], Animation]] = GlobalProperty(list)
def handle_init(self, session_manager: SessionManager):
self.remangle_local_anims(session_manager)
def handle_session_init(self, session: Session):
# Reload anims and reload any manglers if we have any
self._schedule_task(self._try_reload_anims(session))
@handle_command()
@@ -66,11 +82,23 @@ class LocalAnimAddon(BaseAddon):
"""Stop a named local animation"""
self.apply_local_anim(session, region, anim_name, new_data=None)
@handle_command(anim_name=str)
async def save_local_anim(self, _session: Session, _region: ProxiedRegion, anim_name: str):
"""Save a named local anim to disk"""
anim_bytes = self.local_anim_bytes.get(anim_name)
if not anim_bytes:
return
filename = await AddonManager.UI.save_file(filter_str="SL Anim (*.anim)", default_suffix="anim")
if not filename:
return
with open(filename, "wb") as f:
f.write(anim_bytes)
async def _try_reload_anims(self, session: Session):
while True:
region = session.main_region
if not region:
await asyncio.sleep(2.0)
await asyncio.sleep(1.0)
continue
# Loop over local anims we loaded
@@ -79,8 +107,11 @@ class LocalAnimAddon(BaseAddon):
if not anim_id:
continue
# is playing right now, check if there's a newer version
self.apply_local_anim_from_file(session, region, anim_name, only_if_changed=True)
await asyncio.sleep(2.0)
try:
self.apply_local_anim_from_file(session, region, anim_name, only_if_changed=True)
except Exception:
logging.exception("Exploded while replaying animation")
await asyncio.sleep(1.0)
def handle_rlv_command(self, session: Session, region: ProxiedRegion, source: UUID,
cmd: str, options: List[str], param: str):
@@ -107,6 +138,7 @@ class LocalAnimAddon(BaseAddon):
AgentID=session.agent_id,
SessionID=session.id,
),
flags=PacketFlags.RELIABLE,
)
# Stop any old version of the anim that might be playing first
@@ -127,11 +159,13 @@ class LocalAnimAddon(BaseAddon):
StartAnim=True,
))
cls.local_anim_playing_ids[anim_name] = next_id
cls.local_anim_bytes[anim_name] = new_data
else:
# No data means just stop the anim
cls.local_anim_playing_ids.pop(anim_name, None)
cls.local_anim_bytes.pop(anim_name, None)
region.circuit.send_message(new_msg)
region.circuit.send(new_msg)
print(f"Changing {anim_name} to {next_id}")
@classmethod
@@ -141,11 +175,10 @@ class LocalAnimAddon(BaseAddon):
anim_data = None
if anim_path:
old_mtime = cls.local_anim_mtimes.get(anim_name)
mtime = _get_mtime(anim_path)
mtime = get_mtime(anim_path)
if only_if_changed and old_mtime == mtime:
return
cls.local_anim_mtimes[anim_name] = mtime
# file might not even exist anymore if mtime is `None`,
# anim will automatically stop if that happens.
if mtime:
@@ -156,9 +189,95 @@ class LocalAnimAddon(BaseAddon):
with open(anim_path, "rb") as f:
anim_data = f.read()
anim_data = cls._mangle_anim(anim_data)
cls.local_anim_mtimes[anim_name] = mtime
else:
print(f"Unknown anim {anim_name!r}")
cls.apply_local_anim(session, region, anim_name, new_data=anim_data)
@classmethod
def _mangle_anim(cls, anim_data: bytes) -> bytes:
if not cls.anim_manglers:
return anim_data
reader = se.BufferReader("<", anim_data)
spec = se.Dataclass(Animation)
anim = reader.read(spec)
for mangler in cls.anim_manglers:
anim = mangler(anim)
writer = se.BufferWriter("<")
writer.write(spec, anim)
return writer.copy_buffer()
@classmethod
def remangle_local_anims(cls, session_manager: SessionManager):
# Anim manglers are global, so we need to re-mangle anims for all sessions
for session in session_manager.sessions:
# Push the context of this session onto the stack so we can access
# session-scoped properties
with addon_ctx.push(new_session=session, new_region=session.main_region):
cls.local_anim_mtimes.clear()
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
if flow.name == "NewFileAgentInventoryUploader":
# Don't bother looking at this if we have no manglers
if not self.anim_manglers:
return
# This is kind of a crappy match but these magic bytes shouldn't match anything that SL
# allows as an upload type but animations.
if not flow.request.content or not flow.request.content.startswith(b"\x01\x00\x00\x00"):
return
# Replace the uploaded anim with the mangled version
flow.request.content = self._mangle_anim(flow.request.content)
show_message("Mangled upload request")
class BaseAnimManglerAddon(BaseAddon):
"""Base class for addons that mangle uploaded or file-based local animations"""
ANIM_MANGLERS: List[Callable[[Animation], Animation]]
def handle_init(self, session_manager: SessionManager):
# Add our manglers into the list
LocalAnimAddon.anim_manglers.extend(self.ANIM_MANGLERS)
LocalAnimAddon.remangle_local_anims(session_manager)
def handle_unload(self, session_manager: SessionManager):
# Clean up our manglers before we go away
mangler_list = LocalAnimAddon.anim_manglers
for mangler in self.ANIM_MANGLERS:
if mangler in mangler_list:
mangler_list.remove(mangler)
LocalAnimAddon.remangle_local_anims(session_manager)
class BaseAnimHelperAddon(BaseAddon):
"""
Base class for local creation of procedural animations
Animation generated by build_anim() gets applied to all active sessions
"""
ANIM_NAME: str
def handle_session_init(self, session: Session):
self._reapply_anim(session, session.main_region)
def handle_session_closed(self, session: Session):
LocalAnimAddon.apply_local_anim(session, session.main_region, self.ANIM_NAME, None)
def handle_unload(self, session_manager: SessionManager):
for session in session_manager.sessions:
# TODO: Nasty. Since we need to access session-local attrs we need to set the
# context even though we also explicitly pass session and region.
# Need to rethink the LocalAnimAddon API.
with addon_ctx.push(session, session.main_region):
LocalAnimAddon.apply_local_anim(session, session.main_region, self.ANIM_NAME, None)
@abstractmethod
def build_anim(self) -> Animation:
pass
def _reapply_anim(self, session: Session, region: ProxiedRegion):
LocalAnimAddon.apply_local_anim(session, region, self.ANIM_NAME, self.build_anim().to_bytes())
addons = [LocalAnimAddon()]

View File

@@ -81,17 +81,16 @@ class MeshUploadInterceptingAddon(BaseAddon):
@handle_command()
async def set_local_mesh_target(self, session: Session, region: ProxiedRegion):
"""Set the currently selected object as the target for local mesh"""
parent_object = region.objects.lookup_localid(session.selected.object_local)
if not parent_object:
"""Set the currently selected objects as the target for local mesh"""
selected_links = [region.objects.lookup_localid(l_id) for l_id in session.selected.object_locals]
selected_links = [o for o in selected_links if o is not None]
if not selected_links:
show_message("Nothing selected")
return
linkset_objects = [parent_object] + parent_object.Children
old_locals = self.local_mesh_target_locals
self.local_mesh_target_locals = [
x.LocalID
for x in linkset_objects
for x in selected_links
if ExtraParamType.MESH in x.ExtraParams
]
@@ -201,7 +200,7 @@ class MeshUploadInterceptingAddon(BaseAddon):
self.local_mesh_mapping = {x["mesh_name"]: x["mesh"] for x in instances}
# Fake a response, we don't want to actually send off the request.
flow.response = mitmproxy.http.HTTPResponse.make(
flow.response = mitmproxy.http.Response.make(
200,
b"",
{
@@ -280,4 +279,23 @@ class MeshUploadInterceptingAddon(BaseAddon):
cls._replace_local_mesh(session.main_region, asset_repo, mesh_list)
class BaseMeshManglerAddon(BaseAddon):
"""Base class for addons that mangle uploaded or local mesh"""
MESH_MANGLERS: List[Callable[[MeshAsset], MeshAsset]]
def handle_init(self, session_manager: SessionManager):
# Add our manglers into the list
MeshUploadInterceptingAddon.mesh_manglers.extend(self.MESH_MANGLERS)
# Tell the local mesh plugin that the mangler list changed, and to re-apply
MeshUploadInterceptingAddon.remangle_local_mesh(session_manager)
def handle_unload(self, session_manager: SessionManager):
# Clean up our manglers before we go away
mangler_list = MeshUploadInterceptingAddon.mesh_manglers
for mangler in self.MESH_MANGLERS:
if mangler in mangler_list:
mangler_list.remove(mangler)
MeshUploadInterceptingAddon.remangle_local_mesh(session_manager)
addons = [MeshUploadInterceptingAddon()]

View File

@@ -11,25 +11,28 @@ to add to give a mesh an arbitrary center of rotation / scaling.
from hippolyzer.lib.base.mesh import MeshAsset
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.sessions import SessionManager
import local_mesh
AddonManager.hot_reload(local_mesh, require_addons_loaded=True)
def _reorient_coord(coord, orientation):
def _reorient_coord(coord, orientation, normals=False):
coords = []
for axis in orientation:
axis_idx = abs(axis) - 1
coords.append(coord[axis_idx] if axis >= 0 else 1.0 - coord[axis_idx])
if normals:
# Normals have a static domain from -1.0 to 1.0, just negate.
new_coord = coord[axis_idx] if axis >= 0 else -coord[axis_idx]
else:
new_coord = coord[axis_idx] if axis >= 0 else 1.0 - coord[axis_idx]
coords.append(new_coord)
if coord.__class__ in (list, tuple):
return coord.__class__(coords)
return coord.__class__(*coords)
def _reorient_coord_list(coord_list, orientation):
return [_reorient_coord(x, orientation) for x in coord_list]
def _reorient_coord_list(coord_list, orientation, normals=False):
return [_reorient_coord(x, orientation, normals) for x in coord_list]
def reorient_mesh(orientation):
@@ -37,37 +40,23 @@ def reorient_mesh(orientation):
# X=1, Y=2, Z=3
def _reorienter(mesh: MeshAsset):
for material in mesh.iter_lod_materials():
if "Position" not in material:
# Must be a NoGeometry LOD
continue
# We don't need to use positions_(to/from)_domain here since we're just naively
# flipping the axes around.
material["Position"] = _reorient_coord_list(material["Position"], orientation)
# Are you even supposed to do this to the normals?
material["Normal"] = _reorient_coord_list(material["Normal"], orientation)
material["Normal"] = _reorient_coord_list(material["Normal"], orientation, normals=True)
return mesh
return _reorienter
OUR_MANGLERS = [
# Negate the X and Y axes on any mesh we upload or create temp
reorient_mesh((-1, -2, 3)),
]
class ExampleMeshManglerAddon(local_mesh.BaseMeshManglerAddon):
MESH_MANGLERS = [
# Negate the X and Y axes on any mesh we upload or create temp
reorient_mesh((-1, -2, 3)),
]
class MeshManglerExampleAddon(BaseAddon):
def handle_init(self, session_manager: SessionManager):
# Add our manglers into the list
local_mesh_addon = local_mesh.MeshUploadInterceptingAddon
local_mesh_addon.mesh_manglers.extend(OUR_MANGLERS)
# Tell the local mesh plugin that the mangler list changed, and to re-apply
local_mesh_addon.remangle_local_mesh(session_manager)
def handle_unload(self, session_manager: SessionManager):
# Clean up our manglers before we go away
local_mesh_addon = local_mesh.MeshUploadInterceptingAddon
mangler_list = local_mesh_addon.mesh_manglers
for mangler in OUR_MANGLERS:
if mangler in mangler_list:
mangler_list.remove(mangler)
local_mesh_addon.remangle_local_mesh(session_manager)
addons = [MeshManglerExampleAddon()]
addons = [ExampleMeshManglerAddon()]

View File

@@ -0,0 +1,244 @@
"""
Message Mirror
Re-routes messages through the circuit of another agent running through this proxy,
rewriting the messages to use the credentials tied to that circuit.
Useful if you need to quickly QA authorization checks on a message handler or script.
Or if you want to chat as two people at once. Whatever.
Also shows some advanced ways of managing / rerouting Messages and HTTP flows.
Fiddle with the values of `SEND_NORMALLY` and `MIRROR` to change how and which
messages get moved to other circuits.
Usage: /524 mirror_to <mirror_agent_uuid>
To Disable: /524 mirror_to
"""
import weakref
from typing import Optional
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.template_dict import DEFAULT_TEMPLATE_DICT
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty, show_message
from hippolyzer.lib.proxy.commands import handle_command, Parameter, parse_bool
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.caps import CapData, CapType
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
# Things that make no sense to mirror, or will make everything explode if mirrored.
SEND_NORMALLY = {
'StartPingCheck', 'CompletePingCheck', 'PacketAck', 'SimulatorViewerTimeMessage', 'SimStats',
'SoundTrigger', 'EventQueueGet', 'GetMesh', 'GetMesh2', 'ParcelDwellRequest', 'ViewerEffect', 'ViewerStats',
'ParcelAccessListRequest', 'FirestormBridge', 'AvatarRenderInfo', 'ParcelPropertiesRequest', 'GetObjectCost',
'RequestMultipleObjects', 'GetObjectPhysicsData', 'GetExperienceInfo', 'RequestTaskInventory', 'AgentRequestSit',
'MuteListRequest', 'UpdateMuteListEntry', 'RemoveMuteListEntry', 'RequestImage',
'AgentThrottle', 'UseCircuitCode', 'AgentWearablesRequest', 'AvatarPickerRequest', 'CloseCircuit',
'CompleteAgentMovement', 'RegionHandshakeReply', 'LogoutRequest', 'ParcelPropertiesRequest',
'ParcelPropertiesRequestByID', 'MapBlockRequest', 'MapLayerRequest', 'MapItemRequest', 'MapNameRequest',
'ParcelAccessListRequest', 'AvatarPropertiesRequest', 'DirFindQuery',
'SetAlwaysRun', 'GetDisplayNames', 'ViewerMetrics', 'AgentResume', 'AgentPause',
'ViewerAsset', 'GetTexture', 'UUIDNameRequest', 'AgentUpdate', 'AgentAnimation'
# Would just be confusing for everyone
'ImprovedInstantMessage',
# Xfer system isn't authed to begin with, and duping Xfers can lead to premature file deletion. Skip.
'RequestXfer', 'ConfirmXferPacket', 'AbortXfer', 'SendXferPacket',
}
# Messages that _must_ be sent normally, but are worth mirroring onto the target session to see how
# they would respond
MIRROR = {
'RequestObjectPropertiesFamily', 'ObjectSelect', 'RequestObjectProperties', 'TransferRequest',
'RequestMultipleObjects', 'RequestTaskInventory', 'FetchInventory2', 'ScriptDialogReply',
'ObjectDeselect', 'GenericMessage', 'ChatFromViewer'
}
for msg_name in DEFAULT_TEMPLATE_DICT.message_templates.keys():
# There are a lot of these.
if msg_name.startswith("Group") and msg_name.endswith("Request"):
MIRROR.add(msg_name)
class MessageMirrorAddon(BaseAddon):
mirror_target_agent: Optional[UUID] = SessionProperty(None)
mirror_use_target_session: bool = SessionProperty(True)
mirror_use_target_agent: bool = SessionProperty(True)
@handle_command(target_agent=Parameter(UUID, optional=True))
async def mirror_to(self, session: Session, _region, target_agent: Optional[UUID] = None):
"""
Send this session's outbound messages over another proxied agent's circuit
"""
if target_agent:
if target_agent == session.agent_id:
show_message("Can't mirror our own session")
target_agent = None
elif not any(s.agent_id == target_agent for s in session.session_manager.sessions):
show_message(f"No active proxied session for agent {target_agent}")
target_agent = None
self.mirror_target_agent = target_agent
if target_agent:
show_message(f"Mirroring to {target_agent}")
else:
show_message("Message mirroring disabled")
@handle_command(enabled=parse_bool)
async def set_mirror_use_target_session(self, _session, _region, enabled):
"""Replace the original session ID with the target session's ID when mirroring"""
self.mirror_use_target_session = enabled
@handle_command(enabled=parse_bool)
async def set_mirror_use_target_agent(self, _session, _region, enabled):
"""Replace the original agent ID with the target agent's ID when mirroring"""
self.mirror_use_target_agent = enabled
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if message.direction != Direction.OUT:
return
if not self.mirror_target_agent:
return
if message.name in SEND_NORMALLY:
return
target_session = None
for poss_session in session.session_manager.sessions:
if poss_session.agent_id == self.mirror_target_agent:
target_session = poss_session
if not target_session:
print("Couldn't find target session?")
return
target_region = None
for poss_region in target_session.regions:
if poss_region.circuit_addr == region.circuit_addr:
target_region = poss_region
if not target_region:
print("Couldn't find equivalent target region?")
return
# Send the message normally first if we're mirroring
if message.name in MIRROR:
region.circuit.send(message)
# We're going to send the message on a new circuit, we need to take
# it so we get a new packet ID and clean ACKs
message = message.take()
self._lludp_fixups(target_session, message)
target_region.circuit.send(message)
return True
def _lludp_fixups(self, target_session: Session, message: Message):
if "AgentData" in message:
agent_block = message["AgentData"][0]
if "AgentID" in agent_block and self.mirror_use_target_agent:
agent_block["AgentID"] = target_session.agent_id
if "SessionID" in agent_block and self.mirror_use_target_session:
agent_block["SessionID"] = target_session.id
if message.name == "TransferRequest":
transfer_block = message["TransferInfo"][0]
# This is a duplicated message so we need to give it a new ID
transfer_block["TransferID"] = UUID.random()
params = transfer_block.deserialize_var("Params")
# This kind of Transfer might not even use agent credentials
if self.mirror_use_target_agent and hasattr(params, 'AgentID'):
params.AgentID = target_session.agent_id
if self.mirror_use_target_session and hasattr(params, 'SessionID'):
params.SessionID = target_session.id
transfer_block.serialize_var("Params", params)
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
# Already mirrored, ignore.
if flow.is_replay:
return
cap_data = flow.cap_data
if not cap_data:
return
if cap_data.cap_name in SEND_NORMALLY:
return
if cap_data.asset_server_cap:
return
# Likely doesn't have an exact equivalent in the target session, this is a temporary
# cap like an uploader URL or a stats URL.
if cap_data.type == CapType.TEMPORARY:
return
session: Optional[Session] = cap_data.session and cap_data.session()
if not session:
return
region: Optional[ProxiedRegion] = cap_data.region and cap_data.region()
if not region:
return
# Session-scoped, so we need to know if we have a session before checking
if not self.mirror_target_agent:
return
target_session: Optional[Session] = None
for poss_session in session.session_manager.sessions:
if poss_session.agent_id == self.mirror_target_agent:
target_session = poss_session
if not target_session:
return
caps_source = target_session
target_region: Optional[ProxiedRegion] = None
if region:
target_region = None
for poss_region in target_session.regions:
if poss_region.circuit_addr == region.circuit_addr:
target_region = poss_region
if not target_region:
print("No region in cap?")
return
caps_source = target_region
new_base_url = caps_source.cap_urls.get(cap_data.cap_name)
if not new_base_url:
print("No equiv cap?")
return
if cap_data.cap_name in MIRROR:
flow = flow.copy()
# Have the cap data reflect the new URL we're pointing at
flow.metadata["cap_data"] = CapData(
cap_name=cap_data.cap_name,
region=weakref.ref(target_region) if target_region else None,
session=weakref.ref(target_session),
base_url=new_base_url,
)
# Tack any params onto the new base URL for the cap
new_url = new_base_url + flow.request.url[len(cap_data.base_url):]
flow.request.url = new_url
if cap_data.cap_name in MIRROR:
self._replay_flow(flow, session.session_manager)
def _replay_flow(self, flow: HippoHTTPFlow, session_manager: SessionManager):
# Work around mitmproxy bug, changing the URL updates the Host header, which may
# cause it to drop the port even when it shouldn't have. Fix the host header.
if flow.request.port not in (80, 443) and ":" not in flow.request.host_header:
flow.request.host_header = f"{flow.request.host}:{flow.request.port}"
# Should get repopulated when it goes back through the MITM addon
flow.metadata.pop("cap_data_ser", None)
flow.metadata.pop("cap_data", None)
proxy_queue = session_manager.flow_context.to_proxy_queue
proxy_queue.put_nowait(("replay", None, flow.get_state()))
addons = [MessageMirrorAddon()]

View File

@@ -0,0 +1,49 @@
"""
Example of proxy-provided caps
Useful for mocking out a cap that isn't actually implemented by the server
while developing the viewer-side pieces of it.
Implements a cap that accepts an `obj_id` UUID query parameter and returns
the name of the object.
"""
import asyncio
import asgiref.wsgi
from flask import Flask, Response, request
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.webapp_cap_addon import WebAppCapAddon
app = Flask("GetObjectNameCapApp")
@app.route('/')
async def get_object_name():
# Should always have the current region, the cap handler is bound to one.
# Just need to pull it from the `addon_ctx` module's global.
obj_mgr = addon_ctx.region.get().objects
obj_id = UUID(request.args['obj_id'])
obj = obj_mgr.lookup_fullid(obj_id)
if not obj:
return Response(f"Couldn't find {obj_id!r}", status=404, mimetype="text/plain")
try:
await asyncio.wait_for(obj_mgr.request_object_properties(obj)[0], 1.0)
except asyncio.TimeoutError:
return Response(f"Timed out requesting {obj_id!r}'s properties", status=500, mimetype="text/plain")
return Response(obj.Name, mimetype="text/plain")
class MockProxyCapExampleAddon(WebAppCapAddon):
# A cap URL with this name will be tied to each region when
# the sim is first connected to. The URL will be returned to the
# viewer in the Seed if the viewer requests it by name.
CAP_NAME = "GetObjectNameExample"
# Any asgi app should be fine.
APP = asgiref.wsgi.WsgiToAsgi(app)
addons = [MockProxyCapExampleAddon()]

View File

@@ -27,7 +27,7 @@ from mitmproxy.http import HTTPFlow
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.jp2_utils import BufferedJp2k
from hippolyzer.lib.base.multiprocessing_utils import ParentProcessWatcher
from hippolyzer.lib.base.templates import TextureEntry
from hippolyzer.lib.base.templates import TextureEntryCollection
from hippolyzer.lib.proxy.addon_utils import AssetAliasTracker, BaseAddon, GlobalProperty, AddonProcess
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.base.message.message import Message
@@ -148,7 +148,7 @@ class MonochromeAddon(BaseAddon):
message["RegionInfo"][field_name] = tracker.get_alias_uuid(val)
@staticmethod
def _make_te_monochrome(tracker: AssetAliasTracker, parsed_te: TextureEntry):
def _make_te_monochrome(tracker: AssetAliasTracker, parsed_te: TextureEntryCollection):
# Need a deepcopy because TEs are owned by the ObjectManager
# and we don't want to change the canonical view.
parsed_te = copy.deepcopy(parsed_te)

View File

@@ -0,0 +1,111 @@
"""
Check object manager state against region ViewerObject cache
Can't look at every object we've tracked and every object in VOCache
and report mismatches due to weird VOCache cache eviction criteria and certain
cacheable objects not being added to the VOCache.
Off the top of my head, animesh objects get explicit KillObjects at extreme
view distances same as avatars, but will still be present in the cache even
though they will not be in gObjectList.
"""
import asyncio
import logging
from typing import *
from hippolyzer.lib.base.objects import normalize_object_update_compressed_data
from hippolyzer.lib.base.templates import ObjectUpdateFlags, PCode
from hippolyzer.lib.proxy.addon_utils import BaseAddon, GlobalProperty
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager, Session
from hippolyzer.lib.proxy.vocache import is_valid_vocache_dir, RegionViewerObjectCacheChain
LOG = logging.getLogger(__name__)
class ObjectManagementValidator(BaseAddon):
base_cache_path: Optional[str] = GlobalProperty(None)
orig_auto_request: Optional[bool] = GlobalProperty(None)
def handle_init(self, session_manager: SessionManager):
if self.orig_auto_request is None:
self.orig_auto_request = session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS
session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS = False
async def _choose_cache_path():
while not self.base_cache_path:
cache_dir = await AddonManager.UI.open_dir("Choose the base cache directory")
if not cache_dir:
return
if not is_valid_vocache_dir(cache_dir):
continue
self.base_cache_path = cache_dir
if not self.base_cache_path:
self._schedule_task(_choose_cache_path(), session_scoped=False)
def handle_unload(self, session_manager: SessionManager):
session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS = self.orig_auto_request
def handle_session_init(self, session: Session):
# Use only the specified cache path for the vocache
session.cache_dir = self.base_cache_path
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if message.name != "DisableSimulator":
return
# Send it off to the client without handling it normally,
# we need to defer region teardown in the proxy
region.circuit.send(message)
self._schedule_task(self._check_cache_before_region_teardown(region))
return True
async def _check_cache_before_region_teardown(self, region: ProxiedRegion):
await asyncio.sleep(0.5)
print("Ok, checking cache differences")
try:
# Index will have been rewritten, so re-read it.
region_cache_chain = RegionViewerObjectCacheChain.for_region(
handle=region.handle,
cache_id=region.cache_id,
cache_dir=self.base_cache_path
)
if not region_cache_chain.region_caches:
print(f"no caches for {region!r}?")
return
all_full_ids = set()
for obj in region.objects.all_objects:
cacheable = True
orig_obj = obj
# Walk along the ancestry checking for things that would make the tree non-cacheable
while obj is not None:
if obj.UpdateFlags & ObjectUpdateFlags.TEMPORARY_ON_REZ:
cacheable = False
if obj.PCode == PCode.AVATAR:
cacheable = False
obj = obj.Parent
if cacheable:
all_full_ids.add(orig_obj.FullID)
for key in all_full_ids:
obj = region.objects.lookup_fullid(key)
cached_data = region_cache_chain.lookup_object_data(obj.LocalID, obj.CRC)
if not cached_data:
continue
orig_dict = obj.to_dict()
parsed_data = normalize_object_update_compressed_data(cached_data)
updated = obj.update_properties(parsed_data)
# Can't compare this yet
updated -= {"TextureEntry"}
if updated:
print(key)
for attr in updated:
print("\t", attr, orig_dict[attr], parsed_data[attr])
finally:
# Ok to teardown region in the proxy now
region.mark_dead()
addons = [ObjectManagementValidator()]

View File

@@ -37,7 +37,7 @@ class PaydayAddon(BaseAddon):
chat_type=ChatType.SHOUT,
)
# Do the traditional money dance.
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"AgentAnimation",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
Block("AnimationList", AnimID=UUID("928cae18-e31d-76fd-9cc9-2f55160ff818"), StartAnim=True),

View File

@@ -9,13 +9,14 @@ import asyncio
import struct
from typing import *
from PySide2.QtGui import QImage
from PySide6.QtGui import QImage
from hippolyzer.lib.base.datatypes import UUID, Vector3, Quaternion
from hippolyzer.lib.base.helpers import to_chunks
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.templates import ObjectUpdateFlags, PCode, MCode, MultipleObjectUpdateFlags, TextureEntry
from hippolyzer.lib.client.object_manager import ObjectEvent, UpdateType
from hippolyzer.lib.base.templates import ObjectUpdateFlags, PCode, MCode, MultipleObjectUpdateFlags, \
TextureEntryCollection, JUST_CREATED_FLAGS
from hippolyzer.lib.client.object_manager import ObjectEvent, ObjectUpdateType
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.commands import handle_command
@@ -24,7 +25,6 @@ from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
JUST_CREATED_FLAGS = (ObjectUpdateFlags.CREATE_SELECTED | ObjectUpdateFlags.OBJECT_YOU_OWNER)
PRIM_SCALE = 0.2
@@ -42,7 +42,7 @@ class PixelArtistAddon(BaseAddon):
return
img = QImage()
with open(filename, "rb") as f:
img.loadFromData(f.read(), aformat=None)
img.loadFromData(f.read(), format=None)
img = img.convertToFormat(QImage.Format_RGBA8888)
height = img.height()
width = img.width()
@@ -72,7 +72,7 @@ class PixelArtistAddon(BaseAddon):
# Watch for any newly created prims, this is basically what the viewer does to find
# prims that it just created with the build tool.
with session.objects.events.subscribe_async(
(UpdateType.OBJECT_UPDATE,),
(ObjectUpdateType.OBJECT_UPDATE,),
predicate=lambda e: e.object.UpdateFlags & JUST_CREATED_FLAGS and "LocalID" in e.updated
) as get_events:
# Create a pool of prims to use for building the pixel art
@@ -80,7 +80,7 @@ class PixelArtistAddon(BaseAddon):
# TODO: We don't track the land group or user's active group, so
# "anyone can build" must be on for rezzing to work.
group_id = UUID()
region.circuit.send_message(Message(
region.circuit.send(Message(
'ObjectAdd',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id, GroupID=group_id),
Block(
@@ -124,12 +124,12 @@ class PixelArtistAddon(BaseAddon):
y = i // width
obj = created_prims[prim_idx]
# Set a blank texture on all faces
te = TextureEntry()
te = TextureEntryCollection()
te.Textures[None] = UUID('5748decc-f629-461c-9a36-a35a221fe21f')
# Set the prim color to the color from the pixel
te.Color[None] = pixel_color
# Set the prim texture and color
region.circuit.send_message(Message(
region.circuit.send(Message(
'ObjectImage',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('ObjectData', ObjectLocalID=obj.LocalID, MediaURL=b'', TextureEntry_=te),
@@ -149,7 +149,7 @@ class PixelArtistAddon(BaseAddon):
# Move the "pixels" to their correct position in chunks
for chunk in to_chunks(positioning_blocks, 25):
region.circuit.send_message(Message(
region.circuit.send(Message(
'MultipleObjectUpdate',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
*chunk,

View File

@@ -116,7 +116,7 @@ class RecapitatorAddon(BaseAddon):
except:
logging.exception("Exception while recapitating")
# Tell the viewer about the status of its original upload
region.circuit.send_message(Message(
region.circuit.send(Message(
"AssetUploadComplete",
Block("AssetBlock", UUID=asset_id, Type=asset_block["Type"], Success=success),
direction=Direction.IN,

View File

@@ -0,0 +1,22 @@
import random
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
class SimulatePacketLossAddon(BaseAddon):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
# Messing with these may kill your circuit
if message.name in {"PacketAck", "StartPingCheck", "CompletePingCheck", "UseCircuitCode",
"CompleteAgentMovement", "AgentMovementComplete"}:
return
# Simulate 30% packet loss
if random.random() > 0.7:
# Do nothing, drop this packet on the floor
return True
return
addons = [SimulatePacketLossAddon()]

View File

@@ -0,0 +1,55 @@
"""
Tail animation generator
Demonstrates programmatic generation of local motions using BaseAnimHelperAddon
You can use this to create an animation with a script, fiddle with it until it
looks right, then finally save it with /524 save_local_anim <ANIM_NAME>.
The built animation is automatically applied to all active sessions when loaded,
and is re-generated whenever the script is edited. Unloading the script stops
the animations.
"""
from hippolyzer.lib.base.anim_utils import shift_keyframes, smooth_rot
from hippolyzer.lib.base.datatypes import Quaternion
from hippolyzer.lib.base.llanim import Animation, Joint
from hippolyzer.lib.proxy.addons import AddonManager
import local_anim
AddonManager.hot_reload(local_anim, require_addons_loaded=True)
class TailAnimator(local_anim.BaseAnimHelperAddon):
# Should be unique
ANIM_NAME = "tail_anim"
def build_anim(self) -> Animation:
anim = Animation(
base_priority=5,
duration=5.0,
loop_out_point=5.0,
loop=True,
)
# Iterate along tail joints 1 through 6
for joint_num in range(1, 7):
# Give further along joints a wider range of motion
start_rot = Quaternion.from_euler(0.2, -0.3, 0.15 * joint_num)
end_rot = Quaternion.from_euler(-0.2, -0.3, -0.15 * joint_num)
rot_keyframes = [
# Tween between start_rot and end_rot, using smooth interpolation.
# SL's keyframes only allow linear interpolation which doesn't look great
# for natural motions. `smooth_rot()` gets around that by generating
# smooth inter frames for SL to linearly interpolate between.
*smooth_rot(start_rot, end_rot, inter_frames=10, time=0.0, duration=2.5),
*smooth_rot(end_rot, start_rot, inter_frames=10, time=2.5, duration=2.5),
]
anim.joints[f"mTail{joint_num}"] = Joint(
priority=5,
# Each joint's frames should be ahead of the previous joint's by 2 frames
rot_keyframes=shift_keyframes(rot_keyframes, joint_num * 2),
)
return anim
addons = [TailAnimator()]

View File

@@ -3,7 +3,7 @@ Example of how to request a Transfer
"""
from typing import *
from hippolyzer.lib.base.legacy_inv import InventoryModel, InventoryItem
from hippolyzer.lib.base.inventory import InventoryModel, InventoryItem
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.templates import (
AssetType,
@@ -35,7 +35,7 @@ class TransferExampleAddon(BaseAddon):
async def get_first_script(self, session: Session, region: ProxiedRegion):
"""Get the contents of the first script in the selected object"""
# Ask for the object inventory so we can find a script
region.circuit.send_message(Message(
region.circuit.send(Message(
'RequestTaskInventory',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('InventoryData', LocalID=session.selected.object_local),
@@ -47,7 +47,7 @@ class TransferExampleAddon(BaseAddon):
file_name=inv_message["InventoryData"]["Filename"], file_path=XferFilePath.CACHE)
inv_model = InventoryModel.from_bytes(xfer.reassemble_chunks())
first_script: Optional[InventoryItem] = None
for item in inv_model.items.values():
for item in inv_model.all_items:
if item.type == "lsltext":
first_script = item
if not first_script:

View File

@@ -64,12 +64,12 @@ class TurboObjectInventoryAddon(BaseAddon):
# Any previous requests will have triggered a delete of the inventory file
# by marking it complete on the server-side. Re-send our RequestTaskInventory
# To make sure there's a fresh copy.
region.circuit.send_message(request_msg.take())
region.circuit.send(request_msg.take())
inv_message = await region.message_handler.wait_for(('ReplyTaskInventory',), timeout=5.0)
# No task inventory, send the reply as-is
file_name = inv_message["InventoryData"]["Filename"]
if not file_name:
region.circuit.send_message(inv_message)
region.circuit.send(inv_message)
return
xfer = region.xfer_manager.request(
@@ -87,7 +87,7 @@ class TurboObjectInventoryAddon(BaseAddon):
continue
# Send the original ReplyTaskInventory to the viewer so it knows the file is ready
region.circuit.send_message(inv_message)
region.circuit.send(inv_message)
proxied_xfer = Xfer(data=xfer.reassemble_chunks())
# Wait for the viewer to request the inventory file

View File

@@ -2,21 +2,15 @@
Example of how to upload assets, assumes assets are already encoded
in the appropriate format.
/524 upload <asset type>
/524 upload_asset <asset type>
"""
import pprint
from pathlib import Path
from typing import *
import aiohttp
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.templates import AssetType
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.addon_utils import ais_item_to_inventory_data, show_message, BaseAddon
from hippolyzer.lib.proxy.addon_utils import show_message, BaseAddon
from hippolyzer.lib.proxy.commands import handle_command, Parameter
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
@@ -29,7 +23,6 @@ class UploaderAddon(BaseAddon):
async def upload_asset(self, _session: Session, region: ProxiedRegion,
asset_type: AssetType, flags: Optional[int] = None):
"""Upload a raw asset with optional flags"""
inv_type = asset_type.inventory_type
file = await AddonManager.UI.open_file()
if not file:
return
@@ -42,67 +35,29 @@ class UploaderAddon(BaseAddon):
with open(file, "rb") as f:
file_body = f.read()
params = {
"asset_type": asset_type.human_name,
"description": "(No Description)",
"everyone_mask": 0,
"group_mask": 0,
"folder_id": UUID(), # Puts it in the default folder, I guess. Undocumented.
"inventory_type": inv_type.human_name,
"name": name,
"next_owner_mask": 581632,
}
if flags is not None:
params['flags'] = flags
try:
if asset_type == AssetType.MESH:
# Kicking off a mesh upload works a little differently internally
upload_token = await region.asset_uploader.initiate_mesh_upload(
name, file_body, flags=flags
)
else:
upload_token = await region.asset_uploader.initiate_asset_upload(
name, asset_type, file_body, flags=flags,
)
except Exception as e:
show_message(e)
raise
caps = region.caps_client
async with aiohttp.ClientSession() as sess:
async with caps.post('NewFileAgentInventory', llsd=params, session=sess) as resp:
parsed = await resp.read_llsd()
if "uploader" not in parsed:
show_message(f"Upload error!: {parsed!r}")
return
print("Got upload URL, uploading...")
if not await AddonManager.UI.confirm("Upload", f"Spend {upload_token.linden_cost}L on upload?"):
return
async with caps.post(parsed["uploader"], data=file_body, session=sess) as resp:
upload_parsed = await resp.read_llsd()
if "new_inventory_item" not in upload_parsed:
show_message(f"Got weird upload resp: {pprint.pformat(upload_parsed)}")
return
await self._force_inv_update(region, upload_parsed['new_inventory_item'])
@handle_command(item_id=UUID)
async def force_inv_update(self, _session: Session, region: ProxiedRegion, item_id: UUID):
"""Force an inventory update for a given item id"""
await self._force_inv_update(region, item_id)
async def _force_inv_update(self, region: ProxiedRegion, item_id: UUID):
session = region.session()
ais_req_data = {
"items": [
{
"owner_id": session.agent_id,
"item_id": item_id,
}
]
}
async with region.caps_client.post('FetchInventory2', llsd=ais_req_data) as resp:
ais_item = (await resp.read_llsd())["items"][0]
message = Message(
"UpdateCreateInventoryItem",
Block(
"AgentData",
AgentID=session.agent_id,
SimApproved=1,
TransactionID=UUID.random(),
),
ais_item_to_inventory_data(ais_item),
direction=Direction.IN
)
region.circuit.send_message(message)
# Do the actual upload
try:
await region.asset_uploader.complete_upload(upload_token)
except Exception as e:
show_message(e)
raise
addons = [UploaderAddon()]

View File

@@ -2,7 +2,7 @@
Example of how to request an Xfer
"""
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.legacy_inv import InventoryModel
from hippolyzer.lib.base.inventory import InventoryModel
from hippolyzer.lib.base.templates import XferFilePath, AssetType, InventoryType, WearableType
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message
@@ -15,7 +15,7 @@ class XferExampleAddon(BaseAddon):
@handle_command()
async def get_mute_list(self, session: Session, region: ProxiedRegion):
"""Fetch the current user's mute list"""
region.circuit.send_message(Message(
region.circuit.send(Message(
'MuteListRequest',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block("MuteData", MuteCRC=0),
@@ -35,7 +35,7 @@ class XferExampleAddon(BaseAddon):
@handle_command()
async def get_task_inventory(self, session: Session, region: ProxiedRegion):
"""Get the inventory of the currently selected object"""
region.circuit.send_message(Message(
region.circuit.send(Message(
'RequestTaskInventory',
# If no session is passed in we'll use the active session when the coro was created
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
@@ -57,7 +57,7 @@ class XferExampleAddon(BaseAddon):
await xfer
inv_model = InventoryModel.from_bytes(xfer.reassemble_chunks())
item_names = [item.name for item in inv_model.items.values()]
item_names = [item.name for item in inv_model.all_items]
show_message(item_names)
@handle_command()
@@ -98,7 +98,7 @@ textures 1
data=asset_data,
transaction_id=transaction_id
)
region.circuit.send_message(Message(
region.circuit.send(Message(
'CreateInventoryItem',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block(

View File

@@ -2,7 +2,7 @@ import enum
import logging
import typing
from PySide2 import QtCore, QtGui
from PySide6 import QtCore, QtGui
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.message_logger import FilteringMessageLogger
@@ -19,9 +19,9 @@ class MessageLogHeader(enum.IntEnum):
class MessageLogModel(QtCore.QAbstractTableModel, FilteringMessageLogger):
def __init__(self, parent=None):
def __init__(self, parent=None, maxlen=2000):
QtCore.QAbstractTableModel.__init__(self, parent)
FilteringMessageLogger.__init__(self)
FilteringMessageLogger.__init__(self, maxlen=maxlen)
def _begin_insert(self, insert_idx: int):
self.beginInsertRows(QtCore.QModelIndex(), insert_idx, insert_idx)

View File

@@ -7,6 +7,7 @@ import sys
import time
from typing import Optional
import mitmproxy.ctx
import mitmproxy.exceptions
from hippolyzer.lib.base import llsd
@@ -43,7 +44,7 @@ class SelectionManagerAddon(BaseAddon):
LOG.debug(f"Don't know about selected {local_id}, requesting object")
needed_objects.add(local_id)
if needed_objects:
if needed_objects and session.session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS:
region.objects.request_objects(needed_objects)
# ParcelDwellRequests are sent whenever "about land" is opened. This gives us a
# decent mechanism for selecting parcels.
@@ -86,11 +87,13 @@ class REPLAddon(BaseAddon):
def run_http_proxy_process(proxy_host, http_proxy_port, flow_context: HTTPFlowContext):
mitm_loop = asyncio.new_event_loop()
asyncio.set_event_loop(mitm_loop)
mitmproxy_master = create_http_proxy(proxy_host, http_proxy_port, flow_context)
mitmproxy_master.start_server()
gc.freeze()
flow_context.mitmproxy_ready.set()
mitm_loop.run_forever()
async def mitmproxy_loop():
mitmproxy_master = create_http_proxy(proxy_host, http_proxy_port, flow_context)
gc.freeze()
await mitmproxy_master.run()
asyncio.run(mitmproxy_loop())
def start_proxy(session_manager: SessionManager, extra_addons: Optional[list] = None,
@@ -105,7 +108,7 @@ def start_proxy(session_manager: SessionManager, extra_addons: Optional[list] =
root_log.setLevel(logging.INFO)
logging.basicConfig()
loop = asyncio.get_event_loop()
loop = asyncio.get_event_loop_policy().get_event_loop()
udp_proxy_port = session_manager.settings.SOCKS_PROXY_PORT
http_proxy_port = session_manager.settings.HTTP_PROXY_PORT
@@ -120,7 +123,7 @@ def start_proxy(session_manager: SessionManager, extra_addons: Optional[list] =
if sys.argv[1] == "--setup-ca":
try:
mitmproxy_master = create_http_proxy(proxy_host, http_proxy_port, flow_context)
except mitmproxy.exceptions.ServerException:
except mitmproxy.exceptions.MitmproxyException:
# Proxy already running, create the master so we don't try to bind to a port
mitmproxy_master = create_proxy_master(proxy_host, http_proxy_port, flow_context)
setup_ca(sys.argv[2], mitmproxy_master)
@@ -132,6 +135,9 @@ def start_proxy(session_manager: SessionManager, extra_addons: Optional[list] =
daemon=True,
)
http_proc.start()
# These need to be set for mitmproxy's ASGIApp serving code to work.
mitmproxy.ctx.master = None
mitmproxy.ctx.log = logging.getLogger("mitmproxy log")
server = SLSOCKS5Server(session_manager)
coro = asyncio.start_server(server.handle_connection, proxy_host, udp_proxy_port)

View File

@@ -17,8 +17,8 @@ import urllib.parse
from typing import *
import multidict
from qasync import QEventLoop
from PySide2 import QtCore, QtWidgets, QtGui
from qasync import QEventLoop, asyncSlot
from PySide6 import QtCore, QtWidgets, QtGui
from hippolyzer.apps.model import MessageLogModel, MessageLogHeader, RegionListModel
from hippolyzer.apps.proxy import start_proxy
@@ -35,6 +35,7 @@ from hippolyzer.lib.base.message.message_formatting import (
)
from hippolyzer.lib.base.message.msgtypes import MsgType
from hippolyzer.lib.base.message.template_dict import DEFAULT_TEMPLATE_DICT
from hippolyzer.lib.base.settings import SettingDescriptor
from hippolyzer.lib.base.ui_helpers import loadUi
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base.network.transport import Direction, SocketUDPTransport
@@ -42,7 +43,8 @@ from hippolyzer.lib.proxy.addons import BaseInteractionManager, AddonManager
from hippolyzer.lib.proxy.ca_utils import setup_ca_everywhere
from hippolyzer.lib.proxy.caps_client import ProxyCapsClient
from hippolyzer.lib.proxy.http_proxy import create_proxy_master, HTTPFlowContext
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, AbstractMessageLogEntry
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, AbstractMessageLogEntry, WrappingMessageLogger, \
import_log_entries, export_log_entries
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
from hippolyzer.lib.proxy.settings import ProxySettings
@@ -60,7 +62,7 @@ def show_error_message(error_msg, parent=None):
error_dialog = QtWidgets.QErrorMessage(parent=parent)
# No obvious way to set this to plaintext, yuck...
error_dialog.showMessage(html.escape(error_msg))
error_dialog.exec_()
error_dialog.exec()
error_dialog.raise_()
@@ -68,11 +70,11 @@ class GUISessionManager(SessionManager, QtCore.QObject):
regionAdded = QtCore.Signal(ProxiedRegion)
regionRemoved = QtCore.Signal(ProxiedRegion)
def __init__(self, settings, model):
def __init__(self, settings):
SessionManager.__init__(self, settings)
QtCore.QObject.__init__(self)
self.all_regions = []
self.message_logger = model
self.message_logger = WrappingMessageLogger()
def checkRegions(self):
new_regions = itertools.chain(*[s.regions for s in self.sessions])
@@ -87,13 +89,13 @@ class GUISessionManager(SessionManager, QtCore.QObject):
self.all_regions = new_regions
class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
def __init__(self, parent):
class GUIInteractionManager(BaseInteractionManager):
def __init__(self, parent: QtWidgets.QWidget):
BaseInteractionManager.__init__(self)
QtCore.QObject.__init__(self, parent=parent)
self._parent = parent
def main_window_handle(self) -> Any:
return self.parent()
return self._parent
def _dialog_async_exec(self, dialog: QtWidgets.QDialog):
future = asyncio.Future()
@@ -101,12 +103,16 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
dialog.open()
return future
async def _file_dialog(self, caption: str, directory: str, filter_str: str, mode: QtWidgets.QFileDialog.FileMode) \
-> Tuple[bool, QtWidgets.QFileDialog]:
dialog = QtWidgets.QFileDialog(self.parent(), caption=caption, directory=directory, filter=filter_str)
async def _file_dialog(
self, caption: str, directory: str, filter_str: str, mode: QtWidgets.QFileDialog.FileMode,
default_suffix: str = '',
) -> Tuple[bool, QtWidgets.QFileDialog]:
dialog = QtWidgets.QFileDialog(self._parent, caption=caption, directory=directory, filter=filter_str)
dialog.setFileMode(mode)
if mode == QtWidgets.QFileDialog.FileMode.AnyFile:
dialog.setAcceptMode(QtWidgets.QFileDialog.AcceptMode.AcceptSave)
if default_suffix:
dialog.setDefaultSuffix(default_suffix)
res = await self._dialog_async_exec(dialog)
return res, dialog
@@ -134,9 +140,10 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
return None
return dialog.selectedFiles()[0]
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '') -> Optional[str]:
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '',
default_suffix: str = '') -> Optional[str]:
res, dialog = await self._file_dialog(
caption, directory, filter_str, QtWidgets.QFileDialog.FileMode.AnyFile
caption, directory, filter_str, QtWidgets.QFileDialog.FileMode.AnyFile, default_suffix,
)
if not res or not dialog.selectedFiles():
return None
@@ -148,7 +155,7 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
title,
caption,
QtWidgets.QMessageBox.Ok | QtWidgets.QMessageBox.Cancel,
self.parent(),
self._parent,
)
fut = asyncio.Future()
msg.finished.connect(lambda r: fut.set_result(r))
@@ -156,6 +163,24 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
return (await fut) == QtWidgets.QMessageBox.Ok
class GUIProxySettings(ProxySettings):
FIRST_RUN: bool = SettingDescriptor(True)
"""Persistent settings backed by QSettings"""
def __init__(self, settings: QtCore.QSettings):
super().__init__()
self._settings_obj = settings
def get_setting(self, name: str) -> Any:
val: Any = self._settings_obj.value(name, defaultValue=dataclasses.MISSING)
if val is dataclasses.MISSING:
return val
return json.loads(val)
def set_setting(self, name: str, val: Any):
self._settings_obj.setValue(name, json.dumps(val))
def nonFatalExceptions(f):
@functools.wraps(f)
def _wrapper(self, *args, **kwargs):
@@ -169,7 +194,35 @@ def nonFatalExceptions(f):
return _wrapper
class ProxyGUI(QtWidgets.QMainWindow):
def buildReplacements(session: Session, region: ProxiedRegion):
if not session or not region:
return {}
selected = session.selected
agent_object = region.objects.lookup_fullid(session.agent_id)
selected_local = selected.object_local
selected_object = None
if selected_local:
# We may or may not have an object for this
selected_object = region.objects.lookup_localid(selected_local)
return {
"SELECTED_LOCAL": selected_local,
"SELECTED_FULL": selected_object.FullID if selected_object else None,
"SELECTED_PARCEL_LOCAL": selected.parcel_local,
"SELECTED_PARCEL_FULL": selected.parcel_full,
"SELECTED_SCRIPT_ITEM": selected.script_item,
"SELECTED_TASK_ITEM": selected.task_item,
"AGENT_ID": session.agent_id,
"AGENT_LOCAL": agent_object.LocalID if agent_object else None,
"SESSION_ID": session.id,
"AGENT_POS": agent_object.Position if agent_object else None,
"NULL_KEY": UUID(),
"RANDOM_KEY": UUID.random,
"CIRCUIT_CODE": session.circuit_code,
"REGION_HANDLE": region.handle,
}
class MessageLogWindow(QtWidgets.QMainWindow):
DEFAULT_IGNORE = "StartPingCheck CompletePingCheck PacketAck SimulatorViewerTimeMessage SimStats " \
"AgentUpdate AgentAnimation AvatarAnimation ViewerEffect CoarseLocationUpdate LayerData " \
"CameraConstraint ObjectUpdateCached RequestMultipleObjects ObjectUpdate ObjectUpdateCompressed " \
@@ -183,26 +236,39 @@ class ProxyGUI(QtWidgets.QMainWindow):
textRequest: QtWidgets.QTextEdit
def __init__(self):
super().__init__()
def __init__(
self, settings: GUIProxySettings, session_manager: GUISessionManager,
log_live_messages: bool, parent: Optional[QtWidgets.QWidget] = None,
):
super().__init__(parent=parent)
loadUi(MAIN_WINDOW_UI_PATH, self)
if parent:
self.setWindowTitle("Message Log")
self.menuBar.setEnabled(False) # type: ignore
self.menuBar.hide() # type: ignore
self._selectedEntry: Optional[AbstractMessageLogEntry] = None
self.settings = GUIProxySettings(QtCore.QSettings("SaladDais", "hippolyzer"))
self.model = MessageLogModel(parent=self.tableView)
self.settings = settings
self.sessionManager = session_manager
if log_live_messages:
self.model = MessageLogModel(parent=self.tableView)
session_manager.message_logger.loggers.append(self.model)
else:
self.model = MessageLogModel(parent=self.tableView, maxlen=None)
self.tableView.setModel(self.model)
self.model.rowsAboutToBeInserted.connect(self.beforeInsert)
self.model.rowsInserted.connect(self.afterInsert)
self.tableView.selectionModel().selectionChanged.connect(self._messageSelected)
self.checkBeautify.clicked.connect(self._showSelectedMessage)
self.checkPause.clicked.connect(self._setPaused)
self._setFilter(self.DEFAULT_FILTER)
self.setFilter(self.DEFAULT_FILTER)
self.btnClearLog.clicked.connect(self.model.clear)
self.lineEditFilter.editingFinished.connect(self._setFilter)
self.lineEditFilter.editingFinished.connect(self.setFilter)
self.btnMessageBuilder.clicked.connect(self._sendToMessageBuilder)
self.btnCopyRepr.clicked.connect(self._copyRepr)
self.actionInstallHTTPSCerts.triggered.connect(self._installHTTPSCerts)
self.actionInstallHTTPSCerts.triggered.connect(self.installHTTPSCerts)
self.actionManageAddons.triggered.connect(self._manageAddons)
self.actionManageFilters.triggered.connect(self._manageFilters)
self.actionOpenMessageBuilder.triggered.connect(self._openMessageBuilder)
@@ -213,15 +279,14 @@ class ProxyGUI(QtWidgets.QMainWindow):
self.actionProxyRemotelyAccessible.triggered.connect(self._setProxyRemotelyAccessible)
self.actionUseViewerObjectCache.triggered.connect(self._setUseViewerObjectCache)
self.actionRequestMissingObjects.triggered.connect(self._setRequestMissingObjects)
self.actionOpenNewMessageLogWindow.triggered.connect(self._openNewMessageLogWindow)
self.actionImportLogEntries.triggered.connect(self._importLogEntries)
self.actionExportLogEntries.triggered.connect(self._exportLogEntries)
self._filterMenu = QtWidgets.QMenu()
self._populateFilterMenu()
self.toolButtonFilter.setMenu(self._filterMenu)
self.sessionManager = GUISessionManager(self.settings, self.model)
self.interactionManager = GUIInteractionManager(self)
AddonManager.UI = self.interactionManager
self._shouldScrollOnInsert = True
self.tableView.horizontalHeader().resizeSection(MessageLogHeader.Host, 80)
self.tableView.horizontalHeader().resizeSection(MessageLogHeader.Method, 60)
@@ -230,10 +295,16 @@ class ProxyGUI(QtWidgets.QMainWindow):
self.textResponse.hide()
def closeEvent(self, event) -> None:
loggers = self.sessionManager.message_logger.loggers
if self.model in loggers:
loggers.remove(self.model)
super().closeEvent(event)
def _populateFilterMenu(self):
def _addFilterAction(text, filter_str):
filter_action = QtWidgets.QAction(text, self)
filter_action.triggered.connect(lambda: self._setFilter(filter_str))
filter_action = QtGui.QAction(text, self)
filter_action.triggered.connect(lambda: self.setFilter(filter_str))
self._filterMenu.addAction(filter_action)
self._filterMenu.clear()
@@ -243,16 +314,19 @@ class ProxyGUI(QtWidgets.QMainWindow):
for preset_name, preset_filter in filters.items():
_addFilterAction(preset_name, preset_filter)
def getFilterDict(self):
return self.settings.FILTERS
def setFilterDict(self, val: dict):
self.settings.FILTERS = val
self._populateFilterMenu()
def _manageFilters(self):
dialog = FilterDialog(self)
dialog.exec_()
dialog.exec()
@nonFatalExceptions
def _setFilter(self, filter_str=None):
def setFilter(self, filter_str=None):
if filter_str is None:
filter_str = self.lineEditFilter.text()
else:
@@ -284,23 +358,22 @@ class ProxyGUI(QtWidgets.QMainWindow):
return
req = entry.request(
beautify=self.checkBeautify.isChecked(),
replacements=self.buildReplacements(entry.session, entry.region),
replacements=buildReplacements(entry.session, entry.region),
)
highlight_range = None
if isinstance(req, SpannedString):
match_result = self.model.filter.match(entry)
# Match result was a tuple indicating what matched
if isinstance(match_result, tuple):
highlight_range = req.spans.get(match_result)
self.textRequest.setPlainText(req)
if highlight_range:
cursor = self.textRequest.textCursor()
cursor.setPosition(highlight_range[0], QtGui.QTextCursor.MoveAnchor)
cursor.setPosition(highlight_range[1], QtGui.QTextCursor.KeepAnchor)
highlight_format = QtGui.QTextBlockFormat()
highlight_format.setBackground(QtCore.Qt.yellow)
cursor.setBlockFormat(highlight_format)
# The string has a map of fields and their associated positions within the string,
# use that to highlight any individual fields the filter matched on.
if isinstance(req, SpannedString):
for field in self.model.filter.match(entry, short_circuit=False).fields:
field_span = req.spans.get(field)
if not field_span:
continue
cursor = self.textRequest.textCursor()
cursor.setPosition(field_span[0], QtGui.QTextCursor.MoveAnchor)
cursor.setPosition(field_span[1], QtGui.QTextCursor.KeepAnchor)
highlight_format = QtGui.QTextBlockFormat()
highlight_format.setBackground(QtCore.Qt.yellow)
cursor.setBlockFormat(highlight_format)
resp = entry.response(beautify=self.checkBeautify.isChecked())
if resp:
@@ -324,7 +397,7 @@ class ProxyGUI(QtWidgets.QMainWindow):
win.show()
msg = self._selectedEntry
beautify = self.checkBeautify.isChecked()
replacements = self.buildReplacements(msg.session, msg.region)
replacements = buildReplacements(msg.session, msg.region)
win.setMessageText(msg.request(beautify=beautify, replacements=replacements))
@nonFatalExceptions
@@ -340,37 +413,43 @@ class ProxyGUI(QtWidgets.QMainWindow):
win = MessageBuilderWindow(self, self.sessionManager)
win.show()
def buildReplacements(self, session: Session, region: ProxiedRegion):
if not session or not region:
return {}
selected = session.selected
agent_object = region.objects.lookup_fullid(session.agent_id)
selected_local = selected.object_local
selected_object = None
if selected_local:
# We may or may not have an object for this
selected_object = region.objects.lookup_localid(selected_local)
return {
"SELECTED_LOCAL": selected_local,
"SELECTED_FULL": selected_object.FullID if selected_object else None,
"SELECTED_PARCEL_LOCAL": selected.parcel_local,
"SELECTED_PARCEL_FULL": selected.parcel_full,
"SELECTED_SCRIPT_ITEM": selected.script_item,
"SELECTED_TASK_ITEM": selected.task_item,
"AGENT_ID": session.agent_id,
"AGENT_LOCAL": agent_object.LocalID if agent_object else None,
"SESSION_ID": session.id,
"AGENT_POS": agent_object.Position if agent_object else None,
"NULL_KEY": UUID(),
"RANDOM_KEY": UUID.random,
"CIRCUIT_CODE": session.circuit_code,
"REGION_HANDLE": region.handle,
}
def _openNewMessageLogWindow(self):
win: QtWidgets.QMainWindow = MessageLogWindow(
self.settings, self.sessionManager, log_live_messages=True, parent=self)
win.setFilter(self.lineEditFilter.text())
win.show()
win.activateWindow()
def _installHTTPSCerts(self):
@asyncSlot()
async def _importLogEntries(self):
log_file = await AddonManager.UI.open_file(
caption="Import Log Entries", filter_str="Hippolyzer Logs (*.hippolog)"
)
if not log_file:
return
win = MessageLogWindow(self.settings, self.sessionManager, log_live_messages=False, parent=self)
win.setFilter(self.lineEditFilter.text())
with open(log_file, "rb") as f:
entries = import_log_entries(f.read())
for entry in entries:
win.model.add_log_entry(entry)
win.show()
win.activateWindow()
@asyncSlot()
async def _exportLogEntries(self):
log_file = await AddonManager.UI.save_file(
caption="Export Log Entries", filter_str="Hippolyzer Logs (*.hippolog)", default_suffix="hippolog",
)
if not log_file:
return
with open(log_file, "wb") as f:
f.write(export_log_entries(self.model))
def installHTTPSCerts(self):
msg = QtWidgets.QMessageBox()
msg.setText("This will install the proxy's HTTPS certificate in the config dir"
" of any installed viewers, continue?")
msg.setText("Would you like to install the proxy's HTTPS certificate in the config dir"
" of any installed viewers so that HTTPS connections will work?")
yes_btn = msg.addButton("Yes", QtWidgets.QMessageBox.NoRole)
msg.addButton("No", QtWidgets.QMessageBox.NoRole)
msg.exec()
@@ -402,7 +481,7 @@ class ProxyGUI(QtWidgets.QMainWindow):
def _manageAddons(self):
dialog = AddonDialog(self)
dialog.exec_()
dialog.exec()
def getAddonList(self) -> List[str]:
return self.sessionManager.settings.ADDON_SCRIPTS
@@ -491,7 +570,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
else:
self.comboUntrusted.addItem(message_name)
cap_names = sorted(set(itertools.chain(*[r.caps.keys() for r in self.regionModel.regions])))
cap_names = sorted(set(itertools.chain(*[r.cap_urls.keys() for r in self.regionModel.regions])))
for cap_name in cap_names:
if cap_name.endswith("ProxyWrapper"):
continue
@@ -522,7 +601,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
break
self.textRequest.setPlainText(
f"""{method} [[{cap_name}]]{path}{params} HTTP/1.1
# {region.caps.get(cap_name, "<unknown URI>")}
# {region.cap_urls.get(cap_name, "<unknown URI>")}
{headers}
{body}"""
)
@@ -575,24 +654,9 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
if var.name in ("TaskID", "ObjectID"):
return VerbatimHumanVal("[[SELECTED_FULL]]")
if var.type.is_int:
return 0
elif var.type.is_float:
return 0.0
elif var.type == MsgType.MVT_LLUUID:
return UUID()
elif var.type == MsgType.MVT_BOOL:
return False
elif var.type == MsgType.MVT_VARIABLE:
return ""
elif var.type in (MsgType.MVT_LLVector3, MsgType.MVT_LLVector3d, MsgType.MVT_LLQuaternion):
return VerbatimHumanVal("(0.0, 0.0, 0.0)")
elif var.type == MsgType.MVT_LLVector4:
return VerbatimHumanVal("(0.0, 0.0, 0.0, 0.0)")
elif var.type == MsgType.MVT_FIXED:
return b"\x00" * var.size
elif var.type == MsgType.MVT_IP_ADDR:
return "0.0.0.0"
default_val = var.default_value
if default_val is not None:
return default_val
return VerbatimHumanVal("")
@nonFatalExceptions
@@ -600,7 +664,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
session, region = self._getTarget()
msg_text = self.textRequest.toPlainText()
replacements = self.parent().buildReplacements(session, region)
replacements = buildReplacements(session, region)
if re.match(r"\A\s*(in|out)\s+", msg_text, re.I):
sender_func = self._sendLLUDPMessage
@@ -632,13 +696,11 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
msg = HumanMessageSerializer.from_human_string(msg_text, replacements, env, safe=False)
if self.checkLLUDPViaCaps.isChecked():
if msg.direction == Direction.IN:
region.eq_manager.queue_event(
self.llsdSerializer.serialize(msg, as_dict=True)
)
region.eq_manager.inject_message(msg)
else:
self._sendHTTPRequest(
"POST",
region.caps["UntrustedSimulatorMessage"],
region.cap_urls["UntrustedSimulatorMessage"],
{"Content-Type": "application/llsd+xml", "Accept": "application/llsd+xml"},
self.llsdSerializer.serialize(msg),
)
@@ -647,18 +709,25 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
off_circuit = self.checkOffCircuit.isChecked()
if off_circuit:
transport = SocketUDPTransport(socket.socket(socket.AF_INET, socket.SOCK_DGRAM))
region.circuit.send_message(msg, transport=transport)
region.circuit.send(msg, transport=transport)
if off_circuit:
transport.close()
def _sendEQMessage(self, session, region: Optional[ProxiedRegion], msg_text: str, _replacements: dict):
def _sendEQMessage(self, session, region: Optional[ProxiedRegion], msg_text: str, replacements: dict):
if not session or not region:
raise RuntimeError("Need a valid session and region to send EQ event")
message_line, _, body = (x.strip() for x in msg_text.partition("\n"))
message_name = message_line.rsplit(" ", 1)[-1]
region.eq_manager.queue_event({
env = self._buildEnv(session, region)
def directive_handler(m):
return self._handleHTTPDirective(env, replacements, False, m)
body = re.sub(rb"<!HIPPO(\w+)\[\[(.*?)]]>", directive_handler, body.encode("utf8"), flags=re.S)
region.eq_manager.inject_event({
"message": message_name,
"body": llsd.parse_xml(body.encode("utf8")),
"body": llsd.parse_xml(body),
})
def _sendHTTPMessage(self, session, region, msg_text: str, replacements: dict):
@@ -682,7 +751,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
cap_name = match.group(1)
cap_url = session.global_caps.get(cap_name)
if not cap_url:
cap_url = region.caps.get(cap_name)
cap_url = region.cap_urls.get(cap_name)
if not cap_url:
raise ValueError("Don't have a Cap for %s" % cap_name)
uri = cap_url + match.group(2)
@@ -722,7 +791,10 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
val = subfield_eval(contents.decode("utf8").strip(), globals_={**env, **replacements})
val = _coerce_to_bytes(val)
elif directive == b"REPL":
val = _coerce_to_bytes(replacements[contents.decode("utf8").strip()])
repl = replacements[contents.decode("utf8").strip()]
if callable(repl):
repl = repl()
val = _coerce_to_bytes(repl)
else:
raise ValueError(f"Unknown directive {directive}")
@@ -749,7 +821,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
class AddonDialog(QtWidgets.QDialog):
listAddons: QtWidgets.QListWidget
def __init__(self, parent: ProxyGUI):
def __init__(self, parent: MessageLogWindow):
super().__init__()
loadUi(ADDON_DIALOG_UI_PATH, self)
@@ -800,7 +872,7 @@ class AddonDialog(QtWidgets.QDialog):
class FilterDialog(QtWidgets.QDialog):
listFilters: QtWidgets.QListWidget
def __init__(self, parent: ProxyGUI):
def __init__(self, parent: MessageLogWindow):
super().__init__()
loadUi(FILTER_DIALOG_UI_PATH, self)
@@ -838,29 +910,16 @@ class FilterDialog(QtWidgets.QDialog):
self.listFilters.takeItem(idx)
class GUIProxySettings(ProxySettings):
"""Persistent settings backed by QSettings"""
def __init__(self, settings: QtCore.QSettings):
super().__init__()
self._settings_obj = settings
def get_setting(self, name: str) -> Any:
val: Any = self._settings_obj.value(name, defaultValue=dataclasses.MISSING)
if val is dataclasses.MISSING:
return val
return json.loads(val)
def set_setting(self, name: str, val: Any):
self._settings_obj.setValue(name, json.dumps(val))
def gui_main():
multiprocessing.set_start_method('spawn')
QtCore.QCoreApplication.setAttribute(QtCore.Qt.AA_ShareOpenGLContexts)
app = QtWidgets.QApplication(sys.argv)
loop = QEventLoop(app)
asyncio.set_event_loop(loop)
window = ProxyGUI()
settings = GUIProxySettings(QtCore.QSettings("SaladDais", "hippolyzer"))
session_manager = GUISessionManager(settings)
window = MessageLogWindow(settings, session_manager, log_live_messages=True)
AddonManager.UI = GUIInteractionManager(window)
timer = QtCore.QTimer(app)
timer.timeout.connect(window.sessionManager.checkRegions)
timer.start(100)
@@ -869,6 +928,10 @@ def gui_main():
http_host = None
if window.sessionManager.settings.REMOTELY_ACCESSIBLE:
http_host = "0.0.0.0"
if settings.FIRST_RUN:
settings.FIRST_RUN = False
# Automatically offer to install the HTTPS certs on first run.
window.installHTTPSCerts()
start_proxy(
session_manager=window.sessionManager,
extra_addon_paths=window.getAddonList(),

View File

@@ -256,6 +256,10 @@
<bool>true</bool>
</property>
<addaction name="actionOpenMessageBuilder"/>
<addaction name="actionOpenNewMessageLogWindow"/>
<addaction name="separator"/>
<addaction name="actionImportLogEntries"/>
<addaction name="actionExportLogEntries"/>
<addaction name="separator"/>
<addaction name="actionInstallHTTPSCerts"/>
<addaction name="actionManageAddons"/>
@@ -323,6 +327,21 @@
<string>Force the proxy to request objects that it doesn't know about due to cache misses</string>
</property>
</action>
<action name="actionOpenNewMessageLogWindow">
<property name="text">
<string>Open New Message Log Window</string>
</property>
</action>
<action name="actionImportLogEntries">
<property name="text">
<string>Import Log Entries</string>
</property>
</action>
<action name="actionExportLogEntries">
<property name="text">
<string>Export Log Entries</string>
</property>
</action>
</widget>
<resources/>
<connections/>

View File

@@ -0,0 +1,91 @@
"""
Assorted utilities to make creating animations from scratch easier
"""
import copy
from typing import List, Union
from hippolyzer.lib.base.datatypes import Vector3, Quaternion
from hippolyzer.lib.base.llanim import PosKeyframe, RotKeyframe
def smooth_step(t: float):
t = max(0.0, min(1.0, t))
return t * t * (3 - 2 * t)
def rot_interp(r0: Quaternion, r1: Quaternion, t: float):
"""
Bad quaternion interpolation
TODO: This is definitely not correct yet seems to work ok? Implement slerp.
"""
# Ignore W
r0 = r0.data(3)
r1 = r1.data(3)
return Quaternion(*map(lambda pair: ((pair[0] * (1.0 - t)) + (pair[1] * t)), zip(r0, r1)))
def unique_frames(frames: List[Union[PosKeyframe, RotKeyframe]]):
"""Drop frames where time and coordinate are exact duplicates of another frame"""
new_frames = []
for frame in frames:
# TODO: fudge factor for float comparison instead
if frame not in new_frames:
new_frames.append(frame)
return new_frames
def shift_keyframes(frames: List[Union[PosKeyframe, RotKeyframe]], num: int):
"""
Shift keyframes around by `num` frames
Assumes keyframes occur at a set cadence, and that first and last keyframe are at the same coord.
"""
# Get rid of duplicate frames
frames = unique_frames(frames)
pop_idx = -1
insert_idx = 0
if num < 0:
insert_idx = len(frames) - 1
pop_idx = 0
num = -num
old_times = [f.time for f in frames]
new_frames = frames.copy()
# Drop last, duped frame. We'll copy the first frame to replace it later
new_frames.pop(-1)
for _ in range(num):
new_frames.insert(insert_idx, new_frames.pop(pop_idx))
# Put first frame back on the end
new_frames.append(copy.copy(new_frames[0]))
assert len(old_times) == len(new_frames)
assert new_frames[0] == new_frames[-1]
# Make the times of the shifted keyframes match up with the previous timeline
for old_time, new_frame in zip(old_times, new_frames):
new_frame.time = old_time
return new_frames
def smooth_pos(start: Vector3, end: Vector3, inter_frames: int, time: float, duration: float) -> List[PosKeyframe]:
"""Generate keyframes to smoothly interpolate between two positions"""
frames = [PosKeyframe(time=time, pos=start)]
for i in range(0, inter_frames):
t = (i + 1) / (inter_frames + 1)
smooth_t = smooth_step(t)
pos = Vector3(smooth_t, smooth_t, smooth_t).interpolate(start, end)
frames.append(PosKeyframe(time=time + (t * duration), pos=pos))
return frames + [PosKeyframe(time=time + duration, pos=end)]
def smooth_rot(start: Quaternion, end: Quaternion, inter_frames: int, time: float, duration: float)\
-> List[RotKeyframe]:
"""Generate keyframes to smoothly interpolate between two rotations"""
frames = [RotKeyframe(time=time, rot=start)]
for i in range(0, inter_frames):
t = (i + 1) / (inter_frames + 1)
smooth_t = smooth_step(t)
frames.append(RotKeyframe(time=time + (t * duration), rot=rot_interp(start, end, smooth_t)))
return frames + [RotKeyframe(time=time + duration, rot=end)]

View File

@@ -0,0 +1,319 @@
# This currently implements basic LLMesh -> Collada.
#
# TODO:
# * inverse, Collada -> LLMesh (for simple cases, maybe using impasse rather than pycollada)
# * round-tripping tests, LLMesh->Collada->LLMesh
# * * Can't really test using Collada->LLMesh->Collada because Collada->LLMesh is almost always
# going to be lossy due to how SL represents vertex data and materials compared to what
# Collada allows.
# * Eventually scrap this and just use GLTF instead once we know we have the semantics correct
# * * Collada was just easier to bootstrap given that it's the only officially supported input format
# * * Collada tooling sucks and even LL is moving away from it
# * * Ensuring LLMesh->Collada and LLMesh->GLTF conversion don't differ semantically is easy via assimp.
import logging
import os.path
import secrets
import sys
from typing import Dict, List, Optional, Union, Sequence
import collada
import collada.source
from collada import E
from lxml import etree
import numpy as np
import transformations
from hippolyzer.lib.base.datatypes import Vector3
from hippolyzer.lib.base.helpers import get_resource_filename
from hippolyzer.lib.base.serialization import BufferReader
from hippolyzer.lib.base.mesh import LLMeshSerializer, MeshAsset, positions_from_domain, SkinSegmentDict
LOG = logging.getLogger(__name__)
DIR = os.path.dirname(os.path.realpath(__file__))
def llsd_to_mat4(mat: Union[np.ndarray, Sequence[float]]) -> np.ndarray:
return np.array(mat).reshape((4, 4), order='F')
def mat4_to_llsd(mat: np.ndarray) -> List[float]:
return list(mat.flatten(order='F'))
def mat4_to_collada(mat: np.ndarray) -> np.ndarray:
return mat.flatten(order='C')
def mesh_to_collada(ll_mesh: MeshAsset, include_skin=True) -> collada.Collada:
dae = collada.Collada()
axis = collada.asset.UP_AXIS.Z_UP
dae.assetInfo.upaxis = axis
scene = collada.scene.Scene("scene", [llmesh_to_node(ll_mesh, dae, include_skin=include_skin)])
dae.scenes.append(scene)
dae.scene = scene
return dae
def llmesh_to_node(ll_mesh: MeshAsset, dae: collada.Collada, uniq=None,
include_skin=True, node_transform: Optional[np.ndarray] = None) -> collada.scene.Node:
if node_transform is None:
node_transform = np.identity(4)
should_skin = False
skin_seg = ll_mesh.segments.get('skin')
bind_shape_matrix = None
if include_skin and skin_seg:
bind_shape_matrix = llsd_to_mat4(skin_seg["bind_shape_matrix"])
should_skin = True
# Transform from the skin will be applied on the controller, not the node
node_transform = np.identity(4)
if not uniq:
uniq = secrets.token_urlsafe(4)
geom_nodes = []
node_name = f"mainnode{uniq}"
# TODO: do the other LODs?
for submesh_num, submesh in enumerate(ll_mesh.segments["high_lod"]):
# Make sure none of our IDs collide with those of other nodes
sub_uniq = uniq + str(submesh_num)
range_xyz = positions_from_domain(submesh["Position"], submesh["PositionDomain"])
xyz = np.array([x.data() for x in range_xyz])
range_uv = positions_from_domain(submesh['TexCoord0'], submesh['TexCoord0Domain'])
uv = np.array([x.data() for x in range_uv]).flatten()
norms = np.array([x.data() for x in submesh["Normal"]])
effect = collada.material.Effect(
id=f"effect{sub_uniq}",
params=[],
specular=(0.0, 0.0, 0.0, 0.0),
reflectivity=(0.0, 0.0, 0.0, 0.0),
emission=(0.0, 0.0, 0.0, 0.0),
ambient=(0.0, 0.0, 0.0, 0.0),
reflective=0.0,
shadingtype="blinn",
shininess=0.0,
diffuse=(0.0, 0.0, 0.0),
)
mat = collada.material.Material(f"material{sub_uniq}", f"material{sub_uniq}", effect)
dae.materials.append(mat)
dae.effects.append(effect)
vert_src = collada.source.FloatSource(f"verts-array{sub_uniq}", xyz.flatten(), ("X", "Y", "Z"))
norm_src = collada.source.FloatSource(f"norms-array{sub_uniq}", norms.flatten(), ("X", "Y", "Z"))
# UV maps have to have the same name or they'll behave weirdly when objects are merged.
uv_src = collada.source.FloatSource("uvs-array", np.array(uv), ("U", "V"))
geom = collada.geometry.Geometry(dae, f"geometry{sub_uniq}", "geometry", [vert_src, norm_src, uv_src])
input_list = collada.source.InputList()
input_list.addInput(0, 'VERTEX', f'#verts-array{sub_uniq}', set="0")
input_list.addInput(0, 'NORMAL', f'#norms-array{sub_uniq}', set="0")
input_list.addInput(0, 'TEXCOORD', '#uvs-array', set="0")
tri_idxs = np.array(submesh["TriangleList"]).flatten()
matnode = collada.scene.MaterialNode(f"materialref{sub_uniq}", mat, inputs=[])
tri_set = geom.createTriangleSet(tri_idxs, input_list, f'materialref{sub_uniq}')
geom.primitives.append(tri_set)
dae.geometries.append(geom)
if should_skin:
joint_names = np.array(skin_seg['joint_names'], dtype=object)
joints_source = collada.source.NameSource(f"joint-names{sub_uniq}", joint_names, ("JOINT",))
# PyCollada has a bug where it doesn't set the source URI correctly. Fix it.
accessor = joints_source.xmlnode.find(f"{dae.tag('technique_common')}/{dae.tag('accessor')}")
if not accessor.get('source').startswith('#'):
accessor.set('source', f"#{accessor.get('source')}")
flattened_bind_poses = []
for bind_pose in skin_seg['inverse_bind_matrix']:
flattened_bind_poses.append(mat4_to_collada(llsd_to_mat4(bind_pose)))
flattened_bind_poses = np.array(flattened_bind_poses)
inv_bind_source = _create_mat4_source(f"bind-poses{sub_uniq}", flattened_bind_poses, "TRANSFORM")
weight_joint_idxs = []
weights = []
vert_weight_counts = []
cur_weight_idx = 0
for vert_weights in submesh['Weights']:
vert_weight_counts.append(len(vert_weights))
for vert_weight in vert_weights:
weights.append(vert_weight.weight)
weight_joint_idxs.append(vert_weight.joint_idx)
weight_joint_idxs.append(cur_weight_idx)
cur_weight_idx += 1
weights_source = collada.source.FloatSource(f"skin-weights{sub_uniq}", np.array(weights), ("WEIGHT",))
# We need to make a controller for each material since materials are essentially distinct meshes
# in SL, with their own distinct sets of weights and vertex data.
controller_node = E.controller(
E.skin(
E.bind_shape_matrix(' '.join(str(x) for x in mat4_to_collada(bind_shape_matrix))),
joints_source.xmlnode,
inv_bind_source.xmlnode,
weights_source.xmlnode,
E.joints(
E.input(semantic="JOINT", source=f"#joint-names{sub_uniq}"),
E.input(semantic="INV_BIND_MATRIX", source=f"#bind-poses{sub_uniq}")
),
E.vertex_weights(
E.input(semantic="JOINT", source=f"#joint-names{sub_uniq}", offset="0"),
E.input(semantic="WEIGHT", source=f"#skin-weights{sub_uniq}", offset="1"),
E.vcount(' '.join(str(x) for x in vert_weight_counts)),
E.v(' '.join(str(x) for x in weight_joint_idxs)),
count=str(len(submesh['Weights']))
),
source=f"#geometry{sub_uniq}"
),
id=f"Armature-{sub_uniq}",
name=node_name
)
controller = collada.controller.Controller.load(dae, {}, controller_node)
dae.controllers.append(controller)
geom_node = collada.scene.ControllerNode(controller, [matnode])
else:
geom_node = collada.scene.GeometryNode(geom, [matnode])
geom_nodes.append(geom_node)
node = collada.scene.Node(
node_name,
children=geom_nodes,
transforms=[collada.scene.MatrixTransform(mat4_to_collada(node_transform))],
)
if should_skin:
# We need a skeleton per _mesh asset_ because you could have incongruous skeletons
# within the same linkset.
skel_root = load_skeleton_nodes()
transform_skeleton(skel_root, dae, skin_seg)
skel = collada.scene.Node.load(dae, skel_root, {})
skel.children.append(node)
skel.id = f"Skel-{uniq}"
skel.save()
node = skel
return node
def load_skeleton_nodes() -> etree.ElementBase:
# TODO: this sucks. Can't we construct nodes with the appropriate transformation
# matrices from the data in `avatar_skeleton.xml`?
skel_path = get_resource_filename("lib/base/data/male_collada_joints.xml")
with open(skel_path, 'r') as f:
return etree.fromstring(f.read())
def transform_skeleton(skel_root: etree.ElementBase, dae: collada.Collada, skin_seg: SkinSegmentDict,
include_unreferenced_bones=False):
"""Update skeleton XML nodes to account for joint translations in the mesh"""
# TODO: Use translation component only.
joint_nodes: Dict[str, collada.scene.Node] = {}
for skel_node in skel_root.iter():
# xpath is loathsome so this is easier.
if skel_node.tag != dae.tag('node') or skel_node.get('type') != 'JOINT':
continue
joint_nodes[skel_node.get('name')] = collada.scene.Node.load(dae, skel_node, {})
for joint_name, matrix in zip(skin_seg['joint_names'], skin_seg.get('alt_inverse_bind_matrix', [])):
joint_node = joint_nodes[joint_name]
joint_decomp = transformations.decompose_matrix(llsd_to_mat4(matrix))
joint_node.matrix = mat4_to_collada(transformations.compose_matrix(translate=joint_decomp[3]))
# Update the underlying XML element with the new transform matrix
joint_node.save()
if not include_unreferenced_bones:
needed_heirarchy = set()
for skel_node in joint_nodes.values():
skel_node = skel_node.xmlnode
if skel_node.get('name') in skin_seg['joint_names']:
# Add this joint and any ancestors the list of needed joints
while skel_node is not None:
needed_heirarchy.add(skel_node.get('name'))
skel_node = skel_node.getparent()
for skel_node in joint_nodes.values():
skel_node = skel_node.xmlnode
if skel_node.get('name') not in needed_heirarchy:
skel_node.getparent().remove(skel_node)
pelvis_offset = skin_seg.get('pelvis_offset')
# TODO: should we even do this here? It's not present in the collada, just
# something that's specified in the uploader before conversion to LLMesh.
if pelvis_offset and 'mPelvis' in joint_nodes:
pelvis_node = joint_nodes['mPelvis']
# Column-major!
pelvis_node.matrix[3][2] += pelvis_offset
pelvis_node.save()
def _create_mat4_source(name: str, data: np.ndarray, semantic: str):
# PyCollada has no way to make a source with a float4x4 semantic. Do it a bad way.
# Note that collada demands column-major matrices whereas LLSD mesh has them row-major!
source = collada.source.FloatSource(name, data, tuple(f"M{x}" for x in range(16)))
accessor = source.xmlnode[1][0]
for child in list(accessor):
accessor.remove(child)
accessor.append(E.param(name=semantic, type="float4x4"))
return source
def fix_weird_bind_matrices(skin_seg: SkinSegmentDict):
"""
Fix weird-looking bind matrices to have normal scaling and rotations
Not sure why these even happen (weird mesh authoring programs?)
Sometimes get enormous inverse bind matrices (each component 10k+) and tiny
bind shape matrix components. This detects inverse bind shape matrices
with weird scales and tries to set them to what they "should" be without
the weird inverted scaling.
"""
scale_fixup = Vector3(1, 1, 1)
angle_fixup = Vector3(0, 0, 0)
have_fixups = False
# Totally non-scientific method of detecting odd bind matrices based on squinting very,
# very hard at a random sample of assets.
for joint_name, joint_inv in zip(skin_seg['joint_names'], skin_seg['inverse_bind_matrix']):
if not joint_name.startswith("m"):
# We can't make very good guesses based on collision volume scales and rotations,
# skip anything but the "m" joints.
continue
joint_mat = llsd_to_mat4(joint_inv)
joint_scale, _, joint_angle, _, _ = transformations.decompose_matrix(joint_mat)
# If the scale component of an mJointName joint isn't roughly <1,1,1>, we likely have
# scaling applied to the inverse bind matrices rather than the bind matrix. Figure out
# what the fixup should be so that we can reverse it.
if abs(3.0 - sum(joint_scale)) > 0.5:
scale_fixup = Vector3(1, 1, 1) / Vector3(*joint_scale)
have_fixups = True
# I wouldn't expect mJointName joints to be rotated at all in their inverse bind matrices.
# Is this a rotation that should've been applied to the bind shape matrix instead?
# In any event, all joints are likely rotated by this amount, so calculate the inverse.
if abs(sum(joint_angle)) > 0.05:
angle_fixup = -Vector3(*joint_angle)
have_fixups = True
if have_fixups:
LOG.warning("Detected weird matrices in mesh!", scale_fixup, angle_fixup)
# The magnitude of the scales in the inverse bind matrices look very strange.
# The bind matrix itself is probably messed up as well, try to fix it.
# TODO: DON'T MESS WITH INVERSE TRANSLATION!!!! Only bind shape gets its translation scaled.
# TODO: put this back in, the previous logic was totally wrong-headed..
pass
def main():
# Take an llmesh file as an argument and spit out basename-converted.dae
with open(sys.argv[1], "rb") as f:
reader = BufferReader("<", f.read())
mesh = mesh_to_collada(reader.read(LLMeshSerializer(parse_segment_contents=True)))
mesh.write(sys.argv[1].rsplit(".", 1)[0] + "-converted.dae")
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,485 @@
<!-- from http://wiki.secondlife.com/wiki/Project_Bento_Resources_and_Information collada -->
<node id="Avatar" name="Avatar" type="NODE" xmlns="http://www.collada.org/2005/11/COLLADASchema">
<translate sid="location">0 0 0</translate>
<rotate sid="rotationZ">0 0 1 0</rotate>
<rotate sid="rotationY">0 1 0 0</rotate>
<rotate sid="rotationX">1 0 0 0</rotate>
<scale sid="scale">1 1 1</scale>
<node id="mPelvis" name="mPelvis" sid="mPelvis" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 1.067 0 0 0 1</matrix>
<node id="PELVIS" name="PELVIS" sid="PELVIS" type="JOINT">
<matrix sid="transform">1 0 0 -0.01 0 1 0 0 0 0 1 -0.02 0 0 0 1</matrix>
</node>
<node id="BUTT" name="BUTT" sid="BUTT" type="JOINT">
<matrix sid="transform">1 0 0 -0.06 0 1 0 0 0 0 1 -0.1 0 0 0 1</matrix>
</node>
<node id="mSpine1" name="mSpine1" sid="mSpine1" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.084 0 0 0 1</matrix>
<node id="mSpine2" name="mSpine2" sid="mSpine2" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 -0.084 0 0 0 1</matrix>
<node id="mTorso" name="mTorso" sid="mTorso" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.084 0 0 0 1</matrix>
<node id="BELLY" name="BELLY" sid="BELLY" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 0 0 0 1 0.04 0 0 0 1</matrix>
</node>
<node id="LEFT_HANDLE" name="LEFT_HANDLE" sid="LEFT_HANDLE" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.1 0 0 1 0.058 0 0 0 1</matrix>
</node>
<node id="RIGHT_HANDLE" name="RIGHT_HANDLE" sid="RIGHT_HANDLE" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.1 0 0 1 0.058 0 0 0 1</matrix>
</node>
<node id="LOWER_BACK" name="LOWER_BACK" sid="LOWER_BACK" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.023 0 0 0 1</matrix>
</node>
<node id="mSpine3" name="mSpine3" sid="mSpine3" type="JOINT">
<matrix sid="transform">1 0 0 -0.015 0 1 0 0 0 0 1 0.205 0 0 0 1</matrix>
<node id="mSpine4" name="mSpine4" sid="mSpine4" type="JOINT">
<matrix sid="transform">1 0 0 0.015 0 1 0 0 0 0 1 -0.205 0 0 0 1</matrix>
<node id="mChest" name="mChest" sid="mChest" type="JOINT">
<matrix sid="transform">1 0 0 -0.015 0 1 0 0 0 0 1 0.205 0 0 0 1</matrix>
<node id="CHEST" name="CHEST" sid="CHEST" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 0 0 0 1 0.07 0 0 0 1</matrix>
</node>
<node id="LEFT_PEC" name="LEFT_PEC" sid="LEFT_PEC" type="JOINT">
<matrix sid="transform">1 0 0 0.119 0 1 0 0.082 0 0 1 0.042 0 0 0 1</matrix>
</node>
<node id="RIGHT_PEC" name="RIGHT_PEC" sid="RIGHT_PEC" type="JOINT">
<matrix sid="transform">1 0 0 0.119 0 1 0 -0.082 0 0 1 0.042 0 0 0 1</matrix>
</node>
<node id="UPPER_BACK" name="UPPER_BACK" sid="UPPER_BACK" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.017 0 0 0 1</matrix>
</node>
<node id="mNeck" name="mNeck" sid="mNeck" type="JOINT">
<matrix sid="transform">1 0 0 -0.01 0 1 0 0 0 0 1 0.251 0 0 0 1</matrix>
<node id="NECK" name="NECK" sid="NECK" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
</node>
<node id="mHead" name="mHead" sid="mHead" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.076 0 0 0 1</matrix>
<node id="HEAD" name="HEAD" sid="HEAD" type="JOINT">
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 0.07 0 0 0 1</matrix>
</node>
<node id="mSkull" name="mSkull" sid="mSkull" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.079 0 0 0 1</matrix>
</node>
<node id="mEyeRight" name="mEyeRight" sid="mEyeRight" type="JOINT">
<matrix sid="transform">1 0 0 0.098 0 1 0 -0.036 0 0 1 0.079 0 0 0 1</matrix>
</node>
<node id="mEyeLeft" name="mEyeLeft" sid="mEyeLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.098 0 1 0 0.036 0 0 1 0.079 0 0 0 1</matrix>
</node>
<node id="mFaceRoot" name="mFaceRoot" sid="mFaceRoot" type="JOINT">
<matrix sid="transform">1 0 0 0.025 0 1 0 0 0 0 1 0.045 0 0 0 1</matrix>
<node id="mFaceEyeAltRight" name="mFaceEyeAltRight" sid="mFaceEyeAltRight" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 -0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEyeAltLeft" name="mFaceEyeAltLeft" sid="mFaceEyeAltLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceForeheadLeft" name="mFaceForeheadLeft" sid="mFaceForeheadLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.061 0 1 0 0.035 0 0 1 0.083 0 0 0 1</matrix>
</node>
<node id="mFaceForeheadRight" name="mFaceForeheadRight" sid="mFaceForeheadRight" type="JOINT">
<matrix sid="transform">1 0 0 0.061 0 1 0 -0.035 0 0 1 0.083 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowOuterLeft" name="mFaceEyebrowOuterLeft" sid="mFaceEyebrowOuterLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.064 0 1 0 0.051 0 0 1 0.048 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowCenterLeft" name="mFaceEyebrowCenterLeft" sid="mFaceEyebrowCenterLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.07 0 1 0 0.043 0 0 1 0.056 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowInnerLeft" name="mFaceEyebrowInnerLeft" sid="mFaceEyebrowInnerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.075 0 1 0 0.022 0 0 1 0.051 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowOuterRight" name="mFaceEyebrowOuterRight" sid="mFaceEyebrowOuterRight" type="JOINT">
<matrix sid="transform">1 0 0 0.064 0 1 0 -0.051 0 0 1 0.048 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowCenterRight" name="mFaceEyebrowCenterRight" sid="mFaceEyebrowCenterRight" type="JOINT">
<matrix sid="transform">1 0 0 0.07 0 1 0 -0.043 0 0 1 0.056 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowInnerRight" name="mFaceEyebrowInnerRight" sid="mFaceEyebrowInnerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.075 0 1 0 -0.022 0 0 1 0.051 0 0 0 1</matrix>
</node>
<node id="mFaceEyeLidUpperLeft" name="mFaceEyeLidUpperLeft" sid="mFaceEyeLidUpperLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEyeLidLowerLeft" name="mFaceEyeLidLowerLeft" sid="mFaceEyeLidLowerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEyeLidUpperRight" name="mFaceEyeLidUpperRight" sid="mFaceEyeLidUpperRight" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 -0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEyeLidLowerRight" name="mFaceEyeLidLowerRight" sid="mFaceEyeLidLowerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 -0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEar1Left" name="mFaceEar1Left" sid="mFaceEar1Left" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.08 0 0 1 0.002 0 0 0 1</matrix>
<node id="mFaceEar2Left" name="mFaceEar2Left" sid="mFaceEar2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.019 0 1 0 0.018 0 0 1 0.025 0 0 0 1</matrix>
</node>
</node>
<node id="mFaceEar1Right" name="mFaceEar1Right" sid="mFaceEar1Right" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.08 0 0 1 0.002 0 0 0 1</matrix>
<node id="mFaceEar2Right" name="mFaceEar2Right" sid="mFaceEar2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.019 0 1 0 -0.018 0 0 1 0.025 0 0 0 1</matrix>
</node>
</node>
<node id="mFaceNoseLeft" name="mFaceNoseLeft" sid="mFaceNoseLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.086 0 1 0 0.015 0 0 1 -0.004 0 0 0 1</matrix>
</node>
<node id="mFaceNoseCenter" name="mFaceNoseCenter" sid="mFaceNoseCenter" type="JOINT">
<matrix sid="transform">1 0 0 0.102 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceNoseRight" name="mFaceNoseRight" sid="mFaceNoseRight" type="JOINT">
<matrix sid="transform">1 0 0 0.086 0 1 0 -0.015 0 0 1 -0.004 0 0 0 1</matrix>
</node>
<node id="mFaceCheekLowerLeft" name="mFaceCheekLowerLeft" sid="mFaceCheekLowerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.05 0 1 0 0.034 0 0 1 -0.031 0 0 0 1</matrix>
</node>
<node id="mFaceCheekUpperLeft" name="mFaceCheekUpperLeft" sid="mFaceCheekUpperLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.07 0 1 0 0.034 0 0 1 -0.005 0 0 0 1</matrix>
</node>
<node id="mFaceCheekLowerRight" name="mFaceCheekLowerRight" sid="mFaceCheekLowerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.05 0 1 0 -0.034 0 0 1 -0.031 0 0 0 1</matrix>
</node>
<node id="mFaceCheekUpperRight" name="mFaceCheekUpperRight" sid="mFaceCheekUpperRight" type="JOINT">
<matrix sid="transform">1 0 0 0.07 0 1 0 -0.034 0 0 1 -0.005 0 0 0 1</matrix>
</node>
<node id="mFaceJaw" name="mFaceJaw" sid="mFaceJaw" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 0 0 0 1 -0.015 0 0 0 1</matrix>
<node id="mFaceChin" name="mFaceChin" sid="mFaceChin" type="JOINT">
<matrix sid="transform">1 0 0 0.074 0 1 0 0 0 0 1 -0.054 0 0 0 1</matrix>
</node>
<node id="mFaceTeethLower" name="mFaceTeethLower" sid="mFaceTeethLower" type="JOINT">
<matrix sid="transform">1 0 0 0.021 0 1 0 0 0 0 1 -0.039 0 0 0 1</matrix>
<node id="mFaceLipLowerLeft" name="mFaceLipLowerLeft" sid="mFaceLipLowerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceLipLowerRight" name="mFaceLipLowerRight" sid="mFaceLipLowerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceLipLowerCenter" name="mFaceLipLowerCenter" sid="mFaceLipLowerCenter" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceTongueBase" name="mFaceTongueBase" sid="mFaceTongueBase" type="JOINT">
<matrix sid="transform">1 0 0 0.039 0 1 0 0 0 0 1 0.005 0 0 0 1</matrix>
<node id="mFaceTongueTip" name="mFaceTongueTip" sid="mFaceTongueTip" type="JOINT">
<matrix sid="transform">1 0 0 0.022 0 1 0 0 0 0 1 0.007 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
<node id="mFaceJawShaper" name="mFaceJawShaper" sid="mFaceJawShaper" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceForeheadCenter" name="mFaceForeheadCenter" sid="mFaceForeheadCenter" type="JOINT">
<matrix sid="transform">1 0 0 0.069 0 1 0 0 0 0 1 0.065 0 0 0 1</matrix>
</node>
<node id="mFaceNoseBase" name="mFaceNoseBase" sid="mFaceNoseBase" type="JOINT">
<matrix sid="transform">1 0 0 0.094 0 1 0 0 0 0 1 -0.016 0 0 0 1</matrix>
</node>
<node id="mFaceTeethUpper" name="mFaceTeethUpper" sid="mFaceTeethUpper" type="JOINT">
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 -0.03 0 0 0 1</matrix>
<node id="mFaceLipUpperLeft" name="mFaceLipUpperLeft" sid="mFaceLipUpperLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 -0.003 0 0 0 1</matrix>
</node>
<node id="mFaceLipUpperRight" name="mFaceLipUpperRight" sid="mFaceLipUpperRight" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 -0.003 0 0 0 1</matrix>
</node>
<node id="mFaceLipCornerLeft" name="mFaceLipCornerLeft" sid="mFaceLipCornerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 -0.019 0 0 1 -0.01 0 0 0 1</matrix>
</node>
<node id="mFaceLipCornerRight" name="mFaceLipCornerRight" sid="mFaceLipCornerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 0.019 0 0 1 -0.01 0 0 0 1</matrix>
</node>
<node id="mFaceLipUpperCenter" name="mFaceLipUpperCenter" sid="mFaceLipUpperCenter" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 -0.003 0 0 0 1</matrix>
</node>
</node>
<node id="mFaceEyecornerInnerLeft" name="mFaceEyecornerInnerLeft" sid="mFaceEyecornerInnerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.075 0 1 0 0.017 0 0 1 0.032 0 0 0 1</matrix>
</node>
<node id="mFaceEyecornerInnerRight" name="mFaceEyecornerInnerRight" sid="mFaceEyecornerInnerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.075 0 1 0 -0.017 0 0 1 0.032 0 0 0 1</matrix>
</node>
<node id="mFaceNoseBridge" name="mFaceNoseBridge" sid="mFaceNoseBridge" type="JOINT">
<matrix sid="transform">1 0 0 0.091 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
<node id="mCollarLeft" name="mCollarLeft" sid="mCollarLeft" type="JOINT">
<matrix sid="transform">1 0 0 -0.021 0 1 0 0.085 0 0 1 0.165 0 0 0 1</matrix>
<node id="L_CLAVICLE" name="L_CLAVICLE" sid="L_CLAVICLE" type="JOINT">
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
</node>
<node id="mShoulderLeft" name="mShoulderLeft" sid="mShoulderLeft" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.079 0 0 1 0 0 0 0 1</matrix>
<node id="L_UPPER_ARM" name="L_UPPER_ARM" sid="L_UPPER_ARM" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.12 0 0 1 0.01 0 0 0 1</matrix>
</node>
<node id="mElbowLeft" name="mElbowLeft" sid="mElbowLeft" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.248 0 0 1 0 0 0 0 1</matrix>
<node id="L_LOWER_ARM" name="L_LOWER_ARM" sid="L_LOWER_ARM" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.1 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mWristLeft" name="mWristLeft" sid="mWristLeft" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.205 0 0 1 0 0 0 0 1</matrix>
<node id="L_HAND" name="L_HAND" sid="L_HAND" type="JOINT">
<matrix sid="transform">1 0 0 0.01 0 1 0 0.05 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mHandMiddle1Left" name="mHandMiddle1Left" sid="mHandMiddle1Left" type="JOINT">
<matrix sid="transform">1 0 0 0.013 0 1 0 0.101 0 0 1 0.015 0 0 0 1</matrix>
<node id="mHandMiddle2Left" name="mHandMiddle2Left" sid="mHandMiddle2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 0.04 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandMiddle3Left" name="mHandMiddle3Left" sid="mHandMiddle3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 0.049 0 0 1 -0.008 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandIndex1Left" name="mHandIndex1Left" sid="mHandIndex1Left" type="JOINT">
<matrix sid="transform">1 0 0 0.038 0 1 0 0.097 0 0 1 0.015 0 0 0 1</matrix>
<node id="mHandIndex2Left" name="mHandIndex2Left" sid="mHandIndex2Left" type="JOINT">
<matrix sid="transform">1 0 0 0.017 0 1 0 0.036 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandIndex3Left" name="mHandIndex3Left" sid="mHandIndex3Left" type="JOINT">
<matrix sid="transform">1 0 0 0.014 0 1 0 0.032 0 0 1 -0.006 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandRing1Left" name="mHandRing1Left" sid="mHandRing1Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.01 0 1 0 0.099 0 0 1 0.009 0 0 0 1</matrix>
<node id="mHandRing2Left" name="mHandRing2Left" sid="mHandRing2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.013 0 1 0 0.038 0 0 1 -0.008 0 0 0 1</matrix>
<node id="mHandRing3Left" name="mHandRing3Left" sid="mHandRing3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.013 0 1 0 0.04 0 0 1 -0.009 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandPinky1Left" name="mHandPinky1Left" sid="mHandPinky1Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.031 0 1 0 0.095 0 0 1 0.003 0 0 0 1</matrix>
<node id="mHandPinky2Left" name="mHandPinky2Left" sid="mHandPinky2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.024 0 1 0 0.025 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandPinky3Left" name="mHandPinky3Left" sid="mHandPinky3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.015 0 1 0 0.018 0 0 1 -0.004 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandThumb1Left" name="mHandThumb1Left" sid="mHandThumb1Left" type="JOINT">
<matrix sid="transform">1 0 0 0.031 0 1 0 0.026 0 0 1 0.004 0 0 0 1</matrix>
<node id="mHandThumb2Left" name="mHandThumb2Left" sid="mHandThumb2Left" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 0.032 0 0 1 -0.001 0 0 0 1</matrix>
<node id="mHandThumb3Left" name="mHandThumb3Left" sid="mHandThumb3Left" type="JOINT">
<matrix sid="transform">1 0 0 0.023 0 1 0 0.031 0 0 1 -0.001 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
<node id="mCollarRight" name="mCollarRight" sid="mCollarRight" type="JOINT">
<matrix sid="transform">1 0 0 -0.021 0 1 0 -0.085 0 0 1 0.165 0 0 0 1</matrix>
<node id="R_CLAVICLE" name="R_CLAVICLE" sid="R_CLAVICLE" type="JOINT">
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
</node>
<node id="mShoulderRight" name="mShoulderRight" sid="mShoulderRight" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.079 0 0 1 0 0 0 0 1</matrix>
<node id="R_UPPER_ARM" name="R_UPPER_ARM" sid="R_UPPER_ARM" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.12 0 0 1 0.01 0 0 0 1</matrix>
</node>
<node id="mElbowRight" name="mElbowRight" sid="mElbowRight" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.248 0 0 1 0 0 0 0 1</matrix>
<node id="R_LOWER_ARM" name="R_LOWER_ARM" sid="R_LOWER_ARM" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.1 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mWristRight" name="mWristRight" sid="mWristRight" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.205 0 0 1 0 0 0 0 1</matrix>
<node id="R_HAND" name="R_HAND" sid="R_HAND" type="JOINT">
<matrix sid="transform">1 0 0 0.01 0 1 0 -0.05 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mHandMiddle1Right" name="mHandMiddle1Right" sid="mHandMiddle1Right" type="JOINT">
<matrix sid="transform">1 0 0 0.013 0 1 0 -0.101 0 0 1 0.015 0 0 0 1</matrix>
<node id="mHandMiddle2Right" name="mHandMiddle2Right" sid="mHandMiddle2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 -0.04 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandMiddle3Right" name="mHandMiddle3Right" sid="mHandMiddle3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 -0.049 0 0 1 -0.008 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandIndex1Right" name="mHandIndex1Right" sid="mHandIndex1Right" type="JOINT">
<matrix sid="transform">1 0 0 0.038 0 1 0 -0.097 0 0 1 0.015 0 0 0 1</matrix>
<node id="mHandIndex2Right" name="mHandIndex2Right" sid="mHandIndex2Right" type="JOINT">
<matrix sid="transform">1 0 0 0.017 0 1 0 -0.036 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandIndex3Right" name="mHandIndex3Right" sid="mHandIndex3Right" type="JOINT">
<matrix sid="transform">1 0 0 0.014 0 1 0 -0.032 0 0 1 -0.006 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandRing1Right" name="mHandRing1Right" sid="mHandRing1Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.01 0 1 0 -0.099 0 0 1 0.009 0 0 0 1</matrix>
<node id="mHandRing2Right" name="mHandRing2Right" sid="mHandRing2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.013 0 1 0 -0.038 0 0 1 -0.008 0 0 0 1</matrix>
<node id="mHandRing3Right" name="mHandRing3Right" sid="mHandRing3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.013 0 1 0 -0.04 0 0 1 -0.009 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandPinky1Right" name="mHandPinky1Right" sid="mHandPinky1Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.031 0 1 0 -0.095 0 0 1 0.003 0 0 0 1</matrix>
<node id="mHandPinky2Right" name="mHandPinky2Right" sid="mHandPinky2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.024 0 1 0 -0.025 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandPinky3Right" name="mHandPinky3Right" sid="mHandPinky3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.015 0 1 0 -0.018 0 0 1 -0.004 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandThumb1Right" name="mHandThumb1Right" sid="mHandThumb1Right" type="JOINT">
<matrix sid="transform">1 0 0 0.031 0 1 0 -0.026 0 0 1 0.004 0 0 0 1</matrix>
<node id="mHandThumb2Right" name="mHandThumb2Right" sid="mHandThumb2Right" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 -0.032 0 0 1 -0.001 0 0 0 1</matrix>
<node id="mHandThumb3Right" name="mHandThumb3Right" sid="mHandThumb3Right" type="JOINT">
<matrix sid="transform">1 0 0 0.023 0 1 0 -0.031 0 0 1 -0.001 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
<node id="mWingsRoot" name="mWingsRoot" sid="mWingsRoot" type="JOINT">
<matrix sid="transform">1 0 0 -0.014 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mWing1Left" name="mWing1Left" sid="mWing1Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.099 0 1 0 0.105 0 0 1 0.181 0 0 0 1</matrix>
<node id="mWing2Left" name="mWing2Left" sid="mWing2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.168 0 1 0 0.169 0 0 1 0.067 0 0 0 1</matrix>
<node id="mWing3Left" name="mWing3Left" sid="mWing3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.181 0 1 0 0.183 0 0 1 0 0 0 0 1</matrix>
<node id="mWing4Left" name="mWing4Left" sid="mWing4Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.171 0 1 0 0.173 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mWing4FanLeft" name="mWing4FanLeft" sid="mWing4FanLeft" type="JOINT">
<matrix sid="transform">1 0 0 -0.171 0 1 0 0.173 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
<node id="mWing1Right" name="mWing1Right" sid="mWing1Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.099 0 1 0 -0.105 0 0 1 0.181 0 0 0 1</matrix>
<node id="mWing2Right" name="mWing2Right" sid="mWing2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.168 0 1 0 -0.169 0 0 1 0.067 0 0 0 1</matrix>
<node id="mWing3Right" name="mWing3Right" sid="mWing3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.181 0 1 0 -0.183 0 0 1 0 0 0 0 1</matrix>
<node id="mWing4Right" name="mWing4Right" sid="mWing4Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.171 0 1 0 -0.173 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mWing4FanRight" name="mWing4FanRight" sid="mWing4FanRight" type="JOINT">
<matrix sid="transform">1 0 0 -0.171 0 1 0 -0.173 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
<node id="mHipRight" name="mHipRight" sid="mHipRight" type="JOINT">
<matrix sid="transform">1 0 0 0.034 0 1 0 -0.129 0 0 1 -0.041 0 0 0 1</matrix>
<node id="R_UPPER_LEG" name="R_UPPER_LEG" sid="R_UPPER_LEG" type="JOINT">
<matrix sid="transform">1 0 0 -0.02 0 1 0 0.05 0 0 1 -0.22 0 0 0 1</matrix>
</node>
<node id="mKneeRight" name="mKneeRight" sid="mKneeRight" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 0.049 0 0 1 -0.491 0 0 0 1</matrix>
<node id="R_LOWER_LEG" name="R_LOWER_LEG" sid="R_LOWER_LEG" type="JOINT">
<matrix sid="transform">1 0 0 -0.02 0 1 0 0 0 0 1 -0.2 0 0 0 1</matrix>
</node>
<node id="mAnkleRight" name="mAnkleRight" sid="mAnkleRight" type="JOINT">
<matrix sid="transform">1 0 0 -0.029 0 1 0 0 0 0 1 -0.468 0 0 0 1</matrix>
<node id="R_FOOT" name="R_FOOT" sid="R_FOOT" type="JOINT">
<matrix sid="transform">1 0 0 0.077 0 1 0 0 0 0 1 -0.041 0 0 0 1</matrix>
</node>
<node id="mFootRight" name="mFootRight" sid="mFootRight" type="JOINT">
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
<node id="mToeRight" name="mToeRight" sid="mToeRight" type="JOINT">
<matrix sid="transform">1 0 0 0.109 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
<node id="mHipLeft" name="mHipLeft" sid="mHipLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.034 0 1 0 0.127 0 0 1 -0.041 0 0 0 1</matrix>
<node id="L_UPPER_LEG" name="L_UPPER_LEG" sid="L_UPPER_LEG" type="JOINT">
<matrix sid="transform">1 0 0 -0.02 0 1 0 -0.05 0 0 1 -0.22 0 0 0 1</matrix>
</node>
<node id="mKneeLeft" name="mKneeLeft" sid="mKneeLeft" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 -0.046 0 0 1 -0.491 0 0 0 1</matrix>
<node id="L_LOWER_LEG" name="L_LOWER_LEG" sid="L_LOWER_LEG" type="JOINT">
<matrix sid="transform">1 0 0 -0.02 0 1 0 0 0 0 1 -0.2 0 0 0 1</matrix>
</node>
<node id="mAnkleLeft" name="mAnkleLeft" sid="mAnkleLeft" type="JOINT">
<matrix sid="transform">1 0 0 -0.029 0 1 0 0.001 0 0 1 -0.468 0 0 0 1</matrix>
<node id="L_FOOT" name="L_FOOT" sid="L_FOOT" type="JOINT">
<matrix sid="transform">1 0 0 0.077 0 1 0 0 0 0 1 -0.041 0 0 0 1</matrix>
</node>
<node id="mFootLeft" name="mFootLeft" sid="mFootLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
<node id="mToeLeft" name="mToeLeft" sid="mToeLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.109 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
<node id="mTail1" name="mTail1" sid="mTail1" type="JOINT">
<matrix sid="transform">1 0 0 -0.116 0 1 0 0 0 0 1 0.047 0 0 0 1</matrix>
<node id="mTail2" name="mTail2" sid="mTail2" type="JOINT">
<matrix sid="transform">1 0 0 -0.197 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mTail3" name="mTail3" sid="mTail3" type="JOINT">
<matrix sid="transform">1 0 0 -0.168 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mTail4" name="mTail4" sid="mTail4" type="JOINT">
<matrix sid="transform">1 0 0 -0.142 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mTail5" name="mTail5" sid="mTail5" type="JOINT">
<matrix sid="transform">1 0 0 -0.112 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mTail6" name="mTail6" sid="mTail6" type="JOINT">
<matrix sid="transform">1 0 0 -0.094 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
<node id="mGroin" name="mGroin" sid="mGroin" type="JOINT">
<matrix sid="transform">1 0 0 0.064 0 1 0 0 0 0 1 -0.097 0 0 0 1</matrix>
</node>
<node id="mHindLimbsRoot" name="mHindLimbsRoot" sid="mHindLimbsRoot" type="JOINT">
<matrix sid="transform">1 0 0 -0.2 0 1 0 0 0 0 1 0.084 0 0 0 1</matrix>
<node id="mHindLimb1Left" name="mHindLimb1Left" sid="mHindLimb1Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.204 0 1 0 0.129 0 0 1 -0.125 0 0 0 1</matrix>
<node id="mHindLimb2Left" name="mHindLimb2Left" sid="mHindLimb2Left" type="JOINT">
<matrix sid="transform">1 0 0 0.002 0 1 0 -0.046 0 0 1 -0.491 0 0 0 1</matrix>
<node id="mHindLimb3Left" name="mHindLimb3Left" sid="mHindLimb3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.03 0 1 0 -0.003 0 0 1 -0.468 0 0 0 1</matrix>
<node id="mHindLimb4Left" name="mHindLimb4Left" sid="mHindLimb4Left" type="JOINT">
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
<node id="mHindLimb1Right" name="mHindLimb1Right" sid="mHindLimb1Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.204 0 1 0 -0.129 0 0 1 -0.125 0 0 0 1</matrix>
<node id="mHindLimb2Right" name="mHindLimb2Right" sid="mHindLimb2Right" type="JOINT">
<matrix sid="transform">1 0 0 0.002 0 1 0 0.046 0 0 1 -0.491 0 0 0 1</matrix>
<node id="mHindLimb3Right" name="mHindLimb3Right" sid="mHindLimb3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.03 0 1 0 0.003 0 0 1 -0.468 0 0 0 1</matrix>
<node id="mHindLimb4Right" name="mHindLimb4Right" sid="mHindLimb4Right" type="JOINT">
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
</node>

View File

@@ -18,6 +18,8 @@ You should have received a copy of the GNU Lesser General Public License
along with this program; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
from __future__ import annotations
import ast
import enum
import hashlib
@@ -27,6 +29,7 @@ import math
from typing import *
import recordclass
import transformations
logger = getLogger('hippolyzer.lib.base.datatypes')
@@ -58,6 +61,9 @@ class TupleCoord(recordclass.datatuple, _IterableStub): # type: ignore
def __abs__(self):
return self.__class__(*(abs(x) for x in self))
def __neg__(self):
return self.__class__(*(-x for x in self))
def __add__(self, other):
return self.__class__(*(x + y for x, y in zip(self, other)))
@@ -215,6 +221,15 @@ class Quaternion(TupleCoord):
)
return super().__mul__(other)
@classmethod
def from_transformations(cls, coord) -> Quaternion:
"""Convert to W (S) last form"""
return cls(coord[1], coord[2], coord[3], coord[0])
def to_transformations(self) -> Tuple[float, float, float, float]:
"""Convert to W (S) first form for use with the transformations lib"""
return self.W, self.X, self.Y, self.Z
@classmethod
def from_euler(cls, roll, pitch, yaw, degrees=False):
if degrees:
@@ -236,6 +251,9 @@ class Quaternion(TupleCoord):
return cls(X=x, Y=y, Z=z, W=w)
def to_euler(self) -> Vector3:
return Vector3(*transformations.euler_from_quaternion(self.to_transformations()))
def data(self, wanted_components=None):
if wanted_components == 3:
return self.X, self.Y, self.Z
@@ -244,6 +262,7 @@ class Quaternion(TupleCoord):
class UUID(uuid.UUID):
_NULL_UUID_STR = '00000000-0000-0000-0000-000000000000'
ZERO: UUID
__slots__ = ()
def __init__(self, val: Union[uuid.UUID, str, None] = None, bytes=None, int=None):
@@ -268,12 +287,16 @@ class UUID(uuid.UUID):
return self.__class__(int=self.int ^ other.int)
UUID.ZERO = UUID()
class JankStringyBytes(bytes):
"""
Treat bytes as UTF8 if used in string context
Sinful, but necessary evil for now since templates don't specify what's
binary and what's a string.
binary and what's a string. There are also certain fields where the value
may be either binary _or_ a string, depending on the context.
"""
__slots__ = ()
@@ -288,12 +311,28 @@ class JankStringyBytes(bytes):
def __ne__(self, other):
return not self.__eq__(other)
def __contains__(self, item):
if isinstance(item, str):
return item in str(self)
return item in bytes(self)
class RawBytes(bytes):
__slots__ = ()
pass
_T = TypeVar("_T")
class Pretty(Generic[_T]):
"""Wrapper for var values so Messages will know to serialize"""
__slots__ = ("value",)
def __init__(self, value: _T):
self.value: _T = value
class StringEnum(str, enum.Enum):
def __str__(self):
return self.value
@@ -333,5 +372,5 @@ class TaggedUnion(recordclass.datatuple): # type: ignore
__all__ = [
"Vector3", "Vector4", "Vector2", "Quaternion", "TupleCoord",
"UUID", "RawBytes", "StringEnum", "JankStringyBytes", "TaggedUnion",
"IntEnum", "IntFlag", "flags_to_pod"
"IntEnum", "IntFlag", "flags_to_pod", "Pretty"
]

View File

@@ -2,6 +2,9 @@ from __future__ import annotations
import codecs
import functools
import os
import lazy_object_proxy
import pkg_resources
import re
import weakref
@@ -17,7 +20,7 @@ def _with_patched_multidict(f):
# There's no way to tell pprint "hey, this is a dict,
# this is how you access its items." A lot of the formatting logic
# is in the module-level `_safe_repr()` which we don't want to mess with.
# Instead, pretend our MultiDict has dict's __repr__ and while we're inside
# Instead, pretend our MultiDict has dict's __repr__ while we're inside
# calls to pprint. Hooray.
orig_repr = MultiDict.__repr__
if orig_repr is dict.__repr__:
@@ -65,6 +68,9 @@ class HippoPrettyPrinter(PrettyPrinter):
return f"({reprs})"
def pformat(self, obj: object, *args, **kwargs) -> str:
# Unwrap lazy object proxies before pprinting them
if isinstance(obj, lazy_object_proxy.Proxy):
obj = obj.__wrapped__
if isinstance(obj, (bytes, str)):
return self._str_format(obj)
return self._base_pformat(obj, *args, **kwargs)
@@ -145,3 +151,10 @@ def to_chunks(chunkable: Sequence[_T], chunk_size: int) -> Generator[_T, None, N
while chunkable:
yield chunkable[:chunk_size]
chunkable = chunkable[chunk_size:]
def get_mtime(path):
try:
return os.stat(path).st_mtime
except:
return None

View File

@@ -7,8 +7,9 @@ from __future__ import annotations
import dataclasses
import datetime as dt
import itertools
import logging
import struct
import typing
import weakref
from io import StringIO
from typing import *
@@ -33,6 +34,17 @@ LOG = logging.getLogger(__name__)
_T = TypeVar("_T")
class SchemaFlagField(SchemaHexInt):
"""Like a hex int, but must be serialized as bytes in LLSD due to being a U32"""
@classmethod
def from_llsd(cls, val: Any) -> int:
return struct.unpack("!I", val)[0]
@classmethod
def to_llsd(cls, val: int) -> Any:
return struct.pack("!I", val)
def _yield_schema_tokens(reader: StringIO):
in_bracket = False
# empty str == EOF in Python
@@ -76,7 +88,7 @@ class InventoryBase(SchemaBase):
if schema_name != cls.SCHEMA_NAME:
raise ValueError(f"Expected schema name {schema_name!r} to be {cls.SCHEMA_NAME!r}")
fields = cls._fields_dict()
fields = cls._get_fields_dict()
obj_dict = {}
for key, val in tok_iter:
if key in fields:
@@ -100,11 +112,13 @@ class InventoryBase(SchemaBase):
def to_writer(self, writer: StringIO):
writer.write(f"\t{self.SCHEMA_NAME}\t0\n")
writer.write("\t{\n")
for field_name, field in self._fields_dict().items():
for field_name, field in self._get_fields_dict().items():
spec = field.metadata.get("spec")
# Not meant to be serialized
if not spec:
continue
if field.metadata.get("llsd_only"):
continue
val = getattr(self, field_name)
if val is None:
@@ -120,10 +134,14 @@ class InventoryBase(SchemaBase):
writer.write("\t}\n")
class InventoryDifferences(typing.NamedTuple):
changed: List[InventoryNodeBase]
removed: List[InventoryNodeBase]
class InventoryModel(InventoryBase):
def __init__(self):
self.containers: Dict[UUID, InventoryContainerBase] = {}
self.items: Dict[UUID, InventoryItem] = {}
self.nodes: Dict[UUID, InventoryNodeBase] = {}
self.root: Optional[InventoryContainerBase] = None
@classmethod
@@ -133,48 +151,117 @@ class InventoryModel(InventoryBase):
if key == "inv_object":
obj = InventoryObject.from_reader(reader)
if obj is not None:
model.add_container(obj)
model.add(obj)
elif key == "inv_category":
cat = InventoryCategory.from_reader(reader)
if cat is not None:
model.add_container(cat)
model.add(cat)
elif key == "inv_item":
item = InventoryItem.from_reader(reader)
if item is not None:
model.add_item(item)
model.add(item)
else:
LOG.warning("Unknown key {0}".format(key))
model.reparent_nodes()
return model
@classmethod
def from_llsd(cls, llsd_val: List[Dict]) -> InventoryModel:
model = cls()
for obj_dict in llsd_val:
for inv_type in INVENTORY_TYPES:
if inv_type.ID_ATTR in obj_dict:
if (obj := inv_type.from_llsd(obj_dict)) is not None:
model.add(obj)
break
LOG.warning(f"Unknown object type {obj_dict!r}")
return model
@property
def ordered_nodes(self) -> Iterable[InventoryNodeBase]:
yield from self.all_containers
yield from self.all_items
@property
def all_containers(self) -> Iterable[InventoryContainerBase]:
for node in self.nodes.values():
if isinstance(node, InventoryContainerBase):
yield node
@property
def all_items(self) -> Iterable[InventoryItem]:
for node in self.nodes.values():
if not isinstance(node, InventoryContainerBase):
yield node
def __eq__(self, other):
if not isinstance(other, InventoryModel):
return False
return set(self.nodes.values()) == set(other.nodes.values())
def to_writer(self, writer: StringIO):
for container in self.containers.values():
container.to_writer(writer)
for item in self.items.values():
item.to_writer(writer)
for node in self.ordered_nodes:
node.to_writer(writer)
def add_container(self, container: InventoryContainerBase):
self.containers[container.node_id] = container
container.model = weakref.proxy(self)
def to_llsd(self):
return list(node.to_llsd() for node in self.ordered_nodes)
def add_item(self, item: InventoryItem):
self.items[item.item_id] = item
item.model = weakref.proxy(self)
def add(self, node: InventoryNodeBase):
if node.node_id in self.nodes:
raise KeyError(f"{node.node_id} already exists in the inventory model")
def reparent_nodes(self):
self.root = None
for container in self.containers.values():
container.children.clear()
if container.parent_id == UUID():
self.root = container
for obj in itertools.chain(self.items.values(), self.containers.values()):
if not obj.parent_id or obj.parent_id == UUID():
continue
parent_container = self.containers.get(obj.parent_id)
if not parent_container:
LOG.warning("{0} had an invalid parent {1}".format(obj, obj.parent_id))
continue
parent_container.children.append(obj)
self.nodes[node.node_id] = node
if isinstance(node, InventoryContainerBase):
if node.parent_id == UUID.ZERO:
self.root = node
node.model = weakref.proxy(self)
def unlink(self, node: InventoryNodeBase, single_only: bool = False) -> Sequence[InventoryNodeBase]:
"""Unlink a node and its descendants from the tree, returning the removed nodes"""
assert node.model == self
if node == self.root:
self.root = None
unlinked = [node]
if isinstance(node, InventoryContainerBase) and not single_only:
for child in node.children:
unlinked.extend(self.unlink(child))
self.nodes.pop(node.node_id, None)
node.model = None
return unlinked
def get_differences(self, other: InventoryModel) -> InventoryDifferences:
# Includes modified things with the same ID
changed_in_other = []
removed_in_other = []
other_keys = set(other.nodes.keys())
our_keys = set(self.nodes.keys())
# Removed
for key in our_keys - other_keys:
removed_in_other.append(self.nodes[key])
# Updated
for key in other_keys.intersection(our_keys):
other_node = other.nodes[key]
if other_node != self.nodes[key]:
changed_in_other.append(other_node)
# Added
for key in other_keys - our_keys:
changed_in_other.append(other.nodes[key])
return InventoryDifferences(
changed=changed_in_other,
removed=removed_in_other,
)
def __getitem__(self, item: UUID) -> InventoryNodeBase:
return self.nodes[item]
def __contains__(self, item: UUID):
return item in self.nodes
def get(self, item: UUID) -> Optional[InventoryNodeBase]:
return self.nodes.get(item)
@dataclasses.dataclass
@@ -190,6 +277,9 @@ class InventoryPermissions(InventoryBase):
owner_id: UUID = schema_field(SchemaUUID)
last_owner_id: UUID = schema_field(SchemaUUID)
group_id: UUID = schema_field(SchemaUUID)
# Nothing actually cares about this, but it could be there.
# It's kind of redundant since it just means owner_id == NULL_KEY && group_id != NULL_KEY.
is_owner_group: int = schema_field(SchemaInt, default=0, llsd_only=True)
@dataclasses.dataclass
@@ -204,16 +294,27 @@ class InventorySaleInfo(InventoryBase):
class InventoryNodeBase(InventoryBase):
ID_ATTR: ClassVar[str]
name: str
parent_id: Optional[UUID] = schema_field(SchemaUUID)
model: Optional[InventoryModel] = dataclasses.field(default=None, init=False)
model: Optional[InventoryModel] = dataclasses.field(
default=None, init=False, hash=False, compare=False, repr=False
)
@property
def node_id(self) -> UUID:
return getattr(self, self.ID_ATTR)
@node_id.setter
def node_id(self, val: UUID):
setattr(self, self.ID_ATTR, val)
@property
def parent(self):
return self.model.containers.get(self.parent_id)
def parent(self) -> Optional[InventoryContainerBase]:
return self.model.nodes.get(self.parent_id)
def unlink(self) -> Sequence[InventoryNodeBase]:
return self.model.unlink(self)
@classmethod
def _obj_from_dict(cls, obj_dict):
@@ -224,12 +325,58 @@ class InventoryNodeBase(InventoryBase):
return None
return super()._obj_from_dict(obj_dict)
def __hash__(self):
return hash(self.node_id)
def __iter__(self) -> Iterator[InventoryNodeBase]:
return iter(())
def __contains__(self, item) -> bool:
return item in tuple(self)
@dataclasses.dataclass
class InventoryContainerBase(InventoryNodeBase):
type: str = schema_field(SchemaStr)
name: str = schema_field(SchemaMultilineStr)
children: List[InventoryNodeBase] = dataclasses.field(default_factory=list, init=False)
@property
def children(self) -> Sequence[InventoryNodeBase]:
return tuple(
x for x in self.model.nodes.values()
if x.parent_id == self.node_id
)
def __getitem__(self, item: Union[int, str]) -> InventoryNodeBase:
if isinstance(item, int):
return self.children[item]
for child in self.children:
if child.name == item:
return child
raise KeyError(f"{item!r} not found in children")
def __iter__(self) -> Iterator[InventoryNodeBase]:
return iter(self.children)
def get_or_create_subcategory(self, name: str) -> InventoryCategory:
for child in self:
if child.name == name and isinstance(child, InventoryCategory):
return child
child = InventoryCategory(
name=name,
cat_id=UUID.random(),
parent_id=self.node_id,
type="category",
pref_type="-1",
owner_id=getattr(self, 'owner_id', UUID.ZERO),
version=1,
)
self.model.add(child)
return child
# So autogenerated __hash__ doesn't kill our inherited one
__hash__ = InventoryNodeBase.__hash__
@dataclasses.dataclass
@@ -239,17 +386,22 @@ class InventoryObject(InventoryContainerBase):
obj_id: UUID = schema_field(SchemaUUID)
__hash__ = InventoryNodeBase.__hash__
@dataclasses.dataclass
class InventoryCategory(InventoryContainerBase):
ID_ATTR: ClassVar[str] = "cat_id"
SCHEMA_NAME: ClassVar[str] = "inv_object"
SCHEMA_NAME: ClassVar[str] = "inv_category"
VERSION_NONE: ClassVar[int] = -1
cat_id: UUID = schema_field(SchemaUUID)
pref_type: str = schema_field(SchemaStr)
pref_type: str = schema_field(SchemaStr, llsd_name="preferred_type")
owner_id: UUID = schema_field(SchemaUUID)
version: int = schema_field(SchemaInt)
__hash__ = InventoryNodeBase.__hash__
@dataclasses.dataclass
class InventoryItem(InventoryNodeBase):
@@ -259,17 +411,22 @@ class InventoryItem(InventoryNodeBase):
item_id: UUID = schema_field(SchemaUUID)
type: str = schema_field(SchemaStr)
inv_type: str = schema_field(SchemaStr)
flags: int = schema_field(SchemaHexInt)
flags: int = schema_field(SchemaFlagField)
name: str = schema_field(SchemaMultilineStr)
desc: str = schema_field(SchemaMultilineStr)
creation_date: dt.datetime = schema_field(SchemaDate)
creation_date: dt.datetime = schema_field(SchemaDate, llsd_name="created_at")
permissions: InventoryPermissions = schema_field(InventoryPermissions)
sale_info: InventorySaleInfo = schema_field(InventorySaleInfo)
asset_id: Optional[UUID] = schema_field(SchemaUUID, default=None)
shadow_id: Optional[UUID] = schema_field(SchemaUUID, default=None)
__hash__ = InventoryNodeBase.__hash__
@property
def true_asset_id(self) -> UUID:
if self.asset_id is not None:
return self.asset_id
return self.shadow_id ^ MAGIC_ID
INVENTORY_TYPES: Tuple[Type[InventoryNodeBase], ...] = (InventoryCategory, InventoryObject, InventoryItem)

View File

@@ -1,7 +1,6 @@
import os
import tempfile
from io import BytesIO
from typing import *
import defusedxml.ElementTree
from glymur import jp2box, Jp2k
@@ -10,12 +9,6 @@ from glymur import jp2box, Jp2k
jp2box.ET = defusedxml.ElementTree
SL_DEFAULT_ENCODE = {
"cratios": (1920.0, 480.0, 120.0, 30.0, 10.0),
"irreversible": True,
}
class BufferedJp2k(Jp2k):
"""
For manipulating JP2K from within a binary buffer.
@@ -24,12 +17,7 @@ class BufferedJp2k(Jp2k):
based on filename, so this is the least brittle approach.
"""
def __init__(self, contents: bytes, encode_kwargs: Optional[Dict] = None):
if encode_kwargs is None:
self.encode_kwargs = SL_DEFAULT_ENCODE.copy()
else:
self.encode_kwargs = encode_kwargs
def __init__(self, contents: bytes):
stream = BytesIO(contents)
self.temp_file = tempfile.NamedTemporaryFile(delete=False)
stream.seek(0)
@@ -44,11 +32,12 @@ class BufferedJp2k(Jp2k):
os.remove(self.temp_file.name)
self.temp_file = None
def _write(self, img_array, verbose=False, **kwargs):
# Glymur normally only lets you control encode params when a write happens within
# the constructor. Keep around the encode params from the constructor and pass
# them to successive write calls.
return super()._write(img_array, verbose=False, **self.encode_kwargs, **kwargs)
def _populate_cparams(self, img_array):
if self._cratios is None:
self._cratios = (1920.0, 480.0, 120.0, 30.0, 10.0)
if self._irreversible is None:
self.irreversible = True
return super()._populate_cparams(img_array)
def __bytes__(self):
with open(self.temp_file.name, "rb") as f:

View File

@@ -31,6 +31,14 @@ class SchemaFieldSerializer(abc.ABC, Generic[_T]):
def serialize(cls, val: _T) -> str:
pass
@classmethod
def from_llsd(cls, val: Any) -> _T:
return val
@classmethod
def to_llsd(cls, val: _T) -> Any:
return val
class SchemaDate(SchemaFieldSerializer[dt.datetime]):
@classmethod
@@ -41,6 +49,14 @@ class SchemaDate(SchemaFieldSerializer[dt.datetime]):
def serialize(cls, val: dt.datetime) -> str:
return str(calendar.timegm(val.utctimetuple()))
@classmethod
def from_llsd(cls, val: Any) -> dt.datetime:
return dt.datetime.utcfromtimestamp(val)
@classmethod
def to_llsd(cls, val: dt.datetime):
return calendar.timegm(val.utctimetuple())
class SchemaHexInt(SchemaFieldSerializer[int]):
@classmethod
@@ -95,10 +111,11 @@ class SchemaUUID(SchemaFieldSerializer[UUID]):
def schema_field(spec: Type[Union[SchemaBase, SchemaFieldSerializer]], *, default=dataclasses.MISSING, init=True,
repr=True, hash=None, compare=True) -> dataclasses.Field: # noqa
repr=True, hash=None, compare=True, llsd_name=None, llsd_only=False) -> dataclasses.Field: # noqa
"""Describe a field in the inventory schema and the shape of its value"""
return dataclasses.field(
metadata={"spec": spec}, default=default, init=init, repr=repr, hash=hash, compare=compare
return dataclasses.field( # noqa
metadata={"spec": spec, "llsd_name": llsd_name, "llsd_only": llsd_only}, default=default,
init=init, repr=repr, hash=hash, compare=compare,
)
@@ -121,8 +138,14 @@ def parse_schema_line(line: str):
@dataclasses.dataclass
class SchemaBase(abc.ABC):
@classmethod
def _fields_dict(cls):
return {f.name: f for f in dataclasses.fields(cls)}
def _get_fields_dict(cls, llsd=False):
fields_dict = {}
for field in dataclasses.fields(cls):
field_name = field.name
if llsd:
field_name = field.metadata.get("llsd_name") or field_name
fields_dict[field_name] = field
return fields_dict
@classmethod
def from_str(cls, text: str):
@@ -137,6 +160,30 @@ class SchemaBase(abc.ABC):
def from_bytes(cls, data: bytes):
return cls.from_str(data.decode("utf8"))
@classmethod
def from_llsd(cls, inv_dict: Dict):
fields = cls._get_fields_dict(llsd=True)
obj_dict = {}
for key, val in inv_dict.items():
if key in fields:
field: dataclasses.Field = fields[key]
key = field.name
spec = field.metadata.get("spec")
# Not a real key, an internal var on our dataclass
if not spec:
LOG.warning(f"Internal key {key!r}")
continue
# some kind of nested structure like sale_info
if issubclass(spec, SchemaBase):
obj_dict[key] = spec.from_llsd(val)
elif issubclass(spec, SchemaFieldSerializer):
obj_dict[key] = spec.from_llsd(val)
else:
raise ValueError(f"Unsupported spec for {key!r}, {spec!r}")
else:
LOG.warning(f"Unknown key {key!r}")
return cls._obj_from_dict(obj_dict)
def to_bytes(self) -> bytes:
return self.to_str().encode("utf8")
@@ -146,6 +193,28 @@ class SchemaBase(abc.ABC):
writer.seek(0)
return writer.read()
def to_llsd(self):
obj_dict = {}
for field_name, field in self._get_fields_dict(llsd=True).items():
spec = field.metadata.get("spec")
# Not meant to be serialized
if not spec:
continue
val = getattr(self, field.name)
if val is None:
continue
# Some kind of nested structure like sale_info
if isinstance(val, SchemaBase):
val = val.to_llsd()
elif issubclass(spec, SchemaFieldSerializer):
val = spec.to_llsd(val)
else:
raise ValueError(f"Bad inventory spec {spec!r}")
obj_dict[field_name] = val
return obj_dict
@abc.abstractmethod
def to_writer(self, writer: StringIO):
pass

View File

@@ -46,22 +46,103 @@ class HippoLLSDNotationFormatter(llbase.llsd.LLSDNotationFormatter, HippoLLSDBas
def __init__(self):
super().__init__()
def STRING(self, v):
# llbase's notation LLSD encoder isn't suitable for generating line-delimited
# LLSD because the string formatter leaves \n unencoded, unlike indra's llcommon.
# Add our own escaping rule.
return super().STRING(v).replace(b"\n", b"\\n")
def format_notation(val: typing.Any):
return HippoLLSDNotationFormatter().format(val)
def format_binary(val: typing.Any, with_header=True):
val = llbase.llsd.format_binary(val)
if not with_header:
return val.split(b"\n", 1)[1]
val = _format_binary_recurse(val)
if with_header:
return b'<?llsd/binary?>\n' + val
return val
# This is copied almost wholesale from https://bitbucket.org/lindenlab/llbase/src/master/llbase/llsd.py
# With a few minor changes to make serialization round-trip correctly. It's evil.
def _format_binary_recurse(something) -> bytes:
"""Binary formatter workhorse."""
def _format_list(list_something):
array_builder = [b'[' + struct.pack('!i', len(list_something))]
for item in list_something:
array_builder.append(_format_binary_recurse(item))
array_builder.append(b']')
return b''.join(array_builder)
if something is None:
return b'!'
elif isinstance(something, LLSD):
return _format_binary_recurse(something.thing)
elif isinstance(something, bool):
if something:
return b'1'
else:
return b'0'
elif is_integer(something):
try:
return b'i' + struct.pack('!i', something)
except (OverflowError, struct.error) as exc:
raise LLSDSerializationError(str(exc), something)
elif isinstance(something, float):
try:
return b'r' + struct.pack('!d', something)
except SystemError as exc:
raise LLSDSerializationError(str(exc), something)
elif isinstance(something, uuid.UUID):
return b'u' + something.bytes
elif isinstance(something, binary):
return b'b' + struct.pack('!i', len(something)) + something
elif is_string(something):
if is_unicode(something):
something = something.encode("utf8")
return b's' + struct.pack('!i', len(something)) + something
elif isinstance(something, uri):
return b'l' + struct.pack('!i', len(something)) + something.encode("utf8")
elif isinstance(something, datetime.datetime):
return b'd' + struct.pack('<d', something.timestamp())
elif isinstance(something, datetime.date):
seconds_since_epoch = calendar.timegm(something.timetuple())
return b'd' + struct.pack('<d', seconds_since_epoch)
elif isinstance(something, (list, tuple)):
return _format_list(something)
elif isinstance(something, dict):
map_builder = [b'{' + struct.pack('!i', len(something))]
for key, value in something.items():
if isinstance(key, str):
key = key.encode("utf8")
map_builder.append(b'k' + struct.pack('!i', len(key)) + key)
map_builder.append(_format_binary_recurse(value))
map_builder.append(b'}')
return b''.join(map_builder)
else:
try:
return _format_list(list(something))
except TypeError:
raise LLSDSerializationError(
"Cannot serialize unknown type: %s (%s)" %
(type(something), something))
class HippoLLSDBinaryParser(llbase.llsd.LLSDBinaryParser):
def __init__(self):
super().__init__()
self._dispatch[ord('u')] = lambda: UUID(bytes=self._getc(16))
self._dispatch[ord('d')] = self._parse_date
def _parse_date(self):
seconds = struct.unpack("<d", self._getc(8))[0]
try:
return datetime.datetime.fromtimestamp(seconds, tz=datetime.timezone.utc)
except OverflowError as exc:
# A garbage seconds value can cause utcfromtimestamp() to raise
# OverflowError: timestamp out of range for platform time_t
self._error(exc, -8)
def _parse_string(self):
# LLSD's C++ API lets you stuff binary in a string field even though it's only
@@ -89,7 +170,7 @@ def parse_notation(data: bytes):
def zip_llsd(val: typing.Any):
return zlib.compress(format_binary(val, with_header=False))
return zlib.compress(format_binary(val, with_header=False), level=zlib.Z_BEST_COMPRESSION)
def unzip_llsd(data: bytes):

View File

@@ -26,6 +26,50 @@ class MeshAsset:
segments: MeshSegmentDict = dataclasses.field(default_factory=dict)
raw_segments: Dict[str, bytes] = dataclasses.field(default_factory=dict)
@classmethod
def make_triangle(cls) -> MeshAsset:
"""Make an asset representing an un-rigged single-sided mesh triangle"""
inst = cls()
inst.header = {
"version": 1,
"high_lod": {"offset": 0, "size": 0},
"physics_mesh": {"offset": 0, "size": 0},
"physics_convex": {"offset": 0, "size": 0},
}
base_lod: LODSegmentDict = {
'Normal': [
Vector3(-0.0, -0.0, -1.0),
Vector3(-0.0, -0.0, -1.0),
Vector3(-0.0, -0.0, -1.0)
],
'PositionDomain': {'Max': [0.5, 0.5, 0.0], 'Min': [-0.5, -0.5, 0.0]},
'Position': [
Vector3(0.0, 0.0, 0.0),
Vector3(1.0, 0.0, 0.0),
Vector3(0.5, 1.0, 0.0)
],
'TexCoord0Domain': {'Max': [1.0, 1.0], 'Min': [0.0, 0.0]},
'TexCoord0': [
Vector2(0.0, 0.0),
Vector2(1.0, 0.0),
Vector2(0.5, 1.0)
],
'TriangleList': [[0, 1, 2]],
}
inst.segments['physics_mesh'] = [deepcopy(base_lod)]
inst.segments['high_lod'] = [deepcopy(base_lod)]
convex_segment: PhysicsConvexSegmentDict = {
'BoundingVerts': [
Vector3(-0.0, 1.0, -1.0),
Vector3(-1.0, -1.0, -1.0),
Vector3(1.0, -1.0, -1.0)
],
'Max': [0.5, 0.5, 0.0],
'Min': [-0.5, -0.5, 0.0]
}
inst.segments['physics_convex'] = convex_segment
return inst
def iter_lods(self) -> Generator[List[LODSegmentDict], None, None]:
for lod_name, lod_val in self.segments.items():
if lod_name.endswith("_lod"):
@@ -135,20 +179,26 @@ class VertexWeight(recordclass.datatuple): # type: ignore
class SkinSegmentDict(TypedDict, total=False):
"""Rigging information"""
joint_names: List[str]
# model -> world transform matrix for model
# model -> world transform mat4 for model
bind_shape_matrix: List[float]
# world -> joint local transform matrices
# world -> joint local transform mat4s
inverse_bind_matrix: List[List[float]]
# offset matrices for joints, translation-only.
# Not sure what these are relative to, base joint or model <0,0,0>.
# Transform mat4s for the joint nodes themselves.
# The matrices may have scale or other components, but only the
# translation component will be used by the viewer.
# All translations are relative to the joint's parent.
alt_inverse_bind_matrix: List[List[float]]
lock_scale_if_joint_position: bool
pelvis_offset: float
class PhysicsConvexSegmentDict(DomainDict, total=False):
"""Data for convex hull collisions, populated by the client"""
# Min / Max domain vals are inline, unlike for LODs
"""
Data for convex hull collisions, populated by the client
Min / Max pos domain vals are inline, unlike for LODs, so this inherits from DomainDict
"""
# Indices into the Positions list
HullList: List[int]
# -1.0 - 1.0, dequantized from binary field of U16s
Positions: List[Vector3]
@@ -158,13 +208,13 @@ class PhysicsConvexSegmentDict(DomainDict, total=False):
class PhysicsHavokSegmentDict(TypedDict, total=False):
"""Cached data for Havok collisions, populated by sim and not used by client."""
HullMassProps: MassPropsDict
MOPP: MOPPDict
MeshDecompMassProps: MassPropsDict
HullMassProps: HavokMassPropsDict
MOPP: HavokMOPPDict
MeshDecompMassProps: HavokMassPropsDict
WeldingData: bytes
class MassPropsDict(TypedDict, total=False):
class HavokMassPropsDict(TypedDict, total=False):
# Vec, center of mass
CoM: List[float]
# 9 floats, Mat3?
@@ -173,7 +223,7 @@ class MassPropsDict(TypedDict, total=False):
volume: float
class MOPPDict(TypedDict, total=False):
class HavokMOPPDict(TypedDict, total=False):
"""Memory Optimized Partial Polytope"""
BuildType: int
MoppData: bytes
@@ -270,8 +320,8 @@ LOD_SEGMENT_SERIALIZER = SegmentSerializer({
# Each position represents a single vert.
"Position": se.Collection(None, se.Vector3U16(0.0, 1.0)),
"TexCoord0": se.Collection(None, se.Vector2U16(0.0, 1.0)),
# Normals have a static domain between -1 and 1
"Normal": se.Collection(None, se.Vector3U16(0.0, 1.0)),
# Normals have a static domain between -1 and 1, so just use that.
"Normal": se.Collection(None, se.Vector3U16(-1.0, 1.0)),
"Weights": se.Collection(None, VertexWeights)
})

View File

@@ -1,6 +1,9 @@
from __future__ import annotations
import abc
import asyncio
import copy
import dataclasses
import datetime as dt
import logging
from typing import *
@@ -13,6 +16,14 @@ from .msgtypes import PacketFlags
from .udpserializer import UDPMessageSerializer
@dataclasses.dataclass
class ReliableResendInfo:
last_resent: dt.datetime
message: Message
completed: asyncio.Future = dataclasses.field(default_factory=asyncio.Future)
tries_left: int = 10
class Circuit:
def __init__(self, near_host: Optional[ADDR_TUPLE], far_host: ADDR_TUPLE, transport):
self.near_host: Optional[ADDR_TUPLE] = near_host
@@ -22,6 +33,8 @@ class Circuit:
self.serializer = UDPMessageSerializer()
self.last_packet_at = dt.datetime.now()
self.packet_id_base = 0
self.unacked_reliable: Dict[Tuple[Direction, int], ReliableResendInfo] = {}
self.resend_every: float = 3.0
def _send_prepared_message(self, message: Message, transport=None):
try:
@@ -46,22 +59,69 @@ class Circuit:
raise RuntimeError(f"Trying to re-send finalized {message!r}")
message.packet_id = self.packet_id_base
self.packet_id_base += 1
if not message.acks:
message.send_flags &= PacketFlags.ACK
if message.acks:
message.send_flags |= PacketFlags.ACK
else:
message.send_flags &= ~PacketFlags.ACK
# If it was queued, it's not anymore
message.queued = False
message.finalized = True
def send_message(self, message: Message, transport=None):
def send(self, message: Message, transport=None) -> UDPPacket:
if self.prepare_message(message):
# If the message originates from us then we're responsible for resends.
if message.reliable and message.synthetic:
self.unacked_reliable[(message.direction, message.packet_id)] = ReliableResendInfo(
last_resent=dt.datetime.now(),
message=message,
)
return self._send_prepared_message(message, transport)
# Temporary alias
send_message = send
def send_reliable(self, message: Message, transport=None) -> asyncio.Future:
"""send() wrapper that always sends reliably and allows `await`ing ACK receipt"""
if not message.synthetic:
raise ValueError("Not able to send non-synthetic message reliably!")
message.send_flags |= PacketFlags.RELIABLE
self.send(message, transport)
return self.unacked_reliable[(message.direction, message.packet_id)].completed
def collect_acks(self, message: Message):
effective_acks = list(message.acks)
if message.name == "PacketAck":
effective_acks.extend(x["ID"] for x in message["Packets"])
for ack in effective_acks:
resend_info = self.unacked_reliable.pop((~message.direction, ack), None)
if resend_info:
resend_info.completed.set_result(None)
def resend_unacked(self):
for resend_info in list(self.unacked_reliable.values()):
# Not time to attempt a resend yet
if dt.datetime.now() - resend_info.last_resent < dt.timedelta(seconds=self.resend_every):
continue
msg = copy.copy(resend_info.message)
resend_info.tries_left -= 1
# We were on our last try and we never received an ack
if not resend_info.tries_left:
logging.warning(f"Giving up on unacked {msg.packet_id}")
del self.unacked_reliable[(msg.direction, msg.packet_id)]
resend_info.completed.set_exception(TimeoutError("Exceeded resend limit"))
continue
resend_info.last_resent = dt.datetime.now()
msg.send_flags |= PacketFlags.RESENT
self._send_prepared_message(msg)
def send_acks(self, to_ack: Sequence[int], direction=Direction.OUT, packet_id=None):
logging.debug("%r acking %r" % (direction, to_ack))
# TODO: maybe tack this onto `.acks` for next message?
message = Message('PacketAck', *[Block('Packets', ID=x) for x in to_ack])
message.packet_id = packet_id
message.direction = direction
message.injected = True
self.send_message(message)
self.send(message)
def __repr__(self):
return "<%s %r : %r>" % (self.__class__.__name__, self.near_host, self.host)

View File

@@ -3002,6 +3002,16 @@ version 2.0
RegionInfo3 Variable
{ RegionFlagsExtended U64 }
}
{
RegionInfo5 Variable
{ ChatWhisperRange F32 }
{ ChatNormalRange F32 }
{ ChatShoutRange F32 }
{ ChatWhisperOffset F32 }
{ ChatNormalOffset F32 }
{ ChatShoutOffset F32 }
{ ChatFlags U32 }
}
}
// GodUpdateRegionInfo
@@ -5792,6 +5802,28 @@ version 2.0
}
}
// LargeGenericMessage
// Similar to the above messages, but can handle larger payloads and serialized
// LLSD. Uses HTTP transport
{
LargeGenericMessage Low 430 NotTrusted Unencoded UDPDeprecated
{
AgentData Single
{ AgentID LLUUID }
{ SessionID LLUUID }
{ TransactionID LLUUID }
}
{
MethodData Single
{ Method Variable 1 }
{ Invoice LLUUID }
}
{
ParamList Variable
{ Parameter Variable 2 }
}
}
// ***************************************************************************
// Requests for possessions, acquisition, money, etc
// ***************************************************************************

View File

@@ -32,6 +32,7 @@ from typing import *
from hippolyzer.lib.base.datatypes import *
import hippolyzer.lib.base.serialization as se
import hippolyzer.lib.base.templates as templates
from hippolyzer.lib.base.datatypes import Pretty
from hippolyzer.lib.base.message.msgtypes import PacketFlags
from hippolyzer.lib.base.network.transport import Direction, ADDR_TUPLE
@@ -62,11 +63,12 @@ class Block:
Block expects a name, and kwargs for variables (var_name = value)
"""
__slots__ = ('name', 'size', 'vars', 'message_name', '_ser_cache', 'fill_missing',)
PARENT_MESSAGE_NAME: ClassVar[Optional[str]] = None
def __init__(self, name, /, *, fill_missing=False, **kwargs):
self.name = name
self.size = 0
self.message_name: Optional[str] = None
self.message_name: Optional[str] = self.PARENT_MESSAGE_NAME
self.vars: Dict[str, VAR_TYPE] = {}
self._ser_cache: Dict[str, Any] = {}
self.fill_missing = fill_missing
@@ -83,6 +85,9 @@ class Block:
return self.vars[name]
def __setitem__(self, key, value):
if isinstance(value, Pretty):
return self.serialize_var(key, value.value)
# These don't pickle well since they're likely to get hot-reloaded
if isinstance(value, (enum.IntEnum, enum.IntFlag)):
value = int(value)
@@ -181,9 +186,9 @@ class MsgBlockList(List["Block"]):
class Message:
__slots__ = ("name", "send_flags", "_packet_id", "acks", "body_boundaries", "queued",
__slots__ = ("name", "send_flags", "packet_id", "acks", "body_boundaries", "queued",
"offset", "raw_extra", "raw_body", "deserializer", "_blocks", "finalized",
"direction", "meta", "injected", "dropped", "sender")
"direction", "meta", "synthetic", "dropped", "sender")
def __init__(self, name, *args, packet_id=None, flags=0, acks=None, direction=None):
# TODO: Do this on a timer or something.
@@ -191,7 +196,7 @@ class Message:
self.name = name
self.send_flags = flags
self._packet_id: Optional[int] = packet_id # aka, sequence number
self.packet_id: Optional[int] = packet_id # aka, sequence number
self.acks = acks if acks is not None else tuple()
self.body_boundaries = (-1, -1)
@@ -208,22 +213,12 @@ class Message:
self.queued: bool = False
self._blocks: BLOCK_DICT = {}
self.meta = {}
self.injected = False
self.synthetic = packet_id is None
self.dropped = False
self.sender: Optional[ADDR_TUPLE] = None
self.add_blocks(args)
@property
def packet_id(self) -> Optional[int]:
return self._packet_id
@packet_id.setter
def packet_id(self, val: Optional[int]):
self._packet_id = val
# Changing packet ID clears the finalized flag
self.finalized = False
def add_blocks(self, block_list):
# can have a list of blocks if it is multiple or variable
for block in block_list:
@@ -296,7 +291,7 @@ class Message:
if self.raw_body and self.deserializer():
self.deserializer().parse_message_body(self)
def to_dict(self):
def to_dict(self, extended=False):
""" A dict representation of a message.
This is the form used for templated messages sent via EQ.
@@ -312,6 +307,18 @@ class Message:
new_vars[var_name] = val
dict_blocks.append(new_vars)
if extended:
base_repr.update({
"packet_id": self.packet_id,
"meta": self.meta.copy(),
"dropped": self.dropped,
"synthetic": self.synthetic,
"direction": self.direction.name,
"send_flags": int(self.send_flags),
"extra": self.extra,
"acks": self.acks,
})
return base_repr
@classmethod
@@ -321,6 +328,17 @@ class Message:
msg.create_block_list(block_type)
for block in blocks:
msg.add_block(Block(block_type, **block))
if 'packet_id' in dict_val:
# extended format
msg.packet_id = dict_val['packet_id']
msg.meta = dict_val['meta']
msg.dropped = dict_val['dropped']
msg.synthetic = dict_val['synthetic']
msg.direction = Direction[dict_val['direction']]
msg.send_flags = dict_val['send_flags']
msg.extra = dict_val['extra']
msg.acks = dict_val['acks']
return msg
def invalidate_caches(self):
@@ -359,12 +377,16 @@ class Message:
message_copy = copy.deepcopy(self)
# Set the queued flag so the original will be dropped and acks will be sent
self.queued = True
if not self.finalized:
self.queued = True
# Original was dropped so let's make sure we have clean acks and packet id
message_copy.acks = tuple()
message_copy.send_flags &= ~PacketFlags.ACK
message_copy.packet_id = None
message_copy.dropped = False
message_copy.finalized = False
message_copy.queued = False
return message_copy
def to_summary(self):

View File

@@ -62,9 +62,16 @@ class HumanMessageSerializer:
continue
if first_line:
direction, message_name = line.split(" ", 1)
first_split = [x for x in line.split(" ") if x]
direction, message_name = first_split[:2]
options = [x.strip("[]") for x in first_split[2:]]
msg = Message(message_name)
msg.direction = Direction[direction.upper()]
for option in options:
if option in PacketFlags.__members__:
msg.send_flags |= PacketFlags[option]
elif re.match(r"^\d+$", option):
msg.send_flags |= int(option)
first_line = False
continue
@@ -137,9 +144,17 @@ class HumanMessageSerializer:
if msg.direction is not None:
string += f'{msg.direction.name} '
string += msg.name
flags = msg.send_flags
for poss_flag in iter(PacketFlags):
if flags & poss_flag:
flags &= ~poss_flag
string += f" [{poss_flag.name}]"
# Make sure flags with unknown meanings don't get lost
if flags:
string += f" [{int(flags)}]"
if msg.packet_id is not None:
string += f'\n# {msg.packet_id}: {PacketFlags(msg.send_flags)!r}'
string += f'{", DROPPED" if msg.dropped else ""}{", INJECTED" if msg.injected else ""}'
string += f'\n# ID: {msg.packet_id}'
string += f'{", DROPPED" if msg.dropped else ""}{", SYNTHETIC" if msg.synthetic else ""}'
if msg.extra:
string += f'\n# EXTRA: {msg.extra!r}'
string += '\n\n'

View File

@@ -107,12 +107,14 @@ class MessageHandler(Generic[_T, _K]):
take = self.take_by_default
notifiers = [self.register(name) for name in message_names]
fut = asyncio.get_event_loop().create_future()
loop = asyncio.get_event_loop_policy().get_event_loop()
fut = loop.create_future()
timeout_task = None
async def _canceller():
await asyncio.sleep(timeout)
fut.set_exception(asyncio.exceptions.TimeoutError("Timed out waiting for packet"))
if not fut.done():
fut.set_exception(asyncio.exceptions.TimeoutError("Timed out waiting for packet"))
for n in notifiers:
n.unsubscribe(_handler)
@@ -125,7 +127,8 @@ class MessageHandler(Generic[_T, _K]):
# Whatever was awaiting this future now owns this message
if take:
message = message.take()
fut.set_result(message)
if not fut.done():
fut.set_result(message)
# Make sure to unregister this handler for all message types
for n in notifiers:
n.unsubscribe(_handler)

View File

@@ -22,6 +22,7 @@ Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
import typing
from .msgtypes import MsgType, MsgBlockType
from ..datatypes import UUID
class MessageTemplateVariable:
@@ -61,6 +62,32 @@ class MessageTemplateVariable:
self._probably_text = self._probably_text and self.name != "NameValue"
return self._probably_text
@property
def default_value(self):
if self.type.is_int:
return 0
elif self.type.is_float:
return 0.0
elif self.type == MsgType.MVT_LLUUID:
return UUID()
elif self.type == MsgType.MVT_BOOL:
return False
elif self.type == MsgType.MVT_VARIABLE:
if self.probably_binary:
return b""
if self.probably_text:
return ""
return b""
elif self.type in (MsgType.MVT_LLVector3, MsgType.MVT_LLVector3d, MsgType.MVT_LLQuaternion):
return 0.0, 0.0, 0.0
elif self.type == MsgType.MVT_LLVector4:
return 0.0, 0.0, 0.0, 0.0
elif self.type == MsgType.MVT_FIXED:
return b"\x00" * self.size
elif self.type == MsgType.MVT_IP_ADDR:
return "0.0.0.0"
return None
class MessageTemplateBlock:
def __init__(self, name):

View File

@@ -68,7 +68,7 @@ class UDPMessageDeserializer:
self.settings = settings or Settings()
self.template_dict = self.DEFAULT_TEMPLATE
def deserialize(self, msg_buff: bytes):
def deserialize(self, msg_buff: bytes) -> Message:
msg = self._parse_message_header(msg_buff)
if not self.settings.ENABLE_DEFERRED_PACKET_PARSING:
try:
@@ -85,6 +85,7 @@ class UDPMessageDeserializer:
reader = se.BufferReader("!", data)
msg: Message = Message("Placeholder")
msg.synthetic = False
msg.send_flags = reader.read(se.U8)
msg.packet_id = reader.read(se.U32)

View File

@@ -71,7 +71,7 @@ class Object(recordclass.datatuple): # type: ignore
ProfileBegin: Optional[int] = None
ProfileEnd: Optional[int] = None
ProfileHollow: Optional[int] = None
TextureEntry: Optional[tmpls.TextureEntry] = None
TextureEntry: Optional[tmpls.TextureEntryCollection] = None
TextureAnim: Optional[tmpls.TextureAnim] = None
NameValue: Optional[Any] = None
Data: Optional[Any] = None
@@ -270,6 +270,9 @@ def normalize_object_update_compressed_data(data: bytes):
# Only used for determining which sections are present
del compressed["Flags"]
# Unlike other ObjectUpdate types, a null value in an ObjectUpdateCompressed
# always means that there is no value, not that the value hasn't changed
# from the client's view. Use the default value when that happens.
ps_block = compressed.pop("PSBlockNew", None)
if ps_block is None:
ps_block = compressed.pop("PSBlock", None)
@@ -278,6 +281,20 @@ def normalize_object_update_compressed_data(data: bytes):
compressed.pop("PSBlock", None)
if compressed["NameValue"] is None:
compressed["NameValue"] = NameValueCollection()
if compressed["Text"] is None:
compressed["Text"] = b""
compressed["TextColor"] = b""
if compressed["MediaURL"] is None:
compressed["MediaURL"] = b""
if compressed["AngularVelocity"] is None:
compressed["AngularVelocity"] = Vector3()
if compressed["SoundFlags"] is None:
compressed["SoundFlags"] = 0
compressed["SoundGain"] = 0.0
compressed["SoundRadius"] = 0.0
compressed["Sound"] = UUID()
if compressed["TextureEntry"] is None:
compressed["TextureEntry"] = tmpls.TextureEntryCollection()
object_data = {
"PSBlock": ps_block.value,
@@ -286,9 +303,9 @@ def normalize_object_update_compressed_data(data: bytes):
"LocalID": compressed.pop("ID"),
**compressed,
}
if object_data["TextureEntry"] is None:
object_data.pop("TextureEntry")
# Don't clobber OwnerID in case the object has a proper one.
# Don't clobber OwnerID in case the object has a proper one from
# a previous ObjectProperties. OwnerID isn't expected to be populated
# on ObjectUpdates unless an attached sound is playing.
if object_data["OwnerID"] == UUID():
del object_data["OwnerID"]
return object_data

View File

@@ -27,6 +27,14 @@ class _Unserializable:
return False
class MissingType:
"""Simple sentinel type like dataclasses._MISSING_TYPE"""
pass
MISSING = MissingType()
UNSERIALIZABLE = _Unserializable()
_T = TypeVar("_T")
@@ -288,7 +296,7 @@ class SerializableBase(abc.ABC):
@classmethod
def default_value(cls) -> Any:
# None may be a valid default, so return MISSING as a sentinel val
return dataclasses.MISSING
return MISSING
class Adapter(SerializableBase, abc.ABC):
@@ -328,18 +336,18 @@ class ForwardSerializable(SerializableBase):
def __init__(self, func: Callable[[], SERIALIZABLE_TYPE]):
super().__init__()
self._func = func
self._wrapped = dataclasses.MISSING
self._wrapped: Union[MissingType, SERIALIZABLE_TYPE] = MISSING
def _ensure_evaled(self):
if self._wrapped is dataclasses.MISSING:
if self._wrapped is MISSING:
self._wrapped = self._func()
def __getattr__(self, attr):
return getattr(self._wrapped, attr)
def default_value(self) -> Any:
if self._wrapped is dataclasses.MISSING:
return dataclasses.MISSING
if self._wrapped is MISSING:
return MISSING
return self._wrapped.default_value()
def serialize(self, val, writer: BufferWriter, ctx: Optional[ParseContext]):
@@ -357,10 +365,10 @@ class Template(SerializableBase):
def __init__(self, template_spec: Dict[str, SERIALIZABLE_TYPE], skip_missing=False):
self._template_spec = template_spec
self._skip_missing = skip_missing
self._size = dataclasses.MISSING
self._size = MISSING
def calc_size(self):
if self._size is not dataclasses.MISSING:
if self._size is not MISSING:
return self._size
sum_bytes = 0
for _, field_type in self._template_spec.items():
@@ -1196,9 +1204,9 @@ class ContextMixin(Generic[_T]):
def _choose_option(self, ctx: Optional[ParseContext]) -> _T:
idx = self._fun(ctx)
if idx not in self._options:
if dataclasses.MISSING not in self._options:
if MISSING not in self._options:
raise KeyError(f"{idx!r} not found in {self._options!r}")
idx = dataclasses.MISSING
idx = MISSING
return self._options[idx]
@@ -1339,6 +1347,12 @@ class TypedBytesBase(SerializableBase, abc.ABC):
return self._spec.default_value()
class TypedBytesGreedy(TypedBytesBase):
def __init__(self, spec, empty_is_none=False, check_trailing_bytes=True, lazy=False):
self._bytes_tmpl = BytesGreedy()
super().__init__(spec, empty_is_none, check_trailing_bytes, lazy=lazy)
class TypedByteArray(TypedBytesBase):
def __init__(self, len_spec, spec, empty_is_none=False, check_trailing_bytes=True, lazy=False):
self._bytes_tmpl = ByteArray(len_spec)
@@ -1436,7 +1450,7 @@ class StringEnumAdapter(Adapter):
class FixedPoint(SerializableBase):
def __init__(self, ser_spec, int_bits, frac_bits, signed=False):
# Should never be used due to how this handles signs :/
assert(not ser_spec.is_signed)
assert (not ser_spec.is_signed)
self._ser_spec: SerializablePrimitive = ser_spec
self._signed = signed
@@ -1446,7 +1460,7 @@ class FixedPoint(SerializableBase):
self._min_val = ((1 << int_bits) * -1) if signed else 0
self._max_val = 1 << int_bits
assert(required_bits == (ser_spec.calc_size() * 8))
assert (required_bits == (ser_spec.calc_size() * 8))
def deserialize(self, reader: Reader, ctx):
fixed_val = float(self._ser_spec.deserialize(reader, ctx))
@@ -1476,8 +1490,8 @@ def _make_undefined_raiser():
return f
def dataclass_field(spec: Union[SERIALIZABLE_TYPE, Callable], *, default=dataclasses.MISSING,
default_factory=dataclasses.MISSING, init=True, repr=True, # noqa
def dataclass_field(spec: Union[SERIALIZABLE_TYPE, Callable], *, default: Any = dataclasses.MISSING,
default_factory: Any = dataclasses.MISSING, init=True, repr=True, # noqa
hash=None, compare=True) -> dataclasses.Field: # noqa
enrich_factory = False
# Lambda, need to defer evaluation of spec until it's actually used.
@@ -1498,7 +1512,7 @@ def dataclass_field(spec: Union[SERIALIZABLE_TYPE, Callable], *, default=datacla
metadata={"spec": spec}, default=default, default_factory=default_factory, init=init,
repr=repr, hash=hash, compare=compare
)
# Need to stuff this on so it knows which field went unspecified.
# Need to stuff this on, so it knows which field went unspecified.
if enrich_factory:
default_factory.field = field
return field

View File

@@ -3,23 +3,17 @@ Serialization templates for structures used in LLUDP and HTTP bodies.
"""
import abc
import collections
import dataclasses
import enum
import importlib
import logging
import math
import zlib
from typing import *
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID, IntEnum, IntFlag
from hippolyzer.lib.base.datatypes import UUID, IntEnum, IntFlag, Vector3
from hippolyzer.lib.base.namevalue import NameValuesSerializer
try:
importlib.reload(se) # type: ignore
except:
logging.exception("Failed to reload serialization lib")
@se.enum_field_serializer("RequestXfer", "XferID", "VFileType")
@se.enum_field_serializer("AssetUploadRequest", "AssetBlock", "Type")
@@ -141,9 +135,54 @@ class InventoryType(IntEnum):
lower = self.name.lower()
return {
"callingcard": "callcard",
"none": "-1",
}.get(lower, lower)
class FolderType(IntEnum):
TEXTURE = 0
SOUND = 1
CALLINGCARD = 2
LANDMARK = 3
CLOTHING = 5
OBJECT = 6
NOTECARD = 7
# We'd really like to change this to 9 since AT_CATEGORY is 8,
# but "My Inventory" has been type 8 for a long time.
ROOT_INVENTORY = 8
LSL_TEXT = 10
BODYPART = 13
TRASH = 14
SNAPSHOT_CATEGORY = 15
LOST_AND_FOUND = 16
ANIMATION = 20
GESTURE = 21
FAVORITE = 23
ENSEMBLE_START = 26
ENSEMBLE_END = 45
# This range is reserved for special clothing folder types.
CURRENT_OUTFIT = 46
OUTFIT = 47
MY_OUTFITS = 48
MESH = 49
# "received items" for MP
INBOX = 50
OUTBOX = 51
BASIC_ROOT = 52
MARKETPLACE_LISTINGS = 53
MARKETPLACE_STOCK = 54
# Note: We actually *never* create folders with that type. This is used for icon override only.
MARKETPLACE_VERSION = 55
SETTINGS = 56
# Firestorm folders, may not actually exist
FIRESTORM = 57
PHOENIX = 58
RLV = 59
# Opensim folders
MY_SUITCASE = 100
NONE = -1
@se.enum_field_serializer("AgentIsNowWearing", "WearableData", "WearableType")
@se.enum_field_serializer("AgentWearablesUpdate", "WearableData", "WearableType")
@se.enum_field_serializer("CreateInventoryItem", "InventoryBlock", "WearableType")
@@ -177,6 +216,7 @@ def _register_permissions_flags(message_name, block_name):
@se.flag_field_serializer("ObjectPermissions", "ObjectData", "Mask")
@_register_permissions_flags("ObjectProperties", "ObjectData")
@_register_permissions_flags("ObjectPropertiesFamily", "ObjectData")
@_register_permissions_flags("UpdateCreateInventoryItem", "InventoryData")
@_register_permissions_flags("UpdateTaskInventory", "InventoryData")
@_register_permissions_flags("CreateInventoryItem", "InventoryBlock")
@@ -201,11 +241,74 @@ class Permissions(IntFlag):
RESERVED = 1 << 31
@se.enum_field_serializer("ObjectSaleInfo", "ObjectData", "SaleType")
@se.enum_field_serializer("ObjectProperties", "ObjectData", "SaleType")
@se.enum_field_serializer("ObjectPropertiesFamily", "ObjectData", "SaleType")
@se.enum_field_serializer("ObjectBuy", "ObjectData", "SaleType")
@se.enum_field_serializer("RezScript", "InventoryBlock", "SaleType")
@se.enum_field_serializer("RezObject", "InventoryData", "SaleType")
@se.enum_field_serializer("UpdateTaskInventory", "InventoryData", "SaleType")
@se.enum_field_serializer("UpdateCreateInventoryItem", "InventoryData", "SaleType")
class SaleInfo(IntEnum):
NOT = 0
ORIGINAL = 1
COPY = 2
CONTENTS = 3
@se.flag_field_serializer("ParcelInfoReply", "Data", "Flags")
class ParcelInfoFlags(IntFlag):
MATURE = 1 << 0
# You should never see adult without mature
ADULT = 1 << 1
GROUP_OWNED = 1 << 2
@se.flag_field_serializer("MapItemRequest", "AgentData", "Flags")
@se.flag_field_serializer("MapNameRequest", "AgentData", "Flags")
@se.flag_field_serializer("MapBlockRequest", "AgentData", "Flags")
@se.flag_field_serializer("MapItemReply", "AgentData", "Flags")
@se.flag_field_serializer("MapNameReply", "AgentData", "Flags")
@se.flag_field_serializer("MapBlockReply", "AgentData", "Flags")
class MapImageFlags(IntFlag):
# No clue, honestly. I guess there's potentially different image types you could request.
LAYER = 1 << 1
@se.enum_field_serializer("MapBlockReply", "Data", "Access")
@se.enum_field_serializer("RegionInfo", "RegionInfo", "SimAccess")
class SimAccess(IntEnum):
# Treated as 'unknown', usually ends up being SIM_ACCESS_PG
MIN = 0
PG = 13
MATURE = 21
ADULT = 42
DOWN = 254
@se.enum_field_serializer("MapItemRequest", "RequestData", "ItemType")
@se.enum_field_serializer("MapItemReply", "RequestData", "ItemType")
class MapItemType(IntEnum):
TELEHUB = 0x01
PG_EVENT = 0x02
MATURE_EVENT = 0x03
# No longer supported, 2009-03-02 KLW
DEPRECATED_POPULAR = 0x04
DEPRECATED_AGENT_COUNT = 0x05
AGENT_LOCATIONS = 0x06
LAND_FOR_SALE = 0x07
CLASSIFIED = 0x08
ADULT_EVENT = 0x09
LAND_FOR_SALE_ADULT = 0x0a
@se.flag_field_serializer("RezObject", "RezData", "ItemFlags")
@se.flag_field_serializer("RezMultipleAttachmentsFromInv", "ObjectData", "ItemFlags")
@se.flag_field_serializer("RezObject", "InventoryData", "Flags")
@se.flag_field_serializer("RezScript", "InventoryBlock", "Flags")
@se.flag_field_serializer("UpdateCreateInventoryItem", "InventoryData", "Flags")
@se.flag_field_serializer("UpdateTaskInventory", "InventoryData", "Flags")
@se.flag_field_serializer("ChangeInventoryItemFlags", "InventoryData", "Flags")
class InventoryItemFlags(IntFlag):
# The asset has only one reference in the system. If the
# inventory item is deleted, or the assetid updated, then we
@@ -232,7 +335,8 @@ class InventoryItemFlags(IntFlag):
OBJECT_HAS_MULTIPLE_ITEMS = 0x200000
@property
def attachment_point(self):
def subtype(self):
"""Subtype of the given item type, could be an attachment point or setting type, etc."""
return self & 0xFF
@@ -248,10 +352,10 @@ class PermissionType(IntEnum):
@se.enum_field_serializer("TransferRequest", "TransferInfo", "SourceType")
class TransferSourceType(IntEnum):
UNKNOWN = 0
FILE = enum.auto()
ASSET = enum.auto()
SIM_INV_ITEM = enum.auto()
SIM_ESTATE = enum.auto()
FILE = 1
ASSET = 2
SIM_INV_ITEM = 3
SIM_ESTATE = 4
class EstateAssetType(IntEnum):
@@ -314,15 +418,15 @@ class TransferParamsSerializer(se.EnumSwitchedSubfieldSerializer):
@se.enum_field_serializer("TransferInfo", "TransferInfo", "ChannelType")
class TransferChannelType(IntEnum):
UNKNOWN = 0
MISC = enum.auto()
ASSET = enum.auto()
MISC = 1
ASSET = 2
@se.enum_field_serializer("TransferInfo", "TransferInfo", "TargetType")
class TransferTargetType(IntEnum):
UNKNOWN = 0
FILE = enum.auto()
VFILE = enum.auto()
FILE = 1
VFILE = 2
@se.enum_field_serializer("TransferInfo", "TransferInfo", "Status")
@@ -429,45 +533,45 @@ class SendXferPacketIDSerializer(se.AdapterSubfieldSerializer):
@se.enum_field_serializer("ViewerEffect", "Effect", "Type")
class ViewerEffectType(IntEnum):
TEXT = 0
ICON = enum.auto()
CONNECTOR = enum.auto()
FLEXIBLE_OBJECT = enum.auto()
ANIMAL_CONTROLS = enum.auto()
LOCAL_ANIMATION_OBJECT = enum.auto()
CLOTH = enum.auto()
EFFECT_BEAM = enum.auto()
EFFECT_GLOW = enum.auto()
EFFECT_POINT = enum.auto()
EFFECT_TRAIL = enum.auto()
EFFECT_SPHERE = enum.auto()
EFFECT_SPIRAL = enum.auto()
EFFECT_EDIT = enum.auto()
EFFECT_LOOKAT = enum.auto()
EFFECT_POINTAT = enum.auto()
EFFECT_VOICE_VISUALIZER = enum.auto()
NAME_TAG = enum.auto()
EFFECT_BLOB = enum.auto()
ICON = 1
CONNECTOR = 2
FLEXIBLE_OBJECT = 3
ANIMAL_CONTROLS = 4
LOCAL_ANIMATION_OBJECT = 5
CLOTH = 6
EFFECT_BEAM = 7
EFFECT_GLOW = 8
EFFECT_POINT = 9
EFFECT_TRAIL = 10
EFFECT_SPHERE = 11
EFFECT_SPIRAL = 12
EFFECT_EDIT = 13
EFFECT_LOOKAT = 14
EFFECT_POINTAT = 15
EFFECT_VOICE_VISUALIZER = 16
NAME_TAG = 17
EFFECT_BLOB = 18
class LookAtTarget(IntEnum):
NONE = 0
IDLE = enum.auto()
AUTO_LISTEN = enum.auto()
FREELOOK = enum.auto()
RESPOND = enum.auto()
HOVER = enum.auto()
CONVERSATION = enum.auto()
SELECT = enum.auto()
FOCUS = enum.auto()
MOUSELOOK = enum.auto()
CLEAR = enum.auto()
IDLE = 1
AUTO_LISTEN = 2
FREELOOK = 3
RESPOND = 4
HOVER = 5
CONVERSATION = 6
SELECT = 7
FOCUS = 8
MOUSELOOK = 9
CLEAR = 10
class PointAtTarget(IntEnum):
NONE = 0
SELECT = enum.auto()
GRAB = enum.auto()
CLEAR = enum.auto()
SELECT = 1
GRAB = 2
CLEAR = 3
@se.subfield_serializer("ViewerEffect", "Effect", "TypeData")
@@ -761,6 +865,7 @@ class MCode(IntEnum):
@se.flag_field_serializer("ObjectUpdateCompressed", "ObjectData", "UpdateFlags")
@se.flag_field_serializer("ObjectUpdateCached", "ObjectData", "UpdateFlags")
@se.flag_field_serializer("ObjectAdd", "ObjectData", "AddFlags")
@se.flag_field_serializer("ObjectDuplicate", "SharedData", "DuplicateFlags")
class ObjectUpdateFlags(IntFlag):
USE_PHYSICS = 1 << 0
CREATE_SELECTED = 1 << 1
@@ -796,6 +901,9 @@ class ObjectUpdateFlags(IntFlag):
ZLIB_COMPRESSED_REPRECATED = 1 << 31
JUST_CREATED_FLAGS = (ObjectUpdateFlags.CREATE_SELECTED | ObjectUpdateFlags.OBJECT_YOU_OWNER)
class AttachmentStateAdapter(se.Adapter):
# Encoded attachment point ID for attached objects
# nibbles are swapped around because old attachment nums only used to live
@@ -828,7 +936,7 @@ class ObjectStateAdapter(se.ContextAdapter):
PCode.AVATAR: se.IntFlag(AgentState),
PCode.PRIMITIVE: AttachmentStateAdapter(None),
# Other cases are probably just a number (tree species ID or something.)
dataclasses.MISSING: se.IdentityAdapter(),
se.MISSING: se.IdentityAdapter(),
}
)
@@ -840,6 +948,15 @@ class ObjectStateSerializer(se.AdapterSubfieldSerializer):
ORIG_INLINE = True
@se.subfield_serializer("ObjectUpdate", "RegionData", "TimeDilation")
@se.subfield_serializer("ObjectUpdateCompressed", "RegionData", "TimeDilation")
@se.subfield_serializer("ObjectUpdateCached", "RegionData", "TimeDilation")
@se.subfield_serializer("ImprovedTerseObjectUpdate", "RegionData", "TimeDilation")
class TimeDilationSerializer(se.AdapterSubfieldSerializer):
ADAPTER = se.QuantizedFloat(se.U16, 0.0, 1.0, False)
ORIG_INLINE = True
@se.subfield_serializer("ImprovedTerseObjectUpdate", "ObjectData", "Data")
class ImprovedTerseObjectUpdateDataSerializer(se.SimpleSubfieldSerializer):
TEMPLATE = se.Template({
@@ -862,12 +979,12 @@ class ShineLevel(IntEnum):
HIGH = 3
@dataclasses.dataclass
@dataclasses.dataclass(unsafe_hash=True)
class BasicMaterials:
# Meaning is technically implementation-dependent, these are in LL data files
Bump: int = se.bitfield_field(bits=5)
FullBright: bool = se.bitfield_field(bits=1, adapter=se.BoolAdapter())
Shiny: int = se.bitfield_field(bits=2, adapter=se.IntEnum(ShineLevel))
Bump: int = se.bitfield_field(bits=5, default=0)
FullBright: bool = se.bitfield_field(bits=1, adapter=se.BoolAdapter(), default=False)
Shiny: int = se.bitfield_field(bits=2, adapter=se.IntEnum(ShineLevel), default=0)
BUMP_SHINY_FULLBRIGHT = se.BitfieldDataclass(BasicMaterials, se.U8)
@@ -881,12 +998,12 @@ class TexGen(IntEnum):
CYLINDRICAL = 0x6
@dataclasses.dataclass
@dataclasses.dataclass(unsafe_hash=True)
class MediaFlags:
WebPage: bool = se.bitfield_field(bits=1, adapter=se.BoolAdapter())
TexGen: "TexGen" = se.bitfield_field(bits=2, adapter=se.IntEnum(TexGen))
WebPage: bool = se.bitfield_field(bits=1, adapter=se.BoolAdapter(), default=False)
TexGen: "TexGen" = se.bitfield_field(bits=2, adapter=se.IntEnum(TexGen), default=TexGen.DEFAULT)
# Probably unused but show it just in case
_Unused: int = se.bitfield_field(bits=5)
_Unused: int = se.bitfield_field(bits=5, default=0)
# Not shifted so enum definitions can match indra
@@ -1022,9 +1139,15 @@ class TEExceptionField(se.SerializableBase):
return dict
_T = TypeVar("_T")
_TE_FIELD_KEY = Optional[Sequence[int]]
_TE_DICT = Dict[_TE_FIELD_KEY, _T]
def _te_field(spec: se.SERIALIZABLE_TYPE, first=False, optional=False,
default_factory=dataclasses.MISSING, default=dataclasses.MISSING):
if default_factory is not dataclasses.MISSING:
default_factory: Union[se.MissingType, Callable[[], _T]] = se.MISSING,
default: Union[se.MissingType, _T] = se.MISSING):
if default_factory is not se.MISSING:
new_default_factory = lambda: {None: default_factory()}
elif default is not None:
new_default_factory = lambda: {None: default}
@@ -1036,34 +1159,130 @@ def _te_field(spec: se.SERIALIZABLE_TYPE, first=False, optional=False,
)
_T = TypeVar("_T")
_TE_FIELD_KEY = Optional[Sequence[int]]
# If this seems weird it's because it is. TE offsets are S16s with `0` as the actual 0
# point, and LL divides by `0x7FFF` to convert back to float. Negative S16s can
# actually go to -0x8000 due to two's complement, creating a larger range for negatives.
TE_S16_COORD = se.QuantizedFloat(se.S16, -1.000030518509476, 1.0, False)
class PackedTERotation(se.QuantizedFloat):
"""Another weird one, packed TE rotations have their own special quantization"""
def __init__(self):
super().__init__(se.S16, math.pi * -2, math.pi * 2, zero_median=False)
self.step_mag = 1.0 / (se.U16.max_val + 1)
def _float_to_quantized(self, val: float, lower: float, upper: float):
val = math.fmod(val, upper)
val = super()._float_to_quantized(val, lower, upper)
if val == se.S16.max_val + 1:
val = self.prim_min
return val
@dataclasses.dataclass
class TextureEntry:
Textures: Dict[_TE_FIELD_KEY, UUID] = _te_field(
"""Representation of a TE for a single face. Not sent over the wire."""
Textures: UUID = UUID('89556747-24cb-43ed-920b-47caed15465f')
Color: bytes = b"\xff\xff\xff\xff"
ScalesS: float = 1.0
ScalesT: float = 1.0
OffsetsS: float = 0.0
OffsetsT: float = 0.0
# In radians
Rotation: float = 0.0
MediaFlags: "MediaFlags" = dataclasses.field(default_factory=MediaFlags)
BasicMaterials: "BasicMaterials" = dataclasses.field(default_factory=BasicMaterials)
Glow: float = 0.0
Materials: UUID = UUID.ZERO
def st_to_uv(self, st_coord: Vector3) -> Vector3:
"""Convert OpenGL ST coordinates to UV coordinates, accounting for mapping"""
uv = Vector3(st_coord.X - 0.5, st_coord.Y - 0.5)
cos_rot = math.cos(self.Rotation)
sin_rot = math.sin(self.Rotation)
uv = Vector3(
X=uv.X * cos_rot + uv.Y * sin_rot,
Y=-uv.X * sin_rot + uv.Y * cos_rot
)
uv *= Vector3(self.ScalesS, self.ScalesT)
return uv + Vector3(self.OffsetsS + 0.5, self.OffsetsT + 0.5)
# Max number of TEs possible according to llprimitive (but not really true!)
# Useful if you don't know how many faces / TEs an object really has because it's mesh
# or something.
MAX_TES = 45
@dataclasses.dataclass
class TextureEntryCollection:
Textures: _TE_DICT[UUID] = _te_field(
# Plywood texture
se.UUID, first=True, default=UUID('89556747-24cb-43ed-920b-47caed15465f'))
# Bytes are inverted so fully opaque white is \x00\x00\x00\x00
Color: Dict[_TE_FIELD_KEY, bytes] = _te_field(Color4(invert_bytes=True), default=b"\xff\xff\xff\xff")
ScalesS: Dict[_TE_FIELD_KEY, float] = _te_field(se.F32, default=1.0)
ScalesT: Dict[_TE_FIELD_KEY, float] = _te_field(se.F32, default=1.0)
OffsetsS: Dict[_TE_FIELD_KEY, int] = _te_field(se.S16, default=0)
OffsetsT: Dict[_TE_FIELD_KEY, int] = _te_field(se.S16, default=0)
Rotation: Dict[_TE_FIELD_KEY, int] = _te_field(se.S16, default=0)
BasicMaterials: Dict[_TE_FIELD_KEY, "BasicMaterials"] = _te_field(
BUMP_SHINY_FULLBRIGHT, default_factory=lambda: BasicMaterials(Bump=0, FullBright=False, Shiny=0),
Color: _TE_DICT[bytes] = _te_field(Color4(invert_bytes=True), default=b"\xff\xff\xff\xff")
ScalesS: _TE_DICT[float] = _te_field(se.F32, default=1.0)
ScalesT: _TE_DICT[float] = _te_field(se.F32, default=1.0)
OffsetsS: _TE_DICT[float] = _te_field(TE_S16_COORD, default=0.0)
OffsetsT: _TE_DICT[float] = _te_field(TE_S16_COORD, default=0.0)
Rotation: _TE_DICT[float] = _te_field(PackedTERotation(), default=0.0)
BasicMaterials: _TE_DICT["BasicMaterials"] = _te_field(
BUMP_SHINY_FULLBRIGHT, default_factory=BasicMaterials,
)
MediaFlags: Dict[_TE_FIELD_KEY, "MediaFlags"] = _te_field(
MEDIA_FLAGS,
default_factory=lambda: MediaFlags(WebPage=False, TexGen=TexGen.DEFAULT, _Unused=0),
)
Glow: Dict[_TE_FIELD_KEY, int] = _te_field(se.U8, default=0)
Materials: Dict[_TE_FIELD_KEY, UUID] = _te_field(se.UUID, optional=True, default=UUID())
MediaFlags: _TE_DICT["MediaFlags"] = _te_field(MEDIA_FLAGS, default_factory=MediaFlags)
Glow: _TE_DICT[float] = _te_field(se.QuantizedFloat(se.U8, 0.0, 1.0), default=0.0)
Materials: _TE_DICT[UUID] = _te_field(se.UUID, optional=True, default=UUID.ZERO)
def unwrap(self):
"""Return `self` regardless of whether this is lazy wrapped object or not"""
return self
def realize(self, num_faces: int = MAX_TES) -> List[TextureEntry]:
"""
Turn the "default" vs "exception cases" wire format TE representation to per-face lookups
Makes it easier to get all TE details associated with a specific face
"""
as_dicts = [dict() for _ in range(num_faces)]
for field in dataclasses.fields(self):
key = field.name
vals = getattr(self, key)
# Fill give all faces the default value for this key
for te in as_dicts:
te[key] = vals[None]
# Walk over the exception cases and replace the default value
for face_nums, val in vals.items():
# Default case already handled
if face_nums is None:
continue
for face_num in face_nums:
if face_num >= num_faces:
raise ValueError(f"Bad value for num_faces? {face_num} >= {num_faces}")
as_dicts[face_num][key] = val
return [TextureEntry(**x) for x in as_dicts]
@classmethod
def from_tes(cls, tes: List[TextureEntry]) -> "TextureEntryCollection":
instance = cls()
if not tes:
return instance
for field in dataclasses.fields(cls):
te_vals: Dict[Any, List[int]] = collections.defaultdict(list)
for i, te in enumerate(tes):
# Group values by what face they occur on
te_vals[getattr(te, field.name)].append(i)
# Make most common value the "default", everything else is an exception
sorted_vals = sorted(te_vals.items(), key=lambda x: len(x[1]), reverse=True)
default_val = sorted_vals.pop(0)[0]
te_vals = {None: default_val}
for val, face_nums in sorted_vals:
te_vals[tuple(face_nums)] = val
setattr(instance, field.name, te_vals)
return instance
TE_SERIALIZER = se.Dataclass(TextureEntry)
TE_SERIALIZER = se.Dataclass(TextureEntryCollection)
@se.subfield_serializer("ObjectUpdate", "ObjectData", "TextureEntry")
@@ -1072,7 +1291,7 @@ TE_SERIALIZER = se.Dataclass(TextureEntry)
@se.subfield_serializer("ObjectImage", "ObjectData", "TextureEntry")
class TextureEntrySubfieldSerializer(se.SimpleSubfieldSerializer):
EMPTY_IS_NONE = True
TEMPLATE = TE_SERIALIZER
TEMPLATE = se.TypedBytesGreedy(TE_SERIALIZER, empty_is_none=True, lazy=True)
DATA_PACKER_TE_TEMPLATE = se.TypedByteArray(
@@ -1510,28 +1729,28 @@ class NameValueSerializer(se.SimpleSubfieldSerializer):
@se.enum_field_serializer("SetFollowCamProperties", "CameraProperty", "Type")
class CameraPropertyType(IntEnum):
PITCH = 0
FOCUS_OFFSET = enum.auto()
FOCUS_OFFSET_X = enum.auto()
FOCUS_OFFSET_Y = enum.auto()
FOCUS_OFFSET_Z = enum.auto()
POSITION_LAG = enum.auto()
FOCUS_LAG = enum.auto()
DISTANCE = enum.auto()
BEHINDNESS_ANGLE = enum.auto()
BEHINDNESS_LAG = enum.auto()
POSITION_THRESHOLD = enum.auto()
FOCUS_THRESHOLD = enum.auto()
ACTIVE = enum.auto()
POSITION = enum.auto()
POSITION_X = enum.auto()
POSITION_Y = enum.auto()
POSITION_Z = enum.auto()
FOCUS = enum.auto()
FOCUS_X = enum.auto()
FOCUS_Y = enum.auto()
FOCUS_Z = enum.auto()
POSITION_LOCKED = enum.auto()
FOCUS_LOCKED = enum.auto()
FOCUS_OFFSET = 1
FOCUS_OFFSET_X = 2
FOCUS_OFFSET_Y = 3
FOCUS_OFFSET_Z = 4
POSITION_LAG = 5
FOCUS_LAG = 6
DISTANCE = 7
BEHINDNESS_ANGLE = 8
BEHINDNESS_LAG = 9
POSITION_THRESHOLD = 10
FOCUS_THRESHOLD = 11
ACTIVE = 12
POSITION = 13
POSITION_X = 14
POSITION_Y = 15
POSITION_Z = 16
FOCUS = 17
FOCUS_X = 18
FOCUS_Y = 19
FOCUS_Z = 20
POSITION_LOCKED = 21
FOCUS_LOCKED = 22
@se.enum_field_serializer("DeRezObject", "AgentBlock", "Destination")
@@ -1555,6 +1774,7 @@ class DeRezObjectDestination(IntEnum):
@se.flag_field_serializer("SimStats", "RegionInfo", "RegionFlagsExtended")
@se.flag_field_serializer("RegionInfo", "RegionInfo", "RegionFlags")
@se.flag_field_serializer("RegionInfo", "RegionInfo3", "RegionFlagsExtended")
@se.flag_field_serializer("MapBlockReply", "Data", "RegionFlags")
class RegionFlags(IntFlag):
ALLOW_DAMAGE = 1 << 0
ALLOW_LANDMARK = 1 << 1
@@ -1600,6 +1820,7 @@ class RegionHandshakeReplyFlags(IntFlag):
@se.flag_field_serializer("TeleportStart", "Info", "TeleportFlags")
@se.flag_field_serializer("TeleportProgress", "Info", "TeleportFlags")
@se.flag_field_serializer("TeleportFinish", "Info", "TeleportFlags")
@se.flag_field_serializer("TeleportLocal", "Info", "TeleportFlags")
@se.flag_field_serializer("TeleportLureRequest", "Info", "TeleportFlags")
class TeleportFlags(IntFlag):
SET_HOME_TO_TARGET = 1 << 0 # newbie leaving prelude (starter area)
@@ -1618,6 +1839,190 @@ class TeleportFlags(IntFlag):
IS_FLYING = 1 << 13
SHOW_RESET_HOME = 1 << 14
FORCE_REDIRECT = 1 << 15
VIA_GLOBAL_COORDS = 1 << 16
WITHIN_REGION = 1 << 17
@se.flag_field_serializer("AvatarPropertiesReply", "PropertiesData", "Flags")
class AvatarPropertiesFlags(IntFlag):
ALLOW_PUBLISH = 1 << 0 # whether profile is externally visible or not
MATURE_PUBLISH = 1 << 1 # profile is "mature"
IDENTIFIED = 1 << 2 # whether avatar has provided payment info
TRANSACTED = 1 << 3 # whether avatar has actively used payment info
ONLINE = 1 << 4 # the online status of this avatar, if known.
AGEVERIFIED = 1 << 5 # whether avatar has been age-verified
@se.flag_field_serializer("AvatarGroupsReply", "GroupData", "GroupPowers")
@se.flag_field_serializer("AvatarGroupDataUpdate", "GroupData", "GroupPowers")
@se.flag_field_serializer("AvatarDataUpdate", "AgentDataData", "GroupPowers")
class GroupPowerFlags(IntFlag):
MEMBER_INVITE = 1 << 1 # Invite member
MEMBER_EJECT = 1 << 2 # Eject member from group
MEMBER_OPTIONS = 1 << 3 # Toggle "Open enrollment" and change "Signup Fee"
MEMBER_VISIBLE_IN_DIR = 1 << 47
# Roles
ROLE_CREATE = 1 << 4 # Create new roles
ROLE_DELETE = 1 << 5 # Delete roles
ROLE_PROPERTIES = 1 << 6 # Change Role Names, Titles, and Descriptions
ROLE_ASSIGN_MEMBER_LIMITED = 1 << 7 # Assign Member to a Role that the assigner is in
ROLE_ASSIGN_MEMBER = 1 << 8 # Assign Member to Role
ROLE_REMOVE_MEMBER = 1 << 9 # Remove Member from Role
ROLE_CHANGE_ACTIONS = 1 << 10 # Change actions a role can perform
# Group Identity
GROUP_CHANGE_IDENTITY = 1 << 11 # Charter, insignia, 'Show In Group List', 'Publish on the web', 'Mature', etc.
# Parcel Management
LAND_DEED = 1 << 12 # Deed Land and Buy Land for Group
LAND_RELEASE = 1 << 13 # Release Land (to Gov. Linden)
# Set for sale info (Toggle "For Sale", Set Price, Set Target, Toggle "Sell objects with the land")
LAND_SET_SALE_INFO = 1 << 14
LAND_DIVIDE_JOIN = 1 << 15 # Divide and Join Parcels
# Parcel Identity
LAND_FIND_PLACES = 1 << 17 # Toggle "Show in Find Places" and Set Category.
# Change Parcel Identity: Parcel Name, Parcel Description, Snapshot, 'Publish on the web', and 'Mature' checkbox
LAND_CHANGE_IDENTITY = 1 << 18
LAND_SET_LANDING_POINT = 1 << 19 # Set Landing Point
# Parcel Settings
LAND_CHANGE_MEDIA = 1 << 20 # Change Media Settings
LAND_EDIT = 1 << 21 # Toggle Edit Land
# Toggle Set Home Point, Fly, Outside Scripts, Create/Edit Objects, Landmark, and Damage checkboxes
LAND_OPTIONS = 1 << 22
# Parcel Powers
LAND_ALLOW_EDIT_LAND = 1 << 23 # Bypass Edit Land Restriction
LAND_ALLOW_FLY = 1 << 24 # Bypass Fly Restriction
LAND_ALLOW_CREATE = 1 << 25 # Bypass Create/Edit Objects Restriction
LAND_ALLOW_LANDMARK = 1 << 26 # Bypass Landmark Restriction
LAND_ALLOW_SET_HOME = 1 << 28 # Bypass Set Home Point Restriction
LAND_ALLOW_HOLD_EVENT = 1 << 41 # Allowed to hold events on group-owned land
LAND_ALLOW_ENVIRONMENT = 1 << 46 # Allowed to change the environment
# Parcel Access
LAND_MANAGE_ALLOWED = 1 << 29 # Manage Allowed List
LAND_MANAGE_BANNED = 1 << 30 # Manage Banned List
LAND_MANAGE_PASSES = 1 << 31 # Change Sell Pass Settings
LAND_ADMIN = 1 << 32 # Eject and Freeze Users on the land
# Parcel Content
LAND_RETURN_GROUP_SET = 1 << 33 # Return objects on parcel that are set to group
LAND_RETURN_NON_GROUP = 1 << 34 # Return objects on parcel that are not set to group
LAND_RETURN_GROUP_OWNED = 1 << 48 # Return objects on parcel that are owned by the group
LAND_GARDENING = 1 << 35 # Parcel Gardening - plant and move linden trees
# Object Management
OBJECT_DEED = 1 << 36 # Deed Object
OBJECT_MANIPULATE = 1 << 38 # Manipulate Group Owned Objects (Move, Copy, Mod)
OBJECT_SET_SALE = 1 << 39 # Set Group Owned Object for Sale
# Accounting
ACCOUNTING_ACCOUNTABLE = 1 << 40 # Pay Group Liabilities and Receive Group Dividends
# Notices
NOTICES_SEND = 1 << 42 # Send Notices
NOTICES_RECEIVE = 1 << 43 # Receive Notices and View Notice History
# Proposals
# TODO: _DEPRECATED suffix as part of vote removal - DEV-24856:
PROPOSAL_START = 1 << 44 # Start Proposal
# TODO: _DEPRECATED suffix as part of vote removal - DEV-24856:
PROPOSAL_VOTE = 1 << 45 # Vote on Proposal
# Group chat moderation related
SESSION_JOIN = 1 << 16 # can join session
SESSION_VOICE = 1 << 27 # can hear/talk
SESSION_MODERATOR = 1 << 37 # can mute people's session
EXPERIENCE_ADMIN = 1 << 49 # has admin rights to any experiences owned by this group
EXPERIENCE_CREATOR = 1 << 50 # can sign scripts for experiences owned by this group
# Group Banning
GROUP_BAN_ACCESS = 1 << 51 # Allows access to ban / un-ban agents from a group.
@se.flag_field_serializer("RequestObjectPropertiesFamily", "ObjectData", "RequestFlags")
@se.flag_field_serializer("ObjectPropertiesFamily", "ObjectData", "RequestFlags")
class ObjectPropertiesFamilyRequestFlags(IntFlag):
BUG_REPORT = 1 << 0
COMPLAINT_REPORT = 1 << 1
OBJECT_PAY = 1 << 2
@se.enum_field_serializer("RequestImage", "RequestImage", "Type")
class RequestImageType(IntEnum):
NORMAL = 0
AVATAR_BAKE = 1
@se.enum_field_serializer("ImageData", "ImageID", "Codec")
class ImageCodec(IntEnum):
INVALID = 0
RGB = 1
J2C = 2
BMP = 3
TGA = 4
JPEG = 5
DXT = 6
PNG = 7
@se.enum_field_serializer("LayerData", "LayerID", "Type")
class LayerDataType(IntEnum):
LAND_LAYER_CODE = ord('L')
WIND_LAYER_CODE = ord('7')
CLOUD_LAYER_CODE = ord('8')
WATER_LAYER_CODE = ord('W')
# <FS:CR> Aurora Sim
# Extended land layer for Aurora Sim
AURORA_LAND_LAYER_CODE = ord('M')
AURORA_WATER_LAYER_CODE = ord('X')
AURORA_WIND_LAYER_CODE = ord('9')
AURORA_CLOUD_LAYER_CODE = ord(':')
@se.enum_field_serializer("ModifyLand", "ModifyBlock", "Action")
class ModifyLandAction(IntEnum):
LEVEL = 0
RAISE = 1
LOWER = 2
SMOOTH = 3
NOISE = 4
REVERT = 5
@se.flag_field_serializer("RevokePermissions", "Data", "ObjectPermissions")
@se.flag_field_serializer("ScriptQuestion", "Data", "Questions")
@se.flag_field_serializer("ScriptAnswerYes", "Data", "Questions")
class ScriptPermissions(IntFlag):
# "1" itself seems to be unused?
TAKE_MONEY = 1 << 1
TAKE_CONTROLS = 1 << 2
# Doesn't seem to be used?
REMAP_CONTROLS = 1 << 3
TRIGGER_ANIMATIONS = 1 << 4
ATTACH = 1 << 5
# Doesn't seem to be used?
RELEASE_OWNERSHIP = 1 << 6
CHANGE_LINKS = 1 << 7
# Object joints don't exist anymore
CHANGE_JOINTS = 1 << 8
# Change its own permissions? Doesn't seem to be used.
CHANGE_PERMISSIONS = 1 << 9
TRACK_CAMERA = 1 << 10
CONTROL_CAMERA = 1 << 11
TELEPORT = 1 << 12
JOIN_EXPERIENCE = 1 << 13
MANAGE_ESTATE_ACCESS = 1 << 14
ANIMATION_OVERRIDE = 1 << 15
RETURN_OBJECTS = 1 << 16
FORCE_SIT = 1 << 17
CHANGE_ENVIRONMENT = 1 << 18
@se.http_serializer("RenderMaterials")

View File

@@ -10,6 +10,7 @@ from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.circuit import ConnectionHolder
from hippolyzer.lib.base.message.msgtypes import PacketFlags
from hippolyzer.lib.base.templates import (
TransferRequestParamsBase,
TransferChannelType,
@@ -94,7 +95,7 @@ class TransferManager:
if params_dict.get("SessionID", dataclasses.MISSING) is None:
params.SessionID = self._session_id
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
'TransferRequest',
Block(
'TransferInfo',
@@ -104,6 +105,7 @@ class TransferManager:
Priority=priority,
Params_=params,
),
flags=PacketFlags.RELIABLE,
))
transfer = Transfer(transfer_id)
asyncio.create_task(self._pump_transfer_replies(transfer))

View File

@@ -1,5 +1,5 @@
from PySide2.QtCore import QMetaObject
from PySide2.QtUiTools import QUiLoader
from PySide6.QtCore import QMetaObject
from PySide6.QtUiTools import QUiLoader
class UiLoader(QUiLoader):

View File

@@ -13,7 +13,7 @@ from xml.etree.ElementTree import parse as parse_etree
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import get_resource_filename
from hippolyzer.lib.base.legacy_inv import InventorySaleInfo, InventoryPermissions
from hippolyzer.lib.base.inventory import InventorySaleInfo, InventoryPermissions
from hippolyzer.lib.base.legacy_schema import SchemaBase, parse_schema_line, SchemaParsingError
from hippolyzer.lib.base.templates import WearableType

View File

@@ -11,7 +11,7 @@ from typing import *
from hippolyzer.lib.base.datatypes import UUID, RawBytes
from hippolyzer.lib.base.message.data_packer import TemplateDataPacker
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.msgtypes import MsgType
from hippolyzer.lib.base.message.msgtypes import MsgType, PacketFlags
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.base.message.circuit import ConnectionHolder
from hippolyzer.lib.base.templates import XferPacket, XferFilePath, AssetType, XferError
@@ -110,7 +110,7 @@ class XferManager:
direction: Direction = Direction.OUT,
) -> Xfer:
xfer_id = xfer_id if xfer_id is not None else random.getrandbits(64)
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
'RequestXfer',
Block(
'XferID',
@@ -174,10 +174,11 @@ class XferManager:
to_ack = range(xfer.next_ackable, ack_max)
xfer.next_ackable = ack_max
for ack_id in to_ack:
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send_reliable(Message(
"ConfirmXferPacket",
Block("XferID", ID=xfer.xfer_id, Packet=ack_id),
direction=xfer.direction,
flags=PacketFlags.RELIABLE,
))
xfer.chunks[packet_id.PacketID] = packet_data
@@ -216,7 +217,7 @@ class XferManager:
else:
inline_data = data
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
"AssetUploadRequest",
Block(
"AssetBlock",
@@ -225,7 +226,8 @@ class XferManager:
Tempfile=temp_file,
StoreLocal=store_local,
AssetData=inline_data,
)
),
flags=PacketFlags.RELIABLE
))
fut = asyncio.Future()
asyncio.create_task(self._pump_asset_upload(xfer, transaction_id, fut))
@@ -272,12 +274,13 @@ class XferManager:
chunk = xfer.chunks.pop(packet_id)
# EOF if there are no chunks left
packet_val = XferPacket(PacketID=packet_id, IsEOF=not bool(xfer.chunks))
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
"SendXferPacket",
Block("XferID", ID=xfer.xfer_id, Packet_=packet_val),
Block("DataPacket", Data=chunk),
# Send this towards the sender of the RequestXfer
direction=~request_msg.direction,
flags=PacketFlags.RELIABLE,
))
# Don't care about the value, just want to know it was confirmed.
if wait_for_confirm:

View File

@@ -0,0 +1,127 @@
from typing import NamedTuple, Union, Optional
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.mesh import MeshAsset, LLMeshSerializer
from hippolyzer.lib.base.templates import AssetType
from hippolyzer.lib.client.state import BaseClientRegion
class UploadError(Exception):
pass
class UploadToken(NamedTuple):
linden_cost: int
uploader_url: str
payload: bytes
class AssetUploader:
def __init__(self, region: BaseClientRegion):
self._region = region
async def initiate_asset_upload(self, name: str, asset_type: AssetType,
body: bytes, flags: Optional[int] = None) -> UploadToken:
payload = {
"asset_type": asset_type.human_name,
"description": "(No Description)",
"everyone_mask": 0,
"group_mask": 0,
"folder_id": UUID.ZERO, # Puts it in the default folder, I guess. Undocumented.
"inventory_type": asset_type.inventory_type.human_name,
"name": name,
"next_owner_mask": 581632,
}
if flags is not None:
payload['flags'] = flags
resp_payload = await self._make_newfileagentinventory_req(payload)
return UploadToken(resp_payload["upload_price"], resp_payload["uploader"], body)
async def _make_newfileagentinventory_req(self, payload: dict):
async with self._region.caps_client.post("NewFileAgentInventory", llsd=payload) as resp:
resp.raise_for_status()
resp_payload = await resp.read_llsd()
# Need to sniff the resp payload for this because SL sends a 200 status code on error
if "error" in resp_payload:
raise UploadError(resp_payload)
return resp_payload
async def complete_upload(self, token: UploadToken) -> dict:
async with self._region.caps_client.post(token.uploader_url, data=token.payload) as resp:
resp.raise_for_status()
resp_payload = await resp.read_llsd()
# The actual upload endpoints return 200 on error, have to sniff the payload to figure
# out if it actually failed...
if "error" in resp_payload:
raise UploadError(resp_payload)
await self._handle_upload_complete(resp_payload)
return resp_payload
async def _handle_upload_complete(self, resp_payload: dict):
"""
Generic hook called when any asset upload completes.
Could trigger an AIS fetch to send the viewer details about the item we just created,
assuming we were in proxy context.
"""
pass
# The mesh upload flow is a little special, so it gets its own methods
async def initiate_mesh_upload(self, name: str, mesh: Union[bytes, MeshAsset],
flags: Optional[int] = None) -> UploadToken:
"""
Very basic LL-serialized mesh uploader
Currently only handles a single mesh with a single face and no associated textures.
"""
if isinstance(mesh, MeshAsset):
writer = se.BufferWriter("!")
writer.write(LLMeshSerializer(), mesh)
mesh = writer.copy_buffer()
asset_resources = self._build_asset_resources(name, mesh)
payload = {
'asset_resources': asset_resources,
'asset_type': 'mesh',
'description': '(No Description)',
'everyone_mask': 0,
'folder_id': UUID.ZERO,
'group_mask': 0,
'inventory_type': 'object',
'name': name,
'next_owner_mask': 581632,
'texture_folder_id': UUID.ZERO
}
if flags is not None:
payload['flags'] = flags
resp_payload = await self._make_newfileagentinventory_req(payload)
upload_body = llsd.format_xml(asset_resources)
return UploadToken(resp_payload["upload_price"], resp_payload["uploader"], upload_body)
def _build_asset_resources(self, name: str, mesh: bytes) -> dict:
return {
'instance_list': [
{
'face_list': [
{
'diffuse_color': [1.0, 1.0, 1.0, 1.0],
'fullbright': False
}
],
'material': 3,
'mesh': 0,
'mesh_name': name,
'physics_shape_type': 2,
'position': [0.0, 0.0, 0.0],
'rotation': [0.7071067690849304, 0.0, 0.0, 0.7071067690849304],
'scale': [1.0, 1.0, 1.0]
}
],
'mesh_list': [mesh],
'metric': 'MUT_Unspecified',
'texture_list': []
}

View File

@@ -0,0 +1,192 @@
from __future__ import annotations
import gzip
import logging
import secrets
from pathlib import Path
from typing import Union, List, Tuple, Set
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.inventory import InventoryModel, InventoryCategory, InventoryItem
from hippolyzer.lib.base.message.message import Block
from hippolyzer.lib.client.state import BaseClientSession
LOG = logging.getLogger(__name__)
class InventoryManager:
def __init__(self, session: BaseClientSession):
self._session = session
self.model: InventoryModel = InventoryModel()
self._load_skeleton()
def _load_skeleton(self):
assert not self.model.nodes
skel_cats: List[dict] = self._session.login_data.get('inventory-skeleton', [])
for skel_cat in skel_cats:
self.model.add(InventoryCategory(
name=skel_cat["name"],
cat_id=UUID(skel_cat["folder_id"]),
parent_id=UUID(skel_cat["parent_id"]),
# Don't use the version from the skeleton, this flags the inventory as needing
# completion from the inventory cache. This matches indra's behavior.
version=InventoryCategory.VERSION_NONE,
type="category",
pref_type=skel_cat.get("type_default", -1),
owner_id=self._session.agent_id,
))
def load_cache(self, path: Union[str, Path]):
# Per indra, rough flow for loading inv on login is:
# 1. Look at inventory skeleton from login response
# 2. Pre-populate model with categories from the skeleton, including their versions
# 3. Read the inventory cache, tracking categories and items separately
# 4. Walk the list of categories in our cache. If the cat exists in the skeleton and the versions
# match, then we may load the category and its descendants from cache.
# 5. Any categories in the skeleton but not in the cache, or those with mismatched versions must be fetched.
# The viewer does this by setting the local version of the cats to -1 and forcing a descendent fetch
# over AIS.
#
# By the time you call this function call, you should have already loaded the inventory skeleton
# into the model set its inventory category versions to VERSION_NONE.
skel_cats: List[dict] = self._session.login_data['inventory-skeleton']
# UUID -> version map for inventory skeleton
skel_versions = {UUID(cat["folder_id"]): cat["version"] for cat in skel_cats}
LOG.info(f"Parsing inv cache at {path}")
cached_categories, cached_items = self._parse_cache(path)
LOG.info(f"Done parsing inv cache at {path}")
loaded_cat_ids: Set[UUID] = set()
for cached_cat in cached_categories:
existing_cat: InventoryCategory = self.model.get(cached_cat.cat_id) # noqa
# Don't clobber an existing cat unless it just has a placeholder version,
# maybe from loading the skeleton?
if existing_cat and existing_cat.version != InventoryCategory.VERSION_NONE:
continue
# Cached cat isn't the same as what the inv server says it should be, can't use it.
if cached_cat.version != skel_versions.get(cached_cat.cat_id):
continue
if existing_cat:
# Remove the category so that we can replace it, but leave any children in place
self.model.unlink(existing_cat, single_only=True)
self.model.add(cached_cat)
# Any items in this category in our cache file are usable and should be added
loaded_cat_ids.add(cached_cat.cat_id)
for cached_item in cached_items:
# The skeleton doesn't have any items, so if we run into any items they should be exactly the
# same as what we're trying to add. No point clobbering.
if cached_item.item_id in self.model:
continue
# The parent category didn't have a cache hit against the inventory skeleton, can't add!
if cached_item.parent_id not in loaded_cat_ids:
continue
self.model.add(cached_item)
def _parse_cache(self, path: Union[str, Path]) -> Tuple[List[InventoryCategory], List[InventoryItem]]:
categories: List[InventoryCategory] = []
items: List[InventoryItem] = []
# Parse our cached items and categories out of the compressed inventory cache
first_line = True
with gzip.open(path, "rb") as f:
# Line-delimited LLSD notation!
for line in f.readlines():
# TODO: Parsing of invcache is dominated by `parse_notation()`. It's stupidly inefficient.
node_llsd = llsd.parse_notation(line)
if first_line:
# First line is the file header
first_line = False
if node_llsd['inv_cache_version'] != 2:
raise ValueError(f"Unknown cache version: {node_llsd!r}")
continue
if InventoryCategory.ID_ATTR in node_llsd:
if (cat_node := InventoryCategory.from_llsd(node_llsd)) is not None:
categories.append(cat_node)
elif InventoryItem.ID_ATTR in node_llsd:
if (item_node := InventoryItem.from_llsd(node_llsd)) is not None:
items.append(item_node)
else:
LOG.warning(f"Unknown node type in inv cache: {node_llsd!r}")
return categories, items
# Thankfully we have 9 billion different ways to represent inventory data.
def ais_item_to_inventory_data(ais_item: dict) -> Block:
return Block(
"InventoryData",
ItemID=ais_item["item_id"],
FolderID=ais_item["parent_id"],
CallbackID=0,
CreatorID=ais_item["permissions"]["creator_id"],
OwnerID=ais_item["permissions"]["owner_id"],
GroupID=ais_item["permissions"]["group_id"],
BaseMask=ais_item["permissions"]["base_mask"],
OwnerMask=ais_item["permissions"]["owner_mask"],
GroupMask=ais_item["permissions"]["group_mask"],
EveryoneMask=ais_item["permissions"]["everyone_mask"],
NextOwnerMask=ais_item["permissions"]["next_owner_mask"],
GroupOwned=0,
AssetID=ais_item["asset_id"],
Type=ais_item["type"],
InvType=ais_item["inv_type"],
Flags=ais_item["flags"],
SaleType=ais_item["sale_info"]["sale_type"],
SalePrice=ais_item["sale_info"]["sale_price"],
Name=ais_item["name"],
Description=ais_item["desc"],
CreationDate=ais_item["created_at"],
# Meaningless here
CRC=secrets.randbits(32),
)
def inventory_data_to_ais_item(inventory_data: Block) -> dict:
return dict(
item_id=inventory_data["ItemID"],
parent_id=inventory_data["ParentID"],
permissions=dict(
creator_id=inventory_data["CreatorID"],
owner_id=inventory_data["OwnerID"],
group_id=inventory_data["GroupID"],
base_mask=inventory_data["BaseMask"],
owner_mask=inventory_data["OwnerMask"],
group_mask=inventory_data["GroupMask"],
everyone_mask=inventory_data["EveryoneMask"],
next_owner_mask=inventory_data["NextOwnerMask"],
),
asset_id=inventory_data["AssetID"],
type=inventory_data["Type"],
inv_type=inventory_data["InvType"],
flags=inventory_data["Flags"],
sale_info=dict(
sale_type=inventory_data["SaleType"],
sale_price=inventory_data["SalePrice"],
),
name=inventory_data["Name"],
description=inventory_data["Description"],
creation_at=inventory_data["CreationDate"],
)
def ais_folder_to_inventory_data(ais_folder: dict) -> Block:
return Block(
"FolderData",
FolderID=ais_folder["cat_id"],
ParentID=ais_folder["parent_id"],
CallbackID=0,
Type=ais_folder["preferred_type"],
Name=ais_folder["name"],
)
def inventory_data_to_ais_folder(inventory_data: Block) -> dict:
return dict(
cat_id=inventory_data["FolderID"],
parent_id=inventory_data["ParentID"],
preferred_type=inventory_data["Type"],
name=inventory_data["Name"],
)

View File

@@ -17,6 +17,7 @@ from hippolyzer.lib.base.datatypes import UUID, Vector3
from hippolyzer.lib.base.helpers import proxify
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.base.message.msgtypes import PacketFlags
from hippolyzer.lib.base.objects import (
normalize_object_update,
normalize_terse_object_update,
@@ -34,7 +35,7 @@ LOG = logging.getLogger(__name__)
OBJECT_OR_LOCAL = Union[Object, int]
class UpdateType(enum.IntEnum):
class ObjectUpdateType(enum.IntEnum):
OBJECT_UPDATE = enum.auto()
PROPERTIES = enum.auto()
FAMILY = enum.auto()
@@ -116,15 +117,15 @@ class ClientObjectManager:
*[Block("ObjectData", ObjectLocalID=x) for x in ids_to_req[:255]],
]
# Selecting causes ObjectProperties to be sent
self._region.circuit.send_message(Message("ObjectSelect", blocks))
self._region.circuit.send_message(Message("ObjectDeselect", blocks))
self._region.circuit.send(Message("ObjectSelect", blocks, flags=PacketFlags.RELIABLE))
self._region.circuit.send(Message("ObjectDeselect", blocks, flags=PacketFlags.RELIABLE))
ids_to_req = ids_to_req[255:]
futures = []
for local_id in local_ids:
if local_id in unselected_ids:
# Need to wait until we get our reply
fut = self.state.register_future(local_id, UpdateType.PROPERTIES)
fut = self.state.register_future(local_id, ObjectUpdateType.PROPERTIES)
else:
# This was selected so we should already have up to date info
fut = asyncio.Future()
@@ -150,16 +151,17 @@ class ClientObjectManager:
ids_to_req = local_ids
while ids_to_req:
self._region.circuit.send_message(Message(
self._region.circuit.send(Message(
"RequestMultipleObjects",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
*[Block("ObjectData", CacheMissType=0, ID=x) for x in ids_to_req[:255]],
flags=PacketFlags.RELIABLE,
))
ids_to_req = ids_to_req[255:]
futures = []
for local_id in local_ids:
futures.append(self.state.register_future(local_id, UpdateType.OBJECT_UPDATE))
futures.append(self.state.register_future(local_id, ObjectUpdateType.OBJECT_UPDATE))
return futures
@@ -168,15 +170,15 @@ class ObjectEvent:
object: Object
updated: Set[str]
update_type: UpdateType
update_type: ObjectUpdateType
def __init__(self, obj: Object, updated: Set[str], update_type: UpdateType):
def __init__(self, obj: Object, updated: Set[str], update_type: ObjectUpdateType):
self.object = obj
self.updated = updated
self.update_type = update_type
@property
def name(self) -> UpdateType:
def name(self) -> ObjectUpdateType:
return self.update_type
@@ -186,7 +188,7 @@ class ClientWorldObjectManager:
self._session: BaseClientSession = session
self._settings = settings
self.name_cache = name_cache or NameCache()
self.events: MessageHandler[ObjectEvent, UpdateType] = MessageHandler(take_by_default=False)
self.events: MessageHandler[ObjectEvent, ObjectUpdateType] = MessageHandler(take_by_default=False)
self._fullid_lookup: Dict[UUID, Object] = {}
self._avatars: Dict[UUID, Avatar] = {}
self._avatar_objects: Dict[UUID, Object] = {}
@@ -295,7 +297,7 @@ class ClientWorldObjectManager:
self._rebuild_avatar_objects()
self._region_managers.clear()
def _update_existing_object(self, obj: Object, new_properties: dict, update_type: UpdateType):
def _update_existing_object(self, obj: Object, new_properties: dict, update_type: ObjectUpdateType):
old_parent_id = obj.ParentID
new_parent_id = new_properties.get("ParentID", obj.ParentID)
old_local_id = obj.LocalID
@@ -354,7 +356,7 @@ class ClientWorldObjectManager:
if obj.PCode == PCode.AVATAR:
self._avatar_objects[obj.FullID] = obj
self._rebuild_avatar_objects()
self._run_object_update_hooks(obj, set(obj.to_dict().keys()), UpdateType.OBJECT_UPDATE)
self._run_object_update_hooks(obj, set(obj.to_dict().keys()), ObjectUpdateType.OBJECT_UPDATE)
def _kill_object_by_local_id(self, region_state: RegionObjectsState, local_id: int):
obj = region_state.lookup_localid(local_id)
@@ -406,7 +408,7 @@ class ClientWorldObjectManager:
# our view of the world then we want to move it to this region.
obj = self.lookup_fullid(object_data["FullID"])
if obj:
self._update_existing_object(obj, object_data, UpdateType.OBJECT_UPDATE)
self._update_existing_object(obj, object_data, ObjectUpdateType.OBJECT_UPDATE)
else:
if region_state is None:
continue
@@ -430,7 +432,7 @@ class ClientWorldObjectManager:
# Need the Object as context because decoding state requires PCode.
state_deserializer = ObjectStateSerializer.deserialize
object_data["State"] = state_deserializer(ctx_obj=obj, val=object_data["State"])
self._update_existing_object(obj, object_data, UpdateType.OBJECT_UPDATE)
self._update_existing_object(obj, object_data, ObjectUpdateType.OBJECT_UPDATE)
else:
if region_state:
region_state.missing_locals.add(object_data["LocalID"])
@@ -458,7 +460,7 @@ class ClientWorldObjectManager:
self._update_existing_object(obj, {
"UpdateFlags": update_flags,
"RegionHandle": handle,
}, UpdateType.OBJECT_UPDATE)
}, ObjectUpdateType.OBJECT_UPDATE)
continue
cached_obj_data = self._lookup_cache_entry(handle, block["ID"], block["CRC"])
@@ -497,7 +499,7 @@ class ClientWorldObjectManager:
LOG.warning(f"Got ObjectUpdateCompressed for unknown region {handle}: {object_data!r}")
obj = self.lookup_fullid(object_data["FullID"])
if obj:
self._update_existing_object(obj, object_data, UpdateType.OBJECT_UPDATE)
self._update_existing_object(obj, object_data, ObjectUpdateType.OBJECT_UPDATE)
else:
if region_state is None:
continue
@@ -514,7 +516,7 @@ class ClientWorldObjectManager:
obj = self.lookup_fullid(block["ObjectID"])
if obj:
seen_locals.append(obj.LocalID)
self._update_existing_object(obj, object_properties, UpdateType.PROPERTIES)
self._update_existing_object(obj, object_properties, ObjectUpdateType.PROPERTIES)
else:
LOG.debug(f"Received {packet.name} for unknown {block['ObjectID']}")
packet.meta["ObjectUpdateIDs"] = tuple(seen_locals)
@@ -561,9 +563,9 @@ class ClientWorldObjectManager:
LOG.debug(f"Received ObjectCost for unknown {object_id}")
continue
obj.ObjectCosts.update(object_costs)
self._run_object_update_hooks(obj, {"ObjectCosts"}, UpdateType.COSTS)
self._run_object_update_hooks(obj, {"ObjectCosts"}, ObjectUpdateType.COSTS)
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str], update_type: UpdateType):
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str], update_type: ObjectUpdateType):
region_state = self._get_region_state(obj.RegionHandle)
region_state.resolve_futures(obj, update_type)
if obj.PCode == PCode.AVATAR and "NameValue" in updated_props:
@@ -572,7 +574,7 @@ class ClientWorldObjectManager:
self.events.handle(ObjectEvent(obj, updated_props, update_type))
def _run_kill_object_hooks(self, obj: Object):
self.events.handle(ObjectEvent(obj, set(), UpdateType.KILL))
self.events.handle(ObjectEvent(obj, set(), ObjectUpdateType.KILL))
def _rebuild_avatar_objects(self):
# Get all avatars known through coarse locations and which region the location was in
@@ -779,7 +781,7 @@ class RegionObjectsState:
del self._orphans[parent_id]
return removed
def register_future(self, local_id: int, future_type: UpdateType) -> asyncio.Future[Object]:
def register_future(self, local_id: int, future_type: ObjectUpdateType) -> asyncio.Future[Object]:
fut = asyncio.Future()
fut_key = (local_id, future_type)
local_futs = self._object_futures.get(fut_key, [])
@@ -788,7 +790,7 @@ class RegionObjectsState:
fut.add_done_callback(local_futs.remove)
return fut
def resolve_futures(self, obj: Object, update_type: UpdateType):
def resolve_futures(self, obj: Object, update_type: ObjectUpdateType):
futures = self._object_futures.get((obj.LocalID, update_type), [])
for fut in futures[:]:
fut.set_result(obj)

View File

@@ -10,6 +10,7 @@ from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.circuit import ConnectionHolder
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.base.network.caps_client import CapsClient
from hippolyzer.lib.base.network.transport import ADDR_TUPLE
if TYPE_CHECKING:
@@ -18,10 +19,11 @@ if TYPE_CHECKING:
class BaseClientRegion(ConnectionHolder, abc.ABC):
"""Represents a client's view of a remote region"""
# Actually a weakref
handle: Optional[int]
# Actually a weakref
session: Callable[[], BaseClientSession]
objects: ClientObjectManager
caps_client: CapsClient
class BaseClientSession(abc.ABC):
@@ -34,3 +36,4 @@ class BaseClientSession(abc.ABC):
region_by_handle: Callable[[int], Optional[BaseClientRegion]]
region_by_circuit_addr: Callable[[ADDR_TUPLE], Optional[BaseClientRegion]]
objects: ClientWorldObjectManager
login_data: Dict[str, Any]

View File

@@ -1,11 +1,13 @@
from __future__ import annotations
from typing import *
import abc
import copy
import dataclasses
import multiprocessing
import pickle
import secrets
import warnings
from typing import *
from hippolyzer.lib.base.datatypes import UUID, Vector3
from hippolyzer.lib.base.message.message import Block, Message
@@ -14,10 +16,11 @@ from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.base.network.transport import UDPPacket, Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager, Session
from hippolyzer.lib.proxy.task_scheduler import TaskLifeScope
from hippolyzer.lib.base.templates import ChatSourceType, ChatType
if TYPE_CHECKING:
from hippolyzer.lib.proxy.sessions import SessionManager, Session
from hippolyzer.lib.proxy.region import ProxiedRegion
class AssetAliasTracker:
@@ -73,17 +76,17 @@ def show_message(text, session=None) -> None:
direction=Direction.IN,
)
if session:
session.main_region.circuit.send_message(message)
session.main_region.circuit.send(message)
else:
for session in AddonManager.SESSION_MANAGER.sessions:
session.main_region.circuit.send_message(copy.copy(message))
session.main_region.circuit.send(copy.copy(message))
def send_chat(message: Union[bytes, str], channel=0, chat_type=ChatType.NORMAL, session=None):
session = session or addon_ctx.session.get(None) or None
if not session:
raise RuntimeError("Tried to send chat without session")
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"ChatFromViewer",
Block(
"AgentData",
@@ -99,36 +102,34 @@ def send_chat(message: Union[bytes, str], channel=0, chat_type=ChatType.NORMAL,
))
def ais_item_to_inventory_data(ais_item: dict):
return Block(
"InventoryData",
ItemID=ais_item["item_id"],
FolderID=ais_item["parent_id"],
CallbackID=0,
CreatorID=ais_item["permissions"]["creator_id"],
OwnerID=ais_item["permissions"]["owner_id"],
GroupID=ais_item["permissions"]["group_id"],
BaseMask=ais_item["permissions"]["base_mask"],
OwnerMask=ais_item["permissions"]["owner_mask"],
GroupMask=ais_item["permissions"]["group_mask"],
EveryoneMask=ais_item["permissions"]["everyone_mask"],
NextOwnerMask=ais_item["permissions"]["next_owner_mask"],
GroupOwned=0,
AssetID=ais_item["asset_id"],
Type=ais_item["type"],
InvType=ais_item["inv_type"],
Flags=ais_item["flags"],
SaleType=ais_item["sale_info"]["sale_type"],
SalePrice=ais_item["sale_info"]["sale_price"],
Name=ais_item["name"],
Description=ais_item["desc"],
CreationDate=ais_item["created_at"],
# Meaningless here
CRC=secrets.randbits(32),
)
class MetaBaseAddon(abc.ABCMeta):
"""
Metaclass for BaseAddon that prevents class member assignments from clobbering descriptors
Without this things like:
class Foo(BaseAddon):
bar: int = GlobalProperty(0)
Foo.bar = 2
Won't work as you expect!
"""
def __setattr__(self, key: str, value):
# TODO: Keep track of AddonProperties in __new__ or something?
try:
existing = object.__getattribute__(self, key)
except AttributeError:
# If the attribute doesn't exist then it's fine to use the base setattr.
super().__setattr__(key, value)
return
if existing and isinstance(existing, BaseAddonProperty):
existing.__set__(self, value)
return
super().__setattr__(key, value)
class BaseAddon(abc.ABC):
class BaseAddon(metaclass=MetaBaseAddon):
def _schedule_task(self, coro: Coroutine, session=None,
region_scoped=False, session_scoped=True, addon_scoped=True):
session = session or addon_ctx.session.get(None) or None
@@ -181,6 +182,9 @@ class BaseAddon(abc.ABC):
def handle_region_changed(self, session: Session, region: ProxiedRegion):
pass
def handle_region_registered(self, session: Session, region: ProxiedRegion):
pass
def handle_circuit_created(self, session: Session, region: ProxiedRegion):
pass
@@ -194,7 +198,7 @@ class BaseAddon(abc.ABC):
_T = TypeVar("_T")
_U = TypeVar("_U", Session, SessionManager)
_U = TypeVar("_U", "Session", "SessionManager")
class BaseAddonProperty(abc.ABC, Generic[_T, _U]):
@@ -243,7 +247,7 @@ class BaseAddonProperty(abc.ABC, Generic[_T, _U]):
self._get_context_obj().addon_ctx[self.name] = value
class SessionProperty(BaseAddonProperty[_T, Session]):
class SessionProperty(BaseAddonProperty[_T, "Session"]):
"""
Property tied to the current session context
@@ -253,7 +257,7 @@ class SessionProperty(BaseAddonProperty[_T, Session]):
return addon_ctx.session.get()
class GlobalProperty(BaseAddonProperty[_T, SessionManager]):
class GlobalProperty(BaseAddonProperty[_T, "SessionManager"]):
"""
Property tied to the global SessionManager context

View File

@@ -16,6 +16,7 @@ from types import ModuleType
from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import get_mtime
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.network.transport import UDPPacket
from hippolyzer.lib.proxy import addon_ctx
@@ -31,13 +32,6 @@ if TYPE_CHECKING:
LOG = logging.getLogger(__name__)
def _get_mtime(path):
try:
return os.stat(path).st_mtime
except:
return None
class BaseInteractionManager:
@abc.abstractmethod
async def open_dir(self, caption: str = '', directory: str = '', filter_str: str = '') -> Optional[str]:
@@ -52,7 +46,8 @@ class BaseInteractionManager:
pass
@abc.abstractmethod
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '') -> Optional[str]:
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '',
default_suffix: str = '') -> Optional[str]:
pass
@abc.abstractmethod
@@ -63,6 +58,15 @@ class BaseInteractionManager:
return None
# Used to initialize a REPL environment with commonly desired helpers
REPL_INITIALIZER = r"""
from hippolyzer.lib.base.datatypes import *
from hippolyzer.lib.base.templates import *
from hippolyzer.lib.base.message.message import Block, Message, Direction
from hippolyzer.lib.proxy.addon_utils import send_chat, show_message
"""
class AddonManager:
COMMAND_CHANNEL = 524
@@ -138,6 +142,16 @@ class AddonManager:
if _locals is None:
_locals = stack.frame.f_locals
init_globals = {}
exec(REPL_INITIALIZER, init_globals, None)
# We're modifying the globals of the caller, be careful of things we imported
# for the REPL initializer clobber things that already exist in the caller's globals.
# Making our own mutable copy of the globals dict, mutating that and then passing it
# to embed() is not an option due to https://github.com/prompt-toolkit/ptpython/issues/279
for global_name, global_val in init_globals.items():
if global_name not in _globals:
_globals[global_name] = global_val
async def _wrapper():
coro: Coroutine = ptpython.repl.embed( # noqa: the type signature lies
globals=_globals,
@@ -186,7 +200,7 @@ class AddonManager:
def _check_hotreloads(cls):
"""Mark addons that rely on changed files for reloading"""
for filename, importers in cls.HOTRELOAD_IMPORTERS.items():
mtime = _get_mtime(filename)
mtime = get_mtime(filename)
if not mtime or mtime == cls.FILE_MTIMES.get(filename, None):
continue
@@ -215,7 +229,7 @@ class AddonManager:
# Mark the caller as having imported (and being dependent on) `module`
stack = inspect.stack()[1]
cls.HOTRELOAD_IMPORTERS[imported_file].add(stack.filename)
cls.FILE_MTIMES[imported_file] = _get_mtime(imported_file)
cls.FILE_MTIMES[imported_file] = get_mtime(imported_file)
importing_spec = next((s for s in cls.BASE_ADDON_SPECS if s.origin == stack.filename), None)
imported_spec = next((s for s in cls.BASE_ADDON_SPECS if s.origin == imported_file), None)
@@ -261,9 +275,12 @@ class AddonManager:
new_addons = {}
for spec in cls.BASE_ADDON_SPECS[:]:
previous_mod = cls.FRESH_ADDON_MODULES.get(spec.name)
# Whether we've EVER successfully loaded this module,
# There may be a `None` entry in the dict if that's the case.
had_mod = spec.name in cls.FRESH_ADDON_MODULES
try:
mtime = _get_mtime(spec.origin)
mtime = get_mtime(spec.origin)
mtime_changed = mtime != cls.FILE_MTIMES.get(spec.origin, None)
if not mtime_changed and had_mod:
continue
@@ -275,20 +292,21 @@ class AddonManager:
# Keep module loaded even if file went away.
continue
if previous_mod:
cls._unload_module(previous_mod)
logging.info("(Re)compiling addon %s" % spec.origin)
old_mod = cls.FRESH_ADDON_MODULES.get(spec.name)
mod = importlib.util.module_from_spec(spec)
sys.modules[spec.name] = mod
spec.loader.exec_module(mod)
cls.FILE_MTIMES[spec.origin] = mtime
cls._unload_module(old_mod)
new_addons[spec.name] = mod
# Make sure module initialization happens after any pending task cancellations
# due to module unloading.
asyncio.get_event_loop().call_soon(cls._init_module, mod)
loop = asyncio.get_event_loop_policy().get_event_loop()
loop.call_soon(cls._init_module, mod)
except Exception as e:
if had_mod:
logging.exception("Exploded trying to reload addon %s" % spec.name)
@@ -414,22 +432,34 @@ class AddonManager:
chat_type: int = message["ChatData"]["ChatType"]
# RLV-style OwnerSay?
if chat and chat.startswith("@") and chat_type == 8:
# RLV-style command, `@<cmd>(:<option1>;<option2>)?(=<param>)?`
options, _, param = chat.rpartition("=")
cmd, _, options = options.lstrip("@").partition(":")
options = options.split(";")
source = message["ChatData"]["SourceID"]
try:
with addon_ctx.push(session, region):
handled = cls._call_all_addon_hooks("handle_rlv_command",
session, region, source, cmd, options, param)
if handled:
region.circuit.drop_message(message)
return True
except:
LOG.exception(f"Failed while handling command {chat!r}")
if not cls._SWALLOW_ADDON_EXCEPTIONS:
raise
# RLV allows putting multiple commands into one message, blindly splitting on ",".
chat = chat.lstrip("@")
all_cmds_handled = True
for command_str in chat.split(","):
if not command_str:
continue
# RLV-style command, `@<cmd>(:<option1>;<option2>)?(=<param>)?`
options, _, param = command_str.partition("=")
cmd, _, options = options.partition(":")
# TODO: Not always correct, commands can specify their own parsing for the option field
options = options.split(";") if options else []
source = message["ChatData"]["SourceID"]
try:
with addon_ctx.push(session, region):
handled = cls._call_all_addon_hooks("handle_rlv_command",
session, region, source, cmd, options, param)
if handled:
region.circuit.drop_message(message)
else:
all_cmds_handled = False
except:
LOG.exception(f"Failed while handling command {command_str!r}")
all_cmds_handled = False
if not cls._SWALLOW_ADDON_EXCEPTIONS:
raise
# Drop the chat message if all commands it contained were handled by an addon
if all_cmds_handled:
return True
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_lludp_message", session, region, message)
@@ -526,6 +556,11 @@ class AddonManager:
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_region_changed", session, region)
@classmethod
def handle_region_registered(cls, session: Session, region: ProxiedRegion):
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_region_registered", session, region)
@classmethod
def handle_circuit_created(cls, session: Session, region: ProxiedRegion):
with addon_ctx.push(session, region):

View File

@@ -0,0 +1,39 @@
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message, Block
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.client.asset_uploader import AssetUploader
from hippolyzer.lib.client.inventory_manager import ais_item_to_inventory_data
class ProxyAssetUploader(AssetUploader):
async def _handle_upload_complete(self, resp_payload: dict):
# Check if this a failure response first, raising if it is
await super()._handle_upload_complete(resp_payload)
# Fetch enough data from AIS to tell the viewer about the new inventory item
session = self._region.session()
item_id = resp_payload["new_inventory_item"]
ais_req_data = {
"items": [
{
"owner_id": session.agent_id,
"item_id": item_id,
}
]
}
async with self._region.caps_client.post('FetchInventory2', llsd=ais_req_data) as resp:
ais_item = (await resp.read_llsd())["items"][0]
# Got it, ship it off to the viewer
message = Message(
"UpdateCreateInventoryItem",
Block(
"AgentData",
AgentID=session.agent_id,
SimApproved=1,
TransactionID=UUID.random(),
),
ais_item_to_inventory_data(ais_item),
direction=Direction.IN
)
self._region.circuit.send(message)

View File

@@ -0,0 +1,93 @@
from __future__ import annotations
import enum
import typing
from weakref import ref
from typing import *
if TYPE_CHECKING:
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
def is_asset_server_cap_name(cap_name):
return cap_name and (
cap_name.startswith("GetMesh")
or cap_name.startswith("GetTexture")
or cap_name.startswith("ViewerAsset")
)
class CapType(enum.Enum):
NORMAL = enum.auto()
TEMPORARY = enum.auto()
WRAPPER = enum.auto()
PROXY_ONLY = enum.auto()
@property
def fake(self) -> bool:
return self == CapType.PROXY_ONLY or self == CapType.WRAPPER
class SerializedCapData(typing.NamedTuple):
cap_name: typing.Optional[str] = None
region_addr: typing.Optional[str] = None
session_id: typing.Optional[str] = None
base_url: typing.Optional[str] = None
type: str = "NORMAL"
def __bool__(self):
return bool(self.cap_name or self.session_id)
@property
def asset_server_cap(self):
return is_asset_server_cap_name(self.cap_name)
class CapData(NamedTuple):
cap_name: Optional[str] = None
# Actually they're weakrefs but the type sigs suck.
region: Optional[Callable[[], Optional[ProxiedRegion]]] = None
session: Optional[Callable[[], Optional[Session]]] = None
base_url: Optional[str] = None
type: CapType = CapType.NORMAL
def __bool__(self):
return bool(self.cap_name or self.session)
def serialize(self) -> "SerializedCapData":
return SerializedCapData(
cap_name=self.cap_name,
region_addr=str(self.region().circuit_addr) if self.region and self.region() else None,
session_id=str(self.session().id) if self.session and self.session() else None,
base_url=self.base_url,
type=self.type.name,
)
@classmethod
def deserialize(
cls,
ser_cap_data: "SerializedCapData",
session_mgr: Optional[SessionManager],
) -> "CapData":
cap_session = None
cap_region = None
if session_mgr and ser_cap_data.session_id:
for session in session_mgr.sessions:
if ser_cap_data.session_id == str(session.id):
cap_session = session
if cap_session and ser_cap_data.region_addr:
for region in cap_session.regions:
if ser_cap_data.region_addr == str(region.circuit_addr):
cap_region = region
return cls(
cap_name=ser_cap_data.cap_name,
region=ref(cap_region) if cap_region else None,
session=ref(cap_session) if cap_session else None,
base_url=ser_cap_data.base_url,
type=CapType[ser_cap_data.type],
)
@property
def asset_server_cap(self) -> bool:
return is_asset_server_cap_name(self.cap_name)

View File

@@ -20,7 +20,7 @@ class ProxyCapsClient(CapsClient):
def _get_caps(self) -> Optional[CAPS_DICT]:
if not self._region:
return None
return self._region.caps
return self._region.cap_urls
def _request_fixups(self, cap_or_url: str, headers: Dict, proxy: Optional[bool], ssl: Any):
# We want to proxy this through Hippolyzer
@@ -28,7 +28,8 @@ class ProxyCapsClient(CapsClient):
# We go through the proxy by default, tack on a header letting mitmproxy know the
# request came from us so we can tag the request as injected. The header will be popped
# off before passing through to the server.
headers["X-Hippo-Injected"] = "1"
if "X-Hippo-Injected" not in headers:
headers["X-Hippo-Injected"] = "1"
proxy_port = self._settings.HTTP_PROXY_PORT
proxy = f"http://127.0.0.1:{proxy_port}"
# TODO: set up the SSLContext to validate mitmproxy's cert

View File

@@ -25,7 +25,7 @@ class ProxiedCircuit(Circuit):
except:
logging.exception(f"Failed to serialize: {message.to_dict()!r}")
raise
if self.logging_hook and message.injected:
if self.logging_hook and message.synthetic:
self.logging_hook(message)
return self.send_datagram(serialized, message.direction, transport=transport)
@@ -34,44 +34,46 @@ class ProxiedCircuit(Circuit):
return self.out_injections, self.in_injections
return self.in_injections, self.out_injections
def prepare_message(self, message: Message, direction=None):
def prepare_message(self, message: Message):
if message.finalized:
raise RuntimeError(f"Trying to re-send finalized {message!r}")
direction = direction or getattr(message, 'direction')
fwd_injections, reverse_injections = self._get_injections(direction)
if message.queued:
# This is due to be dropped, nothing should be sending the original
raise RuntimeError(f"Trying to send original of queued {message!r}")
fwd_injections, reverse_injections = self._get_injections(message.direction)
message.finalized = True
# Injected, let's gen an ID
if message.packet_id is None:
message.packet_id = fwd_injections.gen_injectable_id()
message.injected = True
else:
message.synthetic = True
# This message wasn't injected by the proxy so we need to rewrite packet IDs
# to account for IDs the real creator of the packet couldn't have known about.
elif not message.synthetic:
# was_dropped needs the unmodified packet ID
if fwd_injections.was_dropped(message.packet_id) and message.name != "PacketAck":
logging.warning("Attempting to re-send previously dropped %s:%s, did we ack?" %
(message.packet_id, message.name))
message.packet_id = fwd_injections.get_effective_id(message.packet_id)
fwd_injections.track_seen(message.packet_id)
message.finalized = True
if not message.injected:
# This message wasn't injected by the proxy so we need to rewrite packet IDs
# to account for IDs the other parties couldn't have known about.
message.acks = tuple(
reverse_injections.get_original_id(x) for x in message.acks
if not reverse_injections.was_injected(x)
)
if message.name == "PacketAck":
if not self._rewrite_packet_ack(message, reverse_injections):
logging.debug(f"Dropping {direction} ack for injected packets!")
if not self._rewrite_packet_ack(message, reverse_injections) and not message.acks:
logging.debug(f"Dropping {message.direction} ack for injected packets!")
# Let caller know this shouldn't be sent at all, it's strictly ACKs for
# injected packets.
return False
elif message.name == "StartPingCheck":
self._rewrite_start_ping_check(message, fwd_injections)
if not message.acks:
if message.acks:
message.send_flags |= PacketFlags.ACK
else:
message.send_flags &= ~PacketFlags.ACK
return True
@@ -97,15 +99,18 @@ class ProxiedCircuit(Circuit):
new_id = fwd_injections.get_effective_id(orig_id)
if orig_id != new_id:
logging.debug("Rewrote oldest unacked %s -> %s" % (orig_id, new_id))
# Get a list of unacked IDs for the direction this StartPingCheck is heading
fwd_unacked = (a for (d, a) in self.unacked_reliable.keys() if d == message.direction)
# Use the proxy's oldest unacked ID if it's older than the client's
new_id = min((new_id, *fwd_unacked))
message["PingID"]["OldestUnacked"] = new_id
def drop_message(self, message: Message, orig_direction=None):
def drop_message(self, message: Message):
if message.finalized:
raise RuntimeError(f"Trying to drop finalized {message!r}")
if message.packet_id is None:
return
orig_direction = orig_direction or message.direction
fwd_injections, reverse_injections = self._get_injections(orig_direction)
fwd_injections, reverse_injections = self._get_injections(message.direction)
fwd_injections.mark_dropped(message.packet_id)
message.dropped = True
@@ -113,7 +118,7 @@ class ProxiedCircuit(Circuit):
# Was sent reliably, tell the other end that we saw it and to shut up.
if message.reliable:
self.send_acks([message.packet_id], ~orig_direction)
self.send_acks([message.packet_id], ~message.direction)
# This packet had acks for the other end, send them in a separate PacketAck
effective_acks = tuple(
@@ -121,7 +126,7 @@ class ProxiedCircuit(Circuit):
if not reverse_injections.was_injected(x)
)
if effective_acks:
self.send_acks(effective_acks, orig_direction, packet_id=message.packet_id)
self.send_acks(effective_acks, message.direction, packet_id=message.packet_id)
class InjectionTracker:

View File

@@ -26,6 +26,10 @@ class CommandDetails(NamedTuple):
lifetime: Optional[TaskLifeScope] = None
def parse_bool(val: str) -> bool:
return val.lower() in ('on', 'true', '1', '1.0', 'yes')
def handle_command(command_name: Optional[str] = None, /, *, lifetime: Optional[TaskLifeScope] = None,
single_instance: bool = False, **params: Union[Parameter, callable]):
"""
@@ -61,13 +65,13 @@ def handle_command(command_name: Optional[str] = None, /, *, lifetime: Optional[
# Greedy, takes the rest of the message
if param.sep is None:
param_val = message
message = None
message = ""
else:
message = message.lstrip(param.sep)
if not message:
if param.optional:
break
raise KeyError(f"Missing parameter {param_name}")
if not param.optional:
raise KeyError(f"Missing parameter {param_name}")
continue
param_val, _, message = message.partition(param.sep) # type: ignore
param_vals[param_name] = param.parser(param_val)

View File

@@ -58,7 +58,7 @@ class HTTPAssetRepo(collections.UserDict):
return False
asset = self[asset_id]
flow.response = http.HTTPResponse.make(
flow.response = http.Response.make(
content=asset.data,
headers={
"Content-Type": "application/octet-stream",

View File

@@ -18,8 +18,9 @@ from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.llsd_msg_serializer import LLSDMessageSerializer
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.region import ProxiedRegion, CapType
from hippolyzer.lib.proxy.sessions import SessionManager, CapData, Session
from hippolyzer.lib.proxy.caps import CapData, CapType
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager, Session
from hippolyzer.lib.proxy.http_proxy import HTTPFlowContext
@@ -82,16 +83,19 @@ class MITMProxyEventManager:
finally:
# If someone has taken this request out of the regular callback flow,
# they'll manually send a callback at some later time.
if not flow.taken:
self.to_proxy_queue.put(("callback", flow.id, flow.get_state()))
if not flow.taken and not flow.resumed:
# Addon hasn't taken ownership of this flow, send it back to mitmproxy
# ourselves.
flow.resume()
def _handle_request(self, flow: HippoHTTPFlow):
url = flow.request.url
cap_data = self.session_manager.resolve_cap(url)
flow.cap_data = cap_data
# Don't do anything special with the proxy's own requests,
# we only pass it through for logging purposes.
if flow.request_injected:
# Don't do anything special with the proxy's own requests unless the requested
# URL can only be handled by the proxy. Ideally we only pass the request through
# for logging purposes.
if flow.request_injected and (not cap_data or not cap_data.type.fake):
return
# The local asset repo gets first bite at the apple
@@ -103,7 +107,7 @@ class MITMProxyEventManager:
AddonManager.handle_http_request(flow)
if cap_data and cap_data.cap_name.endswith("ProxyWrapper"):
orig_cap_name = cap_data.cap_name.rsplit("ProxyWrapper", 1)[0]
orig_cap_url = cap_data.region().caps[orig_cap_name]
orig_cap_url = cap_data.region().cap_urls[orig_cap_name]
split_orig_url = urllib.parse.urlsplit(orig_cap_url)
orig_cap_host = split_orig_url[1]
@@ -120,7 +124,7 @@ class MITMProxyEventManager:
if not flow.can_stream or self._asset_server_proxied:
flow.request.url = redir_url
else:
flow.response = mitmproxy.http.HTTPResponse.make(
flow.response = mitmproxy.http.Response.make(
307,
# Can't provide explanation in the body because this results in failing Range requests under
# mitmproxy that return garbage data. Chances are there's weird interactions
@@ -134,9 +138,41 @@ class MITMProxyEventManager:
)
elif cap_data and cap_data.asset_server_cap:
# Both the wrapper request and the actual asset server request went through
# the proxy
# the proxy. Don't bother trying the redirect strategy anymore.
self._asset_server_proxied = True
logging.warning("noproxy not used, switching to URI rewrite strategy")
elif cap_data and cap_data.cap_name == "EventQueueGet":
# HACK: The sim's EQ acking mechanism doesn't seem to actually work.
# if the client drops the connection due to timeout before we can
# proxy back the response then it will be lost forever. Keep around
# the last EQ response we got so we can re-send it if the client repeats
# its previous request.
req_ack_id = llsd.parse_xml(flow.request.content)["ack"]
eq_manager = cap_data.region().eq_manager
cached_resp = eq_manager.get_cached_poll_response(req_ack_id)
if cached_resp:
logging.warning("Had to serve a cached EventQueueGet due to client desync")
flow.response = mitmproxy.http.Response.make(
200,
llsd.format_xml(cached_resp),
{
"Content-Type": "application/llsd+xml",
# So we can differentiate these in the log
"X-Hippo-Fake-EQ": "1",
"Connection": "close",
},
)
elif cap_data and cap_data.cap_name == "Seed":
# Drop any proxy-only caps from the seed request we send to the server,
# add those cap names as metadata so we know to send their urls in the response
parsed_seed: List[str] = llsd.parse_xml(flow.request.content)
flow.metadata['needed_proxy_caps'] = []
for known_cap_name, (known_cap_type, known_cap_url) in cap_data.region().caps.items():
if known_cap_type == CapType.PROXY_ONLY and known_cap_name in parsed_seed:
parsed_seed.remove(known_cap_name)
flow.metadata['needed_proxy_caps'].append(known_cap_name)
if flow.metadata['needed_proxy_caps']:
flow.request.content = llsd.format_xml(parsed_seed)
elif not cap_data:
if self._is_login_request(flow):
# Not strictly a Cap, but makes it easier to filter on.
@@ -145,7 +181,7 @@ class MITMProxyEventManager:
if cap_data and cap_data.type == CapType.PROXY_ONLY:
# A proxy addon was supposed to respond itself, but it didn't.
if not flow.taken and not flow.response_injected:
flow.response = mitmproxy.http.HTTPResponse.make(
flow.response = mitmproxy.http.Response.make(
500,
b"Proxy didn't handle proxy-only Cap correctly",
{
@@ -176,10 +212,14 @@ class MITMProxyEventManager:
def _handle_response(self, flow: HippoHTTPFlow):
message_logger = self.session_manager.message_logger
if message_logger:
message_logger.log_http_response(flow)
try:
message_logger.log_http_response(flow)
except:
logging.exception("Failed while logging HTTP flow")
# Don't handle responses for requests injected by the proxy
if flow.request_injected:
# Don't process responses for requests or responses injected by the proxy.
# We already processed it, it came from us!
if flow.request_injected or flow.response_injected:
return
status = flow.response.status_code
@@ -240,7 +280,10 @@ class MITMProxyEventManager:
for cap_name in wrappable_caps:
if cap_name in parsed:
parsed[cap_name] = region.register_wrapper_cap(cap_name)
flow.response.content = llsd.format_pretty_xml(parsed)
# Send the client the URLs for any proxy-only caps it requested
for cap_name in flow.metadata['needed_proxy_caps']:
parsed[cap_name] = region.cap_urls[cap_name]
flow.response.content = llsd.format_xml(parsed)
elif cap_data.cap_name == "EventQueueGet":
parsed_eq_resp = llsd.parse_xml(flow.response.content)
if parsed_eq_resp:
@@ -251,18 +294,21 @@ class MITMProxyEventManager:
new_events.append(event)
# Add on any fake events that've been queued by addons
eq_manager = cap_data.region().eq_manager
new_events.extend(eq_manager.take_events())
new_events.extend(eq_manager.take_injected_events())
parsed_eq_resp["events"] = new_events
# Empty event list is an error, need to return undef instead.
if old_events and not new_events:
# Need at least one event or the viewer will refuse to ack!
new_events.append({"message": "NOP", "body": {}})
flow.response.content = llsd.format_pretty_xml(parsed_eq_resp)
parsed_eq_resp = None
# HACK: see note in above request handler for EventQueueGet
req_ack_id = llsd.parse_xml(flow.request.content)["ack"]
eq_manager.cache_last_poll_response(req_ack_id, parsed_eq_resp)
flow.response.content = llsd.format_xml(parsed_eq_resp)
elif cap_data.cap_name in self.UPLOAD_CREATING_CAPS:
if not region:
return
parsed = llsd.parse_xml(flow.response.content)
if "uploader" in parsed:
region.register_temporary_cap(cap_data.cap_name + "Uploader", parsed["uploader"])
region.register_cap(cap_data.cap_name + "Uploader", parsed["uploader"], CapType.TEMPORARY)
except:
logging.exception("OOPS, blew up in HTTP proxy!")

View File

@@ -1,13 +1,18 @@
from __future__ import annotations
import copy
import multiprocessing
import weakref
from typing import *
from typing import Optional
import mitmproxy.http
from mitmproxy.http import HTTPFlow
from hippolyzer.lib.proxy.caps import CapData
if TYPE_CHECKING:
from hippolyzer.lib.proxy.sessions import CapData, SessionManager
from hippolyzer.lib.proxy.sessions import SessionManager
class HippoHTTPFlow:
@@ -17,24 +22,26 @@ class HippoHTTPFlow:
Hides the nastiness of writing to flow.metadata so we can pass
state back and forth between the two proxies
"""
__slots__ = ("flow",)
__slots__ = ("flow", "callback_queue", "resumed", "taken")
def __init__(self, flow: HTTPFlow):
def __init__(self, flow: HTTPFlow, callback_queue: Optional[multiprocessing.Queue] = None):
self.flow: HTTPFlow = flow
self.resumed = False
self.taken = False
self.callback_queue = weakref.ref(callback_queue) if callback_queue else None
meta = self.flow.metadata
meta.setdefault("taken", False)
meta.setdefault("can_stream", True)
meta.setdefault("response_injected", False)
meta.setdefault("request_injected", False)
meta.setdefault("cap_data", None)
meta.setdefault("cap_data", CapData())
meta.setdefault("from_browser", False)
@property
def request(self) -> mitmproxy.http.HTTPRequest:
def request(self) -> mitmproxy.http.Request:
return self.flow.request
@property
def response(self) -> Optional[mitmproxy.http.HTTPResponse]:
def response(self) -> Optional[mitmproxy.http.Response]:
return self.flow.response
@property
@@ -42,7 +49,7 @@ class HippoHTTPFlow:
return self.flow.id
@response.setter
def response(self, val: Optional[mitmproxy.http.HTTPResponse]):
def response(self, val: Optional[mitmproxy.http.Response]):
self.flow.metadata["response_injected"] = True
self.flow.response = val
@@ -88,12 +95,27 @@ class HippoHTTPFlow:
def take(self) -> HippoHTTPFlow:
"""Don't automatically pass this flow back to mitmproxy"""
self.metadata["taken"] = True
# TODO: Having to explicitly take / release Flows to use them in an async
# context is kind of janky. The HTTP callback handling code should probably
# be made totally async, including the addon hooks. Would coroutine per-callback
# be expensive?
assert not self.taken and not self.resumed
self.taken = True
return self
@property
def taken(self) -> bool:
return self.metadata["taken"]
def resume(self):
"""Release the HTTP flow back to the normal processing flow"""
assert self.callback_queue
assert not self.resumed
self.taken = False
self.resumed = True
self.callback_queue().put(("callback", self.flow.id, self.get_state()))
def preempt(self):
# Must be some flow that we previously resumed, we're racing
# the result from the server end.
assert not self.taken and self.resumed
self.callback_queue().put(("preempt", self.flow.id, self.get_state()))
@property
def is_replay(self) -> bool:
@@ -113,15 +135,18 @@ class HippoHTTPFlow:
return state
@classmethod
def from_state(cls, flow_state: Dict, session_manager: SessionManager) -> HippoHTTPFlow:
def from_state(cls, flow_state: Dict, session_manager: Optional[SessionManager]) -> HippoHTTPFlow:
flow: Optional[HTTPFlow] = HTTPFlow.from_state(flow_state)
assert flow is not None
cap_data_ser = flow.metadata.get("cap_data_ser")
callback_queue = None
if session_manager:
callback_queue = session_manager.flow_context.to_proxy_queue
if cap_data_ser is not None:
flow.metadata["cap_data"] = session_manager.deserialize_cap_data(cap_data_ser)
flow.metadata["cap_data"] = CapData.deserialize(cap_data_ser, session_manager)
else:
flow.metadata["cap_data"] = None
return cls(flow)
return cls(flow, callback_queue)
def copy(self) -> HippoHTTPFlow:
# HACK: flow.copy() expects the flow to be fully JSON serializable, but

View File

@@ -1,5 +1,4 @@
import asyncio
import functools
import logging
import multiprocessing
import os
@@ -8,6 +7,7 @@ import sys
import queue
import typing
import uuid
import weakref
import mitmproxy.certs
import mitmproxy.ctx
@@ -15,42 +15,30 @@ import mitmproxy.log
import mitmproxy.master
import mitmproxy.options
import mitmproxy.proxy
from mitmproxy.addons import core, clientplayback
from mitmproxy.addons import core, clientplayback, proxyserver, next_layer, disable_h2c
from mitmproxy.http import HTTPFlow
from mitmproxy.proxy.layers import tls
import OpenSSL
from hippolyzer.lib.base.helpers import get_resource_filename
from hippolyzer.lib.base.multiprocessing_utils import ParentProcessWatcher
orig_sethostflags = OpenSSL.SSL._lib.X509_VERIFY_PARAM_set_hostflags # noqa
@functools.wraps(orig_sethostflags)
def _sethostflags_wrapper(param, flags):
# Since 2000 the recommendation per RFCs has been to only check SANs and not the CN field.
# Most browsers do this, as does mitmproxy. The viewer does not, and the sim certs have no SAN
# field. Just monkeypatch out this flag since mitmproxy's internals are in flux and there's
# no good way to stop setting this flag currently.
return orig_sethostflags(
param,
flags & (~OpenSSL.SSL._lib.X509_CHECK_FLAG_NEVER_CHECK_SUBJECT) # noqa
)
OpenSSL.SSL._lib.X509_VERIFY_PARAM_set_hostflags = _sethostflags_wrapper # noqa
from hippolyzer.lib.proxy.caps import SerializedCapData
class SLCertStore(mitmproxy.certs.CertStore):
def get_cert(self, commonname: typing.Optional[bytes], sans: typing.List[bytes], *args):
cert, privkey, chain = super().get_cert(commonname, sans, *args)
x509: OpenSSL.crypto.X509 = cert.x509
def get_cert(self, commonname: typing.Optional[str], sans: typing.List[str], *args, **kwargs):
entry = super().get_cert(commonname, sans, *args, **kwargs)
cert, privkey, chain = entry.cert, entry.privatekey, entry.chain_file
x509 = cert.to_pyopenssl()
# The cert must have a subject key ID or the viewer will reject it.
for i in range(0, x509.get_extension_count()):
ext = x509.get_extension(i)
# This cert already has a subject key id, pass through.
if ext.get_short_name() == b"subjectKeyIdentifier":
return cert, privkey, chain
return entry
# Need to add a subject key ID onto this cert or the viewer will reject it.
# The viewer doesn't actually use the subject key ID for its intended purpose,
# so a random, unique value is fine.
x509.add_extensions([
OpenSSL.crypto.X509Extension(
b"subjectKeyIdentifier",
@@ -58,17 +46,24 @@ class SLCertStore(mitmproxy.certs.CertStore):
uuid.uuid4().hex.encode("utf8"),
),
])
x509.sign(privkey, "sha256") # type: ignore
return cert, privkey, chain
x509.sign(OpenSSL.crypto.PKey.from_cryptography_key(privkey), "sha256") # type: ignore
new_entry = mitmproxy.certs.CertStoreEntry(
mitmproxy.certs.Cert.from_pyopenssl(x509), privkey, chain
)
# Replace the cert that was created in the base `get_cert()` with our modified cert
self.certs[(commonname, tuple(sans))] = new_entry
self.expire_queue.pop(-1)
self.expire(new_entry)
return new_entry
class SLProxyConfig(mitmproxy.proxy.ProxyConfig):
def configure(self, options, updated) -> None:
super().configure(options, updated)
class SLTlsConfig(mitmproxy.addons.tlsconfig.TlsConfig):
def running(self):
super().running()
old_cert_store = self.certstore
# Replace the cert store with one that knows how to add
# a subject key ID extension.
self.certstore = SLCertStore( # noqa
self.certstore = SLCertStore(
default_privatekey=old_cert_store.default_privatekey,
default_ca=old_cert_store.default_ca,
default_chain_file=old_cert_store.default_chain_file,
@@ -76,6 +71,18 @@ class SLProxyConfig(mitmproxy.proxy.ProxyConfig):
)
self.certstore.certs = old_cert_store.certs
def tls_start_server(self, tls_start: tls.TlsData):
super().tls_start_server(tls_start)
# Since 2000 the recommendation per RFCs has been to only check SANs and not the CN field.
# Most browsers do this, as does mitmproxy. The viewer does not, and the sim certs have no SAN
# field. set the host verification flags to remove the flag that disallows falling back to
# checking the CN (X509_CHECK_FLAG_NEVER_CHECK_SUBJECT)
param = OpenSSL.SSL._lib.SSL_get0_param(tls_start.ssl_conn._ssl) # noqa
# get_hostflags() doesn't seem to be exposed, just set the usual flags without
# the problematic `X509_CHECK_FLAG_NEVER_CHECK_SUBJECT` flag.
flags = OpenSSL.SSL._lib.X509_CHECK_FLAG_NO_PARTIAL_WILDCARDS # noqa
OpenSSL.SSL._lib.X509_VERIFY_PARAM_set_hostflags(param, flags) # noqa
class HTTPFlowContext:
def __init__(self):
@@ -92,12 +99,13 @@ class IPCInterceptionAddon:
flow which is merged in and resumed.
"""
def __init__(self, flow_context: HTTPFlowContext):
self.intercepted_flows: typing.Dict[str, HTTPFlow] = {}
self.mitmproxy_ready = flow_context.mitmproxy_ready
self.flows: weakref.WeakValueDictionary[str, HTTPFlow] = weakref.WeakValueDictionary()
self.from_proxy_queue: multiprocessing.Queue = flow_context.from_proxy_queue
self.to_proxy_queue: multiprocessing.Queue = flow_context.to_proxy_queue
self.shutdown_signal: multiprocessing.Event = flow_context.shutdown_signal
def log(self, entry: mitmproxy.log.LogEntry):
def add_log(self, entry: mitmproxy.log.LogEntry):
if entry.level == "debug":
logging.debug(entry.msg)
elif entry.level in ("alert", "info"):
@@ -112,6 +120,8 @@ class IPCInterceptionAddon:
def running(self):
# register to pump the events or something here
asyncio.create_task(self._pump_callbacks())
# Tell the main process mitmproxy is ready to handle requests
self.mitmproxy_ready.set()
async def _pump_callbacks(self):
watcher = ParentProcessWatcher(self.shutdown_signal)
@@ -125,11 +135,13 @@ class IPCInterceptionAddon:
await asyncio.sleep(0.001)
continue
if event_type == "callback":
orig_flow = self.intercepted_flows.pop(flow_id)
orig_flow = self.flows[flow_id]
orig_flow.set_state(flow_state)
# Remove the taken flag from the flow if present, the flow by definition
# isn't take()n anymore once it's been passed back to the proxy.
orig_flow.metadata.pop("taken", None)
elif event_type == "preempt":
orig_flow = self.flows.get(flow_id)
if orig_flow:
orig_flow.intercept()
orig_flow.set_state(flow_state)
elif event_type == "replay":
flow: HTTPFlow = HTTPFlow.from_state(flow_state)
# mitmproxy won't replay intercepted flows, this is an old flow so
@@ -151,8 +163,8 @@ class IPCInterceptionAddon:
from_browser = "Mozilla" in flow.request.headers.get("User-Agent", "")
flow.metadata["from_browser"] = from_browser
# Only trust the "injected" header if not from a browser
was_injected = flow.request.headers.pop("X-Hippo-Injected", False)
if was_injected and not from_browser:
was_injected = flow.request.headers.pop("X-Hippo-Injected", "")
if was_injected == "1" and not from_browser:
flow.metadata["request_injected"] = True
# Does this request need the stupid hack around aiohttp's windows proactor bug
@@ -163,13 +175,13 @@ class IPCInterceptionAddon:
def _queue_flow_interception(self, event_type: str, flow: HTTPFlow):
flow.intercept()
self.intercepted_flows[flow.id] = flow
self.flows[flow.id] = flow
self.from_proxy_queue.put((event_type, flow.get_state()), True)
def responseheaders(self, flow: HTTPFlow):
# The response was injected earlier in an earlier handler,
# we don't want to touch this anymore.
if flow.metadata["response_injected"]:
if flow.metadata.get("response_injected"):
return
# Someone fucked up and put a mimetype in Content-Encoding.
@@ -180,7 +192,10 @@ class IPCInterceptionAddon:
flow.response.headers["Content-Encoding"] = "identity"
def response(self, flow: HTTPFlow):
if flow.metadata["response_injected"]:
cap_data: typing.Optional[SerializedCapData] = flow.metadata.get("cap_data")
if flow.metadata.get("response_injected") and cap_data and cap_data.asset_server_cap:
# Don't bother intercepting asset server requests where we injected a response.
# We don't want to log them and they don't need any more processing by user hooks.
return
self._queue_flow_interception("response", flow)
@@ -188,10 +203,10 @@ class IPCInterceptionAddon:
class SLMITMAddon(IPCInterceptionAddon):
def responseheaders(self, flow: HTTPFlow):
super().responseheaders(flow)
cap_data: typing.Optional[SerializedCapData] = flow.metadata["cap_data_ser"]
cap_data: typing.Optional[SerializedCapData] = flow.metadata.get("cap_data_ser")
# Request came from the proxy itself, don't touch it.
if flow.metadata["request_injected"]:
if flow.metadata.get("request_injected"):
return
# This is an asset server response that we're not interested in intercepting.
@@ -200,7 +215,7 @@ class SLMITMAddon(IPCInterceptionAddon):
# Can't stream if we injected our own response or we were asked not to stream
if not flow.metadata["response_injected"] and flow.metadata["can_stream"]:
flow.response.stream = True
elif not cap_data and not flow.metadata["from_browser"]:
elif not cap_data and not flow.metadata.get("from_browser"):
object_name = flow.response.headers.get("X-SecondLife-Object-Name", "")
# Meh. Add some fake Cap data for this so it can be matched on.
if object_name.startswith("#Firestorm LSL Bridge"):
@@ -213,13 +228,13 @@ class SLMITMMaster(mitmproxy.master.Master):
self.addons.add(
core.Core(),
clientplayback.ClientPlayback(),
SLMITMAddon(flow_context)
disable_h2c.DisableH2C(),
proxyserver.Proxyserver(),
next_layer.NextLayer(),
SLTlsConfig(),
SLMITMAddon(flow_context),
)
def start_server(self):
self.start()
asyncio.ensure_future(self.running())
def create_proxy_master(host, port, flow_context: HTTPFlowContext): # pragma: no cover
opts = mitmproxy.options.Options()
@@ -242,30 +257,4 @@ def create_proxy_master(host, port, flow_context: HTTPFlowContext): # pragma: n
def create_http_proxy(bind_host, port, flow_context: HTTPFlowContext): # pragma: no cover
master = create_proxy_master(bind_host, port, flow_context)
pconf = SLProxyConfig(master.options)
server = mitmproxy.proxy.server.ProxyServer(pconf)
master.server = server
return master
def is_asset_server_cap_name(cap_name):
return cap_name and (
cap_name.startswith("GetMesh")
or cap_name.startswith("GetTexture")
or cap_name.startswith("ViewerAsset")
)
class SerializedCapData(typing.NamedTuple):
cap_name: typing.Optional[str] = None
region_addr: typing.Optional[str] = None
session_id: typing.Optional[str] = None
base_url: typing.Optional[str] = None
type: str = "NORMAL"
def __bool__(self):
return bool(self.cap_name or self.session_id)
@property
def asset_server_cap(self):
return is_asset_server_cap_name(self.cap_name)

View File

@@ -0,0 +1,28 @@
import datetime as dt
from hippolyzer.lib.base.helpers import get_mtime
from hippolyzer.lib.client.inventory_manager import InventoryManager
from hippolyzer.lib.client.state import BaseClientSession
from hippolyzer.lib.proxy.viewer_settings import iter_viewer_cache_dirs
class ProxyInventoryManager(InventoryManager):
def __init__(self, session: BaseClientSession):
super().__init__(session)
newest_cache = None
newest_timestamp = dt.datetime(year=1970, month=1, day=1, tzinfo=dt.timezone.utc)
# Look for the newest version of the cached inventory and use that.
# Not foolproof, but close enough if we're not sure what viewer is being used.
for cache_dir in iter_viewer_cache_dirs():
inv_cache_path = cache_dir / (str(session.agent_id) + ".inv.llsd.gz")
if inv_cache_path.exists():
mod = get_mtime(inv_cache_path)
if not mod:
continue
mod_ts = dt.datetime.fromtimestamp(mod, dt.timezone.utc)
if mod_ts <= newest_timestamp:
continue
newest_cache = inv_cache_path
if newest_cache:
self.load_cache(newest_cache)

View File

@@ -1,3 +1,4 @@
import asyncio
import logging
import weakref
from typing import Optional, Tuple
@@ -35,6 +36,18 @@ class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
)
self.message_xml = MessageDotXML()
self.session: Optional[Session] = None
loop = asyncio.get_event_loop_policy().get_event_loop()
self.resend_task = loop.create_task(self.attempt_resends())
async def attempt_resends(self):
while True:
await asyncio.sleep(0.1)
if self.session is None:
continue
for region in self.session.regions:
if not region.circuit or not region.circuit.is_alive:
continue
region.circuit.resend_unacked()
def _ensure_message_allowed(self, msg: Message):
if not self.message_xml.validate_udp_msg(msg.name):
@@ -99,6 +112,9 @@ class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
LOG.error("No circuit for %r, dropping packet!" % (packet.far_addr,))
return
# Process any ACKs for messages we injected first
region.circuit.collect_acks(message)
if message.name == "AgentMovementComplete":
self.session.main_region = region
if region.handle is None:
@@ -131,7 +147,7 @@ class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
# This message is owned by an async handler, drop it so it doesn't get
# sent with the normal flow.
if message.queued and not message.dropped:
if message.queued:
region.circuit.drop_message(message)
# Shouldn't mutate the message past this point, so log it now.
@@ -146,8 +162,9 @@ class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
elif message.name == "RegionHandshake":
region.name = str(message["RegionInfo"][0]["SimName"])
if not message.dropped:
region.circuit.send_message(message)
# Send the message if it wasn't explicitly dropped or sent before
if not message.finalized:
region.circuit.send(message)
def close(self):
super().close()
@@ -155,3 +172,4 @@ class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
AddonManager.handle_session_closed(self.session)
self.session_manager.close_session(self.session)
self.session = None
self.resend_task.cancel()

View File

@@ -3,7 +3,7 @@ import ast
import typing
from arpeggio import Optional, ZeroOrMore, EOF, \
ParserPython, PTNodeVisitor, visit_parse_tree, RegExMatch
ParserPython, PTNodeVisitor, visit_parse_tree, RegExMatch, OneOrMore
def literal():
@@ -12,7 +12,7 @@ def literal():
# https://stackoverflow.com/questions/14366401/#comment79795017_14366904
RegExMatch(r'''b?(\"\"\"|\'\'\'|\"|\')((?<!\\)(\\\\)*\\\1|.)*?\1'''),
# base16
RegExMatch(r'0x\d+'),
RegExMatch(r'0x[0-9a-fA-F]+'),
# base10 int or float.
RegExMatch(r'\d+(\.\d+)?'),
"None",
@@ -26,7 +26,9 @@ def literal():
def identifier():
return RegExMatch(r'[a-zA-Z*]([a-zA-Z0-9_*]+)?')
# Identifiers are allowed to have "-". It's not a special character
# in our grammar, and we expect them to show up some places, like header names.
return RegExMatch(r'[a-zA-Z*]([a-zA-Z0-9_*-]+)?')
def field_specifier():
@@ -42,7 +44,7 @@ def unary_expression():
def meta_field_specifier():
return "Meta", ".", identifier
return "Meta", OneOrMore(".", identifier)
def enum_field_specifier():
@@ -69,12 +71,17 @@ def message_filter():
return expression, EOF
MATCH_RESULT = typing.Union[bool, typing.Tuple]
class MatchResult(typing.NamedTuple):
result: bool
fields: typing.List[typing.Tuple]
def __bool__(self):
return self.result
class BaseFilterNode(abc.ABC):
@abc.abstractmethod
def match(self, msg) -> MATCH_RESULT:
def match(self, msg, short_circuit=True) -> MatchResult:
raise NotImplementedError()
@property
@@ -104,18 +111,36 @@ class BinaryFilterNode(BaseFilterNode, abc.ABC):
class UnaryNotFilterNode(UnaryFilterNode):
def match(self, msg) -> MATCH_RESULT:
return not self.node.match(msg)
def match(self, msg, short_circuit=True) -> MatchResult:
# Should we pass fields up here? Maybe not.
return MatchResult(not self.node.match(msg, short_circuit), [])
class OrFilterNode(BinaryFilterNode):
def match(self, msg) -> MATCH_RESULT:
return self.left_node.match(msg) or self.right_node.match(msg)
def match(self, msg, short_circuit=True) -> MatchResult:
left_match = self.left_node.match(msg, short_circuit)
if left_match and short_circuit:
return MatchResult(True, left_match.fields)
right_match = self.right_node.match(msg, short_circuit)
if right_match and short_circuit:
return MatchResult(True, right_match.fields)
if left_match or right_match:
# Fine since fields should be empty when result=False
return MatchResult(True, left_match.fields + right_match.fields)
return MatchResult(False, [])
class AndFilterNode(BinaryFilterNode):
def match(self, msg) -> MATCH_RESULT:
return self.left_node.match(msg) and self.right_node.match(msg)
def match(self, msg, short_circuit=True) -> MatchResult:
left_match = self.left_node.match(msg, short_circuit)
if not left_match:
return MatchResult(False, [])
right_match = self.right_node.match(msg, short_circuit)
if not right_match:
return MatchResult(False, [])
return MatchResult(True, left_match.fields + right_match.fields)
class MessageFilterNode(BaseFilterNode):
@@ -124,15 +149,15 @@ class MessageFilterNode(BaseFilterNode):
self.operator = operator
self.value = value
def match(self, msg) -> MATCH_RESULT:
return msg.matches(self)
def match(self, msg, short_circuit=True) -> MatchResult:
return msg.matches(self, short_circuit)
@property
def children(self):
return self.selector, self.operator, self.value
class MetaFieldSpecifier(str):
class MetaFieldSpecifier(tuple):
pass
@@ -158,7 +183,7 @@ class MessageFilterVisitor(PTNodeVisitor):
return LiteralValue(ast.literal_eval(node.value))
def visit_meta_field_specifier(self, _node, children):
return MetaFieldSpecifier(children[0])
return MetaFieldSpecifier(children)
def visit_enum_field_specifier(self, _node, children):
return EnumFieldSpecifier(*children)

View File

@@ -1,8 +1,11 @@
from __future__ import annotations
import abc
import ast
import collections
import copy
import fnmatch
import gzip
import io
import logging
import pickle
@@ -13,16 +16,16 @@ import weakref
from defusedxml import minidom
from hippolyzer.lib.base import serialization as se, llsd
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.datatypes import TaggedUnion, UUID, TupleCoord
from hippolyzer.lib.base.helpers import bytes_escape
from hippolyzer.lib.base.message.message_formatting import HumanMessageSerializer
from hippolyzer.lib.proxy.message_filter import MetaFieldSpecifier, compile_filter, BaseFilterNode, MessageFilterNode, \
EnumFieldSpecifier
from hippolyzer.lib.proxy.region import CapType
EnumFieldSpecifier, MatchResult
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.caps import CapType, SerializedCapData
if typing.TYPE_CHECKING:
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
@@ -30,24 +33,42 @@ LOG = logging.getLogger(__name__)
class BaseMessageLogger:
paused: bool
def log_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
pass
if self.paused:
return False
return self.add_log_entry(LLUDPMessageLogEntry(message, region, session))
def log_http_response(self, flow: HippoHTTPFlow):
pass
if self.paused:
return False
# These are huge, let's not log them for now.
if flow.cap_data and flow.cap_data.asset_server_cap:
return False
return self.add_log_entry(HTTPMessageLogEntry(flow))
def log_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
if self.paused:
return False
return self.add_log_entry(EQMessageLogEntry(event, region, session))
@abc.abstractmethod
def add_log_entry(self, entry: AbstractMessageLogEntry):
pass
class FilteringMessageLogger(BaseMessageLogger):
def __init__(self):
def __init__(self, maxlen=2000):
BaseMessageLogger.__init__(self)
self._raw_entries = collections.deque(maxlen=2000)
self._raw_entries = collections.deque(maxlen=maxlen)
self._filtered_entries: typing.List[AbstractMessageLogEntry] = []
self._paused = False
self.paused = False
self.filter: BaseFilterNode = compile_filter("")
def __iter__(self) -> typing.Iterator[AbstractMessageLogEntry]:
return iter(self._filtered_entries)
def set_filter(self, filter_str: str):
self.filter = compile_filter(filter_str)
self._begin_reset()
@@ -61,25 +82,7 @@ class FilteringMessageLogger(BaseMessageLogger):
self._end_reset()
def set_paused(self, paused: bool):
self._paused = paused
def log_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if self._paused:
return
self._add_log_entry(LLUDPMessageLogEntry(message, region, session))
def log_http_response(self, flow: HippoHTTPFlow):
if self._paused:
return
# These are huge, let's not log them for now.
if flow.cap_data and flow.cap_data.asset_server_cap:
return
self._add_log_entry(HTTPMessageLogEntry(flow))
def log_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
if self._paused:
return
self._add_log_entry(EQMessageLogEntry(event, region, session))
self.paused = paused
# Hooks that Qt models will want to implement
def _begin_insert(self, insert_idx: int):
@@ -94,25 +97,21 @@ class FilteringMessageLogger(BaseMessageLogger):
def _end_reset(self):
pass
def _add_log_entry(self, entry: AbstractMessageLogEntry):
def add_log_entry(self, entry: AbstractMessageLogEntry):
try:
# Paused, throw it away.
if self._paused:
return
if self.paused:
return False
self._raw_entries.append(entry)
if self.filter.match(entry):
next_idx = len(self._filtered_entries)
self._begin_insert(next_idx)
self._filtered_entries.append(entry)
self._end_insert()
entry.cache_summary()
# In the common case we don't need to keep around the serialization
# caches anymore. If the filter changes, the caches will be repopulated
# as necessary.
entry.freeze()
return True
except Exception:
LOG.exception("Failed to filter queued message")
return False
def clear(self):
self._begin_reset()
@@ -121,7 +120,27 @@ class FilteringMessageLogger(BaseMessageLogger):
self._end_reset()
class AbstractMessageLogEntry:
class WrappingMessageLogger(BaseMessageLogger):
def __init__(self):
self.loggers: typing.List[BaseMessageLogger] = []
@property
def paused(self):
return all(x.paused for x in self.loggers)
def add_log_entry(self, entry: AbstractMessageLogEntry):
logged = False
for logger in self.loggers:
if logger.add_log_entry(entry):
logged = True
# At least one logger ended up keeping the message around, so let's
# cache the summary before we freeze the message.
if logged:
entry.cache_summary()
entry.freeze()
class AbstractMessageLogEntry(abc.ABC):
region: typing.Optional[ProxiedRegion]
session: typing.Optional[Session]
name: str
@@ -129,7 +148,7 @@ class AbstractMessageLogEntry:
__slots__ = ["_region", "_session", "_region_name", "_agent_id", "_summary", "meta"]
def __init__(self, region, session):
def __init__(self, region: typing.Optional[ProxiedRegion], session: typing.Optional[Session]):
if region and not isinstance(region, weakref.ReferenceType):
region = weakref.ref(region)
if session and not isinstance(session, weakref.ReferenceType):
@@ -159,6 +178,45 @@ class AbstractMessageLogEntry:
"SelectedFull": self._current_selected_full(),
}
def to_dict(self) -> dict:
meta = self.meta.copy()
def _dehydrate_meta_uuid(key: str):
if meta[key]:
meta[key] = str(meta[key])
_dehydrate_meta_uuid("AgentID")
_dehydrate_meta_uuid("SelectedFull")
_dehydrate_meta_uuid("SessionID")
return {
"type": self.type,
"region_name": self.region_name,
"agent_id": str(self.agent_id) if self.agent_id is not None else None,
"summary": self.summary,
"meta": meta,
}
@classmethod
@abc.abstractmethod
def from_dict(cls, val: dict):
pass
def apply_dict(self, val: dict) -> None:
self._region_name = val['region_name']
self._agent_id = UUID(val['agent_id']) if val['agent_id'] else None
self._summary = val['summary']
meta = val['meta'].copy()
def _hydrate_meta_uuid(key: str):
if meta[key]:
meta[key] = UUID(meta[key])
_hydrate_meta_uuid("AgentID")
_hydrate_meta_uuid("SelectedFull")
_hydrate_meta_uuid("SessionID")
self.meta.update(meta)
def freeze(self):
pass
@@ -177,7 +235,7 @@ class AbstractMessageLogEntry:
obj = self.region.objects.lookup_localid(selected_local)
return obj and obj.FullID
def _get_meta(self, name: str):
def _get_meta(self, name: str) -> typing.Any:
# Slight difference in semantics. Filters are meant to return the same
# thing no matter when they're run, so SelectedLocal and friends resolve
# to the selected items _at the time the message was logged_. To handle
@@ -250,7 +308,9 @@ class AbstractMessageLogEntry:
def _val_matches(self, operator, val, expected):
if isinstance(expected, MetaFieldSpecifier):
expected = self._get_meta(str(expected))
if len(expected) != 1:
raise ValueError(f"Can only support single-level Meta specifiers, not {expected!r}")
expected = self._get_meta(str(expected[0]))
if not isinstance(expected, (int, float, bytes, str, type(None), tuple)):
if callable(expected):
expected = expected()
@@ -304,12 +364,18 @@ class AbstractMessageLogEntry:
if matcher.value or matcher.operator:
return False
return self._packet_root_matches(matcher.selector[0])
if len(matcher.selector) == 2 and matcher.selector[0] == "Meta":
return self._val_matches(matcher.operator, self._get_meta(matcher.selector[1]), matcher.value)
if matcher.selector[0] == "Meta":
if len(matcher.selector) == 2:
return self._val_matches(matcher.operator, self._get_meta(matcher.selector[1]), matcher.value)
elif len(matcher.selector) == 3:
meta_dict = self._get_meta(matcher.selector[1])
if not meta_dict or not hasattr(meta_dict, 'get'):
return False
return self._val_matches(matcher.operator, meta_dict.get(matcher.selector[2]), matcher.value)
return None
def matches(self, matcher: "MessageFilterNode"):
return self._base_matches(matcher) or False
def matches(self, matcher: "MessageFilterNode", short_circuit=True) -> "MatchResult":
return MatchResult(self._base_matches(matcher) or False, [])
@property
def seq(self):
@@ -330,6 +396,14 @@ class AbstractMessageLogEntry:
xmlified = re.sub(rb" <key>", b"<key>", xmlified)
return xmlified.decode("utf8", errors="replace")
@staticmethod
def _format_xml(content):
beautified = minidom.parseString(content).toprettyxml(indent=" ")
# kill blank lines. will break cdata sections. meh.
beautified = re.sub(r'\n\s*\n', '\n', beautified, flags=re.MULTILINE)
return re.sub(r'<(\w+)>\s*</\1>', r'<\1></\1>',
beautified, flags=re.MULTILINE)
class HTTPMessageLogEntry(AbstractMessageLogEntry):
__slots__ = ["flow"]
@@ -342,7 +416,7 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
super().__init__(region, session)
# This was a request the proxy made through itself
self.meta["Injected"] = flow.request_injected
self.meta["Synthetic"] = flow.request_injected
@property
def type(self):
@@ -418,13 +492,17 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
if not beautified:
content_type = self._guess_content_type(message)
if content_type.startswith("application/llsd"):
beautified = self._format_llsd(llsd.parse(message.content))
try:
beautified = self._format_llsd(llsd.parse(message.content))
except llsd.LLSDParseError:
# Sometimes LL sends plain XML with a Content-Type of application/llsd+xml.
# Try to detect that case and work around it
if content_type == "application/llsd+xml" and message.content.startswith(b'<'):
beautified = self._format_xml(message.content)
else:
raise
elif any(content_type.startswith(x) for x in ("application/xml", "text/xml")):
beautified = minidom.parseString(message.content).toprettyxml(indent=" ")
# kill blank lines. will break cdata sections. meh.
beautified = re.sub(r'\n\s*\n', '\n', beautified, flags=re.MULTILINE)
beautified = re.sub(r'<([\w]+)>\s*</\1>', r'<\1></\1>',
beautified, flags=re.MULTILINE)
beautified = self._format_xml(message.content)
except:
LOG.exception("Failed to beautify message")
@@ -444,7 +522,7 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
buf.write(bytes(headers).decode("utf8", errors="replace"))
buf.write("\r\n")
buf.write(message_body)
buf.write(message_body or "")
return buf.getvalue()
def request(self, beautify=False, replacements=None):
@@ -471,6 +549,12 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
return self._summary
def _guess_content_type(self, message):
# SL's login service lies and says that its XML-RPC response is LLSD+XML.
# It is not, and it blows up the parser. It's been broken ever since the
# login rewrite and a fix is likely not forthcoming. I'm sick of seeing
# the traceback, so just hack around it.
if self.name == "LoginRequest":
return "application/xml"
content_type = message.headers.get("Content-Type", "")
if not message.content or content_type.startswith("application/llsd"):
return content_type
@@ -483,6 +567,40 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
return "application/xml"
return content_type
def _get_meta(self, name: str) -> typing.Any:
lower_name = name.lower()
if lower_name == "url":
return self.flow.request.url
elif lower_name == "reqheaders":
return self.flow.request.headers
elif lower_name == "respheaders":
return self.flow.response.headers
elif lower_name == "host":
return self.flow.request.host.lower()
elif lower_name == "status":
return self.flow.response.status_code
return super()._get_meta(name)
def to_dict(self):
val = super().to_dict()
val['flow'] = self.flow.get_state()
cap_data = val['flow'].get('metadata', {}).get('cap_data_ser')
if cap_data is not None:
# Have to convert this from a namedtuple to a dict to make
# it importable
cap_dict = cap_data._asdict() # noqa
val['flow']['metadata']['cap_data_ser'] = cap_dict
return val
@classmethod
def from_dict(cls, val: dict):
cap_data = val['flow'].get('metadata', {}).get('cap_data_ser')
if cap_data:
val['flow']['metadata']['cap_data_ser'] = SerializedCapData(**cap_data)
ev = cls(HippoHTTPFlow.from_state(val['flow'], None))
ev.apply_dict(val)
return ev
class EQMessageLogEntry(AbstractMessageLogEntry):
__slots__ = ["event"]
@@ -510,6 +628,17 @@ class EQMessageLogEntry(AbstractMessageLogEntry):
self._summary = llsd.format_notation(self.event["body"]).decode("utf8")[:500]
return self._summary
def to_dict(self) -> dict:
val = super().to_dict()
val['event'] = llsd.format_notation(self.event)
return val
@classmethod
def from_dict(cls, val: dict):
ev = cls(llsd.parse_notation(val['event']), None, None)
ev.apply_dict(val)
return ev
class LLUDPMessageLogEntry(AbstractMessageLogEntry):
__slots__ = ["_message", "_name", "_direction", "_frozen_message", "_seq", "_deserializer"]
@@ -524,7 +653,7 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
super().__init__(region, session)
_MESSAGE_META_ATTRS = {
"Injected", "Dropped", "Extra", "Resent", "Zerocoded", "Acks", "Reliable",
"Synthetic", "Dropped", "Extra", "Resent", "Zerocoded", "Acks", "Reliable",
}
def _get_meta(self, name: str):
@@ -582,20 +711,21 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
def request(self, beautify=False, replacements=None):
return HumanMessageSerializer.to_human_string(self.message, replacements, beautify)
def matches(self, matcher):
def matches(self, matcher, short_circuit=True) -> "MatchResult":
base_matched = self._base_matches(matcher)
if base_matched is not None:
return base_matched
return MatchResult(base_matched, [])
if not self._packet_root_matches(matcher.selector[0]):
return False
return MatchResult(False, [])
message = self.message
selector_len = len(matcher.selector)
# name, block_name, var_name(, subfield_name)?
if selector_len not in (3, 4):
return False
return MatchResult(False, [])
found_field_keys = []
for block_name in message.blocks:
if not fnmatch.fnmatchcase(block_name, matcher.selector[1]):
continue
@@ -604,13 +734,13 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
if not fnmatch.fnmatchcase(var_name, matcher.selector[2]):
continue
# So we know where the match happened
span_key = (message.name, block_name, block_num, var_name)
field_key = (message.name, block_name, block_num, var_name)
if selector_len == 3:
# We're just matching on the var existing, not having any particular value
if matcher.value is None:
return span_key
if self._val_matches(matcher.operator, block[var_name], matcher.value):
return span_key
found_field_keys.append(field_key)
elif self._val_matches(matcher.operator, block[var_name], matcher.value):
found_field_keys.append(field_key)
# Need to invoke a special unpacker
elif selector_len == 4:
try:
@@ -621,15 +751,21 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
if isinstance(deserialized, TaggedUnion):
deserialized = deserialized.value
if not isinstance(deserialized, dict):
return False
continue
for key in deserialized.keys():
if fnmatch.fnmatchcase(str(key), matcher.selector[3]):
if matcher.value is None:
return span_key
if self._val_matches(matcher.operator, deserialized[key], matcher.value):
return span_key
# Short-circuiting checking individual subfields is fine since
# we only highlight fields anyway.
found_field_keys.append(field_key)
break
elif self._val_matches(matcher.operator, deserialized[key], matcher.value):
found_field_keys.append(field_key)
break
return False
if short_circuit and found_field_keys:
return MatchResult(True, found_field_keys)
return MatchResult(bool(found_field_keys), found_field_keys)
@property
def summary(self):
@@ -642,3 +778,30 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
if self._message:
self._seq = self._message.packet_id
return self._seq
def to_dict(self):
val = super().to_dict()
val['message'] = llsd.format_notation(self.message.to_dict(extended=True))
return val
@classmethod
def from_dict(cls, val: dict):
ev = cls(Message.from_dict(llsd.parse_notation(val['message'])), None, None)
ev.apply_dict(val)
return ev
def export_log_entries(entries: typing.Iterable[AbstractMessageLogEntry]) -> bytes:
return gzip.compress(repr([e.to_dict() for e in entries]).encode("utf8"))
_TYPE_CLASSES = {
"HTTP": HTTPMessageLogEntry,
"LLUDP": LLUDPMessageLogEntry,
"EQ": EQMessageLogEntry,
}
def import_log_entries(data: bytes) -> typing.List[AbstractMessageLogEntry]:
entries = ast.literal_eval(gzip.decompress(data).decode("utf8"))
return [_TYPE_CLASSES[e['type']].from_dict(e) for e in entries]

View File

@@ -32,6 +32,9 @@ class ProxyNameCache(NameCache):
with open(namecache_file, "rb") as f:
namecache_bytes = f.read()
agents = llsd.parse_xml(namecache_bytes)["agents"]
# Can be `None` if the file was just created
if not agents:
continue
for agent_id, agent_data in agents.items():
# Don't set display name if they just have the default
display_name = None

View File

@@ -11,7 +11,7 @@ from hippolyzer.lib.base.templates import PCode
from hippolyzer.lib.client.namecache import NameCache
from hippolyzer.lib.client.object_manager import (
ClientObjectManager,
UpdateType, ClientWorldObjectManager,
ObjectUpdateType, ClientWorldObjectManager,
)
from hippolyzer.lib.base.objects import Object
@@ -57,20 +57,31 @@ class ProxyObjectManager(ClientObjectManager):
LOG.warning(f"Tried to load cache for {self._region} without a handle")
return
self.cache_loaded = True
self.object_cache = RegionViewerObjectCacheChain.for_region(handle, self._region.cache_id)
self.object_cache = RegionViewerObjectCacheChain.for_region(
handle=handle,
cache_id=self._region.cache_id,
cache_dir=self._region.session().cache_dir,
)
def request_missed_cached_objects_soon(self):
def request_missed_cached_objects_soon(self, report_only=False):
if self._cache_miss_timer:
self._cache_miss_timer.cancel()
# Basically debounce. Will only trigger 0.2 seconds after the last time it's invoked to
# deal with the initial flood of ObjectUpdateCached and the natural lag time between that
# and the viewers' RequestMultipleObjects messages
self._cache_miss_timer = asyncio.get_event_loop().call_later(
0.2, self._request_missed_cached_objects)
loop = asyncio.get_event_loop_policy().get_event_loop()
self._cache_miss_timer = loop.call_later(0.2, self._request_missed_cached_objects, report_only)
def _request_missed_cached_objects(self):
def _request_missed_cached_objects(self, report_only: bool):
self._cache_miss_timer = None
self.request_objects(self.queued_cache_misses)
if not self.queued_cache_misses:
# All the queued cache misses ended up being satisfied without us
# having to request them, no need to fire off a request.
return
if report_only:
print(f"Would have automatically requested {self.queued_cache_misses!r}")
else:
self.request_objects(self.queued_cache_misses)
self.queued_cache_misses.clear()
def clear(self):
@@ -106,7 +117,12 @@ class ProxyWorldObjectManager(ClientWorldObjectManager):
)
def _handle_object_update_cached_misses(self, region_handle: int, missing_locals: Set[int]):
if self._settings.AUTOMATICALLY_REQUEST_MISSING_OBJECTS:
region_mgr: Optional[ProxyObjectManager] = self._get_region_manager(region_handle)
if not self._settings.ALLOW_AUTO_REQUEST_OBJECTS:
if self._settings.USE_VIEWER_OBJECT_CACHE:
region_mgr.queued_cache_misses |= missing_locals
region_mgr.request_missed_cached_objects_soon(report_only=True)
elif self._settings.AUTOMATICALLY_REQUEST_MISSING_OBJECTS:
# Schedule these local IDs to be requested soon if the viewer doesn't request
# them itself. Ideally we could just mutate the CRC of the ObjectUpdateCached
# to force a CRC cache miss in the viewer, but that appears to cause the viewer
@@ -117,17 +133,18 @@ class ProxyWorldObjectManager(ClientWorldObjectManager):
region_mgr.queued_cache_misses |= missing_locals
region_mgr.request_missed_cached_objects_soon()
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str], update_type: UpdateType):
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str], update_type: ObjectUpdateType):
super()._run_object_update_hooks(obj, updated_props, update_type)
region = self._session.region_by_handle(obj.RegionHandle)
if obj.PCode == PCode.AVATAR and "ParentID" in updated_props:
if obj.ParentID and not region.objects.lookup_localid(obj.ParentID):
# If an avatar just sat on an object we don't know about, add it to the queued
# cache misses and request if if the viewer doesn't. This should happen
# regardless of the auto-request object setting because otherwise we have no way
# to get a sitting agent's true region location, even if it's ourself.
region.objects.queued_cache_misses.add(obj.ParentID)
region.objects.request_missed_cached_objects_soon()
if self._settings.ALLOW_AUTO_REQUEST_OBJECTS:
if obj.PCode == PCode.AVATAR and "ParentID" in updated_props:
if obj.ParentID and not region.objects.lookup_localid(obj.ParentID):
# If an avatar just sat on an object we don't know about, add it to the queued
# cache misses and request it if the viewer doesn't. This should happen
# regardless of the auto-request missing objects setting because otherwise we
# have no way to get a sitting agent's true region location, even if it's ourselves.
region.objects.queued_cache_misses.add(obj.ParentID)
region.objects.request_missed_cached_objects_soon()
AddonManager.handle_object_updated(self._session, region, obj, updated_props)
def _run_kill_object_hooks(self, obj: Object):

View File

@@ -1,6 +1,5 @@
from __future__ import annotations
import enum
import logging
import hashlib
import uuid
@@ -12,28 +11,24 @@ import multidict
from hippolyzer.lib.base.datatypes import Vector3, UUID
from hippolyzer.lib.base.helpers import proxify
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.llsd_msg_serializer import LLSDMessageSerializer
from hippolyzer.lib.base.message.message import Message, Block
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.base.objects import handle_to_global_pos
from hippolyzer.lib.client.state import BaseClientRegion
from hippolyzer.lib.proxy.caps_client import ProxyCapsClient
from hippolyzer.lib.proxy.circuit import ProxiedCircuit
from hippolyzer.lib.proxy.caps import CapType
from hippolyzer.lib.proxy.object_manager import ProxyObjectManager
from hippolyzer.lib.base.transfer_manager import TransferManager
from hippolyzer.lib.base.xfer_manager import XferManager
from hippolyzer.lib.proxy.asset_uploader import ProxyAssetUploader
if TYPE_CHECKING:
from hippolyzer.lib.proxy.sessions import Session
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
class CapType(enum.Enum):
NORMAL = enum.auto()
TEMPORARY = enum.auto()
WRAPPER = enum.auto()
PROXY_ONLY = enum.auto()
class CapsMultiDict(multidict.MultiDict[Tuple[CapType, str]]):
# TODO: Make a view object for this that's just name -> URL
# deriving from MultiMapping[_T] so we don't have to do
@@ -58,10 +53,11 @@ class ProxiedRegion(BaseClientRegion):
self.cache_id: Optional[UUID] = None
self.circuit: Optional[ProxiedCircuit] = None
self.circuit_addr = circuit_addr
self._caps = CapsMultiDict()
self.caps = CapsMultiDict()
# Reverse lookup for URL -> cap data
self._caps_url_lookup: Dict[str, Tuple[CapType, str]] = {}
if seed_cap:
self._caps["Seed"] = (CapType.NORMAL, seed_cap)
self.caps["Seed"] = (CapType.NORMAL, seed_cap)
self.session: Callable[[], Session] = weakref.ref(session)
self.message_handler: MessageHandler[Message, str] = MessageHandler()
self.http_message_handler: MessageHandler[HippoHTTPFlow, str] = MessageHandler()
@@ -71,6 +67,7 @@ class ProxiedRegion(BaseClientRegion):
self.objects: ProxyObjectManager = ProxyObjectManager(self, may_use_vo_cache=True)
self.xfer_manager = XferManager(proxify(self), self.session().secure_session_id)
self.transfer_manager = TransferManager(proxify(self), session.agent_id, session.id)
self.asset_uploader = ProxyAssetUploader(proxify(self))
self._recalc_caps()
@property
@@ -84,8 +81,8 @@ class ProxiedRegion(BaseClientRegion):
self._name = val
@property
def caps(self):
return multidict.MultiDict((x, y[1]) for x, y in self._caps.items())
def cap_urls(self) -> multidict.MultiDict[str, str]:
return multidict.MultiDict((x, y[1]) for x, y in self.caps.items())
@property
def global_pos(self) -> Vector3:
@@ -102,12 +99,12 @@ class ProxiedRegion(BaseClientRegion):
def update_caps(self, caps: Mapping[str, str]):
for cap_name, cap_url in caps.items():
if isinstance(cap_url, str) and cap_url.startswith('http'):
self._caps.add(cap_name, (CapType.NORMAL, cap_url))
self.caps.add(cap_name, (CapType.NORMAL, cap_url))
self._recalc_caps()
def _recalc_caps(self):
self._caps_url_lookup.clear()
for name, cap_info in self._caps.items():
for name, cap_info in self.caps.items():
cap_type, cap_url = cap_info
self._caps_url_lookup[cap_url] = (cap_type, name)
@@ -116,32 +113,35 @@ class ProxiedRegion(BaseClientRegion):
Wrap an existing, non-unique cap with a unique URL
caps like ViewerAsset may be the same globally and wouldn't let us infer
which session / region the request was related to without a wrapper
which session / region the request was related to without a wrapper URL
that we inject into the seed response sent to the viewer.
"""
parsed = list(urllib.parse.urlsplit(self._caps[name][1]))
seed_id = self._caps["Seed"][1].split("/")[-1].encode("utf8")
parsed = list(urllib.parse.urlsplit(self.caps[name][1]))
seed_id = self.caps["Seed"][1].split("/")[-1].encode("utf8")
# Give it a unique domain tied to the current Seed URI
parsed[1] = f"{name.lower()}-{hashlib.sha256(seed_id).hexdigest()[:16]}.hippo-proxy.localhost"
# Force the URL to HTTP, we're going to handle the request ourselves so it doesn't need
# to be secure. This should save on expensive TLS context setup for each req.
parsed[0] = "http"
wrapper_url = urllib.parse.urlunsplit(parsed)
self._caps.add(name + "ProxyWrapper", (CapType.WRAPPER, wrapper_url))
self._recalc_caps()
# Register it with "ProxyWrapper" appended so we don't shadow the real cap URL
# in our own view of the caps
self.register_cap(name + "ProxyWrapper", wrapper_url, CapType.WRAPPER)
return wrapper_url
def register_proxy_cap(self, name: str):
"""
Register a cap to be completely handled by the proxy
"""
cap_url = f"https://caps.hippo-proxy.localhost/cap/{uuid.uuid4()!s}"
self._caps.add(name, (CapType.PROXY_ONLY, cap_url))
self._recalc_caps()
"""Register a cap to be completely handled by the proxy"""
if name in self.caps:
# If we have an existing cap then we should just use that.
cap_data = self.caps[name]
if cap_data[1] == CapType.PROXY_ONLY:
return cap_data[0]
cap_url = f"http://{uuid.uuid4()!s}.caps.hippo-proxy.localhost"
self.register_cap(name, cap_url, CapType.PROXY_ONLY)
return cap_url
def register_temporary_cap(self, name: str, cap_url: str):
"""Register a Cap that only has meaning the first time it's used"""
self._caps.add(name, (CapType.TEMPORARY, cap_url))
def register_cap(self, name: str, cap_url: str, cap_type: CapType = CapType.NORMAL):
self.caps.add(name, (cap_type, cap_url))
self._recalc_caps()
def resolve_cap(self, url: str, consume=True) -> Optional[Tuple[str, str, CapType]]:
@@ -150,9 +150,9 @@ class ProxiedRegion(BaseClientRegion):
cap_type, name = self._caps_url_lookup[cap_url]
if cap_type == CapType.TEMPORARY and consume:
# Resolving a temporary cap pops it out of the dict
temporary_caps = self._caps.popall(name)
temporary_caps = self.caps.popall(name)
temporary_caps.remove((cap_type, cap_url))
self._caps.extend((name, x) for x in temporary_caps)
self.caps.extend((name, x) for x in temporary_caps)
self._recalc_caps()
return name, cap_url, cap_type
return None
@@ -162,6 +162,7 @@ class ProxiedRegion(BaseClientRegion):
if self.circuit:
self.circuit.is_alive = False
self.objects.clear()
self.eq_manager.clear()
def __repr__(self):
return "<%s %s>" % (self.__class__.__name__, self.name)
@@ -172,11 +173,44 @@ class EventQueueManager:
# TODO: Per-EQ InjectionTracker so we can inject fake responses on 499
self._queued_events = []
self._region = weakref.proxy(region)
self._last_ack: Optional[int] = None
self._last_payload: Optional[Any] = None
self.llsd_message_serializer = LLSDMessageSerializer()
def queue_event(self, event: dict):
def inject_message(self, message: Message):
self.inject_event(self.llsd_message_serializer.serialize(message, True))
def inject_event(self, event: dict):
self._queued_events.append(event)
if self._region:
circuit: ProxiedCircuit = self._region.circuit
session: Session = self._region.session()
# Inject an outbound PlacesQuery message so we can trigger an inbound PlacesReply
# over the EQ. That will allow us to shove our own event onto the response once it comes in,
# otherwise we have to wait until the EQ legitimately returns 200 due to a new event.
# May or may not work in OpenSim.
circuit.send_message(Message(
'PlacesQuery',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id, QueryID=UUID()),
Block('TransactionData', TransactionID=UUID()),
Block('QueryData', QueryText=b'', QueryFlags=64, Category=-1, SimName=b''),
))
def take_events(self):
def take_injected_events(self):
events = self._queued_events
self._queued_events = []
return events
def cache_last_poll_response(self, req_ack: int, payload: Any):
self._last_ack = req_ack
self._last_payload = payload
def get_cached_poll_response(self, req_ack: Optional[int]) -> Optional[Any]:
if self._last_ack == req_ack:
return self._last_payload
return None
def clear(self):
self._queued_events.clear()
self._last_ack = None
self._last_payload = None

View File

@@ -10,16 +10,19 @@ from typing import *
from weakref import ref
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import proxify
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.client.state import BaseClientSession
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.circuit import ProxiedCircuit
from hippolyzer.lib.proxy.http_asset_repo import HTTPAssetRepo
from hippolyzer.lib.proxy.http_proxy import HTTPFlowContext, is_asset_server_cap_name, SerializedCapData
from hippolyzer.lib.proxy.http_proxy import HTTPFlowContext
from hippolyzer.lib.proxy.caps import is_asset_server_cap_name, CapData, CapType
from hippolyzer.lib.proxy.inventory_manager import ProxyInventoryManager
from hippolyzer.lib.proxy.namecache import ProxyNameCache
from hippolyzer.lib.proxy.object_manager import ProxyWorldObjectManager
from hippolyzer.lib.proxy.region import ProxiedRegion, CapType
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.settings import ProxySettings
if TYPE_CHECKING:
@@ -46,6 +49,9 @@ class Session(BaseClientSession):
self.message_handler: MessageHandler[Message, str] = MessageHandler()
self.http_message_handler: MessageHandler[HippoHTTPFlow, str] = MessageHandler()
self.objects = ProxyWorldObjectManager(self, session_manager.settings, session_manager.name_cache)
self.inventory = ProxyInventoryManager(proxify(self))
# Base path of a newview type cache directory for this session
self.cache_dir: Optional[str] = None
self._main_region = None
@property
@@ -96,12 +102,12 @@ class Session(BaseClientSession):
for region in self.regions:
if region.circuit_addr == circuit_addr:
if seed_url and region.caps.get("Seed") != seed_url:
if seed_url and region.cap_urls.get("Seed") != seed_url:
region.update_caps({"Seed": seed_url})
if handle:
region.handle = handle
return region
if seed_url and region.caps.get("Seed") == seed_url:
if seed_url and region.cap_urls.get("Seed") == seed_url:
return region
if not circuit_addr:
@@ -110,6 +116,7 @@ class Session(BaseClientSession):
logging.info("Registering region for %r" % (circuit_addr,))
region = ProxiedRegion(circuit_addr, seed_url, self, handle=handle)
self.regions.append(region)
AddonManager.handle_region_registered(self, region)
return region
def region_by_circuit_addr(self, circuit_addr) -> Optional[ProxiedRegion]:
@@ -211,50 +218,6 @@ class SessionManager:
return cap_data
return CapData()
def deserialize_cap_data(self, ser_cap_data: "SerializedCapData") -> "CapData":
cap_session = None
cap_region = None
if ser_cap_data.session_id:
for session in self.sessions:
if ser_cap_data.session_id == str(session.id):
cap_session = session
if cap_session and ser_cap_data.region_addr:
for region in cap_session.regions:
if ser_cap_data.region_addr == str(region.circuit_addr):
cap_region = region
return CapData(
cap_name=ser_cap_data.cap_name,
region=ref(cap_region) if cap_region else None,
session=ref(cap_session) if cap_session else None,
base_url=ser_cap_data.base_url,
type=CapType[ser_cap_data.type],
)
class CapData(NamedTuple):
cap_name: Optional[str] = None
# Actually they're weakrefs but the type sigs suck.
region: Optional[Callable[[], Optional[ProxiedRegion]]] = None
session: Optional[Callable[[], Optional[Session]]] = None
base_url: Optional[str] = None
type: CapType = CapType.NORMAL
def __bool__(self):
return bool(self.cap_name or self.session)
def serialize(self) -> "SerializedCapData":
return SerializedCapData(
cap_name=self.cap_name,
region_addr=str(self.region().circuit_addr) if self.region and self.region() else None,
session_id=str(self.session().id) if self.session and self.session() else None,
base_url=self.base_url,
type=self.type.name,
)
@property
def asset_server_cap(self) -> bool:
return is_asset_server_cap_name(self.cap_name)
@dataclasses.dataclass
class SelectionModel:

View File

@@ -28,6 +28,9 @@ class ProxySettings(Settings):
PROXY_BIND_ADDR: str = EnvSettingDescriptor("127.0.0.1", "HIPPO_BIND_HOST", str)
REMOTELY_ACCESSIBLE: bool = SettingDescriptor(False)
USE_VIEWER_OBJECT_CACHE: bool = SettingDescriptor(False)
# Whether having the proxy do automatic internal requests objects is allowed at all
ALLOW_AUTO_REQUEST_OBJECTS: bool = SettingDescriptor(True)
# Whether the viewer should request any directly referenced objects it didn't know about.
AUTOMATICALLY_REQUEST_MISSING_OBJECTS: bool = SettingDescriptor(False)
ADDON_SCRIPTS: List[str] = SettingDescriptor(list)
FILTERS: Dict[str, str] = SettingDescriptor(dict)

View File

@@ -63,8 +63,14 @@ class TaskScheduler:
def shutdown(self):
for task_data, task in self.tasks:
task.cancel()
await_all = asyncio.gather(*(task for task_data, task in self.tasks))
asyncio.get_event_loop().run_until_complete(await_all)
try:
event_loop = asyncio.get_running_loop()
await_all = asyncio.gather(*(task for task_data, task in self.tasks))
event_loop.run_until_complete(await_all)
except RuntimeError:
pass
self.tasks.clear()
def _task_done(self, task: asyncio.Task):
for task_details in reversed(self.tasks):

View File

@@ -14,7 +14,7 @@ from hippolyzer.lib.proxy.transport import SOCKS5UDPTransport
class BaseProxyTest(unittest.IsolatedAsyncioTestCase):
def setUp(self) -> None:
async def asyncSetUp(self) -> None:
self.client_addr = ("127.0.0.1", 1)
self.region_addr = ("127.0.0.1", 3)
self.circuit_code = 1234
@@ -37,6 +37,9 @@ class BaseProxyTest(unittest.IsolatedAsyncioTestCase):
self.serializer = UDPMessageSerializer()
self.session.objects.track_region_objects(123)
def tearDown(self) -> None:
self.protocol.close()
async def _wait_drained(self):
await asyncio.sleep(0.001)

View File

@@ -58,6 +58,7 @@ from __future__ import annotations
import io
import logging
import pathlib
from pathlib import Path
from typing import *
@@ -82,6 +83,7 @@ class ViewerObjectCache:
@classmethod
def from_path(cls, base_path: Union[str, Path]):
base_path = pathlib.Path(base_path)
cache = cls(base_path)
with open(cache.base_path / "object.cache", "rb") as fh:
reader = se.BufferReader("<", fh.read())
@@ -143,6 +145,10 @@ class ViewerObjectCacheEntry(recordclass.datatuple): # type: ignore
data: bytes
def is_valid_vocache_dir(cache_dir):
return (pathlib.Path(cache_dir) / "objectcache" / "object.cache").exists()
class RegionViewerObjectCache:
"""Parser and container for .slc files"""
def __init__(self, cache_id: UUID, entries: List[ViewerObjectCacheEntry]):
@@ -201,7 +207,7 @@ class RegionViewerObjectCacheChain:
return None
@classmethod
def for_region(cls, handle: int, cache_id: UUID):
def for_region(cls, handle: int, cache_id: UUID, cache_dir: Optional[str] = None):
"""
Get a cache chain for a specific region, called on region connection
@@ -209,8 +215,13 @@ class RegionViewerObjectCacheChain:
so we have to try every region object cache file for every viewer installed.
"""
caches = []
for cache_dir in iter_viewer_cache_dirs():
if not (cache_dir / "objectcache" / "object.cache").exists():
if cache_dir is None:
cache_dirs = iter_viewer_cache_dirs()
else:
cache_dirs = [pathlib.Path(cache_dir)]
for cache_dir in cache_dirs:
if not is_valid_vocache_dir(cache_dir):
continue
cache = ViewerObjectCache.from_path(cache_dir / "objectcache")
if cache:

View File

@@ -0,0 +1,42 @@
import abc
from mitmproxy.addons import asgiapp
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
async def serve(app, flow: HippoHTTPFlow):
"""Serve a request based on a Hippolyzer HTTP flow using a provided app"""
await asgiapp.serve(app, flow.flow)
# Send the modified flow object back to mitmproxy
flow.resume()
class WebAppCapAddon(BaseAddon, abc.ABC):
"""
Addon that provides a cap via an ASGI webapp
Handles all registration of the cap URL and routing of the request.
"""
CAP_NAME: str
APP: any
def handle_region_registered(self, session: Session, region: ProxiedRegion):
# Register a fake URL for our cap. This will add the cap URL to the Seed
# response that gets sent back to the client if that cap name was requested.
region.register_proxy_cap(self.CAP_NAME)
def handle_session_init(self, session: Session):
for region in session.regions:
region.register_proxy_cap(self.CAP_NAME)
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
if flow.cap_data.cap_name != self.CAP_NAME:
return
# This request may take a while to generate a response for, take it out of the normal
# HTTP handling flow and handle it in a async task.
# TODO: Make all HTTP handling hooks async so this isn't necessary
self._schedule_task(serve(self.APP, flow.take()))

View File

@@ -1,69 +1,68 @@
aiohttp==3.7.4.post0
aiohttp==3.8.1
aiosignal==1.2.0
appdirs==1.4.4
Arpeggio==1.10.2
asgiref==3.3.4
async-timeout==3.0.1
attrs==20.3.0
black==21.4b2
asgiref==3.4.1
async-timeout==4.0.1
attrs==21.2.0
blinker==1.4
Brotli==1.0.9
certifi==2020.12.5
cffi==1.14.5
chardet==4.0.0
click==7.1.2
cryptography==3.3.2
certifi==2021.10.8
cffi==1.15.0
charset-normalizer==2.0.9
click==8.0.3
cryptography==36.0.2
defusedxml==0.7.1
Flask==1.1.2
Glymur==0.9.3
Flask==2.0.2
frozenlist==1.2.0
Glymur==0.9.6
h11==0.12.0
h2==4.0.0
h2==4.1.0
hpack==4.0.0
hyperframe==6.0.1
idna==2.10
itsdangerous==1.1.0
jedi==0.18.0
Jinja2==2.11.3
itsdangerous==2.0.1
jedi==0.18.1
Jinja2==3.0.3
kaitaistruct==0.9
lazy-object-proxy==1.6.0
ldap3==2.8.1
llbase==1.2.10
lxml==4.6.3
MarkupSafe==1.1.1
mitmproxy==6.0.2
msgpack==1.0.2
multidict==5.1.0
mypy-extensions==0.4.3
numpy==1.20.2
parso==0.8.2
ldap3==2.9.1
llbase==1.2.11
lxml==4.6.4
MarkupSafe==2.0.1
mitmproxy==8.0.0
msgpack==1.0.3
multidict==5.2.0
numpy==1.21.4
parso==0.8.3
passlib==1.7.4
pathspec==0.8.1
prompt-toolkit==3.0.18
protobuf==3.14.0
ptpython==3.0.17
prompt-toolkit==3.0.23
protobuf==3.18.1
ptpython==3.0.20
publicsuffix2==2.20191221
pyasn1==0.4.8
pycparser==2.20
Pygments==2.8.1
pyOpenSSL==20.0.1
pycparser==2.21
pycollada==0.7.2
Pygments==2.10.0
pyOpenSSL==22.0.0
pyparsing==2.4.7
pyperclip==1.8.2
PySide2==5.15.2
qasync==0.15.0
PySide6==6.2.2
qasync==0.22.0
recordclass==0.14.3
regex==2021.4.4
requests==2.25.1
ruamel.yaml==0.16.13
ruamel.yaml.clib==0.2.2
shiboken2==5.15.2
six==1.15.0
sortedcontainers==2.3.0
toml==0.10.2
requests==2.26.0
ruamel.yaml==0.17.16
ruamel.yaml.clib==0.2.6
shiboken6==6.2.2
six==1.16.0
sortedcontainers==2.4.0
tornado==6.1
typing-extensions==3.7.4.3
urllib3==1.26.5
transformations==2021.6.6
typing-extensions==4.0.1
urllib3==1.26.7
urwid==2.1.2
wcwidth==0.2.5
Werkzeug==1.0.1
Werkzeug==2.0.2
wsproto==1.0.0
yarl==1.6.3
zstandard==0.14.1
yarl==1.7.2
zstandard==0.15.2

View File

@@ -25,7 +25,7 @@ from setuptools import setup, find_packages
here = path.abspath(path.dirname(__file__))
version = '0.6.2'
version = '0.12.0'
with open(path.join(here, 'README.md')) as readme_fh:
readme = readme_fh.read()
@@ -44,6 +44,7 @@ setup(
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: Implementation :: CPython",
"Topic :: System :: Networking :: Monitoring",
"Topic :: Software Development :: Libraries :: Python Modules",
@@ -66,6 +67,7 @@ setup(
'lib/base/data/static_data.db2',
'lib/base/data/static_index.db2',
'lib/base/data/avatar_lad.xml',
'lib/base/data/male_collada_joints.xml',
'lib/base/data/avatar_skeleton.xml',
'lib/base/data/LICENSE-artwork.txt',
],
@@ -82,21 +84,24 @@ setup(
'llbase>=1.2.5',
'defusedxml',
'aiohttp<4.0.0',
'recordclass',
'recordclass<0.15',
'lazy-object-proxy',
'arpeggio',
# requests breaks with newer idna
'idna<3,>=2.5',
# 7.x will be a major change.
'mitmproxy<7.0.0',
'mitmproxy>=8.0.0,<8.1',
# For REPLs
'ptpython<4.0',
# JP2 codec
'Glymur<1.0',
'Glymur<0.9.7',
'numpy<2.0',
# These could be in extras_require if you don't want a GUI.
'pyside2<6.0',
'pyside6',
'qasync',
# Needed for mesh format conversion tooling
'pycollada',
'transformations',
],
tests_require=[
"pytest",

View File

@@ -9,20 +9,21 @@ from cx_Freeze import setup, Executable
# We don't need any of these and they make the archive huge.
TO_DELETE = [
"lib/PySide2/Qt3DRender.pyd",
"lib/PySide2/Qt53DRender.dll",
"lib/PySide2/Qt5Charts.dll",
"lib/PySide2/Qt5Location.dll",
"lib/PySide2/Qt5Pdf.dll",
"lib/PySide2/Qt5Quick.dll",
"lib/PySide2/Qt5WebEngineCore.dll",
"lib/PySide2/QtCharts.pyd",
"lib/PySide2/QtMultimedia.pyd",
"lib/PySide2/QtOpenGLFunctions.pyd",
"lib/PySide2/QtOpenGLFunctions.pyi",
"lib/PySide2/d3dcompiler_47.dll",
"lib/PySide2/opengl32sw.dll",
"lib/PySide2/translations",
"lib/PySide6/Qt6DRender.pyd",
"lib/PySide6/Qt63DRender.dll",
"lib/PySide6/Qt6Charts.dll",
"lib/PySide6/Qt6Location.dll",
"lib/PySide6/Qt6Pdf.dll",
"lib/PySide6/Qt6Quick.dll",
"lib/PySide6/Qt6WebEngineCore.dll",
"lib/PySide6/QtCharts.pyd",
"lib/PySide6/QtMultimedia.pyd",
"lib/PySide6/QtOpenGLFunctions.pyd",
"lib/PySide6/QtOpenGLFunctions.pyi",
"lib/PySide6/d3dcompiler_47.dll",
"lib/PySide6/opengl32sw.dll",
"lib/PySide6/lupdate.exe",
"lib/PySide6/translations",
"lib/aiohttp/_find_header.c",
"lib/aiohttp/_frozenlist.c",
"lib/aiohttp/_helpers.c",
@@ -112,7 +113,7 @@ executables = [
setup(
name="hippolyzer_gui",
version="0.6.2",
version="0.9.0",
description="Hippolyzer GUI",
options=options,
executables=executables,

BIN
static/repl_screenshot.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 42 KiB

View File

@@ -50,4 +50,4 @@ class TestCapsClient(unittest.IsolatedAsyncioTestCase):
with self.assertRaises(KeyError):
with self.caps_client.get("BadCap"):
pass
assert False

View File

@@ -79,6 +79,20 @@ class TestDatatypes(unittest.TestCase):
quat = Quaternion(X=128.0, Y=128.0, Z=22.0)
self.assertEqual(quat, (128.0, 128.0, 22.0, 0.0))
def test_quaternion_euler_roundtrip(self):
orig_vec = Vector3(0.0, -1.0, 2.0)
quat = Quaternion.from_euler(*orig_vec)
for orig_comp, new_comp in zip(orig_vec, quat.to_euler()):
self.assertAlmostEqual(orig_comp, new_comp)
def test_quaternion_transformations(self):
quat = Quaternion(0.4034226801113349, -0.2590347239999257, 0.7384602626041288, 0.4741598817790379)
expected_trans = (0.4741598817790379, 0.4034226801113349, -0.2590347239999257, 0.7384602626041288)
trans_quat = quat.to_transformations()
self.assertSequenceEqual(expected_trans, trans_quat)
new_quat = Quaternion.from_transformations(trans_quat)
self.assertEqual(quat, new_quat)
def test_uuid_from_bytes(self):
tmp_uuid = uuid.UUID('2b7f7a6e-32c5-dbfd-e2c7-926d1a9f0aca')
tmp_uuid2 = uuid.UUID('1dd5efe2-faaf-1864-5ac9-bc61c5d8d7ea')
@@ -134,3 +148,18 @@ class TestDatatypes(unittest.TestCase):
val = llsd.parse_binary(llsd.format_binary(orig))
self.assertIsInstance(val, UUID)
self.assertEqual(orig, val)
def test_str_llsd_serialization(self):
self.assertEqual(b"'foo\\nbar'", llsd.format_notation("foo\nbar"))
def test_jank_stringy_bytes(self):
val = JankStringyBytes(b"foo\x00")
self.assertTrue("o" in val)
self.assertTrue(b"o" in val)
self.assertFalse(b"z" in val)
self.assertFalse("z" in val)
self.assertEqual("foo", val)
self.assertEqual(b"foo\x00", val)
self.assertNotEqual(b"foo", val)
self.assertEqual(b"foo", JankStringyBytes(b"foo"))
self.assertEqual("foo", JankStringyBytes(b"foo"))

View File

@@ -1,7 +1,8 @@
import copy
import unittest
from hippolyzer.lib.base.datatypes import *
from hippolyzer.lib.base.legacy_inv import InventoryModel
from hippolyzer.lib.base.inventory import InventoryModel
from hippolyzer.lib.base.wearables import Wearable, VISUAL_PARAMS
SIMPLE_INV = """\tinv_object\t0
@@ -44,22 +45,124 @@ SIMPLE_INV = """\tinv_object\t0
class TestLegacyInv(unittest.TestCase):
def setUp(self) -> None:
self.model = InventoryModel.from_str(SIMPLE_INV)
def test_parse(self):
model = InventoryModel.from_str(SIMPLE_INV)
self.assertTrue(UUID('f4d91477-def1-487a-b4f3-6fa201c17376') in model.containers)
self.assertIsNotNone(model.root)
self.assertTrue(UUID('f4d91477-def1-487a-b4f3-6fa201c17376') in self.model.nodes)
self.assertIsNotNone(self.model.root)
def test_serialize(self):
model = InventoryModel.from_str(SIMPLE_INV)
new_model = InventoryModel.from_str(model.to_str())
self.assertEqual(model, new_model)
self.model = InventoryModel.from_str(SIMPLE_INV)
new_model = InventoryModel.from_str(self.model.to_str())
self.assertEqual(self.model, new_model)
def test_item_access(self):
model = InventoryModel.from_str(SIMPLE_INV)
item = model.items[UUID('dd163122-946b-44df-99f6-a6030e2b9597')]
item = self.model.nodes[UUID('dd163122-946b-44df-99f6-a6030e2b9597')]
self.assertEqual(item.name, "New Script")
self.assertEqual(item.sale_info.sale_type, "not")
self.assertEqual(item.model, model)
self.assertEqual(item.model, self.model)
def test_access_children(self):
root = self.model.root
item = tuple(self.model.ordered_nodes)[1]
self.assertEqual((item,), root.children)
def test_access_parent(self):
root = self.model.root
item = tuple(self.model.ordered_nodes)[1]
self.assertEqual(root, item.parent)
self.assertEqual(None, root.parent)
def test_unlink(self):
self.assertEqual(1, len(self.model.root.children))
item = tuple(self.model.ordered_nodes)[1]
self.assertEqual([item], item.unlink())
self.assertEqual(0, len(self.model.root.children))
self.assertEqual(None, item.model)
def test_relink(self):
item = tuple(self.model.ordered_nodes)[1]
for unlinked in item.unlink():
self.model.add(unlinked)
self.assertEqual(self.model, item.model)
self.assertEqual(1, len(self.model.root.children))
def test_eq_excludes_model(self):
item = tuple(self.model.ordered_nodes)[1]
item_copy = copy.copy(item)
item_copy.model = None
self.assertEqual(item, item_copy)
def test_llsd_serialization(self):
self.assertEqual(
self.model.to_llsd(),
[
{
'name': 'Contents',
'obj_id': UUID('f4d91477-def1-487a-b4f3-6fa201c17376'),
'parent_id': UUID('00000000-0000-0000-0000-000000000000'),
'type': 'category'
},
{
'asset_id': UUID('00000000-0000-0000-0000-000000000000'),
'created_at': 1587367239,
'desc': '2020-04-20 04:20:39 lsl2 script',
'flags': b'\x00\x00\x00\x00',
'inv_type': 'script',
'item_id': UUID('dd163122-946b-44df-99f6-a6030e2b9597'),
'name': 'New Script',
'parent_id': UUID('f4d91477-def1-487a-b4f3-6fa201c17376'),
'permissions': {
'base_mask': 2147483647,
'creator_id': UUID('a2e76fcd-9360-4f6d-a924-000000000003'),
'everyone_mask': 0,
'group_id': UUID('00000000-0000-0000-0000-000000000000'),
'group_mask': 0,
'last_owner_id': UUID('a2e76fcd-9360-4f6d-a924-000000000003'),
'next_owner_mask': 581632,
'owner_id': UUID('a2e76fcd-9360-4f6d-a924-000000000003'),
'owner_mask': 2147483647,
'is_owner_group': 0,
},
'sale_info': {
'sale_price': 10,
'sale_type': 'not'
},
'type': 'lsltext'
}
]
)
def test_llsd_legacy_equality(self):
new_model = InventoryModel.from_llsd(self.model.to_llsd())
self.assertEqual(self.model, new_model)
new_model.root.name = "foo"
self.assertNotEqual(self.model, new_model)
def test_difference_added(self):
new_model = InventoryModel.from_llsd(self.model.to_llsd())
diff = self.model.get_differences(new_model)
self.assertEqual([], diff.changed)
self.assertEqual([], diff.removed)
new_model.root.name = "foo"
diff = self.model.get_differences(new_model)
self.assertEqual([new_model.root], diff.changed)
self.assertEqual([], diff.removed)
item = new_model.root.children[0]
item.unlink()
diff = self.model.get_differences(new_model)
self.assertEqual([new_model.root], diff.changed)
self.assertEqual([item], diff.removed)
new_item = copy.copy(item)
new_item.node_id = UUID.random()
new_model.add(new_item)
diff = self.model.get_differences(new_model)
self.assertEqual([new_model.root, new_item], diff.changed)
self.assertEqual([item], diff.removed)
GIRL_NEXT_DOOR_SHAPE = """LLWearable version 22

View File

@@ -62,3 +62,8 @@ class TestMesh(unittest.TestCase):
mat_list = list(mesh.iter_lod_materials())
self.assertEqual(4, len(mat_list))
self.assertIsInstance(mat_list[0], dict)
def test_make_default_triangle(self):
tri = MeshAsset.make_triangle()
self.assertEqual(0.5, tri.segments['high_lod'][0]['Position'][2].X)
self.assertEqual(1, tri.header['version'])

View File

@@ -146,6 +146,12 @@ class TestMessage(unittest.TestCase):
new_msg = Message.from_dict(self.chat_msg.to_dict())
self.assertEqual(pickle.dumps(self.chat_msg), pickle.dumps(new_msg))
def test_todict_extended(self):
self.chat_msg.packet_id = 5
new_msg = Message.from_dict(self.chat_msg.to_dict(extended=True))
self.assertEqual(5, new_msg.packet_id)
self.assertEqual(pickle.dumps(self.chat_msg), pickle.dumps(new_msg))
def test_todict_multiple_blocks(self):
chat_msg = self.chat_msg
# If we dupe the ChatData block it should survive to_dict()
@@ -294,3 +300,14 @@ class HumanReadableMessageTests(unittest.TestCase):
with self.assertRaises(ValueError):
HumanMessageSerializer.from_human_string(val)
def test_flags(self):
val = """
OUT FooMessage [ZEROCODED] [RELIABLE] [1]
[SomeBlock]
foo = 1
"""
msg = HumanMessageSerializer.from_human_string(val)
self.assertEqual(HumanMessageSerializer.to_human_string(msg).strip(), val.strip())

View File

@@ -791,7 +791,3 @@ class SubfieldSerializationTests(BaseSerializationTest):
self.assertEqual(ser.serialize(None, FooFlags.FOO), 1)
self.assertEqual(ser.serialize(None, 3), 3)
self.assertEqual(ser.serialize(None, 7), 7)
if __name__ == "__main__":
unittest.main()

View File

@@ -28,7 +28,8 @@ class MockHandlingCircuit(ProxiedCircuit):
self.handler = handler
def _send_prepared_message(self, message: Message, transport=None):
asyncio.get_event_loop().call_soon(self.handler.handle, message)
loop = asyncio.get_event_loop_policy().get_event_loop()
loop.call_soon(self.handler.handle, message)
class MockConnectionHolder(ConnectionHolder):
@@ -70,7 +71,7 @@ class XferManagerTests(BaseTransferTests):
manager = XferManager(self.server_connection)
xfer = await manager.request(vfile_id=asset_id, vfile_type=AssetType.BODYPART)
self.received_bytes = xfer.reassemble_chunks()
self.server_circuit.send_message(Message(
self.server_circuit.send(Message(
"AssetUploadComplete",
Block("AssetBlock", UUID=asset_id, Type=asset_block["Type"], Success=True),
direction=Direction.IN,
@@ -109,7 +110,7 @@ class TestTransferManager(BaseTransferTests):
self.assertEqual(EstateAssetType.COVENANT, params.EstateAssetType)
data = self.LARGE_PAYLOAD
self.server_circuit.send_message(Message(
self.server_circuit.send(Message(
'TransferInfo',
Block(
'TransferInfo',
@@ -125,7 +126,7 @@ class TestTransferManager(BaseTransferTests):
while True:
chunk = data[:1000]
data = data[1000:]
self.server_circuit.send_message(Message(
self.server_circuit.send(Message(
'TransferPacket',
Block(
'TransferData',

View File

@@ -33,10 +33,11 @@ class MockAddon(BaseAddon):
PARENT_ADDON_SOURCE = """
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.addon_utils import BaseAddon, GlobalProperty
class ParentAddon(BaseAddon):
baz = None
quux: int = GlobalProperty(0)
@classmethod
def foo(cls):
@@ -62,8 +63,8 @@ addons = [ChildAddon()]
class AddonIntegrationTests(BaseProxyTest):
def setUp(self) -> None:
super().setUp()
async def asyncSetUp(self) -> None:
await super().asyncSetUp()
self.addon = MockAddon()
AddonManager.init([], self.session_manager, [self.addon], swallow_addon_exceptions=False)
self.temp_dir = TemporaryDirectory(prefix="addon_test_sources")
@@ -136,3 +137,16 @@ class AddonIntegrationTests(BaseProxyTest):
AddonManager.unload_addon_from_path(str(self.parent_path), reload=True)
await asyncio.sleep(0.001)
self.assertNotIn('hippolyzer.user_addon_parent_addon', sys.modules)
async def test_global_property_access_and_set(self):
with open(self.parent_path, "w") as f:
f.write(PARENT_ADDON_SOURCE)
AddonManager.load_addon_from_path(str(self.parent_path), reload=True)
# Wait for the init hooks to run
await asyncio.sleep(0.001)
self.assertFalse("quux" in self.session_manager.addon_ctx)
parent_addon_mod = AddonManager.FRESH_ADDON_MODULES['hippolyzer.user_addon_parent_addon']
self.assertEqual(0, parent_addon_mod.ParentAddon.quux)
self.assertEqual(0, self.session_manager.addon_ctx["quux"])
parent_addon_mod.ParentAddon.quux = 1
self.assertEqual(1, self.session_manager.addon_ctx["quux"])

View File

@@ -6,9 +6,8 @@ import multiprocessing
from urllib.parse import urlparse
import aioresponses
from mitmproxy.net import http
from mitmproxy.test import tflow, tutils
from mitmproxy.http import HTTPFlow
from mitmproxy.http import HTTPFlow, Headers
from yarl import URL
from hippolyzer.apps.proxy import run_http_proxy_process
@@ -17,8 +16,7 @@ from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.http_event_manager import MITMProxyEventManager
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.http_proxy import SerializedCapData
from hippolyzer.lib.proxy.message_logger import FilteringMessageLogger
from hippolyzer.lib.proxy.caps import SerializedCapData
from hippolyzer.lib.proxy.sessions import SessionManager
from hippolyzer.lib.proxy.test_utils import BaseProxyTest
@@ -31,15 +29,9 @@ class MockAddon(BaseAddon):
flow.metadata["touched_addon"] = True
class SimpleMessageLogger(FilteringMessageLogger):
@property
def entries(self):
return self._filtered_entries
class HTTPIntegrationTests(BaseProxyTest):
def setUp(self) -> None:
super().setUp()
async def asyncSetUp(self) -> None:
await super().asyncSetUp()
self.addon = MockAddon()
AddonManager.init([], self.session_manager, [self.addon])
self.flow_context = self.session_manager.flow_context
@@ -88,7 +80,7 @@ class HTTPIntegrationTests(BaseProxyTest):
fake_flow = tflow.tflow(
req=tutils.treq(host="example.com", content=b'<llsd><string>getZOffsets|'),
resp=tutils.tresp(
headers=http.Headers((
headers=Headers((
(b"X-SecondLife-Object-Name", b"#Firestorm LSL Bridge v99999"),
(b"X-SecondLife-Owner-Key", str(self.session.agent_id).encode("utf8")),
)),
@@ -132,8 +124,8 @@ class HTTPIntegrationTests(BaseProxyTest):
class TestCapsClient(BaseProxyTest):
def setUp(self) -> None:
super().setUp()
async def asyncSetUp(self) -> None:
await super().asyncSetUp()
self._setup_default_circuit()
self.caps_client = self.session.main_region.caps_client
@@ -149,29 +141,30 @@ class TestCapsClient(BaseProxyTest):
class TestMITMProxy(BaseProxyTest):
def setUp(self) -> None:
super().setUp()
async def asyncSetUp(self) -> None:
await super().asyncSetUp()
self._setup_default_circuit()
self.caps_client = self.session.main_region.caps_client
def test_mitmproxy_works(self):
proxy_port = 9905
self.session_manager.settings.HTTP_PROXY_PORT = proxy_port
http_proc = multiprocessing.Process(
self.http_proc = multiprocessing.Process(
target=run_http_proxy_process,
args=("127.0.0.1", proxy_port, self.session_manager.flow_context),
daemon=True,
)
http_proc.start()
self.http_proc.start()
self.session_manager.flow_context.mitmproxy_ready.wait(1.0)
http_event_manager = MITMProxyEventManager(self.session_manager, self.session_manager.flow_context)
self.http_event_manager = MITMProxyEventManager(
self.session_manager,
self.session_manager.flow_context
)
def test_mitmproxy_works(self):
async def _request_example_com():
# Pump callbacks from mitmproxy
asyncio.create_task(http_event_manager.run())
asyncio.create_task(self.http_event_manager.run())
try:
async with self.caps_client.get("http://example.com/", timeout=0.5) as resp:
self.assertIn(b"Example Domain", await resp.read())
@@ -181,4 +174,4 @@ class TestMITMProxy(BaseProxyTest):
# Tell the event pump and mitmproxy they need to shut down
self.session_manager.flow_context.shutdown_signal.set()
asyncio.run(_request_example_com())
http_proc.join()
self.http_proc.join()

View File

@@ -12,7 +12,6 @@ from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.udpdeserializer import UDPMessageDeserializer
from hippolyzer.lib.base.objects import Object
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.message_logger import FilteringMessageLogger, LLUDPMessageLogEntry
@@ -48,8 +47,8 @@ class SimpleMessageLogger(FilteringMessageLogger):
class LLUDPIntegrationTests(BaseProxyTest):
def setUp(self) -> None:
super().setUp()
async def asyncSetUp(self) -> None:
await super().asyncSetUp()
self.addon = MockAddon()
self.deserializer = UDPMessageDeserializer()
AddonManager.init([], self.session_manager, [self.addon])
@@ -205,8 +204,8 @@ class LLUDPIntegrationTests(BaseProxyTest):
self.protocol.datagram_received(obj_update, self.region_addr)
await self._wait_drained()
entries = message_logger.entries
self.assertEqual(len(entries), 1)
self.assertEqual(entries[0].name, "ObjectUpdateCompressed")
self.assertEqual(1, len(entries))
self.assertEqual("ObjectUpdateCompressed", entries[0].name)
async def test_filtering_logged_messages(self):
message_logger = SimpleMessageLogger()
@@ -223,8 +222,8 @@ class LLUDPIntegrationTests(BaseProxyTest):
await self._wait_drained()
message_logger.set_filter("ObjectUpdateCompressed")
entries = message_logger.entries
self.assertEqual(len(entries), 1)
self.assertEqual(entries[0].name, "ObjectUpdateCompressed")
self.assertEqual(1, len(entries))
self.assertEqual("ObjectUpdateCompressed", entries[0].name)
async def test_logging_taken_message(self):
message_logger = SimpleMessageLogger()
@@ -262,11 +261,6 @@ class LLUDPIntegrationTests(BaseProxyTest):
# Don't have a serializer, onto the next field
continue
deser = serializer.deserialize(block, orig_val)
# For now we consider returning UNSERIALIZABLE to be acceptable.
# We should probably consider raising instead of returning that.
if deser is se.UNSERIALIZABLE:
continue
new_val = serializer.serialize(block, deser)
if orig_val != new_val:
raise AssertionError(f"{block.name}.{var_name} didn't reserialize correctly,"

View File

@@ -26,7 +26,13 @@ class ExampleCommandHandler:
y=str,
)
async def own_name(self, _session, _region, y):
self.bar = y
pass
@handle_command(
x=Parameter(str, optional=True),
)
async def optional(self, _session, _region, x=42):
self.bar = x
class TestCommandHandlers(unittest.IsolatedAsyncioTestCase):
@@ -47,9 +53,20 @@ class TestCommandHandlers(unittest.IsolatedAsyncioTestCase):
async def test_own_name(self):
self.assertEqual(self.handler.own_name.command.name, "own_name")
async def test_missing_param(self):
with self.assertRaises(KeyError):
await self.handler.foo(None, None, "")
async def test_optional_param(self):
await self.handler.optional(None, None, "foo") # type: ignore
self.assertEqual(self.handler.bar, "foo")
await self.handler.optional(None, None, "") # type: ignore
# Should have picked up the default value
self.assertEqual(self.handler.bar, 42)
async def test_bad_command(self):
with self.assertRaises(ValueError):
class _BadCommandHandler:
@handle_command("foobaz")
def bad_command(self, session, region):
pass
assert False

View File

@@ -1,13 +1,14 @@
from mitmproxy.test import tflow, tutils
from hippolyzer.lib.proxy.caps import CapType
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.message_logger import HTTPMessageLogEntry
from hippolyzer.lib.proxy.test_utils import BaseProxyTest
class TestHTTPFlows(BaseProxyTest):
def setUp(self) -> None:
super().setUp()
async def asyncSetUp(self) -> None:
await super().asyncSetUp()
self.region = self.session.register_region(
("127.0.0.1", 2),
"https://test.localhost:4/foo",
@@ -18,7 +19,7 @@ class TestHTTPFlows(BaseProxyTest):
"ViewerAsset": "http://assets.example.com",
})
def test_request_formatting(self):
async def test_request_formatting(self):
req = tutils.treq(host="example.com", port=80)
fake_flow = tflow.tflow(req=req, resp=tutils.tresp())
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
@@ -32,7 +33,7 @@ content-length: 7\r
\r
content""")
def test_binary_request_formatting(self):
async def test_binary_request_formatting(self):
req = tutils.treq(host="example.com", port=80)
fake_flow = tflow.tflow(req=req, resp=tutils.tresp())
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
@@ -46,7 +47,7 @@ X-Hippo-Escaped-Body: 1\r
\r
c\\x00ntent""")
def test_llsd_response_formatting(self):
async def test_llsd_response_formatting(self):
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
# Half the time LLSD is sent with a random Content-Type and no PI indicating
@@ -63,7 +64,7 @@ content-length: 33\r
</llsd>
""")
def test_flow_state_serde(self):
async def test_flow_state_serde(self):
fake_flow = tflow.tflow(req=tutils.treq(host="example.com"), resp=tutils.tresp())
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
# Make sure cap resolution works correctly
@@ -72,7 +73,7 @@ content-length: 33\r
new_flow = HippoHTTPFlow.from_state(flow_state, self.session_manager)
self.assertIs(self.session, new_flow.cap_data.session())
def test_http_asset_repo(self):
async def test_http_asset_repo(self):
asset_repo = self.session_manager.asset_repo
asset_id = asset_repo.create_asset(b"foobar", one_shot=True)
req = tutils.treq(host="assets.example.com", path=f"/?animatn_id={asset_id}")
@@ -83,9 +84,9 @@ content-length: 33\r
self.assertTrue(asset_repo.try_serve_asset(flow))
self.assertEqual(b"foobar", flow.response.content)
def test_temporary_cap_resolution(self):
self.region.register_temporary_cap("TempExample", "http://not.example.com")
self.region.register_temporary_cap("TempExample", "http://not2.example.com")
async def test_temporary_cap_resolution(self):
self.region.register_cap("TempExample", "http://not.example.com", CapType.TEMPORARY)
self.region.register_cap("TempExample", "http://not2.example.com", CapType.TEMPORARY)
# Resolving the cap should consume it
cap_data = self.session_manager.resolve_cap("http://not.example.com")
self.assertEqual(cap_data.cap_name, "TempExample")

View File

@@ -2,13 +2,14 @@ import unittest
from mitmproxy.test import tflow, tutils
from hippolyzer.lib.base.datatypes import Vector3
from hippolyzer.lib.base.datatypes import Vector3, UUID
from hippolyzer.lib.base.message.message import Block, Message as Message
from hippolyzer.lib.base.message.udpdeserializer import UDPMessageDeserializer
from hippolyzer.lib.base.settings import Settings
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.http_proxy import SerializedCapData
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, HTTPMessageLogEntry
from hippolyzer.lib.proxy.caps import SerializedCapData
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, HTTPMessageLogEntry, export_log_entries, \
import_log_entries
from hippolyzer.lib.proxy.message_filter import compile_filter
from hippolyzer.lib.proxy.sessions import SessionManager
from hippolyzer.lib.proxy.settings import ProxySettings
@@ -24,7 +25,7 @@ OBJECT_UPDATE = b'\xc0\x00\x00\x00Q\x00\x0c\x00\x01\xea\x03\x00\x02\xe6\x03\x00\
b'\x88\x00"'
class MessageFilterTests(unittest.TestCase):
class MessageFilterTests(unittest.IsolatedAsyncioTestCase):
def _filter_matches(self, filter_str, message):
compiled = compile_filter(filter_str)
return compiled.match(message)
@@ -118,7 +119,18 @@ class MessageFilterTests(unittest.TestCase):
self.assertTrue(self._filter_matches("ObjectUpdate.ObjectData.ObjectData.Position > (88, 41, 25)", entry))
self.assertTrue(self._filter_matches("ObjectUpdate.ObjectData.ObjectData.Position < (90, 43, 27)", entry))
def test_http_flow(self):
def test_import_export_message(self):
msg = LLUDPMessageLogEntry(Message(
"Foo",
Block("Bar", Baz=1, Quux=UUID.random(), Foo=0xFFffFFffFF)
), None, None)
msg.freeze()
msg = import_log_entries(export_log_entries([msg]))[0]
self.assertTrue(self._filter_matches("Foo.Bar.Baz == 1", msg))
# Make sure numbers outside 32bit range come through
self.assertTrue(self._filter_matches("Foo.Bar.Foo == 0xFFffFFffFF", msg))
async def test_http_flow(self):
session_manager = SessionManager(ProxySettings())
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
fake_flow.metadata["cap_data_ser"] = SerializedCapData(
@@ -129,6 +141,21 @@ class MessageFilterTests(unittest.TestCase):
self.assertTrue(self._filter_matches("FakeCap", entry))
self.assertFalse(self._filter_matches("NotFakeCap", entry))
async def test_http_header_filter(self):
session_manager = SessionManager(ProxySettings())
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
fake_flow.request.headers["Cookie"] = 'foo="bar"'
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), session_manager)
entry = HTTPMessageLogEntry(flow)
# The header map is case-insensitive!
self.assertTrue(self._filter_matches('Meta.ReqHeaders.cookie ~= "foo"', entry))
self.assertFalse(self._filter_matches('Meta.ReqHeaders.foobar ~= "foo"', entry))
if __name__ == "__main__":
unittest.main()
async def test_export_import_http_flow(self):
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
fake_flow.metadata["cap_data_ser"] = SerializedCapData(
cap_name="FakeCap",
)
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), None)
new_entry = import_log_entries(export_log_entries([HTTPMessageLogEntry(flow)]))[0]
self.assertEqual("FakeCap", new_entry.name)

View File

@@ -17,19 +17,19 @@ class MockedProxyCircuit(ProxiedCircuit):
self.in_injections = InjectionTracker(0, maxlen=10)
def _send_prepared_message(self, msg: Message, transport=None):
self.sent_simple.append((msg.packet_id, msg.name, msg.direction, msg.injected, msg.acks))
self.sent_simple.append((msg.packet_id, msg.name, msg.direction, msg.synthetic, msg.acks))
self.sent_msgs.append(msg)
class PacketIDTests(unittest.TestCase):
class PacketIDTests(unittest.IsolatedAsyncioTestCase):
def setUp(self) -> None:
self.circuit = MockedProxyCircuit()
def _send_message(self, msg, outgoing=True):
msg.direction = Direction.OUT if outgoing else Direction.IN
return self.circuit.send_message(msg)
return self.circuit.send(msg)
def test_basic(self):
async def test_basic(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self._send_message(Message('ChatFromViewer', packet_id=2))
@@ -38,7 +38,7 @@ class PacketIDTests(unittest.TestCase):
(2, "ChatFromViewer", Direction.OUT, False, ()),
))
def test_inject(self):
async def test_inject(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self._send_message(Message('ChatFromViewer'))
self._send_message(Message('ChatFromViewer', packet_id=2))
@@ -49,7 +49,7 @@ class PacketIDTests(unittest.TestCase):
(3, "ChatFromViewer", Direction.OUT, False, ()),
))
def test_max_injected(self):
async def test_max_injected(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
for _ in range(5):
self._send_message(Message('ChatFromViewer'))
@@ -74,7 +74,7 @@ class PacketIDTests(unittest.TestCase):
# Make sure we're still able to get the original ID
self.assertEqual(self.circuit.out_injections.get_original_id(15), 3)
def test_inject_hole_in_sequence(self):
async def test_inject_hole_in_sequence(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self._send_message(Message('ChatFromViewer'))
self._send_message(Message('ChatFromViewer', packet_id=4))
@@ -87,7 +87,7 @@ class PacketIDTests(unittest.TestCase):
(6, "ChatFromViewer", Direction.OUT, True, ()),
))
def test_inject_misordered(self):
async def test_inject_misordered(self):
self._send_message(Message('ChatFromViewer', packet_id=2))
self._send_message(Message('ChatFromViewer'))
self._send_message(Message('ChatFromViewer', packet_id=1))
@@ -98,7 +98,7 @@ class PacketIDTests(unittest.TestCase):
(1, "ChatFromViewer", Direction.OUT, False, ()),
])
def test_inject_multiple(self):
async def test_inject_multiple(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self._send_message(Message('ChatFromViewer'))
self._send_message(Message('ChatFromViewer'))
@@ -115,7 +115,7 @@ class PacketIDTests(unittest.TestCase):
(6, "ChatFromViewer", Direction.OUT, True, ()),
])
def test_packet_ack_field_converted(self):
async def test_packet_ack_field_converted(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self._send_message(Message('ChatFromViewer'))
self._send_message(Message('ChatFromViewer'))
@@ -139,7 +139,7 @@ class PacketIDTests(unittest.TestCase):
(6, "ChatFromViewer", Direction.OUT, True, ()),
])
def test_packet_ack_proxied_message_converted(self):
async def test_packet_ack_proxied_message_converted(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self._send_message(Message('ChatFromViewer'))
self._send_message(Message('ChatFromViewer'))
@@ -176,12 +176,9 @@ class PacketIDTests(unittest.TestCase):
self.assertEqual(self.circuit.sent_msgs[5]["Packets"][0]["ID"], 2)
def test_drop_proxied_message(self):
async def test_drop_proxied_message(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self.circuit.drop_message(
Message('ChatFromViewer', packet_id=2, flags=PacketFlags.RELIABLE),
Direction.OUT,
)
self.circuit.drop_message(Message('ChatFromViewer', packet_id=2, flags=PacketFlags.RELIABLE))
self._send_message(Message('ChatFromViewer', packet_id=3))
self.assertSequenceEqual(self.circuit.sent_simple, [
@@ -191,12 +188,9 @@ class PacketIDTests(unittest.TestCase):
])
self.assertEqual(self.circuit.sent_msgs[1]["Packets"][0]["ID"], 2)
def test_unreliable_proxied_message(self):
async def test_unreliable_proxied_message(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self.circuit.drop_message(
Message('ChatFromViewer', packet_id=2),
Direction.OUT,
)
self.circuit.drop_message(Message('ChatFromViewer', packet_id=2))
self._send_message(Message('ChatFromViewer', packet_id=3))
self.assertSequenceEqual(self.circuit.sent_simple, [
@@ -204,15 +198,12 @@ class PacketIDTests(unittest.TestCase):
(3, "ChatFromViewer", Direction.OUT, False, ()),
])
def test_dropped_proxied_message_acks_sent(self):
async def test_dropped_proxied_message_acks_sent(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self._send_message(Message('ChatFromViewer', packet_id=2))
self._send_message(Message('ChatFromViewer', packet_id=3))
self._send_message(Message('ChatFromSimulator'), outgoing=False)
self.circuit.drop_message(
Message('ChatFromViewer', packet_id=4, acks=(4,)),
Direction.OUT,
)
self.circuit.drop_message(Message('ChatFromViewer', packet_id=4, acks=(4,)))
self._send_message(Message('ChatFromViewer', packet_id=5))
self.assertSequenceEqual(self.circuit.sent_simple, [
@@ -229,21 +220,81 @@ class PacketIDTests(unittest.TestCase):
# We injected an incoming packet, so "4" is really "3"
self.assertEqual(self.circuit.sent_msgs[4]["Packets"][0]["ID"], 3)
def test_resending_or_dropping(self):
self.circuit.send_message(Message('ChatFromViewer', packet_id=1))
async def test_resending_or_dropping(self):
self.circuit.send(Message('ChatFromViewer', packet_id=1))
to_drop = Message('ChatFromViewer', packet_id=2, flags=PacketFlags.RELIABLE)
self.circuit.drop_message(to_drop)
with self.assertRaises(RuntimeError):
# Re-dropping the same message should raise
self.circuit.drop_message(to_drop)
# Clears finalized flag
to_drop.packet_id = None
self.circuit.send_message(to_drop)
# Returns a new message without finalized flag
new_msg = to_drop.take()
self.circuit.send(new_msg)
with self.assertRaises(RuntimeError):
self.circuit.send_message(to_drop)
self.circuit.send(new_msg)
self.assertSequenceEqual(self.circuit.sent_simple, [
(1, "ChatFromViewer", Direction.OUT, False, ()),
(1, "PacketAck", Direction.IN, True, ()),
# ended up getting the same packet ID when injected
(2, "ChatFromViewer", Direction.OUT, True, ()),
])
async def test_reliable_unacked_queueing(self):
self._send_message(Message('ChatFromViewer', flags=PacketFlags.RELIABLE))
self._send_message(Message('ChatFromViewer', flags=PacketFlags.RELIABLE, packet_id=2))
# Only the first, injected message should be queued for resends
self.assertEqual({(Direction.OUT, 1)}, set(self.circuit.unacked_reliable))
async def test_reliable_resend_cadence(self):
self._send_message(Message('ChatFromViewer', flags=PacketFlags.RELIABLE))
resend_info = self.circuit.unacked_reliable[(Direction.OUT, 1)]
self.circuit.resend_unacked()
# Should have been too soon to retry
self.assertEqual(10, resend_info.tries_left)
# Switch to allowing resends every 0s
self.circuit.resend_every = 0.0
self.circuit.resend_unacked()
self.assertSequenceEqual(self.circuit.sent_simple, [
(1, "ChatFromViewer", Direction.OUT, True, ()),
# Should have resent
(1, "ChatFromViewer", Direction.OUT, True, ()),
])
self.assertEqual(9, resend_info.tries_left)
for _ in range(resend_info.tries_left):
self.circuit.resend_unacked()
# Should have used up all the retry attempts and been kicked out of the retry queue
self.assertEqual(set(), set(self.circuit.unacked_reliable))
async def test_reliable_ack_collection(self):
msg = Message('ChatFromViewer', flags=PacketFlags.RELIABLE)
fut = self.circuit.send_reliable(msg)
self.assertEqual(1, len(self.circuit.unacked_reliable))
# Shouldn't count, this is an ACK going in the wrong direction!
ack_msg = Message("PacketAck", Block("Packets", ID=msg.packet_id))
self.circuit.collect_acks(ack_msg)
self.assertEqual(1, len(self.circuit.unacked_reliable))
self.assertFalse(fut.done())
# But it should count if the ACK message is heading in
ack_msg.direction = Direction.IN
self.circuit.collect_acks(ack_msg)
self.assertEqual(0, len(self.circuit.unacked_reliable))
self.assertTrue(fut.done())
async def test_start_ping_check(self):
# Should not break if no unacked
self._send_message(Message(
"StartPingCheck",
Block("PingID", PingID=0, OldestUnacked=20),
packet_id=5,
))
injected_msg = Message('ChatFromViewer', flags=PacketFlags.RELIABLE)
self._send_message(injected_msg)
self._send_message(Message(
"StartPingCheck",
Block("PingID", PingID=0, OldestUnacked=20),
packet_id=8,
))
# Oldest unacked should have been replaced with the injected packet's ID, it's older!
self.assertEqual(self.circuit.sent_msgs[2]["PingID"]["OldestUnacked"], injected_msg.packet_id)

Some files were not shown because too many files have changed in this diff Show More