196 Commits
v0.3 ... v0.7.0

Author SHA1 Message Date
Salad Dais
c6e0a400a9 v0.7.0 2021-08-10 01:16:20 +00:00
Salad Dais
d01122d542 Call correct method to raise new message log window 2021-08-10 01:11:21 +00:00
Salad Dais
690d6b51b8 Upgrade to mitmproxy 7.0.2
Our fix for `Flow.set_state()` has been upstreamed
2021-08-09 22:16:23 +00:00
Salad Dais
2437a8b14f Add a framework for simple local anim creation, tail animator 2021-08-05 21:08:18 +00:00
Salad Dais
afa601fffe Support session-specific viewer cache directories 2021-08-02 18:23:13 +00:00
Salad Dais
874feff471 Fix incorrect reference to mitmproxy class 2021-08-01 12:16:10 +00:00
Salad Dais
05c53bba9f Add CapsClient to BaseClientSession 2021-08-01 06:39:04 +00:00
Salad Dais
578f1d8c4e Add setting to disable all proxy object autorequests
Will help with #18 by not changing object request behaviour when
running through the proxy.
2021-08-01 06:37:33 +00:00
Salad Dais
7d8e18440a Add local anim mangler support with example
Analogous to local mesh mangler support.
2021-07-31 11:56:17 +00:00
Salad Dais
66e112dd52 Add basic message log import / export feature
Closes #20
2021-07-30 03:13:33 +00:00
Salad Dais
02ac022ab3 Add export formats for message log entries 2021-07-30 01:06:29 +00:00
Salad Dais
33ce74754e Fix mirror_target_agent check in http hooks 2021-07-30 01:06:29 +00:00
Salad Dais
74dd6b977c Add extended to_dict() format for Message class
This will allow proper import / export of message logs.
2021-07-29 10:26:42 +00:00
Salad Dais
387652731a Add Message Mirror example addon 2021-07-29 09:43:20 +00:00
Salad Dais
e4601fd879 Support multiple Message Log windows
Closes #19
2021-07-29 01:00:57 +00:00
Salad Dais
6eb25f96d9 Support logging to a hierarchy of message loggers
Necessary to eventually support multiple message log windows
2021-07-27 02:35:03 +00:00
Salad Dais
22b9eeb5cb Better handling of optional command parameters 2021-07-22 23:59:55 +00:00
Salad Dais
0dbedcb2f5 Improve coverage 2021-07-22 23:58:17 +00:00
Salad Dais
7d9712c16e Fix message dropping and queueing corner cases 2021-07-22 05:08:47 +00:00
Salad Dais
82663c0fc2 Add parse_bool helper function for command parameters 2021-07-21 06:39:29 +00:00
Salad Dais
9fb4884470 Extend TlsLayer.tls_start_server instead of monkeypatching OpenSSL funcs
We have a more elegant way of unsetting `X509_CHECK_FLAG_NEVER_CHECK_SUBJECT`
now that mitmproxy 7.0 is out.

See https://github.com/mitmproxy/mitmproxy/pull/4688
2021-07-19 20:17:31 +00:00
Salad Dais
cf69c42f67 Rework HTTP proxying code to work with mitmproxy 7.0.0 2021-07-18 07:02:45 +00:00
Salad Dais
be658b9026 v0.6.3
Cutting a release before working on mitmproxy upgrade
2021-07-18 06:57:40 +00:00
Salad Dais
c505941595 Improve test for TE serialization 2021-07-18 06:33:55 +00:00
Salad Dais
96f471d6b7 Add initial support for Message-specific Block subclasses 2021-07-07 12:49:32 +00:00
Salad Dais
4238016767 Change readme wording
:)
2021-07-07 12:49:32 +00:00
Salad Dais
a35a67718d Add default_value to MessateTemplateVariable 2021-07-01 21:25:51 +00:00
Salad Dais
c2981b107a Remove CodeQL scanning
Maybe later, doesn't seem to do anything useful out of the box.
2021-06-28 06:00:42 -03:00
Salad Dais
851375499a Add CodeQL scanning 2021-06-28 05:44:02 -03:00
Salad Dais
d064ecd466 Don't raise when reading a new avatar_name_cache.xml 2021-06-25 18:45:42 +00:00
Salad Dais
fda37656c9 Reduce boilerplate for mesh mangling addons
Makes it less annoying to compose separate addons with different manglers
2021-06-24 05:29:23 +00:00
Salad Dais
49a9c6f28f Workaround for failed teleports due to EventQueue timeouts
Closes #16
2021-06-23 16:43:09 +00:00
Salad Dais
050ac5e3a9 v0.6.2 2021-06-19 03:06:39 +00:00
Salad Dais
fe0d3132e4 Update shield addon 2021-06-18 20:49:31 +00:00
Salad Dais
d7f18e05be Fix typo 2021-06-18 20:49:20 +00:00
Salad Dais
9bf4240411 Allow tagging UDPPackets with arbitrary metadata
The metadata should propagate to any Messages deserialized
from the packet as well.
2021-06-18 20:31:15 +00:00
Salad Dais
76df9a0424 Streamline template dictionary use 2021-06-17 21:28:22 +00:00
Salad Dais
a91bc67a43 v0.6.1 2021-06-16 14:27:26 +00:00
Salad Dais
48180b85d1 Export proxy test utils for use in addon test suites 2021-06-15 18:48:05 +00:00
Salad Dais
77d3bf2fe1 Make ObjectCacheChain handle invalid caches properly 2021-06-14 14:17:21 +00:00
Salad Dais
d8ec9ee77a Add hooks to allow swapping out transports 2021-06-14 13:48:30 +00:00
Salad Dais
0b46b95f81 Minor API changes 2021-06-14 13:33:17 +00:00
Salad Dais
73e66c56e5 Clarify addon state management example addon 2021-06-13 12:06:04 +00:00
Salad Dais
fd2a4d8dce Remove incorrect comment from JPEG2000 test 2021-06-13 10:23:18 +00:00
Salad Dais
2209ebdd0c Add unit tests for JPEG2000 utils 2021-06-13 10:20:18 +00:00
Salad Dais
ccfb641cc2 Add pixel artist example addon 2021-06-12 15:44:26 +00:00
Salad Dais
220d8ddf65 Add confirmation helper for InteractionManager API 2021-06-12 15:15:34 +00:00
Salad Dais
235bc8e09e Change TextureEntry type signatures to play nicer with type checker 2021-06-12 15:15:03 +00:00
Salad Dais
41fd67577a Add ability to wait on object-related events 2021-06-12 10:43:16 +00:00
Salad Dais
8347b341f5 Give default values for TextureEntry fields 2021-06-12 10:26:52 +00:00
Salad Dais
9d5599939e Add MCode enum definition 2021-06-12 08:54:34 +00:00
Salad Dais
1fd6decf91 Add integration tests for addon (un)loading 2021-06-11 19:44:53 +00:00
Salad Dais
4ddc6aa852 Remove unloaded addon scripts from sys.modules 2021-06-11 19:44:35 +00:00
Salad Dais
ab89f6bc14 Add integration test for asset server wrapper cap 2021-06-11 17:53:55 +00:00
Salad Dais
cb8c1cfe91 Only generate lowercase hostnames in register_wrapper_cap()
Hostnames are case insensitive and passing a URL through urlparse()
will always give you a lowercase domain name.
2021-06-11 17:52:03 +00:00
Salad Dais
52679bf708 HTTPAssetRepo: Don't throw when trying to serve invalid UUID 2021-06-11 17:51:45 +00:00
Salad Dais
a21c0439e9 Test for mitmproxy handling HTTPS requests as well 2021-06-10 23:32:38 +00:00
Salad Dais
216ffb3777 Add integration test for mitmproxy interception 2021-06-10 23:22:59 +00:00
Salad Dais
d4c30d998d Allow handling Firestorm Bridge responses, use to guess avatar Z pos 2021-06-09 02:02:09 +00:00
Salad Dais
003f37c3d3 Auto-request unknown objects when an avatar sits on them
We need to know about an avatar's parent to get their exact position
due to the Object.Position field always being relative to the parent.
2021-06-08 23:44:08 +00:00
Salad Dais
d64a07c04c Better guard to prevent accidental lazy serializable hydration 2021-06-08 18:57:57 +00:00
Salad Dais
82b156813b Add more name accessors to Avatar class 2021-06-08 18:57:24 +00:00
Salad Dais
b71da8f5a4 Add option to automatically request missing cached objects 2021-06-08 18:41:44 +00:00
Salad Dais
5618bcbac1 Add new persistent (Proxy)Settings object, use to pass down settings 2021-06-08 16:55:19 +00:00
Salad Dais
24abc36df2 Correct AgentState enum definition 2021-06-07 12:56:39 +00:00
Salad Dais
9ceea8324a Fix templates.py reloading by importing importlib 2021-06-07 12:56:21 +00:00
Salad Dais
29653c350f Bundle addon examples with Windows build 2021-06-07 11:40:45 +00:00
Salad Dais
b03ef1c36b v0.6.0 2021-06-07 08:24:10 +00:00
Salad Dais
a2d5414691 Add more ObjectManager tests 2021-06-07 08:10:28 +00:00
Salad Dais
135ce06452 Rewrite ObjectManager to have WorldObjectManager own objects
This simplifies a lot of the interdependencies between the
WorldObjectManagers and region ObjectManagers.
2021-06-07 05:31:54 +00:00
Salad Dais
12862fcd02 Keep Avatar wrappers around rather than regenerating them when queried
Allows callers to keep around a reference to an Avatar object and get
updated position and validity information without having to poll the
ObjectManager itself.
2021-06-05 14:23:49 +00:00
Salad Dais
9ab5c8a907 Update VFS impl type hints 2021-06-05 14:20:26 +00:00
Salad Dais
9652261b67 Increase timeouts in transfer tests to reduce flakiness 2021-06-04 09:44:17 +00:00
Salad Dais
3887e0a23c Add note about VOCache 2021-06-04 09:31:54 +00:00
Salad Dais
84733731fe Add distinct tests for CapsClient and ProxyCapsClient 2021-06-04 09:31:54 +00:00
Salad Dais
49f7ba960f Move tons more things to lib.base and lib.client
Put an abstract session and region implementation in client so things
that could be logically shared between client/proxy can be.

ObjectManager moved to client with proxy-specific details in
ProxyObjectManager.
2021-06-04 09:31:54 +00:00
Salad Dais
f2ee6f789f Correct region handle change comments in ObjectManager 2021-06-03 20:51:38 +00:00
Salad Dais
9df0224fbf Split CapsClient into proxy and non-proxy version 2021-06-03 08:02:11 +00:00
Salad Dais
59493e021c Move XferManager and TransferManager to base 2021-06-03 07:04:06 +00:00
Salad Dais
7b98c0b261 Split out human str formatting for Messages 2021-06-03 07:03:54 +00:00
Salad Dais
a39d025a04 Move Circuit and Message to lib.base
Fairly invasive, but will help make lib.base useful again. No
more Message / ProxiedMessage split!
2021-06-03 07:00:32 +00:00
Salad Dais
908d7a24f1 Add test for TransferManager 2021-06-02 21:08:27 +00:00
Salad Dais
0bf1e84da4 Make XferManager tests exercise both upload and download paths 2021-06-02 20:02:31 +00:00
Salad Dais
3d8da0af65 Remove TransferManager dependency on ProxiedRegion 2021-06-02 20:01:47 +00:00
Salad Dais
abf730cea5 serializer -> serialize 2021-06-02 12:07:58 +00:00
Salad Dais
0a45cd3739 Remove XferManager dependency on ProxiedRegion 2021-06-02 11:44:06 +00:00
Salad Dais
af17525071 Remove Circuit dependency on parent Region 2021-06-02 11:44:06 +00:00
dependabot[bot]
592ac4bec6 Bump urllib3 from 1.26.4 to 1.26.5 (#13)
Bumps [urllib3](https://github.com/urllib3/urllib3) from 1.26.4 to 1.26.5.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/1.26.4...1.26.5)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-06-02 03:56:01 -03:00
Salad Dais
960c8aa905 Add test for large xfer payload case 2021-06-02 05:38:55 +00:00
Salad Dais
c1d795e850 Add XferManager tests 2021-06-02 02:48:13 +00:00
Salad Dais
984ac257a5 Rename ensure_ancestors_loaded(), add timeout tests 2021-06-01 22:51:57 +00:00
Salad Dais
9b970f07e5 Add session-level lookup_avatar 2021-06-01 22:35:49 +00:00
Salad Dais
d6a6fb4a91 Add ability to send EventQueue messages to message builder 2021-06-01 21:08:22 +00:00
Salad Dais
fd747c9615 Switch to importing hippolyzer.lib.base.templates
Should help deal with automatic template reloading issues since
mtime wasn't changing on `lib.proxy.templates`.
2021-06-01 08:24:17 +00:00
Salad Dais
69dd1ca9ce Unpack fixed point coords in particles as vectors 2021-06-01 01:39:42 +00:00
Salad Dais
2c914b43b0 Add better type hints to Object 2021-06-01 01:39:14 +00:00
Salad Dais
0d18bc1daa Test for roundtripping of ObjectUpdateCompressed's serializers 2021-05-31 13:53:22 +00:00
Salad Dais
626e59f22c Fix TextureEntry serialization 2021-05-31 13:33:16 +00:00
Salad Dais
8c614404d8 Improve NameCache implementation, share viewer name cache 2021-05-31 12:24:08 +00:00
Salad Dais
98df182110 Add common exclusions to coveragerc 2021-05-31 10:33:28 +00:00
Salad Dais
c856b5e7fc Add test for post-filtering logged messages 2021-05-31 10:25:31 +00:00
Salad Dais
c0e91273fd Fix cache location detection case 2021-05-31 10:25:12 +00:00
Salad Dais
e50a00064a Make greeting example use world object store and global positions 2021-05-31 00:25:42 +00:00
Salad Dais
ebc02f9a22 Make object handling happen at the world level, fix region handoffs
The previous model didn't really map to how Indra handles objects:
In Indra Local IDs are only really used to look up the FullID, and
that's used to look at a global object list.

This moves to a model where the world (Session) owns the object, and
objects can be freely moved between regions without killing the world's
reference to it.

The two ID design for objects was a mistake in my opinion, but whatever.
2021-05-30 14:24:39 +00:00
Salad Dais
f57087bf6c Clear timed-out futures in ObjectManager 2021-05-29 10:47:13 +00:00
Salad Dais
6c6ea66989 Allow awaiting object update / property requests 2021-05-29 08:51:15 +00:00
Salad Dais
6cc25118b9 Faster cap URL lookup
Was taking up 15% of proxy time, so worth optimizing.
2021-05-29 06:19:09 +00:00
Salad Dais
3aa5215587 Don't trigger a message parse when invalidating caches 2021-05-29 06:18:49 +00:00
Salad Dais
eb34a945bc Update the vocache state in the GUI properly 2021-05-29 06:00:52 +00:00
Salad Dais
ccb29f8eeb Simplify Object definition 2021-05-29 05:29:53 +00:00
Salad Dais
bf377ae323 Make using VOCache optional, off by default 2021-05-28 21:53:44 +00:00
Salad Dais
6df2224be5 Account for recent stringification of Filename in ShieldAddon 2021-05-28 20:51:18 +00:00
Salad Dais
9dbb719d52 Add fast path for ObjectUpdateCompressed decoding
Speeds up ObjectUpdateCompressed handling by 25%

Resolves #9
2021-05-28 02:19:51 +00:00
Salad Dais
2608a02d5c Use viewer's object cache to better handle ObjectUpdateCached hits
Without this we end up in weird cases where the viewer gets a cache
hit and never request the object data, creating link heirarchies where
the viewer knows about all the prims but Hippolyzer only knows some
of them and orphans them.

Since we don't know what viewer the user is using, we scan around
the disk for object caches and try to use those. 99% of the time the
connection will be coming from localhost so this is fine.

Fixes #11
2021-05-28 02:18:20 +00:00
Salad Dais
eb2c5b7494 Allow getting coarse location for seated orphaned avatars 2021-05-25 20:29:37 +00:00
Salad Dais
a1bbfbf410 TurboXferAddon -> TurboObjectInventoryAddon 2021-05-25 02:34:22 +00:00
Salad Dais
2485831c47 Make Turbo Xfer example more reliable 2021-05-25 02:32:37 +00:00
Salad Dais
2e869e9219 Add turbo Xfer capabilities to XferManager 2021-05-25 02:32:16 +00:00
Salad Dais
c39db7f130 Fix for take()n messages having no deserializer 2021-05-25 01:29:39 +00:00
Salad Dais
c58d24bd16 Revert "Make it less annoying to pickle messages"
This reverts commit 8af87befbd.

It was breaking take()n messages sometimes.
2021-05-25 01:13:53 +00:00
Salad Dais
aef1261068 Add Turbo Xfer example addon 2021-05-24 05:15:42 +00:00
Salad Dais
2570269e29 Reorder subscribe_async call signature 2021-05-24 04:59:02 +00:00
Salad Dais
f3c937bf14 Add recapitator addon example 2021-05-24 03:49:49 +00:00
Salad Dais
2fab1a0fae Allow serving inbound RequestXfers outsite asset upload flow 2021-05-24 03:29:36 +00:00
Salad Dais
935e3ccc40 Add linden character files to repo, parse visual params 2021-05-24 03:28:39 +00:00
Salad Dais
f5ededcdd7 Put stub templates.py back in
If I have to choose between breaking bisect and breaking blame,
I pick bisect. This was split across two commits to help Git with
its rename detection.
2021-05-23 10:44:41 +00:00
Salad Dais
237a409ee0 Move serialization templates and VFS code to lib.base
Not being able to use common enums in code in lib.base was
getting to be really annoying. It always should have been in
base anyways.
2021-05-23 10:44:18 +00:00
Salad Dais
058b9f5313 Allow getting an alias without implicitly creating one 2021-05-23 10:22:28 +00:00
Salad Dais
fdcb816585 Allow synthesizing inbound Xfer requests 2021-05-23 06:36:00 +00:00
Salad Dais
d22fef149b Fix multi-chunk Xfer uploads 2021-05-23 06:35:21 +00:00
Salad Dais
9e035e98ba Fix messages take()n inside addon LLUDP hooks not getting dropped 2021-05-23 06:34:36 +00:00
Salad Dais
c9138b4649 Add wearable asset serialization support 2021-05-23 05:01:37 +00:00
Salad Dais
0caba9da68 Add serialization support for task inventory schema 2021-05-23 04:30:55 +00:00
Salad Dais
b2f0de2db5 v0.5.0 2021-05-21 23:48:29 +00:00
Salad Dais
0b0e031091 Run Flake8 in CI 2021-05-21 19:02:15 +00:00
Salad Dais
4eeac738dc Clean up linter warnings 2021-05-21 19:00:06 +00:00
Salad Dais
d9416363b3 Add flake8 config 2021-05-21 18:58:15 +00:00
Salad Dais
5906140921 Make Monochrome example addon work with bakes on mesh 2021-05-20 20:42:17 +00:00
Salad Dais
58932e585e Add better ObjectUpdate change detection 2021-05-20 20:42:17 +00:00
Salad Dais
b9f8ce0da2 Update readme 2021-05-20 20:42:17 +00:00
Salad Dais
67aa5e6bcd Possibly fix for flakey tests 2021-05-19 22:26:18 +00:00
Salad Dais
2a05529ceb Fix bad directive in pytest workflow 2021-05-19 22:20:41 +00:00
Salad Dais
a97aa88cc9 Add integration tests for MITMProxyEventManager 2021-05-19 22:14:27 +00:00
Salad Dais
febc0793f2 Add more HTTP flow tests 2021-05-19 20:44:28 +00:00
Salad Dais
141eb3afcd Add more HTTP request logging tests 2021-05-19 06:11:53 +00:00
Salad Dais
517888b1fa Fix missing import for byte escaping 2021-05-19 01:07:37 +00:00
Salad Dais
376b100ed9 Asset server proxying speedups
Should help with #7, will need to check on Windows.
2021-05-17 07:39:26 +00:00
Salad Dais
07fbec47e1 Fix autocompletion for enums used in subfields 2021-05-17 02:12:37 +00:00
Salad Dais
7836527305 Add NameCache CoarseLocation-only Avatars can be named 2021-05-17 01:50:40 +00:00
Salad Dais
21b18b7a52 Make new base classes for enum and flag with pretty repr() 2021-05-16 17:35:23 +00:00
Salad Dais
28b09144f2 Add Avatar wrapper class for Avatar PCoded Objects
Must be specifically requested through lookup_avatar or all_avatars
Includes Avatars known either through CoarseLocationUpdates or ObjectUpdates
2021-05-16 00:05:28 +00:00
Salad Dais
1e13fede82 Minor changes to avatar position accessor, add tests 2021-05-15 21:28:29 +00:00
Salad Dais
1bfb719f08 Run tests on PRs 2021-05-15 20:01:04 +00:00
gwigz
e5b63f7550 Add basic support for coarse locations (#8) 2021-05-15 15:40:40 -03:00
Salad Dais
91328ac448 Add bodypart creation example, make short uploads take short path 2021-05-15 05:17:49 +00:00
Salad Dais
46dbacd475 Fix order of arg-only, kwarg-only specifiers 2021-05-14 04:04:35 +00:00
Salad Dais
187742c20a Fix typo in comment 2021-05-14 04:03:00 +00:00
Salad Dais
5eae956750 Add support for asset upload via xfer
Still needed for shapes.
2021-05-14 04:01:33 +00:00
Salad Dais
37e8f8a20e Add TeleportFlags enum 2021-05-14 04:01:33 +00:00
Salad Dais
b3125f3231 Minor changes to Transfer / Xfer 2021-05-13 00:22:16 +00:00
Salad Dais
46fed98d6a Add note about why Connection: close is there
I forgot.
2021-05-12 20:22:47 +00:00
Salad Dais
3b5938cf5c Better inbound RequestXfer filter 2021-05-12 19:57:12 +00:00
Salad Dais
c7aeb03ea4 Allow shape Xfers through 2021-05-12 05:43:41 +00:00
Salad Dais
ab1bd16b5c whitespace cleanup 2021-05-11 22:00:02 +00:00
Salad Dais
0412ca5019 v0.4.1 2021-05-11 18:49:52 +00:00
Salad Dais
4d238c8dc8 Update readme to mention Windows SOCKS wrapper
Closes #6
2021-05-11 18:49:11 +00:00
Salad Dais
3bcc510cfd Handle Windows config dirs in the roaming profile 2021-05-11 09:55:04 +00:00
Salad Dais
0d9593e14c v0.4.0 2021-05-08 01:44:13 +00:00
Salad Dais
28dfe2f1b2 Allow filter identifiers with underscores, fixes enum filters 2021-05-08 01:32:57 +00:00
Salad Dais
c8f7231eae Fix message log match highlighting 2021-05-08 01:27:11 +00:00
Salad Dais
00e9ecb765 Allow flag or enum references in filter expressions 2021-05-08 00:45:02 +00:00
Salad Dais
2892bbeb98 Add note about how object handling could be improved 2021-05-07 23:05:31 +00:00
Salad Dais
28f57a8836 More mesh documentation 2021-05-07 20:09:05 +00:00
Salad Dais
943b8b11d5 Improve KillObject handling
KillObject should kill the hierarchy. This brings us closer
to indra object handling semantics.
2021-05-07 19:47:49 +00:00
Salad Dais
88915dd8d7 Better handling of object LocalID changes 2021-05-07 05:38:27 +00:00
Salad Dais
60b39e27f8 Add note about attachment tp out / in brokenness 2021-05-07 04:49:49 +00:00
Salad Dais
8af87befbd Make it less annoying to pickle messages 2021-05-06 02:41:12 +00:00
Salad Dais
95e34bb07a Add a few tests for HTTP flow wrappers 2021-05-05 22:25:03 +00:00
Salad Dais
106eb5c063 Fix typo in CI YAML 2021-05-05 21:35:07 +00:00
Salad Dais
e7f88eeed9 Add tests for CapsClient 2021-05-05 21:30:01 +00:00
Salad Dais
d07f100452 Update codecov.yml 2021-05-05 17:37:52 +00:00
Salad Dais
02c212e4a6 Highlight matched line when matching on specific var values
Very helpful for debugging ObjectUpdates which are high frequency
and have many diff objects in a single message.

Just the first line of the var for now. Need to be smarter about
how we build the blocks in the message text if we want to highlight
the whole thing.
2021-05-05 04:15:35 +00:00
Salad Dais
8989843042 v0.3.2 2021-05-04 15:42:27 +00:00
Salad Dais
a217a30133 Log message after addon hooks have run
This used to be the behaviour, but switching from queueing to
immediately adding messages to the log removed the implicit delay
2021-05-04 03:01:18 +00:00
Salad Dais
8514d7bae8 Update readme 2021-05-04 00:10:17 +00:00
Salad Dais
d9084c3332 Include licenses in Windows bundles 2021-05-04 00:09:07 +00:00
Salad Dais
0f35cc00d5 Allow manually triggering windows build 2021-05-03 23:40:00 +00:00
Salad Dais
a6a7ce8fa3 Correct codecov threshold 2021-05-03 23:36:59 +00:00
Salad Dais
269a1e163b Don't fail commits on coverage dropping 2021-05-03 23:33:33 +00:00
Salad Dais
eb2b6ee870 Package a zip for Windows when a release is made 2021-05-03 23:20:40 +00:00
Salad Dais
79a4f72558 v0.3.1 2021-05-03 17:37:22 +00:00
Salad Dais
6316369e1a Don't fail CI if coverage drops 2021-05-03 17:36:37 +00:00
Salad Dais
1b0272f3b3 WIP cx_Freeze support 2021-05-03 17:28:42 +00:00
Salad Dais
aedc2bf48c Fix CapType resolution 2021-05-03 17:09:57 +00:00
Salad Dais
5d3fd69e35 Add badges 2021-05-03 15:05:37 +00:00
Salad Dais
ae464f2c06 Track code coverage on codecov 2021-05-03 14:49:48 +00:00
136 changed files with 27143 additions and 4201 deletions

View File

@@ -1,2 +1,12 @@
[run]
omit =
concurrency = multiprocessing
[report]
exclude_lines =
pragma: no cover
if TYPE_CHECKING:
if typing.TYPE_CHECKING:
def __repr__
raise AssertionError
assert False
pass

46
.github/workflows/bundle_windows.yml vendored Normal file
View File

@@ -0,0 +1,46 @@
# Have to manually unzip this (it gets double zipped) and add it
# onto the release after it gets created. Don't want actions with repo write.
name: Bundle Windows EXE
on:
# Only trigger on release creation
release:
types:
- created
workflow_dispatch:
jobs:
build:
runs-on: windows-latest
strategy:
matrix:
python-version: [3.9]
steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -e .
pip install cx_freeze
- name: Bundle with cx_Freeze
run: |
python setup_cxfreeze.py build_exe
pip install pip-licenses
pip-licenses --format=plain-vertical --with-license-file --no-license-path --output-file=lib_licenses.txt
python setup_cxfreeze.py finalize_cxfreeze
- name: Upload the artifact
uses: actions/upload-artifact@v2
with:
name: hippolyzer-gui-windows-${{ github.sha }}
path: ./dist/**

View File

@@ -6,6 +6,8 @@ on:
release:
types:
- created
workflow_dispatch:
# based on https://github.com/pypa/gh-action-pypi-publish

View File

@@ -1,6 +1,6 @@
name: Run Python Tests
on: [push]
on: [push, pull_request]
jobs:
build:
@@ -12,16 +12,36 @@ jobs:
steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install flake8 pytest
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: Test with pytest
pip install -r requirements.txt
pip install -r requirements-test.txt
sudo apt-get install libopenjp2-7
- name: Run Flake8
run: |
pytest
flake8 .
- name: Test with pytest
# Tests are intentionally covered to detect broken tests.
run: |
pytest --cov=./hippolyzer --cov=./tests --cov-report=xml
# Keep this in a workflow without any other secrets in it.
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v1
with:
token: ${{ secrets.CODECOV_TOKEN }}
files: ./coverage.xml
directory: ./coverage/reports/
flags: unittests
env_vars: OS,PYTHON
name: codecov-umbrella
fail_ci_if_error: false
path_to_write_report: ./coverage/codecov_report.txt
verbose: false

1
.gitignore vendored
View File

@@ -1,6 +1,7 @@
#use glob syntax
syntax: glob
__pycache__
*.pyc
build/*
*.egg-info

View File

@@ -1,6 +1,8 @@
# Hippolyzer
[Hippolyzer](http://wiki.secondlife.com/wiki/Hippo) is a fork of Linden Lab's abandoned
![Python Test Status](https://github.com/SaladDais/Hippolyzer/workflows/Run%20Python%20Tests/badge.svg) [![codecov](https://codecov.io/gh/SaladDais/Hippolyzer/branch/master/graph/badge.svg?token=HCTFA4RAXX)](https://codecov.io/gh/SaladDais/Hippolyzer)
[Hippolyzer](http://wiki.secondlife.com/wiki/Hippo) is a revival of Linden Lab's
[PyOGP library](http://wiki.secondlife.com/wiki/PyOGP)
targeting modern Python 3, with a focus on debugging issues in Second Life-compatible
servers and clients. There is a secondary focus on mocking up new features without requiring a
@@ -22,6 +24,9 @@ with low-level SL details. See the [Local Animation addon example](https://githu
![Screenshot of proxy GUI](https://github.com/SaladDais/Hippolyzer/blob/master/static/screenshot.png?raw=true)
## Setup
### From Source
* Python 3.8 or above is **required**. If you're unable to upgrade your system Python package due to
being on a stable distro, you can use [pyenv](https://github.com/pyenv/pyenv) to create
a self-contained Python install with the appropriate version.
@@ -32,6 +37,11 @@ with low-level SL details. See the [Local Animation addon example](https://githu
* * Under Windows it's `<virtualenv_dir>\Scripts\activate.bat`
* Run `pip install hippolyzer`, or run `pip install -e .` in a cloned repo to install an editable version
### Binary Windows Builds
Binary Windows builds are available on the [Releases page](https://github.com/SaladDais/Hippolyzer/releases/).
I don't extensively test these, building from source is recommended.
## Proxy
A proxy is provided with both a CLI and Qt-based interface. The proxy application wraps a
@@ -52,16 +62,27 @@ the [Alchemy](https://github.com/AlchemyViewer/Alchemy) viewer.
On Linux that would be `~/.firestorm_x64/` if you're using Firestorm.
* * Certificate validation can be disabled entirely through viewer debug setting `NoVerifySSLCert`,
but is not recommended.
#### Windows
Windows viewers have broken SOCKS 5 proxy support. To work around that, you need to use a wrapper EXE that
can make the viewer to correctly talk to Hippolyzer. Follow the instructions on https://github.com/SaladDais/WinHippoAutoProxy
to start the viewer and run it through Hippolyzer.
The proxy should _not_ be configured through the viewer's own preferences panel, it won't work correctly.
#### OS X & Linux
SOCKS 5 works correctly on these platforms, so you can just configure it through the
`preferences -> network -> proxy settings` panel:
* Start the viewer and configure it to use `127.0.0.1:9061` as a SOCKS proxy and `127.0.0.1:9062` as
an HTTP proxy. You **must** select the option in the viewer to use the HTTP proxy for all HTTP
traffic, or logins will fail.
* Optionally, If you want to reduce HTTP proxy lag you can have asset requests bypass the HTTP proxy by setting
the `no_proxy` env var appropriately. For ex. `no_proxy="asset-cdn.glb.agni.lindenlab.com" ./firestorm` or
`setx /m "no_proxy" "asset-cdn.glb.agni.lindenlab.com"` on Windows.
the `no_proxy` env var appropriately. For ex. `no_proxy="asset-cdn.glb.agni.lindenlab.com" ./firestorm`.
* Log in!
![Proxy config in firestorm](https://github.com/SaladDais/Hippolyzer/blob/master/static/proxy_config.png?raw=true)
### Filtering
By default, the proxy's display filter is configured to ignore many high-frequency messages.
@@ -85,11 +106,14 @@ agent's session, you can do `(Meta.AgentID == None || Meta.AgentID == "d929385f-
Vectors can also be compared. This will get any ObjectUpdate variant that occurs within a certain range:
`(*ObjectUpdate*.ObjectData.*Data.Position > (110, 50, 100) && *ObjectUpdate*.ObjectData.*Data.Position < (115, 55, 105))`
If you want to compare against an enum or a flag class in defined in `templates.py`, you can just specify its name:
`ViewerEffect.Effect.Type == ViewerEffectType.EFFECT_BEAM`
### Logging
Decoded messages are displayed in the log pane, clicking one will show the request and
response for HTTP messages, and a human-friendly form for UDP messages. Some messages and
fields have [special packers defined](https://github.com/SaladDais/Hippolyzer/blob/master/hippolyzer/lib/proxy/templates.py)
fields have [special packers defined](https://github.com/SaladDais/Hippolyzer/blob/master/hippolyzer/lib/base/templates.py)
that will give a more human-readable form of enum or binary fields, with the original form beside or below it.
For example, an `AgentUpdate` message may show up in the log pane like:
@@ -200,7 +224,7 @@ OUT ObjectAdd
```
The repeat spinner at the bottom of the window lets you send a message multiple times.
an `i` variable is put into the eval context and can be used to vary messages accros repeats.
an `i` variable is put into the eval context and can be used to vary messages across repeats.
With repeat set to two:
```
@@ -289,12 +313,8 @@ If you are a viewer developer, please put them in a viewer.
## Potential Changes
* Make package-able for PyPI
* GitHub action to build binary packages and pull together licenses bundle
* AISv3 wrapper?
* Higher level wrappers for common things? I don't really need these, so only if people want to write them.
* Highlight matched portion of message in log view, if applicable
* * Remember deep filters and return a map of them, have message formatter return text ranges?
* Move things out of `templates.py`, right now most binary serialization stuff lives there
because it's more convenient for me to hot-reload.
* Ability to add menus?
@@ -303,10 +323,23 @@ If you are a viewer developer, please put them in a viewer.
[LGPLv3](https://www.gnu.org/licenses/lgpl-3.0.en.html). If you have a good reason why, I might dual license.
This package [includes portions of the Second Life(TM) Viewer Artwork](https://github.com/SaladDais/Hippolyzer/tree/master/hippolyzer/lib/proxy/data),
This package [includes portions of the Second Life(TM) Viewer Artwork](https://github.com/SaladDais/Hippolyzer/tree/master/hippolyzer/lib/base/data),
Copyright (C) 2008 Linden Research, Inc. The viewer artwork is licensed under the Creative Commons
Attribution-Share Alike 3.0 License.
## Contributing
Ensure that any patches are clean with no unnecessary whitespace or formatting changes, and that you
add new tests for any added functionality.
## Philosophy
With a few notable exceptions, Hippolyzer focuses mainly on decomposition of data, and doesn't
provide many high-level abstractions for interpreting or manipulating that data. It's careful
to only do lossless transforms on data that are just prettier representations of the data sent
over the wire. Hippolyzer's goal is to help people understand how Second Life actually works,
automatically employing abstractions that hide how SL works is counter to that goal.
## For Client Developers
This section is mostly useful if you're developing a new SL-compatible client from scratch. Clients based
@@ -320,18 +353,20 @@ UDP proxy and an HTTP proxy.
To have your client's traffic proxied through Hippolyzer the general flow is:
* Open a TCP connection to Hippolyzer's SOCKS 5 proxy port
* * This should be done once per logical user session, as Hippolyzer assumes a 1:1 mapping of SOCKS
* * This should be done once per logical user session, as Hippolyzer assumes a 1:1 mapping of SOCKS TCP
connections to SL sessions
* Send a UDP associate command without authentication
* The proxy will respond with a host / port pair that UDP messages may be sent through
* At this point you will no longer need to use the TCP connection, but it must be kept
* At this point you will no longer need to use the TCP connection, but it must be kept
alive until you want to break the UDP association
* Whenever you send a UDP packet to a remote host, you'll need to instead send it to the host / port
from the UDP associate response. A SOCKS 5 header must be prepended to the data indicating the ultimate destination
of the packet
* Any received UDP packets will also have a SOCKS 5 header indicating the real source IP and address
* * When in doubt, check `socks_proxy.py`, `packets.py` and the SOCKS 5 RFC for more info on how to deal with SOCKS.
* All HTTP requests must be sent through the Hippolyzer's HTTP proxy port.
* * <https://github.com/SaladDais/WinHippoAutoProxy/blob/master/winhippoautoproxy/socks5udphooker.cpp> is a simple
example that wraps around `recvfrom()` and `sendto()` and could be used as a starting point.
* All HTTP requests must be sent through the Hippolyzer's HTTP proxy port.
* * You may not need to do any extra plumbing to get this to work if your chosen HTTP client
respects the `HTTP_PROXY` environment variable.
* All HTTPS connections will be encrypted with the proxy's TLS key. You'll need to either add it to whatever

View File

@@ -9,7 +9,7 @@ from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
class PropertyHelloWorldAddon(BaseAddon):
class AddonStateHelloWorldAddon(BaseAddon):
# How to say hello, value shared across sessions and will be the same
# regardless of which session is active when accessed.
# "hello_greeting" is added to session_manager.addon_ctx's dict and will survive reloads
@@ -28,7 +28,11 @@ class PropertyHelloWorldAddon(BaseAddon):
# Shared across sessions and will die if the addon is reloaded
self.hello_punctuation = "!"
@handle_command(greeting=Parameter(str, sep=None))
@handle_command(
# Use the longer-form `Parameter()` for declaring this because
# this field should be greedy and take the rest of the message (no separator.)
greeting=Parameter(str, sep=None),
)
async def set_hello_greeting(self, _session: Session, _region: ProxiedRegion, greeting: str):
"""Set the person to say hello to"""
self.hello_greeting = greeting
@@ -38,7 +42,10 @@ class PropertyHelloWorldAddon(BaseAddon):
"""Set the person to say hello to"""
self.hello_person = person
@handle_command(punctuation=Parameter(str, sep=None))
@handle_command(
# Punctuation should have no whitespace, so using a simple parameter is OK.
punctuation=str,
)
async def set_hello_punctuation(self, _session: Session, _region: ProxiedRegion, punctuation: str):
"""Set the punctuation to use for saying hello"""
self.hello_punctuation = punctuation
@@ -47,8 +54,8 @@ class PropertyHelloWorldAddon(BaseAddon):
async def say_hello(self, _session: Session, _region: ProxiedRegion):
"""Say hello using the configured hello variables"""
# These aren't instance properties, they can be accessed via the class as well.
hello_person = PropertyHelloWorldAddon.hello_person
hello_person = AddonStateHelloWorldAddon.hello_person
send_chat(f"{self.hello_greeting} {hello_person}{self.hello_punctuation}")
addons = [PropertyHelloWorldAddon()]
addons = [AddonStateHelloWorldAddon()]

View File

@@ -0,0 +1,32 @@
"""
Example anim mangler addon, to be used with local anim addon.
You can edit this live to apply various transforms to local anims,
as well as any uploaded anims. Any changes will be reflected in currently
playing local anims.
This example modifies any position keys of an animation's mHipRight joint.
"""
from hippolyzer.lib.base.llanim import Animation
from hippolyzer.lib.proxy.addons import AddonManager
import local_anim
AddonManager.hot_reload(local_anim, require_addons_loaded=True)
def offset_right_hip(anim: Animation):
hip_joint = anim.joints.get("mHipRight")
if hip_joint:
for pos_frame in hip_joint.pos_keyframes:
pos_frame.pos.Z *= 2.5
pos_frame.pos.X *= 5.0
return anim
class ExampleAnimManglerAddon(local_anim.BaseAnimManglerAddon):
ANIM_MANGLERS = [
offset_right_hip,
]
addons = [ExampleAnimManglerAddon()]

View File

@@ -4,11 +4,11 @@ All buttons make you go backwards.
Except for backward, which makes you go left.
"""
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.base.templates import AgentControlFlags
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
from hippolyzer.lib.proxy.templates import AgentControlFlags
NUDGE_MASK = sum(x for x in AgentControlFlags if "NUDGE" in x.name)
@@ -19,7 +19,7 @@ BACK_MASK = (AgentControlFlags.AT_NEG | AgentControlFlags.NUDGE_AT_NEG)
class BackwardsAddon(BaseAddon):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if message.name == "AgentUpdate":
agent_data_block = message["AgentData"][0]
flags: AgentControlFlags = agent_data_block.deserialize_var("ControlFlags")

View File

@@ -11,7 +11,7 @@ import secrets
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
@@ -41,7 +41,7 @@ class BezosifyAddon(BaseAddon):
# random value to XOR all CRCs with
self.bezos_crc_xor = secrets.randbits(32)
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if message.name == "ObjectUpdateCached":
for block in message["ObjectData"]:
# Cached only really has a CRC, this will force the cache miss.

View File

@@ -14,18 +14,17 @@ from typing import *
from PySide2 import QtCore, QtGui, QtWidgets
from hippolyzer.lib.base.datatypes import Vector3
from hippolyzer.lib.base.message.message import Block
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.objects import Object
from hippolyzer.lib.base.ui_helpers import loadUi
from hippolyzer.lib.base.templates import PCode
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.packets import Direction
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
from hippolyzer.lib.proxy.task_scheduler import TaskLifeScope
from hippolyzer.lib.proxy.templates import PCode
def _is_color_blueish(color: bytes) -> bool:
@@ -81,7 +80,7 @@ class BlueishObjectListGUIAddon(BaseAddon):
raise
def _highlight_object(self, session: Session, obj: Object):
session.main_region.circuit.send_message(ProxiedMessage(
session.main_region.circuit.send_message(Message(
"ForceObjectSelect",
Block("Header", ResetList=False),
Block("Data", LocalID=obj.LocalID),
@@ -89,7 +88,7 @@ class BlueishObjectListGUIAddon(BaseAddon):
))
def _teleport_to_object(self, session: Session, obj: Object):
session.main_region.circuit.send_message(ProxiedMessage(
session.main_region.circuit.send_message(Message(
"TeleportLocationRequest",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
Block(

View File

@@ -1,9 +1,9 @@
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
def handle_lludp_message(session: Session, region: ProxiedRegion, message: ProxiedMessage):
def handle_lludp_message(session: Session, region: ProxiedRegion, message: Message):
# addon_ctx will persist across addon reloads, use for storing data that
# needs to survive across calls to this function
ctx = session.addon_ctx

View File

@@ -10,13 +10,13 @@ message with a greeting.
"""
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
class CustomMetaExampleAddon(BaseAddon):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if not message.name.startswith("ChatFrom"):
return

View File

@@ -16,8 +16,8 @@ import random
from hippolyzer.lib.base.message.msgtypes import PacketLayout
from hippolyzer.lib.base.message.udpserializer import UDPMessageSerializer
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.proxy.packets import Direction
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
@@ -28,7 +28,7 @@ class PacketMutationAddon(BaseAddon):
def __init__(self):
self.serializer = UDPMessageSerializer()
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
# Only inbound messages, don't fiddle with the sim.
if message.direction != Direction.IN:
return

View File

@@ -9,23 +9,24 @@ class GreetingAddon(BaseAddon):
@handle_command()
async def greetings(self, session: Session, region: ProxiedRegion):
"""Greet everyone around you"""
agent_obj = region.objects.lookup_fullid(session.agent_id)
if not agent_obj:
our_avatar = region.objects.lookup_avatar(session.agent_id)
if not our_avatar:
show_message("Don't have an agent object?")
# Note that this will only have avatars closeish to your camera. The sim sends
# KillObjects for avatars that get too far away.
other_agents = [o for o in region.objects.all_avatars if o.FullID != agent_obj.FullID]
# Look this up in the session object store since we may be next
# to a region border.
other_avatars = [o for o in session.objects.all_avatars if o.FullID != our_avatar.FullID]
if not other_agents:
show_message("No other agents?")
if not other_avatars:
show_message("No other avatars?")
for other_agent in other_agents:
dist = Vector3.dist(agent_obj.Position, other_agent.Position)
for other_avatar in other_avatars:
dist = Vector3.dist(our_avatar.GlobalPosition, other_avatar.GlobalPosition)
if dist >= 19.0:
continue
nv = other_agent.NameValue.to_dict()
send_chat(f"Greetings, {nv['FirstName']} {nv['LastName']}!")
if other_avatar.PreferredName is None:
continue
send_chat(f"Greetings, {other_avatar.PreferredName}!")
addons = [GreetingAddon()]

View File

@@ -2,11 +2,11 @@
Drop outgoing packets that might leak what you're looking at, similar to Firestorm
"""
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.proxy.packets import Direction
from hippolyzer.lib.base.templates import ViewerEffectType
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
from hippolyzer.lib.proxy.templates import ViewerEffectType
BLOCKED_EFFECTS = (
@@ -17,7 +17,7 @@ BLOCKED_EFFECTS = (
)
def handle_lludp_message(_session: Session, region: ProxiedRegion, msg: ProxiedMessage):
def handle_lludp_message(_session: Session, region: ProxiedRegion, msg: Message):
if msg.name == "ViewerEffect" and msg.direction == Direction.OUT:
new_blocks = [b for b in msg["Effect"] if b["Type"] not in BLOCKED_EFFECTS]
if new_blocks:

View File

@@ -13,10 +13,10 @@ from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.llanim import Animation
from hippolyzer.lib.proxy.addon_utils import AssetAliasTracker, BaseAddon, GlobalProperty
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
from hippolyzer.lib.proxy.vfs import STATIC_VFS
from hippolyzer.lib.base.vfs import STATIC_VFS
JOINT_REPLS = {
@@ -53,7 +53,7 @@ class HorrorAnimatorAddon(BaseAddon):
# We've reloaded, so make sure assets get new aliases
self.horror_anim_tracker.invalidate_aliases()
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
tracker = self.horror_anim_tracker
if message.name == "AvatarAnimation":
@@ -105,7 +105,7 @@ class HorrorAnimatorAddon(BaseAddon):
# send the response back immediately
block = STATIC_VFS[orig_anim_id]
anim_data = STATIC_VFS.read_block(block)
flow.response = mitmproxy.http.HTTPResponse.make(
flow.response = mitmproxy.http.Response.make(
200,
_mutate_anim_bytes(anim_data),
{

View File

@@ -5,25 +5,38 @@ Local animations
assuming you loaded something.anim
/524 start_local_anim something
/524 stop_local_anim something
/524 save_local_anim something
If you want to trigger the animation from an object to simulate llStartAnimation():
llOwnerSay("@start_local_anim:something=force");
Also includes a concept of "anim manglers" similar to the "mesh manglers" of the
local mesh addon. This is useful if you want to test making procedural changes
to animations before uploading them. The manglers will be applied to any uploaded
animations as well.
May also be useful if you need to make ad-hoc changes to a bunch of animations on
bulk upload, like changing priority or removing a joint.
"""
import asyncio
import os
import pathlib
from abc import abstractmethod
from typing import *
from hippolyzer.lib.base import serialization as se
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Block
from hippolyzer.lib.base.llanim import Animation
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty, GlobalProperty, show_message
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.http_asset_repo import HTTPAssetRepo
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
from hippolyzer.lib.proxy.sessions import Session, SessionManager
def _get_mtime(path: str):
@@ -36,12 +49,19 @@ def _get_mtime(path: str):
class LocalAnimAddon(BaseAddon):
# name -> path, only for anims actually from files
local_anim_paths: Dict[str, str] = SessionProperty(dict)
# name -> anim bytes
local_anim_bytes: Dict[str, bytes] = SessionProperty(dict)
# name -> mtime or None. Only for anims from files.
local_anim_mtimes: Dict[str, Optional[float]] = SessionProperty(dict)
# name -> current asset ID (changes each play)
local_anim_playing_ids: Dict[str, UUID] = SessionProperty(dict)
anim_manglers: List[Callable[[Animation], Animation]] = GlobalProperty(list)
def handle_init(self, session_manager: SessionManager):
self.remangle_local_anims(session_manager)
def handle_session_init(self, session: Session):
# Reload anims and reload any manglers if we have any
self._schedule_task(self._try_reload_anims(session))
@handle_command()
@@ -67,11 +87,23 @@ class LocalAnimAddon(BaseAddon):
"""Stop a named local animation"""
self.apply_local_anim(session, region, anim_name, new_data=None)
@handle_command(anim_name=str)
async def save_local_anim(self, _session: Session, _region: ProxiedRegion, anim_name: str):
"""Save a named local anim to disk"""
anim_bytes = self.local_anim_bytes.get(anim_name)
if not anim_bytes:
return
filename = await AddonManager.UI.save_file(filter_str="SL Anim (*.anim)", default_suffix="anim")
if not filename:
return
with open(filename, "wb") as f:
f.write(anim_bytes)
async def _try_reload_anims(self, session: Session):
while True:
region = session.main_region
if not region:
await asyncio.sleep(2.0)
await asyncio.sleep(1.0)
continue
# Loop over local anims we loaded
@@ -81,7 +113,7 @@ class LocalAnimAddon(BaseAddon):
continue
# is playing right now, check if there's a newer version
self.apply_local_anim_from_file(session, region, anim_name, only_if_changed=True)
await asyncio.sleep(2.0)
await asyncio.sleep(1.0)
def handle_rlv_command(self, session: Session, region: ProxiedRegion, source: UUID,
cmd: str, options: List[str], param: str):
@@ -101,7 +133,7 @@ class LocalAnimAddon(BaseAddon):
anim_name: str, new_data: Optional[bytes] = None):
asset_repo: HTTPAssetRepo = session.session_manager.asset_repo
next_id: Optional[UUID] = None
new_msg = ProxiedMessage(
new_msg = Message(
"AgentAnimation",
Block(
"AgentData",
@@ -128,9 +160,11 @@ class LocalAnimAddon(BaseAddon):
StartAnim=True,
))
cls.local_anim_playing_ids[anim_name] = next_id
cls.local_anim_bytes[anim_name] = new_data
else:
# No data means just stop the anim
cls.local_anim_playing_ids.pop(anim_name, None)
cls.local_anim_bytes.pop(anim_name, None)
region.circuit.send_message(new_msg)
print(f"Changing {anim_name} to {next_id}")
@@ -157,9 +191,94 @@ class LocalAnimAddon(BaseAddon):
with open(anim_path, "rb") as f:
anim_data = f.read()
anim_data = cls._mangle_anim(anim_data)
else:
print(f"Unknown anim {anim_name!r}")
cls.apply_local_anim(session, region, anim_name, new_data=anim_data)
@classmethod
def _mangle_anim(cls, anim_data: bytes) -> bytes:
if not cls.anim_manglers:
return anim_data
reader = se.BufferReader("<", anim_data)
spec = se.Dataclass(Animation)
anim = reader.read(spec)
for mangler in cls.anim_manglers:
anim = mangler(anim)
writer = se.BufferWriter("<")
writer.write(spec, anim)
return writer.copy_buffer()
@classmethod
def remangle_local_anims(cls, session_manager: SessionManager):
# Anim manglers are global, so we need to re-mangle anims for all sessions
for session in session_manager.sessions:
# Push the context of this session onto the stack so we can access
# session-scoped properties
with addon_ctx.push(new_session=session, new_region=session.main_region):
cls.local_anim_mtimes.clear()
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
if flow.name == "NewFileAgentInventoryUploader":
# Don't bother looking at this if we have no manglers
if not self.anim_manglers:
return
# This is kind of a crappy match but these magic bytes shouldn't match anything that SL
# allows as an upload type but animations.
if not flow.request.content or not flow.request.content.startswith(b"\x01\x00\x00\x00"):
return
# Replace the uploaded anim with the mangled version
flow.request.content = self._mangle_anim(flow.request.content)
show_message("Mangled upload request")
class BaseAnimManglerAddon(BaseAddon):
"""Base class for addons that mangle uploaded or file-based local animations"""
ANIM_MANGLERS: List[Callable[[Animation], Animation]]
def handle_init(self, session_manager: SessionManager):
# Add our manglers into the list
LocalAnimAddon.anim_manglers.extend(self.ANIM_MANGLERS)
LocalAnimAddon.remangle_local_anims(session_manager)
def handle_unload(self, session_manager: SessionManager):
# Clean up our manglers before we go away
mangler_list = LocalAnimAddon.anim_manglers
for mangler in self.ANIM_MANGLERS:
if mangler in mangler_list:
mangler_list.remove(mangler)
LocalAnimAddon.remangle_local_anims(session_manager)
class BaseAnimHelperAddon(BaseAddon):
"""
Base class for local creation of procedural animations
Animation generated by build_anim() gets applied to all active sessions
"""
ANIM_NAME: str
def handle_session_init(self, session: Session):
self._reapply_anim(session, session.main_region)
def handle_session_closed(self, session: Session):
LocalAnimAddon.apply_local_anim(session, session.main_region, self.ANIM_NAME, None)
def handle_unload(self, session_manager: SessionManager):
for session in session_manager.sessions:
# TODO: Nasty. Since we need to access session-local attrs we need to set the
# context even though we also explicitly pass session and region.
# Need to rethink the LocalAnimAddon API.
with addon_ctx.push(session, session.main_region):
LocalAnimAddon.apply_local_anim(session, session.main_region, self.ANIM_NAME, None)
@abstractmethod
def build_anim(self) -> Animation:
pass
def _reapply_anim(self, session: Session, region: ProxiedRegion):
LocalAnimAddon.apply_local_anim(session, region, self.ANIM_NAME, self.build_anim().to_bytes())
addons = [LocalAnimAddon()]

View File

@@ -23,23 +23,22 @@ import ctypes
import secrets
from typing import *
import mitmproxy
from mitmproxy.http import HTTPFlow
import mitmproxy.http
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import *
from hippolyzer.lib.base.mesh import LLMeshSerializer, MeshAsset
from hippolyzer.lib.base import serialization as se
from hippolyzer.lib.base.objects import Object
from hippolyzer.lib.base.templates import ExtraParamType
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.addon_utils import show_message, BaseAddon, GlobalProperty, SessionProperty
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.http_asset_repo import HTTPAssetRepo
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
from hippolyzer.lib.proxy.templates import ExtraParamType
def _modify_crc(crc_tweak, crc_val):
@@ -126,7 +125,7 @@ class MeshUploadInterceptingAddon(BaseAddon):
region.objects.request_objects(old_locals)
show_message(f"Cleared target {old_locals}")
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
# Replace any mesh asset IDs in tracked objects with our local assets
if not self.local_mesh_target_locals:
return
@@ -202,7 +201,7 @@ class MeshUploadInterceptingAddon(BaseAddon):
self.local_mesh_mapping = {x["mesh_name"]: x["mesh"] for x in instances}
# Fake a response, we don't want to actually send off the request.
flow.response = mitmproxy.http.HTTPResponse.make(
flow.response = mitmproxy.http.Response.make(
200,
b"",
{
@@ -281,4 +280,23 @@ class MeshUploadInterceptingAddon(BaseAddon):
cls._replace_local_mesh(session.main_region, asset_repo, mesh_list)
class BaseMeshManglerAddon(BaseAddon):
"""Base class for addons that mangle uploaded or local mesh"""
MESH_MANGLERS: List[Callable[[MeshAsset], MeshAsset]]
def handle_init(self, session_manager: SessionManager):
# Add our manglers into the list
MeshUploadInterceptingAddon.mesh_manglers.extend(self.MESH_MANGLERS)
# Tell the local mesh plugin that the mangler list changed, and to re-apply
MeshUploadInterceptingAddon.remangle_local_mesh(session_manager)
def handle_unload(self, session_manager: SessionManager):
# Clean up our manglers before we go away
mangler_list = MeshUploadInterceptingAddon.mesh_manglers
for mangler in self.MESH_MANGLERS:
if mangler in mangler_list:
mangler_list.remove(mangler)
MeshUploadInterceptingAddon.remangle_local_mesh(session_manager)
addons = [MeshUploadInterceptingAddon()]

View File

@@ -11,8 +11,6 @@ to add to give a mesh an arbitrary center of rotation / scaling.
from hippolyzer.lib.base.mesh import MeshAsset
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.sessions import SessionManager
import local_mesh
AddonManager.hot_reload(local_mesh, require_addons_loaded=True)
@@ -37,6 +35,9 @@ def reorient_mesh(orientation):
# X=1, Y=2, Z=3
def _reorienter(mesh: MeshAsset):
for material in mesh.iter_lod_materials():
if "Position" not in material:
# Must be a NoGeometry LOD
continue
# We don't need to use positions_(to/from)_domain here since we're just naively
# flipping the axes around.
material["Position"] = _reorient_coord_list(material["Position"], orientation)
@@ -46,28 +47,11 @@ def reorient_mesh(orientation):
return _reorienter
OUR_MANGLERS = [
# Negate the X and Y axes on any mesh we upload or create temp
reorient_mesh((-1, -2, 3)),
]
class ExampleMeshManglerAddon(local_mesh.BaseMeshManglerAddon):
MESH_MANGLERS = [
# Negate the X and Y axes on any mesh we upload or create temp
reorient_mesh((-1, -2, 3)),
]
class MeshManglerExampleAddon(BaseAddon):
def handle_init(self, session_manager: SessionManager):
# Add our manglers into the list
local_mesh_addon = local_mesh.MeshUploadInterceptingAddon
local_mesh_addon.mesh_manglers.extend(OUR_MANGLERS)
# Tell the local mesh plugin that the mangler list changed, and to re-apply
local_mesh_addon.remangle_local_mesh(session_manager)
def handle_unload(self, session_manager: SessionManager):
# Clean up our manglers before we go away
local_mesh_addon = local_mesh.MeshUploadInterceptingAddon
mangler_list = local_mesh_addon.mesh_manglers
for mangler in OUR_MANGLERS:
if mangler in mangler_list:
mangler_list.remove(mangler)
local_mesh_addon.remangle_local_mesh(session_manager)
addons = [MeshManglerExampleAddon()]
addons = [ExampleMeshManglerAddon()]

View File

@@ -0,0 +1,244 @@
"""
Message Mirror
Re-routes messages through the circuit of another agent running through this proxy,
rewriting the messages to use the credentials tied to that circuit.
Useful if you need to quickly QA authorization checks on a message handler or script.
Or if you want to chat as two people at once. Whatever.
Also shows some advanced ways of managing / rerouting Messages and HTTP flows.
Fiddle with the values of `SEND_NORMALLY` and `MIRROR` to change how and which
messages get moved to other circuits.
Usage: /524 mirror_to <mirror_agent_uuid>
To Disable: /524 mirror_to
"""
import weakref
from typing import Optional
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.template_dict import DEFAULT_TEMPLATE_DICT
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty, show_message
from hippolyzer.lib.proxy.commands import handle_command, Parameter, parse_bool
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.caps import CapData, CapType
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
# Things that make no sense to mirror, or will make everything explode if mirrored.
SEND_NORMALLY = {
'StartPingCheck', 'CompletePingCheck', 'PacketAck', 'SimulatorViewerTimeMessage', 'SimStats',
'SoundTrigger', 'EventQueueGet', 'GetMesh', 'GetMesh2', 'ParcelDwellRequest', 'ViewerEffect', 'ViewerStats',
'ParcelAccessListRequest', 'FirestormBridge', 'AvatarRenderInfo', 'ParcelPropertiesRequest', 'GetObjectCost',
'RequestMultipleObjects', 'GetObjectPhysicsData', 'GetExperienceInfo', 'RequestTaskInventory', 'AgentRequestSit',
'MuteListRequest', 'UpdateMuteListEntry', 'RemoveMuteListEntry', 'RequestImage',
'AgentThrottle', 'UseCircuitCode', 'AgentWearablesRequest', 'AvatarPickerRequest', 'CloseCircuit',
'CompleteAgentMovement', 'RegionHandshakeReply', 'LogoutRequest', 'ParcelPropertiesRequest',
'ParcelPropertiesRequestByID', 'MapBlockRequest', 'MapLayerRequest', 'MapItemRequest', 'MapNameRequest',
'ParcelAccessListRequest', 'AvatarPropertiesRequest', 'DirFindQuery',
'SetAlwaysRun', 'GetDisplayNames', 'ViewerMetrics', 'AgentResume', 'AgentPause',
'ViewerAsset', 'GetTexture', 'UUIDNameRequest', 'AgentUpdate', 'AgentAnimation'
# Would just be confusing for everyone
'ImprovedInstantMessage',
# Xfer system isn't authed to begin with, and duping Xfers can lead to premature file deletion. Skip.
'RequestXfer', 'ConfirmXferPacket', 'AbortXfer', 'SendXferPacket',
}
# Messages that _must_ be sent normally, but are worth mirroring onto the target session to see how
# they would respond
MIRROR = {
'RequestObjectPropertiesFamily', 'ObjectSelect', 'RequestObjectProperties', 'TransferRequest',
'RequestMultipleObjects', 'RequestTaskInventory', 'FetchInventory2', 'ScriptDialogReply',
'ObjectDeselect', 'GenericMessage', 'ChatFromViewer'
}
for msg_name in DEFAULT_TEMPLATE_DICT.message_templates.keys():
# There are a lot of these.
if msg_name.startswith("Group") and msg_name.endswith("Request"):
MIRROR.add(msg_name)
class MessageMirrorAddon(BaseAddon):
mirror_target_agent: Optional[UUID] = SessionProperty(None)
mirror_use_target_session: bool = SessionProperty(True)
mirror_use_target_agent: bool = SessionProperty(True)
@handle_command(target_agent=Parameter(UUID, optional=True))
async def mirror_to(self, session: Session, _region, target_agent: Optional[UUID] = None):
"""
Send this session's outbound messages over another proxied agent's circuit
"""
if target_agent:
if target_agent == session.agent_id:
show_message("Can't mirror our own session")
target_agent = None
elif not any(s.agent_id == target_agent for s in session.session_manager.sessions):
show_message(f"No active proxied session for agent {target_agent}")
target_agent = None
self.mirror_target_agent = target_agent
if target_agent:
show_message(f"Mirroring to {target_agent}")
else:
show_message("Message mirroring disabled")
@handle_command(enabled=parse_bool)
async def set_mirror_use_target_session(self, _session, _region, enabled):
"""Replace the original session ID with the target session's ID when mirroring"""
self.mirror_use_target_session = enabled
@handle_command(enabled=parse_bool)
async def set_mirror_use_target_agent(self, _session, _region, enabled):
"""Replace the original agent ID with the target agent's ID when mirroring"""
self.mirror_use_target_agent = enabled
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if message.direction != Direction.OUT:
return
if not self.mirror_target_agent:
return
if message.name in SEND_NORMALLY:
return
target_session = None
for poss_session in session.session_manager.sessions:
if poss_session.agent_id == self.mirror_target_agent:
target_session = poss_session
if not target_session:
print("Couldn't find target session?")
return
target_region = None
for poss_region in target_session.regions:
if poss_region.circuit_addr == region.circuit_addr:
target_region = poss_region
if not target_region:
print("Couldn't find equivalent target region?")
return
# Send the message normally first if we're mirroring
if message.name in MIRROR:
region.circuit.send_message(message)
# We're going to send the message on a new circuit, we need to take
# it so we get a new packet ID and clean ACKs
message = message.take()
self._lludp_fixups(target_session, message)
target_region.circuit.send_message(message)
return True
def _lludp_fixups(self, target_session: Session, message: Message):
if "AgentData" in message:
agent_block = message["AgentData"][0]
if "AgentID" in agent_block and self.mirror_use_target_agent:
agent_block["AgentID"] = target_session.agent_id
if "SessionID" in agent_block and self.mirror_use_target_session:
agent_block["SessionID"] = target_session.id
if message.name == "TransferRequest":
transfer_block = message["TransferInfo"][0]
# This is a duplicated message so we need to give it a new ID
transfer_block["TransferID"] = UUID.random()
params = transfer_block.deserialize_var("Params")
# This kind of Transfer might not even use agent credentials
if self.mirror_use_target_agent and hasattr(params, 'AgentID'):
params.AgentID = target_session.agent_id
if self.mirror_use_target_session and hasattr(params, 'SessionID'):
params.SessionID = target_session.id
transfer_block.serialize_var("Params", params)
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
# Already mirrored, ignore.
if flow.is_replay:
return
cap_data = flow.cap_data
if not cap_data:
return
if cap_data.cap_name in SEND_NORMALLY:
return
if cap_data.asset_server_cap:
return
# Likely doesn't have an exact equivalent in the target session, this is a temporary
# cap like an uploader URL or a stats URL.
if cap_data.type == CapType.TEMPORARY:
return
session: Optional[Session] = cap_data.session and cap_data.session()
if not session:
return
region: Optional[ProxiedRegion] = cap_data.region and cap_data.region()
if not region:
return
# Session-scoped, so we need to know if we have a session before checking
if not self.mirror_target_agent:
return
target_session: Optional[Session] = None
for poss_session in session.session_manager.sessions:
if poss_session.agent_id == self.mirror_target_agent:
target_session = poss_session
if not target_session:
return
caps_source = target_session
target_region: Optional[ProxiedRegion] = None
if region:
target_region = None
for poss_region in target_session.regions:
if poss_region.circuit_addr == region.circuit_addr:
target_region = poss_region
if not target_region:
print("No region in cap?")
return
caps_source = target_region
new_base_url = caps_source.caps.get(cap_data.cap_name)
if not new_base_url:
print("No equiv cap?")
return
if cap_data.cap_name in MIRROR:
flow = flow.copy()
# Have the cap data reflect the new URL we're pointing at
flow.metadata["cap_data"] = CapData(
cap_name=cap_data.cap_name,
region=weakref.ref(target_region) if target_region else None,
session=weakref.ref(target_session),
base_url=new_base_url,
)
# Tack any params onto the new base URL for the cap
new_url = new_base_url + flow.request.url[len(cap_data.base_url):]
flow.request.url = new_url
if cap_data.cap_name in MIRROR:
self._replay_flow(flow, session.session_manager)
def _replay_flow(self, flow: HippoHTTPFlow, session_manager: SessionManager):
# Work around mitmproxy bug, changing the URL updates the Host header, which may
# cause it to drop the port even when it shouldn't have. Fix the host header.
if flow.request.port not in (80, 443) and ":" not in flow.request.host_header:
flow.request.host_header = f"{flow.request.host}:{flow.request.port}"
# Should get repopulated when it goes back through the MITM addon
flow.metadata.pop("cap_data_ser", None)
flow.metadata.pop("cap_data", None)
proxy_queue = session_manager.flow_context.to_proxy_queue
proxy_queue.put_nowait(("replay", None, flow.get_state()))
addons = [MessageMirrorAddon()]

View File

@@ -27,16 +27,32 @@ from mitmproxy.http import HTTPFlow
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.jp2_utils import BufferedJp2k
from hippolyzer.lib.base.multiprocessing_utils import ParentProcessWatcher
from hippolyzer.lib.base.templates import TextureEntry
from hippolyzer.lib.proxy.addon_utils import AssetAliasTracker, BaseAddon, GlobalProperty, AddonProcess
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
from hippolyzer.lib.proxy.templates import TextureEntry
glymur.set_option('lib.num_threads', 4)
# These should never be replaced, they're only used as aliases to tell the viewer
# it should fetch the relevant texture from the appearance service
BAKES_ON_MESH_TEXTURE_IDS = {UUID(x) for x in (
"5a9f4a74-30f2-821c-b88d-70499d3e7183",
"ae2de45c-d252-50b8-5c6e-19f39ce79317",
"24daea5f-0539-cfcf-047f-fbc40b2786ba",
"52cc6bb6-2ee5-e632-d3ad-50197b1dcb8a",
"43529ce8-7faa-ad92-165a-bc4078371687",
"09aac1fb-6bce-0bee-7d44-caac6dbb6c63",
"ff62763f-d60a-9855-890b-0c96f8f8cd98",
"8e915e25-31d1-cc95-ae08-d58a47488251",
"9742065b-19b5-297c-858a-29711d539043",
"03642e83-2bd1-4eb9-34b4-4c47ed586d2d",
"edd51b77-fc10-ce7a-4b3d-011dfc349e4f",
)}
def _modify_crc(crc_tweak: int, crc_val: int):
return ctypes.c_uint32(crc_val ^ crc_tweak).value
@@ -82,7 +98,7 @@ class MonochromeAddon(BaseAddon):
# Tell queue consumers to shut down
self.mono_addon_shutdown_signal.set()
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
tracker = self.mono_tracker
if message.name == "ObjectUpdateCached":
for block in message["ObjectData"]:
@@ -137,6 +153,8 @@ class MonochromeAddon(BaseAddon):
# and we don't want to change the canonical view.
parsed_te = copy.deepcopy(parsed_te)
for k, v in parsed_te.Textures.items():
if v in BAKES_ON_MESH_TEXTURE_IDS:
continue
# Replace textures with their alias to bust the viewer cache
parsed_te.Textures[k] = tracker.get_alias_uuid(v)
for k, v in parsed_te.Color.items():
@@ -166,6 +184,8 @@ class MonochromeAddon(BaseAddon):
orig_texture_id = self.mono_tracker.get_orig_uuid(UUID(texture_id))
if not orig_texture_id:
return
if orig_texture_id in BAKES_ON_MESH_TEXTURE_IDS:
return
# The request was for a fake texture ID we created, rewrite the request to
# request the real asset and mark the flow for modification once we receive

View File

@@ -11,11 +11,11 @@ from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.objects import Object
from hippolyzer.lib.base.templates import PCode
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message, SessionProperty
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
from hippolyzer.lib.proxy.templates import PCode
class ObjectUpdateBlameAddon(BaseAddon):

View File

@@ -3,16 +3,15 @@ Do the money dance whenever someone in the sim pays you directly
"""
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Block
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.templates import MoneyTransactionType, ChatType
from hippolyzer.lib.proxy.addon_utils import send_chat, BaseAddon
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
from hippolyzer.lib.proxy.templates import MoneyTransactionType, PCode, ChatType
class PaydayAddon(BaseAddon):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if message.name != "MoneyBalanceReply":
return
transaction_block = message["TransactionInfo"][0]
@@ -28,8 +27,8 @@ class PaydayAddon(BaseAddon):
return
# Check if they're likely to be in the sim
sender_obj = region.objects.lookup_fullid(sender)
if not sender_obj or sender_obj.PCode != PCode.AVATAR:
sender_obj = region.objects.lookup_avatar(sender)
if not sender_obj:
return
amount = transaction_block['Amount']
@@ -38,7 +37,7 @@ class PaydayAddon(BaseAddon):
chat_type=ChatType.SHOUT,
)
# Do the traditional money dance.
session.main_region.circuit.send_message(ProxiedMessage(
session.main_region.circuit.send_message(Message(
"AgentAnimation",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
Block("AnimationList", AnimID=UUID("928cae18-e31d-76fd-9cc9-2f55160ff818"), StartAnim=True),

View File

@@ -0,0 +1,161 @@
"""
Import a small image (like a nintendo sprite) and create it out of cube prims
Inefficient and doesn't even do line fill, expect it to take `width * height`
prims for whatever image you import!
"""
import asyncio
import struct
from typing import *
from PySide2.QtGui import QImage
from hippolyzer.lib.base.datatypes import UUID, Vector3, Quaternion
from hippolyzer.lib.base.helpers import to_chunks
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.templates import ObjectUpdateFlags, PCode, MCode, MultipleObjectUpdateFlags, TextureEntry
from hippolyzer.lib.client.object_manager import ObjectEvent, UpdateType
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
JUST_CREATED_FLAGS = (ObjectUpdateFlags.CREATE_SELECTED | ObjectUpdateFlags.OBJECT_YOU_OWNER)
PRIM_SCALE = 0.2
class PixelArtistAddon(BaseAddon):
@handle_command()
async def import_pixel_art(self, session: Session, region: ProxiedRegion):
"""
Import a small image (like a nintendo sprite) and create it out of cube prims
"""
filename = await AddonManager.UI.open_file(
"Open an image",
filter_str="Images (*.png *.jpg *.jpeg *.bmp)",
)
if not filename:
return
img = QImage()
with open(filename, "rb") as f:
img.loadFromData(f.read(), aformat=None)
img = img.convertToFormat(QImage.Format_RGBA8888)
height = img.height()
width = img.width()
pixels: List[Optional[bytes]] = []
needed_prims = 0
for y in range(height):
for x in range(width):
color: int = img.pixel(x, y)
# This will be ARGB, SL wants RGBA
alpha = (color & 0xFF000000) >> 24
color = color & 0x00FFFFFF
if alpha > 20:
# Repack RGBA to the bytes format we use for colors
pixels.append(struct.pack("!I", (color << 8) | alpha))
needed_prims += 1
else:
# Pretty transparent, skip it
pixels.append(None)
if not await AddonManager.UI.confirm("Confirm prim use", f"This will take {needed_prims} prims"):
return
agent_obj = region.objects.lookup_fullid(session.agent_id)
agent_pos = agent_obj.RegionPosition
created_prims = []
# Watch for any newly created prims, this is basically what the viewer does to find
# prims that it just created with the build tool.
with session.objects.events.subscribe_async(
(UpdateType.OBJECT_UPDATE,),
predicate=lambda e: e.object.UpdateFlags & JUST_CREATED_FLAGS and "LocalID" in e.updated
) as get_events:
# Create a pool of prims to use for building the pixel art
for _ in range(needed_prims):
# TODO: We don't track the land group or user's active group, so
# "anyone can build" must be on for rezzing to work.
group_id = UUID()
region.circuit.send_message(Message(
'ObjectAdd',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id, GroupID=group_id),
Block(
'ObjectData',
PCode=PCode.PRIMITIVE,
Material=MCode.WOOD,
AddFlags=ObjectUpdateFlags.CREATE_SELECTED,
PathCurve=16,
ProfileCurve=1,
PathScaleX=100,
PathScaleY=100,
BypassRaycast=1,
RayStart=agent_obj.RegionPosition + Vector3(0, 0, 2),
RayEnd=agent_obj.RegionPosition + Vector3(0, 0, 2),
RayTargetID=UUID(),
RayEndIsIntersection=0,
Scale=Vector3(PRIM_SCALE, PRIM_SCALE, PRIM_SCALE),
Rotation=Quaternion(0.0, 0.0, 0.0, 1.0),
fill_missing=True,
),
))
# Don't spam a ton of creates at once
await asyncio.sleep(0.02)
# Read any creation events that queued up while we were creating the objects
# So we can figure out the newly-created objects' IDs
for _ in range(needed_prims):
evt: ObjectEvent = await asyncio.wait_for(get_events(), 1.0)
created_prims.append(evt.object)
# Drawing origin starts at the top left, should be positioned just above the
# avatar on Z and centered on Y.
top_left = Vector3(0, (width * PRIM_SCALE) * -0.5, (height * PRIM_SCALE) + 2.0) + agent_pos
positioning_blocks = []
prim_idx = 0
for i, pixel_color in enumerate(pixels):
# Transparent, skip
if pixel_color is None:
continue
x = i % width
y = i // width
obj = created_prims[prim_idx]
# Set a blank texture on all faces
te = TextureEntry()
te.Textures[None] = UUID('5748decc-f629-461c-9a36-a35a221fe21f')
# Set the prim color to the color from the pixel
te.Color[None] = pixel_color
# Set the prim texture and color
region.circuit.send_message(Message(
'ObjectImage',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('ObjectData', ObjectLocalID=obj.LocalID, MediaURL=b'', TextureEntry_=te),
direction=Direction.OUT,
))
# Save the repositioning data for later since it uses a different message,
# but it can be set in batches.
positioning_blocks.append(Block(
'ObjectData',
ObjectLocalID=obj.LocalID,
Type=MultipleObjectUpdateFlags.POSITION,
Data_={'POSITION': top_left + Vector3(0, x * PRIM_SCALE, y * -PRIM_SCALE)},
))
await asyncio.sleep(0.01)
# We actually used a prim for this, so increment the index
prim_idx += 1
# Move the "pixels" to their correct position in chunks
for chunk in to_chunks(positioning_blocks, 25):
region.circuit.send_message(Message(
'MultipleObjectUpdate',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
*chunk,
direction=Direction.OUT,
))
await asyncio.sleep(0.01)
addons = [PixelArtistAddon()]

View File

@@ -0,0 +1,151 @@
"""
Recapitator addon, merges a base head shape into body shapes.
Only works if both the base shapes and shapes you need to edit are modify.
Useful if you switch heads a lot. Most heads come with a base shape you
have to start from if you don't want the head to look like garbage. If you
have an existing shape for your body, you have to write down all the values
of the base shape's head sliders and edit them onto your body shapes.
This addon does basically the same thing by intercepting shape uploads. After
enabling recapitation, you save the base head shape once. Then the next time you
edit and save a body shape, it will be saved with the head sliders from your base
shape.
"""
import logging
from typing import *
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.templates import AssetType, WearableType
from hippolyzer.lib.base.wearables import Wearable, VISUAL_PARAMS
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty, AssetAliasTracker, show_message
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
# Get all VisualParam IDs that belong to head sliders
HEAD_EDIT_GROUPS = ("shape_head", "shape_eyes", "shape_ears", "shape_nose", "shape_mouth", "shape_chin")
HEAD_PARAM_IDS = [v.id for v in VISUAL_PARAMS if v.edit_group in HEAD_EDIT_GROUPS]
class RecapitatorAddon(BaseAddon):
transaction_remappings: AssetAliasTracker = SessionProperty(AssetAliasTracker)
recapitating: bool = SessionProperty(bool)
recapitation_mappings: Dict[int, float] = SessionProperty(dict)
@handle_command()
async def enable_recapitation(self, _session: Session, _region: ProxiedRegion):
"""Apply base head shape when saving subsequent shapes"""
self.recapitating = True
self.recapitation_mappings.clear()
show_message("Recapitation enabled, wear the base shape containing the head parameters and save it.")
@handle_command()
async def disable_recapitation(self, _session: Session, _region: ProxiedRegion):
self.recapitating = False
show_message("Recapitation disabled")
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if not self.recapitating:
return
if message.direction != Direction.OUT:
return
if message.name != "AssetUploadRequest":
return
if message["AssetBlock"]["Type"] != AssetType.BODYPART:
return
# Pending asset upload for a bodypart asset. Take the message and request
# it from the client ourself so we can see what it wants to upload
new_message = message.take()
self._schedule_task(self._proxy_bodypart_upload(session, region, new_message))
return True
async def _proxy_bodypart_upload(self, session: Session, region: ProxiedRegion, message: Message):
asset_block = message["AssetBlock"]
# Asset will already be in the viewer's VFS as the expected asset ID, calculate it.
asset_id = session.transaction_to_assetid(asset_block["TransactionID"])
success = False
try:
# Xfer the asset from the viewer if it wasn't small enough to fit in AssetData
if asset_block["AssetData"]:
asset_data = asset_block["AssetData"]
else:
xfer = await region.xfer_manager.request(
vfile_id=asset_id,
vfile_type=AssetType.BODYPART,
direction=Direction.IN,
)
asset_data = xfer.reassemble_chunks()
wearable = Wearable.from_bytes(asset_data)
# If they're uploading a shape, process it.
if wearable.wearable_type == WearableType.SHAPE:
if self.recapitation_mappings:
# Copy our previously saved head params over
for key, value in self.recapitation_mappings.items():
wearable.parameters[key] = value
# Upload the changed version
asset_data = wearable.to_bytes()
show_message("Recapitated shape")
else:
# Don't have a recapitation mapping yet, use this shape as the base.
for param_id in HEAD_PARAM_IDS:
self.recapitation_mappings[param_id] = wearable.parameters[param_id]
show_message("Got base parameters for recapitation, head parameters will be copied")
# Upload it ourselves with a new transaction ID that can be traced back to
# the original. This is important because otherwise the viewer will use its
# own cached version of the shape, under the assumption it wasn't modified
# during upload.
new_transaction_id = self.transaction_remappings.get_alias_uuid(
asset_block["TransactionID"]
)
await region.xfer_manager.upload_asset(
asset_type=AssetType.BODYPART,
data=asset_data,
transaction_id=new_transaction_id,
)
success = True
except:
logging.exception("Exception while recapitating")
# Tell the viewer about the status of its original upload
region.circuit.send_message(Message(
"AssetUploadComplete",
Block("AssetBlock", UUID=asset_id, Type=asset_block["Type"], Success=success),
direction=Direction.IN,
))
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
# Skip requests that aren't related to patching an existing item
if flow.cap_data.cap_name != "InventoryAPIv3":
return
if flow.request.method != "PATCH":
return
if "/item/" not in flow.request.url:
return
parsed = llsd.parse_xml(flow.request.content)
if parsed.get("type") != "bodypart":
return
# `hash_id` being present means we're updating the item to point to a newly
# uploaded asset. It's actually a transaction ID.
transaction_id: Optional[UUID] = parsed.get("hash_id")
if not transaction_id:
return
# We have an original transaction ID, do we need to remap it to an alias ID?
orig_id = self.transaction_remappings.get_alias_uuid(transaction_id, create=False)
if not orig_id:
return
parsed["hash_id"] = orig_id
flow.request.content = llsd.format_xml(parsed)
addons = [RecapitatorAddon()]

View File

@@ -1,12 +1,12 @@
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
class REPLExampleAddon(BaseAddon):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if message.name == "ChatFromViewer":
chat_msg = message["ChatData"]["Message"]
if not chat_msg:

View File

@@ -15,8 +15,8 @@ from hippolyzer.lib.base import serialization as se
from hippolyzer.lib.base.message.udpdeserializer import UDPMessageDeserializer
from hippolyzer.lib.base.message.udpserializer import UDPMessageSerializer
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.proxy.packets import ProxiedUDPPacket
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.network.transport import UDPPacket
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager, Session
@@ -28,11 +28,12 @@ class SerializationSanityChecker(BaseAddon):
self.serializer = UDPMessageSerializer()
self.deserializer = UDPMessageDeserializer()
def handle_proxied_packet(self, session_manager: SessionManager, packet: ProxiedUDPPacket,
session: Optional[Session], region: Optional[ProxiedRegion],
message: Optional[ProxiedMessage]):
def handle_proxied_packet(self, session_manager: SessionManager, packet: UDPPacket,
session: Optional[Session], region: Optional[ProxiedRegion]):
# Well this doesn't even parse as a message, can't do anything about it.
if message is None:
try:
message = self.deserializer.deserialize(packet.data)
except:
LOG.error(f"Received unparseable message from {packet.src_addr!r}: {packet.data!r}")
return
try:
@@ -63,7 +64,7 @@ class SerializationSanityChecker(BaseAddon):
except:
LOG.exception(f"Exception during message validation:\n{message!r}")
def _roundtrip_var_serializers(self, message: ProxiedMessage):
def _roundtrip_var_serializers(self, message: Message):
for block in itertools.chain(*message.blocks.values()):
for var_name in block.vars.keys():
orig_val = block[var_name]

View File

@@ -1,18 +1,23 @@
"""Block potentially bad things"""
from hippolyzer.lib.base.templates import IMDialogType, XferFilePath
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.proxy.packets import Direction
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
from hippolyzer.lib.proxy.templates import IMDialogType
SUSPICIOUS_PACKETS = {"RequestXfer", "TransferRequest", "UUIDNameRequest",
"UUIDGroupNameRequest", "OpenCircuit"}
SUSPICIOUS_PACKETS = {
"TransferRequest",
"UUIDNameRequest",
"UUIDGroupNameRequest",
"OpenCircuit",
"AddCircuitCode",
}
REGULAR_IM_DIALOGS = (IMDialogType.TYPING_STOP, IMDialogType.TYPING_STOP, IMDialogType.NOTHING_SPECIAL)
class ShieldAddon(BaseAddon):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if message.direction != Direction.IN:
return
if message.name in SUSPICIOUS_PACKETS:
@@ -29,6 +34,13 @@ class ShieldAddon(BaseAddon):
else:
expected_id = from_agent ^ session.agent_id
msg_block["ID"] = expected_id
if message.name == "RequestXfer":
xfer_block = message["XferID"][0]
# Don't allow Xfers for files, only assets
if xfer_block["FilePath"] != XferFilePath.NONE or xfer_block["Filename"]:
show_message(f"Blocked suspicious {message.name} packet")
region.circuit.drop_message(message)
return True
addons = [ShieldAddon()]

View File

@@ -1,6 +1,6 @@
import itertools
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
@@ -12,7 +12,7 @@ def _to_spongecase(val):
return "".join(itertools.chain(*spongecased))
def handle_lludp_message(session: Session, _region: ProxiedRegion, message: ProxiedMessage):
def handle_lludp_message(session: Session, _region: ProxiedRegion, message: Message):
ctx = session.addon_ctx
ctx.setdefault("spongecase", False)
if message.name == "ChatFromViewer":

View File

@@ -0,0 +1,55 @@
"""
Tail animation generator
Demonstrates programmatic generation of local motions using BaseAnimHelperAddon
You can use this to create an animation with a script, fiddle with it until it
looks right, then finally save it with /524 save_local_anim <ANIM_NAME>.
The built animation is automatically applied to all active sessions when loaded,
and is re-generated whenever the script is edited. Unloading the script stops
the animations.
"""
from hippolyzer.lib.base.anim_utils import shift_keyframes, smooth_rot
from hippolyzer.lib.base.datatypes import Quaternion
from hippolyzer.lib.base.llanim import Animation, Joint
from hippolyzer.lib.proxy.addons import AddonManager
import local_anim
AddonManager.hot_reload(local_anim, require_addons_loaded=True)
class TailAnimator(local_anim.BaseAnimHelperAddon):
# Should be unique
ANIM_NAME = "tail_anim"
def build_anim(self) -> Animation:
anim = Animation(
base_priority=5,
duration=5.0,
loop_out_point=5.0,
loop=True,
)
# Iterate along tail joints 1 through 6
for joint_num in range(1, 7):
# Give further along joints a wider range of motion
start_rot = Quaternion.from_euler(0.2, -0.3, 0.15 * joint_num)
end_rot = Quaternion.from_euler(-0.2, -0.3, -0.15 * joint_num)
rot_keyframes = [
# Tween between start_rot and end_rot, using smooth interpolation.
# SL's keyframes only allow linear interpolation which doesn't look great
# for natural motions. `smooth_rot()` gets around that by generating
# smooth inter frames for SL to linearly interpolate between.
*smooth_rot(start_rot, end_rot, inter_frames=10, time=0.0, duration=2.5),
*smooth_rot(end_rot, start_rot, inter_frames=10, time=2.5, duration=2.5),
]
anim.joints[f"mTail{joint_num}"] = Joint(
priority=5,
# Each joint's frames should be ahead of the previous joint's by 2 frames
rot_keyframes=shift_keyframes(rot_keyframes, joint_num * 2),
)
return anim
addons = [TailAnimator()]

View File

@@ -4,13 +4,8 @@ Example of how to request a Transfer
from typing import *
from hippolyzer.lib.base.legacy_inv import InventoryModel, InventoryItem
from hippolyzer.lib.base.message.message import Block
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
from hippolyzer.lib.proxy.templates import (
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.templates import (
AssetType,
EstateAssetType,
TransferRequestParamsSimEstate,
@@ -18,6 +13,10 @@ from hippolyzer.lib.proxy.templates import (
TransferSourceType,
XferFilePath,
)
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
class TransferExampleAddon(BaseAddon):
@@ -36,12 +35,12 @@ class TransferExampleAddon(BaseAddon):
async def get_first_script(self, session: Session, region: ProxiedRegion):
"""Get the contents of the first script in the selected object"""
# Ask for the object inventory so we can find a script
region.circuit.send_message(ProxiedMessage(
region.circuit.send_message(Message(
'RequestTaskInventory',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('InventoryData', LocalID=session.selected.object_local),
))
inv_message = await region.message_handler.wait_for('ReplyTaskInventory', timeout=5.0)
inv_message = await region.message_handler.wait_for(('ReplyTaskInventory',), timeout=5.0)
# Xfer the inventory file and look for a script
xfer = await region.xfer_manager.request(

View File

@@ -0,0 +1,105 @@
"""
Speed up outbound object inventory listing requests
by 20x at the cost of potentially failing to request some due to
dropped packets.
Useful for builders working on objects with very large inventories that
change very often.
Object Inventory transfers use the Xfer system. Xfers have their own,
terrible reliability system that probably pre-dates LLUDP reliability.
Each packet has to be ACKed before the far end will send the next packet.
Each packet can be around 1200 bytes and will fit 1.5 inventory items worth of data.
Let's say your sim ping is 100 ms. Because each packet needs to be ACKed
before the next will be sent, it'll take around `num_items * 100 / 1.5`
milliseconds before you receive the full inventory list of an object.
That means for an object with 300 items, it'll take about 20 seconds
for you to download the full inventory, and those downloads are triggered
every time the inventory is changed.
By faking ACKs for packets we haven't received yet, we can trick the server
into sending us packets much faster than it would otherwise. The only problem
is that if an inbound SendXferPacket gets lost after we faked an ACK for it,
we have no way to re-request it. The Xfer will just fail. The viewer will also
drop any out-of-order xfer packets, so packet re-ordering is a problem.
To deal with that, the proxy attempts its own Xfers using all the chunks
from the previous attempts before sending a final, reconstructed Xfer
to the viewer.
"""
import asyncio
from typing import *
from hippolyzer.lib.base.templates import XferFilePath
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
from hippolyzer.lib.base.xfer_manager import Xfer
class TurboObjectInventoryAddon(BaseAddon):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if message.direction != Direction.OUT:
return
if message.name != "RequestTaskInventory":
return
self._schedule_task(self._proxy_task_inventory_request(region, message.take()))
return True
async def _proxy_task_inventory_request(
self,
region: ProxiedRegion,
request_msg: Message
):
# Keep around a dict of chunks we saw previously in case we have to restart
# an Xfer due to missing chunks. We don't expect chunks to change across Xfers
# so this can be used to recover from dropped SendXferPackets in subsequent attempts
existing_chunks: Dict[int, bytes] = {}
for i in range(3):
# Any previous requests will have triggered a delete of the inventory file
# by marking it complete on the server-side. Re-send our RequestTaskInventory
# To make sure there's a fresh copy.
region.circuit.send_message(request_msg.take())
inv_message = await region.message_handler.wait_for(('ReplyTaskInventory',), timeout=5.0)
# No task inventory, send the reply as-is
file_name = inv_message["InventoryData"]["Filename"]
if not file_name:
region.circuit.send_message(inv_message)
return
xfer = region.xfer_manager.request(
file_name=file_name,
file_path=XferFilePath.CACHE,
turbo=True,
)
xfer.chunks.update(existing_chunks)
try:
await xfer
except asyncio.TimeoutError:
# We likely failed the request due to missing chunks, store
# the chunks that we _did_ get for the next attempt.
existing_chunks.update(xfer.chunks)
continue
# Send the original ReplyTaskInventory to the viewer so it knows the file is ready
region.circuit.send_message(inv_message)
proxied_xfer = Xfer(data=xfer.reassemble_chunks())
# Wait for the viewer to request the inventory file
await region.xfer_manager.serve_inbound_xfer_request(
xfer=proxied_xfer,
request_predicate=lambda x: x["XferID"]["Filename"] == file_name,
# indra's XferManager throttles confirms, so even local transfers will be
# slow if we wait for confirmation.
wait_for_confirm=False,
)
return
raise asyncio.TimeoutError("Failed to get inventory after 3 tries")
addons = [TurboObjectInventoryAddon()]

View File

@@ -11,15 +11,14 @@ from typing import *
import aiohttp
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Block
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.templates import AssetType
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.addon_utils import ais_item_to_inventory_data, show_message, BaseAddon
from hippolyzer.lib.proxy.commands import handle_command, Parameter
from hippolyzer.lib.proxy.packets import Direction
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
from hippolyzer.lib.proxy.templates import AssetType
class UploaderAddon(BaseAddon):
@@ -92,7 +91,7 @@ class UploaderAddon(BaseAddon):
async with region.caps_client.post('FetchInventory2', llsd=ais_req_data) as resp:
ais_item = (await resp.read_llsd())["items"][0]
message = ProxiedMessage(
message = Message(
"UpdateCreateInventoryItem",
Block(
"AgentData",

View File

@@ -1,28 +1,28 @@
"""
Example of how to request an Xfer
"""
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.legacy_inv import InventoryModel
from hippolyzer.lib.base.message.message import Block
from hippolyzer.lib.base.templates import XferFilePath, AssetType, InventoryType, WearableType
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
from hippolyzer.lib.proxy.templates import XferFilePath
class XferExampleAddon(BaseAddon):
@handle_command()
async def get_mute_list(self, session: Session, region: ProxiedRegion):
"""Fetch the current user's mute list"""
region.circuit.send_message(ProxiedMessage(
region.circuit.send_message(Message(
'MuteListRequest',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block("MuteData", MuteCRC=0),
))
# Wait for any MuteListUpdate, dropping it before it reaches the viewer
update_msg = await region.message_handler.wait_for('MuteListUpdate', timeout=5.0)
update_msg = await region.message_handler.wait_for(('MuteListUpdate',), timeout=5.0)
mute_file_name = update_msg["MuteData"]["Filename"]
if not mute_file_name:
show_message("Nobody muted?")
@@ -35,14 +35,14 @@ class XferExampleAddon(BaseAddon):
@handle_command()
async def get_task_inventory(self, session: Session, region: ProxiedRegion):
"""Get the inventory of the currently selected object"""
region.circuit.send_message(ProxiedMessage(
region.circuit.send_message(Message(
'RequestTaskInventory',
# If no session is passed in we'll use the active session when the coro was created
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('InventoryData', LocalID=session.selected.object_local),
))
inv_message = await region.message_handler.wait_for('ReplyTaskInventory', timeout=5.0)
inv_message = await region.message_handler.wait_for(('ReplyTaskInventory',), timeout=5.0)
# Xfer doesn't need to be immediately awaited, multiple signals can be waited on.
xfer = region.xfer_manager.request(
@@ -60,5 +60,61 @@ class XferExampleAddon(BaseAddon):
item_names = [item.name for item in inv_model.items.values()]
show_message(item_names)
@handle_command()
async def eyes_for_you(self, session: Session, region: ProxiedRegion):
"""Upload an eye bodypart and create an item for it"""
asset_data = f"""LLWearable version 22
New Eyes
\tpermissions 0
\t{{
\t\tbase_mask\t7fffffff
\t\towner_mask\t7fffffff
\t\tgroup_mask\t00000000
\t\teveryone_mask\t00000000
\t\tnext_owner_mask\t00082000
\t\tcreator_id\t{session.agent_id}
\t\towner_id\t{session.agent_id}
\t\tlast_owner_id\t00000000-0000-0000-0000-000000000000
\t\tgroup_id\t00000000-0000-0000-0000-000000000000
\t}}
\tsale_info\t0
\t{{
\t\tsale_type\tnot
\t\tsale_price\t10
\t}}
type 3
parameters 2
98 0
99 0
textures 1
3 89556747-24cb-43ed-920b-47caed15465f
"""
# If we want to create an item containing the asset we need to know the transaction id
# used to create the asset.
transaction_id = UUID.random()
await region.xfer_manager.upload_asset(
AssetType.BODYPART,
data=asset_data,
transaction_id=transaction_id
)
region.circuit.send_message(Message(
'CreateInventoryItem',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block(
'InventoryBlock',
CallbackID=0,
# Null folder ID will put it in the default folder for the type
FolderID=UUID(),
TransactionID=transaction_id,
NextOwnerMask=0x7fFFffFF,
Type=AssetType.BODYPART,
InvType=InventoryType.WEARABLE,
WearableType=WearableType.EYES,
Name='Eyes For You',
Description=b''
),
))
addons = [XferExampleAddon()]

14
codecov.yml Normal file
View File

@@ -0,0 +1,14 @@
coverage:
precision: 1
round: down
range: "50...80"
status:
project:
default:
# Do not fail commits if the code coverage drops.
target: 0%
threshold: 100%
base: auto
patch:
default:
only_pulls: true

View File

@@ -19,9 +19,9 @@ class MessageLogHeader(enum.IntEnum):
class MessageLogModel(QtCore.QAbstractTableModel, FilteringMessageLogger):
def __init__(self, parent=None):
def __init__(self, parent=None, maxlen=2000):
QtCore.QAbstractTableModel.__init__(self, parent)
FilteringMessageLogger.__init__(self)
FilteringMessageLogger.__init__(self, maxlen=maxlen)
def _begin_insert(self, insert_idx: int):
self.beginInsertRows(QtCore.QModelIndex(), insert_idx, insert_idx)

View File

@@ -17,15 +17,16 @@ from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.http_proxy import create_http_proxy, create_proxy_master, HTTPFlowContext
from hippolyzer.lib.proxy.http_event_manager import MITMProxyEventManager
from hippolyzer.lib.proxy.lludp_proxy import SLSOCKS5Server
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager, Session
from hippolyzer.lib.proxy.settings import ProxySettings
LOG = logging.getLogger(__name__)
class SelectionManagerAddon(BaseAddon):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
selected = session.selected
if message.name == "ObjectSelect":
# ObjectDeselect intentionally ignored to deal with messages that
@@ -42,7 +43,7 @@ class SelectionManagerAddon(BaseAddon):
LOG.debug(f"Don't know about selected {local_id}, requesting object")
needed_objects.add(local_id)
if needed_objects:
if needed_objects and session.session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS:
region.objects.request_objects(needed_objects)
# ParcelDwellRequests are sent whenever "about land" is opened. This gives us a
# decent mechanism for selecting parcels.
@@ -91,8 +92,8 @@ def run_http_proxy_process(proxy_host, http_proxy_port, flow_context: HTTPFlowCo
mitm_loop.run_forever()
def start_proxy(extra_addons: Optional[list] = None, extra_addon_paths: Optional[list] = None,
session_manager=None, proxy_host=None):
def start_proxy(session_manager: SessionManager, extra_addons: Optional[list] = None,
extra_addon_paths: Optional[list] = None, proxy_host=None):
extra_addons = extra_addons or []
extra_addon_paths = extra_addon_paths or []
extra_addons.append(SelectionManagerAddon())
@@ -105,20 +106,20 @@ def start_proxy(extra_addons: Optional[list] = None, extra_addon_paths: Optional
loop = asyncio.get_event_loop()
udp_proxy_port = int(os.environ.get("HIPPO_UDP_PORT", 9061))
http_proxy_port = int(os.environ.get("HIPPO_HTTP_PORT", 9062))
udp_proxy_port = session_manager.settings.SOCKS_PROXY_PORT
http_proxy_port = session_manager.settings.HTTP_PROXY_PORT
if proxy_host is None:
proxy_host = os.environ.get("HIPPO_BIND_HOST", "127.0.0.1")
proxy_host = session_manager.settings.PROXY_BIND_ADDR
session_manager = session_manager or SessionManager()
flow_context = session_manager.flow_context
session_manager.name_cache.load_viewer_caches()
# TODO: argparse
if len(sys.argv) == 3:
if sys.argv[1] == "--setup-ca":
try:
mitmproxy_master = create_http_proxy(proxy_host, http_proxy_port, flow_context)
except mitmproxy.exceptions.ServerException:
except mitmproxy.exceptions.MitmproxyException:
# Proxy already running, create the master so we don't try to bind to a port
mitmproxy_master = create_proxy_master(proxy_host, http_proxy_port, flow_context)
setup_ca(sys.argv[2], mitmproxy_master)
@@ -136,7 +137,7 @@ def start_proxy(extra_addons: Optional[list] = None, extra_addon_paths: Optional
async_server = loop.run_until_complete(coro)
event_manager = MITMProxyEventManager(session_manager, flow_context)
loop.create_task(event_manager.pump_proxy_events())
loop.create_task(event_manager.run())
addon_paths = sys.argv[1:]
addon_paths.extend(extra_addon_paths)
@@ -179,10 +180,15 @@ def start_proxy(extra_addons: Optional[list] = None, extra_addon_paths: Optional
def _windows_timeout_killer(pid: int):
time.sleep(2.0)
print(f"Killing hanging event loop")
print("Killing hanging event loop")
os.kill(pid, 9)
def main():
multiprocessing.set_start_method("spawn")
start_proxy()
start_proxy(SessionManager(ProxySettings()))
if __name__ == "__main__":
multiprocessing.freeze_support()
main()

View File

@@ -1,5 +1,6 @@
import asyncio
import base64
import dataclasses
import email
import functools
import html
@@ -8,7 +9,6 @@ import json
import logging
import pathlib
import multiprocessing
import os
import re
import signal
import socket
@@ -17,38 +17,44 @@ import urllib.parse
from typing import *
import multidict
from qasync import QEventLoop
from qasync import QEventLoop, asyncSlot
from PySide2 import QtCore, QtWidgets, QtGui
from hippolyzer.apps.model import MessageLogModel, MessageLogHeader, RegionListModel
from hippolyzer.apps.proxy import start_proxy
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import bytes_unescape, bytes_escape
from hippolyzer.lib.base.helpers import bytes_unescape, bytes_escape, get_resource_filename
from hippolyzer.lib.base.message.llsd_msg_serializer import LLSDMessageSerializer
from hippolyzer.lib.base.message.message import Block
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.message_formatting import (
HumanMessageSerializer,
VerbatimHumanVal,
subfield_eval,
SpannedString,
)
from hippolyzer.lib.base.message.msgtypes import MsgType
from hippolyzer.lib.base.message.template_dict import TemplateDictionary
from hippolyzer.lib.base.message.template_dict import DEFAULT_TEMPLATE_DICT
from hippolyzer.lib.base.ui_helpers import loadUi
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base.network.transport import Direction, SocketUDPTransport
from hippolyzer.lib.proxy.addons import BaseInteractionManager, AddonManager
from hippolyzer.lib.proxy.ca_utils import setup_ca_everywhere
from hippolyzer.lib.proxy.caps_client import CapsClient
from hippolyzer.lib.proxy.caps_client import ProxyCapsClient
from hippolyzer.lib.proxy.http_proxy import create_proxy_master, HTTPFlowContext
from hippolyzer.lib.proxy.packets import Direction
from hippolyzer.lib.proxy.message import ProxiedMessage, VerbatimHumanVal, proxy_eval
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, AbstractMessageLogEntry
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, AbstractMessageLogEntry, WrappingMessageLogger, \
import_log_entries, export_log_entries
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
from hippolyzer.lib.proxy.settings import ProxySettings
from hippolyzer.lib.proxy.templates import CAP_TEMPLATES
LOG = logging.getLogger(__name__)
BASE_PATH = os.path.dirname(os.path.abspath(__file__))
MAIN_WINDOW_UI_PATH = os.path.join(BASE_PATH, "proxy_mainwindow.ui")
MESSAGE_BUILDER_UI_PATH = os.path.join(BASE_PATH, "message_builder.ui")
ADDON_DIALOG_UI_PATH = os.path.join(BASE_PATH, "addon_dialog.ui")
FILTER_DIALOG_UI_PATH = os.path.join(BASE_PATH, "filter_dialog.ui")
MAIN_WINDOW_UI_PATH = get_resource_filename("apps/proxy_mainwindow.ui")
MESSAGE_BUILDER_UI_PATH = get_resource_filename("apps/message_builder.ui")
ADDON_DIALOG_UI_PATH = get_resource_filename("apps/addon_dialog.ui")
FILTER_DIALOG_UI_PATH = get_resource_filename("apps/filter_dialog.ui")
def show_error_message(error_msg, parent=None):
@@ -63,11 +69,11 @@ class GUISessionManager(SessionManager, QtCore.QObject):
regionAdded = QtCore.Signal(ProxiedRegion)
regionRemoved = QtCore.Signal(ProxiedRegion)
def __init__(self, model):
SessionManager.__init__(self)
def __init__(self, settings):
SessionManager.__init__(self, settings)
QtCore.QObject.__init__(self)
self.all_regions = []
self.message_logger = model
self.message_logger = WrappingMessageLogger()
def checkRegions(self):
new_regions = itertools.chain(*[s.regions for s in self.sessions])
@@ -96,12 +102,16 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
dialog.open()
return future
async def _file_dialog(self, caption: str, directory: str, filter_str: str, mode: QtWidgets.QFileDialog.FileMode) \
-> Tuple[bool, QtWidgets.QFileDialog]:
async def _file_dialog(
self, caption: str, directory: str, filter_str: str, mode: QtWidgets.QFileDialog.FileMode,
default_suffix: str = '',
) -> Tuple[bool, QtWidgets.QFileDialog]:
dialog = QtWidgets.QFileDialog(self.parent(), caption=caption, directory=directory, filter=filter_str)
dialog.setFileMode(mode)
if mode == QtWidgets.QFileDialog.FileMode.AnyFile:
dialog.setAcceptMode(QtWidgets.QFileDialog.AcceptMode.AcceptSave)
if default_suffix:
dialog.setDefaultSuffix(default_suffix)
res = await self._dialog_async_exec(dialog)
return res, dialog
@@ -129,14 +139,44 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
return None
return dialog.selectedFiles()[0]
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '') -> Optional[str]:
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '',
default_suffix: str = '') -> Optional[str]:
res, dialog = await self._file_dialog(
caption, directory, filter_str, QtWidgets.QFileDialog.FileMode.AnyFile
caption, directory, filter_str, QtWidgets.QFileDialog.FileMode.AnyFile, default_suffix,
)
if not res or not dialog.selectedFiles():
return None
return dialog.selectedFiles()[0]
async def confirm(self, title: str, caption: str) -> bool:
msg = QtWidgets.QMessageBox(
QtWidgets.QMessageBox.Icon.Question,
title,
caption,
QtWidgets.QMessageBox.Ok | QtWidgets.QMessageBox.Cancel,
self.parent(),
)
fut = asyncio.Future()
msg.finished.connect(lambda r: fut.set_result(r))
msg.open()
return (await fut) == QtWidgets.QMessageBox.Ok
class GUIProxySettings(ProxySettings):
"""Persistent settings backed by QSettings"""
def __init__(self, settings: QtCore.QSettings):
super().__init__()
self._settings_obj = settings
def get_setting(self, name: str) -> Any:
val: Any = self._settings_obj.value(name, defaultValue=dataclasses.MISSING)
if val is dataclasses.MISSING:
return val
return json.loads(val)
def set_setting(self, name: str, val: Any):
self._settings_obj.setValue(name, json.dumps(val))
def nonFatalExceptions(f):
@functools.wraps(f)
@@ -151,7 +191,35 @@ def nonFatalExceptions(f):
return _wrapper
class ProxyGUI(QtWidgets.QMainWindow):
def buildReplacements(session: Session, region: ProxiedRegion):
if not session or not region:
return {}
selected = session.selected
agent_object = region.objects.lookup_fullid(session.agent_id)
selected_local = selected.object_local
selected_object = None
if selected_local:
# We may or may not have an object for this
selected_object = region.objects.lookup_localid(selected_local)
return {
"SELECTED_LOCAL": selected_local,
"SELECTED_FULL": selected_object.FullID if selected_object else None,
"SELECTED_PARCEL_LOCAL": selected.parcel_local,
"SELECTED_PARCEL_FULL": selected.parcel_full,
"SELECTED_SCRIPT_ITEM": selected.script_item,
"SELECTED_TASK_ITEM": selected.task_item,
"AGENT_ID": session.agent_id,
"AGENT_LOCAL": agent_object.LocalID if agent_object else None,
"SESSION_ID": session.id,
"AGENT_POS": agent_object.Position if agent_object else None,
"NULL_KEY": UUID(),
"RANDOM_KEY": UUID.random,
"CIRCUIT_CODE": session.circuit_code,
"REGION_HANDLE": region.handle,
}
class MessageLogWindow(QtWidgets.QMainWindow):
DEFAULT_IGNORE = "StartPingCheck CompletePingCheck PacketAck SimulatorViewerTimeMessage SimStats " \
"AgentUpdate AgentAnimation AvatarAnimation ViewerEffect CoarseLocationUpdate LayerData " \
"CameraConstraint ObjectUpdateCached RequestMultipleObjects ObjectUpdate ObjectUpdateCompressed " \
@@ -163,41 +231,59 @@ class ProxyGUI(QtWidgets.QMainWindow):
"ViewerAsset GetTexture SetAlwaysRun GetDisplayNames MapImageService MapItemReply".split(" ")
DEFAULT_FILTER = f"!({' || '.join(ignored for ignored in DEFAULT_IGNORE)})"
def __init__(self):
super().__init__()
textRequest: QtWidgets.QTextEdit
def __init__(
self, settings: GUIProxySettings, session_manager: GUISessionManager,
log_live_messages: bool, parent: Optional[QtWidgets.QWidget] = None,
):
super().__init__(parent=parent)
loadUi(MAIN_WINDOW_UI_PATH, self)
self.settings = QtCore.QSettings("SaladDais", "hippolyzer")
if parent:
self.setWindowTitle("Message Log")
self.menuBar.setEnabled(False) # type: ignore
self.menuBar.hide() # type: ignore
self._selectedEntry: Optional[AbstractMessageLogEntry] = None
self.model = MessageLogModel(parent=self.tableView)
self.settings = settings
self.sessionManager = session_manager
if log_live_messages:
self.model = MessageLogModel(parent=self.tableView)
session_manager.message_logger.loggers.append(self.model)
else:
self.model = MessageLogModel(parent=self.tableView, maxlen=None)
self.tableView.setModel(self.model)
self.model.rowsAboutToBeInserted.connect(self.beforeInsert)
self.model.rowsInserted.connect(self.afterInsert)
self.tableView.selectionModel().selectionChanged.connect(self._messageSelected)
self.checkBeautify.clicked.connect(self._showSelectedMessage)
self.checkPause.clicked.connect(self._setPaused)
self._setFilter(self.DEFAULT_FILTER)
self.setFilter(self.DEFAULT_FILTER)
self.btnClearLog.clicked.connect(self.model.clear)
self.lineEditFilter.editingFinished.connect(self._setFilter)
self.lineEditFilter.editingFinished.connect(self.setFilter)
self.btnMessageBuilder.clicked.connect(self._sendToMessageBuilder)
self.btnCopyRepr.clicked.connect(self._copyRepr)
self.actionInstallHTTPSCerts.triggered.connect(self._installHTTPSCerts)
self.actionManageAddons.triggered.connect(self._manageAddons)
self.actionManageFilters.triggered.connect(self._manageFilters)
self.actionOpenMessageBuilder.triggered.connect(self._openMessageBuilder)
self.actionProxyRemotelyAccessible.setChecked(
self.settings.value("RemotelyAccessible", False, type=bool))
self.actionProxyRemotelyAccessible.setChecked(self.settings.REMOTELY_ACCESSIBLE)
self.actionUseViewerObjectCache.setChecked(self.settings.USE_VIEWER_OBJECT_CACHE)
self.actionRequestMissingObjects.setChecked(self.settings.AUTOMATICALLY_REQUEST_MISSING_OBJECTS)
self.actionProxyRemotelyAccessible.triggered.connect(self._setProxyRemotelyAccessible)
self.actionUseViewerObjectCache.triggered.connect(self._setUseViewerObjectCache)
self.actionRequestMissingObjects.triggered.connect(self._setRequestMissingObjects)
self.actionOpenNewMessageLogWindow.triggered.connect(self._openNewMessageLogWindow)
self.actionImportLogEntries.triggered.connect(self._importLogEntries)
self.actionExportLogEntries.triggered.connect(self._exportLogEntries)
self._filterMenu = QtWidgets.QMenu()
self._populateFilterMenu()
self.toolButtonFilter.setMenu(self._filterMenu)
self.sessionManager = GUISessionManager(self.model)
self.interactionManager = GUIInteractionManager(self)
AddonManager.UI = self.interactionManager
self._shouldScrollOnInsert = True
self.tableView.horizontalHeader().resizeSection(MessageLogHeader.Host, 80)
self.tableView.horizontalHeader().resizeSection(MessageLogHeader.Method, 60)
@@ -206,24 +292,27 @@ class ProxyGUI(QtWidgets.QMainWindow):
self.textResponse.hide()
def closeEvent(self, event) -> None:
loggers = self.sessionManager.message_logger.loggers
if self.model in loggers:
loggers.remove(self.model)
super().closeEvent(event)
def _populateFilterMenu(self):
def _addFilterAction(text, filter_str):
filter_action = QtWidgets.QAction(text, self)
filter_action.triggered.connect(lambda: self._setFilter(filter_str))
filter_action.triggered.connect(lambda: self.setFilter(filter_str))
self._filterMenu.addAction(filter_action)
self._filterMenu.clear()
_addFilterAction("Default", self.DEFAULT_FILTER)
filters = self.getFilterDict()
filters = self.settings.FILTERS
for preset_name, preset_filter in filters.items():
_addFilterAction(preset_name, preset_filter)
def getFilterDict(self):
return json.loads(str(self.settings.value("Filters", "{}")))
def setFilterDict(self, val: dict):
self.settings.setValue("Filters", json.dumps(val))
self.settings.FILTERS = val
self._populateFilterMenu()
def _manageFilters(self):
@@ -231,7 +320,7 @@ class ProxyGUI(QtWidgets.QMainWindow):
dialog.exec_()
@nonFatalExceptions
def _setFilter(self, filter_str=None):
def setFilter(self, filter_str=None):
if filter_str is None:
filter_str = self.lineEditFilter.text()
else:
@@ -263,10 +352,25 @@ class ProxyGUI(QtWidgets.QMainWindow):
return
req = entry.request(
beautify=self.checkBeautify.isChecked(),
replacements=self.buildReplacements(entry.session, entry.region),
replacements=buildReplacements(entry.session, entry.region),
)
resp = entry.response(beautify=self.checkBeautify.isChecked())
highlight_range = None
if isinstance(req, SpannedString):
match_result = self.model.filter.match(entry)
# Match result was a tuple indicating what matched
if isinstance(match_result, tuple):
highlight_range = req.spans.get(match_result)
self.textRequest.setPlainText(req)
if highlight_range:
cursor = self.textRequest.textCursor()
cursor.setPosition(highlight_range[0], QtGui.QTextCursor.MoveAnchor)
cursor.setPosition(highlight_range[1], QtGui.QTextCursor.KeepAnchor)
highlight_format = QtGui.QTextBlockFormat()
highlight_format.setBackground(QtCore.Qt.yellow)
cursor.setBlockFormat(highlight_format)
resp = entry.response(beautify=self.checkBeautify.isChecked())
if resp:
self.textResponse.show()
self.textResponse.setPlainText(resp)
@@ -288,7 +392,7 @@ class ProxyGUI(QtWidgets.QMainWindow):
win.show()
msg = self._selectedEntry
beautify = self.checkBeautify.isChecked()
replacements = self.buildReplacements(msg.session, msg.region)
replacements = buildReplacements(msg.session, msg.region)
win.setMessageText(msg.request(beautify=beautify, replacements=replacements))
@nonFatalExceptions
@@ -304,32 +408,38 @@ class ProxyGUI(QtWidgets.QMainWindow):
win = MessageBuilderWindow(self, self.sessionManager)
win.show()
def buildReplacements(self, session: Session, region: ProxiedRegion):
if not session or not region:
return {}
selected = session.selected
agent_object = region.objects.lookup_fullid(session.agent_id)
selected_local = selected.object_local
selected_object = None
if selected_local:
# We may or may not have an object for this
selected_object = region.objects.lookup_localid(selected_local)
return {
"SELECTED_LOCAL": selected_local,
"SELECTED_FULL": selected_object.FullID if selected_object else None,
"SELECTED_PARCEL_LOCAL": selected.parcel_local,
"SELECTED_PARCEL_FULL": selected.parcel_full,
"SELECTED_SCRIPT_ITEM": selected.script_item,
"SELECTED_TASK_ITEM": selected.task_item,
"AGENT_ID": session.agent_id,
"AGENT_LOCAL": agent_object.LocalID if agent_object else None,
"SESSION_ID": session.id,
"AGENT_POS": agent_object.Position if agent_object else None,
"NULL_KEY": UUID(),
"RANDOM_KEY": UUID.random,
"CIRCUIT_CODE": session.circuit_code,
"REGION_HANDLE": region.handle,
}
def _openNewMessageLogWindow(self):
win: QtWidgets.QMainWindow = MessageLogWindow(
self.settings, self.sessionManager, log_live_messages=True, parent=self)
win.setFilter(self.lineEditFilter.text())
win.show()
win.activateWindow()
@asyncSlot()
async def _importLogEntries(self):
log_file = await AddonManager.UI.open_file(
caption="Import Log Entries", filter_str="Hippolyzer Logs (*.hippolog)"
)
if not log_file:
return
win = MessageLogWindow(self.settings, self.sessionManager, log_live_messages=False, parent=self)
win.setFilter(self.lineEditFilter.text())
with open(log_file, "rb") as f:
entries = import_log_entries(f.read())
for entry in entries:
win.model.add_log_entry(entry)
win.show()
win.activateWindow()
@asyncSlot()
async def _exportLogEntries(self):
log_file = await AddonManager.UI.save_file(
caption="Export Log Entries", filter_str="Hippolyzer Logs (*.hippolog)", default_suffix="hippolog",
)
if not log_file:
return
with open(log_file, "wb") as f:
f.write(export_log_entries(self.model))
def _installHTTPSCerts(self):
msg = QtWidgets.QMessageBox()
@@ -353,20 +463,26 @@ class ProxyGUI(QtWidgets.QMainWindow):
msg.exec()
def _setProxyRemotelyAccessible(self, checked: bool):
self.settings.setValue("RemotelyAccessible", checked)
self.sessionManager.settings.REMOTELY_ACCESSIBLE = checked
msg = QtWidgets.QMessageBox()
msg.setText("Remote accessibility setting changes will take effect on next run")
msg.exec()
def _setUseViewerObjectCache(self, checked: bool):
self.sessionManager.settings.USE_VIEWER_OBJECT_CACHE = checked
def _setRequestMissingObjects(self, checked: bool):
self.sessionManager.settings.AUTOMATICALLY_REQUEST_MISSING_OBJECTS = checked
def _manageAddons(self):
dialog = AddonDialog(self)
dialog.exec_()
def getAddonList(self) -> List[str]:
return json.loads(str(self.settings.value("Addons", "[]")))
return self.sessionManager.settings.ADDON_SCRIPTS
def setAddonList(self, val: List[str]):
self.settings.setValue("Addons", json.dumps(val))
self.sessionManager.settings.ADDON_SCRIPTS = val
BANNED_HEADERS = ("content-length", "host")
@@ -404,7 +520,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
def __init__(self, parent, session_manager):
super().__init__(parent=parent)
loadUi(MESSAGE_BUILDER_UI_PATH, self)
self.templateDict = TemplateDictionary()
self.templateDict = DEFAULT_TEMPLATE_DICT
self.llsdSerializer = LLSDMessageSerializer()
self.sessionManager: SessionManager = session_manager
self.regionModel = RegionListModel(self, self.sessionManager)
@@ -491,7 +607,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
self.textRequest.clear()
template = self.templateDict[message_name]
msg = ProxiedMessage(message_name, direction=Direction.OUT)
msg = Message(message_name, direction=Direction.OUT)
for tmpl_block in template.blocks:
num_blocks = tmpl_block.number or 1
@@ -502,7 +618,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
msg_block = Block(tmpl_block.name, **fill_vars)
msg.add_block(msg_block)
self.textRequest.setPlainText(
msg.to_human_string(replacements={}, beautify=True, template=template)
HumanMessageSerializer.to_human_string(msg, replacements={}, beautify=True, template=template)
)
def _getVarPlaceholder(self, msg, block, var):
@@ -533,24 +649,9 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
if var.name in ("TaskID", "ObjectID"):
return VerbatimHumanVal("[[SELECTED_FULL]]")
if var.type.is_int:
return 0
elif var.type.is_float:
return 0.0
elif var.type == MsgType.MVT_LLUUID:
return UUID()
elif var.type == MsgType.MVT_BOOL:
return False
elif var.type == MsgType.MVT_VARIABLE:
return ""
elif var.type in (MsgType.MVT_LLVector3, MsgType.MVT_LLVector3d, MsgType.MVT_LLQuaternion):
return VerbatimHumanVal("(0.0, 0.0, 0.0)")
elif var.type == MsgType.MVT_LLVector4:
return VerbatimHumanVal("(0.0, 0.0, 0.0, 0.0)")
elif var.type == MsgType.MVT_FIXED:
return b"\x00" * var.size
elif var.type == MsgType.MVT_IP_ADDR:
return "0.0.0.0"
default_val = var.default_value
if default_val is not None:
return default_val
return VerbatimHumanVal("")
@nonFatalExceptions
@@ -558,10 +659,12 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
session, region = self._getTarget()
msg_text = self.textRequest.toPlainText()
replacements = self.parent().buildReplacements(session, region)
replacements = buildReplacements(session, region)
if re.match(r"\A\s*(in|out)\s+", msg_text, re.I):
sender_func = self._sendLLUDPMessage
elif re.match(r"\A\s*(eq)\s+", msg_text, re.I):
sender_func = self._sendEQMessage
elif re.match(r"\A.*http/[0-9.]+\r?\n", msg_text, re.I):
sender_func = self._sendHTTPMessage
else:
@@ -585,10 +688,10 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
env = self._buildEnv(session, region)
# We specifically want to allow `eval()` in messages since
# messages from here are trusted.
msg = ProxiedMessage.from_human_string(msg_text, replacements, env, safe=False)
msg = HumanMessageSerializer.from_human_string(msg_text, replacements, env, safe=False)
if self.checkLLUDPViaCaps.isChecked():
if msg.direction == Direction.IN:
region.eq_manager.queue_event(
region.eq_manager.inject_event(
self.llsdSerializer.serialize(msg, as_dict=True)
)
else:
@@ -600,9 +703,22 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
)
else:
transport = None
if self.checkOffCircuit.isChecked():
transport = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
off_circuit = self.checkOffCircuit.isChecked()
if off_circuit:
transport = SocketUDPTransport(socket.socket(socket.AF_INET, socket.SOCK_DGRAM))
region.circuit.send_message(msg, transport=transport)
if off_circuit:
transport.close()
def _sendEQMessage(self, session, region: Optional[ProxiedRegion], msg_text: str, _replacements: dict):
if not session or not region:
raise RuntimeError("Need a valid session and region to send EQ event")
message_line, _, body = (x.strip() for x in msg_text.partition("\n"))
message_name = message_line.rsplit(" ", 1)[-1]
region.eq_manager.inject_event({
"message": message_name,
"body": llsd.parse_xml(body.encode("utf8")),
})
def _sendHTTPMessage(self, session, region, msg_text: str, replacements: dict):
env = self._buildEnv(session, region)
@@ -662,7 +778,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
elif directive == b"UNESCAPE":
val = unescaped_contents
elif directive == b"EVAL":
val = proxy_eval(contents.decode("utf8").strip(), globals_={**env, **replacements})
val = subfield_eval(contents.decode("utf8").strip(), globals_={**env, **replacements})
val = _coerce_to_bytes(val)
elif directive == b"REPL":
val = _coerce_to_bytes(replacements[contents.decode("utf8").strip()])
@@ -677,7 +793,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
return val
def _sendHTTPRequest(self, method, uri, headers, body):
caps_client = CapsClient()
caps_client = ProxyCapsClient(self.sessionManager.settings)
async def _send_request():
req = caps_client.request(method, uri, headers=headers, data=body)
@@ -692,7 +808,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
class AddonDialog(QtWidgets.QDialog):
listAddons: QtWidgets.QListWidget
def __init__(self, parent: ProxyGUI):
def __init__(self, parent: MessageLogWindow):
super().__init__()
loadUi(ADDON_DIALOG_UI_PATH, self)
@@ -743,7 +859,7 @@ class AddonDialog(QtWidgets.QDialog):
class FilterDialog(QtWidgets.QDialog):
listFilters: QtWidgets.QListWidget
def __init__(self, parent: ProxyGUI):
def __init__(self, parent: MessageLogWindow):
super().__init__()
loadUi(FILTER_DIALOG_UI_PATH, self)
@@ -787,18 +903,25 @@ def gui_main():
app = QtWidgets.QApplication(sys.argv)
loop = QEventLoop(app)
asyncio.set_event_loop(loop)
window = ProxyGUI()
settings = GUIProxySettings(QtCore.QSettings("SaladDais", "hippolyzer"))
session_manager = GUISessionManager(settings)
window = MessageLogWindow(settings, session_manager, log_live_messages=True)
AddonManager.UI = GUIInteractionManager(window)
timer = QtCore.QTimer(app)
timer.timeout.connect(window.sessionManager.checkRegions)
timer.start(100)
signal.signal(signal.SIGINT, lambda *args: QtWidgets.QApplication.quit())
window.show()
remote_access = window.settings.value("RemotelyAccessible", False, type=bool)
http_host = None
if remote_access:
if window.sessionManager.settings.REMOTELY_ACCESSIBLE:
http_host = "0.0.0.0"
start_proxy(
session_manager=window.sessionManager,
extra_addon_paths=window.getAddonList(),
proxy_host=http_host,
)
if __name__ == "__main__":
multiprocessing.freeze_support()
gui_main()

View File

@@ -256,12 +256,18 @@
<bool>true</bool>
</property>
<addaction name="actionOpenMessageBuilder"/>
<addaction name="actionOpenNewMessageLogWindow"/>
<addaction name="separator"/>
<addaction name="actionImportLogEntries"/>
<addaction name="actionExportLogEntries"/>
<addaction name="separator"/>
<addaction name="actionInstallHTTPSCerts"/>
<addaction name="actionManageAddons"/>
<addaction name="actionManageFilters"/>
<addaction name="separator"/>
<addaction name="actionProxyRemotelyAccessible"/>
<addaction name="actionUseViewerObjectCache"/>
<addaction name="actionRequestMissingObjects"/>
</widget>
<addaction name="menuFile"/>
</widget>
@@ -299,6 +305,43 @@
<string>Make the proxy accessible from other devices on the network</string>
</property>
</action>
<action name="actionUseViewerObjectCache">
<property name="checkable">
<bool>true</bool>
</property>
<property name="text">
<string>Use Viewer Object Cache</string>
</property>
<property name="toolTip">
<string>Can help make the proxy aware of certain objects, but can cause slowdowns</string>
</property>
</action>
<action name="actionRequestMissingObjects">
<property name="checkable">
<bool>true</bool>
</property>
<property name="text">
<string>Automatically Request Missing Objects</string>
</property>
<property name="toolTip">
<string>Force the proxy to request objects that it doesn't know about due to cache misses</string>
</property>
</action>
<action name="actionOpenNewMessageLogWindow">
<property name="text">
<string>Open New Message Log Window</string>
</property>
</action>
<action name="actionImportLogEntries">
<property name="text">
<string>Import Log Entries</string>
</property>
</action>
<action name="actionExportLogEntries">
<property name="text">
<string>Export Log Entries</string>
</property>
</action>
</widget>
<resources/>
<connections/>

View File

@@ -0,0 +1,91 @@
"""
Assorted utilities to make creating animations from scratch easier
"""
import copy
from typing import List, Union
from hippolyzer.lib.base.datatypes import Vector3, Quaternion
from hippolyzer.lib.base.llanim import PosKeyframe, RotKeyframe
def smooth_step(t: float):
t = max(0.0, min(1.0, t))
return t * t * (3 - 2 * t)
def rot_interp(r0: Quaternion, r1: Quaternion, t: float):
"""
Bad quaternion interpolation
TODO: This is definitely not correct yet seems to work ok? Implement slerp.
"""
# Ignore W
r0 = r0.data(3)
r1 = r1.data(3)
return Quaternion(*map(lambda pair: ((pair[0] * (1.0 - t)) + (pair[1] * t)), zip(r0, r1)))
def unique_frames(frames: List[Union[PosKeyframe, RotKeyframe]]):
"""Drop frames where time and coordinate are exact duplicates of another frame"""
new_frames = []
for frame in frames:
# TODO: fudge factor for float comparison instead
if frame not in new_frames:
new_frames.append(frame)
return new_frames
def shift_keyframes(frames: List[Union[PosKeyframe, RotKeyframe]], num: int):
"""
Shift keyframes around by `num` frames
Assumes keyframes occur at a set cadence, and that first and last keyframe are at the same coord.
"""
# Get rid of duplicate frames
frames = unique_frames(frames)
pop_idx = -1
insert_idx = 0
if num < 0:
insert_idx = len(frames) - 1
pop_idx = 0
num = -num
old_times = [f.time for f in frames]
new_frames = frames.copy()
# Drop last, duped frame. We'll copy the first frame to replace it later
new_frames.pop(-1)
for _ in range(num):
new_frames.insert(insert_idx, new_frames.pop(pop_idx))
# Put first frame back on the end
new_frames.append(copy.copy(new_frames[0]))
assert len(old_times) == len(new_frames)
assert new_frames[0] == new_frames[-1]
# Make the times of the shifted keyframes match up with the previous timeline
for old_time, new_frame in zip(old_times, new_frames):
new_frame.time = old_time
return new_frames
def smooth_pos(start: Vector3, end: Vector3, inter_frames: int, time: float, duration: float) -> List[PosKeyframe]:
"""Generate keyframes to smoothly interpolate between two positions"""
frames = [PosKeyframe(time=time, pos=start)]
for i in range(0, inter_frames):
t = (i + 1) / (inter_frames + 1)
smooth_t = smooth_step(t)
pos = Vector3(smooth_t, smooth_t, smooth_t).interpolate(start, end)
frames.append(PosKeyframe(time=time + (t * duration), pos=pos))
return frames + [PosKeyframe(time=time + duration, pos=end)]
def smooth_rot(start: Quaternion, end: Quaternion, inter_frames: int, time: float, duration: float)\
-> List[RotKeyframe]:
"""Generate keyframes to smoothly interpolate between two rotations"""
frames = [RotKeyframe(time=time, rot=start)]
for i in range(0, inter_frames):
t = (i + 1) / (inter_frames + 1)
smooth_t = smooth_step(t)
frames.append(RotKeyframe(time=time + (t * duration), rot=rot_interp(start, end, smooth_t)))
return frames + [RotKeyframe(time=time + duration, rot=end)]

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,232 @@
<linden_skeleton num_bones="133" num_collision_volumes="26" version="2.0">
<bone aliases="hip avatar_mPelvis" connected="false" end="0.000 0.000 0.084" group="Torso" name="mPelvis" pivot="0.000000 0.000000 1.067015" pos="0.000 0.000 1.067" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<collision_volume end="0.030 0.000 0.095" group="Collision" name="PELVIS" pos="-0.01 0 -0.02" rot="0.000000 8.00000 0.000000" scale="0.12 0.16 0.17" support="base"/>
<collision_volume end="-0.100 0.000 0.000" group="Collision" name="BUTT" pos="-0.06 0 -0.1" rot="0.000000 0.00000 0.000000" scale="0.1 0.1 0.1" support="base"/>
<bone connected="true" end="0.000 0.000 -0.084" group="Spine" name="mSpine1" pivot="0.000000 0.000000 0.084073" pos="0.000 0.000 0.084" rot="0.000000 0.000000 0.000000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="0.000 0.000 0.084" group="Spine" name="mSpine2" pivot="0.000000 0.000000 -0.084073" pos="0.000 0.000 -0.084" rot="0.000000 0.000000 0.000000" scale="1.00 1.00 1.00" support="extended">
<bone aliases="abdomen avatar_mTorso" connected="true" end="-0.015 0.000 0.205" group="Torso" name="mTorso" pivot="0.000000 0.000000 0.084073" pos="0.000 0.000 0.084" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<collision_volume end="0.028 0.000 0.094" group="Collision" name="BELLY" pos="0.028 0 0.04" rot="0.000000 8.00000 0.000000" scale="0.09 0.13 0.15" support="base"/>
<collision_volume end="0.000 0.100 0.000" group="Collision" name="LEFT_HANDLE" pos="0.0 0.10 0.058" rot="0.000000 0.00000 0.000000" scale="0.05 0.05 0.05" support="base"/>
<collision_volume end="0.000 -0.100 0.000" group="Collision" name="RIGHT_HANDLE" pos="0.0 -0.10 0.058" rot="0.000000 0.00000 0.000000" scale="0.05 0.05 0.05" support="base"/>
<collision_volume end="-0.100 0.000 0.000" group="Collision" name="LOWER_BACK" pos="0.0 0.0 0.023" rot="0.000000 0.00000 0.000000" scale="0.09 0.13 0.15" support="base"/>
<bone connected="true" end="0.015 0.000 -0.205" group="Spine" name="mSpine3" pivot="-0.015368 0.000000 0.204877" pos="-0.015 0.000 0.205" rot="0.000000 0.000000 0.000000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.015 0.000 0.205" group="Spine" name="mSpine4" pivot="0.015368 0.000000 -0.204877" pos="0.015 0.000 -0.205" rot="0.000000 0.000000 0.000000" scale="1.00 1.00 1.00" support="extended">
<bone aliases="chest avatar_mChest" connected="true" end="-0.010 0.000 0.250" group="Torso" name="mChest" pivot="-0.015368 0.000000 0.204877" pos="-0.015 0.000 0.205" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<collision_volume end="-0.096 0.000 0.152" group="Collision" name="CHEST" pos="0.028 0 0.07" rot="0.000000 -10.00000 0.000000" scale="0.11 0.15 0.2" support="base"/>
<collision_volume end="0.080 0.000 -0.006" group="Collision" name="LEFT_PEC" pos="0.119 0.082 0.042" rot="0.000000 4.29000 0.000000" scale="0.05 0.05 0.05" support="base"/>
<collision_volume end="0.080 0.000 -0.006" group="Collision" name="RIGHT_PEC" pos="0.119 -0.082 0.042" rot="0.000000 4.29000 0.000000" scale="0.05 0.05 0.05" support="base"/>
<collision_volume end="-0.100 0.000 0.000" group="Collision" name="UPPER_BACK" pos="0.0 0.0 0.017" rot="0.000000 0.00000 0.000000" scale="0.09 0.13 0.15" support="base"/>
<bone aliases="neck avatar_mNeck" connected="true" end="0.000 0.000 0.077" group="Torso" name="mNeck" pivot="-0.009507 0.000000 0.251108" pos="-0.010 0.000 0.251" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<collision_volume end="0.000 0.000 0.080" group="Collision" name="NECK" pos="0.0 0 0.02" rot="0.000000 0.000000 0.000000" scale="0.05 0.06 0.08" support="base"/>
<bone aliases="head avatar_mHead" connected="true" end="0.000 0.000 0.079" group="Torso" name="mHead" pivot="0.000000 -0.000000 0.075630" pos="0.000 -0.000 0.076" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<collision_volume end="0.000 0.000 0.100" group="Collision" name="HEAD" pos="0.02 0 0.07" rot="0.000000 0.000000 0.000000" scale="0.11 0.09 0.12" support="base"/>
<bone aliases="figureHair avatar_mSkull" connected="false" end="0.000 0.000 0.033" group="Extra" name="mSkull" pivot="0.000000 0.000000 0.079000" pos="0.000 0.000 0.079" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base"/>
<bone aliases="avatar_mEyeRight" connected="false" end="0.025 0.000 0.000" group="Extra" name="mEyeRight" pivot="0.098466 -0.036000 0.079000" pos="0.098 -0.036 0.079" rot="0.000000 0.000000 -0.000000" scale="1.000 1.000 1.000" support="base"/>
<bone aliases="avatar_mEyeLeft" connected="false" end="0.025 0.000 0.000" group="Extra" name="mEyeLeft" pivot="0.098461 0.036000 0.079000" pos="0.098 0.036 0.079" rot="0.000000 -0.000000 0.000000" scale="1.000 1.000 1.000" support="base"/>
<bone connected="false" end="0.020 0.000 0.000" group="Face" name="mFaceRoot" pivot="0.025000 0.000000 0.045000" pos="0.025 0.000 0.045" rot="0.000000 0.000000 0.000000" scale="1.00 1.00 1.00" support="extended">
<bone connected="false" end="0.025 0.000 0.000" group="Face" name="mFaceEyeAltRight" pivot="0.073466 -0.036000 0.0339300" pos="0.073 -0.036 0.034" rot="0.000000 0.000000 0.000000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.025 0.000 0.000" group="Face" name="mFaceEyeAltLeft" pivot="0.073461 0.036000 0.0339300" pos="0.073 0.036 0.034" rot="0.000000 0.000000 0.000000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.024 0.004 0.018" group="Face" name="mFaceForeheadLeft" pivot="0.061 0.035 0.083" pos="0.061 0.035 0.083" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.024 -0.004 0.018" group="Face" name="mFaceForeheadRight" pivot="0.061 -0.035 0.083" pos="0.061 -0.035 0.083" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.023 0.013 0.000" group="Eyes" name="mFaceEyebrowOuterLeft" pivot="0.064 0.051 0.048" pos="0.064 0.051 0.048" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.027 0.000 0.000" group="Eyes" name="mFaceEyebrowCenterLeft" pivot="0.070 0.043 0.056" pos="0.070 0.043 0.056" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.026 0.000 0.000" group="Eyes" name="mFaceEyebrowInnerLeft" pivot="0.075 0.022 0.051" pos="0.075 0.022 0.051" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.023 -0.013 0.000" group="Eyes" name="mFaceEyebrowOuterRight" pivot="0.064 -0.051 0.048" pos="0.064 -0.051 0.048" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.027 0.000 0.000" group="Eyes" name="mFaceEyebrowCenterRight" pivot="0.070 -0.043 0.056" pos="0.070 -0.043 0.056" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.026 0.000 0.000" group="Eyes" name="mFaceEyebrowInnerRight" pivot="0.075 -0.022 0.051" pos="0.075 -0.022 0.051" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.027 0.000 0.005" group="Eyes" name="mFaceEyeLidUpperLeft" pivot="0.073 0.036 0.034" pos="0.073 0.036 0.034" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.024 0.000 -0.007" group="Eyes" name="mFaceEyeLidLowerLeft" pivot="0.073 0.036 0.034" pos="0.073 0.036 0.034" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.027 0.000 0.005" group="Eyes" name="mFaceEyeLidUpperRight" pivot="0.073 -0.036 0.034" pos="0.073 -0.036 0.034" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.024 0.000 -0.007" group="Eyes" name="mFaceEyeLidLowerRight" pivot="0.073 -0.036 0.034" pos="0.073 -0.036 0.034" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="-0.019 0.018 0.025" group="Ears" name="mFaceEar1Left" pivot="0.000 0.080 0.002" pos="0.000 0.080 0.002" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="0.000 0.000 0.033" group="Ears" name="mFaceEar2Left" pivot="-0.019 0.018 0.025" pos="-0.019 0.018 0.025" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
<bone connected="false" end="-0.019 -0.018 0.025" group="Ears" name="mFaceEar1Right" pivot="0.000 -0.080 0.002" pos="0.000 -0.080 0.002" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="0.000 0.000 0.033" group="Ears" name="mFaceEar2Right" pivot="-0.019 -0.018 0.025" pos="-0.019 -0.018 0.025" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
<bone connected="false" end="0.015 0.004 0.000" group="Face" name="mFaceNoseLeft" pivot="0.086 0.015 -0.004" pos="0.086 0.015 -0.004" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.025 0.000 0.000" group="Face" name="mFaceNoseCenter" pivot="0.102 0.000 0.000" pos="0.102 0.000 0.000" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.015 -0.004 0.000" group="Face" name="mFaceNoseRight" pivot="0.086 -0.015 -0.004" pos="0.086 -0.015 -0.004" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.013 0.030 0.000" group="Face" name="mFaceCheekLowerLeft" pivot="0.050 0.034 -0.031" pos="0.050 0.034 -0.031" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.022 0.015 0.000" group="Face" name="mFaceCheekUpperLeft" pivot="0.070 0.034 -0.005" pos="0.070 0.034 -0.005" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.013 -0.030 0.000" group="Face" name="mFaceCheekLowerRight" pivot="0.050 -0.034 -0.031" pos="0.050 -0.034 -0.031" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.022 -0.015 0.000" group="Face" name="mFaceCheekUpperRight" pivot="0.070 -0.034 -0.005" pos="0.070 -0.034 -0.005" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.059 0.000 -0.039" group="Mouth" name="mFaceJaw" pivot="-0.001 0.000 -0.015" pos="-0.001 0.000 -0.015" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="false" end="0.021 0.000 -0.018" group="Mouth" name="mFaceChin" pivot="0.074 0.000 -0.054" pos="0.074 0.000 -0.054" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.035 0.000 0.000" group="Mouth" name="mFaceTeethLower" pivot="0.021 0.000 -0.039" pos="0.021 0.000 -0.039" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="false" end="0.034 0.017 0.005" group="Lips" name="mFaceLipLowerLeft" pivot="0.045 0.000 0.000" pos="0.045 0.000 0.000" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.034 -0.017 0.005" group="Lips" name="mFaceLipLowerRight" pivot="0.045 0.000 0.000" pos="0.045 0.000 0.000" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.040 0.000 0.002" group="Lips" name="mFaceLipLowerCenter" pivot="0.045 0.000 0.000" pos="0.045 0.000 0.000" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.022 0.000 0.007" group="Mouth" name="mFaceTongueBase" pivot="0.039 0.000 0.005" pos="0.039 0.000 0.005" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="0.010 0.000 0.000" group="Mouth" name="mFaceTongueTip" pivot="0.022 0.000 0.007" pos="0.022 0.000 0.007" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
</bone>
</bone>
<bone connected="false" end="-0.017 0.000 0.000" group="Face" name="mFaceJawShaper" pivot="0.000 0.000 0.000" pos="0.000 0.000 0.000" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.036 0.000 0.000" group="Face" name="mFaceForeheadCenter" pivot="0.069 0.000 0.065" pos="0.069 0.000 0.065" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.014 0.000 0.000" group="Nose" name="mFaceNoseBase" pivot="0.094 0.000 -0.016" pos="0.094 0.000 -0.016" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.035 0.000 0.000" group="Mouth" name="mFaceTeethUpper" pivot="0.020 0.000 -0.030" pos="0.020 0.000 -0.030" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="false" end="0.041 0.015 0.000" group="Lips" name="mFaceLipUpperLeft" pivot="0.045 0.000 -0.003" pos="0.045 0.000 -0.003" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.041 -0.015 0.000" group="Lips" name="mFaceLipUpperRight" pivot="0.045 0.000 -0.003" pos="0.045 0.000 -0.003" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.045 0.051 0.000" group="Lips" name="mFaceLipCornerLeft" pivot="0.028 -0.019 -0.010" pos="0.028 -0.019 -0.010" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.045 -0.051 0.000" group="Lips" name="mFaceLipCornerRight" pivot="0.028 0.019 -0.010" pos="0.028 0.019 -0.010" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.043 0.000 0.002" group="Lips" name="mFaceLipUpperCenter" pivot="0.045 0.000 -0.003" pos="0.045 0.000 -0.003" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
<bone connected="false" end="0.016 0.000 0.000" group="Face" name="mFaceEyecornerInnerLeft" pivot="0.075 0.017 0.032" pos="0.075 0.017 0.032" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.016 0.000 0.000" group="Face" name="mFaceEyecornerInnerRight" pivot="0.075 -0.017 0.032" pos="0.075 -0.017 0.032" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="0.015 0.000 0.008" group="Nose" name="mFaceNoseBridge" pivot="0.091 0.000 0.020" pos="0.091 0.000 0.020" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
</bone>
</bone>
<bone aliases="lCollar avatar_mCollarLeft" connected="false" end="0.000 0.079 0.000" group="Arms" name="mCollarLeft" pivot="-0.020927 0.084665 0.165396" pos="-0.021 0.085 0.165" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<collision_volume end="0.000 0.100 0.000" group="Collision" name="L_CLAVICLE" pos="0.02 0 0.02" rot="0.000000 0.00000 0.000000" scale="0.07 0.14 0.05" support="base"/>
<bone aliases="lShldr avatar_mShoulderLeft" connected="true" end="0.000 0.247 0.000" group="Arms" name="mShoulderLeft" pivot="0.000000 0.079000 -0.000000" pos="0.000 0.079 -0.000" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<collision_volume end="0.000 0.130 -0.003" group="Collision" name="L_UPPER_ARM" pos="0.0 0.12 0.01" rot="-5.000000 0.00000 0.000000" scale="0.05 0.17 0.05" support="base"/>
<bone aliases="lForeArm avatar_mElbowLeft" connected="true" end="0.000 0.205 0.000" group="Arms" name="mElbowLeft" pivot="0.000000 0.248000 0.000000" pos="0.000 0.248 0.000" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<collision_volume end="0.000 0.100 -0.001" group="Collision" name="L_LOWER_ARM" pos="0.0 0.1 0.0" rot="-3.000000 0.00000 0.000000" scale="0.04 0.14 0.04" support="base"/>
<bone aliases="lHand avatar_mWristLeft" connected="true" end="0.000 0.060 0.000" group="Arms" name="mWristLeft" pivot="-0.000000 0.204846 0.000000" pos="-0.000 0.205 0.000" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<collision_volume end="0.005 0.049 -0.001" group="Collision" name="L_HAND" pos="0.01 0.05 0.0" rot="-3.000000 0.00000 -10.000000" scale="0.05 0.08 0.03" support="base"/>
<bone connected="false" end="-0.001 0.040 -0.006" group="Hand" name="mHandMiddle1Left" pivot="0.013 0.101 0.015" pos="0.013 0.101 0.015" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.001 0.049 -0.008" group="Hand" name="mHandMiddle2Left" pivot="-0.001 0.040 -0.006" pos="-0.001 0.040 -0.006" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.002 0.033 -0.006" group="Hand" name="mHandMiddle3Left" pivot="-0.001 0.049 -0.008" pos="-0.001 0.049 -0.008" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
</bone>
<bone connected="false" end="0.017 0.036 -0.006" group="Hand" name="mHandIndex1Left" pivot="0.038 0.097 0.015" pos="0.038 0.097 0.015" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="0.014 0.032 -0.006" group="Hand" name="mHandIndex2Left" pivot="0.017 0.036 -0.006" pos="0.017 0.036 -0.006" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="0.011 0.025 -0.004" group="Hand" name="mHandIndex3Left" pivot="0.014 0.032 -0.006" pos="0.014 0.032 -0.006" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
</bone>
<bone connected="false" end="-0.013 0.038 -0.008" group="Hand" name="mHandRing1Left" pivot="-0.010 0.099 0.009" pos="-0.010 0.099 0.009" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.013 0.040 -0.009" group="Hand" name="mHandRing2Left" pivot="-0.013 0.038 -0.008" pos="-0.013 0.038 -0.008" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.010 0.028 -0.006" group="Hand" name="mHandRing3Left" pivot="-0.013 0.040 -0.009" pos="-0.013 0.040 -0.009" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
</bone>
<bone connected="false" end="-0.024 0.025 -0.006" group="Hand" name="mHandPinky1Left" pivot="-0.031 0.095 0.003" pos="-0.031 0.095 0.003" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.015 0.018 -0.004" group="Hand" name="mHandPinky2Left" pivot="-0.024 0.025 -0.006" pos="-0.024 0.025 -0.006" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.013 0.016 -0.004" group="Hand" name="mHandPinky3Left" pivot="-0.015 0.018 -0.004" pos="-0.015 0.018 -0.004" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
</bone>
<bone connected="false" end="0.028 0.032 0.000" group="Hand" name="mHandThumb1Left" pivot="0.031 0.026 0.004" pos="0.031 0.026 0.004" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="0.023 0.031 0.000" group="Hand" name="mHandThumb2Left" pivot="0.028 0.032 -0.001" pos="0.028 0.032 -0.001" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="0.015 0.025 0.000" group="Hand" name="mHandThumb3Left" pivot="0.023 0.031 -0.001" pos="0.023 0.031 -0.001" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
</bone>
</bone>
</bone>
</bone>
</bone>
<bone aliases="rCollar avatar_mCollarRight" connected="false" end="0.000 -0.079 0.000" group="Arms" name="mCollarRight" pivot="-0.020927 -0.085000 0.165396" pos="-0.021 -0.085 0.165" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<collision_volume end="0.000 -0.100 0.000" group="Collision" name="R_CLAVICLE" pos="0.02 0 0.02" rot="0.000000 0.00000 0.000000" scale="0.07 0.14 0.05" support="base"/>
<bone aliases="rShldr avatar_mShoulderRight" connected="true" end="0.000 -0.247 0.000" group="Arms" name="mShoulderRight" pivot="0.000000 -0.079418 -0.000000" pos="0.000 -0.079 -0.000" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<collision_volume end="0.000 -0.130 -0.003" group="Collision" name="R_UPPER_ARM" pos="0.0 -0.12 0.01" rot="5.000000 0.00000 0.000000" scale="0.05 0.17 0.05" support="base"/>
<bone aliases="rForeArm avatar_mElbowRight" connected="true" end="0.000 -0.205 0.000" group="Arms" name="mElbowRight" pivot="0.000000 -0.248000 -0.000000" pos="0.000 -0.248 -0.000" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<collision_volume end="0.000 -0.100 -0.001" group="Collision" name="R_LOWER_ARM" pos="0.0 -0.1 0.0" rot="3.000000 0.00000 0.000000" scale="0.04 0.14 0.04" support="base"/>
<bone aliases="rHand avatar_mWristRight" connected="true" end="0.000 -0.060 0.000" group="Arms" name="mWristRight" pivot="-0.000000 -0.205000 -0.000000" pos="0.000 -0.205 -0.000" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<collision_volume end="0.005 -0.049 -0.001" group="Collision" name="R_HAND" pos="0.01 -0.05 0.0" rot="3.000000 0.00000 10.000000" scale="0.05 0.08 0.03" support="base"/>
<bone connected="false" end="-0.001 -0.040 -0.006" group="Hand" name="mHandMiddle1Right" pivot="0.013 -0.101 0.015" pos="0.013 -0.101 0.015" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.001 -0.049 -0.008" group="Hand" name="mHandMiddle2Right" pivot="-0.001 -0.040 -0.006" pos="-0.001 -0.040 -0.006" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.002 -0.033 -0.006" group="Hand" name="mHandMiddle3Right" pivot="-0.001 -0.049 -0.008" pos="-0.001 -0.049 -0.008" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
</bone>
<bone connected="false" end="0.017 -0.036 -0.006" group="Hand" name="mHandIndex1Right" pivot="0.038 -0.097 0.015" pos="0.038 -0.097 0.015" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="0.014 -0.032 -0.006" group="Hand" name="mHandIndex2Right" pivot="0.017 -0.036 -0.006" pos="0.017 -0.036 -0.006" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="0.011 -0.025 -0.004" group="Hand" name="mHandIndex3Right" pivot="0.014 -0.032 -0.006" pos="0.014 -0.032 -0.006" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
</bone>
<bone connected="false" end="-0.013 -0.038 -0.008" group="Hand" name="mHandRing1Right" pivot="-0.010 -0.099 0.009" pos="-0.010 -0.099 0.009" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.013 -0.040 -0.009" group="Hand" name="mHandRing2Right" pivot="-0.013 -0.038 -0.008" pos="-0.013 -0.038 -0.008" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.010 -0.028 -0.006" group="Hand" name="mHandRing3Right" pivot="-0.013 -0.040 -0.009" pos="-0.013 -0.040 -0.009" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
</bone>
<bone connected="false" end="-0.024 -0.025 -0.006" group="Hand" name="mHandPinky1Right" pivot="-0.031 -0.095 0.003" pos="-0.031 -0.095 0.003" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.015 -0.018 -0.004" group="Hand" name="mHandPinky2Right" pivot="-0.024 -0.025 -0.006" pos="-0.024 -0.025 -0.006" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.013 -0.016 -0.004" group="Hand" name="mHandPinky3Right" pivot="-0.015 -0.018 -0.004" pos="-0.015 -0.018 -0.004" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
</bone>
<bone connected="false" end="0.028 -0.032 0.000" group="Hand" name="mHandThumb1Right" pivot="0.031 -0.026 0.004" pos="0.031 -0.026 0.004" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="0.023 -0.031 0.000" group="Hand" name="mHandThumb2Right" pivot="0.028 -0.032 -0.001" pos="0.028 -0.032 -0.001" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="0.015 -0.025 0.000" group="Hand" name="mHandThumb3Right" pivot="0.023 -0.031 -0.001" pos="0.023 -0.031 -0.001" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
</bone>
</bone>
</bone>
</bone>
</bone>
<bone connected="false" end="-0.061 0.000 0.000" group="Wing" name="mWingsRoot" pivot="-0.014 0.000 0.000" pos="-0.014 0.000 0.000" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="false" end="-0.168 0.169 0.067" group="Wing" name="mWing1Left" pivot="-0.099 0.105 0.181" pos="-0.099 0.105 0.181" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.181 0.183 0.000" group="Wing" name="mWing2Left" pivot="-0.168 0.169 0.067" pos="-0.168 0.169 0.067" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.171 0.173 0.000" group="Wing" name="mWing3Left" pivot="-0.181 0.183 0.000" pos="-0.181 0.183 0.000" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.146 0.132 0.000" group="Wing" name="mWing4Left" pivot="-0.171 0.173 0.000" pos="-0.171 0.173 0.000" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="true" end="-0.068 0.062 -0.159" group="Wing" name="mWing4FanLeft" pivot="-0.171 0.173 0.000" pos="-0.171 0.173 0.000" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
</bone>
</bone>
<bone connected="false" end="-0.168 -0.169 0.067" group="Wing" name="mWing1Right" pivot="-0.099 -0.105 0.181" pos="-0.099 -0.105 0.181" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.181 -0.183 0.000" group="Wing" name="mWing2Right" pivot="-0.168 -0.169 0.067" pos="-0.168 -0.169 0.067" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.171 -0.173 0.000" group="Wing" name="mWing3Right" pivot="-0.181 -0.183 0.000" pos="-0.181 -0.183 0.000" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.146 -0.132 0.000" group="Wing" name="mWing4Right" pivot="-0.171 -0.173 0.000" pos="-0.171 -0.173 0.000" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="true" end="-0.068 -0.062 -0.159" group="Wing" name="mWing4FanRight" pivot="-0.171 -0.173 0.000" pos="-0.171 -0.173 0.000" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
</bone>
</bone>
</bone>
</bone>
</bone>
</bone>
</bone>
</bone>
</bone>
<bone aliases="rThigh avatar_mHipRight" connected="false" end="-0.001 0.049 -0.491" group="Legs" name="mHipRight" pivot="0.033620 -0.128806 -0.041086" pos="0.034 -0.129 -0.041" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<collision_volume end="0.000 0.000 -0.200" group="Collision" name="R_UPPER_LEG" pos="-0.02 0.05 -0.22" rot="0.000000 0.00000 0.000000" scale="0.09 0.09 0.32" support="base"/>
<bone aliases="rShin avatar_mKneeRight" connected="true" end="-0.029 0.000 -0.469" group="Legs" name="mKneeRight" pivot="-0.000780 0.048635 -0.490922" pos="-0.001 0.049 -0.491" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<collision_volume end="-0.010 0.000 -0.150" group="Collision" name="R_LOWER_LEG" pos="-0.02 0.0 -0.2" rot="0.000000 0.00000 0.000000" scale="0.06 0.06 0.25" support="base"/>
<bone aliases="rFoot avatar_mAnkleRight" connected="true" end="0.112 0.000 -0.061" group="Legs" name="mAnkleRight" pivot="-0.028869 0.000000 -0.468494" pos="-0.029 0.000 -0.468" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<collision_volume end="0.089 0.000 -0.026" group="Collision" name="R_FOOT" pos="0.077 0.0 -0.041" rot="0.000000 10.00000 0.000000" scale="0.13 0.05 0.05" support="base"/>
<bone aliases="avatar_mFootRight" connected="true" end="0.105 -0.010 0.000" group="Extra" name="mFootRight" pivot="0.111956 -0.000000 -0.060637" pos="0.112 -0.000 -0.061" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<bone aliases="avatar_mToeRight" connected="false" end="0.020 0.000 0.000" group="Extra" name="mToeRight" pivot="0.105399 -0.010408 -0.000104" pos="0.109 0.000 0.000" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base"/>
</bone>
</bone>
</bone>
</bone>
<bone aliases="lThigh avatar_mHipLeft" connected="false" end="-0.001 -0.046 -0.491" group="Legs" name="mHipLeft" pivot="0.033757 0.126765 -0.040998" pos="0.034 0.127 -0.041" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<collision_volume end="0.000 0.000 -0.200" group="Collision" name="L_UPPER_LEG" pos="-0.02 -0.05 -0.22" rot="0.000000 0.00000 0.000000" scale="0.09 0.09 0.32" support="base"/>
<bone aliases="lShin avatar_mKneeLeft" connected="true" end="-0.029 0.001 -0.469" group="Legs" name="mKneeLeft" pivot="-0.000887 -0.045568 -0.491053" pos="-0.001 -0.046 -0.491" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<collision_volume end="-0.010 0.000 -0.150" group="Collision" name="L_LOWER_LEG" pos="-0.02 0.0 -0.2" rot="0.000000 0.00000 0.000000" scale="0.06 0.06 0.25" support="base"/>
<bone aliases="lFoot avatar_mAnkleLeft" connected="true" end="0.112 0.000 -0.061" group="Legs" name="mAnkleLeft" pivot="-0.028887 0.001378 -0.468449" pos="-0.029 0.001 -0.468" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<collision_volume end="0.089 0.000 -0.026" group="Collision" name="L_FOOT" pos="0.077 0.0 -0.041" rot="0.000000 10.00000 0.000000" scale="0.13 0.05 0.05" support="base"/>
<bone aliases="avatar_mFootLeft" connected="true" end="0.105 0.008 0.001" group="Extra" name="mFootLeft" pivot="0.111956 -0.000000 -0.060620" pos="0.112 -0.000 -0.061" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base">
<bone aliases="avatar_mToeLeft" connected="false" end="0.020 0.000 0.000" group="Extra" name="mToeLeft" pivot="0.105387 0.008270 0.000871" pos="0.109 0.000 0.000" rot="0.000000 0.000000 0.000000" scale="1.000 1.000 1.000" support="base"/>
</bone>
</bone>
</bone>
</bone>
<bone connected="false" end="-0.197 0.000 0.000" group="Tail" name="mTail1" pivot="-0.116 0.000 0.047" pos="-0.116 0.000 0.047" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.168 0.000 0.000" group="Tail" name="mTail2" pivot="-0.197 0.000 0.000" pos="-0.197 0.000 0.000" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.142 0.000 0.000" group="Tail" name="mTail3" pivot="-0.168 0.000 0.000" pos="-0.168 0.000 0.000" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.112 0.000 0.000" group="Tail" name="mTail4" pivot="-0.142 0.000 0.000" pos="-0.142 0.000 0.000" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.094 0.000 0.000" group="Tail" name="mTail5" pivot="-0.112 0.000 0.000" pos="-0.112 0.000 0.000" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.089 0.000 0.000" group="Tail" name="mTail6" pivot="-0.094 0.000 0.000" pos="-0.094 0.000 0.000" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
</bone>
</bone>
</bone>
</bone>
<bone connected="false" end="0.004 0.000 -0.066" group="Groin" name="mGroin" pivot="0.064 0.000 -0.097" pos="0.064 0.000 -0.097" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
<bone connected="false" end="-0.204 0.000 0.000" group="Limb" name="mHindLimbsRoot" pivot="-0.200 0.000 0.084" pos="-0.200 0.000 0.084" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="false" end="0.002 -0.046 -0.491" group="Limb" name="mHindLimb1Left" pivot="-0.204 0.129 -0.125" pos="-0.204 0.129 -0.125" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.030 -0.003 -0.468" group="Limb" name="mHindLimb2Left" pivot="0.002 -0.046 -0.491" pos="0.002 -0.046 -0.491" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="0.112 0.000 -0.061" group="Limb" name="mHindLimb3Left" pivot="-0.030 -0.003 -0.468" pos="-0.030 -0.003 -0.468" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="0.105 0.008 0.000" group="Limb" name="mHindLimb4Left" pivot="0.112 0.000 -0.061" pos="0.112 0.000 -0.061" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
</bone>
</bone>
<bone connected="false" end="0.002 0.046 -0.491" group="Limb" name="mHindLimb1Right" pivot="-0.204 -0.129 -0.125" pos="-0.204 -0.129 -0.125" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="-0.030 0.003 -0.468" group="Limb" name="mHindLimb2Right" pivot="0.002 0.046 -0.491" pos="0.002 0.046 -0.491" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="0.112 0.000 -0.061" group="Limb" name="mHindLimb3Right" pivot="-0.030 0.003 -0.468" pos="-0.030 0.003 -0.468" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended">
<bone connected="true" end="0.105 -0.008 0.000" group="Limb" name="mHindLimb4Right" pivot="0.112 0.000 -0.061" pos="0.112 0.000 -0.061" rot="0.000 0.000 0.000" scale="1.00 1.00 1.00" support="extended"/>
</bone>
</bone>
</bone>
</bone>
</bone>
</linden_skeleton>

View File

@@ -294,11 +294,48 @@ class RawBytes(bytes):
pass
_T = TypeVar("_T")
class Pretty(Generic[_T]):
"""Wrapper for var values so Messages will know to serialize"""
__slots__ = ("value",)
def __init__(self, value: _T):
self.value: _T = value
class StringEnum(str, enum.Enum):
def __str__(self):
return self.value
class IntEnum(enum.IntEnum):
# Give a special repr() that'll eval in a REPL.
def __repr__(self):
return f"{self.__class__.__name__}.{self.name}"
class IntFlag(enum.IntFlag):
def __repr__(self):
# Make an ORed together version of the flags based on the POD version
flags = flags_to_pod(type(self), self)
flags = " | ".join(
(f"{self.__class__.__name__}.{v}" if isinstance(v, str) else str(v))
for v in flags
)
return f"({flags})"
def flags_to_pod(flag_cls: Type[enum.IntFlag], val: int) -> Tuple[Union[str, int], ...]:
# Shove any bits not represented in the IntFlag into an int
left_over = val
for flag in iter(flag_cls):
left_over &= ~flag.value
extra = (int(left_over),) if left_over else ()
return tuple(flag.name for flag in iter(flag_cls) if val & flag.value) + extra
class TaggedUnion(recordclass.datatuple): # type: ignore
tag: Any
value: Any
@@ -306,5 +343,6 @@ class TaggedUnion(recordclass.datatuple): # type: ignore
__all__ = [
"Vector3", "Vector4", "Vector2", "Quaternion", "TupleCoord",
"UUID", "RawBytes", "StringEnum", "JankStringyBytes", "TaggedUnion"
"UUID", "RawBytes", "StringEnum", "JankStringyBytes", "TaggedUnion",
"IntEnum", "IntFlag", "flags_to_pod", "Pretty"
]

View File

@@ -347,7 +347,7 @@ class RegionCapNotAvailable(RegionDomainError):
class RegionMessageError(RegionDomainError):
""" an error raised when a region does not have a connection
over which it can send UDP messages
over which it can send UDP messages
accepts a region object as an attribute

View File

@@ -1,6 +1,8 @@
from __future__ import annotations
import codecs
import functools
import pkg_resources
import re
import weakref
from pprint import PrettyPrinter
@@ -133,3 +135,13 @@ def bytes_unescape(val: bytes) -> bytes:
def bytes_escape(val: bytes) -> bytes:
# Try to keep newlines as-is
return re.sub(rb"(?<!\\)\\n", b"\n", codecs.escape_encode(val)[0]) # type: ignore
def get_resource_filename(resource_filename: str):
return pkg_resources.resource_filename("hippolyzer", resource_filename)
def to_chunks(chunkable: Sequence[_T], chunk_size: int) -> Generator[_T, None, None]:
while chunkable:
yield chunkable[:chunk_size]
chunkable = chunkable[chunk_size:]

View File

@@ -3,11 +3,11 @@ import tempfile
from io import BytesIO
from typing import *
import defusedxml.cElementTree
import defusedxml.ElementTree
from glymur import jp2box, Jp2k
# Replace glymur's ElementTree with a safe one
jp2box.ET = defusedxml.cElementTree
jp2box.ET = defusedxml.ElementTree
SL_DEFAULT_ENCODE = {

View File

@@ -1,71 +1,49 @@
"""
Parse the horrible legacy inventory format
Parse the horrible legacy inventory-related format.
It's typically only used for object contents now.
"""
from __future__ import annotations
import abc
import dataclasses
import datetime as dt
import itertools
import logging
import re
import weakref
from io import StringIO
from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.legacy_schema import (
parse_schema_line,
SchemaBase,
SchemaDate,
SchemaFieldSerializer,
SchemaHexInt,
SchemaInt,
SchemaMultilineStr,
SchemaParsingError,
SchemaStr,
SchemaUUID,
schema_field,
)
LOG = logging.getLogger(__name__)
MAGIC_ID = UUID("3c115e51-04f4-523c-9fa6-98aff1034730")
LOG = logging.getLogger(__name__)
_T = TypeVar("_T")
def _parse_str(val: str):
return val.rstrip("|")
def _int_from_hex(val: str):
return int(val, 16)
def _parse_date(val: str):
return dt.datetime.utcfromtimestamp(int(val))
class InventoryParsingError(Exception):
pass
def _inv_field(spec: Union[Callable, Type], *, default=dataclasses.MISSING, init=True, repr=True, # noqa
hash=None, compare=True) -> dataclasses.Field: # noqa
"""Describe a field in the inventory schema and the shape of its value"""
return dataclasses.field(
metadata={"spec": spec}, default=default, init=init,
repr=repr, hash=hash, compare=compare
)
# The schema is meant to allow multi-line strings, but in practice
# it does not due to scanf() shenanigans. This is fine.
_INV_TOKEN_RE = re.compile(r'\A\s*([^\s]+)(\s+([^\t\r\n]+))?$')
def _parse_inv_line(line: str):
g = _INV_TOKEN_RE.search(line)
if not g:
raise InventoryParsingError("%r doesn't match the token regex" % line)
return g.group(1), g.group(3)
def _yield_inv_tokens(line_iter: Iterator[str]):
def _yield_schema_tokens(reader: StringIO):
in_bracket = False
for line in line_iter:
# empty str == EOF in Python
while line := reader.readline():
line = line.strip()
# Whitespace-only lines are automatically skipped
if not line:
continue
try:
key, val = _parse_inv_line(line)
except InventoryParsingError:
key, val = parse_schema_line(line)
except SchemaParsingError:
# Can happen if there's a malformed multi-line string, just
# skip by it.
LOG.warning(f"Found invalid inventory line {line!r}")
@@ -77,41 +55,91 @@ def _yield_inv_tokens(line_iter: Iterator[str]):
in_bracket = True
continue
if key == "}":
if not in_bracket:
LOG.warning("Unexpected closing bracket")
in_bracket = False
break
yield key, val
if in_bracket:
raise LOG.warning("Reached EOF while inside a bracket")
LOG.warning("Reached EOF while inside a bracket")
class InventoryModel:
class InventoryBase(SchemaBase):
SCHEMA_NAME: ClassVar[str]
@classmethod
def from_reader(cls, reader: StringIO, read_header=False) -> InventoryBase:
tok_iter = _yield_schema_tokens(reader)
# Someone else hasn't already read the header for us
if read_header:
schema_name, _ = next(tok_iter)
if schema_name != cls.SCHEMA_NAME:
raise ValueError(f"Expected schema name {schema_name!r} to be {cls.SCHEMA_NAME!r}")
fields = cls._fields_dict()
obj_dict = {}
for key, val in tok_iter:
if key in fields:
field: dataclasses.Field = fields[key]
spec = field.metadata.get("spec")
# Not a real key, an internal var on our dataclass
if not spec:
LOG.warning(f"Internal key {key!r}")
continue
# some kind of nested structure like sale_info
if issubclass(spec, SchemaBase):
obj_dict[key] = spec.from_reader(reader)
elif issubclass(spec, SchemaFieldSerializer):
obj_dict[key] = spec.deserialize(val)
else:
raise ValueError(f"Unsupported spec for {key!r}, {spec!r}")
else:
LOG.warning(f"Unknown key {key!r}")
return cls._obj_from_dict(obj_dict)
def to_writer(self, writer: StringIO):
writer.write(f"\t{self.SCHEMA_NAME}\t0\n")
writer.write("\t{\n")
for field_name, field in self._fields_dict().items():
spec = field.metadata.get("spec")
# Not meant to be serialized
if not spec:
continue
val = getattr(self, field_name)
if val is None:
continue
# Some kind of nested structure like sale_info
if isinstance(val, SchemaBase):
val.to_writer(writer)
elif issubclass(spec, SchemaFieldSerializer):
writer.write(f"\t\t{field_name}\t{spec.serialize(val)}\n")
else:
raise ValueError(f"Bad inventory spec {spec!r}")
writer.write("\t}\n")
class InventoryModel(InventoryBase):
def __init__(self):
self.containers: Dict[UUID, InventoryContainerBase] = {}
self.items: Dict[UUID, InventoryItem] = {}
self.root: Optional[InventoryContainerBase] = None
@classmethod
def from_str(cls, text: str):
return cls.from_iter(iter(text.splitlines()))
@classmethod
def from_bytes(cls, data: bytes):
return cls.from_str(data.decode("utf8"))
@classmethod
def from_iter(cls, line_iter: Iterator[str]) -> InventoryModel:
def from_reader(cls, reader: StringIO, read_header=False) -> InventoryModel:
model = cls()
for key, value in _yield_inv_tokens(line_iter):
for key, value in _yield_schema_tokens(reader):
if key == "inv_object":
obj = InventoryObject.from_iter(line_iter)
obj = InventoryObject.from_reader(reader)
if obj is not None:
model.add_container(obj)
elif key == "inv_category":
cat = InventoryCategory.from_iter(line_iter)
cat = InventoryCategory.from_reader(reader)
if cat is not None:
model.add_container(cat)
elif key == "inv_item":
item = InventoryItem.from_iter(line_iter)
item = InventoryItem.from_reader(reader)
if item is not None:
model.add_item(item)
else:
@@ -119,6 +147,12 @@ class InventoryModel:
model.reparent_nodes()
return model
def to_writer(self, writer: StringIO):
for container in self.containers.values():
container.to_writer(writer)
for item in self.items.values():
item.to_writer(writer)
def add_container(self, container: InventoryContainerBase):
self.containers[container.node_id] = container
container.model = weakref.proxy(self)
@@ -143,63 +177,34 @@ class InventoryModel:
parent_container.children.append(obj)
@dataclasses.dataclass
class InventoryBase(abc.ABC):
@classmethod
def _fields_dict(cls):
return {f.name: f for f in dataclasses.fields(cls)}
@classmethod
def from_iter(cls, line_iter: Iterator[str]):
fields = cls._fields_dict()
obj = {}
for key, val in _yield_inv_tokens(line_iter):
if key in fields:
field: dataclasses.Field = fields[key]
spec = field.metadata.get("spec")
# Not a real key, an internal var on our dataclass
if not spec:
LOG.warning(f"Internal key {key!r}")
continue
# some kind of nested structure like sale_info
if isinstance(spec, type) and issubclass(spec, InventoryBase):
obj[key] = spec.from_iter(line_iter)
else:
obj[key] = spec(val)
else:
LOG.warning(f"Unknown key {key!r}")
# Bad entry, ignore
# TODO: Check on these. might be symlinks or something.
if obj.get("type") == "-1":
LOG.warning(f"Skipping bad object with type == -1: {obj!r}")
return None
return cls(**obj) # type: ignore
@dataclasses.dataclass
class InventoryPermissions(InventoryBase):
base_mask: int = _inv_field(_int_from_hex)
owner_mask: int = _inv_field(_int_from_hex)
group_mask: int = _inv_field(_int_from_hex)
everyone_mask: int = _inv_field(_int_from_hex)
next_owner_mask: int = _inv_field(_int_from_hex)
creator_id: UUID = _inv_field(UUID)
owner_id: UUID = _inv_field(UUID)
last_owner_id: UUID = _inv_field(UUID)
group_id: UUID = _inv_field(UUID)
SCHEMA_NAME: ClassVar[str] = "permissions"
base_mask: int = schema_field(SchemaHexInt)
owner_mask: int = schema_field(SchemaHexInt)
group_mask: int = schema_field(SchemaHexInt)
everyone_mask: int = schema_field(SchemaHexInt)
next_owner_mask: int = schema_field(SchemaHexInt)
creator_id: UUID = schema_field(SchemaUUID)
owner_id: UUID = schema_field(SchemaUUID)
last_owner_id: UUID = schema_field(SchemaUUID)
group_id: UUID = schema_field(SchemaUUID)
@dataclasses.dataclass
class InventorySaleInfo(InventoryBase):
sale_type: str = _inv_field(str)
sale_price: int = _inv_field(int)
SCHEMA_NAME: ClassVar[str] = "sale_info"
sale_type: str = schema_field(SchemaStr)
sale_price: int = schema_field(SchemaInt)
@dataclasses.dataclass
class InventoryNodeBase(InventoryBase):
ID_ATTR: ClassVar[str]
parent_id: Optional[UUID] = _inv_field(UUID)
parent_id: Optional[UUID] = schema_field(SchemaUUID)
model: Optional[InventoryModel] = dataclasses.field(default=None, init=False)
@property
@@ -210,43 +215,58 @@ class InventoryNodeBase(InventoryBase):
def parent(self):
return self.model.containers.get(self.parent_id)
@classmethod
def _obj_from_dict(cls, obj_dict):
# Bad entry, ignore
# TODO: Check on these. might be symlinks or something.
if obj_dict.get("type") == "-1":
LOG.warning(f"Skipping bad object with type == -1: {obj_dict!r}")
return None
return super()._obj_from_dict(obj_dict)
@dataclasses.dataclass
class InventoryContainerBase(InventoryNodeBase):
type: str = _inv_field(str)
name: str = _inv_field(_parse_str)
type: str = schema_field(SchemaStr)
name: str = schema_field(SchemaMultilineStr)
children: List[InventoryNodeBase] = dataclasses.field(default_factory=list, init=False)
@dataclasses.dataclass
class InventoryObject(InventoryContainerBase):
SCHEMA_NAME: ClassVar[str] = "inv_object"
ID_ATTR: ClassVar[str] = "obj_id"
obj_id: UUID = _inv_field(UUID)
obj_id: UUID = schema_field(SchemaUUID)
@dataclasses.dataclass
class InventoryCategory(InventoryContainerBase):
ID_ATTR: ClassVar[str] = "cat_id"
cat_id: UUID = _inv_field(UUID)
pref_type: str = _inv_field(str)
owner_id: UUID = _inv_field(UUID)
version: int = _inv_field(int)
SCHEMA_NAME: ClassVar[str] = "inv_object"
cat_id: UUID = schema_field(SchemaUUID)
pref_type: str = schema_field(SchemaStr)
owner_id: UUID = schema_field(SchemaUUID)
version: int = schema_field(SchemaInt)
@dataclasses.dataclass
class InventoryItem(InventoryNodeBase):
SCHEMA_NAME: ClassVar[str] = "inv_item"
ID_ATTR: ClassVar[str] = "item_id"
item_id: UUID = _inv_field(UUID)
type: str = _inv_field(str)
inv_type: str = _inv_field(str)
flags: int = _inv_field(_int_from_hex)
name: str = _inv_field(_parse_str)
desc: str = _inv_field(_parse_str)
creation_date: dt.datetime = _inv_field(_parse_date)
permissions: InventoryPermissions = _inv_field(InventoryPermissions)
sale_info: InventorySaleInfo = _inv_field(InventorySaleInfo)
asset_id: Optional[UUID] = _inv_field(UUID, default=None)
shadow_id: Optional[UUID] = _inv_field(UUID, default=None)
item_id: UUID = schema_field(SchemaUUID)
type: str = schema_field(SchemaStr)
inv_type: str = schema_field(SchemaStr)
flags: int = schema_field(SchemaHexInt)
name: str = schema_field(SchemaMultilineStr)
desc: str = schema_field(SchemaMultilineStr)
creation_date: dt.datetime = schema_field(SchemaDate)
permissions: InventoryPermissions = schema_field(InventoryPermissions)
sale_info: InventorySaleInfo = schema_field(InventorySaleInfo)
asset_id: Optional[UUID] = schema_field(SchemaUUID, default=None)
shadow_id: Optional[UUID] = schema_field(SchemaUUID, default=None)
@property
def true_asset_id(self) -> UUID:

View File

@@ -0,0 +1,155 @@
"""
Legacy line-oriented schema parser base classes
Used for task inventory and wearables.
"""
from __future__ import annotations
import abc
import calendar
import dataclasses
import datetime as dt
import logging
import re
from io import StringIO
from typing import *
from hippolyzer.lib.base.datatypes import UUID
LOG = logging.getLogger(__name__)
_T = TypeVar("_T")
class SchemaFieldSerializer(abc.ABC, Generic[_T]):
@classmethod
@abc.abstractmethod
def deserialize(cls, val: str) -> _T:
pass
@classmethod
@abc.abstractmethod
def serialize(cls, val: _T) -> str:
pass
class SchemaDate(SchemaFieldSerializer[dt.datetime]):
@classmethod
def deserialize(cls, val: str) -> dt.datetime:
return dt.datetime.utcfromtimestamp(int(val))
@classmethod
def serialize(cls, val: dt.datetime) -> str:
return str(calendar.timegm(val.utctimetuple()))
class SchemaHexInt(SchemaFieldSerializer[int]):
@classmethod
def deserialize(cls, val: str) -> int:
return int(val, 16)
@classmethod
def serialize(cls, val: int) -> str:
return "%08x" % val
class SchemaInt(SchemaFieldSerializer[int]):
@classmethod
def deserialize(cls, val: str) -> int:
return int(val)
@classmethod
def serialize(cls, val: int) -> str:
return str(val)
class SchemaMultilineStr(SchemaFieldSerializer[str]):
@classmethod
def deserialize(cls, val: str) -> str:
# llinventory claims that it will parse multiple lines until it finds
# an "|" terminator. That's not true. Use llinventory's _actual_ behaviour.
return val.partition("|")[0]
@classmethod
def serialize(cls, val: str) -> str:
return val + "|"
class SchemaStr(SchemaFieldSerializer[str]):
@classmethod
def deserialize(cls, val: str) -> str:
return val
@classmethod
def serialize(cls, val: str) -> str:
return val
class SchemaUUID(SchemaFieldSerializer[UUID]):
@classmethod
def deserialize(cls, val: str) -> UUID:
return UUID(val)
@classmethod
def serialize(cls, val: UUID) -> str:
return str(val)
def schema_field(spec: Type[Union[SchemaBase, SchemaFieldSerializer]], *, default=dataclasses.MISSING, init=True,
repr=True, hash=None, compare=True) -> dataclasses.Field: # noqa
"""Describe a field in the inventory schema and the shape of its value"""
return dataclasses.field(
metadata={"spec": spec}, default=default, init=init, repr=repr, hash=hash, compare=compare
)
class SchemaParsingError(Exception):
pass
# The schema is meant to allow multi-line strings, but in practice
# it does not due to scanf() shenanigans. This is fine.
_SCHEMA_LINE_TOKENS_RE = re.compile(r'\A\s*([^\s]+)(\s+([^\t\r\n]+))?$')
def parse_schema_line(line: str):
g = _SCHEMA_LINE_TOKENS_RE.search(line)
if not g:
raise SchemaParsingError(f"{line!r} doesn't match the token regex")
return g.group(1), g.group(3)
@dataclasses.dataclass
class SchemaBase(abc.ABC):
@classmethod
def _fields_dict(cls):
return {f.name: f for f in dataclasses.fields(cls)}
@classmethod
def from_str(cls, text: str):
return cls.from_reader(StringIO(text))
@classmethod
@abc.abstractmethod
def from_reader(cls: Type[_T], reader: StringIO) -> _T:
pass
@classmethod
def from_bytes(cls, data: bytes):
return cls.from_str(data.decode("utf8"))
def to_bytes(self) -> bytes:
return self.to_str().encode("utf8")
def to_str(self) -> str:
writer = StringIO()
self.to_writer(writer)
writer.seek(0)
return writer.read()
@abc.abstractmethod
def to_writer(self, writer: StringIO):
pass
@classmethod
def _obj_from_dict(cls, obj_dict: Dict):
return cls(**obj_dict) # type: ignore

View File

@@ -39,6 +39,7 @@ class MeshAsset:
# These TypedDicts describe the expected shape of the LLSD in the mesh
# header and various segments. They're mainly for type hinting.
class MeshHeaderDict(TypedDict, total=False):
"""Header of the mesh file, includes offsets & sizes for segments' LLSD"""
version: int
creator: UUID
date: dt.datetime
@@ -54,6 +55,7 @@ class MeshHeaderDict(TypedDict, total=False):
class SegmentHeaderDict(TypedDict):
"""Standard shape for segment references within the header"""
offset: int
size: int
@@ -73,6 +75,7 @@ class PhysicsHavokSegmentHeaderDict(PhysicsSegmentHeaderDict, total=False):
class PhysicsCostDataHeaderDict(TypedDict, total=False):
"""Cost of physical representation, populated by server"""
decomposition: float
decomposition_discounted_vertices: int
decomposition_hulls: int
@@ -85,6 +88,7 @@ class PhysicsCostDataHeaderDict(TypedDict, total=False):
class MeshSegmentDict(TypedDict, total=False):
"""Dict of segments unpacked using the MeshHeaderDict"""
high_lod: List[LODSegmentDict]
medium_lod: List[LODSegmentDict]
low_lod: List[LODSegmentDict]
@@ -96,6 +100,7 @@ class MeshSegmentDict(TypedDict, total=False):
class LODSegmentDict(TypedDict, total=False):
"""Represents a single entry within the material list of a LOD segment"""
# Only present if True and no geometry
NoGeometry: bool
# -1.0 - 1.0
@@ -113,17 +118,22 @@ class LODSegmentDict(TypedDict, total=False):
class DomainDict(TypedDict):
"""Description of the real range for quantized coordinates"""
# number of elems depends on what the domain is for, Vec2 or Vec3
Max: List[float]
Min: List[float]
class VertexWeight(recordclass.datatuple): # type: ignore
"""Vertex weight for a specific joint on a specific vertex"""
# index of the joint within the joint_names list in the skin segment
joint_idx: int
# 0.0 - 1.0
weight: float
class SkinSegmentDict(TypedDict, total=False):
"""Rigging information"""
joint_names: List[str]
# model -> world transform matrix for model
bind_shape_matrix: List[float]
@@ -137,14 +147,17 @@ class SkinSegmentDict(TypedDict, total=False):
class PhysicsConvexSegmentDict(DomainDict, total=False):
"""Data for convex hull collisions, populated by the client"""
# Min / Max domain vals are inline, unlike for LODs
HullList: List[int]
# -1.0 - 1.0
# -1.0 - 1.0, dequantized from binary field of U16s
Positions: List[Vector3]
# -1.0 - 1.0
# -1.0 - 1.0, dequantized from binary field of U16s
BoundingVerts: List[Vector3]
class PhysicsHavokSegmentDict(TypedDict, total=False):
"""Cached data for Havok collisions, populated by sim and not used by client."""
HullMassProps: MassPropsDict
MOPP: MOPPDict
MeshDecompMassProps: MassPropsDict
@@ -169,8 +182,11 @@ class MOPPDict(TypedDict, total=False):
def positions_from_domain(positions: Iterable[TupleCoord], domain: DomainDict):
# Used for turning positions into their actual positions within the mesh / domain
# for ex: positions_from_domain(lod["Position"], lod["PositionDomain])
"""
Used for turning positions into their actual positions within the mesh / domain
for ex: positions_from_domain(lod["Position"], lod["PositionDomain])
"""
lower = domain['Min']
upper = domain['Max']
return [
@@ -179,7 +195,7 @@ def positions_from_domain(positions: Iterable[TupleCoord], domain: DomainDict):
def positions_to_domain(positions: Iterable[TupleCoord], domain: DomainDict):
# Used for turning positions into their actual positions within the mesh / domain
"""Used for turning positions into their actual positions within the mesh / domain"""
lower = domain['Min']
upper = domain['Max']
return [
@@ -187,7 +203,36 @@ def positions_to_domain(positions: Iterable[TupleCoord], domain: DomainDict):
]
class VertexWeights(se.SerializableBase):
"""Serializer for a list of joint weights on a single vertex"""
INFLUENCE_SER = se.QuantizedFloat(se.U16, 0.0, 1.0)
INFLUENCE_LIMIT = 4
INFLUENCE_TERM = 0xFF
@classmethod
def serialize(cls, vals, writer: se.BufferWriter, ctx=None):
if len(vals) > cls.INFLUENCE_LIMIT:
raise ValueError(f"{vals!r} is too long, can only have {cls.INFLUENCE_LIMIT} influences!")
for val in vals:
joint_idx, influence = val
writer.write(se.U8, joint_idx)
writer.write(cls.INFLUENCE_SER, influence, ctx=ctx)
if len(vals) != cls.INFLUENCE_LIMIT:
writer.write(se.U8, cls.INFLUENCE_TERM)
@classmethod
def deserialize(cls, reader: se.Reader, ctx=None):
influence_list = []
for _ in range(cls.INFLUENCE_LIMIT):
joint_idx = reader.read(se.U8)
if joint_idx == cls.INFLUENCE_TERM:
break
influence_list.append(VertexWeight(joint_idx, reader.read(cls.INFLUENCE_SER, ctx=ctx)))
return influence_list
class SegmentSerializer:
"""Serializer for binary fields within an LLSD object"""
def __init__(self, templates):
self._templates: Dict[str, se.SerializableBase] = templates
@@ -217,33 +262,6 @@ class SegmentSerializer:
return new_segment
class VertexWeights(se.SerializableBase):
INFLUENCE_SER = se.QuantizedFloat(se.U16, 0.0, 1.0)
INFLUENCE_LIMIT = 4
INFLUENCE_TERM = 0xFF
@classmethod
def serialize(cls, vals, writer: se.BufferWriter, ctx=None):
if len(vals) > cls.INFLUENCE_LIMIT:
raise ValueError(f"{vals!r} is too long, can only have {cls.INFLUENCE_LIMIT} influences!")
for val in vals:
joint_idx, influence = val
writer.write(se.U8, joint_idx)
writer.write(cls.INFLUENCE_SER, influence, ctx=ctx)
if len(vals) != cls.INFLUENCE_LIMIT:
writer.write(se.U8, cls.INFLUENCE_TERM)
@classmethod
def deserialize(cls, reader: se.Reader, ctx=None):
influence_list = []
for _ in range(cls.INFLUENCE_LIMIT):
joint_idx = reader.read(se.U8)
if joint_idx == cls.INFLUENCE_TERM:
break
influence_list.append(VertexWeight(joint_idx, reader.read(cls.INFLUENCE_SER, ctx=ctx)))
return influence_list
LOD_SEGMENT_SERIALIZER = SegmentSerializer({
# 16-bit indices to the verts making up the tri. Imposes a 16-bit
# upper limit on verts in any given material in the mesh.
@@ -265,6 +283,7 @@ class LLMeshSerializer(se.SerializableBase):
KNOWN_SEGMENTS = ("lowest_lod", "low_lod", "medium_lod", "high_lod",
"physics_mesh", "physics_convex", "skin", "physics_havok")
# Define unpackers for specific binary fields within the parsed LLSD segments
SEGMENT_TEMPLATES: Dict[str, SegmentSerializer] = {
"lowest_lod": LOD_SEGMENT_SERIALIZER,
"low_lod": LOD_SEGMENT_SERIALIZER,

View File

@@ -19,5 +19,3 @@ You should have received a copy of the GNU Lesser General Public License
along with this program; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""

View File

@@ -0,0 +1,82 @@
from __future__ import annotations
import abc
import datetime as dt
import logging
from typing import *
from typing import Optional
from .message_handler import MessageHandler
from ..network.transport import AbstractUDPTransport, UDPPacket, Direction, ADDR_TUPLE
from .message import Block, Message
from .msgtypes import PacketFlags
from .udpserializer import UDPMessageSerializer
class Circuit:
def __init__(self, near_host: Optional[ADDR_TUPLE], far_host: ADDR_TUPLE, transport):
self.near_host: Optional[ADDR_TUPLE] = near_host
self.host: ADDR_TUPLE = far_host
self.is_alive = True
self.transport: Optional[AbstractUDPTransport] = transport
self.serializer = UDPMessageSerializer()
self.last_packet_at = dt.datetime.now()
self.packet_id_base = 0
def _send_prepared_message(self, message: Message, transport=None):
try:
serialized = self.serializer.serialize(message)
except:
logging.exception(f"Failed to serialize: {message.to_dict()!r}")
raise
return self.send_datagram(serialized, message.direction, transport=transport)
def send_datagram(self, data: bytes, direction: Direction, transport=None):
self.last_packet_at = dt.datetime.now()
src_addr, dst_addr = self.host, self.near_host
if direction == Direction.OUT:
src_addr, dst_addr = self.near_host, self.host
packet = UDPPacket(src_addr, dst_addr, data, direction)
(transport or self.transport).send_packet(packet)
return packet
def prepare_message(self, message: Message):
if message.finalized:
raise RuntimeError(f"Trying to re-send finalized {message!r}")
message.packet_id = self.packet_id_base
self.packet_id_base += 1
if not message.acks:
message.send_flags &= PacketFlags.ACK
# If it was queued, it's not anymore
message.queued = False
message.finalized = True
def send_message(self, message: Message, transport=None):
if self.prepare_message(message):
return self._send_prepared_message(message, transport)
def send_acks(self, to_ack: Sequence[int], direction=Direction.OUT, packet_id=None):
logging.debug("%r acking %r" % (direction, to_ack))
# TODO: maybe tack this onto `.acks` for next message?
message = Message('PacketAck', *[Block('Packets', ID=x) for x in to_ack])
message.packet_id = packet_id
message.direction = direction
message.injected = True
self.send_message(message)
def __repr__(self):
return "<%s %r : %r>" % (self.__class__.__name__, self.near_host, self.host)
class ConnectionHolder(abc.ABC):
"""
Any object that has both a circuit and a message handler
Preferred to explicitly passing around a circuit, message handler pair
because generally a ConnectionHolder represents a region or a client.
The same region or client may have multiple different circuits across the
lifetime of a session (due to region restarts, etc.)
"""
circuit: Optional[Circuit]
message_handler: MessageHandler[Message, str]

View File

@@ -20,8 +20,8 @@ along with this program; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
import os
from hippolyzer.lib.base.helpers import get_resource_filename
msg_tmpl = open(os.path.join(os.path.dirname(__file__), 'message_template.msg'))
with open(os.path.join(os.path.dirname(__file__), 'message.xml'), "rb") as _f:
msg_tmpl = open(get_resource_filename("lib/base/message/data/message_template.msg"))
with open(get_resource_filename("lib/base/message/data/message.xml"), "rb") as _f:
msg_details = _f.read()

View File

@@ -5,14 +5,13 @@ from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.message.data_packer import LLSDDataPacker
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.template import MessageTemplateVariable
from hippolyzer.lib.base.message.template_dict import TemplateDictionary
from hippolyzer.lib.base.message.template_dict import TemplateDictionary, DEFAULT_TEMPLATE_DICT
VAR_PAIR = Tuple[dict, MessageTemplateVariable]
class LLSDMessageSerializer:
DEFAULT_TEMPLATE = TemplateDictionary()
DEFAULT_TEMPLATE = DEFAULT_TEMPLATE_DICT
def __init__(self, message_template=None, message_cls: Type[Message] = Message):
if message_template is not None:

View File

@@ -18,32 +18,57 @@ You should have received a copy of the GNU Lesser General Public License
along with this program; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
from __future__ import annotations
import copy
import enum
import importlib
import itertools
import logging
import os
import uuid
from typing import *
from .. import serialization as se
from ..datatypes import *
from .msgtypes import PacketFlags
from hippolyzer.lib.base.datatypes import *
import hippolyzer.lib.base.serialization as se
import hippolyzer.lib.base.templates as templates
from hippolyzer.lib.base.datatypes import Pretty
from hippolyzer.lib.base.message.msgtypes import PacketFlags
from hippolyzer.lib.base.network.transport import Direction, ADDR_TUPLE
BLOCK_DICT = Dict[str, "MsgBlockList"]
VAR_TYPE = Union[TupleCoord, bytes, str, float, int, Tuple, UUID]
_TEMPLATES_MTIME = os.stat(templates.__file__).st_mtime
def maybe_reload_templates():
# Templates may be modified at runtime during development, check
# if they've changed since startup and reload if they have.
global _TEMPLATES_MTIME
templates_mtime = os.stat(templates.__file__).st_mtime
if _TEMPLATES_MTIME is None or _TEMPLATES_MTIME < templates_mtime:
print("Reloading templates")
try:
importlib.reload(templates) # type: ignore
_TEMPLATES_MTIME = templates_mtime
except:
logging.exception("Failed to reload templates!")
class Block:
"""
"""
base representation of a block
Block expects a name, and kwargs for variables (var_name = value)
"""
__slots__ = ('name', 'size', 'vars', 'message_name', '_ser_cache', 'fill_missing',)
PARENT_MESSAGE_NAME: ClassVar[Optional[str]] = None
def __init__(self, name, /, fill_missing=False, **kwargs):
def __init__(self, name, /, *, fill_missing=False, **kwargs):
self.name = name
self.size = 0
self.message_name: Optional[str] = None
self.message_name: Optional[str] = self.PARENT_MESSAGE_NAME
self.vars: Dict[str, VAR_TYPE] = {}
self._ser_cache: Dict[str, Any] = {}
self.fill_missing = fill_missing
@@ -60,6 +85,9 @@ class Block:
return self.vars[name]
def __setitem__(self, key, value):
if isinstance(value, Pretty):
return self.serialize_var(key, value.value)
# These don't pickle well since they're likely to get hot-reloaded
if isinstance(value, (enum.IntEnum, enum.IntFlag)):
value = int(value)
@@ -129,24 +157,7 @@ class Block:
continue
# We have a serializer, include the pretty output in the repr,
# using the _ suffix so the builder knows it needs to be serialized.
deserialized = self.deserialize_var(key)
type_name = type(deserialized).__name__
# TODO: replace __repr__ for these in a context manager so nested
# Enums / Flags get handled correctly as well. The point of the
# pretty repr() is to make messages directly paste-able into code.
if isinstance(deserialized, enum.IntEnum):
deserialized = f"{type_name}.{deserialized.name}"
elif isinstance(deserialized, enum.IntFlag):
# Make an ORed together version of the flags based on the POD version
flags = se.flags_to_pod(type(deserialized), deserialized)
flags = " | ".join(
(f"{type_name}.{v}" if isinstance(v, str) else str(v))
for v in flags
)
deserialized = f"({flags})"
else:
deserialized = repr(deserialized)
block_vars[f"{key}_"] = deserialized
block_vars[f"{key}_"] = repr(self.deserialize_var(key))
else:
block_vars = self.vars
@@ -175,18 +186,23 @@ class MsgBlockList(List["Block"]):
class Message:
__slots__ = ("name", "send_flags", "_packet_id", "acks", "body_boundaries", "queued",
"offset", "raw_extra", "raw_body", "deserializer", "_blocks", "finalized")
__slots__ = ("name", "send_flags", "packet_id", "acks", "body_boundaries", "queued",
"offset", "raw_extra", "raw_body", "deserializer", "_blocks", "finalized",
"direction", "meta", "injected", "dropped", "sender")
def __init__(self, name, *args, packet_id=None, flags=0, acks=None, direction=None):
# TODO: Do this on a timer or something.
maybe_reload_templates()
def __init__(self, name, *args, packet_id=None, flags=0, acks=None):
self.name = name
self.send_flags = flags
self._packet_id: Optional[int] = packet_id # aka, sequence number
self.packet_id: Optional[int] = packet_id # aka, sequence number
self.acks = acks if acks is not None else tuple()
self.body_boundaries = (-1, -1)
self.offset = 0
self.raw_extra = b""
self.direction: Direction = direction if direction is not None else Direction.OUT
# For lazy deserialization
self.raw_body = None
self.deserializer = None
@@ -196,19 +212,13 @@ class Message:
# Whether message is owned by the queue or should be sent immediately
self.queued: bool = False
self._blocks: BLOCK_DICT = {}
self.meta = {}
self.injected = False
self.dropped = False
self.sender: Optional[ADDR_TUPLE] = None
self.add_blocks(args)
@property
def packet_id(self) -> Optional[int]:
return self._packet_id
@packet_id.setter
def packet_id(self, val: Optional[int]):
self._packet_id = val
# Changing packet ID clears the finalized flag
self.finalized = False
def add_blocks(self, block_list):
# can have a list of blocks if it is multiple or variable
for block in block_list:
@@ -281,7 +291,7 @@ class Message:
if self.raw_body and self.deserializer():
self.deserializer().parse_message_body(self)
def to_dict(self):
def to_dict(self, extended=False):
""" A dict representation of a message.
This is the form used for templated messages sent via EQ.
@@ -297,6 +307,18 @@ class Message:
new_vars[var_name] = val
dict_blocks.append(new_vars)
if extended:
base_repr.update({
"packet_id": self.packet_id,
"meta": self.meta.copy(),
"dropped": self.dropped,
"injected": self.injected,
"direction": self.direction.name,
"send_flags": int(self.send_flags),
"extra": self.extra,
"acks": self.acks,
})
return base_repr
@classmethod
@@ -306,9 +328,23 @@ class Message:
msg.create_block_list(block_type)
for block in blocks:
msg.add_block(Block(block_type, **block))
if 'packet_id' in dict_val:
# extended format
msg.packet_id = dict_val['packet_id']
msg.meta = dict_val['meta']
msg.dropped = dict_val['dropped']
msg.injected = dict_val['injected']
msg.direction = Direction[dict_val['direction']]
msg.send_flags = dict_val['send_flags']
msg.extra = dict_val['extra']
msg.acks = dict_val['acks']
return msg
def invalidate_caches(self):
# Don't have any caches if we haven't even parsed
if self.raw_body:
return
for blocks in self.blocks.values():
for block in blocks:
block.invalidate_caches()
@@ -331,7 +367,7 @@ class Message:
block_reprs = sep.join(x.repr(pretty=pretty) for x in itertools.chain(*self.blocks.values()))
if block_reprs:
block_reprs = sep + block_reprs
return f"{self.name!r}{block_reprs}"
return f"{self.name!r}{block_reprs}, direction=Direction.{self.direction.name}"
def repr(self, pretty=False):
self.ensure_parsed()
@@ -341,14 +377,29 @@ class Message:
message_copy = copy.deepcopy(self)
# Set the queued flag so the original will be dropped and acks will be sent
self.queued = True
if not self.finalized:
self.queued = True
# Original was dropped so let's make sure we have clean acks and packet id
message_copy.acks = tuple()
message_copy.send_flags &= ~PacketFlags.ACK
message_copy.packet_id = None
message_copy.dropped = False
message_copy.finalized = False
return message_copy
def to_summary(self):
string = ""
for block_name, block_list in self.blocks.items():
for block in block_list:
for var_name, val in block.items():
if block.name == "AgentData" and var_name in ("AgentID", "SessionID"):
continue
if string:
string += ", "
string += f"{var_name}={_trunc_repr(val, 10)}"
return string
def __repr__(self):
return self.repr()
@@ -356,3 +407,16 @@ class Message:
if not isinstance(other, self.__class__):
return NotImplemented
return self.to_dict() == other.to_dict()
def _trunc_repr(val, max_len):
if isinstance(val, (uuid.UUID, TupleCoord)):
val = str(val)
repr_val = repr(val)
if isinstance(val, str):
repr_val = repr_val[1:-1]
if isinstance(val, bytes):
repr_val = repr_val[2:-1]
if len(repr_val) > max_len:
return repr_val[:max_len] + ""
return repr_val

View File

@@ -1,53 +1,19 @@
import ast
import base64
import importlib
import logging
import math
import os
import re
import uuid
from typing import *
import hippolyzer.lib.base.datatypes
from hippolyzer.lib.base.datatypes import *
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.helpers import HippoPrettyPrinter
from hippolyzer.lib.base.message.message import Message, Block, PacketFlags
import hippolyzer.lib.proxy.templates as templates
from hippolyzer.lib.base.message.msgtypes import MsgBlockType
from hippolyzer.lib.base.message.template import MessageTemplate
from hippolyzer.lib.proxy.packets import Direction
_TEMPLATES_MTIME = os.stat(templates.__file__).st_mtime
def _maybe_reload_templates():
# Templates may be modified at runtime during development, check
# if they've changed since startup and reload if they have.
global _TEMPLATES_MTIME
templates_mtime = os.stat(templates.__file__).st_mtime
if _TEMPLATES_MTIME is None or _TEMPLATES_MTIME < templates_mtime:
print("Reloading templates")
try:
importlib.reload(templates) # type: ignore
_TEMPLATES_MTIME = templates_mtime
except:
logging.exception("Failed to reload templates!")
def _trunc_repr(val, max_len):
if isinstance(val, (uuid.UUID, TupleCoord)):
val = str(val)
repr_val = repr(val)
if isinstance(val, str):
repr_val = repr_val[1:-1]
if isinstance(val, bytes):
repr_val = repr_val[2:-1]
if len(repr_val) > max_len:
return repr_val[:max_len] + ""
return repr_val
from .. import datatypes
from .. import llsd
from .. import serialization as se
from ..helpers import HippoPrettyPrinter
from ..network.transport import Direction
from .msgtypes import PacketFlags, MsgBlockType
from .template import MessageTemplate
from .message import Message, Block, maybe_reload_templates
class VerbatimHumanVal(str):
@@ -58,135 +24,31 @@ def _filtered_exports(mod):
return {k: getattr(mod, k) for k in mod.__all__}
def proxy_eval(eval_str: str, globals_=None, locals_=None):
def subfield_eval(eval_str: str, globals_=None, locals_=None):
return eval(
eval_str,
{
"llsd": llsd,
"base64": base64,
"math": math,
**_filtered_exports(hippolyzer.lib.base.datatypes),
**_filtered_exports(datatypes),
**(globals_ or {})},
locals_
)
class ProxiedMessage(Message):
__slots__ = ("meta", "injected", "dropped", "direction")
TextSpan = Tuple[int, int]
SpanDict = Dict[Tuple[Union[str, int], ...], TextSpan]
def __init__(self, *args, direction=None, **kwargs):
super().__init__(*args, **kwargs)
self.direction = direction if direction is not None else Direction.OUT
self.meta = {}
self.injected = False
self.dropped = False
_maybe_reload_templates()
def to_human_string(self, replacements=None, beautify=False,
template: Optional[MessageTemplate] = None):
replacements = replacements or {}
_maybe_reload_templates()
string = ""
if self.direction is not None:
string += f'{self.direction.name} '
string += self.name
if self.packet_id is not None:
string += f'\n# {self.packet_id}: {PacketFlags(self.send_flags)!r}'
string += f'{", DROPPED" if self.dropped else ""}{", INJECTED" if self.injected else ""}'
if self.extra:
string += f'\n# EXTRA: {self.extra!r}'
string += '\n\n'
class SpannedString(str):
spans: SpanDict = {}
for block_name, block_list in self.blocks.items():
block_suffix = ""
if template and template.get_block(block_name).block_type == MsgBlockType.MBT_VARIABLE:
block_suffix = ' # Variable'
for block in block_list:
string += f"[{block_name}]{block_suffix}\n"
for var_name, val in block.items():
string += self._format_var(block, var_name, val, replacements, beautify)
return string
def _format_var(self, block, var_name, var_val, replacements, beautify=False):
string = ""
# Check if we have a more human-readable way to present this field
ser_key = (self.name, block.name, var_name)
serializer = se.SUBFIELD_SERIALIZERS.get(ser_key)
field_prefix = ""
if isinstance(var_val, VerbatimHumanVal):
var_data = var_val
elif isinstance(var_val, (uuid.UUID, TupleCoord)):
var_data = str(var_val)
elif isinstance(var_val, (str, bytes)) and not serializer:
var_data = self._multi_line_pformat(var_val)
else:
var_data = repr(var_val)
if serializer and beautify and not isinstance(var_val, VerbatimHumanVal):
try:
pretty_data = serializer.deserialize(block, var_val, pod=True)
if pretty_data is not se.UNSERIALIZABLE:
string += f" {var_name} =| {self._multi_line_pformat(pretty_data)}"
if serializer.AS_HEX and isinstance(var_val, int):
var_data = hex(var_val)
if serializer.ORIG_INLINE:
string += f" #{var_data}\n"
return string
else:
string += "\n"
# Human-readable version should be used, orig data is commented out
field_prefix = "#"
except:
logging.exception(f"Failed in subfield serializer {ser_key!r}")
if beautify:
if block.name == "AgentData":
if var_name == "AgentID" and var_val == replacements.get("AGENT_ID"):
var_data = "[[AGENT_ID]]"
elif var_name == "SessionID" and var_val == replacements.get("SESSION_ID"):
var_data = "[[SESSION_ID]]"
if "CircuitCode" in var_name or ("Code" in var_name and "Circuit" in block.name):
if var_val == replacements.get("CIRCUIT_CODE"):
var_data = "[[CIRCUIT_CODE]]"
string += f" {field_prefix}{var_name} = {var_data}\n"
return string
@staticmethod
def _multi_line_pformat(val):
printer = HippoPrettyPrinter(width=100)
val = printer.pformat(val)
newstr = ""
# Now we need to rebuild this to add in the appropriate
# line continuations.
lines = list(val.splitlines())
first_line = True
while lines:
line = lines.pop(0)
prefix = ""
suffix = ""
if first_line:
first_line = False
else:
prefix = " "
if lines:
suffix = " \\\n"
newstr += f"{prefix}{line}{suffix}"
return newstr
def to_summary(self):
string = ""
for block_name, block_list in self.blocks.items():
for block in block_list:
for var_name, val in block.items():
if block.name == "AgentData" and var_name in ("AgentID", "SessionID"):
continue
if string:
string += ", "
string += f"{var_name}={_trunc_repr(val, 10)}"
return string
class HumanMessageSerializer:
@classmethod
def from_human_string(cls, string, replacements=None, env=None, safe=True):
_maybe_reload_templates()
maybe_reload_templates()
replacements = replacements or {}
env = env or {}
first_line = True
@@ -201,7 +63,7 @@ class ProxiedMessage(Message):
if first_line:
direction, message_name = line.split(" ", 1)
msg = ProxiedMessage(message_name)
msg = Message(message_name)
msg.direction = Direction[direction.upper()]
first_line = False
continue
@@ -240,14 +102,14 @@ class ProxiedMessage(Message):
var_val = tuple(float(x) for x in var_val.split(","))
# UUID-ish
elif re.match(r"\A\w+-\w+-.*", var_val):
var_val = UUID(var_val)
var_val = datatypes.UUID(var_val)
else:
var_val = ast.literal_eval(var_val)
# Normally gross, but necessary for expressiveness in built messages
# unless a metalanguage is added.
if evaled:
var_val = proxy_eval(
var_val = subfield_eval(
var_val,
globals_={**env, **replacements},
locals_={"block": cur_block}
@@ -265,6 +127,102 @@ class ProxiedMessage(Message):
cur_block[var_name] = var_val
return msg
def _args_repr(self, pretty=False):
base = super()._args_repr(pretty=pretty)
return f"{base}, direction=Direction.{self.direction.name}"
@classmethod
def to_human_string(cls, msg: Message, replacements=None, beautify=False,
template: Optional[MessageTemplate] = None) -> SpannedString:
replacements = replacements or {}
maybe_reload_templates()
spans: SpanDict = {}
string = ""
if msg.direction is not None:
string += f'{msg.direction.name} '
string += msg.name
if msg.packet_id is not None:
string += f'\n# {msg.packet_id}: {PacketFlags(msg.send_flags)!r}'
string += f'{", DROPPED" if msg.dropped else ""}{", INJECTED" if msg.injected else ""}'
if msg.extra:
string += f'\n# EXTRA: {msg.extra!r}'
string += '\n\n'
for block_name, block_list in msg.blocks.items():
block_suffix = ""
if template and template.get_block(block_name).block_type == MsgBlockType.MBT_VARIABLE:
block_suffix = ' # Variable'
for block_num, block in enumerate(block_list):
string += f"[{block_name}]{block_suffix}\n"
for var_name, val in block.items():
start_len = len(string)
string += cls._format_var(msg, block, var_name, val, replacements, beautify)
end_len = len(string)
# Store the spans for each var so we can highlight specific matches
spans[(msg.name, block_name, block_num, var_name)] = (start_len, end_len)
string += "\n"
spanned = SpannedString(string)
spanned.spans = spans
return spanned
@classmethod
def _format_var(cls, msg, block, var_name, var_val, replacements, beautify=False):
string = ""
# Check if we have a more human-readable way to present this field
ser_key = (msg.name, block.name, var_name)
serializer = se.SUBFIELD_SERIALIZERS.get(ser_key)
field_prefix = ""
if isinstance(var_val, VerbatimHumanVal):
var_data = var_val
elif isinstance(var_val, (uuid.UUID, datatypes.TupleCoord)):
var_data = str(var_val)
elif isinstance(var_val, (str, bytes)) and not serializer:
var_data = cls._multi_line_pformat(var_val)
else:
var_data = repr(var_val)
if serializer and beautify and not isinstance(var_val, VerbatimHumanVal):
try:
pretty_data = serializer.deserialize(block, var_val, pod=True)
if pretty_data is not se.UNSERIALIZABLE:
string += f" {var_name} =| {cls._multi_line_pformat(pretty_data)}"
if serializer.AS_HEX and isinstance(var_val, int):
var_data = hex(var_val)
if serializer.ORIG_INLINE:
string += f" #{var_data}"
return string
else:
string += "\n"
# Human-readable version should be used, orig data is commented out
field_prefix = "#"
except:
logging.exception(f"Failed in subfield serializer {ser_key!r}")
if beautify:
if block.name == "AgentData":
if var_name == "AgentID" and var_val == replacements.get("AGENT_ID"):
var_data = "[[AGENT_ID]]"
elif var_name == "SessionID" and var_val == replacements.get("SESSION_ID"):
var_data = "[[SESSION_ID]]"
if "CircuitCode" in var_name or ("Code" in var_name and "Circuit" in block.name):
if var_val == replacements.get("CIRCUIT_CODE"):
var_data = "[[CIRCUIT_CODE]]"
string += f" {field_prefix}{var_name} = {var_data}"
return string
@staticmethod
def _multi_line_pformat(val):
printer = HippoPrettyPrinter(width=100)
val = printer.pformat(val)
newstr = ""
# Now we need to rebuild this to add in the appropriate
# line continuations.
lines = list(val.splitlines())
first_line = True
while lines:
line = lines.pop(0)
prefix = ""
suffix = ""
if first_line:
first_line = False
else:
prefix = " "
if lines:
suffix = " \\\n"
newstr += f"{prefix}{line}{suffix}"
return newstr

View File

@@ -28,36 +28,36 @@ from hippolyzer.lib.base.events import Event
LOG = logging.getLogger(__name__)
_T = TypeVar("_T")
_K = TypeVar("_K", bound=Hashable)
MESSAGE_HANDLER = Callable[[_T], Any]
PREDICATE = Callable[[_T], bool]
MESSAGE_NAMES = Union[str, Iterable[str]]
MESSAGE_NAMES = Iterable[_K]
class MessageHandler(Generic[_T]):
def __init__(self):
self.handlers: Dict[str, Event] = {}
class MessageHandler(Generic[_T, _K]):
def __init__(self, take_by_default: bool = True):
self.handlers: Dict[_K, Event] = {}
self.take_by_default = take_by_default
def register(self, message_name: str) -> Event:
def register(self, message_name: _K) -> Event:
LOG.debug('Creating a monitor for %s' % message_name)
return self.handlers.setdefault(message_name, Event())
def subscribe(self, message_name: str, handler: MESSAGE_HANDLER) -> Event:
def subscribe(self, message_name: _K, handler: MESSAGE_HANDLER) -> Event:
notifier = self.register(message_name)
notifier.subscribe(handler)
return notifier
def _subscribe_all(self, message_names: MESSAGE_NAMES, handler: MESSAGE_HANDLER,
predicate: Optional[PREDICATE] = None) -> List[Event]:
if isinstance(message_names, str):
message_names = (message_names,)
notifiers = [self.register(name) for name in message_names]
for n in notifiers:
n.subscribe(handler, predicate=predicate)
return notifiers
@contextlib.contextmanager
def subscribe_async(self, message_names: MESSAGE_NAMES, take: bool = True,
predicate: Optional[PREDICATE] = None) -> ContextManager[Callable[[], Awaitable[_T]]]:
def subscribe_async(self, message_names: MESSAGE_NAMES, predicate: Optional[PREDICATE] = None,
take: Optional[bool] = None) -> ContextManager[Callable[[], Awaitable[_T]]]:
"""
Subscribe to a set of message matching predicate while within a block
@@ -69,6 +69,8 @@ class MessageHandler(Generic[_T]):
If a subscriber is just an observer that will never drop or modify a message, take=False
may be used and messages will be sent as usual.
"""
if take is None:
take = self.take_by_default
msg_queue = asyncio.Queue()
def _handler_wrapper(message: _T):
@@ -91,8 +93,8 @@ class MessageHandler(Generic[_T]):
for n in notifiers:
n.unsubscribe(_handler_wrapper)
def wait_for(self, message_names: MESSAGE_NAMES,
predicate: Optional[PREDICATE] = None, timeout=None, take=True) -> Awaitable[_T]:
def wait_for(self, message_names: MESSAGE_NAMES, predicate: Optional[PREDICATE] = None,
timeout: Optional[float] = None, take: Optional[bool] = None) -> Awaitable[_T]:
"""
Wait for a single instance one of message_names matching predicate
@@ -101,8 +103,8 @@ class MessageHandler(Generic[_T]):
sequence of packets, since multiple packets may come in after the future has already
been marked completed, causing some to be missed.
"""
if isinstance(message_names, str):
message_names = (message_names,)
if take is None:
take = self.take_by_default
notifiers = [self.register(name) for name in message_names]
fut = asyncio.get_event_loop().create_future()
@@ -132,7 +134,7 @@ class MessageHandler(Generic[_T]):
notifier.subscribe(_handler, predicate=predicate)
return fut
def is_handled(self, message_name: str):
def is_handled(self, message_name: _K):
return message_name in self.handlers
def handle(self, message: _T):
@@ -140,7 +142,7 @@ class MessageHandler(Generic[_T]):
# Always try to call wildcard handlers
self._handle_type('*', message)
def _handle_type(self, name: str, message: _T):
def _handle_type(self, name: _K, message: _T):
handler = self.handlers.get(name)
if not handler:
return

View File

@@ -22,6 +22,7 @@ Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
import typing
from .msgtypes import MsgType, MsgBlockType
from ..datatypes import UUID
class MessageTemplateVariable:
@@ -56,17 +57,44 @@ class MessageTemplateVariable:
self._probably_text = False
else:
self._probably_text = any(x in self.name for x in (
"Name", "Text", "Title", "Description", "Message", "Label", "Method"))
"Name", "Text", "Title", "Description", "Message", "Label", "Method", "Filename",
))
self._probably_text = self._probably_text and self.name != "NameValue"
return self._probably_text
@property
def default_value(self):
if self.type.is_int:
return 0
elif self.type.is_float:
return 0.0
elif self.type == MsgType.MVT_LLUUID:
return UUID()
elif self.type == MsgType.MVT_BOOL:
return False
elif self.type == MsgType.MVT_VARIABLE:
if self.probably_binary:
return b""
if self.probably_text:
return ""
return b""
elif self.type in (MsgType.MVT_LLVector3, MsgType.MVT_LLVector3d, MsgType.MVT_LLQuaternion):
return 0.0, 0.0, 0.0
elif self.type == MsgType.MVT_LLVector4:
return 0.0, 0.0, 0.0, 0.0
elif self.type == MsgType.MVT_FIXED:
return b"\x00" * self.size
elif self.type == MsgType.MVT_IP_ADDR:
return "0.0.0.0"
return None
class MessageTemplateBlock:
def __init__(self, name):
self.variables: typing.List[MessageTemplateVariable] = []
self.variable_map: typing.Dict[str, MessageTemplateVariable] = {}
self.name = name
self.block_type = 0
self.block_type: MsgBlockType = MsgBlockType.MBT_SINGLE
self.number = 0
def add_variable(self, var):

View File

@@ -27,25 +27,35 @@ from .template import MessageTemplate
from .template_parser import MessageTemplateParser
DEFAULT_PARSER = MessageTemplateParser(msg_tmpl)
class TemplateDictionary:
"""the dictionary with all known templates"""
def __init__(self, template_list=None, message_template=None):
if template_list is None:
if message_template is None:
parser = MessageTemplateParser(msg_tmpl)
parser = DEFAULT_PARSER
else:
parser = MessageTemplateParser(message_template)
template_list = parser.message_templates
self.template_list: typing.List[MessageTemplate] = template_list
self.template_list: typing.List[MessageTemplate] = []
# maps name to template
self.message_templates = {}
# maps (freq,num) to template
self.message_dict = {}
self.load_templates(template_list)
def load_templates(self, template_list):
self.template_list.clear()
self.template_list.extend(template_list)
self.message_templates.clear()
self.message_dict.clear()
self.build_dictionaries(template_list)
self.build_message_ids()
@@ -99,3 +109,6 @@ class TemplateDictionary:
def __iter__(self):
return iter(self.template_list)
DEFAULT_TEMPLATE_DICT = TemplateDictionary()

View File

@@ -26,7 +26,7 @@ from logging import getLogger
from hippolyzer.lib.base.datatypes import JankStringyBytes
from hippolyzer.lib.base.settings import Settings
from .template import MessageTemplateVariable
from .template_dict import TemplateDictionary
from .template_dict import DEFAULT_TEMPLATE_DICT
from .msgtypes import MsgType, MsgBlockType, PacketLayout
from .data_packer import TemplateDataPacker
from .message import Message, Block
@@ -62,12 +62,11 @@ def _parse_msg_num(reader: se.BufferReader):
class UDPMessageDeserializer:
DEFAULT_TEMPLATE = TemplateDictionary()
DEFAULT_TEMPLATE = DEFAULT_TEMPLATE_DICT
def __init__(self, settings=None, message_cls: Type[Message] = Message):
def __init__(self, settings=None):
self.settings = settings or Settings()
self.template_dict = self.DEFAULT_TEMPLATE
self.message_cls = message_cls
def deserialize(self, msg_buff: bytes):
msg = self._parse_message_header(msg_buff)
@@ -85,7 +84,7 @@ class UDPMessageDeserializer:
reader = se.BufferReader("!", data)
msg: Message = self.message_cls("Placeholder")
msg: Message = Message("Placeholder")
msg.send_flags = reader.read(se.U8)
msg.packet_id = reader.read(se.U32)

View File

@@ -26,7 +26,7 @@ from .data_packer import TemplateDataPacker
from .message import Message, MsgBlockList
from .msgtypes import MsgType, MsgBlockType
from .template import MessageTemplateVariable, MessageTemplateBlock
from .template_dict import TemplateDictionary
from .template_dict import TemplateDictionary, DEFAULT_TEMPLATE_DICT
from hippolyzer.lib.base import exc
from hippolyzer.lib.base import serialization as se
from hippolyzer.lib.base.datatypes import RawBytes
@@ -35,7 +35,7 @@ logger = getLogger('message.udpserializer')
class UDPMessageSerializer:
DEFAULT_TEMPLATE = TemplateDictionary(None)
DEFAULT_TEMPLATE = DEFAULT_TEMPLATE_DICT
def __init__(self, message_template=None):
if message_template is not None:

View File

@@ -19,6 +19,3 @@ You should have received a copy of the GNU Lesser General Public License
along with this program; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""

View File

@@ -0,0 +1,172 @@
from __future__ import annotations
import asyncio
import copy
import dataclasses
from types import TracebackType
from typing import *
import aiohttp
import multidict
from hippolyzer.lib.base import llsd as llsd_lib
class CapsClientResponse(aiohttp.ClientResponse):
"""
Not actually instantiated, used for lying to the type system
since we'll dynamically put this onto a ClientResponse instance
Will fail isinstance().
"""
async def read_llsd(self) -> Any:
raise NotImplementedError()
class _HippoSessionRequestContextManager:
"""
_SessionRequestContextManager but with a symmetrical API
aiohttp.request() and aiohttp.ClientSession.request() have different APIs.
One is sync returning a context manager, one is async returning a coro.
aiohttp.request() also doesn't accept the arguments that we need for custom
SSL contexts. To deal with requests that have existing sessions and those without,
just give them both the same wrapper and don't close the session on context manager
exit if it wasn't our session.
"""
__slots__ = ("_coro", "_resp", "_session", "_session_owned")
def __init__(
self,
coro: Coroutine[asyncio.Future[Any], None, aiohttp.ClientResponse],
session: aiohttp.ClientSession,
session_owned: bool = True,
) -> None:
self._coro = coro
self._resp: Optional[aiohttp.ClientResponse] = None
self._session = session
self._session_owned = session_owned
async def __aenter__(self) -> CapsClientResponse:
try:
self._resp = await self._coro
# We don't control creation of the ClientResponse, so tack on
# a convenience method for reading LLSD.
async def _read_llsd():
return llsd_lib.parse_xml(await self._resp.read())
self._resp.read_llsd = _read_llsd
except BaseException:
if self._session_owned:
await self._session.close()
raise
else:
# intentionally fooling the type system
return self._resp # type: ignore
async def __aexit__(
self,
exc_type: Optional[Type[BaseException]],
exc: Optional[BaseException],
tb: Optional[TracebackType],
) -> None:
assert self._resp is not None
self._resp.close()
if self._session_owned:
await self._session.close()
CAPS_DICT = Union[
Mapping[str, str],
multidict.MultiDict[str],
]
class CapsClient:
def __init__(self, caps: Optional[CAPS_DICT] = None):
self._caps = caps
def _request_fixups(self, cap_or_url: str, headers: Dict, proxy: Optional[bool], ssl: Any):
return cap_or_url, headers, proxy, ssl
def _get_caps(self) -> Optional[CAPS_DICT]:
return self._caps
def request(self, method: str, cap_or_url: str, *, path: str = "", data: Any = None,
headers: Optional[Dict] = None, session: Optional[aiohttp.ClientSession] = None,
llsd: Any = dataclasses.MISSING, params: Optional[Dict[str, Any]] = None,
proxy: Optional[str] = None, skip_auto_headers: Optional[Sequence[str]] = None,
**kwargs) -> _HippoSessionRequestContextManager:
if cap_or_url.startswith("http"):
if path:
raise ValueError("Specifying both path and a full URL not supported")
else:
caps = self._get_caps()
if caps is None:
raise RuntimeError(f"Need a caps dict to request a Cap like {cap_or_url}")
if cap_or_url not in caps:
raise KeyError(f"{cap_or_url} is not a full URL and not a Cap")
cap_or_url = caps[cap_or_url]
if path:
cap_or_url += path
if params is not None:
for pname, pval in params.items():
if not isinstance(pval, str):
params[pname] = str(pval)
session_owned = False
# Use an existing session if we have one to take advantage of connection pooling
# otherwise create one
if session is None:
session_owned = True
session = aiohttp.ClientSession(
connector=aiohttp.TCPConnector(force_close=True),
connector_owner=True
)
if headers is None:
headers = {}
else:
headers = copy.copy(headers)
# Use sentinel val so explicit `None` can be passed
if llsd is not dataclasses.MISSING:
data = llsd_lib.format_xml(llsd)
# Sometimes needed even on GETs.
if "Content-Type" not in headers:
headers["Content-Type"] = "application/llsd+xml"
# Always present, usually ignored by the server.
if "Accept" not in headers:
headers["Accept"] = "application/llsd+xml"
# Ask to keep the connection open if we're sharing a session
if not session_owned:
headers["Connection"] = "keep-alive"
headers["Keep-alive"] = "300"
ssl = kwargs.pop('ssl', None)
cap_or_url, headers, proxy, ssl = self._request_fixups(cap_or_url, headers, proxy, ssl)
resp = session._request(method, cap_or_url, data=data, headers=headers, # noqa: need internal call
params=params, ssl=ssl, proxy=proxy,
skip_auto_headers=skip_auto_headers or ("User-Agent",), **kwargs)
return _HippoSessionRequestContextManager(resp, session, session_owned=session_owned)
def get(self, cap_or_url: str, *, path: str = "", headers: Optional[dict] = None,
session: Optional[aiohttp.ClientSession] = None, params: Optional[Dict[str, Any]] = None,
proxy: Optional[str] = None, **kwargs) -> _HippoSessionRequestContextManager:
return self.request("GET", cap_or_url=cap_or_url, path=path, headers=headers,
session=session, params=params, proxy=proxy, **kwargs)
def post(self, cap_or_url: str, *, path: str = "", data: Any = None,
headers: Optional[dict] = None, session: Optional[aiohttp.ClientSession] = None,
llsd: Any = dataclasses.MISSING, params: Optional[Dict[str, Any]] = None,
proxy: Optional[str] = None, **kwargs) -> _HippoSessionRequestContextManager:
return self.request("POST", cap_or_url=cap_or_url, path=path, headers=headers, data=data,
llsd=llsd, session=session, params=params, proxy=proxy, **kwargs)
def put(self, cap_or_url: str, *, path: str = "", data: Any = None,
headers: Optional[dict] = None, session: Optional[aiohttp.ClientSession] = None,
llsd: Any = dataclasses.MISSING, params: Optional[Dict[str, Any]] = None,
proxy: Optional[str] = None, **kwargs) -> _HippoSessionRequestContextManager:
return self.request("PUT", cap_or_url=cap_or_url, path=path, headers=headers, data=data,
llsd=llsd, session=session, params=params, proxy=proxy, **kwargs)

View File

@@ -0,0 +1,73 @@
import abc
import asyncio
import enum
import socket
from typing import *
ADDR_TUPLE = Tuple[str, int]
class Direction(enum.Enum):
OUT = enum.auto()
IN = enum.auto()
def __invert__(self):
if self == self.OUT:
return self.IN
return self.OUT
class UDPPacket:
def __init__(
self,
src_addr: Optional[ADDR_TUPLE],
dst_addr: ADDR_TUPLE,
data: bytes,
direction: Direction
):
self.src_addr = src_addr
self.dst_addr = dst_addr
self.data = data
self.direction = direction
self.meta = {}
@property
def outgoing(self):
return self.direction == Direction.OUT
@property
def incoming(self):
return self.direction == Direction.IN
@property
def far_addr(self):
if self.outgoing:
return self.dst_addr
return self.src_addr
class AbstractUDPTransport(abc.ABC):
__slots__ = ()
@abc.abstractmethod
def send_packet(self, packet: UDPPacket) -> None:
pass
@abc.abstractmethod
def close(self) -> None:
pass
class SocketUDPTransport(AbstractUDPTransport):
def __init__(self, transport: Union[asyncio.DatagramTransport, socket.socket]):
super().__init__()
self.transport = transport
def send_packet(self, packet: UDPPacket) -> None:
if not packet.outgoing:
raise ValueError(f"{self.__class__.__name__} can only send outbound packets")
self.transport.sendto(packet.data, packet.dst_addr)
def close(self) -> None:
self.transport.close()

View File

@@ -20,17 +20,23 @@ Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
from __future__ import annotations
import dataclasses
import logging
import struct
from typing import *
import lazy_object_proxy
import recordclass
from hippolyzer.lib.base.datatypes import Vector3, Quaternion, Vector4, UUID
from hippolyzer.lib.base.datatypes import Vector3, Quaternion, Vector4, UUID, TaggedUnion
from hippolyzer.lib.base.message.message import Block
from hippolyzer.lib.base.namevalue import NameValueCollection
import hippolyzer.lib.base.serialization as se
import hippolyzer.lib.base.templates as tmpls
class Object(recordclass.datatuple): # type: ignore
__options__ = {
"fast_new": False,
"use_weakref": True,
}
__weakref__: Any
@@ -39,8 +45,8 @@ class Object(recordclass.datatuple): # type: ignore
State: Optional[int] = None
FullID: Optional[UUID] = None
CRC: Optional[int] = None
PCode: Optional[int] = None
Material: Optional[int] = None
PCode: Optional[tmpls.PCode] = None
Material: Optional[tmpls.MCode] = None
ClickAction: Optional[int] = None
Scale: Optional[Vector3] = None
ParentID: Optional[int] = None
@@ -65,15 +71,15 @@ class Object(recordclass.datatuple): # type: ignore
ProfileBegin: Optional[int] = None
ProfileEnd: Optional[int] = None
ProfileHollow: Optional[int] = None
TextureEntry: Optional[Any] = None
TextureAnim: Optional[Any] = None
TextureEntry: Optional[tmpls.TextureEntry] = None
TextureAnim: Optional[tmpls.TextureAnim] = None
NameValue: Optional[Any] = None
Data: Optional[Any] = None
Text: Optional[str] = None
TextColor: Optional[bytes] = None
MediaURL: Optional[Any] = None
PSBlock: Optional[Any] = None
ExtraParams: Optional[Any] = None
MediaURL: Optional[str] = None
PSBlock: Optional[Dict] = None
ExtraParams: Optional[Dict[tmpls.ExtraParamType, Any]] = None
Sound: Optional[UUID] = None
OwnerID: Optional[UUID] = None
SoundGain: Optional[float] = None
@@ -122,109 +128,20 @@ class Object(recordclass.datatuple): # type: ignore
Description: Optional[str] = None
TouchName: Optional[str] = None
SitName: Optional[str] = None
TextureID: Optional[Any] = None
TextureID: Optional[List[UUID]] = None
RegionHandle: Optional[int] = None
def __init__(self, *, LocalID=None, State=None, FullID=None, CRC=None, PCode=None, Material=None,
ClickAction=None, Scale=None, ParentID=None, UpdateFlags=None, PathCurve=None, ProfileCurve=None,
PathBegin=None, PathEnd=None, PathScaleX=None, PathScaleY=None, PathShearX=None, PathShearY=None,
PathTwist=None, PathTwistBegin=None, PathRadiusOffset=None, PathTaperX=None, PathTaperY=None,
PathRevolutions=None, PathSkew=None, ProfileBegin=None, ProfileEnd=None, ProfileHollow=None,
TextureEntry=None, TextureAnim=None, NameValue=None, Data=None, Text=None, TextColor=None,
MediaURL=None, PSBlock=None, ExtraParams=None, Sound=None, OwnerID=None, SoundGain=None,
SoundFlags=None, SoundRadius=None, JointType=None, JointPivot=None, JointAxisOrAnchor=None,
FootCollisionPlane=None, Position=None, Velocity=None, Acceleration=None, Rotation=None,
AngularVelocity=None, TreeSpecies=None, ObjectCosts=None, ScratchPad=None):
def __init__(self, **_kwargs):
""" set up the object attributes """
self.LocalID = LocalID # U32
self.State = State # U8
self.FullID = FullID # LLUUID
self.CRC = CRC # U32 // TEMPORARY HACK FOR JAMES
self.PCode = PCode # U8
self.Material = Material # U8
self.ClickAction = ClickAction # U8
self.Scale = Scale # LLVector3
self.ParentID = ParentID # U32
# Actually contains a weakref proxy
self.Parent: Optional[Object] = None
self.UpdateFlags = UpdateFlags # U32 // U32, see object_flags.h
self.PathCurve = PathCurve # U8
self.ProfileCurve = ProfileCurve # U8
self.PathBegin = PathBegin # U16 // 0 to 1, quanta = 0.01
self.PathEnd = PathEnd # U16 // 0 to 1, quanta = 0.01
self.PathScaleX = PathScaleX # U8 // 0 to 1, quanta = 0.01
self.PathScaleY = PathScaleY # U8 // 0 to 1, quanta = 0.01
self.PathShearX = PathShearX # U8 // -.5 to .5, quanta = 0.01
self.PathShearY = PathShearY # U8 // -.5 to .5, quanta = 0.01
self.PathTwist = PathTwist # S8 // -1 to 1, quanta = 0.01
self.PathTwistBegin = PathTwistBegin # S8 // -1 to 1, quanta = 0.01
self.PathRadiusOffset = PathRadiusOffset # S8 // -1 to 1, quanta = 0.01
self.PathTaperX = PathTaperX # S8 // -1 to 1, quanta = 0.01
self.PathTaperY = PathTaperY # S8 // -1 to 1, quanta = 0.01
self.PathRevolutions = PathRevolutions # U8 // 0 to 3, quanta = 0.015
self.PathSkew = PathSkew # S8 // -1 to 1, quanta = 0.01
self.ProfileBegin = ProfileBegin # U16 // 0 to 1, quanta = 0.01
self.ProfileEnd = ProfileEnd # U16 // 0 to 1, quanta = 0.01
self.ProfileHollow = ProfileHollow # U16 // 0 to 1, quanta = 0.01
self.TextureEntry = TextureEntry # Variable 2
self.TextureAnim = TextureAnim # Variable 1
self.NameValue = NameValue # Variable 2
self.Data = Data # Variable 2
self.Text = Text # Variable 1 // llSetText() hovering text
self.TextColor = TextColor # Fixed 4 // actually, a LLColor4U
self.MediaURL = MediaURL # Variable 1 // URL for web page, movie, etc.
self.PSBlock = PSBlock # Variable 1
self.ExtraParams = ExtraParams or {} # Variable 1
self.Sound = Sound # LLUUID
self.OwnerID = OwnerID # LLUUID // HACK object's owner id, only set if non-null sound, for muting
self.SoundGain = SoundGain # F32
self.SoundFlags = SoundFlags # U8
self.SoundRadius = SoundRadius # F32 // cutoff radius
self.JointType = JointType # U8
self.JointPivot = JointPivot # LLVector3
self.JointAxisOrAnchor = JointAxisOrAnchor # LLVector3
self.TreeSpecies = TreeSpecies
self.ScratchPad = ScratchPad
self.ObjectCosts = ObjectCosts or {}
self.ExtraParams = self.ExtraParams or {} # Variable 1
self.ObjectCosts = self.ObjectCosts or {}
self.ChildIDs = []
# Same as parent, contains weakref proxies.
self.Children: List[Object] = []
# from ObjectUpdateCompressed
self.FootCollisionPlane: Optional[Vector4] = FootCollisionPlane
self.Position: Optional[Vector3] = Position
self.Velocity: Optional[Vector3] = Velocity
self.Acceleration: Optional[Vector3] = Acceleration
self.Rotation: Optional[Quaternion] = Rotation
self.AngularVelocity: Optional[Vector3] = AngularVelocity
# from ObjectProperties
self.CreatorID = None
self.GroupID = None
self.CreationDate = None
self.BaseMask = None
self.OwnerMask = None
self.GroupMask = None
self.EveryoneMask = None
self.NextOwnerMask = None
self.OwnershipCost = None
# TaxRate
self.SaleType = None
self.SalePrice = None
self.AggregatePerms = None
self.AggregatePermTextures = None
self.AggregatePermTexturesOwner = None
self.Category = None
self.InventorySerial = None
self.ItemID = None
self.FolderID = None
self.FromTaskID = None
self.LastOwnerID = None
self.Name = None
self.Description = None
self.TouchName = None
self.SitName = None
self.TextureID = None
@property
def GlobalPosition(self) -> Vector3:
return handle_to_global_pos(self.RegionHandle) + self.RegionPosition
@property
def RegionPosition(self) -> Vector3:
@@ -248,19 +165,280 @@ class Object(recordclass.datatuple): # type: ignore
# TODO: Cache this and dirty cache if ancestor updates rot?
return self.Rotation * self.Parent.RegionRotation
@property
def AncestorsKnown(self) -> bool:
obj = self
while obj.ParentID:
if not obj.Parent:
return False
obj = obj.Parent
return True
def update_properties(self, properties: Dict[str, Any]) -> Set[str]:
""" takes a dictionary of attribute:value and makes it so """
updated_properties = set()
for key, val in properties.items():
if hasattr(self, key):
old_val = getattr(self, key, val)
old_val = getattr(self, key, dataclasses.MISSING)
# Don't check equality if we're using a lazy proxy,
# parsing is deferred until we actually use it.
is_proxy = isinstance(val, lazy_object_proxy.Proxy)
if is_proxy or old_val != val:
if any(isinstance(x, lazy_object_proxy.Proxy) for x in (old_val, val)):
# TODO: be smarter about this. Can we store the raw bytes and
# compare those if it's an unparsed object?
is_updated = old_val is not val
else:
is_updated = old_val != val
if is_updated:
updated_properties.add(key)
setattr(self, key, val)
return updated_properties
def to_dict(self):
return recordclass.asdict(self)
val = recordclass.asdict(self)
del val["Children"]
del val["Parent"]
return val
def handle_to_gridxy(handle: int) -> Tuple[int, int]:
return (handle >> 32) // 256, (handle & 0xFFffFFff) // 256
def gridxy_to_handle(x: int, y: int):
return ((x * 256) << 32) | (y * 256)
def handle_to_global_pos(handle: int) -> Vector3:
return Vector3(handle >> 32, handle & 0xFFffFFff)
def normalize_object_update(block: Block, handle: int):
object_data = {
"RegionHandle": handle,
"FootCollisionPlane": None,
"SoundFlags": block["Flags"],
"SoundGain": block["Gain"],
"SoundRadius": block["Radius"],
**dict(block.items()),
"TextureEntry": block.deserialize_var("TextureEntry", make_copy=False),
"NameValue": block.deserialize_var("NameValue", make_copy=False),
"TextureAnim": block.deserialize_var("TextureAnim", make_copy=False),
"ExtraParams": block.deserialize_var("ExtraParams", make_copy=False) or {},
"PSBlock": block.deserialize_var("PSBlock", make_copy=False).value,
"UpdateFlags": block.deserialize_var("UpdateFlags", make_copy=False),
"State": block.deserialize_var("State", make_copy=False),
**block.deserialize_var("ObjectData", make_copy=False).value,
}
object_data["LocalID"] = object_data.pop("ID")
# Empty == not updated
if not object_data["TextureEntry"]:
object_data.pop("TextureEntry")
# OwnerID is only set in this packet if a sound is playing. Don't allow
# ObjectUpdates to clobber _real_ OwnerIDs we had from ObjectProperties
# with a null UUID.
if object_data["OwnerID"] == UUID():
del object_data["OwnerID"]
del object_data["Flags"]
del object_data["Gain"]
del object_data["Radius"]
del object_data["ObjectData"]
return object_data
def normalize_terse_object_update(block: Block, handle: int):
object_data = {
**block.deserialize_var("Data", make_copy=False),
**dict(block.items()),
"TextureEntry": block.deserialize_var("TextureEntry", make_copy=False),
"RegionHandle": handle,
}
object_data["LocalID"] = object_data.pop("ID")
object_data.pop("Data")
# Empty == not updated
if object_data["TextureEntry"] is None:
object_data.pop("TextureEntry")
return object_data
def normalize_object_update_compressed_data(data: bytes):
# Shared by ObjectUpdateCompressed and VOCache case
compressed = FastObjectUpdateCompressedDataDeserializer.read(data)
# TODO: ObjectUpdateCompressed doesn't provide a default value for unused
# fields, whereas ObjectUpdate and friends do (TextColor, etc.)
# need some way to normalize ObjectUpdates so they won't appear to have
# changed just because an ObjectUpdate got sent with a default value
# Only used for determining which sections are present
del compressed["Flags"]
ps_block = compressed.pop("PSBlockNew", None)
if ps_block is None:
ps_block = compressed.pop("PSBlock", None)
if ps_block is None:
ps_block = TaggedUnion(0, None)
compressed.pop("PSBlock", None)
if compressed["NameValue"] is None:
compressed["NameValue"] = NameValueCollection()
object_data = {
"PSBlock": ps_block.value,
# Parent flag not set means explicitly un-parented
"ParentID": compressed.pop("ParentID", None) or 0,
"LocalID": compressed.pop("ID"),
**compressed,
}
if object_data["TextureEntry"] is None:
object_data.pop("TextureEntry")
# Don't clobber OwnerID in case the object has a proper one.
if object_data["OwnerID"] == UUID():
del object_data["OwnerID"]
return object_data
def normalize_object_update_compressed(block: Block, handle: int):
compressed = normalize_object_update_compressed_data(block["Data"])
compressed["UpdateFlags"] = block.deserialize_var("UpdateFlags", make_copy=False)
compressed["RegionHandle"] = handle
return compressed
class SimpleStructReader(se.BufferReader):
def read_struct(self, spec: struct.Struct, peek=False) -> Tuple[Any, ...]:
val = spec.unpack_from(self._buffer, self._pos)
if not peek:
self._pos += spec.size
return val
def read_bytes_null_term(self) -> bytes:
old_offset = self._pos
while self._buffer[self._pos] != 0:
self._pos += 1
val = self._buffer[old_offset:self._pos]
self._pos += 1
return val
class FastObjectUpdateCompressedDataDeserializer:
HEADER_STRUCT = struct.Struct("<16sIBBIBB3f3f3fI16s")
ANGULAR_VELOCITY_STRUCT = struct.Struct("<3f")
PARENT_ID_STRUCT = struct.Struct("<I")
TREE_SPECIES_STRUCT = struct.Struct("<B")
DATAPACKER_LEN = struct.Struct("<I")
COLOR_ADAPTER = tmpls.Color4()
PARTICLES_OLD = se.TypedBytesFixed(86, tmpls.PSBLOCK_TEMPLATE)
SOUND_STRUCT = struct.Struct("<16sfBf")
PRIM_PARAMS_STRUCT = struct.Struct("<BBHHBBBBbbbbbBbHHH")
ATTACHMENT_STATE_ADAPTER = tmpls.AttachmentStateAdapter(None)
@classmethod
def read(cls, data: bytes) -> Dict:
reader = SimpleStructReader("<", data)
foo = reader.read_struct(cls.HEADER_STRUCT)
full_id, local_id, pcode, state, crc, material, click_action, \
scalex, scaley, scalez, posx, posy, posz, rotx, roty, rotz, \
flags, owner_id = foo
scale = Vector3(scalex, scaley, scalez)
full_id = UUID(bytes=full_id)
pcode = tmpls.PCode(pcode)
if pcode == tmpls.PCode.AVATAR:
state = tmpls.AgentState(state)
elif pcode == tmpls.PCode.PRIMITIVE:
state = cls.ATTACHMENT_STATE_ADAPTER.decode(state, None)
pos = Vector3(posx, posy, posz)
rot = Quaternion(rotx, roty, rotz)
owner_id = UUID(bytes=owner_id)
ang_vel = None
if flags & tmpls.CompressedFlags.ANGULAR_VELOCITY.value:
ang_vel = Vector3(*reader.read_struct(cls.ANGULAR_VELOCITY_STRUCT))
parent_id = None
if flags & tmpls.CompressedFlags.PARENT_ID.value:
parent_id = reader.read_struct(cls.PARENT_ID_STRUCT)[0]
tree_species = None
if flags & tmpls.CompressedFlags.TREE.value:
tree_species = reader.read_struct(cls.TREE_SPECIES_STRUCT)[0]
scratchpad = None
if flags & tmpls.CompressedFlags.SCRATCHPAD.value:
scratchpad = reader.read_bytes(reader.read_struct(cls.DATAPACKER_LEN)[0])
text = None
text_color = None
if flags & tmpls.CompressedFlags.TEXT.value:
text = reader.read_bytes_null_term().decode("utf8")
text_color = cls.COLOR_ADAPTER.decode(reader.read_bytes(4), ctx=None)
media_url = None
if flags & tmpls.CompressedFlags.MEDIA_URL.value:
media_url = reader.read_bytes_null_term().decode("utf8")
psblock = None
if flags & tmpls.CompressedFlags.PARTICLES.value:
psblock = reader.read(cls.PARTICLES_OLD)
extra_params = reader.read(tmpls.EXTRA_PARAM_COLLECTION)
sound, sound_gain, sound_flags, sound_radius = None, None, None, None
if flags & tmpls.CompressedFlags.SOUND.value:
sound, sound_gain, sound_flags, sound_radius = reader.read_struct(cls.SOUND_STRUCT)
sound = UUID(bytes=sound)
sound_flags = tmpls.SoundFlags(sound_flags)
name_value = None
if flags & tmpls.CompressedFlags.NAME_VALUES.value:
name_value = reader.read(tmpls.NAMEVALUES_TERMINATED_TEMPLATE)
path_curve, profile_curve, path_begin, path_end, path_scale_x, path_scale_y, \
path_shear_x, path_shear_y, path_twist, path_twist_begin, path_radius_offset, \
path_taper_x, path_taper_y, path_revolutions, path_skew, profile_begin, \
profile_end, profile_hollow = reader.read_struct(cls.PRIM_PARAMS_STRUCT)
texture_entry = reader.read(tmpls.DATA_PACKER_TE_TEMPLATE)
texture_anim = None
if flags & tmpls.CompressedFlags.TEXTURE_ANIM.value:
texture_anim = reader.read(se.TypedByteArray(se.U32, tmpls.TA_TEMPLATE))
psblock_new = None
if flags & tmpls.CompressedFlags.PARTICLES_NEW.value:
psblock_new = reader.read(tmpls.PSBLOCK_TEMPLATE)
if len(reader):
logging.warning(f"{len(reader)} bytes left at end of buffer for compressed {data!r}")
return {
"FullID": full_id,
"ID": local_id,
"PCode": pcode,
"State": state,
"CRC": crc,
"Material": material,
"ClickAction": click_action,
"Scale": scale,
"Position": pos,
"Rotation": rot,
"Flags": flags,
"OwnerID": owner_id,
"AngularVelocity": ang_vel,
"ParentID": parent_id,
"TreeSpecies": tree_species,
"ScratchPad": scratchpad,
"Text": text,
"TextColor": text_color,
"MediaURL": media_url,
"PSBlock": psblock,
"ExtraParams": extra_params,
"Sound": sound,
"SoundGain": sound_gain,
"SoundFlags": sound_flags,
"SoundRadius": sound_radius,
"NameValue": name_value,
"PathCurve": path_curve,
"ProfileCurve": profile_curve,
"PathBegin": path_begin, # 0 to 1, quanta = 0.01
"PathEnd": path_end, # 0 to 1, quanta = 0.01
"PathScaleX": path_scale_x, # 0 to 1, quanta = 0.01
"PathScaleY": path_scale_y, # 0 to 1, quanta = 0.01
"PathShearX": path_shear_x, # -.5 to .5, quanta = 0.01
"PathShearY": path_shear_y, # -.5 to .5, quanta = 0.01
"PathTwist": path_twist, # -1 to 1, quanta = 0.01
"PathTwistBegin": path_twist_begin, # -1 to 1, quanta = 0.01
"PathRadiusOffset": path_radius_offset, # -1 to 1, quanta = 0.01
"PathTaperX": path_taper_x, # -1 to 1, quanta = 0.01
"PathTaperY": path_taper_y, # -1 to 1, quanta = 0.01
"PathRevolutions": path_revolutions, # 0 to 3, quanta = 0.015
"PathSkew": path_skew, # -1 to 1, quanta = 0.01
"ProfileBegin": profile_begin, # 0 to 1, quanta = 0.01
"ProfileEnd": profile_end, # 0 to 1, quanta = 0.01
"ProfileHollow": profile_hollow, # 0 to 1, quanta = 0.01
"TextureEntry": texture_entry,
"TextureAnim": texture_anim,
"PSBlockNew": psblock_new,
}

View File

@@ -5,7 +5,6 @@ import enum
import math
import struct
import types
import typing
import weakref
from io import SEEK_CUR, SEEK_SET, SEEK_END, RawIOBase, BufferedIOBase
from typing import *
@@ -891,7 +890,23 @@ class TupleCoord(SerializableBase):
return cls.COORD_CLS
class QuantizedTupleCoord(TupleCoord):
class EncodedTupleCoord(TupleCoord, abc.ABC):
_elem_specs: Sequence[SERIALIZABLE_TYPE]
def serialize(self, vals, writer: BufferWriter, ctx):
vals = self._vals_to_tuple(vals)
for spec, val in zip(self._elem_specs, vals):
writer.write(spec, val, ctx=ctx)
def deserialize(self, reader: Reader, ctx):
vals = (reader.read(spec, ctx=ctx) for spec in self._elem_specs)
val = self.COORD_CLS(*vals)
if self.need_pod(reader):
return tuple(val)
return val
class QuantizedTupleCoord(EncodedTupleCoord):
def __init__(self, lower=None, upper=None, component_scales=None):
super().__init__()
if component_scales:
@@ -907,17 +922,14 @@ class QuantizedTupleCoord(TupleCoord):
)
assert len(self._elem_specs) == self.NUM_ELEMS
def serialize(self, vals, writer: BufferWriter, ctx):
vals = self._vals_to_tuple(vals)
for spec, val in zip(self._elem_specs, vals):
writer.write(spec, val, ctx=ctx)
def deserialize(self, reader: Reader, ctx):
vals = (reader.read(spec, ctx=ctx) for spec in self._elem_specs)
val = self.COORD_CLS(*vals)
if self.need_pod(reader):
return tuple(val)
return val
class FixedPointTupleCoord(EncodedTupleCoord):
def __init__(self, int_bits: int, frac_bits: int, signed: bool):
super().__init__()
self._elem_specs = tuple(
FixedPoint(self.ELEM_SPEC, int_bits, frac_bits, signed)
for _ in range(self.NUM_ELEMS)
)
class Vector3(TupleCoord):
@@ -993,6 +1005,12 @@ class Vector4U8(QuantizedTupleCoord):
COORD_CLS = dtypes.Vector4
class FixedPointVector3U16(FixedPointTupleCoord):
ELEM_SPEC = U16
NUM_ELEMS = 3
COORD_CLS = dtypes.Vector3
class OptionalPrefixed(SerializableBase):
"""Field prefixed by a U8 indicating whether or not it's present"""
OPTIONAL = True
@@ -1092,15 +1110,6 @@ class IntEnum(Adapter):
return lambda: self.enum_cls(0)
def flags_to_pod(flag_cls: Type[enum.IntFlag], val: int) -> typing.Tuple[Union[str, int], ...]:
# Shove any bits not represented in the IntFlag into an int
left_over = val
for flag in iter(flag_cls):
left_over &= ~flag.value
extra = (int(left_over),) if left_over else ()
return tuple(flag.name for flag in iter(flag_cls) if val & flag.value) + extra
class IntFlag(Adapter):
def __init__(self, flag_cls: Type[enum.IntFlag],
flag_spec: Optional[SerializablePrimitive] = None):
@@ -1121,7 +1130,7 @@ class IntFlag(Adapter):
def decode(self, val: Any, ctx: Optional[ParseContext], pod: bool = False) -> Any:
if pod:
return flags_to_pod(self.flag_cls, val)
return dtypes.flags_to_pod(self.flag_cls, val)
return self.flag_cls(val)
def default_value(self) -> Any:
@@ -1501,6 +1510,9 @@ class DataclassAdapter(Adapter):
self._data_cls = data_cls
def encode(self, val: Any, ctx: Optional[ParseContext]) -> Any:
if isinstance(val, lazy_object_proxy.Proxy):
# Have to unwrap these or the dataclass check will fail
val = val.__wrapped__
if dataclasses.is_dataclass(val):
val = dataclasses.asdict(val)
return val
@@ -1613,7 +1625,7 @@ class BufferedLLSDBinaryParser(llsd.HippoLLSDBinaryParser):
byte = self._getc()[0]
except IndexError:
byte = None
raise llsd.LLSDParseError("%s at byte %d: %s" % (message, self._index+offset, byte))
raise llsd.LLSDParseError("%s at byte %d: %s" % (message, self._index + offset, byte))
def _getc(self, num=1):
return self._buffer.read_bytes(num)
@@ -1641,8 +1653,14 @@ def subfield_serializer(msg_name, block_name, var_name):
return f
_ENUM_TYPE = TypeVar("_ENUM_TYPE", bound=Type[dtypes.IntEnum])
_FLAG_TYPE = TypeVar("_FLAG_TYPE", bound=Type[dtypes.IntFlag])
def enum_field_serializer(msg_name, block_name, var_name):
def f(orig_cls):
def f(orig_cls: _ENUM_TYPE) -> _ENUM_TYPE:
if not issubclass(orig_cls, dtypes.IntEnum):
raise ValueError(f"{orig_cls} must be a subclass of Hippolyzer's IntEnum class")
wrapper = subfield_serializer(msg_name, block_name, var_name)
wrapper(IntEnumSubfieldSerializer(orig_cls))
return orig_cls
@@ -1650,7 +1668,9 @@ def enum_field_serializer(msg_name, block_name, var_name):
def flag_field_serializer(msg_name, block_name, var_name):
def f(orig_cls):
def f(orig_cls: _FLAG_TYPE) -> _FLAG_TYPE:
if not issubclass(orig_cls, dtypes.IntFlag):
raise ValueError(f"{orig_cls!r} must be a subclass of Hippolyzer's IntFlag class")
wrapper = subfield_serializer(msg_name, block_name, var_name)
wrapper(IntFlagSubfieldSerializer(orig_cls))
return orig_cls
@@ -1703,7 +1723,7 @@ class BaseSubfieldSerializer(abc.ABC):
"""Guess at which template a val might correspond to"""
if dataclasses.is_dataclass(val):
val = dataclasses.asdict(val) # noqa
if isinstance(val, bytes):
if isinstance(val, (bytes, bytearray)):
template_checker = cls._template_sizes_match
elif isinstance(val, dict):
template_checker = cls._template_keys_match

View File

@@ -19,81 +19,48 @@ along with this program; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
from __future__ import annotations
import dataclasses
from typing import *
_T = TypeVar("_T")
class SettingDescriptor(Generic[_T]):
__slots__ = ("name", "default")
def __init__(self, default: Union[Callable[[], _T], _T]):
self.default = default
self.name: Optional[str] = None
def __set_name__(self, owner: Settings, name: str):
self.name = name
def _make_default(self) -> _T:
if callable(self.default):
return self.default()
return self.default
def __get__(self, obj: Settings, owner: Optional[Type] = None) -> _T:
val: Union[_T, dataclasses.MISSING] = obj.get_setting(self.name)
if val is dataclasses.MISSING:
val = self._make_default()
return val
def __set__(self, obj: Settings, value: _T) -> None:
obj.set_setting(self.name, value)
class Settings:
def __init__(self, quiet_logging=False, spammy_logging=False, log_tests=True):
""" some lovely configurable settings
ENABLE_DEFERRED_PACKET_PARSING: bool = SettingDescriptor(True)
These are applied application wide, and can be
overridden at any time in a specific instance
quiet_logging overrides spammy_logging
"""
def __init__(self):
self._settings: Dict[str, Any] = {}
self.quiet_logging = quiet_logging
self.spammy_logging = spammy_logging
def get_setting(self, name: str) -> Any:
return self._settings.get(name, dataclasses.MISSING)
# toggle handling udp packets
self.HANDLE_PACKETS = True
self.HANDLE_OUTGOING_PACKETS = False
# toggle parsing all/handled packets
self.ENABLE_DEFERRED_PACKET_PARSING = True
# ~~~~~~~~~~~~~~~~~~
# Logging behaviors
# ~~~~~~~~~~~~~~~~~~
# being a test tool, and an immature one at that,
# enable fine granularity in the logging, but
# make sure we can tone it down as well
self.LOG_VERBOSE = True
self.ENABLE_BYTES_TO_HEX_LOGGING = False
self.ENABLE_CAPS_LOGGING = True
self.ENABLE_CAPS_LLSD_LOGGING = False
self.ENABLE_EQ_LOGGING = True
self.ENABLE_UDP_LOGGING = True
self.ENABLE_OBJECT_LOGGING = True
self.LOG_SKIPPED_PACKETS = True
self.ENABLE_HOST_LOGGING = True
self.LOG_COROUTINE_SPAWNS = True
self.PROXY_LOGGING = False
# allow disabling logging of certain packets
self.DISABLE_SPAMMERS = True
self.UDP_SPAMMERS = ['PacketAck', 'AgentUpdate']
# toggle handling a region's event queue
self.ENABLE_REGION_EVENT_QUEUE = True
# how many seconds to wait between polling
# a region's event queue
self.REGION_EVENT_QUEUE_POLL_INTERVAL = 1
if self.spammy_logging:
self.ENABLE_BYTES_TO_HEX_LOGGING = True
self.ENABLE_CAPS_LLSD_LOGGING = True
self.DISABLE_SPAMMERS = False
# override the defaults
if self.quiet_logging:
self.LOG_VERBOSE = False
self.ENABLE_BYTES_TO_HEX_LOGGING = False
self.ENABLE_CAPS_LOGGING = False
self.ENABLE_CAPS_LLSD_LOGGING = False
self.ENABLE_EQ_LOGGING = False
self.ENABLE_UDP_LOGGING = False
self.LOG_SKIPPED_PACKETS = False
self.ENABLE_OBJECT_LOGGING = False
self.ENABLE_HOST_LOGGING = False
self.LOG_COROUTINE_SPAWNS = False
self.DISABLE_SPAMMERS = True
# ~~~~~~~~~~~~~~~~~~~~~~
# Test related settings
# ~~~~~~~~~~~~~~~~~~~~~~
if log_tests:
self.ENABLE_LOGGING_IN_TESTS = True
else:
self.ENABLE_LOGGING_IN_TESTS = False
def set_setting(self, name: str, val: Any):
self._settings[name] = val

File diff suppressed because it is too large Load Diff

View File

@@ -8,18 +8,15 @@ import dataclasses
from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import proxify
from hippolyzer.lib.base.message.message import Block
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.proxy.templates import (
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.circuit import ConnectionHolder
from hippolyzer.lib.base.templates import (
TransferRequestParamsBase,
TransferChannelType,
TransferSourceType,
TransferStatus,
)
if TYPE_CHECKING:
from hippolyzer.lib.proxy.region import ProxiedRegion
_TRANSFER_MESSAGES = {"TransferInfo", "TransferPacket", "TransferAbort"}
@@ -49,7 +46,7 @@ class Transfer:
def cancelled(self) -> bool:
return self._future.cancelled()
def is_our_message(self, message: ProxiedMessage):
def is_our_message(self, message: Message):
if "TransferData" in message.blocks:
transfer_block = message["TransferData"][0]
else:
@@ -71,8 +68,15 @@ class Transfer:
class TransferManager:
def __init__(self, region: ProxiedRegion):
self._region: ProxiedRegion = proxify(region)
def __init__(
self,
connection_holder: ConnectionHolder,
agent_id: Optional[UUID] = None,
session_id: Optional[UUID] = None,
):
self._connection_holder = connection_holder
self._agent_id = agent_id
self._session_id = session_id
def request(
self, *,
@@ -86,11 +90,11 @@ class TransferManager:
params_dict = dataclasses.asdict(params)
# Fill in any missing AgentID or SessionID attrs if the params type has them
if params_dict.get("AgentID", dataclasses.MISSING) is None:
params.AgentID = self._region.session().agent_id
params.AgentID = self._agent_id
if params_dict.get("SessionID", dataclasses.MISSING) is None:
params.SessionID = self._region.session().id
params.SessionID = self._session_id
self._region.circuit.send_message(ProxiedMessage(
self._connection_holder.circuit.send_message(Message(
'TransferRequest',
Block(
'TransferInfo',
@@ -107,13 +111,13 @@ class TransferManager:
async def _pump_transfer_replies(self, transfer: Transfer):
# Subscribe to message related to our transfer while we're in this block
with self._region.message_handler.subscribe_async(
_TRANSFER_MESSAGES,
predicate=transfer.is_our_message
with self._connection_holder.message_handler.subscribe_async(
_TRANSFER_MESSAGES,
predicate=transfer.is_our_message,
) as get_msg:
while not transfer.done():
try:
msg: ProxiedMessage = await asyncio.wait_for(get_msg(), 5.0)
msg: Message = await asyncio.wait_for(get_msg(), 5.0)
except TimeoutError as e:
transfer.set_exception(e)
return
@@ -128,18 +132,18 @@ class TransferManager:
elif msg.name == "TransferAbort":
transfer.error_code = msg["TransferID"][0].deserialize_var("Result")
transfer.set_exception(
ConnectionAbortedError(f"Unknown failure")
ConnectionAbortedError("Unknown failure")
)
def _handle_transfer_packet(self, msg: ProxiedMessage, transfer: Transfer):
def _handle_transfer_packet(self, msg: Message, transfer: Transfer):
transfer_block = msg["TransferData"][0]
packet_id: int = transfer_block["Packet"]
packet_data = transfer_block["Data"]
transfer.chunks[packet_id] = packet_data
if transfer_block["Status"] == TransferStatus.DONE:
if transfer_block["Status"] == TransferStatus.DONE and not transfer.done():
transfer.mark_done()
def _handle_transfer_info(self, msg: ProxiedMessage, transfer: Transfer):
def _handle_transfer_info(self, msg: Message, transfer: Transfer):
transfer_block = msg["TransferInfo"][0]
transfer.expected_size = transfer_block["Size"]
# Don't re-set if we get a resend of packet 0

View File

@@ -1,11 +1,10 @@
import dataclasses
from typing import *
import pkg_resources
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.proxy.templates import AssetType
from hippolyzer.lib.base.helpers import get_resource_filename
from hippolyzer.lib.base.templates import AssetType
@dataclasses.dataclass
@@ -21,7 +20,7 @@ class VFSBlock:
class VFS:
def __init__(self, index_path):
self._data_fh = None
self.blocks = []
self.blocks: List[VFSBlock] = []
self._uuid_lookup: Dict[UUID, VFSBlock] = {}
assert "index.db2" in index_path
@@ -45,10 +44,10 @@ class VFS:
self.blocks.append(block)
self._uuid_lookup[block.file_id] = block
def __iter__(self):
def __iter__(self) -> Iterator[VFSBlock]:
return iter(self.blocks)
def __getitem__(self, item: UUID):
def __getitem__(self, item: UUID) -> VFSBlock:
return self._uuid_lookup[item]
def __contains__(self, item: UUID):
@@ -59,10 +58,10 @@ class VFS:
self._data_fh.close()
self._data_fh = None
def read_block(self, block: VFSBlock):
def read_block(self, block: VFSBlock) -> bytes:
self._data_fh.seek(block.location)
return self._data_fh.read(block.size)
_static_path = pkg_resources.resource_filename("hippolyzer.lib.proxy", "data/static_index.db2")
_static_path = get_resource_filename("lib/base/data/static_index.db2")
STATIC_VFS = VFS(_static_path)

View File

@@ -0,0 +1,148 @@
"""
Body parts and linden clothing layers
"""
from __future__ import annotations
import dataclasses
import logging
from io import StringIO
from typing import *
from xml.etree.ElementTree import parse as parse_etree
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import get_resource_filename
from hippolyzer.lib.base.legacy_inv import InventorySaleInfo, InventoryPermissions
from hippolyzer.lib.base.legacy_schema import SchemaBase, parse_schema_line, SchemaParsingError
from hippolyzer.lib.base.templates import WearableType
LOG = logging.getLogger(__name__)
_T = TypeVar("_T")
WEARABLE_VERSION = "LLWearable version 22"
@dataclasses.dataclass
class VisualParam:
id: int
name: str
value_min: float
value_max: float
# These might be `None` if the param isn't meant to be directly edited
edit_group: Optional[str]
wearable: Optional[str]
class VisualParams(List[VisualParam]):
def __init__(self):
super().__init__()
lad_path = get_resource_filename("lib/base/data/avatar_lad.xml")
with open(lad_path, "rb") as f:
doc = parse_etree(f)
for param in doc.findall(".//param"):
self.append(VisualParam(
id=int(param.attrib["id"]),
name=param.attrib["name"],
edit_group=param.get("edit_group"),
wearable=param.get("wearable"),
value_min=float(param.attrib["value_min"]),
value_max=float(param.attrib["value_max"]),
))
def by_name(self, name: str) -> VisualParam:
return [x for x in self if x.name == name][0]
def by_edit_group(self, edit_group: str) -> List[VisualParam]:
return [x for x in self if x.edit_group == edit_group]
def by_wearable(self, wearable: str) -> List[VisualParam]:
return [x for x in self if x.wearable == wearable]
VISUAL_PARAMS = VisualParams()
@dataclasses.dataclass
class Wearable(SchemaBase):
name: str
wearable_type: WearableType
permissions: InventoryPermissions
sale_info: InventorySaleInfo
# VisualParam ID -> val
parameters: Dict[int, float]
# TextureEntry ID -> texture ID
textures: Dict[int, UUID]
@classmethod
def _skip_to_next_populated_line(cls, reader: StringIO):
old_pos = reader.tell()
while peeked_data := reader.readline():
# Read until we find a non-blank line
if peeked_data.lstrip("\n"):
break
old_pos = reader.tell()
# Reading an empty string means EOF
if not peeked_data:
raise SchemaParsingError("Premature EOF")
reader.seek(old_pos)
@classmethod
def _read_and_parse_line(cls, reader: StringIO):
cls._skip_to_next_populated_line(reader)
return parse_schema_line(reader.readline())
@classmethod
def _read_expected_key(cls, reader: StringIO, expected_key: str) -> str:
key, val = cls._read_and_parse_line(reader)
if key != expected_key:
raise ValueError(f"Expected {expected_key} not found, {(key, val)!r}")
return val
@classmethod
def from_reader(cls, reader: StringIO) -> Wearable:
cls._skip_to_next_populated_line(reader)
version_str = reader.readline().rstrip()
if version_str != WEARABLE_VERSION:
raise ValueError(f"Bad wearable version {version_str!r}")
cls._skip_to_next_populated_line(reader)
name = reader.readline().rstrip()
permissions = InventoryPermissions.from_reader(reader, read_header=True)
sale_info = InventorySaleInfo.from_reader(reader, read_header=True)
wearable_type = WearableType(int(cls._read_expected_key(reader, "type")))
num_params = int(cls._read_expected_key(reader, "parameters"))
params = {}
for _ in range(num_params):
param_id, param_val = cls._read_and_parse_line(reader)
if param_val == ".":
param_val = "0.0"
params[int(param_id)] = float(param_val)
num_textures = int(cls._read_expected_key(reader, "textures"))
textures = {}
for _ in range(num_textures):
te_id, texture_id = cls._read_and_parse_line(reader)
textures[int(te_id)] = UUID(texture_id)
return Wearable(
name=name,
wearable_type=wearable_type,
permissions=permissions,
sale_info=sale_info,
parameters=params,
textures=textures
)
def to_writer(self, writer: StringIO):
writer.write(f"{WEARABLE_VERSION}\n")
writer.write(f"{self.name}\n\n")
self.permissions.to_writer(writer)
self.sale_info.to_writer(writer)
writer.write(f"type {int(self.wearable_type)}\n")
writer.write(f"parameters {len(self.parameters)}\n")
for param_id, param_val in self.parameters.items():
writer.write(f"{param_id} {param_val}\n")
writer.write(f"textures {len(self.textures)}\n")
for te_id, texture_id in self.textures.items():
writer.write(f"{te_id} {texture_id}\n")

View File

@@ -0,0 +1,286 @@
"""
Managers for inbound and outbound xfer as well as the AssetUploadRequest flow
"""
from __future__ import annotations
import asyncio
import enum
import random
from typing import *
from hippolyzer.lib.base.datatypes import UUID, RawBytes
from hippolyzer.lib.base.message.data_packer import TemplateDataPacker
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.msgtypes import MsgType
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.base.message.circuit import ConnectionHolder
from hippolyzer.lib.base.templates import XferPacket, XferFilePath, AssetType, XferError
_XFER_MESSAGES = {"AbortXfer", "ConfirmXferPacket", "RequestXfer", "SendXferPacket"}
MAX_CHUNK_SIZE = 1150
ACK_AHEAD_MAX = 10
class Xfer:
def __init__(
self,
xfer_id: Optional[int] = None,
direction: Direction = Direction.OUT,
data: Optional[bytes] = None,
turbo: bool = False,
):
self.xfer_id: Optional[int] = xfer_id
self.chunks: Dict[int, bytes] = {}
self.expected_size: Optional[int] = None
self.size_known = asyncio.Future()
self.error_code: Union[int, XferError] = 0
self.next_ackable = 0
self.turbo = turbo
self.direction: Direction = direction
self.expected_chunks: Optional[int] = None
self._future: asyncio.Future[Xfer] = asyncio.Future()
if data is not None:
# Prepend the expected length field to the first chunk
if not isinstance(data, RawBytes):
data = TemplateDataPacker.pack(len(data), MsgType.MVT_S32) + data
chunk_num = 0
while data:
self.chunks[chunk_num] = data[:MAX_CHUNK_SIZE]
data = data[MAX_CHUNK_SIZE:]
chunk_num += 1
def reassemble_chunks(self) -> bytes:
assembled = bytearray()
for _, data in sorted(self.chunks.items()):
assembled.extend(data)
return assembled
def mark_done(self):
self._future.set_result(self)
def done(self) -> bool:
return self._future.done()
def cancelled(self) -> bool:
return self._future.cancelled()
def is_our_message(self, message):
return message["XferID"]["ID"] == self.xfer_id
def cancel(self) -> bool:
if not self.size_known.done():
self.size_known.cancel()
return self._future.cancel()
def set_exception(self, exc: Union[type, BaseException]) -> None:
if not self.size_known.done():
self.size_known.set_exception(exc)
return self._future.set_exception(exc)
def __await__(self) -> Generator[Any, None, Xfer]:
return self._future.__await__()
class UploadStrategy(enum.IntEnum):
XFER = enum.auto()
ASSET_UPLOAD_REQUEST = enum.auto()
class XferManager:
def __init__(
self,
connection_holder: ConnectionHolder,
secure_session_id: Optional[UUID] = None,
):
self._connection_holder = connection_holder
self._secure_session_id = secure_session_id
def request(
self, xfer_id: Optional[int] = None,
file_name: Union[bytes, str, None] = None,
file_path: Optional[Union[XferFilePath, int]] = None,
vfile_id: Optional[UUID] = None,
vfile_type: Optional[Union[AssetType, int]] = None,
use_big_packets: bool = False,
delete_on_completion: bool = True,
turbo: bool = False,
direction: Direction = Direction.OUT,
) -> Xfer:
xfer_id = xfer_id if xfer_id is not None else random.getrandbits(64)
self._connection_holder.circuit.send_message(Message(
'RequestXfer',
Block(
'XferID',
ID=xfer_id,
Filename=file_name or b'',
FilePath=file_path or XferFilePath.NONE,
DeleteOnCompletion=delete_on_completion,
UseBigPackets=use_big_packets,
VFileID=vfile_id or UUID(),
VFileType=vfile_type or AssetType.NONE,
),
direction=direction,
))
xfer = Xfer(xfer_id, direction=direction, turbo=turbo)
asyncio.create_task(self._pump_xfer_replies(xfer))
return xfer
async def _pump_xfer_replies(self, xfer: Xfer):
with self._connection_holder.message_handler.subscribe_async(
_XFER_MESSAGES,
predicate=xfer.is_our_message,
) as get_msg:
while not xfer.done():
try:
msg: Message = await asyncio.wait_for(get_msg(), 5.0)
except asyncio.exceptions.TimeoutError as e:
xfer.set_exception(e)
return
if xfer.cancelled():
# AbortXfer doesn't seem to work on in-progress Xfers.
# Just let any new packets drop on the floor.
return
if msg.name == "SendXferPacket":
self._handle_send_xfer_packet(msg, xfer)
elif msg.name == "AbortXfer":
xfer.error_code = msg["XferID"][0].deserialize_var("Result")
xfer.set_exception(
ConnectionAbortedError(f"Xfer failed with {xfer.error_code!r}")
)
def _handle_send_xfer_packet(self, msg: Message, xfer: Xfer):
# Received a SendXfer for an Xfer we sent ourselves
packet_id: XferPacket = msg["XferID"][0].deserialize_var("Packet")
packet_data = msg["DataPacket"]["Data"]
# First 4 bytes are expected total data length
if packet_id.PacketID == 0:
# Yes, S32. Only used as a hint so buffers can be pre-allocated,
# EOF bit determines when the data actually ends.
xfer.expected_size = TemplateDataPacker.unpack(packet_data[:4], MsgType.MVT_S32)
# Don't re-set if we get a resend of packet 0
if not xfer.size_known.done():
xfer.size_known.set_result(xfer.expected_size)
packet_data = packet_data[4:]
to_ack = (packet_id.PacketID,)
if xfer.turbo:
# ACK the next few packets we expect to be sent, if we haven't already
ack_max = packet_id.PacketID + ACK_AHEAD_MAX
to_ack = range(xfer.next_ackable, ack_max)
xfer.next_ackable = ack_max
for ack_id in to_ack:
self._connection_holder.circuit.send_message(Message(
"ConfirmXferPacket",
Block("XferID", ID=xfer.xfer_id, Packet=ack_id),
direction=xfer.direction,
))
xfer.chunks[packet_id.PacketID] = packet_data
# We may be waiting on other packets so we can't end immediately.
if packet_id.IsEOF:
xfer.expected_chunks = packet_id.PacketID + 1
if not xfer.done() and len(xfer.chunks) == xfer.expected_chunks:
xfer.mark_done()
def upload_asset(
self,
asset_type: AssetType,
data: Union[bytes, str],
store_local: bool = False,
temp_file: bool = False,
transaction_id: Optional[UUID] = None,
upload_strategy: Optional[UploadStrategy] = None,
) -> asyncio.Future[UUID]:
"""Upload an asset through the Xfer upload path"""
if not transaction_id:
transaction_id = UUID.random()
if isinstance(data, str):
data = data.encode("utf8")
# Small amounts of data can be sent inline, decide based on size
if upload_strategy is None:
if len(data) >= MAX_CHUNK_SIZE:
upload_strategy = UploadStrategy.XFER
else:
upload_strategy = UploadStrategy.ASSET_UPLOAD_REQUEST
xfer = None
inline_data = b''
if upload_strategy == UploadStrategy.XFER:
xfer = Xfer(data=data)
else:
inline_data = data
self._connection_holder.circuit.send_message(Message(
"AssetUploadRequest",
Block(
"AssetBlock",
TransactionID=transaction_id,
Type=asset_type,
Tempfile=temp_file,
StoreLocal=store_local,
AssetData=inline_data,
)
))
fut = asyncio.Future()
asyncio.create_task(self._pump_asset_upload(xfer, transaction_id, fut))
return fut
async def _pump_asset_upload(self, xfer: Optional[Xfer], transaction_id: UUID, fut: asyncio.Future):
message_handler = self._connection_holder.message_handler
# We'll receive an Xfer request for the asset we're uploading.
# asset ID is determined by hashing secure session ID with chosen transaction ID.
asset_id: UUID = UUID.combine(transaction_id, self._secure_session_id)
try:
# Only need to do this if we're using the xfer upload strategy, otherwise all the
# data was already sent in the AssetUploadRequest and we don't expect a RequestXfer.
def request_predicate(request_msg: Message):
return request_msg["XferID"]["VFileID"] == asset_id
if xfer is not None:
await self.serve_inbound_xfer_request(xfer, request_predicate)
def complete_predicate(complete_msg: Message):
return complete_msg["AssetBlock"]["UUID"] == asset_id
msg = await message_handler.wait_for(('AssetUploadComplete',), predicate=complete_predicate)
if msg["AssetBlock"]["Success"] == 1:
fut.set_result(asset_id)
else:
fut.set_exception(RuntimeError(f"Xfer for transaction {transaction_id} failed"))
except asyncio.TimeoutError as e:
fut.set_exception(e)
async def serve_inbound_xfer_request(
self,
xfer: Xfer,
request_predicate: Callable[[Message], bool],
wait_for_confirm: bool = True
):
message_handler = self._connection_holder.message_handler
request_msg = await message_handler.wait_for(
('RequestXfer',), predicate=request_predicate, timeout=5.0)
xfer.xfer_id = request_msg["XferID"]["ID"]
packet_id = 0
# TODO: No resend yet. If it's lost, it's lost.
while xfer.chunks:
chunk = xfer.chunks.pop(packet_id)
# EOF if there are no chunks left
packet_val = XferPacket(PacketID=packet_id, IsEOF=not bool(xfer.chunks))
self._connection_holder.circuit.send_message(Message(
"SendXferPacket",
Block("XferID", ID=xfer.xfer_id, Packet_=packet_val),
Block("DataPacket", Data=chunk),
# Send this towards the sender of the RequestXfer
direction=~request_msg.direction,
))
# Don't care about the value, just want to know it was confirmed.
if wait_for_confirm:
await message_handler.wait_for(
("ConfirmXferPacket",), predicate=xfer.is_our_message, timeout=5.0)
packet_id += 1

View File

View File

@@ -0,0 +1,82 @@
import dataclasses
from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
@dataclasses.dataclass
class NameCacheEntry:
full_id: UUID
first_name: Optional[str] = None
last_name: Optional[str] = None
display_name: Optional[str] = None
def __str__(self):
if self.display_name:
return f"{self.display_name} ({self.legacy_name})"
if self.legacy_name:
return self.legacy_name
return f"(???) ({self.full_id})"
@property
def legacy_name(self) -> Optional[str]:
if self.first_name is None:
return None
return f"{self.first_name} {self.last_name}"
@property
def preferred_name(self) -> Optional[str]:
if self.display_name:
return self.display_name
return self.legacy_name
class NameCache:
def __init__(self):
self._cache: Dict[UUID, NameCacheEntry] = {}
def create_subscriptions(
self,
message_handler: MessageHandler[Message, str],
):
message_handler.subscribe("UUIDNameReply", self._handle_uuid_name_reply)
def lookup(self, uuid: UUID, create_if_none: bool = False) -> Optional[NameCacheEntry]:
val = self._cache.get(uuid)
if create_if_none and val is None:
val = NameCacheEntry(full_id=uuid)
self._cache[uuid] = val
return val
def update(self, full_id: UUID, vals: dict):
# upsert the cache entry
entry = self._cache.get(full_id) or NameCacheEntry(full_id=full_id)
if "FirstName" in vals:
entry.first_name = vals["FirstName"]
if "LastName" in vals:
entry.last_name = vals["LastName"]
if "DisplayName" in vals:
entry.display_name = vals["DisplayName"] if vals["DisplayName"] else None
self._cache[full_id] = entry
def _handle_uuid_name_reply(self, msg: Message):
for block in msg.blocks["UUIDNameBlock"]:
self.update(block["ID"], {
"FirstName": block["FirstName"],
"LastName": block["LastName"],
})
def _process_display_names_response(self, parsed: dict):
"""Handle the response from the GetDisplayNames cap"""
for agent in parsed["agents"]:
# Don't set display name if they just have the default
display_name = None
if not agent["is_display_name_default"]:
display_name = agent["display_name"]
self.update(agent["id"], {
"FirstName": agent["legacy_first_name"],
"LastName": agent["legacy_last_name"],
"DisplayName": display_name,
})

View File

@@ -0,0 +1,881 @@
"""
Manager for a client's view of objects in the region and world.
"""
from __future__ import annotations
import asyncio
import collections
import enum
import itertools
import logging
import math
import weakref
from typing import *
from hippolyzer.lib.base.datatypes import UUID, Vector3
from hippolyzer.lib.base.helpers import proxify
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.base.objects import (
normalize_object_update,
normalize_terse_object_update,
normalize_object_update_compressed_data,
normalize_object_update_compressed,
Object, handle_to_global_pos,
)
from hippolyzer.lib.base.settings import Settings
from hippolyzer.lib.client.namecache import NameCache, NameCacheEntry
from hippolyzer.lib.client.state import BaseClientSession, BaseClientRegion
from hippolyzer.lib.base.templates import PCode, ObjectStateSerializer
LOG = logging.getLogger(__name__)
OBJECT_OR_LOCAL = Union[Object, int]
class UpdateType(enum.IntEnum):
OBJECT_UPDATE = enum.auto()
PROPERTIES = enum.auto()
FAMILY = enum.auto()
COSTS = enum.auto()
KILL = enum.auto()
class ClientObjectManager:
"""
Object manager for a specific region
"""
__slots__ = ("_region", "_world_objects", "state")
def __init__(self, region: BaseClientRegion):
self._region: BaseClientRegion = proxify(region)
self._world_objects: ClientWorldObjectManager = proxify(region.session().objects)
self.state: RegionObjectsState = RegionObjectsState()
def __len__(self):
return len(self.state.localid_lookup)
@property
def all_objects(self) -> Iterable[Object]:
return self.state.localid_lookup.values()
@property
def missing_locals(self) -> Set[int]:
return self.state.missing_locals
def clear(self):
self.state.clear()
if self._region.handle is not None:
# We're tracked by the world object manager, tell it to untrack
# any objects that we owned
self._world_objects.clear_region_objects(self._region.handle)
def lookup_localid(self, localid: int) -> Optional[Object]:
return self.state.lookup_localid(localid)
def lookup_fullid(self, fullid: UUID) -> Optional[Object]:
obj = self._world_objects.lookup_fullid(fullid)
if obj is None or obj.RegionHandle != self._region.handle:
return None
return obj
@property
def all_avatars(self) -> Iterable[Avatar]:
return tuple(a for a in self._world_objects.all_avatars
if a.RegionHandle == self._region.handle)
def lookup_avatar(self, fullid: UUID) -> Optional[Avatar]:
for avatar in self.all_avatars:
if avatar.FullID == fullid:
return avatar
return None
# noinspection PyUnusedLocal
def _is_localid_selected(self, local_id: int):
return False
def request_object_properties(self, objects: Union[OBJECT_OR_LOCAL, Sequence[OBJECT_OR_LOCAL]]) \
-> List[asyncio.Future[Object]]:
if isinstance(objects, (Object, int)):
objects = (objects,)
if not objects:
return []
local_ids = tuple((o.LocalID if isinstance(o, Object) else o) for o in objects)
# Don't mess with already selected objects
unselected_ids = tuple(local for local in local_ids if not self._is_localid_selected(local))
ids_to_req = unselected_ids
session = self._region.session()
while ids_to_req:
blocks = [
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
*[Block("ObjectData", ObjectLocalID=x) for x in ids_to_req[:255]],
]
# Selecting causes ObjectProperties to be sent
self._region.circuit.send_message(Message("ObjectSelect", blocks))
self._region.circuit.send_message(Message("ObjectDeselect", blocks))
ids_to_req = ids_to_req[255:]
futures = []
for local_id in local_ids:
if local_id in unselected_ids:
# Need to wait until we get our reply
fut = self.state.register_future(local_id, UpdateType.PROPERTIES)
else:
# This was selected so we should already have up to date info
fut = asyncio.Future()
fut.set_result(self.lookup_localid(local_id))
futures.append(fut)
return futures
def request_missing_objects(self) -> List[asyncio.Future[Object]]:
return self.request_objects(self.state.missing_locals)
def request_objects(self, local_ids: Union[int, Iterable[int]]) -> List[asyncio.Future[Object]]:
"""
Request object local IDs, returning a list of awaitable handles for the objects
Some may never be resolved, so use `asyncio.wait()` or `asyncio.wait_for()`.
"""
if isinstance(local_ids, int):
local_ids = (local_ids,)
elif isinstance(local_ids, set):
local_ids = tuple(local_ids)
session = self._region.session()
ids_to_req = local_ids
while ids_to_req:
self._region.circuit.send_message(Message(
"RequestMultipleObjects",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
*[Block("ObjectData", CacheMissType=0, ID=x) for x in ids_to_req[:255]],
))
ids_to_req = ids_to_req[255:]
futures = []
for local_id in local_ids:
futures.append(self.state.register_future(local_id, UpdateType.OBJECT_UPDATE))
return futures
class ObjectEvent:
__slots__ = ("object", "updated", "update_type")
object: Object
updated: Set[str]
update_type: UpdateType
def __init__(self, obj: Object, updated: Set[str], update_type: UpdateType):
self.object = obj
self.updated = updated
self.update_type = update_type
@property
def name(self) -> UpdateType:
return self.update_type
class ClientWorldObjectManager:
"""Manages Objects for a session's whole world"""
def __init__(self, session: BaseClientSession, settings: Settings, name_cache: Optional[NameCache]):
self._session: BaseClientSession = session
self._settings = settings
self.name_cache = name_cache or NameCache()
self.events: MessageHandler[ObjectEvent, UpdateType] = MessageHandler(take_by_default=False)
self._fullid_lookup: Dict[UUID, Object] = {}
self._avatars: Dict[UUID, Avatar] = {}
self._avatar_objects: Dict[UUID, Object] = {}
self._region_managers: Dict[int, ClientObjectManager] = {}
message_handler = self._session.message_handler
message_handler.subscribe("ObjectUpdate", self._handle_object_update)
message_handler.subscribe("ImprovedTerseObjectUpdate",
self._handle_terse_object_update)
message_handler.subscribe("ObjectUpdateCompressed",
self._handle_object_update_compressed)
message_handler.subscribe("ObjectUpdateCached",
self._handle_object_update_cached)
message_handler.subscribe("CoarseLocationUpdate",
self._handle_coarse_location_update)
message_handler.subscribe("KillObject",
self._handle_kill_object)
message_handler.subscribe("ObjectProperties",
self._handle_object_properties_generic)
message_handler.subscribe("ObjectPropertiesFamily",
self._handle_object_properties_generic)
def lookup_fullid(self, full_id: UUID) -> Optional[Object]:
return self._fullid_lookup.get(full_id, None)
@property
def all_objects(self) -> Iterable[Object]:
return self._fullid_lookup.values()
def lookup_avatar(self, full_id: UUID) -> Optional[Avatar]:
return {a.FullID: a for a in self.all_avatars}.get(full_id, None)
@property
def all_avatars(self) -> Iterable[Avatar]:
return tuple(self._avatars.values())
def __len__(self):
return len(self._fullid_lookup)
def _get_region_state(self, handle: int) -> Optional[RegionObjectsState]:
val = self._get_region_manager(handle)
if val is None:
return None
return val.state
def track_region_objects(self, handle: int):
"""Start tracking objects for a region"""
if self._get_region_manager(handle) is None:
self._region_managers[handle] = proxify(self._session.region_by_handle(handle).objects)
def clear_region_objects(self, handle: int):
"""Handle signal that a region object manager was just cleared"""
# Make sure they're gone from our lookup table
for obj in tuple(self._fullid_lookup.values()):
if obj.RegionHandle == handle:
del self._fullid_lookup[obj.FullID]
self._rebuild_avatar_objects()
def _get_region_manager(self, handle: int) -> Optional[ClientObjectManager]:
return self._region_managers.get(handle)
def request_missing_objects(self) -> List[asyncio.Future[Object]]:
futs = []
for region in self._session.regions:
futs.extend(region.objects.request_missing_objects())
return futs
def request_object_properties(self, objects: Union[Object, Sequence[Object]]) \
-> List[asyncio.Future[Object]]:
# Doesn't accept local ID unlike ObjectManager because they're ambiguous here.
if isinstance(objects, Object):
objects = (objects,)
if not objects:
return []
# Has to be sent to the region they belong to, so split the objects out by region handle.
objs_by_region = collections.defaultdict(list)
for obj in objects:
objs_by_region[obj.RegionHandle].append(obj)
futs = []
for region_handle, region_objs in objs_by_region.items():
region_mgr = self._get_region_manager(region_handle)
futs.extend(region_mgr.request_object_properties(region_objs))
return futs
async def load_ancestors(self, obj: Object, wait_time: float = 1.0):
"""
Ensure that the entire chain of parents above this object is loaded
Use this to make sure the object you're dealing with isn't orphaned and
its RegionPosition can be determined.
"""
region_mgr = self._get_region_manager(obj.RegionHandle)
while obj.ParentID:
if obj.Parent is None:
await asyncio.wait_for(region_mgr.request_objects(obj.ParentID)[0], wait_time)
obj = obj.Parent
def clear(self):
self._avatars.clear()
for region_mgr in self._region_managers.values():
region_mgr.clear()
if self._fullid_lookup:
LOG.warning(f"Had {len(self._fullid_lookup)} objects not tied to a region manager!")
self._fullid_lookup.clear()
self._rebuild_avatar_objects()
self._region_managers.clear()
def _update_existing_object(self, obj: Object, new_properties: dict, update_type: UpdateType):
old_parent_id = obj.ParentID
new_parent_id = new_properties.get("ParentID", obj.ParentID)
old_local_id = obj.LocalID
new_local_id = new_properties.get("LocalID", obj.LocalID)
old_region_handle = obj.RegionHandle
new_region_handle = new_properties.get("RegionHandle", obj.RegionHandle)
old_region_state = self._get_region_state(old_region_handle)
new_region_state = self._get_region_state(new_region_handle)
actually_updated_props = set()
if old_region_handle != new_region_handle:
# The object just changed regions, we have to remove it from the old one.
# Our LocalID will most likely change because, well, our locale changed.
old_region_state.untrack_object(obj)
elif old_local_id != new_local_id:
# Our LocalID changed, and we deal with linkages to other prims by
# LocalID association. Break any links since our LocalID is changing.
# Could happen if we didn't mark an attachment prim dead and the parent agent
# came back into the sim. Attachment FullIDs do not change across TPs,
# LocalIDs do. This at least lets us partially recover from the bad state.
new_localid = new_properties["LocalID"]
LOG.warning(f"Got an update with new LocalID for {obj.FullID}, {obj.LocalID} != {new_localid}. "
f"May have mishandled a KillObject for a prim that left and re-entered region.")
old_region_state.untrack_object(obj)
obj.LocalID = new_localid
old_region_state.track_object(obj)
actually_updated_props |= {"LocalID"}
actually_updated_props |= obj.update_properties(new_properties)
if new_region_handle != old_region_handle:
# Region just changed to this region, we should have untracked it before
# so mark it tracked on this region. This should implicitly pick up any
# orphans and handle parent ID changes.
if new_region_state is not None:
new_region_state.track_object(obj)
else:
# This will leave a regionless object in the global lookup dict, same as indra.
LOG.warning(f"Tried to move object {obj!r} to unknown region {new_region_handle}")
if obj.PCode == PCode.AVATAR:
# `Avatar` instances are handled separately. Update all Avatar objects so
# we can deal with the RegionHandle change.
self._rebuild_avatar_objects()
elif new_parent_id != old_parent_id:
# Parent ID changed, but we're in the same region
new_region_state.handle_object_reparented(obj, old_parent_id=old_parent_id)
if actually_updated_props and new_region_state is not None:
self._run_object_update_hooks(obj, actually_updated_props, update_type)
def _track_new_object(self, region: RegionObjectsState, obj: Object):
region.track_object(obj)
self._fullid_lookup[obj.FullID] = obj
if obj.PCode == PCode.AVATAR:
self._avatar_objects[obj.FullID] = obj
self._rebuild_avatar_objects()
self._run_object_update_hooks(obj, set(obj.to_dict().keys()), UpdateType.OBJECT_UPDATE)
def _kill_object_by_local_id(self, region_state: RegionObjectsState, local_id: int):
obj = region_state.lookup_localid(local_id)
region_state.missing_locals -= {local_id}
child_ids: Sequence[int]
if obj:
self._run_kill_object_hooks(obj)
child_ids = obj.ChildIDs
else:
LOG.debug(f"Tried to kill unknown object {local_id}")
# Kill any pending futures it might have had since untrack_object()
# won't be called.
region_state.cancel_futures(local_id)
# If it had any orphans, they need to die.
child_ids = region_state.collect_orphans(local_id)
# KillObject implicitly kills descendents
# This may mutate child_ids, use the reversed iterator so we don't
# invalidate the iterator during removal.
for child_id in reversed(child_ids):
# indra special-cases avatar PCodes and doesn't mark them dead
# due to cascading kill. Is this correct? Do avatars require
# explicit kill? Does this imply ParentID = 0 or do we need
# an explicit follow-up update?
child_obj = region_state.lookup_localid(child_id)
if child_obj and child_obj.PCode == PCode.AVATAR:
continue
self._kill_object_by_local_id(region_state, child_id)
# Have to do this last, since untracking will clear child IDs
if obj:
region_state.untrack_object(obj)
self._fullid_lookup.pop(obj.FullID, None)
if obj.PCode == PCode.AVATAR:
self._avatar_objects.pop(obj.FullID, None)
self._rebuild_avatar_objects()
def _handle_object_update(self, msg: Message):
seen_locals = []
handle = msg["RegionData"]["RegionHandle"]
region_state = self._get_region_state(handle)
for block in msg['ObjectData']:
object_data = normalize_object_update(block, handle)
seen_locals.append(object_data["LocalID"])
if region_state is None:
LOG.warning(f"Got ObjectUpdate for unknown region {handle}: {object_data!r}")
# Do a lookup by FullID, if an object with this FullID already exists anywhere in
# our view of the world then we want to move it to this region.
obj = self.lookup_fullid(object_data["FullID"])
if obj:
self._update_existing_object(obj, object_data, UpdateType.OBJECT_UPDATE)
else:
if region_state is None:
continue
self._track_new_object(region_state, Object(**object_data))
msg.meta["ObjectUpdateIDs"] = tuple(seen_locals)
def _handle_terse_object_update(self, msg: Message):
seen_locals = []
handle = msg["RegionData"]["RegionHandle"]
region_state = self._get_region_state(handle)
for block in msg['ObjectData']:
object_data = normalize_terse_object_update(block, handle)
if region_state is None:
LOG.warning(f"Got ImprovedTerseObjectUpdate for unknown region {handle}: {object_data!r}")
continue
obj = region_state.lookup_localid(object_data["LocalID"])
# Can only update existing object with this message
if obj:
# Need the Object as context because decoding state requires PCode.
state_deserializer = ObjectStateSerializer.deserialize
object_data["State"] = state_deserializer(ctx_obj=obj, val=object_data["State"])
self._update_existing_object(obj, object_data, UpdateType.OBJECT_UPDATE)
else:
if region_state:
region_state.missing_locals.add(object_data["LocalID"])
LOG.debug(f"Received terse update for unknown object {object_data['LocalID']}")
seen_locals.append(object_data["LocalID"])
msg.meta["ObjectUpdateIDs"] = tuple(seen_locals)
def _handle_object_update_cached(self, msg: Message):
seen_locals = []
missing_locals = set()
handle = msg["RegionData"]["RegionHandle"]
region_state = self._get_region_state(handle)
for block in msg['ObjectData']:
seen_locals.append(block["ID"])
update_flags = block.deserialize_var("UpdateFlags", make_copy=False)
if region_state is None:
LOG.warning(f"Got ObjectUpdateCached for unknown region {handle}: {block!r}")
continue
# Check if we already know about the object
obj = region_state.lookup_localid(block["ID"])
if obj is not None and obj.CRC == block["CRC"]:
self._update_existing_object(obj, {
"UpdateFlags": update_flags,
"RegionHandle": handle,
}, UpdateType.OBJECT_UPDATE)
continue
cached_obj_data = self._lookup_cache_entry(handle, block["ID"], block["CRC"])
if cached_obj_data is not None:
cached_obj = normalize_object_update_compressed_data(cached_obj_data)
cached_obj["UpdateFlags"] = update_flags
cached_obj["RegionHandle"] = handle
self._track_new_object(region_state, Object(**cached_obj))
continue
# Don't know about it and wasn't cached.
missing_locals.add(block["ID"])
if region_state:
region_state.missing_locals.update(missing_locals)
if missing_locals:
self._handle_object_update_cached_misses(handle, missing_locals)
msg.meta["ObjectUpdateIDs"] = tuple(seen_locals)
def _handle_object_update_cached_misses(self, region_handle: int, missing_locals: Set[int]):
"""Handle an ObjectUpdateCached that referenced some un-cached local IDs"""
region_mgr = self._get_region_manager(region_handle)
region_mgr.request_objects(missing_locals)
# noinspection PyUnusedLocal
def _lookup_cache_entry(self, region_handle: int, local_id: int, crc: int) -> Optional[bytes]:
return None
def _handle_object_update_compressed(self, msg: Message):
seen_locals = []
handle = msg["RegionData"]["RegionHandle"]
region_state = self._get_region_state(handle)
for block in msg['ObjectData']:
object_data = normalize_object_update_compressed(block, handle)
seen_locals.append(object_data["LocalID"])
if region_state is None:
LOG.warning(f"Got ObjectUpdateCompressed for unknown region {handle}: {object_data!r}")
obj = self.lookup_fullid(object_data["FullID"])
if obj:
self._update_existing_object(obj, object_data, UpdateType.OBJECT_UPDATE)
else:
if region_state is None:
continue
self._track_new_object(region_state, Object(**object_data))
msg.meta["ObjectUpdateIDs"] = tuple(seen_locals)
def _handle_object_properties_generic(self, packet: Message):
seen_locals = []
for block in packet["ObjectData"]:
object_properties = dict(block.items())
if packet.name == "ObjectProperties":
object_properties["TextureID"] = block.deserialize_var("TextureID")
obj = self.lookup_fullid(block["ObjectID"])
if obj:
seen_locals.append(obj.LocalID)
self._update_existing_object(obj, object_properties, UpdateType.PROPERTIES)
else:
LOG.debug(f"Received {packet.name} for unknown {block['ObjectID']}")
packet.meta["ObjectUpdateIDs"] = tuple(seen_locals)
def _handle_kill_object(self, message: Message):
seen_locals = []
# Have to look up region based on sender, handle not sent in this message
region = self._session.region_by_circuit_addr(message.sender)
region_state = region.objects.state
for block in message["ObjectData"]:
self._kill_object_by_local_id(region_state, block["ID"])
seen_locals.append(block["ID"])
message.meta["ObjectUpdateIDs"] = tuple(seen_locals)
def _handle_coarse_location_update(self, message: Message):
# Have to look up region based on sender, handle not sent in this message
region = self._session.region_by_circuit_addr(message.sender)
region_state = region.objects.state
region_state.coarse_locations.clear()
coarse_locations: Dict[UUID, Vector3] = {}
for agent_block, location_block in zip(message["AgentData"], message["Location"]):
x, y, z = location_block["X"], location_block["Y"], location_block["Z"]
coarse_locations[agent_block["AgentID"]] = Vector3(
X=x,
Y=y,
# The z-axis is multiplied by 4 to obtain true Z location
# The z-axis is also limited to 1020m in height
# If z == 255 then the true Z is unknown.
# http://wiki.secondlife.com/wiki/CoarseLocationUpdate
Z=z * 4 if z != 255 else math.inf,
)
region_state.coarse_locations.update(coarse_locations)
self._rebuild_avatar_objects()
def _process_get_object_cost_response(self, parsed: dict):
if "error" in parsed:
return
for object_id, object_costs in parsed.items():
obj = self.lookup_fullid(UUID(object_id))
if not obj:
LOG.debug(f"Received ObjectCost for unknown {object_id}")
continue
obj.ObjectCosts.update(object_costs)
self._run_object_update_hooks(obj, {"ObjectCosts"}, UpdateType.COSTS)
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str], update_type: UpdateType):
region_state = self._get_region_state(obj.RegionHandle)
region_state.resolve_futures(obj, update_type)
if obj.PCode == PCode.AVATAR and "NameValue" in updated_props:
if obj.NameValue:
self.name_cache.update(obj.FullID, obj.NameValue.to_dict())
self.events.handle(ObjectEvent(obj, updated_props, update_type))
def _run_kill_object_hooks(self, obj: Object):
self.events.handle(ObjectEvent(obj, set(), UpdateType.KILL))
def _rebuild_avatar_objects(self):
# Get all avatars known through coarse locations and which region the location was in
coarse_locations: Dict[UUID, Tuple[int, Vector3]] = {}
for region_handle, region in self._region_managers.items():
for av_key, location in region.state.coarse_locations.items():
coarse_locations[av_key] = (region_handle, location)
# Merge together avatars known through coarse locations or objects, with details for both
current_av_details: Dict[UUID, Tuple[Optional[Tuple[int, Vector3]], Optional[Object]]] = {}
for av_key in set(coarse_locations.keys()) | set(self._avatar_objects.keys()):
details = (coarse_locations.get(av_key), self._avatar_objects.get(av_key))
current_av_details[av_key] = details
# Look for changes in avatars we're already tracking
for existing_key in tuple(self._avatars.keys()):
av = self._avatars[existing_key]
if existing_key in current_av_details:
# This avatar this exists, update it.
coarse_pair, av_obj = current_av_details[existing_key]
av.Object = av_obj
if coarse_pair:
coarse_handle, coarse_location = coarse_pair
av.CoarseLocation = coarse_location
av.RegionHandle = coarse_handle
# If we have a real value for Z then throw away any stale guesses
if av.CoarseLocation.Z != math.inf:
av.GuessedZ = None
if av_obj:
av.Object = av_obj
av.RegionHandle = av_obj.RegionHandle
else:
# Avatar isn't in coarse locations or objects, it's gone.
self._avatars.pop(existing_key, None)
av.Object = None
av.CoarseLocation = None
av.Valid = False
# Check for any new avatars
for av_key, (coarse_pair, av_obj) in current_av_details.items():
if av_key in self._avatars:
# Already handled in the update step above
continue
region_handle = None
coarse_location = None
if coarse_pair:
region_handle, coarse_location = coarse_pair
if av_obj:
# Prefer the region handle from the Object if we have one
region_handle = av_obj.RegionHandle
assert region_handle is not None
self._avatars[av_key] = Avatar(
full_id=av_key,
region_handle=region_handle,
resolved_name=self.name_cache.lookup(av_key, create_if_none=True),
coarse_location=coarse_location,
obj=av_obj,
)
class RegionObjectsState:
"""
Internal class for tracking Object state within a specific region
Should only be directly used by the world and region ObjectManagers.
"""
__slots__ = (
"handle", "missing_locals", "_orphans", "localid_lookup", "coarse_locations",
"_object_futures"
)
def __init__(self):
self.missing_locals = set()
self.localid_lookup: Dict[int, Object] = {}
self.coarse_locations: Dict[UUID, Vector3] = {}
self._object_futures: Dict[Tuple[int, int], List[asyncio.Future]] = {}
self._orphans: Dict[int, List[int]] = collections.defaultdict(list)
def clear(self):
"""Called by the owning ObjectManager when it knows the region is going away"""
for fut in tuple(itertools.chain(*self._object_futures.values())):
fut.cancel()
self._object_futures.clear()
self._orphans.clear()
self.coarse_locations.clear()
self.missing_locals.clear()
self.localid_lookup.clear()
def lookup_localid(self, localid: int) -> Optional[Object]:
return self.localid_lookup.get(localid)
def track_object(self, obj: Object):
"""Assign ownership of Object to this region"""
obj_same_localid = self.localid_lookup.get(obj.LocalID)
if obj_same_localid:
LOG.error(f"Clobbering existing object with LocalID {obj.LocalID}! "
f"{obj.to_dict()} clobbered {obj_same_localid.to_dict()}")
self.localid_lookup[obj.LocalID] = obj
# If it was missing, it's not missing anymore.
self.missing_locals -= {obj.LocalID}
self._parent_object(obj)
# Adopt any of our orphaned child objects.
for orphan_local in self.collect_orphans(obj.LocalID):
child_obj = self.localid_lookup.get(orphan_local)
# Shouldn't be any dead children in the orphanage
assert child_obj is not None
self._parent_object(child_obj)
def untrack_object(self, obj: Object):
"""
Take ownership of an Object from this region
Can happen due to the object being killed, or due to it moving to another region
"""
former_child_ids = obj.ChildIDs[:]
for child_id in former_child_ids:
child_obj = self.localid_lookup.get(child_id)
assert child_obj is not None
self._unparent_object(child_obj, child_obj.ParentID)
# Place any remaining unkilled children in the orphanage
for child_id in former_child_ids:
self._track_orphan(child_id, obj.LocalID)
assert not obj.ChildIDs
# Make sure the parent knows we went away
self._unparent_object(obj, obj.ParentID)
# Object doesn't belong to this region anymore and won't receive
# any updates, cancel any pending futures
self.cancel_futures(obj.LocalID)
del self.localid_lookup[obj.LocalID]
def _parent_object(self, obj: Object, insert_at_head=False):
"""Create any links to ancestor Objects for obj"""
if obj.ParentID:
parent = self.localid_lookup.get(obj.ParentID)
if parent is not None:
assert obj.LocalID not in parent.ChildIDs
# Link order is never explicitly passed to clients, so we have to do
# some nasty guesswork based on order of received initial ObjectUpdates
# Note that this is broken in the viewer as well, and there doesn't seem
# to be a foolproof way to get this.
idx = 0 if insert_at_head else len(parent.ChildIDs)
parent.ChildIDs.insert(idx, obj.LocalID)
parent.Children.insert(idx, obj)
obj.Parent = weakref.proxy(parent)
else:
# We have a parent, but we don't have an Object for it yet
self.missing_locals.add(obj.ParentID)
self._track_orphan(obj.LocalID, parent_id=obj.ParentID)
obj.Parent = None
LOG.debug(f"{obj.LocalID} updated with parent {obj.ParentID}, but parent wasn't found!")
def _unparent_object(self, obj: Object, old_parent_id: int):
"""Break any links to ancestor Objects for obj"""
obj.Parent = None
if old_parent_id:
# Had a parent, remove this from the child and orphan lists.
removed = self._untrack_orphan(obj, old_parent_id)
old_parent = self.localid_lookup.get(old_parent_id)
if old_parent:
if obj.LocalID in old_parent.ChildIDs:
idx = old_parent.ChildIDs.index(obj.LocalID)
del old_parent.ChildIDs[idx]
del old_parent.Children[idx]
else:
# Something is very broken if this happens
LOG.warning(f"Changing parent of {obj.LocalID}, but old parent didn't correctly adopt, "
f"was {'' if removed else 'not '}in orphan list")
else:
LOG.debug(f"Changing parent of {obj.LocalID}, but couldn't find old parent")
def handle_object_reparented(self, obj: Object, old_parent_id: int):
"""Recreate any links to ancestor Objects for obj due to parent changes"""
self._unparent_object(obj, old_parent_id)
self._parent_object(obj, insert_at_head=True)
def collect_orphans(self, parent_localid: int) -> Sequence[int]:
"""Take ownership of any orphan IDs belonging to parent_localid"""
return self._orphans.pop(parent_localid, [])
def _track_orphan(self, local_id: int, parent_id: int):
if len(self._orphans) > 100:
LOG.warning(f"Orphaned object dict is getting large: {len(self._orphans)}")
self._orphans[parent_id].append(local_id)
def _untrack_orphan(self, obj: Object, parent_id: int):
"""Remove obj from parent_id's list of orphans if present"""
if parent_id not in self._orphans:
return False
orphan_list = self._orphans[parent_id]
removed = False
if obj.LocalID in orphan_list:
orphan_list.remove(obj.LocalID)
removed = True
# List is empty now, get rid of it.
if not orphan_list:
del self._orphans[parent_id]
return removed
def register_future(self, local_id: int, future_type: UpdateType) -> asyncio.Future[Object]:
fut = asyncio.Future()
fut_key = (local_id, future_type)
local_futs = self._object_futures.get(fut_key, [])
local_futs.append(fut)
self._object_futures[fut_key] = local_futs
fut.add_done_callback(local_futs.remove)
return fut
def resolve_futures(self, obj: Object, update_type: UpdateType):
futures = self._object_futures.get((obj.LocalID, update_type), [])
for fut in futures[:]:
fut.set_result(obj)
def cancel_futures(self, local_id: int):
# Object went away, so need to kill any pending futures.
for fut_key, futs in self._object_futures.items():
if fut_key[0] == local_id:
for fut in futs:
fut.cancel()
break
class LocationType(enum.IntEnum):
NONE = enum.auto()
COARSE = enum.auto()
EXACT = enum.auto()
class Avatar:
"""Wrapper for an avatar known through ObjectUpdate or CoarseLocationUpdate"""
def __init__(
self,
full_id: UUID,
region_handle: int,
obj: Optional["Object"] = None,
coarse_location: Optional[Vector3] = None,
resolved_name: Optional[NameCacheEntry] = None,
):
self.FullID: UUID = full_id
self.Object: Optional["Object"] = obj
self.RegionHandle: int = region_handle
# TODO: Allow hooking into getZOffsets FS bridge response
# to fill in the Z axis if it's infinite
self.CoarseLocation = coarse_location
self.Valid = True
self.GuessedZ: Optional[float] = None
self._resolved_name = resolved_name
@property
def LocationType(self) -> "LocationType":
if self.Object and self.Object.AncestorsKnown:
return LocationType.EXACT
if self.CoarseLocation is not None:
return LocationType.COARSE
return LocationType.NONE
@property
def RegionPosition(self) -> Vector3:
if self.Object and self.Object.AncestorsKnown:
return self.Object.RegionPosition
if self.CoarseLocation is not None:
if self.CoarseLocation.Z == math.inf and self.GuessedZ is not None:
coarse = self.CoarseLocation
return Vector3(coarse.X, coarse.Y, self.GuessedZ)
return self.CoarseLocation
raise ValueError(f"Avatar {self.FullID} has no known position")
@property
def GlobalPosition(self) -> Vector3:
return self.RegionPosition + handle_to_global_pos(self.RegionHandle)
@property
def Name(self) -> Optional[str]:
if not self._resolved_name:
return None
return str(self._resolved_name)
@property
def PreferredName(self) -> Optional[str]:
if not self._resolved_name:
return None
return self._resolved_name.preferred_name
@property
def DisplayName(self) -> Optional[str]:
if not self._resolved_name:
return None
return self._resolved_name.display_name
@property
def LegacyName(self) -> Optional[str]:
if not self._resolved_name:
return None
return self._resolved_name.legacy_name
def __repr__(self):
loc_str = str(self.RegionPosition) if self.LocationType != LocationType.NONE else "?"
return f"<{self.__class__.__name__} {self.FullID} {self.Name!r} @ {loc_str}>"

View File

@@ -0,0 +1,38 @@
"""
Base classes for common session-related state shared between clients and proxies
"""
from __future__ import annotations
import abc
from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.circuit import ConnectionHolder
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.base.network.caps_client import CapsClient
from hippolyzer.lib.base.network.transport import ADDR_TUPLE
if TYPE_CHECKING:
from hippolyzer.lib.client.object_manager import ClientObjectManager, ClientWorldObjectManager
class BaseClientRegion(ConnectionHolder, abc.ABC):
"""Represents a client's view of a remote region"""
handle: Optional[int]
# Actually a weakref
session: Callable[[], BaseClientSession]
objects: ClientObjectManager
caps_client: CapsClient
class BaseClientSession(abc.ABC):
"""Represents a client's view of a remote session"""
id: UUID
agent_id: UUID
secure_session_id: UUID
message_handler: MessageHandler[Message, str]
regions: Sequence[BaseClientRegion]
region_by_handle: Callable[[int], Optional[BaseClientRegion]]
region_by_circuit_addr: Callable[[ADDR_TUPLE], Optional[BaseClientRegion]]
objects: ClientWorldObjectManager

View File

@@ -8,17 +8,16 @@ import warnings
from typing import *
from hippolyzer.lib.base.datatypes import UUID, Vector3
from hippolyzer.lib.base.message.message import Block
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.objects import Object
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.packets import Direction, ProxiedUDPPacket
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.base.network.transport import UDPPacket, Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager, Session
from hippolyzer.lib.proxy.task_scheduler import TaskLifeScope
from hippolyzer.lib.proxy.templates import ChatSourceType, ChatType
from hippolyzer.lib.base.templates import ChatSourceType, ChatType
class AssetAliasTracker:
@@ -40,8 +39,13 @@ class AssetAliasTracker:
def get_orig_uuid(self, val: UUID) -> Optional[UUID]:
return self.rev_mapping.get(val)
def get_alias_uuid(self, val: UUID):
alias_id = self.alias_mapping.setdefault(val, UUID.random())
def get_alias_uuid(self, val: UUID, create: bool = True) -> Optional[UUID]:
if create:
alias_id = self.alias_mapping.setdefault(val, UUID.random())
else:
alias_id = self.alias_mapping.get(val)
if alias_id is None:
return None
self.rev_mapping.setdefault(alias_id, val)
return alias_id
@@ -53,7 +57,7 @@ def show_message(text, session=None) -> None:
# `or None` so we don't use a dead weakref Proxy which are False-y
session = session or addon_ctx.session.get(None) or None
message = ProxiedMessage(
message = Message(
"ChatFromSimulator",
Block(
"ChatData",
@@ -79,7 +83,7 @@ def send_chat(message: Union[bytes, str], channel=0, chat_type=ChatType.NORMAL,
session = session or addon_ctx.session.get(None) or None
if not session:
raise RuntimeError("Tried to send chat without session")
session.main_region.circuit.send_message(ProxiedMessage(
session.main_region.circuit.send_message(Message(
"ChatFromViewer",
Block(
"AgentData",
@@ -155,7 +159,7 @@ class BaseAddon(abc.ABC):
def handle_unload(self, session_manager: SessionManager):
pass
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
pass
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
@@ -177,13 +181,15 @@ class BaseAddon(abc.ABC):
def handle_region_changed(self, session: Session, region: ProxiedRegion):
pass
def handle_circuit_created(self, session: Session, region: ProxiedRegion):
pass
def handle_rlv_command(self, session: Session, region: ProxiedRegion, source: UUID,
cmd: str, options: List[str], param: str):
pass
def handle_proxied_packet(self, session_manager: SessionManager, packet: ProxiedUDPPacket,
session: Optional[Session], region: Optional[ProxiedRegion],
message: Optional[ProxiedMessage]):
def handle_proxied_packet(self, session_manager: SessionManager, packet: UDPPacket,
session: Optional[Session], region: Optional[ProxiedRegion]):
pass

View File

@@ -16,15 +16,15 @@ from types import ModuleType
from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.network.transport import UDPPacket
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.task_scheduler import TaskLifeScope, TaskScheduler
if TYPE_CHECKING:
from hippolyzer.lib.proxy.commands import CommandDetails, WrappedCommandCallable
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.proxy.objects import Object
from hippolyzer.lib.proxy.packets import ProxiedUDPPacket
from hippolyzer.lib.proxy.object_manager import Object
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
@@ -52,7 +52,12 @@ class BaseInteractionManager:
pass
@abc.abstractmethod
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '') -> Optional[str]:
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '',
default_suffix: str = '') -> Optional[str]:
pass
@abc.abstractmethod
async def confirm(self, title: str, caption: str) -> bool:
pass
def main_window_handle(self) -> Any:
@@ -97,9 +102,14 @@ class AddonManager:
@classmethod
def shutdown(cls):
to_pop = []
for mod in cls.FRESH_ADDON_MODULES.values():
to_pop.append(mod)
cls._call_module_hooks(mod, "handle_unload", cls.SESSION_MANAGER)
cls.SCHEDULER.shutdown()
for mod in to_pop:
if isinstance(mod, ModuleType):
sys.modules.pop(mod.__name__, None)
@classmethod
def have_active_repl(cls):
@@ -169,6 +179,7 @@ class AddonManager:
old_mod = cls.FRESH_ADDON_MODULES.pop(specs[0].name, None)
if old_mod:
cls._unload_module(old_mod)
sys.modules.pop(old_mod.__name__, None)
if reload:
cls._reload_addons()
@@ -385,7 +396,7 @@ class AddonManager:
LOG.error(text)
@classmethod
def handle_lludp_message(cls, session: Session, region: ProxiedRegion, message: ProxiedMessage):
def handle_lludp_message(cls, session: Session, region: ProxiedRegion, message: Message):
cls._reload_addons()
if message.name == "ChatFromViewer" and "ChatData" in message:
if message["ChatData"]["Channel"] == cls.COMMAND_CHANNEL:
@@ -517,8 +528,13 @@ class AddonManager:
return cls._call_all_addon_hooks("handle_region_changed", session, region)
@classmethod
def handle_proxied_packet(cls, session_manager: SessionManager, packet: ProxiedUDPPacket,
session: Optional[Session], region: Optional[ProxiedRegion],
message: Optional[ProxiedMessage]):
return cls._call_all_addon_hooks("handle_proxied_packet", session_manager,
packet, session, region, message)
def handle_circuit_created(cls, session: Session, region: ProxiedRegion):
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_circuit_created", session, region)
@classmethod
def handle_proxied_packet(cls, session_manager: SessionManager, packet: UDPPacket,
session: Optional[Session], region: Optional[ProxiedRegion]):
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_proxied_packet", session_manager,
packet, session, region)

View File

@@ -1,6 +1,7 @@
from pathlib import Path
import shutil
import sys
from hippolyzer.lib.proxy.viewer_settings import iter_viewer_config_dirs, has_settings_file
class InvalidConfigDir(Exception):
@@ -11,39 +12,22 @@ def setup_ca(config_path, mitmproxy_master):
p = Path(config_path)
if not p.exists():
raise InvalidConfigDir("Config path does not exist!")
settings_path = p / "user_settings"
if not (settings_path / "settings.xml").exists():
if not has_settings_file(p):
raise InvalidConfigDir("Path is not a second life config dir!")
mitmproxy_conf_dir = Path(mitmproxy_master.options.confdir)
mitmproxy_ca_path = (mitmproxy_conf_dir.expanduser() / "mitmproxy-ca-cert.pem")
shutil.copy(mitmproxy_ca_path, settings_path / "CA.pem")
shutil.copy(mitmproxy_ca_path, p / "user_settings" / "CA.pem")
def setup_ca_everywhere(mitmproxy_master):
valid_paths = set()
paths = _viewer_config_dir_iter()
paths = iter_viewer_config_dirs()
for path in paths:
try:
setup_ca(path, mitmproxy_master)
valid_paths.add(path)
except InvalidConfigDir:
pass
except PermissionError:
pass
return valid_paths
def _viewer_config_dir_iter():
if sys.platform.startswith("linux"):
paths = (x for x in Path.home().iterdir() if x.name.startswith("."))
elif sys.platform == "darwin":
paths = (Path.home() / "Library" / "Application Support").iterdir()
elif sys.platform in ("win32", "msys", "cygwin"):
paths = (Path.home() / "AppData" / "Local").iterdir()
else:
raise Exception("Unknown OS, can't locate viewer config dirs!")
return (path for path in paths if path.is_dir())

View File

@@ -0,0 +1,89 @@
from __future__ import annotations
import enum
import typing
from weakref import ref
from typing import *
if TYPE_CHECKING:
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
def is_asset_server_cap_name(cap_name):
return cap_name and (
cap_name.startswith("GetMesh")
or cap_name.startswith("GetTexture")
or cap_name.startswith("ViewerAsset")
)
class CapType(enum.Enum):
NORMAL = enum.auto()
TEMPORARY = enum.auto()
WRAPPER = enum.auto()
PROXY_ONLY = enum.auto()
class SerializedCapData(typing.NamedTuple):
cap_name: typing.Optional[str] = None
region_addr: typing.Optional[str] = None
session_id: typing.Optional[str] = None
base_url: typing.Optional[str] = None
type: str = "NORMAL"
def __bool__(self):
return bool(self.cap_name or self.session_id)
@property
def asset_server_cap(self):
return is_asset_server_cap_name(self.cap_name)
class CapData(NamedTuple):
cap_name: Optional[str] = None
# Actually they're weakrefs but the type sigs suck.
region: Optional[Callable[[], Optional[ProxiedRegion]]] = None
session: Optional[Callable[[], Optional[Session]]] = None
base_url: Optional[str] = None
type: CapType = CapType.NORMAL
def __bool__(self):
return bool(self.cap_name or self.session)
def serialize(self) -> "SerializedCapData":
return SerializedCapData(
cap_name=self.cap_name,
region_addr=str(self.region().circuit_addr) if self.region and self.region() else None,
session_id=str(self.session().id) if self.session and self.session() else None,
base_url=self.base_url,
type=self.type.name,
)
@classmethod
def deserialize(
cls,
ser_cap_data: "SerializedCapData",
session_mgr: Optional[SessionManager],
) -> "CapData":
cap_session = None
cap_region = None
if session_mgr and ser_cap_data.session_id:
for session in session_mgr.sessions:
if ser_cap_data.session_id == str(session.id):
cap_session = session
if cap_session and ser_cap_data.region_addr:
for region in cap_session.regions:
if ser_cap_data.region_addr == str(region.circuit_addr):
cap_region = region
return cls(
cap_name=ser_cap_data.cap_name,
region=ref(cap_region) if cap_region else None,
session=ref(cap_session) if cap_session else None,
base_url=ser_cap_data.base_url,
type=CapType[ser_cap_data.type],
)
@property
def asset_server_cap(self) -> bool:
return is_asset_server_cap_name(self.cap_name)

View File

@@ -1,150 +1,35 @@
from __future__ import annotations
import asyncio
import copy
import dataclasses
import os
import re
import sys
from types import TracebackType
from typing import *
import aiohttp
from hippolyzer.lib.base import llsd as llsd_lib
from hippolyzer.lib.base.helpers import proxify
from hippolyzer.lib.base.network.caps_client import CapsClient, CAPS_DICT
from hippolyzer.lib.proxy.settings import ProxySettings
if TYPE_CHECKING:
from hippolyzer.lib.proxy.region import ProxiedRegion
class CapsClientResponse(aiohttp.ClientResponse):
"""
Not actually instantiated, used for lying to the type system
since we'll dynamically put this onto a ClientResponse instance
Will fail isinstance().
"""
async def read_llsd(self) -> Any:
raise NotImplementedError()
class ProxyCapsClient(CapsClient):
def __init__(self, settings: ProxySettings, region: Optional[ProxiedRegion] = None):
super().__init__(None)
self._region = region
self._settings = settings
def _get_caps(self) -> Optional[CAPS_DICT]:
if not self._region:
return None
return self._region.caps
class _HippoSessionRequestContextManager:
"""
_SessionRequestContextManager but with a symmetrical API
aiohttp.request() and aiohttp.ClientSession.request() have different APIs.
One is sync returning a context manager, one is async returning a coro.
aiohttp.request() also doesn't accept the arguments that we need for custom
SSL contexts. To deal with requests that have existing sessions and those without,
just give them both the same wrapper and don't close the session on context manager
exit if it wasn't our session.
"""
__slots__ = ("_coro", "_resp", "_session", "_session_owned")
def __init__(
self,
coro: Coroutine[asyncio.Future[Any], None, aiohttp.ClientResponse],
session: aiohttp.ClientSession,
session_owned: bool = True,
) -> None:
self._coro = coro
self._resp: Optional[aiohttp.ClientResponse] = None
self._session = session
self._session_owned = session_owned
async def __aenter__(self) -> CapsClientResponse:
try:
self._resp = await self._coro
# We don't control creation of the ClientResponse, so tack on
# a convenience method for reading LLSD.
async def _read_llsd():
return llsd_lib.parse_xml(await self._resp.read())
self._resp.read_llsd = _read_llsd
except BaseException:
if self._session_owned:
await self._session.close()
raise
else:
# intentionally fooling the type system
return self._resp # type: ignore
async def __aexit__(
self,
exc_type: Optional[Type[BaseException]],
exc: Optional[BaseException],
tb: Optional[TracebackType],
) -> None:
assert self._resp is not None
self._resp.close()
if self._session_owned:
await self._session.close()
class CapsClient:
def __init__(self, region: Optional[ProxiedRegion] = None):
self._region: Optional[ProxiedRegion] = proxify(region)
def request(self, method: str, cap_or_url: str, *, path: str = "", data: Any = None,
headers: Optional[dict] = None, session: Optional[aiohttp.ClientSession] = None,
llsd: Any = dataclasses.MISSING, params: Optional[Dict[str, Any]] = None,
proxy: Optional[str] = None, skip_auto_headers: Optional[Sequence[str]] = None,
**kwargs) -> _HippoSessionRequestContextManager:
if cap_or_url.startswith("http"):
if path:
raise ValueError("Specifying both path and a full URL not supported")
else:
if self._region is None:
raise RuntimeError(f"Need a region to request a Cap like {cap_or_url}")
if cap_or_url not in self._region.caps:
raise KeyError(f"{cap_or_url} is not a full URL and not a Cap")
cap_or_url = self._region.caps[cap_or_url]
if path:
cap_or_url += path
if params is not None:
for pname, pval in params.items():
if not isinstance(pval, str):
params[pname] = str(pval)
session_owned = False
# Use an existing session if we have one to take advantage of connection pooling
# otherwise create one
if session is None:
session_owned = True
session = aiohttp.ClientSession(
connector=aiohttp.TCPConnector(force_close=True),
connector_owner=True
)
if headers is None:
headers = {}
else:
headers = copy.copy(headers)
# Use sentinel val so explicit `None` can be passed
if llsd is not dataclasses.MISSING:
data = llsd_lib.format_xml(llsd)
# Sometimes needed even on GETs.
if "Content-Type" not in headers:
headers["Content-Type"] = "application/llsd+xml"
# Always present, usually ignored by the server.
if "Accept" not in headers:
headers["Accept"] = "application/llsd+xml"
# Ask to keep the connection open if we're sharing a session
if not session_owned:
headers["Connection"] = "keep-alive"
headers["Keep-alive"] = "300"
# We go through the proxy by default, tack on a header letting mitmproxy know the
# request came from us so we can tag the request as injected. The header will be popped
# off before passing through to the server.
ssl = kwargs.pop('ssl', None)
def _request_fixups(self, cap_or_url: str, headers: Dict, proxy: Optional[bool], ssl: Any):
# We want to proxy this through Hippolyzer
if proxy is None:
# Always set this so we know this request was from the proxy
# We go through the proxy by default, tack on a header letting mitmproxy know the
# request came from us so we can tag the request as injected. The header will be popped
# off before passing through to the server.
headers["X-Hippo-Injected"] = "1"
# TODO: Have a setting for this
proxy_port = int(os.environ.get("HIPPO_HTTP_PORT", 9062))
proxy_port = self._settings.HTTP_PROXY_PORT
proxy = f"http://127.0.0.1:{proxy_port}"
# TODO: set up the SSLContext to validate mitmproxy's cert
ssl = ssl or False
@@ -160,28 +45,4 @@ class CapsClient:
if sys.platform == "win32" and cap_or_url.startswith("https:"):
headers["X-Hippo-Windows-SSL-Hack"] = "1"
cap_or_url = re.sub(r"^https:", "http:", cap_or_url)
resp = session._request(method, cap_or_url, data=data, headers=headers, # noqa: need internal call
params=params, ssl=ssl, proxy=proxy,
skip_auto_headers=skip_auto_headers or ("User-Agent",), **kwargs)
return _HippoSessionRequestContextManager(resp, session, session_owned=session_owned)
def get(self, cap_or_url: str, *, path: str = "", headers: Optional[dict] = None,
session: Optional[aiohttp.ClientSession] = None, params: Optional[Dict[str, Any]] = None,
proxy: Optional[str] = None, **kwargs) -> _HippoSessionRequestContextManager:
return self.request("GET", cap_or_url=cap_or_url, path=path, headers=headers,
session=session, params=params, proxy=proxy, **kwargs)
def post(self, cap_or_url: str, *, path: str = "", data: Any = None,
headers: Optional[dict] = None, session: Optional[aiohttp.ClientSession] = None,
llsd: Any = dataclasses.MISSING, params: Optional[Dict[str, Any]] = None,
proxy: Optional[str] = None, **kwargs) -> _HippoSessionRequestContextManager:
return self.request("POST", cap_or_url=cap_or_url, path=path, headers=headers, data=data,
llsd=llsd, session=session, params=params, proxy=proxy, **kwargs)
def put(self, cap_or_url: str, *, path: str = "", data: Any = None,
headers: Optional[dict] = None, session: Optional[aiohttp.ClientSession] = None,
llsd: Any = dataclasses.MISSING, params: Optional[Dict[str, Any]] = None,
proxy: Optional[str] = None, **kwargs) -> _HippoSessionRequestContextManager:
return self.request("PUT", cap_or_url=cap_or_url, path=path, headers=headers, data=data,
llsd=llsd, session=session, params=params, proxy=proxy, **kwargs)
return cap_or_url, headers, proxy, ssl

View File

@@ -1,69 +1,45 @@
from __future__ import annotations
import asyncio
import datetime as dt
import logging
from collections import deque
from typing import *
from hippolyzer.lib.base.message.message import Block
from hippolyzer.lib.base.message.circuit import Circuit
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.msgtypes import PacketFlags
from hippolyzer.lib.base.message.udpserializer import UDPMessageSerializer
from hippolyzer.lib.proxy.packets import Direction, ProxiedUDPPacket
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.base.network.transport import Direction
if TYPE_CHECKING:
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.message_logger import BaseMessageLogger
LLUDP_LOGGING_HOOK = Optional[Callable[[Message], Any]]
class ProxiedCircuit:
def __init__(self, near_host, far_host, transport, region: Optional[ProxiedRegion] = None,
socks_transport: Optional[bool] = None):
self.near_host = near_host
self.host = far_host
self.is_alive = True
self.socks_transport = socks_transport
self.transport: Optional[asyncio.DatagramTransport] = transport
class ProxiedCircuit(Circuit):
def __init__(self, near_host, far_host, transport, logging_hook: LLUDP_LOGGING_HOOK = None):
super().__init__(near_host, far_host, transport)
self.in_injections = InjectionTracker(0)
self.out_injections = InjectionTracker(0)
self.serializer = UDPMessageSerializer()
self.last_packet_at = dt.datetime.now()
self.region: Optional[ProxiedRegion] = region
message_logger = None
if region:
message_logger = region.session().session_manager.message_logger
self.message_logger: Optional[BaseMessageLogger] = message_logger
self.logging_hook: LLUDP_LOGGING_HOOK = logging_hook
def _send_prepared_message(self, message: ProxiedMessage, direction, transport=None):
def _send_prepared_message(self, message: Message, transport=None):
try:
serialized = self.serializer.serialize(message)
except:
logging.exception(f"Failed to serialize: {message.to_dict()!r}")
raise
if self.message_logger and message.injected:
self.message_logger.log_lludp_message(self.region.session(), self.region, message)
return self.send_datagram(serialized, direction, transport=transport)
def send_datagram(self, data: bytes, direction: Direction, transport=None):
self.last_packet_at = dt.datetime.now()
src_addr, dst_addr = self.host, self.near_host
if direction == Direction.OUT:
src_addr, dst_addr = self.near_host, self.host
packet = ProxiedUDPPacket(src_addr, dst_addr, data, direction)
packet_data = packet.serialize(socks_header=self.socks_transport)
(transport or self.transport).sendto(packet_data, dst_addr)
return packet
if self.logging_hook and message.injected:
self.logging_hook(message)
return self.send_datagram(serialized, message.direction, transport=transport)
def _get_injections(self, direction: Direction):
if direction == Direction.OUT:
return self.out_injections, self.in_injections
return self.in_injections, self.out_injections
def prepare_message(self, message: ProxiedMessage, direction=None):
def prepare_message(self, message: Message, direction=None):
if message.finalized:
raise RuntimeError(f"Trying to re-send finalized {message!r}")
if message.queued:
# This is due to be dropped, nothing should be sending the original
raise RuntimeError(f"Trying to send original of queued {message!r}")
direction = direction or getattr(message, 'direction')
fwd_injections, reverse_injections = self._get_injections(direction)
@@ -102,12 +78,7 @@ class ProxiedCircuit:
message.send_flags &= ~PacketFlags.ACK
return True
def send_message(self, message: ProxiedMessage, direction=None, transport=None):
direction = direction or getattr(message, 'direction')
if self.prepare_message(message, direction):
return self._send_prepared_message(message, direction, transport)
def _rewrite_packet_ack(self, message: ProxiedMessage, reverse_injections):
def _rewrite_packet_ack(self, message: Message, reverse_injections):
new_blocks = []
for block in message["Packets"]:
packet_id = block["ID"]
@@ -124,14 +95,14 @@ class ProxiedCircuit:
message["Packets"] = new_blocks
return True
def _rewrite_start_ping_check(self, message: ProxiedMessage, fwd_injections):
def _rewrite_start_ping_check(self, message: Message, fwd_injections):
orig_id = message["PingID"]["OldestUnacked"]
new_id = fwd_injections.get_effective_id(orig_id)
if orig_id != new_id:
logging.debug("Rewrote oldest unacked %s -> %s" % (orig_id, new_id))
message["PingID"]["OldestUnacked"] = new_id
def drop_message(self, message: ProxiedMessage, orig_direction=None):
def drop_message(self, message: Message, orig_direction=None):
if message.finalized:
raise RuntimeError(f"Trying to drop finalized {message!r}")
if message.packet_id is None:
@@ -140,13 +111,12 @@ class ProxiedCircuit:
fwd_injections, reverse_injections = self._get_injections(orig_direction)
fwd_injections.mark_dropped(message.packet_id)
if hasattr(message, 'dropped'):
message.dropped = True
message.dropped = True
message.finalized = True
# Was sent reliably, tell the other end that we saw it and to shut up.
if message.reliable:
self._send_acks([message.packet_id], ~orig_direction)
self.send_acks([message.packet_id], ~orig_direction)
# This packet had acks for the other end, send them in a separate PacketAck
effective_acks = tuple(
@@ -154,20 +124,7 @@ class ProxiedCircuit:
if not reverse_injections.was_injected(x)
)
if effective_acks:
self._send_acks(effective_acks, orig_direction, packet_id=message.packet_id)
def _send_acks(self, to_ack, direction, packet_id=None):
logging.debug("%r acking %r" % (direction, to_ack))
# TODO: maybe tack this onto `.acks` for next message?
packet = ProxiedMessage('PacketAck',
*[Block('Packets', ID=x) for x in to_ack])
packet.packet_id = packet_id
packet.injected = True
packet.direction = direction
self.send_message(packet)
def __repr__(self):
return "<%s %r : %r>" % (self.__class__.__name__, self.near_host, self.host)
self.send_acks(effective_acks, orig_direction, packet_id=message.packet_id)
class InjectionTracker:

View File

@@ -26,6 +26,10 @@ class CommandDetails(NamedTuple):
lifetime: Optional[TaskLifeScope] = None
def parse_bool(val: str) -> bool:
return val.lower() in ('on', 'true', '1', '1.0', 'yes')
def handle_command(command_name: Optional[str] = None, /, *, lifetime: Optional[TaskLifeScope] = None,
single_instance: bool = False, **params: Union[Parameter, callable]):
"""
@@ -61,13 +65,13 @@ def handle_command(command_name: Optional[str] = None, /, *, lifetime: Optional[
# Greedy, takes the rest of the message
if param.sep is None:
param_val = message
message = None
message = ""
else:
message = message.lstrip(param.sep)
if not message:
if param.optional:
break
raise KeyError(f"Missing parameter {param_name}")
if not param.optional:
raise KeyError(f"Missing parameter {param_name}")
continue
param_val, _, message = message.partition(param.sep) # type: ignore
param_vals[param_name] = param.parser(param_val)

View File

@@ -48,13 +48,17 @@ class HTTPAssetRepo(collections.UserDict):
asset_id = None
for name, val in flow.request.query.items():
if name.endswith("_id"):
asset_id = UUID(val)
try:
asset_id = UUID(val)
break
except ValueError:
pass
if not asset_id or asset_id not in self.data:
return False
asset = self[asset_id]
flow.response = http.HTTPResponse.make(
flow.response = http.Response.make(
content=asset.data,
headers={
"Content-Type": "application/octet-stream",

View File

@@ -9,22 +9,24 @@ import urllib.parse
import weakref
import xmlrpc.client
import defusedxml.cElementTree
import defusedxml.ElementTree
import defusedxml.xmlrpc
import mitmproxy.http
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.llsd_msg_serializer import LLSDMessageSerializer
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.region import ProxiedRegion, CapType
from hippolyzer.lib.proxy.sessions import SessionManager, CapData, Session
from hippolyzer.lib.proxy.caps import CapData, CapType
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager, Session
from hippolyzer.lib.proxy.http_proxy import HTTPFlowContext
def apply_security_monkeypatches():
defusedxml.xmlrpc.monkey_patch()
llsd.fromstring = defusedxml.cElementTree.fromstring
llsd.fromstring = defusedxml.ElementTree.fromstring
apply_security_monkeypatches()
@@ -51,36 +53,39 @@ class MITMProxyEventManager:
self.llsd_message_serializer = LLSDMessageSerializer()
self._asset_server_proxied = False
async def pump_proxy_events(self):
async def run(self):
while not self.shutdown_signal.is_set():
try:
try:
event_type, flow_state = self.from_proxy_queue.get(False)
except queue.Empty:
await asyncio.sleep(0.001)
continue
flow = HippoHTTPFlow.from_state(flow_state, self.session_manager)
try:
if event_type == "request":
self._handle_request(flow)
# A response was injected early in the cycle, we won't get a response
# callback from mitmproxy so just log it now.
message_logger = self.session_manager.message_logger
if message_logger and flow.response_injected:
message_logger.log_http_response(flow)
elif event_type == "response":
self._handle_response(flow)
else:
raise Exception(f"Unknown mitmproxy event type {event_type}")
finally:
# If someone has taken this request out of the regular callback flow,
# they'll manually send a callback at some later time.
if not flow.taken:
self.to_proxy_queue.put(("callback", flow.id, flow.get_state()))
await self.pump_proxy_event()
except:
logging.exception("Exploded when handling parsed packets")
async def pump_proxy_event(self):
try:
event_type, flow_state = self.from_proxy_queue.get(False)
except queue.Empty:
await asyncio.sleep(0.001)
return
flow = HippoHTTPFlow.from_state(flow_state, self.session_manager)
try:
if event_type == "request":
self._handle_request(flow)
# A response was injected early in the cycle, we won't get a response
# callback from mitmproxy so just log it now.
message_logger = self.session_manager.message_logger
if message_logger and flow.response_injected:
message_logger.log_http_response(flow)
elif event_type == "response":
self._handle_response(flow)
else:
raise Exception(f"Unknown mitmproxy event type {event_type}")
finally:
# If someone has taken this request out of the regular callback flow,
# they'll manually send a callback at some later time.
if not flow.taken:
self.to_proxy_queue.put(("callback", flow.id, flow.get_state()))
def _handle_request(self, flow: HippoHTTPFlow):
url = flow.request.url
cap_data = self.session_manager.resolve_cap(url)
@@ -116,13 +121,16 @@ class MITMProxyEventManager:
if not flow.can_stream or self._asset_server_proxied:
flow.request.url = redir_url
else:
flow.response = mitmproxy.http.HTTPResponse.make(
flow.response = mitmproxy.http.Response.make(
307,
b"Redirecting...",
# Can't provide explanation in the body because this results in failing Range requests under
# mitmproxy that return garbage data. Chances are there's weird interactions
# between HTTP/1.x pipelining and range requests under mitmproxy that no other
# applications have hit. If that's a concern then Connection: close should be used.
b"",
{
"Content-Type": "text/plain",
"Connection": "keep-alive",
"Location": redir_url,
"Connection": "close",
}
)
elif cap_data and cap_data.asset_server_cap:
@@ -130,6 +138,27 @@ class MITMProxyEventManager:
# the proxy
self._asset_server_proxied = True
logging.warning("noproxy not used, switching to URI rewrite strategy")
elif cap_data and cap_data.cap_name == "EventQueueGet":
# HACK: The sim's EQ acking mechanism doesn't seem to actually work.
# if the client drops the connection due to timeout before we can
# proxy back the response then it will be lost forever. Keep around
# the last EQ response we got so we can re-send it if the client repeats
# its previous request.
req_ack_id = llsd.parse_xml(flow.request.content)["ack"]
eq_manager = cap_data.region().eq_manager
cached_resp = eq_manager.get_cached_poll_response(req_ack_id)
if cached_resp:
logging.warning("Had to serve a cached EventQueueGet due to client desync")
flow.response = mitmproxy.http.Response.make(
200,
llsd.format_xml(cached_resp),
{
"Content-Type": "application/llsd+xml",
# So we can differentiate these in the log
"X-Hippo-Fake-EQ": "1",
"Connection": "close",
},
)
elif not cap_data:
if self._is_login_request(flow):
# Not strictly a Cap, but makes it easier to filter on.
@@ -138,7 +167,7 @@ class MITMProxyEventManager:
if cap_data and cap_data.type == CapType.PROXY_ONLY:
# A proxy addon was supposed to respond itself, but it didn't.
if not flow.taken and not flow.response_injected:
flow.response = mitmproxy.http.HTTPResponse.make(
flow.response = mitmproxy.http.Response.make(
500,
b"Proxy didn't handle proxy-only Cap correctly",
{
@@ -175,64 +204,92 @@ class MITMProxyEventManager:
if flow.request_injected:
return
if AddonManager.handle_http_response(flow):
return
status = flow.response.status_code
cap_data: Optional[CapData] = flow.metadata["cap_data"]
if cap_data:
if status != 200:
if status == 200 and cap_data and cap_data.cap_name == "FirestormBridge":
# Fake FirestormBridge cap based on a bridge-like response coming from
# a non-browser HTTP request. Figure out what session it belongs to
# so it can be handled in the session and region HTTP MessageHandlers
agent_id_str = flow.response.headers.get("X-SecondLife-Owner-Key", "")
if not agent_id_str:
return
agent_id = UUID(agent_id_str)
for session in self.session_manager.sessions:
if session.pending:
continue
if session.agent_id == agent_id:
# Enrich the flow with the session and region info
cap_data = CapData(
cap_name="FirestormBridge",
region=weakref.ref(session.main_region),
session=weakref.ref(session),
)
flow.cap_data = cap_data
break
if cap_data.cap_name == "LoginRequest":
self._handle_login_flow(flow)
if AddonManager.handle_http_response(flow):
return
if status != 200 or not cap_data:
return
if cap_data.cap_name == "LoginRequest":
self._handle_login_flow(flow)
return
try:
session = cap_data.session and cap_data.session()
if not session:
return
try:
region = cap_data.region and cap_data.region()
session.http_message_handler.handle(flow)
region = cap_data.region and cap_data.region()
if not region:
return
region.http_message_handler.handle(flow)
if cap_data.cap_name == "Seed":
parsed = llsd.parse_xml(flow.response.content)
logging.debug("Got seed cap for %r : %r" % (cap_data, parsed))
region.update_caps(parsed)
# On LL's grid these URIs aren't unique across sessions or regions,
# so we get request attribution by replacing them with a unique
# alias URI.
logging.debug("Replacing GetMesh caps with wrapped versions")
wrappable_caps = {"GetMesh2", "GetMesh", "GetTexture", "ViewerAsset"}
for cap_name in wrappable_caps:
if cap_name in parsed:
parsed[cap_name] = region.register_wrapper_cap(cap_name)
flow.response.content = llsd.format_pretty_xml(parsed)
elif cap_data.cap_name == "EventQueueGet":
parsed_eq_resp = llsd.parse_xml(flow.response.content)
if parsed_eq_resp:
old_events = parsed_eq_resp["events"]
new_events = []
for event in old_events:
if not self._handle_eq_event(cap_data.session(), region, event):
new_events.append(event)
# Add on any fake events that've been queued by addons
eq_manager = cap_data.region().eq_manager
new_events.extend(eq_manager.take_injected_events())
parsed_eq_resp["events"] = new_events
# Empty event list is an error, need to return undef instead.
if old_events and not new_events:
parsed_eq_resp = None
# HACK: see note in above request handler for EventQueueGet
req_ack_id = llsd.parse_xml(flow.request.content)["ack"]
eq_manager.cache_last_poll_response(req_ack_id, parsed_eq_resp)
flow.response.content = llsd.format_pretty_xml(parsed_eq_resp)
elif cap_data.cap_name in self.UPLOAD_CREATING_CAPS:
if not region:
return
region.http_message_handler.handle(flow)
if cap_data.cap_name == "Seed":
parsed = llsd.parse_xml(flow.response.content)
logging.debug("Got seed cap for %r : %r" % (cap_data, parsed))
region.update_caps(parsed)
# On LL's grid these URIs aren't unique across sessions or regions,
# so we get request attribution by replacing them with a unique
# alias URI.
logging.debug("Replacing GetMesh caps with wrapped versions")
wrappable_caps = {"GetMesh2", "GetMesh", "GetTexture", "ViewerAsset"}
for cap_name in wrappable_caps:
if cap_name in parsed:
parsed[cap_name] = region.register_wrapper_cap(cap_name)
flow.response.content = llsd.format_pretty_xml(parsed)
elif cap_data.cap_name == "EventQueueGet":
parsed_eq_resp = llsd.parse_xml(flow.response.content)
if parsed_eq_resp:
old_events = parsed_eq_resp["events"]
new_events = []
for event in old_events:
if not self._handle_eq_event(cap_data.session(), region, event):
new_events.append(event)
# Add on any fake events that've been queued by addons
eq_manager = cap_data.region().eq_manager
new_events.extend(eq_manager.take_events())
parsed_eq_resp["events"] = new_events
if old_events and not new_events:
# Need at least one event or the viewer will refuse to ack!
new_events.append({"message": "NOP", "body": {}})
flow.response.content = llsd.format_pretty_xml(parsed_eq_resp)
elif cap_data.cap_name in self.UPLOAD_CREATING_CAPS:
if not region:
return
parsed = llsd.parse_xml(flow.response.content)
if "uploader" in parsed:
region.register_temporary_cap(cap_data.cap_name + "Uploader", parsed["uploader"])
except:
logging.exception("OOPS, blew up in HTTP proxy!")
parsed = llsd.parse_xml(flow.response.content)
if "uploader" in parsed:
region.register_temporary_cap(cap_data.cap_name + "Uploader", parsed["uploader"])
except:
logging.exception("OOPS, blew up in HTTP proxy!")
def _handle_login_flow(self, flow: HippoHTTPFlow):
resp = xmlrpc.client.loads(flow.response.content)[0][0] # type: ignore

View File

@@ -2,12 +2,15 @@ from __future__ import annotations
import copy
from typing import *
from typing import Optional
import mitmproxy.http
from mitmproxy.http import HTTPFlow
from hippolyzer.lib.proxy.caps import CapData
if TYPE_CHECKING:
from hippolyzer.lib.proxy.sessions import CapData, SessionManager
from hippolyzer.lib.proxy.sessions import SessionManager
class HippoHTTPFlow:
@@ -30,11 +33,11 @@ class HippoHTTPFlow:
meta.setdefault("from_browser", False)
@property
def request(self) -> mitmproxy.http.HTTPRequest:
def request(self) -> mitmproxy.http.Request:
return self.flow.request
@property
def response(self) -> Optional[mitmproxy.http.HTTPResponse]:
def response(self) -> Optional[mitmproxy.http.Response]:
return self.flow.response
@property
@@ -42,7 +45,7 @@ class HippoHTTPFlow:
return self.flow.id
@response.setter
def response(self, val: Optional[mitmproxy.http.HTTPResponse]):
def response(self, val: Optional[mitmproxy.http.Response]):
self.flow.metadata["response_injected"] = True
self.flow.response = val
@@ -113,12 +116,12 @@ class HippoHTTPFlow:
return state
@classmethod
def from_state(cls, flow_state: Dict, session_manager: SessionManager) -> HippoHTTPFlow:
def from_state(cls, flow_state: Dict, session_manager: Optional[SessionManager]) -> HippoHTTPFlow:
flow: Optional[HTTPFlow] = HTTPFlow.from_state(flow_state)
assert flow is not None
cap_data_ser = flow.metadata.get("cap_data_ser")
if cap_data_ser is not None:
flow.metadata["cap_data"] = session_manager.deserialize_cap_data(cap_data_ser)
flow.metadata["cap_data"] = CapData.deserialize(cap_data_ser, session_manager)
else:
flow.metadata["cap_data"] = None
return cls(flow)

View File

@@ -1,11 +1,9 @@
import asyncio
import functools
import logging
import multiprocessing
import os
import re
import sys
import pkg_resources
import queue
import typing
import uuid
@@ -16,41 +14,30 @@ import mitmproxy.log
import mitmproxy.master
import mitmproxy.options
import mitmproxy.proxy
from mitmproxy.addons import core, clientplayback
from mitmproxy.addons import core, clientplayback, proxyserver, next_layer, disable_h2c
from mitmproxy.http import HTTPFlow
from mitmproxy.proxy.layers import tls
import OpenSSL
from hippolyzer.lib.base.helpers import get_resource_filename
from hippolyzer.lib.base.multiprocessing_utils import ParentProcessWatcher
orig_sethostflags = OpenSSL.SSL._lib.X509_VERIFY_PARAM_set_hostflags # noqa
@functools.wraps(orig_sethostflags)
def _sethostflags_wrapper(param, flags):
# Since 2000 the recommendation per RFCs has been to only check SANs and not the CN field.
# Most browsers do this, as does mitmproxy. The viewer does not, and the sim certs have no SAN
# field. Just monkeypatch out this flag since mitmproxy's internals are in flux and there's
# no good way to stop setting this flag currently.
return orig_sethostflags(
param,
flags & (~OpenSSL.SSL._lib.X509_CHECK_FLAG_NEVER_CHECK_SUBJECT) # noqa
)
OpenSSL.SSL._lib.X509_VERIFY_PARAM_set_hostflags = _sethostflags_wrapper # noqa
from hippolyzer.lib.proxy.caps import SerializedCapData
class SLCertStore(mitmproxy.certs.CertStore):
def get_cert(self, commonname: typing.Optional[bytes], sans: typing.List[bytes], *args):
cert, privkey, chain = super().get_cert(commonname, sans, *args)
x509: OpenSSL.crypto.X509 = cert.x509
def get_cert(self, commonname: typing.Optional[str], sans: typing.List[str], *args, **kwargs):
entry = super().get_cert(commonname, sans, *args, **kwargs)
cert, privkey, chain = entry.cert, entry.privatekey, entry.chain_file
x509 = cert.to_pyopenssl()
# The cert must have a subject key ID or the viewer will reject it.
for i in range(0, x509.get_extension_count()):
ext = x509.get_extension(i)
# This cert already has a subject key id, pass through.
if ext.get_short_name() == b"subjectKeyIdentifier":
return cert, privkey, chain
return entry
# Need to add a subject key ID onto this cert or the viewer will reject it.
# The viewer doesn't actually use the subject key ID for its intended purpose,
# so a random, unique value is fine.
x509.add_extensions([
OpenSSL.crypto.X509Extension(
b"subjectKeyIdentifier",
@@ -58,17 +45,24 @@ class SLCertStore(mitmproxy.certs.CertStore):
uuid.uuid4().hex.encode("utf8"),
),
])
x509.sign(privkey, "sha256") # type: ignore
return cert, privkey, chain
x509.sign(OpenSSL.crypto.PKey.from_cryptography_key(privkey), "sha256") # type: ignore
new_entry = mitmproxy.certs.CertStoreEntry(
mitmproxy.certs.Cert.from_pyopenssl(x509), privkey, chain
)
# Replace the cert that was created in the base `get_cert()` with our modified cert
self.certs[(commonname, tuple(sans))] = new_entry
self.expire_queue.pop(-1)
self.expire(new_entry)
return new_entry
class SLProxyConfig(mitmproxy.proxy.ProxyConfig):
def configure(self, options, updated) -> None:
super().configure(options, updated)
class SLTlsConfig(mitmproxy.addons.tlsconfig.TlsConfig):
def running(self):
super().running()
old_cert_store = self.certstore
# Replace the cert store with one that knows how to add
# a subject key ID extension.
self.certstore = SLCertStore( # noqa
self.certstore = SLCertStore(
default_privatekey=old_cert_store.default_privatekey,
default_ca=old_cert_store.default_ca,
default_chain_file=old_cert_store.default_chain_file,
@@ -76,12 +70,25 @@ class SLProxyConfig(mitmproxy.proxy.ProxyConfig):
)
self.certstore.certs = old_cert_store.certs
def tls_start_server(self, tls_start: tls.TlsStartData):
super().tls_start_server(tls_start)
# Since 2000 the recommendation per RFCs has been to only check SANs and not the CN field.
# Most browsers do this, as does mitmproxy. The viewer does not, and the sim certs have no SAN
# field. set the host verification flags to remove the flag that disallows falling back to
# checking the CN (X509_CHECK_FLAG_NEVER_CHECK_SUBJECT)
param = OpenSSL.SSL._lib.SSL_get0_param(tls_start.ssl_conn._ssl) # noqa
# get_hostflags() doesn't seem to be exposed, just set the usual flags without
# the problematic `X509_CHECK_FLAG_NEVER_CHECK_SUBJECT` flag.
flags = OpenSSL.SSL._lib.X509_CHECK_FLAG_NO_PARTIAL_WILDCARDS # noqa
OpenSSL.SSL._lib.X509_VERIFY_PARAM_set_hostflags(param, flags) # noqa
class HTTPFlowContext:
def __init__(self):
self.from_proxy_queue = multiprocessing.Queue()
self.to_proxy_queue = multiprocessing.Queue()
self.shutdown_signal = multiprocessing.Event()
self.mitmproxy_ready = multiprocessing.Event()
class IPCInterceptionAddon:
@@ -91,12 +98,13 @@ class IPCInterceptionAddon:
flow which is merged in and resumed.
"""
def __init__(self, flow_context: HTTPFlowContext):
self.mitmproxy_ready = flow_context.mitmproxy_ready
self.intercepted_flows: typing.Dict[str, HTTPFlow] = {}
self.from_proxy_queue: multiprocessing.Queue = flow_context.from_proxy_queue
self.to_proxy_queue: multiprocessing.Queue = flow_context.to_proxy_queue
self.shutdown_signal: multiprocessing.Event = flow_context.shutdown_signal
def log(self, entry: mitmproxy.log.LogEntry):
def add_log(self, entry: mitmproxy.log.LogEntry):
if entry.level == "debug":
logging.debug(entry.msg)
elif entry.level in ("alert", "info"):
@@ -111,6 +119,8 @@ class IPCInterceptionAddon:
def running(self):
# register to pump the events or something here
asyncio.create_task(self._pump_callbacks())
# Tell the main process mitmproxy is ready to handle requests
self.mitmproxy_ready.set()
async def _pump_callbacks(self):
watcher = ParentProcessWatcher(self.shutdown_signal)
@@ -212,7 +222,11 @@ class SLMITMMaster(mitmproxy.master.Master):
self.addons.add(
core.Core(),
clientplayback.ClientPlayback(),
SLMITMAddon(flow_context)
disable_h2c.DisableH2C(),
proxyserver.Proxyserver(),
next_layer.NextLayer(),
SLTlsConfig(),
SLMITMAddon(flow_context),
)
def start_server(self):
@@ -230,7 +244,7 @@ def create_proxy_master(host, port, flow_context: HTTPFlowContext): # pragma: n
os.path.join(opts.confdir, "config.yml"),
)
# Use SL's CA bundle so LL's CA certs won't cause verification errors
ca_bundle = pkg_resources.resource_filename("hippolyzer.lib.base", "network/data/ca-bundle.crt")
ca_bundle = get_resource_filename("lib/base/network/data/ca-bundle.crt")
opts.update(
ssl_verify_upstream_trusted_ca=ca_bundle,
listen_host=host,
@@ -241,30 +255,4 @@ def create_proxy_master(host, port, flow_context: HTTPFlowContext): # pragma: n
def create_http_proxy(bind_host, port, flow_context: HTTPFlowContext): # pragma: no cover
master = create_proxy_master(bind_host, port, flow_context)
pconf = SLProxyConfig(master.options)
server = mitmproxy.proxy.server.ProxyServer(pconf)
master.server = server
return master
def is_asset_server_cap_name(cap_name):
return cap_name and (
cap_name.startswith("GetMesh") or
cap_name.startswith("GetTexture") or
cap_name.startswith("ViewerAsset")
)
class SerializedCapData(typing.NamedTuple):
cap_name: typing.Optional[str] = None
region_addr: typing.Optional[str] = None
session_id: typing.Optional[str] = None
base_url: typing.Optional[str] = None
type: str = "NORMAL"
def __bool__(self):
return bool(self.cap_name or self.session_id)
@property
def asset_server_cap(self):
return is_asset_server_cap_name(self.cap_name)

View File

@@ -5,10 +5,9 @@ from typing import Optional, Tuple
from hippolyzer.lib.base.message.message_dot_xml import MessageDotXML
from hippolyzer.lib.base.message.udpdeserializer import UDPMessageDeserializer
from hippolyzer.lib.base.message.udpserializer import UDPMessageSerializer
from hippolyzer.lib.base.settings import Settings
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.packets import ProxiedUDPPacket
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.base.network.transport import UDPPacket
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
from hippolyzer.lib.proxy.socks_proxy import SOCKS5Server, UDPProxyProtocol
@@ -26,56 +25,40 @@ class SLSOCKS5Server(SOCKS5Server):
return lambda: InterceptingLLUDPProxyProtocol(source_addr, self.session_manager)
class BaseLLUDPProxyProtocol(UDPProxyProtocol):
def __init__(self, source_addr: Tuple[str, int]):
class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
def __init__(self, source_addr: Tuple[str, int], session_manager: SessionManager):
super().__init__(source_addr)
self.settings = Settings()
self.settings.ENABLE_DEFERRED_PACKET_PARSING = True
self.settings.HANDLE_PACKETS = False
self.session_manager: SessionManager = session_manager
self.serializer = UDPMessageSerializer()
self.deserializer = UDPMessageDeserializer(
settings=self.settings,
message_cls=ProxiedMessage,
settings=self.session_manager.settings,
)
self.message_xml = MessageDotXML()
self.session: Optional[Session] = None
def _ensure_message_allowed(self, msg: ProxiedMessage):
def _ensure_message_allowed(self, msg: Message):
if not self.message_xml.validate_udp_msg(msg.name):
LOG.warning(
f"Received {msg.name!r} over UDP, when it should come over the event queue. Discarding."
)
raise PermissionError(f"UDPBanned message {msg.name}")
class InterceptingLLUDPProxyProtocol(BaseLLUDPProxyProtocol):
def __init__(self, source_addr: Tuple[str, int], session_manager: SessionManager):
super().__init__(source_addr)
self.session_manager: SessionManager = session_manager
self.session: Optional[Session] = None
def _handle_proxied_packet(self, packet: ProxiedUDPPacket):
message: Optional[ProxiedMessage] = None
def handle_proxied_packet(self, packet: UDPPacket):
region: Optional[ProxiedRegion] = None
# Try to do an initial region lookup so we have it for handle_proxied_packet()
if self.session:
region = self.session.region_by_circuit_addr(packet.far_addr)
deserialize_exc = None
try:
message = self.deserializer.deserialize(packet.data)
message.direction = packet.direction
except Exception as e:
# Hang onto this since handle_proxied_packet doesn't need a parseable
# message. If that hook doesn't handle the packet then re-raise.
deserialize_exc = e
# the proxied packet handler is allowed to mutate `packet.data` before
# the message gets parsed.
if AddonManager.handle_proxied_packet(self.session_manager, packet,
self.session, region, message):
# Swallow any error raised by above message deserialization, it was handled.
self.session, region):
return
if deserialize_exc is not None:
# handle_proxied_packet() didn't deal with the error, so it's fatal.
raise deserialize_exc
message = self.deserializer.deserialize(packet.data)
message.direction = packet.direction
message.sender = packet.src_addr
message.meta.update(packet.meta)
assert message is not None
# Check for UDP bans on inbound messages
@@ -122,20 +105,39 @@ class InterceptingLLUDPProxyProtocol(BaseLLUDPProxyProtocol):
region.handle = message["Data"]["RegionHandle"]
LOG.info(f"Setting main region to {region!r}, had circuit addr {packet.far_addr!r}")
AddonManager.handle_region_changed(self.session, region)
if message.name == "RegionHandshake":
region.cache_id = message["RegionInfo"]["CacheID"]
self.session.objects.track_region_objects(region.handle)
if self.session_manager.settings.USE_VIEWER_OBJECT_CACHE:
try:
region.objects.load_cache()
except:
LOG.exception("Failed to load region cache, skipping")
try:
self.session.message_handler.handle(message)
except:
LOG.exception("Failed in session message handler")
try:
region.message_handler.handle(message)
except:
LOG.exception("Failed in region message handler")
message_logger = self.session_manager.message_logger
if message_logger:
message_logger.log_lludp_message(self.session, region, message)
handled = AddonManager.handle_lludp_message(
self.session, region, message
)
# This message is owned by an async handler, drop it so it doesn't get
# sent with the normal flow.
if message.queued:
region.circuit.drop_message(message)
# Shouldn't mutate the message past this point, so log it now.
if message_logger:
message_logger.log_lludp_message(self.session, region, message)
if handled:
return
@@ -144,12 +146,8 @@ class InterceptingLLUDPProxyProtocol(BaseLLUDPProxyProtocol):
elif message.name == "RegionHandshake":
region.name = str(message["RegionInfo"][0]["SimName"])
# This message is owned by an async handler, drop it so it doesn't get
# sent with the normal flow.
if message.queued and not message.dropped:
region.circuit.drop_message(message)
if not message.dropped:
# Send the message if it wasn't explicitly dropped or sent before
if not message.finalized:
region.circuit.send_message(message)
def close(self):

View File

@@ -11,6 +11,9 @@ def literal():
# Nightmare. str or bytes literal.
# https://stackoverflow.com/questions/14366401/#comment79795017_14366904
RegExMatch(r'''b?(\"\"\"|\'\'\'|\"|\')((?<!\\)(\\\\)*\\\1|.)*?\1'''),
# base16
RegExMatch(r'0x[0-9a-fA-F]+'),
# base10 int or float.
RegExMatch(r'\d+(\.\d+)?'),
"None",
"True",
@@ -23,7 +26,7 @@ def literal():
def identifier():
return RegExMatch(r'[a-zA-Z*]([a-zA-Z0-9*]+)?')
return RegExMatch(r'[a-zA-Z*]([a-zA-Z0-9_*]+)?')
def field_specifier():
@@ -42,12 +45,16 @@ def meta_field_specifier():
return "Meta", ".", identifier
def enum_field_specifier():
return identifier, ".", identifier
def compare_val():
return [literal, meta_field_specifier]
return [literal, meta_field_specifier, enum_field_specifier]
def binary_expression():
return field_specifier, ["==", "!=", "^=", "$=", "~=", ">", ">=", "<", "<="], compare_val
return field_specifier, ["==", "!=", "^=", "$=", "~=", ">", ">=", "<", "<=", "&"], compare_val
def term():
@@ -62,9 +69,12 @@ def message_filter():
return expression, EOF
MATCH_RESULT = typing.Union[bool, typing.Tuple]
class BaseFilterNode(abc.ABC):
@abc.abstractmethod
def match(self, msg) -> bool:
def match(self, msg) -> MATCH_RESULT:
raise NotImplementedError()
@property
@@ -94,17 +104,17 @@ class BinaryFilterNode(BaseFilterNode, abc.ABC):
class UnaryNotFilterNode(UnaryFilterNode):
def match(self, msg) -> bool:
def match(self, msg) -> MATCH_RESULT:
return not self.node.match(msg)
class OrFilterNode(BinaryFilterNode):
def match(self, msg) -> bool:
def match(self, msg) -> MATCH_RESULT:
return self.left_node.match(msg) or self.right_node.match(msg)
class AndFilterNode(BinaryFilterNode):
def match(self, msg) -> bool:
def match(self, msg) -> MATCH_RESULT:
return self.left_node.match(msg) and self.right_node.match(msg)
@@ -114,7 +124,7 @@ class MessageFilterNode(BaseFilterNode):
self.operator = operator
self.value = value
def match(self, msg) -> bool:
def match(self, msg) -> MATCH_RESULT:
return msg.matches(self)
@property
@@ -126,6 +136,11 @@ class MetaFieldSpecifier(str):
pass
class EnumFieldSpecifier(typing.NamedTuple):
enum_name: str
field_name: str
class LiteralValue:
"""Only exists because we can't return `None` in a visitor, need to box it"""
def __init__(self, value):
@@ -145,6 +160,9 @@ class MessageFilterVisitor(PTNodeVisitor):
def visit_meta_field_specifier(self, _node, children):
return MetaFieldSpecifier(children[0])
def visit_enum_field_specifier(self, _node, children):
return EnumFieldSpecifier(*children)
def visit_unary_field_specifier(self, _node, children):
# Looks like a bare field specifier with no operator
return MessageFilterNode(tuple(children), None, None)

View File

@@ -1,8 +1,11 @@
from __future__ import annotations
import abc
import ast
import collections
import copy
import fnmatch
import gzip
import io
import logging
import pickle
@@ -13,38 +16,59 @@ import weakref
from defusedxml import minidom
from hippolyzer.lib.base import serialization as se, llsd
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.datatypes import TaggedUnion, UUID, TupleCoord
from hippolyzer.lib.base.helpers import bytes_escape
from hippolyzer.lib.proxy.message_filter import MetaFieldSpecifier, compile_filter, BaseFilterNode, MessageFilterNode
from hippolyzer.lib.base.message.message_formatting import HumanMessageSerializer
from hippolyzer.lib.proxy.message_filter import MetaFieldSpecifier, compile_filter, BaseFilterNode, MessageFilterNode, \
EnumFieldSpecifier
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.caps import CapType, SerializedCapData
if typing.TYPE_CHECKING:
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.proxy.region import ProxiedRegion, CapType
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
LOG = logging.getLogger(__name__)
class BaseMessageLogger:
def log_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
pass
paused: bool
def log_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if self.paused:
return False
return self.add_log_entry(LLUDPMessageLogEntry(message, region, session))
def log_http_response(self, flow: HippoHTTPFlow):
pass
if self.paused:
return False
# These are huge, let's not log them for now.
if flow.cap_data and flow.cap_data.asset_server_cap:
return False
return self.add_log_entry(HTTPMessageLogEntry(flow))
def log_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
if self.paused:
return False
return self.add_log_entry(EQMessageLogEntry(event, region, session))
@abc.abstractmethod
def add_log_entry(self, entry: AbstractMessageLogEntry):
pass
class FilteringMessageLogger(BaseMessageLogger):
def __init__(self):
def __init__(self, maxlen=2000):
BaseMessageLogger.__init__(self)
self._raw_entries = collections.deque(maxlen=2000)
self._raw_entries = collections.deque(maxlen=maxlen)
self._filtered_entries: typing.List[AbstractMessageLogEntry] = []
self._paused = False
self.paused = False
self.filter: BaseFilterNode = compile_filter("")
def __iter__(self) -> typing.Iterator[AbstractMessageLogEntry]:
return iter(self._filtered_entries)
def set_filter(self, filter_str: str):
self.filter = compile_filter(filter_str)
self._begin_reset()
@@ -58,25 +82,7 @@ class FilteringMessageLogger(BaseMessageLogger):
self._end_reset()
def set_paused(self, paused: bool):
self._paused = paused
def log_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
if self._paused:
return
self._add_log_entry(LLUDPMessageLogEntry(message, region, session))
def log_http_response(self, flow: HippoHTTPFlow):
if self._paused:
return
# These are huge, let's not log them for now.
if flow.cap_data and flow.cap_data.asset_server_cap:
return
self._add_log_entry(HTTPMessageLogEntry(flow))
def log_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
if self._paused:
return
self._add_log_entry(EQMessageLogEntry(event, region, session))
self.paused = paused
# Hooks that Qt models will want to implement
def _begin_insert(self, insert_idx: int):
@@ -91,25 +97,21 @@ class FilteringMessageLogger(BaseMessageLogger):
def _end_reset(self):
pass
def _add_log_entry(self, entry: AbstractMessageLogEntry):
def add_log_entry(self, entry: AbstractMessageLogEntry):
try:
# Paused, throw it away.
if self._paused:
return
if self.paused:
return False
self._raw_entries.append(entry)
if self.filter.match(entry):
next_idx = len(self._filtered_entries)
self._begin_insert(next_idx)
self._filtered_entries.append(entry)
self._end_insert()
entry.cache_summary()
# In the common case we don't need to keep around the serialization
# caches anymore. If the filter changes, the caches will be repopulated
# as necessary.
entry.freeze()
return True
except Exception:
LOG.exception("Failed to filter queued message")
return False
def clear(self):
self._begin_reset()
@@ -118,7 +120,27 @@ class FilteringMessageLogger(BaseMessageLogger):
self._end_reset()
class AbstractMessageLogEntry:
class WrappingMessageLogger(BaseMessageLogger):
def __init__(self):
self.loggers: typing.List[BaseMessageLogger] = []
@property
def paused(self):
return all(x.paused for x in self.loggers)
def add_log_entry(self, entry: AbstractMessageLogEntry):
logged = False
for logger in self.loggers:
if logger.add_log_entry(entry):
logged = True
# At least one logger ended up keeping the message around, so let's
# cache the summary before we freeze the message.
if logged:
entry.cache_summary()
entry.freeze()
class AbstractMessageLogEntry(abc.ABC):
region: typing.Optional[ProxiedRegion]
session: typing.Optional[Session]
name: str
@@ -126,7 +148,7 @@ class AbstractMessageLogEntry:
__slots__ = ["_region", "_session", "_region_name", "_agent_id", "_summary", "meta"]
def __init__(self, region, session):
def __init__(self, region: typing.Optional[ProxiedRegion], session: typing.Optional[Session]):
if region and not isinstance(region, weakref.ReferenceType):
region = weakref.ref(region)
if session and not isinstance(session, weakref.ReferenceType):
@@ -156,6 +178,45 @@ class AbstractMessageLogEntry:
"SelectedFull": self._current_selected_full(),
}
def to_dict(self) -> dict:
meta = self.meta.copy()
def _dehydrate_meta_uuid(key: str):
if meta[key]:
meta[key] = str(meta[key])
_dehydrate_meta_uuid("AgentID")
_dehydrate_meta_uuid("SelectedFull")
_dehydrate_meta_uuid("SessionID")
return {
"type": self.type,
"region_name": self.region_name,
"agent_id": str(self.agent_id) if self.agent_id is not None else None,
"summary": self.summary,
"meta": meta,
}
@classmethod
@abc.abstractmethod
def from_dict(cls, val: dict):
pass
def apply_dict(self, val: dict) -> None:
self._region_name = val['region_name']
self._agent_id = UUID(val['agent_id']) if val['agent_id'] else None
self._summary = val['summary']
meta = val['meta'].copy()
def _hydrate_meta_uuid(key: str):
if meta[key]:
meta[key] = UUID(meta[key])
_hydrate_meta_uuid("AgentID")
_hydrate_meta_uuid("SelectedFull")
_hydrate_meta_uuid("SessionID")
self.meta.update(meta)
def freeze(self):
pass
@@ -253,6 +314,11 @@ class AbstractMessageLogEntry:
expected = expected()
else:
expected = str(expected)
elif isinstance(expected, EnumFieldSpecifier):
# Local import so we get a fresh copy of the templates module
from hippolyzer.lib.proxy import templates
enum_cls = getattr(templates, expected.enum_name)
expected = enum_cls[expected.field_name]
elif expected is not None:
# Unbox the expected value
expected = expected.value
@@ -285,6 +351,8 @@ class AbstractMessageLogEntry:
return val > expected
elif operator == ">=":
return val >= expected
elif operator == "&":
return val & expected
else:
raise ValueError(f"Unexpected operator {operator!r}")
@@ -358,8 +426,8 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
cap_name = cap_data and cap_data.cap_name
base_url = cap_name and cap_data.base_url
temporary_cap = cap_data and cap_data.type == CapType.TEMPORARY
beautify_url = (beautify and base_url and cap_name and
not temporary_cap and self.session and want_request)
beautify_url = (beautify and base_url and cap_name
and not temporary_cap and self.session and want_request)
if want_request:
buf.write(message.method)
buf.write(" ")
@@ -473,6 +541,26 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
return "application/xml"
return content_type
def to_dict(self):
val = super().to_dict()
val['flow'] = self.flow.get_state()
cap_data = val['flow'].get('metadata', {}).get('cap_data_ser')
if cap_data is not None:
# Have to convert this from a namedtuple to a dict to make
# it importable
cap_dict = cap_data._asdict() # noqa
val['flow']['metadata']['cap_data_ser'] = cap_dict
return val
@classmethod
def from_dict(cls, val: dict):
cap_data = val['flow'].get('metadata', {}).get('cap_data_ser')
if cap_data:
val['flow']['metadata']['cap_data_ser'] = SerializedCapData(**cap_data)
ev = cls(HippoHTTPFlow.from_state(val['flow'], None))
ev.apply_dict(val)
return ev
class EQMessageLogEntry(AbstractMessageLogEntry):
__slots__ = ["event"]
@@ -486,7 +574,7 @@ class EQMessageLogEntry(AbstractMessageLogEntry):
return "EQ"
def request(self, beautify=False, replacements=None):
return self._format_llsd(self.event["body"])
return f'EQ {self.event["message"]}\n\n{self._format_llsd(self.event["body"])}'
@property
def name(self):
@@ -500,12 +588,23 @@ class EQMessageLogEntry(AbstractMessageLogEntry):
self._summary = llsd.format_notation(self.event["body"]).decode("utf8")[:500]
return self._summary
def to_dict(self) -> dict:
val = super().to_dict()
val['event'] = llsd.format_notation(self.event)
return val
@classmethod
def from_dict(cls, val: dict):
ev = cls(llsd.parse_notation(val['event']), None, None)
ev.apply_dict(val)
return ev
class LLUDPMessageLogEntry(AbstractMessageLogEntry):
__slots__ = ["_message", "_name", "_direction", "_frozen_message", "_seq", "_deserializer"]
def __init__(self, message: ProxiedMessage, region, session):
self._message: ProxiedMessage = message
def __init__(self, message: Message, region, session):
self._message: Message = message
self._deserializer = None
self._name = message.name
self._direction = message.direction
@@ -530,7 +629,7 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
return super()._get_meta(name)
@property
def message(self):
def message(self) -> Message:
if self._message:
return self._message
elif self._frozen_message:
@@ -541,12 +640,16 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
raise ValueError("Didn't have a fresh or frozen message somehow")
def freeze(self):
self.message.invalidate_caches()
message = self.message
message.invalidate_caches()
# These are expensive to keep around. pickle them and un-pickle on
# an as-needed basis.
self._deserializer = self.message.deserializer
self.message.deserializer = None
self._frozen_message = pickle.dumps(self._message, protocol=pickle.HIGHEST_PROTOCOL)
message.deserializer = None
try:
self._frozen_message = pickle.dumps(self._message, protocol=pickle.HIGHEST_PROTOCOL)
finally:
message.deserializer = self._deserializer
self._message = None
@property
@@ -566,7 +669,7 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
return self._direction.name if self._direction is not None else ""
def request(self, beautify=False, replacements=None):
return self.message.to_human_string(replacements, beautify)
return HumanMessageSerializer.to_human_string(self.message, replacements, beautify)
def matches(self, matcher):
base_matched = self._base_matches(matcher)
@@ -585,15 +688,19 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
for block_name in message.blocks:
if not fnmatch.fnmatchcase(block_name, matcher.selector[1]):
continue
for block in message[block_name]:
for block_num, block in enumerate(message[block_name]):
for var_name in block.vars.keys():
if not fnmatch.fnmatchcase(var_name, matcher.selector[2]):
continue
# So we know where the match happened
span_key = (message.name, block_name, block_num, var_name)
if selector_len == 3:
# We're just matching on the var existing, not having any particular value
if matcher.value is None:
return True
return span_key
if self._val_matches(matcher.operator, block[var_name], matcher.value):
return True
return span_key
# Need to invoke a special unpacker
elif selector_len == 4:
try:
deserialized = block.deserialize_var(var_name)
@@ -607,9 +714,9 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
for key in deserialized.keys():
if fnmatch.fnmatchcase(str(key), matcher.selector[3]):
if matcher.value is None:
return True
return span_key
if self._val_matches(matcher.operator, deserialized[key], matcher.value):
return True
return span_key
return False
@@ -624,3 +731,30 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
if self._message:
self._seq = self._message.packet_id
return self._seq
def to_dict(self):
val = super().to_dict()
val['message'] = llsd.format_notation(self.message.to_dict(extended=True))
return val
@classmethod
def from_dict(cls, val: dict):
ev = cls(Message.from_dict(llsd.parse_notation(val['message'])), None, None)
ev.apply_dict(val)
return ev
def export_log_entries(entries: typing.Iterable[AbstractMessageLogEntry]) -> bytes:
return gzip.compress(repr([e.to_dict() for e in entries]).encode("utf8"))
_TYPE_CLASSES = {
"HTTP": HTTPMessageLogEntry,
"LLUDP": LLUDPMessageLogEntry,
"EQ": EQMessageLogEntry,
}
def import_log_entries(data: bytes) -> typing.List[AbstractMessageLogEntry]:
entries = ast.literal_eval(gzip.decompress(data).decode("utf8"))
return [_TYPE_CLASSES[e['type']].from_dict(e) for e in entries]

View File

@@ -0,0 +1,54 @@
from __future__ import annotations
import logging
from typing import *
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.client.namecache import NameCache
from hippolyzer.lib.proxy.viewer_settings import iter_viewer_cache_dirs
if TYPE_CHECKING:
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
class ProxyNameCache(NameCache):
def create_subscriptions(
self,
message_handler: MessageHandler[Message, str],
http_message_handler: Optional[MessageHandler[HippoHTTPFlow, str]] = None,
):
super().create_subscriptions(message_handler)
if http_message_handler is not None:
http_message_handler.subscribe("GetDisplayNames", self._handle_get_display_names)
def load_viewer_caches(self):
for cache_dir in iter_viewer_cache_dirs():
try:
namecache_file = cache_dir / "avatar_name_cache.xml"
if namecache_file.exists():
with open(namecache_file, "rb") as f:
namecache_bytes = f.read()
agents = llsd.parse_xml(namecache_bytes)["agents"]
# Can be `None` if the file was just created
if not agents:
continue
for agent_id, agent_data in agents.items():
# Don't set display name if they just have the default
display_name = None
if not agent_data["is_display_name_default"]:
display_name = agent_data["display_name"]
self.update(UUID(agent_id), {
"FirstName": agent_data["legacy_first_name"],
"LastName": agent_data["legacy_last_name"],
"DisplayName": display_name,
})
except:
logging.exception(f"Failed to load namecache from {cache_dir}")
def _handle_get_display_names(self, flow: HippoHTTPFlow):
if flow.response.status_code != 200:
return
self._process_display_names_response(llsd.parse_xml(flow.response.content))

View File

@@ -0,0 +1,175 @@
from __future__ import annotations
import asyncio
import logging
from typing import *
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.templates import PCode
from hippolyzer.lib.client.namecache import NameCache
from hippolyzer.lib.client.object_manager import (
ClientObjectManager,
UpdateType, ClientWorldObjectManager,
)
from hippolyzer.lib.base.objects import Object
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.settings import ProxySettings
from hippolyzer.lib.proxy.vocache import RegionViewerObjectCacheChain
if TYPE_CHECKING:
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
LOG = logging.getLogger(__name__)
class ProxyObjectManager(ClientObjectManager):
"""
Object manager for a specific region
"""
_region: ProxiedRegion
def __init__(
self,
region: ProxiedRegion,
may_use_vo_cache: bool = False
):
super().__init__(region)
self.may_use_vo_cache = may_use_vo_cache
self.cache_loaded = False
self.object_cache = RegionViewerObjectCacheChain([])
self._cache_miss_timer: Optional[asyncio.TimerHandle] = None
self.queued_cache_misses: Set[int] = set()
region.message_handler.subscribe(
"RequestMultipleObjects",
self._handle_request_multiple_objects,
)
def load_cache(self):
if not self.may_use_vo_cache or self.cache_loaded:
return
handle = self._region.handle
if not handle:
LOG.warning(f"Tried to load cache for {self._region} without a handle")
return
self.cache_loaded = True
self.object_cache = RegionViewerObjectCacheChain.for_region(
handle=handle,
cache_id=self._region.cache_id,
cache_dir=self._region.session().cache_dir,
)
def request_missed_cached_objects_soon(self):
if self._cache_miss_timer:
self._cache_miss_timer.cancel()
# Basically debounce. Will only trigger 0.2 seconds after the last time it's invoked to
# deal with the initial flood of ObjectUpdateCached and the natural lag time between that
# and the viewers' RequestMultipleObjects messages
self._cache_miss_timer = asyncio.get_event_loop().call_later(
0.2, self._request_missed_cached_objects)
def _request_missed_cached_objects(self):
self._cache_miss_timer = None
self.request_objects(self.queued_cache_misses)
self.queued_cache_misses.clear()
def clear(self):
super().clear()
self.object_cache = RegionViewerObjectCacheChain([])
self.cache_loaded = False
self.queued_cache_misses.clear()
if self._cache_miss_timer:
self._cache_miss_timer.cancel()
self._cache_miss_timer = None
def _is_localid_selected(self, localid: int):
return localid in self._region.session().selected.object_locals
def _handle_request_multiple_objects(self, msg: Message):
# Remove any queued cache misses that the viewer just requested for itself
self.queued_cache_misses -= {b["ID"] for b in msg["ObjectData"]}
class ProxyWorldObjectManager(ClientWorldObjectManager):
_session: Session
_settings: ProxySettings
def __init__(self, session: Session, settings: ProxySettings, name_cache: Optional[NameCache]):
super().__init__(session, settings, name_cache)
session.http_message_handler.subscribe(
"GetObjectCost",
self._handle_get_object_cost
)
session.http_message_handler.subscribe(
"FirestormBridge",
self._handle_firestorm_bridge_request,
)
def _handle_object_update_cached_misses(self, region_handle: int, missing_locals: Set[int]):
if not self._settings.ALLOW_AUTO_REQUEST_OBJECTS:
return
if self._settings.AUTOMATICALLY_REQUEST_MISSING_OBJECTS:
# Schedule these local IDs to be requested soon if the viewer doesn't request
# them itself. Ideally we could just mutate the CRC of the ObjectUpdateCached
# to force a CRC cache miss in the viewer, but that appears to cause the viewer
# to drop the resulting ObjectUpdateCompressed when the CRC doesn't match?
# It was causing all objects to go missing even though the ObjectUpdateCompressed
# was received.
region_mgr: Optional[ProxyObjectManager] = self._get_region_manager(region_handle)
region_mgr.queued_cache_misses |= missing_locals
region_mgr.request_missed_cached_objects_soon()
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str], update_type: UpdateType):
super()._run_object_update_hooks(obj, updated_props, update_type)
region = self._session.region_by_handle(obj.RegionHandle)
if self._settings.ALLOW_AUTO_REQUEST_OBJECTS:
if obj.PCode == PCode.AVATAR and "ParentID" in updated_props:
if obj.ParentID and not region.objects.lookup_localid(obj.ParentID):
# If an avatar just sat on an object we don't know about, add it to the queued
# cache misses and request if if the viewer doesn't. This should happen
# regardless of the auto-request object setting because otherwise we have no way
# to get a sitting agent's true region location, even if it's ourself.
region.objects.queued_cache_misses.add(obj.ParentID)
region.objects.request_missed_cached_objects_soon()
AddonManager.handle_object_updated(self._session, region, obj, updated_props)
def _run_kill_object_hooks(self, obj: Object):
super()._run_kill_object_hooks(obj)
region = self._session.region_by_handle(obj.RegionHandle)
AddonManager.handle_object_killed(self._session, region, obj)
def _lookup_cache_entry(self, region_handle: int, local_id: int, crc: int) -> Optional[bytes]:
region_mgr: Optional[ProxyObjectManager] = self._get_region_manager(region_handle)
return region_mgr.object_cache.lookup_object_data(local_id, crc)
def _handle_get_object_cost(self, flow: HippoHTTPFlow):
parsed = llsd.parse_xml(flow.response.content)
self._process_get_object_cost_response(parsed)
def _handle_firestorm_bridge_request(self, flow: HippoHTTPFlow):
"""
Pull guessed avatar Z offsets from Firestorm Bridge requests
CoarseLocationUpdate packets can only represent heights up to 1024, so
viewers typically use an LSL bridge to get avatar heights beyond that range
and combine it with their X and Y coords from CoarseLocationUpdate packets.
"""
if not flow.request.content.startswith(b'<llsd><string>getZOffsets|'):
return
parsed: str = llsd.parse_xml(flow.response.content)
if not parsed:
return
# av_1_id, 1025.001, av_2_id, 3000.0, ...
split = parsed.split(", ")
for av_id, z_offset in zip(split[0::2], split[1::2]):
av_id = UUID(av_id)
z_offset = float(z_offset)
av = self.lookup_avatar(av_id)
if not av:
continue
av.GuessedZ = z_offset

View File

@@ -1,424 +0,0 @@
from __future__ import annotations
import collections
import copy
import logging
import typing
import weakref
from typing import *
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID, TaggedUnion
from hippolyzer.lib.base.helpers import proxify
from hippolyzer.lib.base.message.message import Block
from hippolyzer.lib.base.namevalue import NameValueCollection
from hippolyzer.lib.base.objects import Object
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.proxy.templates import PCode, ObjectStateSerializer
if TYPE_CHECKING:
from hippolyzer.lib.proxy.region import ProxiedRegion
LOG = logging.getLogger(__name__)
class OrphanManager:
def __init__(self):
self._orphans: typing.Dict[int, typing.List[int]] = collections.defaultdict(list)
def clear(self):
return self._orphans.clear()
def untrack_orphan(self, obj: Object, parent_id: int):
if parent_id not in self._orphans:
return False
orphan_list = self._orphans[parent_id]
removed = False
if obj.LocalID in orphan_list:
orphan_list.remove(obj.LocalID)
removed = True
# List is empty now, get rid of it.
if not orphan_list:
del self._orphans[parent_id]
return removed
def collect_orphans(self, parent: Object) -> typing.Sequence[int]:
return self._orphans.pop(parent.LocalID, [])
def track_orphan(self, obj: Object):
self.track_orphan_by_id(obj.LocalID, obj.ParentID)
def track_orphan_by_id(self, local_id, parent_id):
if len(self._orphans) > 100:
LOG.warning(f"Orphaned object dict is getting large: {len(self._orphans)}")
self._orphans[parent_id].append(local_id)
OBJECT_OR_LOCAL = typing.Union[Object, int]
class ObjectManager:
"""Object manager for a specific region"""
def __init__(self, region: ProxiedRegion):
self._localid_lookup: typing.Dict[int, Object] = {}
self._fullid_lookup: typing.Dict[UUID, int] = {}
# Objects that we've seen references to but don't have data for
self.missing_locals = set()
self._region: ProxiedRegion = proxify(region)
self._orphan_manager = OrphanManager()
message_handler = region.message_handler
message_handler.subscribe("ObjectUpdate", self._handle_object_update)
message_handler.subscribe("ImprovedTerseObjectUpdate",
self._handle_terse_object_update)
message_handler.subscribe("ObjectUpdateCompressed",
self._handle_object_update_compressed)
message_handler.subscribe("ObjectUpdateCached",
self._handle_object_update_cached)
message_handler.subscribe("ObjectProperties",
self._handle_object_properties_generic)
message_handler.subscribe("ObjectPropertiesFamily",
self._handle_object_properties_generic)
region.http_message_handler.subscribe("GetObjectCost",
self._handle_get_object_cost)
message_handler.subscribe("KillObject",
self._handle_kill_object)
@property
def all_objects(self) -> typing.Iterable[Object]:
return self._localid_lookup.values()
@property
def all_avatars(self) -> typing.Iterable[Object]:
# This is only avatars within draw distance. Might be useful to have another
# accessor for UUID + pos that's based on CoarseLocationUpdate.
return (o for o in self.all_objects if o.PCode == PCode.AVATAR)
def lookup_localid(self, localid) -> typing.Optional[Object]:
return self._localid_lookup.get(localid, None)
def lookup_fullid(self, fullid: UUID) -> typing.Optional[Object]:
local_id = self._fullid_lookup.get(fullid, None)
if local_id is None:
return None
return self.lookup_localid(local_id)
def _track_object(self, obj: Object):
self._localid_lookup[obj.LocalID] = obj
self._fullid_lookup[obj.FullID] = obj.LocalID
# If it was missing, it's not missing anymore.
self.missing_locals -= {obj.LocalID}
self._parent_object(obj)
# Adopt any of our orphaned child objects.
for orphan_local in self._orphan_manager.collect_orphans(obj):
child_obj = self.lookup_localid(orphan_local)
# Shouldn't be any dead children in the orphanage
assert child_obj is not None
self._parent_object(child_obj)
self._notify_object_updated(obj, set(obj.to_dict().keys()))
def _parent_object(self, obj: Object, insert_at_head=False):
if obj.ParentID:
parent = self.lookup_localid(obj.ParentID)
if parent is not None:
assert obj.LocalID not in parent.ChildIDs
# Link order is never explicitly passed to clients, so we have to do
# some nasty guesswork based on order of received initial ObjectUpdates
# Note that this is broken in the viewer as well, and there doesn't seem
# to be a foolproof way to get this.
idx = 0 if insert_at_head else len(parent.ChildIDs)
parent.ChildIDs.insert(idx, obj.LocalID)
parent.Children.insert(idx, obj)
obj.Parent = weakref.proxy(parent)
else:
self.missing_locals.add(obj.ParentID)
self._orphan_manager.track_orphan(obj)
obj.Parent = None
LOG.debug(f"{obj.LocalID} updated with parent {obj.ParentID}, but parent wasn't found!")
def _unparent_object(self, obj: Object, old_parent_id: int):
obj.Parent = None
if old_parent_id:
# Had a parent, remove this from the child list.
removed = self._orphan_manager.untrack_orphan(obj, old_parent_id)
old_parent = self.lookup_localid(old_parent_id)
if old_parent:
if obj.LocalID in old_parent.ChildIDs:
idx = old_parent.ChildIDs.index(obj.LocalID)
del old_parent.ChildIDs[idx]
del old_parent.Children[idx]
else:
# Something is very broken if this happens
LOG.warning(f"Changing parent of {obj.LocalID}, but old parent didn't correctly adopt, "
f"was {'' if removed else 'not '}in orphan list")
else:
LOG.debug(f"Changing parent of {obj.LocalID}, but couldn't find old parent")
def _update_existing_object(self, obj: Object, new_properties):
new_parent_id = new_properties.get("ParentID", obj.ParentID)
old_parent_id = obj.ParentID
actually_updated_props = obj.update_properties(new_properties)
if new_parent_id != old_parent_id:
self._unparent_object(obj, old_parent_id)
self._parent_object(obj, insert_at_head=True)
# Common case where this may be falsy is if we get an ObjectUpdateCached
# that didn't have a changed UpdateFlags field.
if actually_updated_props:
self._notify_object_updated(obj, actually_updated_props)
def _normalize_object_update(self, block: Block):
object_data = {
"FootCollisionPlane": None,
"SoundFlags": block["Flags"],
"SoundGain": block["Gain"],
"SoundRadius": block["Radius"],
**dict(block.items()),
"TextureEntry": block.deserialize_var("TextureEntry", make_copy=False),
"NameValue": block.deserialize_var("NameValue", make_copy=False),
"TextureAnim": block.deserialize_var("TextureAnim", make_copy=False),
"ExtraParams": block.deserialize_var("ExtraParams", make_copy=False) or {},
"PSBlock": block.deserialize_var("PSBlock", make_copy=False).value,
"UpdateFlags": block.deserialize_var("UpdateFlags", make_copy=False),
"State": block.deserialize_var("State", make_copy=False),
**block.deserialize_var("ObjectData", make_copy=False).value,
}
object_data["LocalID"] = object_data.pop("ID")
# Empty == not updated
if not object_data["TextureEntry"]:
object_data.pop("TextureEntry")
# OwnerID is only set in this packet if a sound is playing. Don't allow
# ObjectUpdates to clobber _real_ OwnerIDs we had from ObjectProperties
# with a null UUID.
if object_data["OwnerID"] == UUID():
del object_data["OwnerID"]
del object_data["Flags"]
del object_data["Gain"]
del object_data["Radius"]
del object_data["ObjectData"]
return object_data
def _handle_object_update(self, packet: ProxiedMessage):
seen_locals = []
for block in packet['ObjectData']:
object_data = self._normalize_object_update(block)
seen_locals.append(object_data["LocalID"])
obj = self.lookup_fullid(object_data["FullID"])
if obj:
self._update_existing_object(obj, object_data)
else:
obj = Object(**object_data)
self._track_object(obj)
packet.meta["ObjectUpdateIDs"] = tuple(seen_locals)
def _normalize_terse_object_update(self, block: Block):
object_data = {
**block.deserialize_var("Data", make_copy=False),
**dict(block.items()),
"TextureEntry": block.deserialize_var("TextureEntry", make_copy=False),
}
object_data["LocalID"] = object_data.pop("ID")
object_data.pop("Data")
# Empty == not updated
if object_data["TextureEntry"] is None:
object_data.pop("TextureEntry")
return object_data
def _handle_terse_object_update(self, packet: ProxiedMessage):
seen_locals = []
for block in packet['ObjectData']:
object_data = self._normalize_terse_object_update(block)
obj = self.lookup_localid(object_data["LocalID"])
# Can only update existing object with this message
if obj:
# Need the Object as context because decoding state requires PCode.
state_deserializer = ObjectStateSerializer.deserialize
object_data["State"] = state_deserializer(ctx_obj=obj, val=object_data["State"])
seen_locals.append(object_data["LocalID"])
if obj:
self._update_existing_object(obj, object_data)
else:
self.missing_locals.add(object_data["LocalID"])
LOG.debug(f"Received terse update for unknown object {object_data['LocalID']}")
packet.meta["ObjectUpdateIDs"] = tuple(seen_locals)
def _handle_object_update_cached(self, packet: ProxiedMessage):
seen_locals = []
for block in packet['ObjectData']:
seen_locals.append(block["ID"])
obj = self.lookup_localid(block["ID"])
if obj is not None:
self._update_existing_object(obj, {
"UpdateFlags": block.deserialize_var("UpdateFlags", make_copy=False),
})
else:
self.missing_locals.add(block["ID"])
packet.meta["ObjectUpdateIDs"] = tuple(seen_locals)
def _normalize_object_update_compressed(self, block: Block):
# TODO: ObjectUpdateCompressed doesn't provide a default value for unused
# fields, whereas ObjectUpdate and friends do (TextColor, etc.)
# need some way to normalize ObjectUpdates so they won't appear to have
# changed just because an ObjectUpdate got sent with a default value
# Only do a shallow copy
compressed = copy.copy(block.deserialize_var("Data", make_copy=False))
# Only used for determining which sections are present
del compressed["Flags"]
ps_block = compressed.pop("PSBlockNew", None)
if ps_block is None:
ps_block = compressed.pop("PSBlock", None)
if ps_block is None:
ps_block = TaggedUnion(0, None)
compressed.pop("PSBlock", None)
if compressed["NameValue"] is None:
compressed["NameValue"] = NameValueCollection()
object_data = {
"PSBlock": ps_block.value,
# Parent flag not set means explicitly un-parented
"ParentID": compressed.pop("ParentID", None) or 0,
"LocalID": compressed.pop("ID"),
**compressed,
**dict(block.items()),
"UpdateFlags": block.deserialize_var("UpdateFlags", make_copy=False),
}
if object_data["TextureEntry"] is None:
object_data.pop("TextureEntry")
# Don't clobber OwnerID in case the object has a proper one.
if object_data["OwnerID"] == UUID():
del object_data["OwnerID"]
object_data.pop("Data")
return object_data
def _handle_object_update_compressed(self, packet: ProxiedMessage):
seen_locals = []
for block in packet['ObjectData']:
object_data = self._normalize_object_update_compressed(block)
obj = self.lookup_localid(object_data["LocalID"])
seen_locals.append(object_data["LocalID"])
if obj:
self._update_existing_object(obj, object_data)
else:
obj = Object(**object_data)
self._track_object(obj)
packet.meta["ObjectUpdateIDs"] = tuple(seen_locals)
def _handle_object_properties_generic(self, packet: ProxiedMessage):
seen_locals = []
for block in packet["ObjectData"]:
object_properties = dict(block.items())
if packet.name == "ObjectProperties":
object_properties["TextureID"] = block.deserialize_var("TextureID")
obj = self.lookup_fullid(block["ObjectID"])
if obj:
seen_locals.append(obj.LocalID)
self._update_existing_object(obj, object_properties)
else:
LOG.debug(f"Received {packet.name} for unknown {block['ObjectID']}")
packet.meta["ObjectUpdateIDs"] = tuple(seen_locals)
def _handle_kill_object(self, packet: ProxiedMessage):
seen_locals = []
for block in packet["ObjectData"]:
obj = self.lookup_localid(block["ID"])
seen_locals.append(block["ID"])
self.missing_locals -= {block["ID"]}
if obj:
AddonManager.handle_object_killed(self._region.session(), self._region, obj)
former_child_ids = obj.ChildIDs[:]
for child_id in former_child_ids:
child_obj = self.lookup_localid(child_id)
assert child_obj is not None
self._unparent_object(child_obj, child_obj.ParentID)
del self._localid_lookup[obj.LocalID]
del self._fullid_lookup[obj.FullID]
# Place any remaining unkilled children in the orphanage
for child_id in former_child_ids:
self._orphan_manager.track_orphan_by_id(child_id, obj.LocalID)
assert not obj.ChildIDs
# Make sure the parent knows we went away
self._unparent_object(obj, obj.ParentID)
else:
logging.debug(f"Received {packet.name} for unknown {block['ID']}")
packet.meta["ObjectUpdateIDs"] = tuple(seen_locals)
def _handle_get_object_cost(self, flow: HippoHTTPFlow):
parsed = llsd.parse_xml(flow.response.content)
if "error" in parsed:
return
for object_id, object_costs in parsed.items():
obj = self.lookup_fullid(UUID(object_id))
if not obj:
LOG.debug(f"Received ObjectCost for unknown {object_id}")
continue
obj.ObjectCosts.update(object_costs)
self._notify_object_updated(obj, {"ObjectCosts"})
def _notify_object_updated(self, obj: Object, updated_props: Set[str]):
AddonManager.handle_object_updated(self._region.session(), self._region, obj, updated_props)
def clear(self):
self._localid_lookup.clear()
self._fullid_lookup.clear()
self._orphan_manager.clear()
self.missing_locals.clear()
def request_object_properties(self, objects: typing.Union[OBJECT_OR_LOCAL, typing.Sequence[OBJECT_OR_LOCAL]]):
if isinstance(objects, (Object, int)):
objects = (objects,)
if not objects:
return
session = self._region.session()
local_ids = tuple((o.LocalID if isinstance(o, Object) else o) for o in objects)
# Don't mess with already selected objects
local_ids = tuple(local for local in local_ids if local not in session.selected.object_locals)
while local_ids:
blocks = [
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
*[Block("ObjectData", ObjectLocalID=x) for x in local_ids[:100]],
]
# Selecting causes ObjectProperties to be sent
self._region.circuit.send_message(ProxiedMessage("ObjectSelect", blocks))
self._region.circuit.send_message(ProxiedMessage("ObjectDeselect", blocks))
local_ids = local_ids[100:]
def request_missing_objects(self):
self.request_objects(self.missing_locals)
def request_objects(self, local_ids):
if isinstance(local_ids, int):
local_ids = (local_ids,)
if isinstance(local_ids, set):
local_ids = tuple(local_ids)
session = self._region.session()
while local_ids:
self._region.circuit.send_message(ProxiedMessage(
"RequestMultipleObjects",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
*[Block("ObjectData", CacheMissType=0, ID=x) for x in local_ids[:100]],
))
local_ids = local_ids[100:]

View File

@@ -1,53 +0,0 @@
import enum
import socket
import struct
import typing
class Direction(enum.Enum):
OUT = enum.auto()
IN = enum.auto()
def __invert__(self):
if self == self.OUT:
return self.IN
return self.OUT
ADDR_TUPLE = typing.Tuple[str, int]
class ProxiedUDPPacket:
HEADER_STRUCT = struct.Struct("!HBB4sH")
def __init__(self, src_addr: ADDR_TUPLE, dst_addr: ADDR_TUPLE, data: bytes, direction: Direction):
self.src_addr = src_addr
self.dst_addr = dst_addr
self.data = data
self.direction = direction
@property
def outgoing(self):
return self.direction == Direction.OUT
@property
def incoming(self):
return self.direction == Direction.IN
@property
def far_addr(self):
if self.outgoing:
return self.dst_addr
return self.src_addr
def _make_socks_header(self):
return self.HEADER_STRUCT.pack(
0, 0, 1, socket.inet_aton(self.far_addr[0]), self.far_addr[1])
def serialize(self, socks_header=None):
# Decide whether we need a header based on packet direction
if socks_header is None:
socks_header = self.incoming
if not socks_header:
return self.data
return self._make_socks_header() + self.data

View File

@@ -1,6 +1,5 @@
from __future__ import annotations
import enum
import logging
import hashlib
import uuid
@@ -10,28 +9,29 @@ import urllib.parse
import multidict
from hippolyzer.lib.base.datatypes import Vector3
from hippolyzer.lib.base.datatypes import Vector3, UUID
from hippolyzer.lib.base.helpers import proxify
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.proxy.caps_client import CapsClient
from hippolyzer.lib.base.objects import handle_to_global_pos
from hippolyzer.lib.client.state import BaseClientRegion
from hippolyzer.lib.proxy.caps_client import ProxyCapsClient
from hippolyzer.lib.proxy.circuit import ProxiedCircuit
from hippolyzer.lib.proxy.objects import ObjectManager
from hippolyzer.lib.proxy.transfer_manager import TransferManager
from hippolyzer.lib.proxy.xfer_manager import XferManager
from hippolyzer.lib.proxy.caps import CapType
from hippolyzer.lib.proxy.object_manager import ProxyObjectManager
from hippolyzer.lib.base.transfer_manager import TransferManager
from hippolyzer.lib.base.xfer_manager import XferManager
if TYPE_CHECKING:
from hippolyzer.lib.proxy.sessions import Session
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.message import ProxiedMessage
class CapType(enum.Enum):
NORMAL = enum.auto()
TEMPORARY = enum.auto()
WRAPPER = enum.auto()
PROXY_ONLY = enum.auto()
class CapsMultiDict(multidict.MultiDict[Tuple[CapType, str]]):
# TODO: Make a view object for this that's just name -> URL
# deriving from MultiMapping[_T] so we don't have to do
# so many copies for consumers that aren't expecting the
# CapType tag.
def add(self, key, value) -> None:
# Prepend rather than append when adding caps.
# Necessary so the most recent for a region URI is returned
@@ -41,25 +41,30 @@ class CapsMultiDict(multidict.MultiDict[Tuple[CapType, str]]):
super().add(key, val)
class ProxiedRegion:
def __init__(self, circuit_addr, seed_cap: str, session, handle=None):
class ProxiedRegion(BaseClientRegion):
def __init__(self, circuit_addr, seed_cap: str, session: Session, handle=None):
# A client may make a Seed request twice, and may get back two (valid!) sets of
# Cap URIs. We need to be able to look up both, so MultiDict is necessary.
self.handle: Optional[int] = handle
self._name: Optional[str] = None
# TODO: when does this change?
self.cache_id: Optional[UUID] = None
self.circuit: Optional[ProxiedCircuit] = None
self.circuit_addr = circuit_addr
self._caps = CapsMultiDict()
self._caps_url_lookup: Dict[str, Tuple[CapType, str]] = {}
if seed_cap:
self._caps["Seed"] = (CapType.NORMAL, seed_cap)
self.session: Optional[Callable[[], Session]] = weakref.ref(session)
self.message_handler: MessageHandler[ProxiedMessage] = MessageHandler()
self.http_message_handler: MessageHandler[HippoHTTPFlow] = MessageHandler()
self.session: Callable[[], Session] = weakref.ref(session)
self.message_handler: MessageHandler[Message, str] = MessageHandler()
self.http_message_handler: MessageHandler[HippoHTTPFlow, str] = MessageHandler()
self.eq_manager = EventQueueManager(self)
self.xfer_manager = XferManager(self)
self.transfer_manager = TransferManager(self)
self.caps_client = CapsClient(self)
self.objects = ObjectManager(self)
settings = session.session_manager.settings
self.caps_client = ProxyCapsClient(settings, proxify(self))
self.objects: ProxyObjectManager = ProxyObjectManager(self, may_use_vo_cache=True)
self.xfer_manager = XferManager(proxify(self), self.session().secure_session_id)
self.transfer_manager = TransferManager(proxify(self), session.agent_id, session.id)
self._recalc_caps()
@property
def name(self):
@@ -76,10 +81,10 @@ class ProxiedRegion:
return multidict.MultiDict((x, y[1]) for x, y in self._caps.items())
@property
def global_pos(self):
def global_pos(self) -> Vector3:
if self.handle is None:
raise ValueError("Can't determine global region position without handle")
return Vector3(self.handle >> 32, self.handle & 0xFFffFFff)
return handle_to_global_pos(self.handle)
@property
def is_alive(self):
@@ -91,6 +96,13 @@ class ProxiedRegion:
for cap_name, cap_url in caps.items():
if isinstance(cap_url, str) and cap_url.startswith('http'):
self._caps.add(cap_name, (CapType.NORMAL, cap_url))
self._recalc_caps()
def _recalc_caps(self):
self._caps_url_lookup.clear()
for name, cap_info in self._caps.items():
cap_type, cap_url = cap_info
self._caps_url_lookup[cap_url] = (cap_type, name)
def register_wrapper_cap(self, name: str):
"""
@@ -102,9 +114,13 @@ class ProxiedRegion:
parsed = list(urllib.parse.urlsplit(self._caps[name][1]))
seed_id = self._caps["Seed"][1].split("/")[-1].encode("utf8")
# Give it a unique domain tied to the current Seed URI
parsed[1] = f"{name}-{hashlib.sha256(seed_id).hexdigest()[:16]}.hippo-proxy.localhost"
parsed[1] = f"{name.lower()}-{hashlib.sha256(seed_id).hexdigest()[:16]}.hippo-proxy.localhost"
# Force the URL to HTTP, we're going to handle the request ourselves so it doesn't need
# to be secure. This should save on expensive TLS context setup for each req.
parsed[0] = "http"
wrapper_url = urllib.parse.urlunsplit(parsed)
self._caps.add(name + "ProxyWrapper", (CapType.WRAPPER, wrapper_url))
self._recalc_caps()
return wrapper_url
def register_proxy_cap(self, name: str):
@@ -113,21 +129,24 @@ class ProxiedRegion:
"""
cap_url = f"https://caps.hippo-proxy.localhost/cap/{uuid.uuid4()!s}"
self._caps.add(name, (CapType.PROXY_ONLY, cap_url))
self._recalc_caps()
return cap_url
def register_temporary_cap(self, name: str, cap_url: str):
"""Register a Cap that only has meaning the first time it's used"""
self._caps.add(name, (CapType.TEMPORARY, cap_url))
self._recalc_caps()
def resolve_cap(self, url: str, consume=True) -> Optional[Tuple[str, str, CapType]]:
for name, cap_info in self._caps.items():
cap_type, cap_url = cap_info
for cap_url in self._caps_url_lookup.keys():
if url.startswith(cap_url):
cap_type, name = self._caps_url_lookup[cap_url]
if cap_type == CapType.TEMPORARY and consume:
# Resolving a temporary cap pops it out of the dict
temporary_caps = self._caps.popall(name)
temporary_caps.remove(cap_info)
temporary_caps.remove((cap_type, cap_url))
self._caps.extend((name, x) for x in temporary_caps)
self._recalc_caps()
return name, cap_url, cap_type
return None
@@ -136,6 +155,7 @@ class ProxiedRegion:
if self.circuit:
self.circuit.is_alive = False
self.objects.clear()
self.eq_manager.clear()
def __repr__(self):
return "<%s %s>" % (self.__class__.__name__, self.name)
@@ -146,11 +166,27 @@ class EventQueueManager:
# TODO: Per-EQ InjectionTracker so we can inject fake responses on 499
self._queued_events = []
self._region = weakref.proxy(region)
self._last_ack: Optional[int] = None
self._last_payload: Optional[Any] = None
def queue_event(self, event: dict):
def inject_event(self, event: dict):
self._queued_events.append(event)
def take_events(self):
def take_injected_events(self):
events = self._queued_events
self._queued_events = []
return events
def cache_last_poll_response(self, req_ack: int, payload: Any):
self._last_ack = req_ack
self._last_payload = payload
def get_cached_poll_response(self, req_ack: Optional[int]) -> Optional[Any]:
if self._last_ack == req_ack:
return self._last_payload
return None
def clear(self):
self._queued_events.clear()
self._last_ack = None
self._last_payload = None

View File

@@ -2,6 +2,7 @@ from __future__ import annotations
import dataclasses
import datetime
import functools
import logging
import multiprocessing
import weakref
@@ -9,16 +10,27 @@ from typing import *
from weakref import ref
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.client.state import BaseClientSession
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.circuit import ProxiedCircuit
from hippolyzer.lib.proxy.http_asset_repo import HTTPAssetRepo
from hippolyzer.lib.proxy.http_proxy import HTTPFlowContext, is_asset_server_cap_name, SerializedCapData
from hippolyzer.lib.proxy.message_logger import BaseMessageLogger
from hippolyzer.lib.proxy.region import ProxiedRegion, CapType
from hippolyzer.lib.proxy.http_proxy import HTTPFlowContext
from hippolyzer.lib.proxy.caps import is_asset_server_cap_name, CapData, CapType
from hippolyzer.lib.proxy.namecache import ProxyNameCache
from hippolyzer.lib.proxy.object_manager import ProxyWorldObjectManager
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.settings import ProxySettings
if TYPE_CHECKING:
from hippolyzer.lib.proxy.message_logger import BaseMessageLogger
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
class Session:
class Session(BaseClientSession):
def __init__(self, session_id, secure_session_id, agent_id, circuit_code,
login_data=None, session_manager=None):
session_manager: Optional[SessionManager], login_data=None):
self.login_data = login_data or {}
self.pending = True
self.id: UUID = session_id
@@ -32,6 +44,11 @@ class Session:
self.selected: SelectionModel = SelectionModel()
self.regions: List[ProxiedRegion] = []
self.started_at = datetime.datetime.now()
self.message_handler: MessageHandler[Message, str] = MessageHandler()
self.http_message_handler: MessageHandler[HippoHTTPFlow, str] = MessageHandler()
self.objects = ProxyWorldObjectManager(self, session_manager.settings, session_manager.name_cache)
# Base path of a newview type cache directory for this session
self.cache_dir: Optional[str] = None
self._main_region = None
@property
@@ -47,8 +64,8 @@ class Session:
secure_session_id=UUID(login_data["secure_session_id"]),
agent_id=UUID(login_data["agent_id"]),
circuit_code=int(login_data["circuit_code"]),
login_data=login_data,
session_manager=session_manager,
login_data=login_data,
)
appearance_service = login_data.get("agent_appearance_service")
map_image_service = login_data.get("map-server-url")
@@ -59,6 +76,7 @@ class Session:
# Login data also has details about the initial sim
sess.register_region(
circuit_addr=(login_data["sim_ip"], login_data["sim_port"]),
handle=(login_data["region_x"] << 32) | login_data["region_y"],
seed_url=login_data["seed_capability"],
)
return sess
@@ -103,12 +121,26 @@ class Session:
return region
return None
def region_by_handle(self, handle: int) -> Optional[ProxiedRegion]:
for region in self.regions:
if region.handle == handle:
return region
return None
def open_circuit(self, near_addr, circuit_addr, transport):
for region in self.regions:
if region.circuit_addr == circuit_addr:
if not region.circuit or not region.circuit.is_alive:
logging_hook = None
if self.session_manager.message_logger:
logging_hook = functools.partial(
self.session_manager.message_logger.log_lludp_message,
self,
region,
)
region.circuit = ProxiedCircuit(
near_addr, circuit_addr, transport, region=region)
near_addr, circuit_addr, transport, logging_hook=logging_hook)
AddonManager.handle_circuit_created(self, region)
return True
if region.circuit and region.circuit.is_alive:
# Whatever, already open
@@ -134,7 +166,7 @@ class Session:
return CapData(cap_name, ref(region), ref(self), base_url, cap_type)
return None
def tid_to_assetid(self, transaction_id: UUID):
def transaction_to_assetid(self, transaction_id: UUID):
return UUID.combine(transaction_id, self.secure_session_id)
def __repr__(self):
@@ -142,16 +174,22 @@ class Session:
class SessionManager:
def __init__(self):
def __init__(self, settings: ProxySettings):
self.settings: ProxySettings = settings
self.sessions: List[Session] = []
self.shutdown_signal = multiprocessing.Event()
self.flow_context = HTTPFlowContext()
self.asset_repo = HTTPAssetRepo()
self.message_logger: Optional[BaseMessageLogger] = None
self.addon_ctx: Dict[str, Any] = {}
self.name_cache = ProxyNameCache()
def create_session(self, login_data) -> Session:
session = Session.from_login_data(login_data, self)
self.name_cache.create_subscriptions(
session.message_handler,
session.http_message_handler,
)
self.sessions.append(session)
logging.info("Created %r" % session)
return session
@@ -166,6 +204,7 @@ class SessionManager:
def close_session(self, session: Session):
logging.info("Closed %r" % session)
session.objects.clear()
self.sessions.remove(session)
def resolve_cap(self, url: str) -> Optional["CapData"]:
@@ -175,50 +214,6 @@ class SessionManager:
return cap_data
return CapData()
def deserialize_cap_data(self, ser_cap_data: "SerializedCapData") -> "CapData":
cap_session = None
cap_region = None
if ser_cap_data.session_id:
for session in self.sessions:
if ser_cap_data.session_id == str(session.id):
cap_session = session
if cap_session and ser_cap_data.region_addr:
for region in cap_session.regions:
if ser_cap_data.region_addr == str(region.circuit_addr):
cap_region = region
return CapData(
cap_name=ser_cap_data.cap_name,
region=ref(cap_region) if cap_region else None,
session=ref(cap_session) if cap_session else None,
base_url=ser_cap_data.base_url,
type=CapType[ser_cap_data.type],
)
class CapData(NamedTuple):
cap_name: Optional[str] = None
# Actually they're weakrefs but the type sigs suck.
region: Optional[Callable[[], Optional[ProxiedRegion]]] = None
session: Optional[Callable[[], Optional[Session]]] = None
base_url: Optional[str] = None
type: CapType = CapType.NORMAL
def __bool__(self):
return bool(self.cap_name or self.session)
def serialize(self) -> "SerializedCapData":
return SerializedCapData(
cap_name=self.cap_name,
region_addr=str(self.region().circuit_addr) if self.region and self.region() else None,
session_id=str(self.session().id) if self.session and self.session() else None,
base_url=self.base_url,
type=self.type.name,
)
@property
def asset_server_cap(self) -> bool:
return is_asset_server_cap_name(self.cap_name)
@dataclasses.dataclass
class SelectionModel:

View File

@@ -0,0 +1,36 @@
import os
from typing import *
from hippolyzer.lib.base.settings import Settings, SettingDescriptor
_T = TypeVar("_T")
class EnvSettingDescriptor(SettingDescriptor):
"""A setting that prefers to pull its value from the environment"""
__slots__ = ("_env_name", "_env_callable")
def __init__(self, default: Union[Callable[[], _T], _T], env_name: str, spec: Callable[[str], _T]):
super().__init__(default)
self._env_name = env_name
self._env_callable = spec
def __get__(self, obj, owner=None) -> _T:
val = os.getenv(self._env_name)
if val is not None:
return self._env_callable(val)
return super().__get__(obj, owner)
class ProxySettings(Settings):
SOCKS_PROXY_PORT: int = EnvSettingDescriptor(9061, "HIPPO_UDP_PORT", int)
HTTP_PROXY_PORT: int = EnvSettingDescriptor(9062, "HIPPO_HTTP_PORT", int)
PROXY_BIND_ADDR: str = EnvSettingDescriptor("127.0.0.1", "HIPPO_BIND_HOST", str)
REMOTELY_ACCESSIBLE: bool = SettingDescriptor(False)
USE_VIEWER_OBJECT_CACHE: bool = SettingDescriptor(False)
# Whether having the proxy do automatic internal requests objects is allowed at all
ALLOW_AUTO_REQUEST_OBJECTS: bool = SettingDescriptor(True)
# Whether the viewer should request any directly referenced objects it didn't know about.
AUTOMATICALLY_REQUEST_MISSING_OBJECTS: bool = SettingDescriptor(False)
ADDON_SCRIPTS: List[str] = SettingDescriptor(list)
FILTERS: Dict[str, str] = SettingDescriptor(dict)

Some files were not shown because too many files have changed in this diff Show More