417 Commits

Author SHA1 Message Date
Salad Dais
6f87ec8725 Split up dependencies so core can be used without GUI deps 2025-11-24 01:44:08 +00:00
Salad Dais
fb885d8eec Add example addon for debugging avatar load times 2025-10-29 02:13:25 +00:00
Salad Dais
78281ed12b Make send off-circuit work correctly again 2025-10-29 02:08:13 +00:00
Salad Dais
4087eaa3c6 Don't trigger resends on off-circuit messages 2025-08-26 21:53:32 +00:00
Salad Dais
32428941d7 Fix up inventory handlers a little 2025-08-18 21:28:26 +00:00
Salad Dais
0cc3397402 Improve inventory handling 2025-07-15 01:53:24 +00:00
Salad Dais
0c2dfd3213 Pass EQ messages off to session message handler as well 2025-07-14 07:43:56 +00:00
Salad Dais
e119181e3f Handle RemoveInventoryObjects message 2025-07-14 07:43:36 +00:00
Salad Dais
64c7265578 Beautify JSON responses 2025-07-14 03:56:27 +00:00
Salad Dais
eb652152f5 Update some flags 2025-07-14 03:56:16 +00:00
Salad Dais
cd03dd4fdd Fix duplication not handling update messages properly 2025-07-07 23:54:26 +00:00
Salad Dais
056e142347 Add API for duplicating inventory folders / items 2025-07-07 22:52:38 +00:00
Salad Dais
927a353dec Use windows-2022 for CI, windows-2019 is retired 2025-07-06 05:47:16 +00:00
Salad Dais
bc68eeb7d2 Add shape creator example addon 2025-07-06 05:27:23 +00:00
Salad Dais
de79f42aa6 Start handling AvatarAppearance messages 2025-07-05 03:59:14 +00:00
Salad Dais
e138ae88a1 Start adding tests for inventory manager 2025-06-30 22:19:24 +00:00
Salad Dais
e20a4a01ad Add tools for mirroring animations 2025-06-28 04:23:18 +00:00
Salad Dais
a2b49fdc44 Allow updating skeleton definitions with attributes from mesh 2025-06-21 08:45:30 +00:00
Salad Dais
988a82179e Update templates 2025-06-18 20:44:11 +00:00
Salad Dais
4eb97b5958 Improve anim tracker addon 2025-06-18 20:43:49 +00:00
Salad Dais
4962d8e7bf Add example addon for debugging object animations starting / stopping 2025-06-15 17:44:40 +00:00
Salad Dais
a652779cc5 Add object inventory helpers to region object manager 2025-06-15 17:44:03 +00:00
Salad Dais
d7092e7733 Track animations for avatars and objects 2025-06-14 23:33:53 +00:00
Salad Dais
8b5a7ebecf Add RLV at home 2025-06-14 07:48:19 +00:00
Salad Dais
8effd431a6 Some typing fixups 2025-06-14 07:06:18 +00:00
Salad Dais
22fb44ef28 Move asset_type helper to WearableType, where it belongs 2025-06-14 03:21:44 +00:00
Salad Dais
c8dc67ea37 More inventory / wearables updates 2025-06-13 09:26:42 +00:00
Salad Dais
0dbba40fe1 Serialization template updates 2025-06-09 13:18:01 +00:00
Salad Dais
97e567be77 More inventory fixups 2025-06-09 13:17:42 +00:00
Salad Dais
76216ee390 More inventory code cleanup 2025-06-07 10:00:03 +00:00
Salad Dais
c60c2819ac Add more AIS-related util functions 2025-06-06 12:43:57 +00:00
Salad Dais
7cbef457cf Update inventory handling code 2025-06-05 16:33:26 +00:00
Salad Dais
4916bdc543 Relax UDP serialization behavior when previous var blocks missing 2025-06-05 16:08:55 +00:00
Salad Dais
bb0e88e460 Add more inventory-related utilities 2025-06-05 00:46:22 +00:00
Salad Dais
46e598cded Don't use setup.py for bundling 2025-05-26 19:15:33 +00:00
Salad Dais
ce130c4831 Use a newer cx_Freeze 2025-05-26 18:50:37 +00:00
Salad Dais
b6ac988601 Always fetch tags so SCM versioning works 2025-05-19 23:22:05 +00:00
Salad Dais
c8dbbef8fc Let's use newer Python versions 2025-05-19 23:14:40 +00:00
Salad Dais
a974f167d1 Update requirements and package dirs 2025-05-19 23:05:34 +00:00
Salad Dais
2d3b3daf10 Start switching to pyproject.toml 2025-05-19 22:49:05 +00:00
Salad Dais
1d54c70164 Update uses of recordclass and utcfromtimestamp() 2025-05-16 22:47:17 +00:00
Salad Dais
6dafe32f6a Update version to v0.15.6
I forgot I have to manually do it in this repo.
2025-04-18 04:33:00 +00:00
Salad Dais
3149d3610f Pin cx_freeze version 2025-04-18 04:30:11 +00:00
Salad Dais
f8f3bcfc36 Make PyPi stop whining about attestations 2025-04-18 04:26:42 +00:00
Salad Dais
8548cce4e5 Use new upload-artifact action 2025-04-18 04:19:52 +00:00
Salad Dais
ad2aca1803 Upgrade mitmproxy 2025-04-18 01:44:23 +00:00
Salad Dais
8cf500ce44 Me more verbose if we can't parse legacy schema 2025-04-18 01:43:10 +00:00
Salad Dais
ceda7f370e Update message template to upstream 2024-12-11 22:59:27 +00:00
Salad Dais
0692a10253 Add support for JankStringyBytes in LLSD 2024-12-11 22:58:56 +00:00
Salad Dais
c1c2a96295 Fix some event handling quirks 2024-12-11 22:56:50 +00:00
Salad Dais
b4be9fa757 Better handle resent reliable messages 2024-10-29 07:31:59 +00:00
Salad Dais
a8967f0b7d Handle unknown messages better 2024-10-29 07:31:35 +00:00
Salad Dais
10af5cc250 Handle more JankStringyBytes ops 2024-10-29 07:15:24 +00:00
Salad Dais
0ea1b0324e v0.15.2 2024-03-14 02:04:25 +00:00
Salad Dais
4ece6efe60 Fix #45, add support for attachment block in AvatarAppearance
This is just a guess based on what the data looks like. The message
template may not be representative of the actual template LL is using
and they may remove it at any time, but this seems close enough
to what is actually being used.

Also it stops the message from spamming me about unparsed data.
2024-03-14 01:44:00 +00:00
Salad Dais
15bc8e0ed2 Log when applying deferred inv calls 2024-02-20 04:56:25 +00:00
dependabot[bot]
33fad6339f Bump aiohttp from 3.9.1 to 3.9.2 (#43)
Bumps [aiohttp](https://github.com/aio-libs/aiohttp) from 3.9.1 to 3.9.2.
- [Release notes](https://github.com/aio-libs/aiohttp/releases)
- [Changelog](https://github.com/aio-libs/aiohttp/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/aiohttp/compare/v3.9.1...v3.9.2)

---
updated-dependencies:
- dependency-name: aiohttp
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-19 22:46:10 -04:00
dependabot[bot]
93916104db Bump jinja2 from 3.1.2 to 3.1.3 (#42)
Bumps [jinja2](https://github.com/pallets/jinja) from 3.1.2 to 3.1.3.
- [Release notes](https://github.com/pallets/jinja/releases)
- [Changelog](https://github.com/pallets/jinja/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/jinja/compare/3.1.2...3.1.3)

---
updated-dependencies:
- dependency-name: jinja2
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-21 19:45:05 -04:00
Salad Dais
3bb4fb0640 Basic AIS response handling in proxy 2024-01-19 04:37:14 +00:00
Salad Dais
c9495763e5 Defer inventory update processing til cache is loaded 2024-01-18 05:08:36 +00:00
Salad Dais
a7825a881c Start improving InventoryManager 2024-01-16 01:56:34 +00:00
Salad Dais
a6bbd97b98 Make sure asyncio.Tasks always have their exceptiosn logged 2024-01-15 22:24:16 +00:00
Salad Dais
3500212da0 Start handling messages in InventoryManager 2024-01-14 07:04:28 +00:00
Salad Dais
01ea9d7879 Improve MessageHandler resiliency 2024-01-14 07:00:20 +00:00
Salad Dais
f19e1b8bfb Upgrade to outleap 0.6.1 2024-01-10 20:34:21 +00:00
Salad Dais
f2202556d7 Mark as compatible with Python 3.12 2024-01-10 16:20:03 +00:00
Salad Dais
5a5b471fe4 v0.15.1 2024-01-10 16:12:23 +00:00
Salad Dais
ff0f20d1dd Correct parcel bitmap parsing 2024-01-10 07:27:50 +00:00
Salad Dais
4898c852c1 Cache render materials in proxy object manager 2024-01-09 13:42:45 +00:00
Salad Dais
adf5295e2b Add start of ProxyParcelManager 2024-01-09 13:41:37 +00:00
Salad Dais
7514baaa5f Add serializer for ParcelProperty bitmaps 2024-01-09 13:40:52 +00:00
Salad Dais
0ba1a779ef Allow handling EQ events through message_handler in proxy 2024-01-09 13:40:07 +00:00
Salad Dais
3ea8a27914 Bitten by YAML floatification... 2024-01-09 12:26:30 +00:00
Salad Dais
2451ad3674 v0.15.0 2024-01-09 12:19:53 +00:00
Salad Dais
25804df238 Windows build needs mitmproxy-windows 2024-01-09 12:09:18 +00:00
Salad Dais
474173ba54 Update workflow python versions 2024-01-09 09:21:12 +00:00
Salad Dais
049a3b703f Update requirements 2024-01-09 09:19:15 +00:00
Salad Dais
ac77fde892 Update mitmproxy, change required Python to 3.10 2024-01-09 09:17:05 +00:00
Salad Dais
6ee9b22923 Start updating Windows release bundling 2024-01-09 08:53:33 +00:00
Salad Dais
f355138cd2 Update requirements 2024-01-08 22:43:08 +00:00
dependabot[bot]
478d135d1f Bump pygments from 2.10.0 to 2.15.0 (#40)
Bumps [pygments](https://github.com/pygments/pygments) from 2.10.0 to 2.15.0.
- [Release notes](https://github.com/pygments/pygments/releases)
- [Changelog](https://github.com/pygments/pygments/blob/master/CHANGES)
- [Commits](https://github.com/pygments/pygments/compare/2.10.0...2.15.0)

---
updated-dependencies:
- dependency-name: pygments
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-08 17:03:58 -04:00
dependabot[bot]
80c9acdabe Bump tornado from 6.1 to 6.3.3 (#41)
Bumps [tornado](https://github.com/tornadoweb/tornado) from 6.1 to 6.3.3.
- [Changelog](https://github.com/tornadoweb/tornado/blob/master/docs/releases.rst)
- [Commits](https://github.com/tornadoweb/tornado/compare/v6.1.0...v6.3.3)

---
updated-dependencies:
- dependency-name: tornado
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-08 17:03:23 -04:00
dependabot[bot]
d4eaa7c543 Bump urllib3 from 1.26.7 to 1.26.18 (#38)
Bumps [urllib3](https://github.com/urllib3/urllib3) from 1.26.7 to 1.26.18.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/1.26.7...1.26.18)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-08 17:00:00 -04:00
dependabot[bot]
2571550da4 Bump aiohttp from 3.8.3 to 3.9.0 (#37)
Bumps [aiohttp](https://github.com/aio-libs/aiohttp) from 3.8.3 to 3.9.0.
- [Release notes](https://github.com/aio-libs/aiohttp/releases)
- [Changelog](https://github.com/aio-libs/aiohttp/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/aiohttp/compare/v3.8.3...v3.9.0)

---
updated-dependencies:
- dependency-name: aiohttp
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-08 16:55:37 -04:00
Salad Dais
b3ee3a3506 Add packet stats addon example 2024-01-08 00:03:45 +00:00
Salad Dais
11feccd93b Add support for Material inventory types 2024-01-07 17:47:09 +00:00
Salad Dais
bb6ce5c013 Handle binary LLSD headers generated by indra 2024-01-07 17:46:54 +00:00
Salad Dais
a35aa9046e v0.14.3 2024-01-07 08:00:21 +00:00
Salad Dais
6c32da878d Handle (and ignore by default) the new GenericStreamingMessage
This is _enormously_ spammy, good god. Apparently related to PBR.
2024-01-07 07:51:52 +00:00
Salad Dais
49c54bc896 Automatically request all materials by default 2024-01-06 21:50:29 +00:00
Salad Dais
4c9fa38ffb Move material management to ClientObjectManager 2024-01-06 21:40:49 +00:00
Salad Dais
2856e78f16 Start adding MaterialManager for RenderMaterials 2024-01-06 20:40:04 +00:00
Salad Dais
33884925f4 enum.IntFlag -> IntFlag 2024-01-06 20:39:29 +00:00
Salad Dais
a11ef96d9a Serve inbound Xfers reliably 2024-01-05 02:53:05 +00:00
Salad Dais
7b6239d66a Add more parcel enums 2024-01-05 02:49:51 +00:00
Salad Dais
2c3bd140ff Update MapImageFlags 2024-01-04 22:24:36 +00:00
Salad Dais
9d2087a0fb Add ParcelManager to HippoClient 2024-01-04 21:45:54 +00:00
Salad Dais
67db8110a1 Fix ParcelOverlay data template 2024-01-04 20:01:32 +00:00
Salad Dais
ab1c56ff3e Start writing client parcel manager 2024-01-04 19:51:47 +00:00
Salad Dais
142f2e42ca Clean up message template code 2024-01-04 19:08:09 +00:00
Salad Dais
e7764c1665 Display templated EQ messages as templated messages
This makes them less annoying to read, and allows us to use
subfield serializers to pretty-print their contents.
2024-01-04 18:00:14 +00:00
Salad Dais
582cfea47c Send AgentUpdate after connecting to main region 2024-01-03 07:53:47 +00:00
Salad Dais
6f38d84a1c Add ParcelOverlay serializers 2024-01-03 07:51:51 +00:00
Salad Dais
1fc46e66bc Support __add__ and __radd_ on JankStringyBytes 2023-12-31 15:58:05 +00:00
Salad Dais
167673aa08 Be nicer about zero-length strings in Messages 2023-12-31 15:52:15 +00:00
Salad Dais
5ad8ee986f Keep track of user's groups in their session 2023-12-31 15:28:00 +00:00
Salad Dais
e9d7ee7e8e ObjectUpdateType.OBJECT_UPDATE -> ObjectUpdateType.UPDATE 2023-12-31 14:57:28 +00:00
Salad Dais
d21c3ec004 Update templates 2023-12-31 14:55:46 +00:00
Salad Dais
01c6931d53 v0.14.2 2023-12-24 18:05:05 +00:00
Salad Dais
493563bb6f Add a few asset type lookups 2023-12-24 06:47:04 +00:00
Salad Dais
ca5c71402b Bump Python requirement to 3.9 2023-12-24 05:57:14 +00:00
Salad Dais
ad765a1ede Load inventory cache in a background thread
llsd.parse_notation() is slow as hell, no way around it.
2023-12-24 05:55:56 +00:00
Salad Dais
9adee14e0f Allow non-byte legacy schema flag fields 2023-12-23 15:40:00 +00:00
Salad Dais
57c4bd0e7c Improve AIS support 2023-12-22 21:25:05 +00:00
Salad Dais
1085dbc8ab v0.14.1 2023-12-22 04:38:30 +00:00
Salad Dais
fb9740003e Fix a couple AIS cases 2023-12-22 04:38:30 +00:00
Salad Dais
087f16fbc5 Simplify Inventory/AssetType legacy conversion 2023-12-22 03:57:36 +00:00
Salad Dais
fa96e80590 Simplify AIS<->InventoryData conversion 2023-12-22 02:40:53 +00:00
Salad Dais
539d38fb4a Fix legacy serialization for categories 2023-12-21 22:09:48 +00:00
Salad Dais
caaf0b0e13 Add tests for legacy category parsing 2023-12-21 20:12:41 +00:00
Salad Dais
16958e516d More enumification in inventory code 2023-12-21 19:18:58 +00:00
Salad Dais
74e4e0c4ec Start supporting enums in inventory schema 2023-12-21 14:55:14 +00:00
Salad Dais
3efeb46500 Add notes about inventory compatibility issues 2023-12-21 06:41:47 +00:00
Salad Dais
0f2e933be1 Make legacy input schema round-trip correctly 2023-12-20 22:26:03 +00:00
Salad Dais
a7f40b0d15 Properly handle inventory metadata field 2023-12-20 03:23:03 +00:00
Salad Dais
e6ac99458f v0.14.0 2023-12-20 01:38:31 +00:00
Salad Dais
92cadf26e9 Support inventory cache v3 2023-12-20 01:21:54 +00:00
Salad Dais
305038a31d Add HippoClient.main_caps_client convenience property 2023-12-20 00:58:12 +00:00
Salad Dais
bd67d6f19f Split out RLV handling 2023-12-20 00:49:16 +00:00
Salad Dais
81eae4edbf Make default log level less insane 2023-12-19 18:43:08 +00:00
Salad Dais
776ef71574 Fix participant removal on session close 2023-12-19 18:41:46 +00:00
Salad Dais
31125ca489 Defer returning from join_session() until we're a participant 2023-12-19 06:38:35 +00:00
Salad Dais
29ab108764 Store capture and render device info for voice 2023-12-19 05:30:21 +00:00
Salad Dais
61820f1670 Better handling of client start locations 2023-12-19 04:24:47 +00:00
Salad Dais
7fafb8b5ae message_handler -> event_handler 2023-12-19 01:31:49 +00:00
Salad Dais
28e84c0c5a Clean up session joining code 2023-12-18 23:32:57 +00:00
Salad Dais
e629214bef Switch voice stuff to use MessageHandler for events 2023-12-18 23:18:25 +00:00
Salad Dais
5e9433b4a4 3d_position -> 3d_pos 2023-12-18 21:34:39 +00:00
Salad Dais
5f2082c6e9 Minor cleanup of asyncio usage 2023-12-18 21:32:25 +00:00
Salad Dais
12c0deadee Add tests for setting voice region pos 2023-12-18 21:16:35 +00:00
Salad Dais
6da766ef22 Add test for joining voice session 2023-12-18 20:11:21 +00:00
Salad Dais
f278a4bfcf Use asyncio.Event when events should be re-awaitable 2023-12-18 18:34:14 +00:00
Salad Dais
631fe91049 Correct coveragerc exclude_lines 2023-12-18 07:27:35 +00:00
Salad Dais
159f39227a Add more voice client tests 2023-12-18 07:08:37 +00:00
Salad Dais
670acef0b4 Add tests for voice connector setup 2023-12-18 06:10:51 +00:00
Salad Dais
1165769aca Start writing voice client tests 2023-12-18 05:34:33 +00:00
Salad Dais
613dd32a40 Add tests for voice stuff 2023-12-18 03:29:40 +00:00
Salad Dais
d7a88f904e Add voice-related tooling 2023-12-18 02:02:39 +00:00
Salad Dais
a8344a231b Make hippolyzer events awaitable 2023-12-17 23:37:10 +00:00
Salad Dais
11043e365a On second thought, don't handle EnableSimulator at all 2023-12-16 21:51:56 +00:00
Salad Dais
ad34ba78ea Handle EnableSimulator correctly in client 2023-12-16 20:53:38 +00:00
Salad Dais
f9b4ae1308 Get rid of decorator so we don't mess up type signature 2023-12-16 20:34:10 +00:00
Salad Dais
7fee8f6bfe Fix Python 3.8 2023-12-16 20:08:09 +00:00
Salad Dais
2e0ca3649c Use Future instead of Event for connected signal 2023-12-16 17:29:35 +00:00
Salad Dais
e0d44741e9 Better teleport request handling 2023-12-16 04:44:49 +00:00
Salad Dais
008d59c7d6 Fix Python 3.8 2023-12-15 21:34:45 +00:00
Salad Dais
ed03b0d49f Add a teleport method to client 2023-12-15 21:32:45 +00:00
Salad Dais
4cc1513e58 Correct type signatures in MessageHandler 2023-12-15 19:07:17 +00:00
Salad Dais
c768aeaf40 Be smarter about clearing out ObjectManagers 2023-12-15 17:18:35 +00:00
Salad Dais
42ebb0e915 Fix multi-region connections 2023-12-15 17:08:00 +00:00
Salad Dais
31ba9635eb WIP multi-region support for client 2023-12-15 00:55:14 +00:00
Salad Dais
dc58512ee6 Better handle sim disconnects in client 2023-12-14 23:22:32 +00:00
Salad Dais
4a58731441 Make client circuits easier to work with 2023-12-14 12:33:23 +00:00
Salad Dais
c2b92d2d7d Add test for non-templated EQ events 2023-12-14 10:10:41 +00:00
Salad Dais
640b384d27 Add tests for resend suppression 2023-12-14 09:31:19 +00:00
Salad Dais
a2ef3d9f8e More client refactoring 2023-12-14 09:14:07 +00:00
Salad Dais
0456b4b62d Make main region caps less annoying to work with 2023-12-14 02:19:11 +00:00
Salad Dais
92c9c82e73 Move some things from session to region 2023-12-14 02:08:12 +00:00
Salad Dais
c5ed1cff24 Handle non-templated EQ events in client 2023-12-14 01:23:57 +00:00
Salad Dais
0710735546 Make client handle ping checks 2023-12-13 22:01:34 +00:00
Salad Dais
7869df224e Simplify chat client example 2023-12-13 20:42:21 +00:00
Salad Dais
6f6274ec7d Add client example 2023-12-13 19:19:14 +00:00
Salad Dais
40da130066 Update docs related to client 2023-12-13 17:57:48 +00:00
Salad Dais
5947d52c8d Add inventory manager to client 2023-12-13 17:52:03 +00:00
Salad Dais
e4b73a7196 Don't take by default in client messagehandlers 2023-12-13 04:18:49 +00:00
Salad Dais
1ded1180dc Clean up client tests 2023-12-13 04:10:43 +00:00
Salad Dais
5517d60e7a Use correct user-agent for hippolyzer client 2023-12-12 22:20:39 +00:00
Salad Dais
ed7e42625e Add Hippolyzer proxy support to client 2023-12-12 22:15:28 +00:00
Salad Dais
d5cde896fb Add tests for client EQ handling 2023-12-12 21:47:34 +00:00
Salad Dais
007c79f4a7 Add basic EQ handling to client 2023-12-12 21:17:47 +00:00
Salad Dais
f1b523b5de Support client seed cap, support async message handlers 2023-12-11 21:47:15 +00:00
Salad Dais
c42e0d7291 Make client login testable 2023-12-11 19:08:01 +00:00
Salad Dais
1ee1b9acc6 Basic working client 2023-12-10 23:55:19 +00:00
Salad Dais
9904633a99 More client work 2023-12-10 23:26:28 +00:00
Salad Dais
c8791db75e Start adding client-related lib files 2023-12-10 19:52:24 +00:00
Salad Dais
21d1c7ebfe v0.13.4 2023-12-07 18:47:43 +00:00
Salad Dais
996a43be5b Add option to allow insecure upstream SSL connections 2023-12-07 18:44:10 +00:00
Salad Dais
9e8127e577 Don't use asyncio.get_running_loop() 2023-12-06 20:35:55 +00:00
Salad Dais
cfcd324a11 Pin to Werkzeug under 3.0 2023-12-06 20:35:39 +00:00
Salad Dais
6872634bf4 Be more resilient when faced with no cap_data 2023-12-06 20:35:18 +00:00
Salad Dais
091090c6fd Reparent avatars correctly when recalculating linksets 2023-12-03 23:51:11 +00:00
Salad Dais
bd4fff4200 Add support for PBR / reflection probes 2023-12-03 23:50:32 +00:00
Salad Dais
52dfd0be05 v0.13.3 2023-10-10 23:23:57 +00:00
Salad Dais
60f1737115 Appease new flake8 rules 2023-10-10 23:20:43 +00:00
Salad Dais
7a5d6baf02 Make failing to load invcache non-fatal 2023-10-10 23:15:15 +00:00
Salad Dais
44a332a77b Handle failing to load an addon correctly 2023-10-10 23:14:59 +00:00
Salad Dais
beb0a2d6a4 v0.13.2 2023-07-06 21:49:35 +00:00
Salad Dais
9be66df52b Add AgentFOV to default message ignorelist
It's incredibly spammy when the mesh upload preview is open
2023-07-06 21:48:46 +00:00
Salad Dais
da0117db1b v0.13.1 2023-07-05 20:29:40 +00:00
Salad Dais
4dbf01a604 Blacklist new versions of recordclass 2023-07-05 20:27:05 +00:00
Salad Dais
36858ed3e2 Fix flake error 2023-06-18 18:37:14 +00:00
Salad Dais
370c586582 Decode more flags fields 2023-06-18 18:33:52 +00:00
Salad Dais
fdfffd96c9 Fix UUID serialization with invalid AIS LLSD payloads 2023-06-18 18:33:26 +00:00
Salad Dais
6da9f58b23 Pass original Message through to objectupdate hooks 2023-06-18 18:29:51 +00:00
Salad Dais
12e3912a37 Update README.md
This isn't even in there anymore!
2023-02-07 19:43:51 +00:00
Salad Dais
8147e7e1d7 Remove stylesheet from message builder 2023-02-07 19:43:29 +00:00
Salad Dais
19dba6651c v0.13.0 2023-02-07 19:36:22 +00:00
Salad Dais
274f96c710 Run CI tests on Python 3.11 instead of 3.10 2023-02-07 18:49:14 +00:00
Salad Dais
09e1d0b6fc Remove custom stylesheet for HTTP request / response panes 2023-02-07 18:49:14 +00:00
dependabot[bot]
f4fb68e310 Bump certifi from 2021.10.8 to 2022.12.7 (#34)
Bumps [certifi](https://github.com/certifi/python-certifi) from 2021.10.8 to 2022.12.7.
- [Release notes](https://github.com/certifi/python-certifi/releases)
- [Commits](https://github.com/certifi/python-certifi/compare/2021.10.08...2022.12.07)

---
updated-dependencies:
- dependency-name: certifi
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2023-02-07 13:39:26 -04:00
Salad Dais
8edf7ae89b Rough cut of Python 3.11 support 2023-02-07 17:35:44 +00:00
Salad Dais
b6458e9eb7 Add mute enum definitions 2022-11-15 06:24:28 +00:00
Salad Dais
375af1e7f6 Improvements to Object and Skeleton APIs 2022-11-14 21:54:52 +00:00
Salad Dais
76d0a72590 Fix ObjectUpdateBlame addon example always requesting 2022-11-01 23:06:17 +00:00
Salad Dais
3255556835 Add CreationDate SubfieldSerializer 2022-11-01 08:18:40 +00:00
Salad Dais
d19122c039 Fix copy/paste error in puppetry addon 2022-10-27 16:10:05 +00:00
Salad Dais
5692f7b8b6 Add WIP puppetry code 2022-10-19 02:11:04 +00:00
Salad Dais
21cea0f009 Claim LEAP client when session is first created 2022-10-19 02:06:35 +00:00
Salad Dais
193d762132 Give each addon a separate addon_ctx bucket
This fixes addons being able to accidentally stomp all over each
others' state just because they happened to use the same name for
a SessionProperty.
2022-10-18 22:40:15 +00:00
Salad Dais
227fbf7a2e Improve avatar skeleton implementation 2022-10-18 19:39:39 +00:00
Salad Dais
25a397bcc5 add LEAP client connection addon hook 2022-10-17 21:28:11 +00:00
Salad Dais
b0dca80b87 Simplify MetaBaseAddon 2022-10-15 22:56:32 +00:00
Salad Dais
ea475b528f v0.12.2 2022-10-14 06:17:07 +00:00
Salad Dais
2036e3c5b3 Add LEAP / outleap support 2022-10-14 06:11:51 +00:00
Salad Dais
584d9f11e8 Use llsd package instead of llbase.llsd 2022-10-14 03:47:48 +00:00
Salad Dais
df020281f1 Remove send_message() alias 2022-09-28 11:46:24 +00:00
Salad Dais
78c1b8869e Remove LEAP-related code
It lives in https://github.com/SaladDais/outleap now.
Hippolyzer-specific integration will be added back in later.
2022-09-19 04:37:31 +00:00
Salad Dais
87d5e8340b Split LEAPProtocol out of LEAPClient 2022-09-18 18:05:16 +00:00
Salad Dais
e6423d2f43 More work on LEAP API wrappers 2022-09-18 07:49:18 +00:00
Salad Dais
fac44a12b0 Update cap templates 2022-09-18 05:05:00 +00:00
Salad Dais
99ca7b1674 Allow paths for text_input() 2022-09-18 05:04:36 +00:00
Salad Dais
e066724a2f Add API wrappers for LLUI and LLWindow LEAP APIs 2022-09-18 03:28:20 +00:00
Salad Dais
dce032de31 Get both scoped and unscoped LEAP listeners working 2022-09-17 22:30:47 +00:00
Salad Dais
2f578b2bc4 More LEAP work 2022-09-17 08:50:52 +00:00
Salad Dais
0c1656e6ab Start of basic LEAP client / forwarding agent 2022-09-16 09:06:01 +00:00
Salad Dais
2b6d8a70f4 v0.12.1 2022-09-12 14:30:18 +00:00
Salad Dais
1a308e9671 Mesh serialization clarifications 2022-09-12 14:17:33 +00:00
Salad Dais
7b21e5634c Slightly faster weights (de)serialization 2022-09-10 00:04:49 +00:00
Salad Dais
e4548a285d Serialize LLMesh internals with NumPy
Easy 2x speedup! Still need to do the vertex weights, but those
have irregular alignment.
2022-09-08 23:44:53 +00:00
Salad Dais
72e926f04c Better bind shape application 2022-09-08 18:58:28 +00:00
Salad Dais
d9fa14b17c Faster vec3 normalization 2022-09-08 18:27:01 +00:00
Salad Dais
33c5abaaf4 Clarify glTF comments 2022-09-08 17:17:54 +00:00
Salad Dais
2dfd61fcc5 Only calculate inverse transpose bind shape matrix once 2022-09-08 05:48:52 +00:00
Salad Dais
eb58e747ce Fix glTF skinning implementation
Still a little funky, but things display correctly now.
2022-09-08 00:32:10 +00:00
Salad Dais
1d221a2289 glTF: Apply bone scale and rotation to inverse bind matrices instead
Blender can't do anything intelligent with them. Fixes major display
issues for collision volume scaling. Figure out how to round-trip
correctly on export later.
2022-09-02 06:47:09 +00:00
Salad Dais
2ffd0458d0 More glTF cleanup 2022-09-01 20:20:02 +00:00
Salad Dais
25f533a31b glTF fixups, parse skeleton definition from avatar_skeleton.xml 2022-09-01 16:57:36 +00:00
Salad Dais
570dbce181 Add WIP glTF conversion code
Related to #24
2022-08-29 14:10:56 +00:00
Salad Dais
ccb63e971b Reorganize collada code a bit 2022-08-29 13:49:55 +00:00
Salad Dais
8be4bce8bc Make mesh uploader handle multi-faced meshes 2022-08-22 01:15:35 +00:00
Salad Dais
e945706d2b Don't hardcode VisualParams path 2022-08-21 04:52:30 +00:00
Salad Dais
6c748a6ab2 More collada notes 2022-08-21 04:52:05 +00:00
Salad Dais
6abc7ca7d2 Fix colladatools log call 2022-08-19 16:57:31 +00:00
Salad Dais
c57e0e467c Better handle dynamically-imported hot_reload()s 2022-08-19 16:54:42 +00:00
Salad Dais
e46b4adad2 Update collada notes 2022-08-18 15:44:23 +00:00
Salad Dais
5ef9b5354a v0.12.0 2022-08-18 15:13:02 +00:00
Salad Dais
34ca7d54be Support formatting SL's busted login endpoint responses 2022-08-18 14:40:33 +00:00
Salad Dais
cb316f1992 Only load the newest version of an agent's inventory cache
This isn't entirely correct, but without a cross-platform way to
map specifically the requesting viewer to a cache directory this
is the least annoying thing we can do.
2022-08-18 14:39:49 +00:00
Salad Dais
da05a6cf1f Begin reshuffling inventory management code 2022-08-18 14:30:42 +00:00
Salad Dais
f06c31e225 Greatly improve matrix handling logic in collada code 2022-08-18 14:29:28 +00:00
Salad Dais
b4e5596ca2 Add more utils for converting between quat and euler 2022-08-08 00:38:09 +00:00
Salad Dais
49a54ce099 Fix anim mangler exceptions causing reload to fail 2022-08-07 04:42:06 +00:00
Salad Dais
0349fd9078 Fix RLV command parser to better match RLV's actual behavior 2022-08-02 08:18:28 +00:00
Salad Dais
118ef2813a Fix new flake8 lint errors 2022-08-01 01:41:15 +00:00
Salad Dais
256f74b71a Add InventoryManager to proxy Session object 2022-07-31 18:31:56 +00:00
Salad Dais
4a84453ca4 Add start of proxy inventory manager 2022-07-31 16:54:57 +00:00
Salad Dais
34316cb166 Fix LLSD notation serialization with embedded newline 2022-07-30 14:39:48 +00:00
Salad Dais
0f7d35cdca Handle HTTP messages with missing (not just empty) body 2022-07-30 00:37:35 +00:00
Salad Dais
2ee8a6f008 Clean up typing to appease the linter 2022-07-28 18:26:05 +00:00
Salad Dais
848a6745c0 v0.11.3 2022-07-28 03:55:22 +00:00
Salad Dais
0cbbedd27b Make assignments on BaseAddon class objects work as expected
The descriptors were being silently clobbered for a while now, and
I never noticed. Oops!
2022-07-28 03:39:53 +00:00
Salad Dais
e951a5b5c3 Make datetime objects (de)serialize in binary LLSD more accurately
Fixes some precision issues with LLBase's LLSD serialization stuff
where the microseconds component was dropped. May still get some
off-by-one serialization differences due to rounding.
2022-07-27 22:42:58 +00:00
Salad Dais
68bf3ba4a2 More comments in mesh module 2022-07-27 22:21:42 +00:00
Salad Dais
5b4f8f03dc Use same compression ratio for LLSD as indra 2022-07-27 22:16:31 +00:00
Salad Dais
d7c2215cbc Remove special Firestorm section from readme
The new Firestorm release added proxy configuration back in.
2022-07-27 02:50:06 +00:00
Salad Dais
629e59d3f9 Add option to upload mesh deformer directly 2022-07-26 04:13:15 +00:00
Salad Dais
8f68bc219e Split up deformer helper a little 2022-07-26 03:44:32 +00:00
Salad Dais
ba296377de Save mesh deformers as files rather than uploading directly 2022-07-26 02:12:54 +00:00
Salad Dais
e34927a996 Improve AssetUploader API, make uploader example addon use it 2022-07-26 00:11:37 +00:00
Salad Dais
3c6a917550 Add command to deformer_helper addon that uploads mesh deformers
Sometimes these are preferable to deformer anims.
2022-07-25 23:11:15 +00:00
Salad Dais
dbae2acf27 Add basic AssetUploader class
Should make it less anoying to upload procedurally generated mesh
outside of local mesh mode
2022-07-25 22:08:28 +00:00
Salad Dais
722e8eeabf v0.11.2 2022-07-24 09:02:02 +00:00
Salad Dais
a6a26a9999 Make sure module unload hooks always run
Fixes anim and mesh manglers not getting manglers unregistered
2022-07-24 08:57:47 +00:00
Salad Dais
a6328d5aee Update get_task_inventory_cap example 2022-07-22 04:04:13 +00:00
Salad Dais
4e76ebe7cf Fix get_task_inventory_cap example 2022-07-21 21:44:32 +00:00
Salad Dais
c0a26ffb57 Send proxy-created Messages reliably where appropriate 2022-07-21 21:44:06 +00:00
Salad Dais
7dfb10cb51 Make TextureEntry deserialization lazy in the ObjectUpdate case too 2022-07-21 08:05:25 +00:00
Salad Dais
de33906db5 Add a couple more enum defs 2022-07-21 08:05:17 +00:00
Salad Dais
605337b280 Remove erroneous comment 2022-07-20 21:30:03 +00:00
Salad Dais
235cd4929f Update message template to add new messages / blocks 2022-07-20 21:23:28 +00:00
Salad Dais
220a02543e v0.11.1 2022-07-20 20:38:17 +00:00
Salad Dais
8ac47c2397 Fix use of dynamically imported globals in REPL 2022-07-20 20:30:41 +00:00
Salad Dais
d384978322 UpdateType -> ObjectUpdateType 2022-07-20 20:26:50 +00:00
Salad Dais
f02a479834 Add get_task_inventory_cap.py addon example
An example of mocking out actually useful behavior for the viewer.
Better (faster!) task inventory fetching API.
2022-07-20 09:20:27 +00:00
Salad Dais
b5e8b36173 Add more enum and flag defs to templates.py 2022-07-20 06:35:04 +00:00
Salad Dais
08a39f4df7 Make object update handling more robust 2022-07-20 06:35:04 +00:00
Salad Dais
61ec51beec Add demo autoattacher addon example 2022-07-19 23:48:40 +00:00
Salad Dais
9adbdcdcc8 Add a couple more flag definitions to templates.py 2022-07-19 09:49:43 +00:00
Salad Dais
e7b05f72ca Dequantize TimeDilation message var 2022-07-19 05:57:19 +00:00
Salad Dais
75f2f363a4 Handle TE glow field quantization 2022-07-18 22:29:37 +00:00
Salad Dais
cc1bb9ac1d Give MediaFlags and BasicMaterials sensible default values 2022-07-18 22:08:06 +00:00
Salad Dais
d498d1f2c8 v0.11.0 2022-07-18 08:53:24 +00:00
Salad Dais
8c0635bb2a Add classmethod for rebuilding TEs into a TECollection 2022-07-18 06:37:20 +00:00
Salad Dais
309dbeeb52 Add TextureEntry.st_to_uv() to convert between coords 2022-07-18 00:34:56 +00:00
Salad Dais
4cc87bf81e Add a default value for TextureEntryCollection.realize() num_faces 2022-07-17 01:09:22 +00:00
Salad Dais
f34bb42dcb TextureEntry -> TextureEntryCollection, improve .realize()
The "TextureEntry" name from the message template is kind of a
misnomer, the field actually includes multiple TextureEntries.
2022-07-17 00:45:20 +00:00
Salad Dais
59ec99809a Correct TE rotation quantization
Literally everything has its own special float quantization. Argh.
2022-07-16 23:17:34 +00:00
Salad Dais
4b963f96d2 Add TextureEntry.realize() to ease indexing into specific faces 2022-07-14 03:10:11 +00:00
Salad Dais
58db8f66de Correct type signatures for TextureEntriy 2022-07-10 17:58:13 +00:00
Salad Dais
95623eba58 More InventoryModel fixes 2022-07-10 01:55:34 +00:00
Salad Dais
8dba0617bd Make injecting inventory EQ events easier 2022-07-09 04:21:44 +00:00
Salad Dais
289073be8e Add InventoryModel diffing 2022-07-09 02:48:23 +00:00
Salad Dais
f3c8015366 Support mutable InventoryModels 2022-07-08 22:06:14 +00:00
Salad Dais
99e8118458 Support HIPPO XML directives in injected EQ events 2022-07-05 14:24:35 +00:00
Salad Dais
80745cfd1c Add TextureEntry.unwrap() to ease working with potentially lazy TEs 2022-07-05 03:08:52 +00:00
Salad Dais
92a06bccaf Dequantize OffsetS and OffsetT in TextureEntrys 2022-07-05 02:08:53 +00:00
Salad Dais
fde9ddf4d9 Initial work to support in-flight EQ response pre-emption 2022-07-04 17:57:05 +00:00
Salad Dais
03a56c9982 Auto-load certain symbols in REPL, add docs for REPL 2022-06-27 01:49:27 +00:00
Salad Dais
d07a0df0fd WIP LLMesh -> Collada
First half of the LLMesh -> Collada -> LLMesh transform for #24
2022-06-24 13:15:20 +00:00
Salad Dais
848397fe63 Fix windows build workflow 2022-06-24 07:36:51 +00:00
Salad Dais
0f9246c5c6 Use github.ref_name instead of github.ref 2022-06-24 02:32:50 +00:00
Salad Dais
2e7f887970 v0.10.0 2022-06-24 01:54:37 +00:00
Salad Dais
ef9df6b058 Update Windows bundling action to add artifact to release 2022-06-24 01:12:21 +00:00
Salad Dais
baae0f6d6e Fix TupleCoord negation 2022-06-21 07:15:49 +00:00
Salad Dais
0f369b682d Upgrade to mitmproxy 8.0
Not 8.1 since that drops Python 3.8 support. Closes #26
2022-06-20 15:15:57 +00:00
Salad Dais
1f1e4de254 Add addon for testing object manager conformance against viewer
Closes #18
2022-06-20 12:38:11 +00:00
Salad Dais
75ddc0a5ba Be smarter about object cache miss autorequests 2022-06-20 12:33:12 +00:00
Salad Dais
e4cb168138 Clear up last few event loop warnings 2022-06-20 12:31:08 +00:00
Salad Dais
63aebba754 Clear up some event loop deprecation warnings 2022-06-20 05:55:01 +00:00
Salad Dais
8cf1a43d59 Better defaults when parsing ObjectUpdateCompressed
This helps our view of the cache better match the viewer's VOCache
2022-06-20 03:23:46 +00:00
Salad Dais
bbc8813b61 Add unary minus for TupleCoords 2022-06-19 04:33:20 +00:00
Salad Dais
5b51dbd30f Add workaround instructions for most recent Firestorm release
Closes #25
2022-05-13 23:52:50 +00:00
Salad Dais
295c7972e7 Use windows-2019 runner instead of windows-latest
windows-latest has some weird ACL changes that cause the cx_Freeze
packaging steps to fail.
2022-05-13 23:39:37 +00:00
Salad Dais
b034661c38 Revert "Temporarily stop generating lib_licenses.txt automatically"
This reverts commit f12fd95ee1.
2022-05-13 23:39:09 +00:00
Salad Dais
f12fd95ee1 Temporarily stop generating lib_licenses.txt automatically
Something is busted with pip-licenses in CI. Not sure why, but
it's only needed for Windows builds anyway.
2022-03-12 19:15:59 +00:00
Salad Dais
bc33313fc7 v0.9.0 2022-03-12 18:40:38 +00:00
Salad Dais
affc7fcf89 Clarify comment in proxy object manager 2022-03-05 11:03:28 +00:00
Salad Dais
b8f1593a2c Allow filtering on HTTP status code 2022-03-05 10:50:09 +00:00
Salad Dais
7879f4e118 Split up mitmproxy integration test a bit 2022-03-05 10:49:55 +00:00
Salad Dais
4ba611ae01 Only apply local mesh to selected links 2022-02-28 07:32:46 +00:00
Salad Dais
82ff6d9c64 Add more TeleportFlags 2022-02-28 07:32:22 +00:00
Salad Dais
f603ea6186 Better handle timeouts that have missing cap_data metadata 2021-12-18 20:43:10 +00:00
Salad Dais
fcf6a4568b Better handling for proxied HTTP requests that timeout 2021-12-17 19:27:20 +00:00
Salad Dais
2ad6cc1b51 Better handle broken 'LLSD' responses 2021-12-17 00:18:51 +00:00
Salad Dais
025f7d31f2 Make sure .queued is cleared if message take()n twice 2021-12-15 20:17:54 +00:00
Salad Dais
9fdb281e4a Create example addon for simulating packet loss 2021-12-13 06:12:43 +00:00
Salad Dais
11e28bde2a Allow filtering message log on HTTP headers 2021-12-11 15:08:45 +00:00
Salad Dais
1faa6f977c Update docs on send() and send_reliable() 2021-12-10 13:41:20 +00:00
Salad Dais
6866e7397f Clean up cap registration API 2021-12-10 13:22:54 +00:00
Salad Dais
fa0b3a5340 Mark all Messages synthetic unless they came off the wire 2021-12-10 07:30:02 +00:00
Salad Dais
16c808bce8 Match viewer resend behaviour 2021-12-10 07:04:36 +00:00
Salad Dais
ec4b2d0770 Move last of the explicit direction params 2021-12-10 06:50:07 +00:00
Salad Dais
3b610fdfd1 Add awaitable send_reliable() 2021-12-09 05:30:35 +00:00
Salad Dais
8b93c5eefa Rename send_message() to send() 2021-12-09 05:30:12 +00:00
Salad Dais
f4bb9eae8f Fix __contains__ for JankStringyBytes 2021-12-09 03:48:29 +00:00
Salad Dais
ecb14197cf Make message log filter highlight every matched field
Previously only the first match was being highlighted.
2021-12-09 01:14:09 +00:00
Salad Dais
95fd58e25a Begin PySide6 cleanup 2021-12-09 00:02:48 +00:00
Salad Dais
afc333ab49 Improve highlighting of matched fields in message log 2021-12-08 23:50:16 +00:00
Salad Dais
eb6406bca4 Fix ACK collection logic for injected reliable messages 2021-12-08 22:29:29 +00:00
Salad Dais
d486aa130d Add support for specifying flags in message builder 2021-12-08 21:10:06 +00:00
Salad Dais
d66d5226a2 Initial implementation of reliable injected packets
See #17. Not yet tested for real.
2021-12-08 04:49:45 +00:00
Salad Dais
d86da70eeb v0.8.0 2021-12-07 07:16:25 +00:00
Salad Dais
aa0b4b63a9 Update cx_freeze script to handle PySide6 2021-12-07 07:16:25 +00:00
Salad Dais
5f479e46b4 Automatically offer to install the HTTPS certs on first run 2021-12-07 07:16:25 +00:00
Salad Dais
1e55d5a9d8 Continue handling HTTP flows if flow logging fails
If flow beautification for display throws then we don't want
to bypass other handling of the flow.

This fixes a login failure due to SL's login XML-RPC endpoint
returning a Content-Type of "application/llsd+xml/r/n" when it's
actually "application/xml".
2021-12-06 17:01:13 +00:00
Salad Dais
077a95b5e7 Migrate to PySide6 to support Python 3.10
Update Glymur too
2021-12-06 13:37:31 +00:00
Salad Dais
4f1399cf66 Add note about LinHippoAutoProxy 2021-12-06 12:26:16 +00:00
Salad Dais
9590b30e66 Add note about Python 3.10 support 2021-12-05 20:25:06 +00:00
Salad Dais
34f3ee4c3e Move mtime wrapper to helpers 2021-12-05 18:14:26 +00:00
Salad Dais
7d655543f5 Dont reserialize responses as pretty LLSD-XML
Certain LLSD parsers don't like the empty text nodes it adds around
the root element of the document. Yuck.
2021-12-05 18:12:53 +00:00
Salad Dais
5de3ed0d5e Add support for LLSD inventory representations 2021-12-03 05:59:58 +00:00
Salad Dais
74c3287cc0 Add base addon for creating proxy-only caps based on ASGI apps 2021-12-02 06:04:29 +00:00
Salad Dais
3a7f8072a0 Initial implementation of proxy-provided caps
Useful for mocking out a cap while developing the viewer-side
pieces of it.
2021-12-02 03:22:47 +00:00
dependabot[bot]
5fa91580eb Bump mitmproxy from 7.0.2 to 7.0.3 (#21)
Bumps [mitmproxy](https://github.com/mitmproxy/mitmproxy) from 7.0.2 to 7.0.3.
- [Release notes](https://github.com/mitmproxy/mitmproxy/releases)
- [Changelog](https://github.com/mitmproxy/mitmproxy/blob/main/CHANGELOG.md)
- [Commits](https://github.com/mitmproxy/mitmproxy/compare/v7.0.2...v7.0.3)

---
updated-dependencies:
- dependency-name: mitmproxy
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-11-30 05:30:06 -04:00
Salad Dais
d8fbb55438 Improve LLUDP integration tests 2021-11-30 09:25:31 +00:00
Salad Dais
99eb4fed74 Fix _reorient_coord to work correctly for normals again 2021-11-30 09:24:49 +00:00
Salad Dais
6b78b841df Fix range of mesh normals 2021-11-23 01:36:14 +00:00
Salad Dais
dae852db69 Fix filter dialog 2021-11-19 04:30:36 +00:00
Salad Dais
0c0de2bcbc v0.7.1 2021-09-04 07:27:20 +00:00
Salad Dais
9f2d2f2194 Pin recordclass version, use requirements.txt for windows build
recordclass had some breaking changes in 0.15
2021-09-04 07:12:45 +00:00
Salad Dais
c6e0a400a9 v0.7.0 2021-08-10 01:16:20 +00:00
Salad Dais
d01122d542 Call correct method to raise new message log window 2021-08-10 01:11:21 +00:00
Salad Dais
690d6b51b8 Upgrade to mitmproxy 7.0.2
Our fix for `Flow.set_state()` has been upstreamed
2021-08-09 22:16:23 +00:00
Salad Dais
2437a8b14f Add a framework for simple local anim creation, tail animator 2021-08-05 21:08:18 +00:00
Salad Dais
afa601fffe Support session-specific viewer cache directories 2021-08-02 18:23:13 +00:00
Salad Dais
874feff471 Fix incorrect reference to mitmproxy class 2021-08-01 12:16:10 +00:00
Salad Dais
05c53bba9f Add CapsClient to BaseClientSession 2021-08-01 06:39:04 +00:00
Salad Dais
578f1d8c4e Add setting to disable all proxy object autorequests
Will help with #18 by not changing object request behaviour when
running through the proxy.
2021-08-01 06:37:33 +00:00
Salad Dais
7d8e18440a Add local anim mangler support with example
Analogous to local mesh mangler support.
2021-07-31 11:56:17 +00:00
Salad Dais
66e112dd52 Add basic message log import / export feature
Closes #20
2021-07-30 03:13:33 +00:00
Salad Dais
02ac022ab3 Add export formats for message log entries 2021-07-30 01:06:29 +00:00
Salad Dais
33ce74754e Fix mirror_target_agent check in http hooks 2021-07-30 01:06:29 +00:00
Salad Dais
74dd6b977c Add extended to_dict() format for Message class
This will allow proper import / export of message logs.
2021-07-29 10:26:42 +00:00
Salad Dais
387652731a Add Message Mirror example addon 2021-07-29 09:43:20 +00:00
Salad Dais
e4601fd879 Support multiple Message Log windows
Closes #19
2021-07-29 01:00:57 +00:00
Salad Dais
6eb25f96d9 Support logging to a hierarchy of message loggers
Necessary to eventually support multiple message log windows
2021-07-27 02:35:03 +00:00
Salad Dais
22b9eeb5cb Better handling of optional command parameters 2021-07-22 23:59:55 +00:00
Salad Dais
0dbedcb2f5 Improve coverage 2021-07-22 23:58:17 +00:00
Salad Dais
7d9712c16e Fix message dropping and queueing corner cases 2021-07-22 05:08:47 +00:00
Salad Dais
82663c0fc2 Add parse_bool helper function for command parameters 2021-07-21 06:39:29 +00:00
Salad Dais
9fb4884470 Extend TlsLayer.tls_start_server instead of monkeypatching OpenSSL funcs
We have a more elegant way of unsetting `X509_CHECK_FLAG_NEVER_CHECK_SUBJECT`
now that mitmproxy 7.0 is out.

See https://github.com/mitmproxy/mitmproxy/pull/4688
2021-07-19 20:17:31 +00:00
Salad Dais
cf69c42f67 Rework HTTP proxying code to work with mitmproxy 7.0.0 2021-07-18 07:02:45 +00:00
Salad Dais
be658b9026 v0.6.3
Cutting a release before working on mitmproxy upgrade
2021-07-18 06:57:40 +00:00
Salad Dais
c505941595 Improve test for TE serialization 2021-07-18 06:33:55 +00:00
Salad Dais
96f471d6b7 Add initial support for Message-specific Block subclasses 2021-07-07 12:49:32 +00:00
Salad Dais
4238016767 Change readme wording
:)
2021-07-07 12:49:32 +00:00
Salad Dais
a35a67718d Add default_value to MessateTemplateVariable 2021-07-01 21:25:51 +00:00
Salad Dais
c2981b107a Remove CodeQL scanning
Maybe later, doesn't seem to do anything useful out of the box.
2021-06-28 06:00:42 -03:00
Salad Dais
851375499a Add CodeQL scanning 2021-06-28 05:44:02 -03:00
Salad Dais
d064ecd466 Don't raise when reading a new avatar_name_cache.xml 2021-06-25 18:45:42 +00:00
Salad Dais
fda37656c9 Reduce boilerplate for mesh mangling addons
Makes it less annoying to compose separate addons with different manglers
2021-06-24 05:29:23 +00:00
Salad Dais
49a9c6f28f Workaround for failed teleports due to EventQueue timeouts
Closes #16
2021-06-23 16:43:09 +00:00
Salad Dais
050ac5e3a9 v0.6.2 2021-06-19 03:06:39 +00:00
Salad Dais
fe0d3132e4 Update shield addon 2021-06-18 20:49:31 +00:00
Salad Dais
d7f18e05be Fix typo 2021-06-18 20:49:20 +00:00
Salad Dais
9bf4240411 Allow tagging UDPPackets with arbitrary metadata
The metadata should propagate to any Messages deserialized
from the packet as well.
2021-06-18 20:31:15 +00:00
Salad Dais
76df9a0424 Streamline template dictionary use 2021-06-17 21:28:22 +00:00
159 changed files with 19170 additions and 8203 deletions

View File

@@ -8,3 +8,5 @@ exclude_lines =
if typing.TYPE_CHECKING:
def __repr__
raise AssertionError
assert False
^\s*pass\b

View File

@@ -1,26 +1,36 @@
# Have to manually unzip this (it gets double zipped) and add it
# onto the release after it gets created. Don't want actions with repo write.
name: Bundle Windows EXE
on:
# Only trigger on release creation
release:
types:
- created
workflow_dispatch:
inputs:
ref_name:
description: Name to use for the release
env:
target_tag: ${{ github.ref_name || github.event.inputs.ref_name }}
sha: ${{ github.sha || github.event.inputs.ref_name }}
jobs:
build:
runs-on: windows-latest
runs-on: windows-2022
permissions:
contents: write
strategy:
matrix:
python-version: [3.9]
python-version: ["3.12"]
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- name: Get history and tags for SCM versioning to work
run: |
git fetch --prune --unshallow
git fetch --depth=1 origin +refs/tags/*:refs/tags/*
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
@@ -29,18 +39,30 @@ jobs:
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -e .
pip install -r requirements.txt
pip install -e .[gui]
pip install cx_freeze
- name: Bundle with cx_Freeze
shell: bash
run: |
python setup_cxfreeze.py build_exe
pip install pip-licenses
pip-licenses --format=plain-vertical --with-license-file --no-license-path --output-file=lib_licenses.txt
python setup_cxfreeze.py finalize_cxfreeze
# Should only be one, but we don't know what it's named
mv ./dist/*.zip hippolyzer-windows-${{ env.target_tag }}.zip
- name: Upload the artifact
uses: actions/upload-artifact@v2
uses: actions/upload-artifact@v4
with:
name: hippolyzer-gui-windows-${{ github.sha }}
path: ./dist/**
name: hippolyzer-windows-${{ env.sha }}
path: ./hippolyzer-windows-${{ env.target_tag }}.zip
- uses: ncipollo/release-action@v1.10.0
if: github.event_name != 'workflow_dispatch'
with:
artifacts: hippolyzer-windows-${{ env.target_tag }}.zip
tag: ${{ env.target_tag }}
token: ${{ secrets.GITHUB_TOKEN }}
allowUpdates: true

View File

@@ -16,18 +16,22 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- name: Get history and tags for SCM versioning to work
run: |
git fetch --prune --unshallow
git fetch --depth=1 origin +refs/tags/*:refs/tags/*
- uses: actions/setup-python@v2
with:
python-version: 3.9
python-version: "3.12"
- name: Install dependencies
run: |
python -m pip install --upgrade pip setuptools wheel
python -m pip install --upgrade pip setuptools wheel build
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: Build
run: >-
python setup.py sdist bdist_wheel
python -m build
# We do this, since failures on test.pypi aren't that bad
- name: Publish to Test PyPI
if: startsWith(github.event.ref, 'refs/tags') || github.event_name == 'release'
@@ -36,6 +40,7 @@ jobs:
user: __token__
password: ${{ secrets.TEST_PYPI_API_TOKEN }}
repository_url: https://test.pypi.org/legacy/
attestations: false
- name: Publish to PyPI
if: startsWith(github.event.ref, 'refs/tags') || github.event_name == 'release'
@@ -43,3 +48,4 @@ jobs:
with:
user: __token__
password: ${{ secrets.PYPI_API_TOKEN }}
attestations: false

View File

@@ -1,6 +1,12 @@
name: Run Python Tests
on: [push, pull_request]
on:
push:
paths-ignore:
- '*.md'
pull_request:
paths-ignore:
- '*.md'
jobs:
build:
@@ -8,11 +14,14 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [3.8, 3.9]
python-version: ["3.12", "3.13"]
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- name: Get history and tags for SCM versioning to work
run: |
git fetch --prune --unshallow
git fetch --depth=1 origin +refs/tags/*:refs/tags/*
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
@@ -20,10 +29,11 @@ jobs:
- name: Install dependencies
run: |
python -m pip install --upgrade pip
python -m pip install --upgrade pip wheel
pip install -r requirements.txt
pip install -r requirements-test.txt
sudo apt-get install libopenjp2-7
pip install -e .[gui]
- name: Run Flake8
run: |
flake8 .

View File

@@ -2,7 +2,7 @@
![Python Test Status](https://github.com/SaladDais/Hippolyzer/workflows/Run%20Python%20Tests/badge.svg) [![codecov](https://codecov.io/gh/SaladDais/Hippolyzer/branch/master/graph/badge.svg?token=HCTFA4RAXX)](https://codecov.io/gh/SaladDais/Hippolyzer)
[Hippolyzer](http://wiki.secondlife.com/wiki/Hippo) is a fork of Linden Lab's abandoned
[Hippolyzer](http://wiki.secondlife.com/wiki/Hippo) is a revival of Linden Lab's
[PyOGP library](http://wiki.secondlife.com/wiki/PyOGP)
targeting modern Python 3, with a focus on debugging issues in Second Life-compatible
servers and clients. There is a secondary focus on mocking up new features without requiring a
@@ -27,7 +27,7 @@ with low-level SL details. See the [Local Animation addon example](https://githu
### From Source
* Python 3.8 or above is **required**. If you're unable to upgrade your system Python package due to
* Python 3.12 or above is **required**. If you're unable to upgrade your system Python package due to
being on a stable distro, you can use [pyenv](https://github.com/pyenv/pyenv) to create
a self-contained Python install with the appropriate version.
* [Create a clean Python 3 virtualenv](https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/#creating-a-virtual-environment)
@@ -35,7 +35,9 @@ with low-level SL details. See the [Local Animation addon example](https://githu
* Activate the virtualenv by running the appropriate activation script
* * Under Linux this would be something like `source <virtualenv_dir>/bin/activate`
* * Under Windows it's `<virtualenv_dir>\Scripts\activate.bat`
* Run `pip install hippolyzer`, or run `pip install -e .` in a cloned repo to install an editable version
* Run `pip install hippolyzer[gui]` for a full install, or run `pip install -e .[gui]` in a cloned repo to install an editable version
* * If you only want the core library without proxy or GUI support, use `pip install hippolyzer` or `pip install -e .`
* * If you only want proxy/CLI support without the GUI, use `pip install hippolyzer[proxy]` or `pip install -e .[proxy]`
### Binary Windows Builds
@@ -48,8 +50,7 @@ A proxy is provided with both a CLI and Qt-based interface. The proxy applicatio
custom SOCKS 5 UDP proxy, as well as an HTTP proxy based on [mitmproxy](https://mitmproxy.org/).
Multiple clients are supported at a time, and UDP messages may be injected in either
direction. The proxy UI was inspired by the Message Log and Message Builder as present in
the [Alchemy](https://github.com/AlchemyViewer/Alchemy) viewer.
direction.
### Proxy Setup
@@ -83,6 +84,10 @@ SOCKS 5 works correctly on these platforms, so you can just configure it through
the `no_proxy` env var appropriately. For ex. `no_proxy="asset-cdn.glb.agni.lindenlab.com" ./firestorm`.
* Log in!
Or, if you're on Linux, you can instead use [LinHippoAutoProxy](https://github.com/SaladDais/LinHippoAutoProxy)
to launch your viewer, which will configure everything for you. Note that connections from the in-viewer browser will
likely _not_ be run through Hippolyzer when using LinHippoAutoProxy.
### Filtering
By default, the proxy's display filter is configured to ignore many high-frequency messages.
@@ -224,7 +229,7 @@ OUT ObjectAdd
```
The repeat spinner at the bottom of the window lets you send a message multiple times.
an `i` variable is put into the eval context and can be used to vary messages accros repeats.
an `i` variable is put into the eval context and can be used to vary messages across repeats.
With repeat set to two:
```
@@ -311,6 +316,22 @@ If you are a viewer developer, please put them in a viewer.
apply the mesh to the local mesh target. It works on attachments too. Useful for testing rigs before a
final, real upload.
## REPL
A quick and dirty REPL is also included for when you want to do ad-hoc introspection of proxy state.
It can be launched at any time by typing `/524 spawn_repl` in chat.
![Screenshot of REPL](https://github.com/SaladDais/Hippolyzer/blob/master/static/repl_screenshot.png?raw=true)
The REPL is fully async aware and allows awaiting events without blocking:
```python
>>> from hippolyzer.lib.client.object_manager import ObjectUpdateType
>>> evt = await session.objects.events.wait_for((ObjectUpdateType.UPDATE,), timeout=2.0)
>>> evt.updated
{'Position'}
```
## Potential Changes
* AISv3 wrapper?
@@ -375,11 +396,21 @@ To have your client's traffic proxied through Hippolyzer the general flow is:
* The proxy needs to use content sniffing to figure out which requests are login requests,
so make sure your request would pass `MITMProxyEventManager._is_login_request()`
#### Do I have to do all that?
You might be able to automate some of it on Linux by using
[LinHippoAutoProxy](https://github.com/SaladDais/LinHippoAutoProxy). If you're on Windows or MacOS the
above is your only option.
### Should I use this library to make an SL client in Python?
No. If you just want to write a client in Python, you should instead look at using
Probably not. If you just want to write a client in Python, you should instead look at using
[libremetaverse](https://github.com/cinderblocks/libremetaverse/) via pythonnet.
I removed the client-related code inherited from PyOGP because libremetaverse's was simply better.
I removed the client-related code inherited from PyOGP because libremetaverse's was simply better
for general use.
<https://github.com/CasperTech/node-metaverse/> also looks like a good, modern wrapper if you
prefer TypeScript.
There is, however, a very low-level `HippoClient` class provided for testing, but it's unlikely
to be what you want for writing a general-purpose bot.

View File

@@ -0,0 +1,32 @@
"""
Example anim mangler addon, to be used with local anim addon.
You can edit this live to apply various transforms to local anims,
as well as any uploaded anims. Any changes will be reflected in currently
playing local anims.
This example modifies any position keys of an animation's mHipRight joint.
"""
from hippolyzer.lib.base.llanim import Animation
from hippolyzer.lib.proxy.addons import AddonManager
import local_anim
AddonManager.hot_reload(local_anim, require_addons_loaded=True)
def offset_right_hip(anim: Animation):
hip_joint = anim.joints.get("mHipRight")
if hip_joint:
for pos_frame in hip_joint.pos_keyframes:
pos_frame.pos.Z *= 2.5
pos_frame.pos.X *= 5.0
return anim
class ExampleAnimManglerAddon(local_anim.BaseAnimManglerAddon):
ANIM_MANGLERS = [
offset_right_hip,
]
addons = [ExampleAnimManglerAddon()]

View File

@@ -0,0 +1,125 @@
"""
Debugger for detecting when animations within an object get started or stopped
Useful for tracking down animation sequence-related bugs within your LSL scripts,
or debugging automatic animation stopping behavior in the viewer.
If an animation unexpectedly stops and nobody requested it be stopped, it's a potential viewer bug (or priority issue).
If an animation unexpectedly stops and the viewer requested it be stopped, it's also a potential viewer bug.
If an animation unexpectedly stops and only the server requested it be stopped, it's a potential script / server bug.
"""
from typing import *
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.base.objects import Object
from hippolyzer.lib.base.templates import AssetType
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.addon_utils import show_message
class AnimTrackerAddon(BaseAddon):
should_track_anims: bool = SessionProperty(False)
anims_lookup: Dict[UUID, str] = SessionProperty(dict)
last_tracker_anims: Set[UUID] = SessionProperty(set)
def _format_anim_diffs(self, started_anims: Set[UUID], stopped_anims: Set[UUID]):
added_strs = [f"+{self.anims_lookup[x]!r}" for x in started_anims]
removed_strs = [f"-{self.anims_lookup[x]!r}" for x in stopped_anims]
return ", ".join(removed_strs + added_strs)
@handle_command()
async def track_anims(self, session: Session, region: ProxiedRegion):
"""Track when animations within this object get started or stopped"""
if self.should_track_anims:
self.last_tracker_anims.clear()
self.anims_lookup.clear()
selected = region.objects.lookup_localid(session.selected.object_local)
if not selected:
return
self.should_track_anims = True
object_items = await region.objects.request_object_inv(selected)
anims: Dict[UUID, str] = {}
for item in object_items:
if item.type != AssetType.ANIMATION:
continue
anims[item.true_asset_id] = item.name
self.anims_lookup = anims
@handle_command()
async def stop_tracking_anims(self, _session: Session, _region: ProxiedRegion):
"""Stop reporting differences"""
if self.should_track_anims:
self.should_track_anims = False
self.last_tracker_anims.clear()
self.anims_lookup.clear()
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if not self.should_track_anims:
return
if message.name != "AgentAnimation" or message.direction != Direction.OUT:
# AgentAnimation is the message the viewer uses to request manually starting or stopping animations.
# We don't care about other messages, we're just interested in distinguishing cases where the viewer
# specifically requested something vs something being done by the server on its own.
return
av = session.objects.lookup_avatar(session.agent_id)
if not av or not av.Object:
print("Somehow didn't know about our own av object?")
return
current_anims = set([x for x in av.Object.Animations if x in self.anims_lookup])
started_anims: Set[UUID] = set()
stopped_anims: Set[UUID] = set()
for block in message["AnimationList"]:
anim_id = block["AnimID"]
if anim_id not in self.anims_lookup:
continue
start_anim = block["StartAnim"]
already_started = anim_id in current_anims
if start_anim == already_started:
# No change
continue
if start_anim:
started_anims.add(anim_id)
else:
stopped_anims.add(anim_id)
if started_anims or stopped_anims:
show_message("Viewer Requested Anims: " + self._format_anim_diffs(started_anims, stopped_anims))
def handle_object_updated(self, session: Session, region: ProxiedRegion,
obj: Object, updated_props: Set[str], msg: Optional[Message]):
if not self.should_track_anims:
return
if obj.FullID != session.agent_id:
return
if "Animations" not in updated_props:
return
current_anims = set([x for x in obj.Animations if x in self.anims_lookup])
started_anims = current_anims - self.last_tracker_anims
stopped_anims = self.last_tracker_anims - current_anims
self.last_tracker_anims.clear()
self.last_tracker_anims.update(current_anims)
if started_anims or stopped_anims:
show_message("Anim Diffs: " + self._format_anim_diffs(started_anims, stopped_anims))
addons = [AnimTrackerAddon()]

View File

@@ -0,0 +1,94 @@
"""
Try and diagnose very slow avatar appearance loads when the avatars first come on the scene
I guess use LEAP or something to detect when things _actually_ declouded.
"""
from typing import *
import dataclasses
import datetime as dt
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.objects import Object
from hippolyzer.lib.base.templates import PCode
from hippolyzer.lib.proxy.addon_utils import BaseAddon, GlobalProperty
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
@dataclasses.dataclass
class AvatarBakeRequest:
requested: dt.datetime
received: Optional[dt.datetime] = None
@dataclasses.dataclass
class AvatarAppearanceRecord:
object_received: dt.datetime
"""When we learned about the agent as an object"""
appearance_received: Optional[dt.datetime] = None
"""When AvatarAppearance was first received"""
bake_requests: Dict[str, AvatarBakeRequest] = dataclasses.field(default_factory=dict)
"""Layer name -> request / response details"""
class AppearanceDelayTrackerAddon(BaseAddon):
# Should be able to access this in the REPL
# Normally we'd use a session property, but we may not have a proper session context for some requests
av_appearance_data: Dict[UUID, AvatarAppearanceRecord] = GlobalProperty(dict)
def handle_object_updated(self, session: Session, region: ProxiedRegion,
obj: Object, updated_props: Set[str], msg: Optional[Message]):
if obj.PCode == PCode.AVATAR and obj.FullID not in self.av_appearance_data:
self.av_appearance_data[obj.FullID] = AvatarAppearanceRecord(object_received=dt.datetime.now())
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if message.name != "AvatarAppearance":
return
agent_id = message["Sender"]["ID"]
appearance_data = self.av_appearance_data.get(agent_id)
if not appearance_data:
print(f"Got appearance for {agent_id} without knowing about object?")
return
if appearance_data.appearance_received:
return
appearance_data.appearance_received = dt.datetime.now()
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
if not flow.cap_data:
return
if flow.cap_data.cap_name != "AppearanceService":
return
agent_id = UUID(flow.request.url.split('/')[-3])
slot_name = flow.request.url.split('/')[-2]
appearance_data = self.av_appearance_data.get(agent_id)
if not appearance_data:
print(f"Got AppearanceService req for {agent_id} without knowing about object?")
return
if slot_name in appearance_data.bake_requests:
# We already requested this slot before
return
appearance_data.bake_requests[slot_name] = AvatarBakeRequest(requested=dt.datetime.now())
def handle_http_response(self, session_manager: SessionManager, flow: HippoHTTPFlow):
if not flow.cap_data:
return
if flow.cap_data.cap_name != "AppearanceService":
return
agent_id = UUID(flow.request.url.split('/')[-3])
slot_name = flow.request.url.split('/')[-2]
appearance_data = self.av_appearance_data.get(agent_id)
if not appearance_data:
return
slot_details = appearance_data.bake_requests.get(slot_name)
if not slot_details:
return
slot_details.received = dt.datetime.now()
addons = [AppearanceDelayTrackerAddon()]

View File

@@ -11,7 +11,7 @@ import enum
import os.path
from typing import *
from PySide2 import QtCore, QtGui, QtWidgets
from PySide6 import QtCore, QtGui, QtWidgets
from hippolyzer.lib.base.datatypes import Vector3
from hippolyzer.lib.base.message.message import Block, Message
@@ -80,7 +80,7 @@ class BlueishObjectListGUIAddon(BaseAddon):
raise
def _highlight_object(self, session: Session, obj: Object):
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"ForceObjectSelect",
Block("Header", ResetList=False),
Block("Data", LocalID=obj.LocalID),
@@ -88,7 +88,7 @@ class BlueishObjectListGUIAddon(BaseAddon):
))
def _teleport_to_object(self, session: Session, obj: Object):
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"TeleportLocationRequest",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
Block(
@@ -114,7 +114,7 @@ class BlueishObjectListGUIAddon(BaseAddon):
region.objects.request_missing_objects()
def handle_object_updated(self, session: Session, region: ProxiedRegion,
obj: Object, updated_props: Set[str]):
obj: Object, updated_props: Set[str], msg: Optional[Message]):
if self.blueish_model is None:
return

View File

@@ -6,7 +6,7 @@ from hippolyzer.lib.proxy.sessions import Session
def handle_lludp_message(session: Session, region: ProxiedRegion, message: Message):
# addon_ctx will persist across addon reloads, use for storing data that
# needs to survive across calls to this function
ctx = session.addon_ctx
ctx = session.addon_ctx[__name__]
if message.name == "ChatFromViewer":
chat = message["ChatData"]["Message"]
if chat == "COUNT":

View File

@@ -0,0 +1,44 @@
"""
Demonstrates item creation as well as bodypart / clothing upload
"""
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.templates import WearableType, Permissions
from hippolyzer.lib.base.wearables import Wearable, VISUAL_PARAMS
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
class ShapeCreatorAddon(BaseAddon):
@handle_command()
async def create_shape(self, session: Session, region: ProxiedRegion):
"""Make a shape with pre-set parameters and place it in the body parts folder"""
wearable = Wearable.make_default(WearableType.SHAPE)
# Max out the jaw jut param
jaw_param = VISUAL_PARAMS.by_name("Jaw Jut")
wearable.parameters[jaw_param.id] = jaw_param.value_max
wearable.name = "Cool Shape"
# A unique transaction ID is needed to tie the item creation to the following asset upload.
transaction_id = UUID.random()
item = await session.inventory.create_item(
UUID.ZERO, # This will place it in the default folder for the type
name=wearable.name,
type=wearable.wearable_type.asset_type,
inv_type=wearable.wearable_type.asset_type.inventory_type,
wearable_type=wearable.wearable_type,
next_mask=Permissions.MOVE | Permissions.MODIFY | Permissions.COPY | Permissions.TRANSFER,
transaction_id=transaction_id,
)
print(f"Created {item!r}")
await region.xfer_manager.upload_asset(
wearable.wearable_type.asset_type,
wearable.to_str(),
transaction_id=transaction_id,
)
addons = [ShapeCreatorAddon()]

View File

@@ -4,8 +4,13 @@ Helper for making deformer anims. This could have a GUI I guess.
import dataclasses
from typing import *
import numpy as np
import transformations
from hippolyzer.lib.base.datatypes import Vector3, Quaternion, UUID
from hippolyzer.lib.base.llanim import Joint, Animation, PosKeyframe, RotKeyframe
from hippolyzer.lib.base.mesh import MeshAsset, SegmentHeaderDict, SkinSegmentDict, LLMeshSerializer
from hippolyzer.lib.base.serialization import BufferWriter
from hippolyzer.lib.proxy.addon_utils import show_message, BaseAddon, SessionProperty
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.commands import handle_command, Parameter
@@ -45,6 +50,58 @@ def build_deformer(joints: Dict[str, DeformerJoint]) -> bytes:
return anim.to_bytes()
def build_mesh_deformer(joints: Dict[str, DeformerJoint]) -> bytes:
skin_seg = SkinSegmentDict(
joint_names=[],
bind_shape_matrix=identity_mat4(),
inverse_bind_matrix=[],
alt_inverse_bind_matrix=[],
pelvis_offset=0.0,
lock_scale_if_joint_position=False
)
for joint_name, joint in joints.items():
# We can only represent joint translations, ignore this joint if it doesn't have any.
if not joint.pos:
continue
skin_seg['joint_names'].append(joint_name)
# Inverse bind matrix isn't actually used, so we can just give it a placeholder value of the
# identity mat4. This might break things in weird ways because the matrix isn't actually sensible.
skin_seg['inverse_bind_matrix'].append(identity_mat4())
# Create a flattened mat4 that only has a translation component of our joint pos
# The viewer ignores any other component of these matrices so no point putting shear
# or perspective or whatever :)
joint_mat4 = pos_to_mat4(joint.pos)
# Ask the viewer to override this joint's usual parent-relative position with our matrix
skin_seg['alt_inverse_bind_matrix'].append(joint_mat4)
# Make a dummy mesh and shove our skin segment onto it. None of the tris are rigged, so the
# viewer will freak out and refuse to display the tri, only the joint translations will be used.
# Supposedly a mesh with a `skin` segment but no weights on the material should just result in an
# effectively unrigged material, but that's not the case. Oh well.
mesh = MeshAsset.make_triangle()
mesh.header['skin'] = SegmentHeaderDict(offset=0, size=0)
mesh.segments['skin'] = skin_seg
writer = BufferWriter("!")
writer.write(LLMeshSerializer(), mesh)
return writer.copy_buffer()
def identity_mat4() -> List[float]:
"""
Return an "Identity" mat4
Effectively represents a transform of no rot, no translation, no shear, no perspective
and scaling by 1.0 on every axis.
"""
return list(np.identity(4).flatten('F'))
def pos_to_mat4(pos: Vector3) -> List[float]:
"""Convert a position Vector3 to a Translation Mat4"""
return list(transformations.compose_matrix(translate=tuple(pos)).flatten('F'))
class DeformerAddon(BaseAddon):
deform_joints: Dict[str, DeformerJoint] = SessionProperty(dict)
@@ -95,7 +152,7 @@ class DeformerAddon(BaseAddon):
local_anim.LocalAnimAddon.apply_local_anim(session, region, "deformer_addon", anim_data)
def handle_rlv_command(self, session: Session, region: ProxiedRegion, source: UUID,
cmd: str, options: List[str], param: str):
behaviour: str, options: List[str], param: str):
# An object in-world can also tell the client how to deform itself via
# RLV-style commands.
@@ -103,9 +160,9 @@ class DeformerAddon(BaseAddon):
if param != "force":
return
if cmd == "stop_deforming":
if behaviour == "stop_deforming":
self.deform_joints.clear()
elif cmd == "deform_joints":
elif behaviour == "deform_joints":
self.deform_joints.clear()
for joint_data in options:
joint_split = joint_data.split("|")
@@ -118,5 +175,41 @@ class DeformerAddon(BaseAddon):
self._reapply_deformer(session, region)
return True
@handle_command()
async def save_deformer_as_mesh(self, _session: Session, _region: ProxiedRegion):
"""
Export the deformer as a crafted rigged mesh rather than an animation
Mesh deformers have the advantage that they don't cause your joints to "stick"
like animations do when using animations with pos keyframes.
"""
filename = await AddonManager.UI.save_file(filter_str="LL Mesh (*.llmesh)")
if not filename:
return
with open(filename, "wb") as f:
f.write(build_mesh_deformer(self.deform_joints))
@handle_command()
async def upload_deformer_as_mesh(self, _session: Session, region: ProxiedRegion):
"""Same as save_deformer_as_mesh, but uploads the mesh directly to SL."""
mesh_bytes = build_mesh_deformer(self.deform_joints)
try:
# Send off mesh to calculate upload cost
upload_token = await region.asset_uploader.initiate_mesh_upload("deformer", mesh_bytes)
except Exception as e:
show_message(e)
raise
if not await AddonManager.UI.confirm("Upload", f"Spend {upload_token.linden_cost}L on upload?"):
return
# Do the actual upload
try:
await region.asset_uploader.complete_upload(upload_token)
except Exception as e:
show_message(e)
raise
addons = [DeformerAddon()]

View File

@@ -0,0 +1,158 @@
"""
Detect receipt of a marketplace order for a demo, and auto-attach the most appropriate object
"""
import asyncio
import re
from typing import List, Tuple, Dict, Optional, Sequence
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message, Block
from hippolyzer.lib.base.templates import InventoryType, Permissions, FolderType
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
MARKETPLACE_TRANSACTION_ID = UUID('ffffffff-ffff-ffff-ffff-ffffffffffff')
class DemoAutoAttacher(BaseAddon):
def handle_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
if event["message"] != "BulkUpdateInventory":
return
# Check that this update even possibly came from the marketplace
if event["body"]["AgentData"][0]["TransactionID"] != MARKETPLACE_TRANSACTION_ID:
return
# Make sure that the transaction targeted our real received items folder
folders = event["body"]["FolderData"]
received_folder = folders[0]
if received_folder["Name"] != "Received Items":
return
skel = session.login_data['inventory-skeleton']
actual_received = [x for x in skel if x['type_default'] == FolderType.INBOX]
assert actual_received
if UUID(actual_received[0]['folder_id']) != received_folder["FolderID"]:
show_message(f"Strange received folder ID spoofing? {folders!r}")
return
if not re.match(r".*\bdemo\b.*", folders[1]["Name"], flags=re.I):
return
# Alright, so we have a demo... thing from the marketplace. What now?
items = event["body"]["ItemData"]
object_items = [x for x in items if x["InvType"] == InventoryType.OBJECT]
if not object_items:
return
self._schedule_task(self._attach_best_object(session, region, object_items))
async def _attach_best_object(self, session: Session, region: ProxiedRegion, object_items: List[Dict]):
own_body_type = await self._guess_own_body(session, region)
show_message(f"Trying to find demo for {own_body_type}")
guess_patterns = self.BODY_CLOTHING_PATTERNS.get(own_body_type)
to_attach = []
if own_body_type and guess_patterns:
matching_items = self._get_matching_items(object_items, guess_patterns)
if matching_items:
# Only take the first one
to_attach.append(matching_items[0])
if not to_attach:
# Don't know what body's being used or couldn't figure out what item
# would work best with our body. Just attach the first object in the folder.
to_attach.append(object_items[0])
# Also attach whatever HUDs, maybe we need them.
for hud in self._get_matching_items(object_items, ("hud",)):
if hud not in to_attach:
to_attach.append(hud)
region.circuit.send(Message(
'RezMultipleAttachmentsFromInv',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('HeaderData', CompoundMsgID=UUID.random(), TotalObjects=len(to_attach), FirstDetachAll=0),
*[Block(
'ObjectData',
ItemID=o["ItemID"],
OwnerID=session.agent_id,
# 128 = "add", uses whatever attachmentpt was defined on the object
AttachmentPt=128,
ItemFlags_=(),
GroupMask_=(),
EveryoneMask_=(),
NextOwnerMask_=(Permissions.COPY | Permissions.MOVE),
Name=o["Name"],
Description=o["Description"],
) for o in to_attach]
))
def _get_matching_items(self, items: List[dict], patterns: Sequence[str]):
# Loop over patterns to search for our body type, in order of preference
matched = []
for guess_pattern in patterns:
# Check each item for that pattern
for item in items:
if re.match(rf".*\b{guess_pattern}\b.*", item["Name"], re.I):
matched.append(item)
return matched
# We scan the agent's attached objects to guess what kind of body they use
BODY_PREFIXES = {
"-Belleza- Jake ": "jake",
"-Belleza- Freya ": "freya",
"-Belleza- Isis ": "isis",
"-Belleza- Venus ": "venus",
"[Signature] Gianni Body": "gianni",
"[Signature] Geralt Body": "geralt",
"Maitreya Mesh Body - Lara": "maitreya",
"Slink Physique Hourglass Petite": "hg_petite",
"Slink Physique Mesh Body Hourglass": "hourglass",
"Slink Physique Original Petite": "phys_petite",
"Slink Physique Mesh Body Original": "physique",
"[BODY] Legacy (f)": "legacy_f",
"[BODY] Legacy (m)": "legacy_m",
"[Signature] Alice Body": "sig_alice",
"Slink Physique MALE Mesh Body": "slink_male",
"AESTHETIC - [Mesh Body]": "aesthetic",
}
# Different bodies' clothes have different naming conventions according to different merchants.
# These are common naming patterns we use to choose objects to attach, in order of preference.
BODY_CLOTHING_PATTERNS: Dict[str, Tuple[str, ...]] = {
"jake": ("jake", "belleza"),
"freya": ("freya", "belleza"),
"isis": ("isis", "belleza"),
"venus": ("venus", "belleza"),
"gianni": ("gianni", "signature", "sig"),
"geralt": ("geralt", "signature", "sig"),
"hg_petite": ("hourglass petite", "hg petite", "hourglass", "hg", "slink"),
"hourglass": ("hourglass", "hg", "slink"),
"phys_petite": ("physique petite", "phys petite", "physique", "phys", "slink"),
"physique": ("physique", "phys", "slink"),
"legacy_f": ("legacy",),
"legacy_m": ("legacy",),
"sig_alice": ("alice", "signature"),
"slink_male": ("physique", "slink"),
"aesthetic": ("aesthetic",),
}
async def _guess_own_body(self, session: Session, region: ProxiedRegion) -> Optional[str]:
agent_obj = region.objects.lookup_fullid(session.agent_id)
if not agent_obj:
return None
# We probably won't know the names for all of our attachments, request them.
# Could be obviated by looking at the COF, not worth it for this.
try:
await asyncio.wait(region.objects.request_object_properties(agent_obj.Children), timeout=0.5)
except asyncio.TimeoutError:
# We expect that we just won't ever receive some property requests, that's fine
pass
for prefix, body_type in self.BODY_PREFIXES.items():
for obj in agent_obj.Children:
if not obj.Name:
continue
if obj.Name.startswith(prefix):
return body_type
return None
addons = [DemoAutoAttacher()]

View File

@@ -0,0 +1,119 @@
"""
Loading task inventory doesn't actually need to be slow.
By using a cap instead of the slow xfer path and sending the LLSD inventory
model we get 15x speedups even when mocking things behind the scenes by using
a hacked up version of xfer. See turbo_object_inventory.py
"""
import asyncio
import asgiref.wsgi
from typing import *
from flask import Flask, Response, request
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.inventory import InventoryModel, InventoryObject
from hippolyzer.lib.base.message.message import Message, Block
from hippolyzer.lib.base.templates import XferFilePath, AssetType
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.webapp_cap_addon import WebAppCapAddon
app = Flask("GetTaskInventoryCapApp")
@app.route('/', methods=["GET"])
async def get_task_inventory():
# Should always have the current region, the cap handler is bound to one.
# Just need to pull it from the `addon_ctx` module's global.
region = addon_ctx.region.get()
session = addon_ctx.session.get()
obj_id = UUID(request.args["task_id"])
obj = region.objects.lookup_fullid(obj_id)
if not obj:
return Response(f"Couldn't find {obj_id}", status=404, mimetype="text/plain")
request_msg = Message(
'RequestTaskInventory',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('InventoryData', LocalID=obj.LocalID),
)
# Keep around a dict of chunks we saw previously in case we have to restart
# an Xfer due to missing chunks. We don't expect chunks to change across Xfers
# so this can be used to recover from dropped SendXferPackets in subsequent attempts
existing_chunks: Dict[int, bytes] = {}
for _ in range(3):
# Any previous requests will have triggered a delete of the inventory file
# by marking it complete on the server-side. Re-send our RequestTaskInventory
# To make sure there's a fresh copy.
region.circuit.send(request_msg.take())
inv_message = await region.message_handler.wait_for(
('ReplyTaskInventory',),
predicate=lambda x: x["InventoryData"]["TaskID"] == obj.FullID,
timeout=5.0,
)
# No task inventory, send the reply as-is
file_name = inv_message["InventoryData"]["Filename"]
if not file_name:
# The "Contents" folder always has to be there, if we don't put it here
# then the viewer will have to lie about it being there itself.
return Response(
llsd.format_xml({
"inventory": [
InventoryObject(
name="Contents",
parent_id=UUID.ZERO,
type=AssetType.CATEGORY,
obj_id=obj_id
).to_llsd()
],
"inv_serial": inv_message["InventoryData"]["Serial"],
}),
headers={"Content-Type": "application/llsd+xml"},
status=200,
)
last_serial = request.args.get("last_serial", None)
if last_serial:
last_serial = int(last_serial)
if inv_message["InventoryData"]["Serial"] == last_serial:
# Nothing has changed since the version of the inventory they say they have, say so.
return Response("", status=304)
xfer = region.xfer_manager.request(
file_name=file_name,
file_path=XferFilePath.CACHE,
turbo=True,
)
xfer.chunks.update(existing_chunks)
try:
await xfer
except asyncio.TimeoutError:
# We likely failed the request due to missing chunks, store
# the chunks that we _did_ get for the next attempt.
existing_chunks.update(xfer.chunks)
continue
inv_model = InventoryModel.from_str(xfer.reassemble_chunks().decode("utf8"))
return Response(
llsd.format_xml({
"inventory": inv_model.to_llsd(),
"inv_serial": inv_message["InventoryData"]["Serial"],
}),
headers={"Content-Type": "application/llsd+xml"},
)
raise asyncio.TimeoutError("Failed to get inventory after 3 tries")
class GetTaskInventoryCapExampleAddon(WebAppCapAddon):
# A cap URL with this name will be tied to each region when
# the sim is first connected to. The URL will be returned to the
# viewer in the Seed if the viewer requests it by name.
CAP_NAME = "GetTaskInventoryExample"
# Any asgi app should be fine.
APP = asgiref.wsgi.WsgiToAsgi(app)
addons = [GetTaskInventoryCapExampleAddon()]

View File

@@ -105,7 +105,7 @@ class HorrorAnimatorAddon(BaseAddon):
# send the response back immediately
block = STATIC_VFS[orig_anim_id]
anim_data = STATIC_VFS.read_block(block)
flow.response = mitmproxy.http.HTTPResponse.make(
flow.response = mitmproxy.http.Response.make(
200,
_mutate_anim_bytes(anim_data),
{

View File

@@ -0,0 +1,50 @@
"""
Example of how to control a viewer over LEAP
Must launch the viewer with `outleap-agent` LEAP script.
See https://github.com/SaladDais/outleap/ for more info on LEAP / outleap.
"""
import outleap
from outleap.scripts.inspector import LEAPInspectorGUI
from hippolyzer.lib.proxy.addon_utils import send_chat, BaseAddon, show_message
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
# Path found using `outleap-inspector`
FPS_PATH = outleap.UIPath("/main_view/menu_stack/status_bar_container/status/time_and_media_bg/FPSText")
class LEAPExampleAddon(BaseAddon):
async def handle_leap_client_added(self, session_manager: SessionManager, leap_client: outleap.LEAPClient):
# You can do things as soon as the LEAP client connects, like if you want to automate
# login or whatever.
viewer_control_api = outleap.LLViewerControlAPI(leap_client)
# Ask for a config value and print it in the viewer logs
print(await viewer_control_api.get("Global", "StatsPilotFile"))
@handle_command()
async def show_ui_inspector(self, session: Session, _region: ProxiedRegion):
"""Spawn a GUI for inspecting the UI state"""
if not session.leap_client:
show_message("No LEAP client connected?")
return
LEAPInspectorGUI(session.leap_client).show()
@handle_command()
async def say_fps(self, session: Session, _region: ProxiedRegion):
"""Say your current FPS in chat"""
if not session.leap_client:
show_message("No LEAP client connected?")
return
window_api = outleap.LLWindowAPI(session.leap_client)
fps = (await window_api.get_info(path=FPS_PATH))['value']
send_chat(f"LEAP says I'm running at {fps} FPS!")
addons = [LEAPExampleAddon()]

View File

@@ -5,42 +5,58 @@ Local animations
assuming you loaded something.anim
/524 start_local_anim something
/524 stop_local_anim something
/524 save_local_anim something
If you want to trigger the animation from an object to simulate llStartAnimation():
llOwnerSay("@start_local_anim:something=force");
Also includes a concept of "anim manglers" similar to the "mesh manglers" of the
local mesh addon. This is useful if you want to test making procedural changes
to animations before uploading them. The manglers will be applied to any uploaded
animations as well.
May also be useful if you need to make ad-hoc changes to a bunch of animations on
bulk upload, like changing priority or removing a joint.
"""
import asyncio
import os
import logging
import pathlib
from abc import abstractmethod
from typing import *
from hippolyzer.lib.base import serialization as se
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import get_mtime
from hippolyzer.lib.base.llanim import Animation
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.msgtypes import PacketFlags
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty, GlobalProperty, show_message
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.http_asset_repo import HTTPAssetRepo
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
def _get_mtime(path: str):
try:
return os.stat(path).st_mtime
except:
return None
from hippolyzer.lib.proxy.sessions import Session, SessionManager
class LocalAnimAddon(BaseAddon):
# name -> path, only for anims actually from files
local_anim_paths: Dict[str, str] = SessionProperty(dict)
# name -> anim bytes
local_anim_bytes: Dict[str, bytes] = SessionProperty(dict)
# name -> mtime or None. Only for anims from files.
local_anim_mtimes: Dict[str, Optional[float]] = SessionProperty(dict)
# name -> current asset ID (changes each play)
local_anim_playing_ids: Dict[str, UUID] = SessionProperty(dict)
anim_manglers: List[Callable[[Animation], Animation]] = GlobalProperty(list)
def handle_init(self, session_manager: SessionManager):
self.remangle_local_anims(session_manager)
def handle_session_init(self, session: Session):
# Reload anims and reload any manglers if we have any
self._schedule_task(self._try_reload_anims(session))
@handle_command()
@@ -66,11 +82,23 @@ class LocalAnimAddon(BaseAddon):
"""Stop a named local animation"""
self.apply_local_anim(session, region, anim_name, new_data=None)
@handle_command(anim_name=str)
async def save_local_anim(self, _session: Session, _region: ProxiedRegion, anim_name: str):
"""Save a named local anim to disk"""
anim_bytes = self.local_anim_bytes.get(anim_name)
if not anim_bytes:
return
filename = await AddonManager.UI.save_file(filter_str="SL Anim (*.anim)", default_suffix="anim")
if not filename:
return
with open(filename, "wb") as f:
f.write(anim_bytes)
async def _try_reload_anims(self, session: Session):
while True:
region = session.main_region
if not region:
await asyncio.sleep(2.0)
await asyncio.sleep(1.0)
continue
# Loop over local anims we loaded
@@ -79,19 +107,22 @@ class LocalAnimAddon(BaseAddon):
if not anim_id:
continue
# is playing right now, check if there's a newer version
self.apply_local_anim_from_file(session, region, anim_name, only_if_changed=True)
await asyncio.sleep(2.0)
try:
self.apply_local_anim_from_file(session, region, anim_name, only_if_changed=True)
except Exception:
logging.exception("Exploded while replaying animation")
await asyncio.sleep(1.0)
def handle_rlv_command(self, session: Session, region: ProxiedRegion, source: UUID,
cmd: str, options: List[str], param: str):
behaviour: str, options: List[str], param: str):
# We only handle commands
if param != "force":
return
if cmd == "stop_local_anim":
if behaviour == "stop_local_anim":
self.apply_local_anim(session, region, options[0], new_data=None)
return True
elif cmd == "start_local_anim":
elif behaviour == "start_local_anim":
self.apply_local_anim_from_file(session, region, options[0])
return True
@@ -107,6 +138,7 @@ class LocalAnimAddon(BaseAddon):
AgentID=session.agent_id,
SessionID=session.id,
),
flags=PacketFlags.RELIABLE,
)
# Stop any old version of the anim that might be playing first
@@ -127,11 +159,13 @@ class LocalAnimAddon(BaseAddon):
StartAnim=True,
))
cls.local_anim_playing_ids[anim_name] = next_id
cls.local_anim_bytes[anim_name] = new_data
else:
# No data means just stop the anim
cls.local_anim_playing_ids.pop(anim_name, None)
cls.local_anim_bytes.pop(anim_name, None)
region.circuit.send_message(new_msg)
region.circuit.send(new_msg)
print(f"Changing {anim_name} to {next_id}")
@classmethod
@@ -141,11 +175,10 @@ class LocalAnimAddon(BaseAddon):
anim_data = None
if anim_path:
old_mtime = cls.local_anim_mtimes.get(anim_name)
mtime = _get_mtime(anim_path)
mtime = get_mtime(anim_path)
if only_if_changed and old_mtime == mtime:
return
cls.local_anim_mtimes[anim_name] = mtime
# file might not even exist anymore if mtime is `None`,
# anim will automatically stop if that happens.
if mtime:
@@ -156,9 +189,95 @@ class LocalAnimAddon(BaseAddon):
with open(anim_path, "rb") as f:
anim_data = f.read()
anim_data = cls._mangle_anim(anim_data)
cls.local_anim_mtimes[anim_name] = mtime
else:
print(f"Unknown anim {anim_name!r}")
cls.apply_local_anim(session, region, anim_name, new_data=anim_data)
@classmethod
def _mangle_anim(cls, anim_data: bytes) -> bytes:
if not cls.anim_manglers:
return anim_data
reader = se.BufferReader("<", anim_data)
spec = se.Dataclass(Animation)
anim = reader.read(spec)
for mangler in cls.anim_manglers:
anim = mangler(anim)
writer = se.BufferWriter("<")
writer.write(spec, anim)
return writer.copy_buffer()
@classmethod
def remangle_local_anims(cls, session_manager: SessionManager):
# Anim manglers are global, so we need to re-mangle anims for all sessions
for session in session_manager.sessions:
# Push the context of this session onto the stack so we can access
# session-scoped properties
with addon_ctx.push(new_session=session, new_region=session.main_region):
cls.local_anim_mtimes.clear()
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
if flow.name == "NewFileAgentInventoryUploader":
# Don't bother looking at this if we have no manglers
if not self.anim_manglers:
return
# This is kind of a crappy match but these magic bytes shouldn't match anything that SL
# allows as an upload type but animations.
if not flow.request.content or not flow.request.content.startswith(b"\x01\x00\x00\x00"):
return
# Replace the uploaded anim with the mangled version
flow.request.content = self._mangle_anim(flow.request.content)
show_message("Mangled upload request")
class BaseAnimManglerAddon(BaseAddon):
"""Base class for addons that mangle uploaded or file-based local animations"""
ANIM_MANGLERS: List[Callable[[Animation], Animation]]
def handle_init(self, session_manager: SessionManager):
# Add our manglers into the list
LocalAnimAddon.anim_manglers.extend(self.ANIM_MANGLERS)
LocalAnimAddon.remangle_local_anims(session_manager)
def handle_unload(self, session_manager: SessionManager):
# Clean up our manglers before we go away
mangler_list = LocalAnimAddon.anim_manglers
for mangler in self.ANIM_MANGLERS:
if mangler in mangler_list:
mangler_list.remove(mangler)
LocalAnimAddon.remangle_local_anims(session_manager)
class BaseAnimHelperAddon(BaseAddon):
"""
Base class for local creation of procedural animations
Animation generated by build_anim() gets applied to all active sessions
"""
ANIM_NAME: str
def handle_session_init(self, session: Session):
self._reapply_anim(session, session.main_region)
def handle_session_closed(self, session: Session):
LocalAnimAddon.apply_local_anim(session, session.main_region, self.ANIM_NAME, None)
def handle_unload(self, session_manager: SessionManager):
for session in session_manager.sessions:
# TODO: Nasty. Since we need to access session-local attrs we need to set the
# context even though we also explicitly pass session and region.
# Need to rethink the LocalAnimAddon API.
with addon_ctx.push(session, session.main_region):
LocalAnimAddon.apply_local_anim(session, session.main_region, self.ANIM_NAME, None)
@abstractmethod
def build_anim(self) -> Animation:
pass
def _reapply_anim(self, session: Session, region: ProxiedRegion):
LocalAnimAddon.apply_local_anim(session, region, self.ANIM_NAME, self.build_anim().to_bytes())
addons = [LocalAnimAddon()]

View File

@@ -81,17 +81,16 @@ class MeshUploadInterceptingAddon(BaseAddon):
@handle_command()
async def set_local_mesh_target(self, session: Session, region: ProxiedRegion):
"""Set the currently selected object as the target for local mesh"""
parent_object = region.objects.lookup_localid(session.selected.object_local)
if not parent_object:
"""Set the currently selected objects as the target for local mesh"""
selected_links = [region.objects.lookup_localid(l_id) for l_id in session.selected.object_locals]
selected_links = [o for o in selected_links if o is not None]
if not selected_links:
show_message("Nothing selected")
return
linkset_objects = [parent_object] + parent_object.Children
old_locals = self.local_mesh_target_locals
self.local_mesh_target_locals = [
x.LocalID
for x in linkset_objects
for x in selected_links
if ExtraParamType.MESH in x.ExtraParams
]
@@ -201,7 +200,7 @@ class MeshUploadInterceptingAddon(BaseAddon):
self.local_mesh_mapping = {x["mesh_name"]: x["mesh"] for x in instances}
# Fake a response, we don't want to actually send off the request.
flow.response = mitmproxy.http.HTTPResponse.make(
flow.response = mitmproxy.http.Response.make(
200,
b"",
{
@@ -231,7 +230,7 @@ class MeshUploadInterceptingAddon(BaseAddon):
show_message("Mangled upload request")
def handle_object_updated(self, session: Session, region: ProxiedRegion,
obj: Object, updated_props: Set[str]):
obj: Object, updated_props: Set[str], msg: Optional[Message]):
if obj.LocalID not in self.local_mesh_target_locals:
return
if "Name" not in updated_props or obj.Name is None:
@@ -280,4 +279,23 @@ class MeshUploadInterceptingAddon(BaseAddon):
cls._replace_local_mesh(session.main_region, asset_repo, mesh_list)
class BaseMeshManglerAddon(BaseAddon):
"""Base class for addons that mangle uploaded or local mesh"""
MESH_MANGLERS: List[Callable[[MeshAsset], MeshAsset]]
def handle_init(self, session_manager: SessionManager):
# Add our manglers into the list
MeshUploadInterceptingAddon.mesh_manglers.extend(self.MESH_MANGLERS)
# Tell the local mesh plugin that the mangler list changed, and to re-apply
MeshUploadInterceptingAddon.remangle_local_mesh(session_manager)
def handle_unload(self, session_manager: SessionManager):
# Clean up our manglers before we go away
mangler_list = MeshUploadInterceptingAddon.mesh_manglers
for mangler in self.MESH_MANGLERS:
if mangler in mangler_list:
mangler_list.remove(mangler)
MeshUploadInterceptingAddon.remangle_local_mesh(session_manager)
addons = [MeshUploadInterceptingAddon()]

View File

@@ -8,28 +8,16 @@ applied to the mesh before upload.
I personally use manglers to strip bounding box materials you need
to add to give a mesh an arbitrary center of rotation / scaling.
"""
from hippolyzer.lib.base.helpers import reorient_coord
from hippolyzer.lib.base.mesh import MeshAsset
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.sessions import SessionManager
import local_mesh
AddonManager.hot_reload(local_mesh, require_addons_loaded=True)
def _reorient_coord(coord, orientation):
coords = []
for axis in orientation:
axis_idx = abs(axis) - 1
coords.append(coord[axis_idx] if axis >= 0 else 1.0 - coord[axis_idx])
if coord.__class__ in (list, tuple):
return coord.__class__(coords)
return coord.__class__(*coords)
def _reorient_coord_list(coord_list, orientation):
return [_reorient_coord(x, orientation) for x in coord_list]
def _reorient_coord_list(coord_list, orientation, min_val: int | float = 0):
return [reorient_coord(x, orientation, min_val) for x in coord_list]
def reorient_mesh(orientation):
@@ -37,37 +25,23 @@ def reorient_mesh(orientation):
# X=1, Y=2, Z=3
def _reorienter(mesh: MeshAsset):
for material in mesh.iter_lod_materials():
if "Position" not in material:
# Must be a NoGeometry LOD
continue
# We don't need to use positions_(to/from)_domain here since we're just naively
# flipping the axes around.
material["Position"] = _reorient_coord_list(material["Position"], orientation)
# Are you even supposed to do this to the normals?
material["Normal"] = _reorient_coord_list(material["Normal"], orientation)
material["Normal"] = _reorient_coord_list(material["Normal"], orientation, min_val=-1)
return mesh
return _reorienter
OUR_MANGLERS = [
# Negate the X and Y axes on any mesh we upload or create temp
reorient_mesh((-1, -2, 3)),
]
class ExampleMeshManglerAddon(local_mesh.BaseMeshManglerAddon):
MESH_MANGLERS = [
# Negate the X and Y axes on any mesh we upload or create temp
reorient_mesh((-1, -2, 3)),
]
class MeshManglerExampleAddon(BaseAddon):
def handle_init(self, session_manager: SessionManager):
# Add our manglers into the list
local_mesh_addon = local_mesh.MeshUploadInterceptingAddon
local_mesh_addon.mesh_manglers.extend(OUR_MANGLERS)
# Tell the local mesh plugin that the mangler list changed, and to re-apply
local_mesh_addon.remangle_local_mesh(session_manager)
def handle_unload(self, session_manager: SessionManager):
# Clean up our manglers before we go away
local_mesh_addon = local_mesh.MeshUploadInterceptingAddon
mangler_list = local_mesh_addon.mesh_manglers
for mangler in OUR_MANGLERS:
if mangler in mangler_list:
mangler_list.remove(mangler)
local_mesh_addon.remangle_local_mesh(session_manager)
addons = [MeshManglerExampleAddon()]
addons = [ExampleMeshManglerAddon()]

View File

@@ -0,0 +1,244 @@
"""
Message Mirror
Re-routes messages through the circuit of another agent running through this proxy,
rewriting the messages to use the credentials tied to that circuit.
Useful if you need to quickly QA authorization checks on a message handler or script.
Or if you want to chat as two people at once. Whatever.
Also shows some advanced ways of managing / rerouting Messages and HTTP flows.
Fiddle with the values of `SEND_NORMALLY` and `MIRROR` to change how and which
messages get moved to other circuits.
Usage: /524 mirror_to <mirror_agent_uuid>
To Disable: /524 mirror_to
"""
import weakref
from typing import Optional
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.template_dict import DEFAULT_TEMPLATE_DICT
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty, show_message
from hippolyzer.lib.proxy.commands import handle_command, Parameter, parse_bool
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.caps import CapData, CapType
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
# Things that make no sense to mirror, or will make everything explode if mirrored.
SEND_NORMALLY = {
'StartPingCheck', 'CompletePingCheck', 'PacketAck', 'SimulatorViewerTimeMessage', 'SimStats',
'SoundTrigger', 'EventQueueGet', 'GetMesh', 'GetMesh2', 'ParcelDwellRequest', 'ViewerEffect', 'ViewerStats',
'ParcelAccessListRequest', 'FirestormBridge', 'AvatarRenderInfo', 'ParcelPropertiesRequest', 'GetObjectCost',
'RequestMultipleObjects', 'GetObjectPhysicsData', 'GetExperienceInfo', 'RequestTaskInventory', 'AgentRequestSit',
'MuteListRequest', 'UpdateMuteListEntry', 'RemoveMuteListEntry', 'RequestImage',
'AgentThrottle', 'UseCircuitCode', 'AgentWearablesRequest', 'AvatarPickerRequest', 'CloseCircuit',
'CompleteAgentMovement', 'RegionHandshakeReply', 'LogoutRequest', 'ParcelPropertiesRequest',
'ParcelPropertiesRequestByID', 'MapBlockRequest', 'MapLayerRequest', 'MapItemRequest', 'MapNameRequest',
'ParcelAccessListRequest', 'AvatarPropertiesRequest', 'DirFindQuery',
'SetAlwaysRun', 'GetDisplayNames', 'ViewerMetrics', 'AgentResume', 'AgentPause',
'ViewerAsset', 'GetTexture', 'UUIDNameRequest', 'AgentUpdate', 'AgentAnimation'
# Would just be confusing for everyone
'ImprovedInstantMessage',
# Xfer system isn't authed to begin with, and duping Xfers can lead to premature file deletion. Skip.
'RequestXfer', 'ConfirmXferPacket', 'AbortXfer', 'SendXferPacket',
}
# Messages that _must_ be sent normally, but are worth mirroring onto the target session to see how
# they would respond
MIRROR = {
'RequestObjectPropertiesFamily', 'ObjectSelect', 'RequestObjectProperties', 'TransferRequest',
'RequestMultipleObjects', 'RequestTaskInventory', 'FetchInventory2', 'ScriptDialogReply',
'ObjectDeselect', 'GenericMessage', 'ChatFromViewer'
}
for msg_name in DEFAULT_TEMPLATE_DICT.message_templates.keys():
# There are a lot of these.
if msg_name.startswith("Group") and msg_name.endswith("Request"):
MIRROR.add(msg_name)
class MessageMirrorAddon(BaseAddon):
mirror_target_agent: Optional[UUID] = SessionProperty(None)
mirror_use_target_session: bool = SessionProperty(True)
mirror_use_target_agent: bool = SessionProperty(True)
@handle_command(target_agent=Parameter(UUID, optional=True))
async def mirror_to(self, session: Session, _region, target_agent: Optional[UUID] = None):
"""
Send this session's outbound messages over another proxied agent's circuit
"""
if target_agent:
if target_agent == session.agent_id:
show_message("Can't mirror our own session")
target_agent = None
elif not any(s.agent_id == target_agent for s in session.session_manager.sessions):
show_message(f"No active proxied session for agent {target_agent}")
target_agent = None
self.mirror_target_agent = target_agent
if target_agent:
show_message(f"Mirroring to {target_agent}")
else:
show_message("Message mirroring disabled")
@handle_command(enabled=parse_bool)
async def set_mirror_use_target_session(self, _session, _region, enabled):
"""Replace the original session ID with the target session's ID when mirroring"""
self.mirror_use_target_session = enabled
@handle_command(enabled=parse_bool)
async def set_mirror_use_target_agent(self, _session, _region, enabled):
"""Replace the original agent ID with the target agent's ID when mirroring"""
self.mirror_use_target_agent = enabled
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if message.direction != Direction.OUT:
return
if not self.mirror_target_agent:
return
if message.name in SEND_NORMALLY:
return
target_session = None
for poss_session in session.session_manager.sessions:
if poss_session.agent_id == self.mirror_target_agent:
target_session = poss_session
if not target_session:
print("Couldn't find target session?")
return
target_region = None
for poss_region in target_session.regions:
if poss_region.circuit_addr == region.circuit_addr:
target_region = poss_region
if not target_region:
print("Couldn't find equivalent target region?")
return
# Send the message normally first if we're mirroring
if message.name in MIRROR:
region.circuit.send(message)
# We're going to send the message on a new circuit, we need to take
# it so we get a new packet ID and clean ACKs
message = message.take()
self._lludp_fixups(target_session, message)
target_region.circuit.send(message)
return True
def _lludp_fixups(self, target_session: Session, message: Message):
if "AgentData" in message:
agent_block = message["AgentData"][0]
if "AgentID" in agent_block and self.mirror_use_target_agent:
agent_block["AgentID"] = target_session.agent_id
if "SessionID" in agent_block and self.mirror_use_target_session:
agent_block["SessionID"] = target_session.id
if message.name == "TransferRequest":
transfer_block = message["TransferInfo"][0]
# This is a duplicated message so we need to give it a new ID
transfer_block["TransferID"] = UUID.random()
params = transfer_block.deserialize_var("Params")
# This kind of Transfer might not even use agent credentials
if self.mirror_use_target_agent and hasattr(params, 'AgentID'):
params.AgentID = target_session.agent_id
if self.mirror_use_target_session and hasattr(params, 'SessionID'):
params.SessionID = target_session.id
transfer_block.serialize_var("Params", params)
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
# Already mirrored, ignore.
if flow.is_replay:
return
cap_data = flow.cap_data
if not cap_data:
return
if cap_data.cap_name in SEND_NORMALLY:
return
if cap_data.asset_server_cap:
return
# Likely doesn't have an exact equivalent in the target session, this is a temporary
# cap like an uploader URL or a stats URL.
if cap_data.type == CapType.TEMPORARY:
return
session: Optional[Session] = cap_data.session and cap_data.session()
if not session:
return
region: Optional[ProxiedRegion] = cap_data.region and cap_data.region()
if not region:
return
# Session-scoped, so we need to know if we have a session before checking
if not self.mirror_target_agent:
return
target_session: Optional[Session] = None
for poss_session in session.session_manager.sessions:
if poss_session.agent_id == self.mirror_target_agent:
target_session = poss_session
if not target_session:
return
caps_source = target_session
target_region: Optional[ProxiedRegion] = None
if region:
target_region = None
for poss_region in target_session.regions:
if poss_region.circuit_addr == region.circuit_addr:
target_region = poss_region
if not target_region:
print("No region in cap?")
return
caps_source = target_region
new_base_url = caps_source.cap_urls.get(cap_data.cap_name)
if not new_base_url:
print("No equiv cap?")
return
if cap_data.cap_name in MIRROR:
flow = flow.copy()
# Have the cap data reflect the new URL we're pointing at
flow.metadata["cap_data"] = CapData(
cap_name=cap_data.cap_name,
region=weakref.ref(target_region) if target_region else None,
session=weakref.ref(target_session),
base_url=new_base_url,
)
# Tack any params onto the new base URL for the cap
new_url = new_base_url + flow.request.url[len(cap_data.base_url):]
flow.request.url = new_url
if cap_data.cap_name in MIRROR:
self._replay_flow(flow, session.session_manager)
def _replay_flow(self, flow: HippoHTTPFlow, session_manager: SessionManager):
# Work around mitmproxy bug, changing the URL updates the Host header, which may
# cause it to drop the port even when it shouldn't have. Fix the host header.
if flow.request.port not in (80, 443) and ":" not in flow.request.host_header:
flow.request.host_header = f"{flow.request.host}:{flow.request.port}"
# Should get repopulated when it goes back through the MITM addon
flow.metadata.pop("cap_data_ser", None)
flow.metadata.pop("cap_data", None)
proxy_queue = session_manager.flow_context.to_proxy_queue
proxy_queue.put_nowait(("replay", None, flow.get_state()))
addons = [MessageMirrorAddon()]

View File

@@ -0,0 +1,49 @@
"""
Example of proxy-provided caps
Useful for mocking out a cap that isn't actually implemented by the server
while developing the viewer-side pieces of it.
Implements a cap that accepts an `obj_id` UUID query parameter and returns
the name of the object.
"""
import asyncio
import asgiref.wsgi
from flask import Flask, Response, request
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.webapp_cap_addon import WebAppCapAddon
app = Flask("GetObjectNameCapApp")
@app.route('/')
async def get_object_name():
# Should always have the current region, the cap handler is bound to one.
# Just need to pull it from the `addon_ctx` module's global.
obj_mgr = addon_ctx.region.get().objects
obj_id = UUID(request.args['obj_id'])
obj = obj_mgr.lookup_fullid(obj_id)
if not obj:
return Response(f"Couldn't find {obj_id!r}", status=404, mimetype="text/plain")
try:
await asyncio.wait_for(obj_mgr.request_object_properties(obj)[0], 1.0)
except asyncio.TimeoutError:
return Response(f"Timed out requesting {obj_id!r}'s properties", status=500, mimetype="text/plain")
return Response(obj.Name, mimetype="text/plain")
class MockProxyCapExampleAddon(WebAppCapAddon):
# A cap URL with this name will be tied to each region when
# the sim is first connected to. The URL will be returned to the
# viewer in the Seed if the viewer requests it by name.
CAP_NAME = "GetObjectNameExample"
# Any asgi app should be fine.
APP = asgiref.wsgi.WsgiToAsgi(app)
addons = [MockProxyCapExampleAddon()]

View File

@@ -27,7 +27,7 @@ from mitmproxy.http import HTTPFlow
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.jp2_utils import BufferedJp2k
from hippolyzer.lib.base.multiprocessing_utils import ParentProcessWatcher
from hippolyzer.lib.base.templates import TextureEntry
from hippolyzer.lib.base.templates import TextureEntryCollection
from hippolyzer.lib.proxy.addon_utils import AssetAliasTracker, BaseAddon, GlobalProperty, AddonProcess
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.base.message.message import Message
@@ -148,7 +148,7 @@ class MonochromeAddon(BaseAddon):
message["RegionInfo"][field_name] = tracker.get_alias_uuid(val)
@staticmethod
def _make_te_monochrome(tracker: AssetAliasTracker, parsed_te: TextureEntry):
def _make_te_monochrome(tracker: AssetAliasTracker, parsed_te: TextureEntryCollection):
# Need a deepcopy because TEs are owned by the ObjectManager
# and we don't want to change the canonical view.
parsed_te = copy.deepcopy(parsed_te)

View File

@@ -0,0 +1,111 @@
"""
Check object manager state against region ViewerObject cache
Can't look at every object we've tracked and every object in VOCache
and report mismatches due to weird VOCache cache eviction criteria and certain
cacheable objects not being added to the VOCache.
Off the top of my head, animesh objects get explicit KillObjects at extreme
view distances same as avatars, but will still be present in the cache even
though they will not be in gObjectList.
"""
import asyncio
import logging
from typing import *
from hippolyzer.lib.base.objects import normalize_object_update_compressed_data
from hippolyzer.lib.base.templates import ObjectUpdateFlags, PCode
from hippolyzer.lib.proxy.addon_utils import BaseAddon, GlobalProperty
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager, Session
from hippolyzer.lib.proxy.vocache import is_valid_vocache_dir, RegionViewerObjectCacheChain
LOG = logging.getLogger(__name__)
class ObjectManagementValidator(BaseAddon):
base_cache_path: Optional[str] = GlobalProperty(None)
orig_auto_request: Optional[bool] = GlobalProperty(None)
def handle_init(self, session_manager: SessionManager):
if self.orig_auto_request is None:
self.orig_auto_request = session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS
session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS = False
async def _choose_cache_path():
while not self.base_cache_path:
cache_dir = await AddonManager.UI.open_dir("Choose the base cache directory")
if not cache_dir:
return
if not is_valid_vocache_dir(cache_dir):
continue
self.base_cache_path = cache_dir
if not self.base_cache_path:
self._schedule_task(_choose_cache_path(), session_scoped=False)
def handle_unload(self, session_manager: SessionManager):
session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS = self.orig_auto_request
def handle_session_init(self, session: Session):
# Use only the specified cache path for the vocache
session.cache_dir = self.base_cache_path
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if message.name != "DisableSimulator":
return
# Send it off to the client without handling it normally,
# we need to defer region teardown in the proxy
region.circuit.send(message)
self._schedule_task(self._check_cache_before_region_teardown(region))
return True
async def _check_cache_before_region_teardown(self, region: ProxiedRegion):
await asyncio.sleep(0.5)
print("Ok, checking cache differences")
try:
# Index will have been rewritten, so re-read it.
region_cache_chain = RegionViewerObjectCacheChain.for_region(
handle=region.handle,
cache_id=region.cache_id,
cache_dir=self.base_cache_path
)
if not region_cache_chain.region_caches:
print(f"no caches for {region!r}?")
return
all_full_ids = set()
for obj in region.objects.all_objects:
cacheable = True
orig_obj = obj
# Walk along the ancestry checking for things that would make the tree non-cacheable
while obj is not None:
if obj.UpdateFlags & ObjectUpdateFlags.TEMPORARY_ON_REZ:
cacheable = False
if obj.PCode == PCode.AVATAR:
cacheable = False
obj = obj.Parent
if cacheable:
all_full_ids.add(orig_obj.FullID)
for key in all_full_ids:
obj = region.objects.lookup_fullid(key)
cached_data = region_cache_chain.lookup_object_data(obj.LocalID, obj.CRC)
if not cached_data:
continue
orig_dict = obj.to_dict()
parsed_data = normalize_object_update_compressed_data(cached_data)
updated = obj.update_properties(parsed_data)
# Can't compare this yet
updated -= {"TextureEntry"}
if updated:
print(key)
for attr in updated:
print("\t", attr, orig_dict[attr], parsed_data[attr])
finally:
# Ok to teardown region in the proxy now
region.mark_dead()
addons = [ObjectManagementValidator()]

View File

@@ -10,6 +10,7 @@ before you start tracking can help too.
from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.objects import Object
from hippolyzer.lib.base.templates import PCode
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message, SessionProperty
@@ -20,7 +21,7 @@ from hippolyzer.lib.proxy.sessions import Session
class ObjectUpdateBlameAddon(BaseAddon):
update_blame_counter: Counter[UUID] = SessionProperty(Counter)
track_update_blame: bool = SessionProperty(False)
should_track_update_blame: bool = SessionProperty(False)
@handle_command()
async def precache_objects(self, _session: Session, region: ProxiedRegion):
@@ -38,11 +39,11 @@ class ObjectUpdateBlameAddon(BaseAddon):
@handle_command()
async def track_update_blame(self, _session: Session, _region: ProxiedRegion):
self.track_update_blame = True
self.should_track_update_blame = True
@handle_command()
async def untrack_update_blame(self, _session: Session, _region: ProxiedRegion):
self.track_update_blame = False
self.should_track_update_blame = False
@handle_command()
async def clear_update_blame(self, _session: Session, _region: ProxiedRegion):
@@ -57,8 +58,8 @@ class ObjectUpdateBlameAddon(BaseAddon):
print(f"{obj_id} ({name!r}): {count}")
def handle_object_updated(self, session: Session, region: ProxiedRegion,
obj: Object, updated_props: Set[str]):
if not self.track_update_blame:
obj: Object, updated_props: Set[str], msg: Optional[Message]):
if not self.should_track_update_blame:
return
if region != session.main_region:
return

View File

@@ -0,0 +1,21 @@
import collections
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.addon_utils import BaseAddon, GlobalProperty
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
class PacketStatsAddon(BaseAddon):
packet_stats: collections.Counter = GlobalProperty(collections.Counter)
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
self.packet_stats[message.name] += 1
@handle_command()
async def print_packet_stats(self, _session: Session, _region: ProxiedRegion):
print(self.packet_stats.most_common(10))
addons = [PacketStatsAddon()]

View File

@@ -37,7 +37,7 @@ class PaydayAddon(BaseAddon):
chat_type=ChatType.SHOUT,
)
# Do the traditional money dance.
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"AgentAnimation",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
Block("AnimationList", AnimID=UUID("928cae18-e31d-76fd-9cc9-2f55160ff818"), StartAnim=True),

View File

@@ -9,13 +9,14 @@ import asyncio
import struct
from typing import *
from PySide2.QtGui import QImage
from PySide6.QtGui import QImage
from hippolyzer.lib.base.datatypes import UUID, Vector3, Quaternion
from hippolyzer.lib.base.helpers import to_chunks
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.templates import ObjectUpdateFlags, PCode, MCode, MultipleObjectUpdateFlags, TextureEntry
from hippolyzer.lib.client.object_manager import ObjectEvent, UpdateType
from hippolyzer.lib.base.templates import ObjectUpdateFlags, PCode, MCode, MultipleObjectUpdateFlags, \
TextureEntryCollection, JUST_CREATED_FLAGS
from hippolyzer.lib.client.object_manager import ObjectEvent, ObjectUpdateType
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.commands import handle_command
@@ -24,7 +25,6 @@ from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
JUST_CREATED_FLAGS = (ObjectUpdateFlags.CREATE_SELECTED | ObjectUpdateFlags.OBJECT_YOU_OWNER)
PRIM_SCALE = 0.2
@@ -42,7 +42,7 @@ class PixelArtistAddon(BaseAddon):
return
img = QImage()
with open(filename, "rb") as f:
img.loadFromData(f.read(), aformat=None)
img.loadFromData(f.read(), format=None)
img = img.convertToFormat(QImage.Format_RGBA8888)
height = img.height()
width = img.width()
@@ -72,15 +72,14 @@ class PixelArtistAddon(BaseAddon):
# Watch for any newly created prims, this is basically what the viewer does to find
# prims that it just created with the build tool.
with session.objects.events.subscribe_async(
(UpdateType.OBJECT_UPDATE,),
(ObjectUpdateType.UPDATE,),
predicate=lambda e: e.object.UpdateFlags & JUST_CREATED_FLAGS and "LocalID" in e.updated
) as get_events:
# Create a pool of prims to use for building the pixel art
for _ in range(needed_prims):
# TODO: We don't track the land group or user's active group, so
# "anyone can build" must be on for rezzing to work.
group_id = UUID()
region.circuit.send_message(Message(
# TODO: Can't get land group atm, just tries to rez with the user's active group
group_id = session.active_group
region.circuit.send(Message(
'ObjectAdd',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id, GroupID=group_id),
Block(
@@ -124,12 +123,12 @@ class PixelArtistAddon(BaseAddon):
y = i // width
obj = created_prims[prim_idx]
# Set a blank texture on all faces
te = TextureEntry()
te = TextureEntryCollection()
te.Textures[None] = UUID('5748decc-f629-461c-9a36-a35a221fe21f')
# Set the prim color to the color from the pixel
te.Color[None] = pixel_color
# Set the prim texture and color
region.circuit.send_message(Message(
region.circuit.send(Message(
'ObjectImage',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('ObjectData', ObjectLocalID=obj.LocalID, MediaURL=b'', TextureEntry_=te),
@@ -149,7 +148,7 @@ class PixelArtistAddon(BaseAddon):
# Move the "pixels" to their correct position in chunks
for chunk in to_chunks(positioning_blocks, 25):
region.circuit.send_message(Message(
region.circuit.send(Message(
'MultipleObjectUpdate',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
*chunk,

View File

@@ -0,0 +1,111 @@
"""
Control a puppetry-enabled viewer and make your neck spin like crazy
It currently requires a custom rebased Firestorm with puppetry applied on top,
and patches applied on top to make startup LEAP scripts be treated as puppetry modules.
Basically, you probably don't want to use this yet. But hey, Puppetry is still only
on the beta grid anyway.
"""
import asyncio
import enum
import logging
import math
from typing import *
import outleap
from hippolyzer.lib.base.datatypes import Quaternion
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty
from hippolyzer.lib.proxy.sessions import Session
LOG = logging.getLogger(__name__)
class BodyPartMask(enum.IntFlag):
"""Which joints to send the viewer as part of "move" puppetry command"""
HEAD = 1 << 0
FACE = 1 << 1
LHAND = 1 << 2
RHAND = 1 << 3
FINGERS = 1 << 4
def register_puppetry_command(func: Callable[[dict], Awaitable[None]]):
"""Register a method as handling inbound puppetry commands from the viewer"""
func._puppetry_command = True
return func
class PuppetryExampleAddon(BaseAddon):
server_skeleton: Dict[str, Dict[str, Any]] = SessionProperty(dict)
camera_num: int = SessionProperty(0)
parts_active: BodyPartMask = SessionProperty(lambda: BodyPartMask(0x1F))
puppetry_api: Optional[outleap.LLPuppetryAPI] = SessionProperty(None)
leap_client: Optional[outleap.LEAPClient] = SessionProperty(None)
def handle_session_init(self, session: Session):
if not session.leap_client:
return
self.puppetry_api = outleap.LLPuppetryAPI(session.leap_client)
self.leap_client = session.leap_client
self._schedule_task(self._serve())
self._schedule_task(self._exorcist(session))
@register_puppetry_command
async def enable_parts(self, args: dict):
if (new_mask := args.get("parts_mask")) is not None:
self.parts_active = BodyPartMask(new_mask)
@register_puppetry_command
async def set_camera(self, args: dict):
if (camera_num := args.get("camera_num")) is not None:
self.camera_num = camera_num
@register_puppetry_command
async def stop(self, _args: dict):
LOG.info("Viewer asked us to stop puppetry")
@register_puppetry_command
async def log(self, _args: dict):
# Intentionally ignored, we don't care about things the viewer
# asked us to log
pass
@register_puppetry_command
async def set_skeleton(self, args: dict):
# Don't really care about what the viewer thinks the view of the skeleton is.
# Just log store it.
self.server_skeleton = args
async def _serve(self):
"""Handle inbound puppetry commands from viewer in a loop"""
async with self.leap_client.listen_scoped("puppetry.controller") as listener:
while True:
msg = await listener.get()
cmd = msg["command"]
handler = getattr(self, cmd, None)
if handler is None or not hasattr(handler, "_puppetry_command"):
LOG.warning(f"Unknown puppetry command {cmd!r}: {msg!r}")
continue
await handler(msg.get("args", {}))
async def _exorcist(self, session):
"""Do the Linda Blair thing with your neck"""
spin_rad = 0.0
while True:
await asyncio.sleep(0.05)
if not session.main_region:
continue
# Wrap spin_rad around if necessary
while spin_rad > math.pi:
spin_rad -= math.pi * 2
# LEAP wants rot as a quaternion with just the imaginary parts.
neck_rot = Quaternion.from_euler(0, 0, spin_rad).data(3)
self.puppetry_api.move({
"mNeck": {"no_constraint": True, "local_rot": neck_rot},
})
spin_rad += math.pi / 25
addons = [PuppetryExampleAddon()]

View File

@@ -116,7 +116,7 @@ class RecapitatorAddon(BaseAddon):
except:
logging.exception("Exception while recapitating")
# Tell the viewer about the status of its original upload
region.circuit.send_message(Message(
region.circuit.send(Message(
"AssetUploadComplete",
Block("AssetBlock", UUID=asset_id, Type=asset_block["Type"], Success=success),
direction=Direction.IN,

View File

@@ -0,0 +1,53 @@
"""
You don't need RLV, we have RLV at home.
RLV at home:
"""
from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message, Block
from hippolyzer.lib.base.templates import ChatType
from hippolyzer.lib.proxy.addon_utils import BaseAddon, send_chat
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
def send_rlv_chat(channel: int, message: str):
# We always shout.
send_chat(channel=channel, message=message, chat_type=ChatType.NORMAL)
class RLVAtHomeAddon(BaseAddon):
"""
Addon for pretending to be an RLV-enabled viewer
Useful if you want only a specific subset of RLV and don't want everything RLV normally allows,
or want to override some RLV builtins.
"""
def handle_rlv_command(self, session: Session, region: ProxiedRegion, source: UUID,
behaviour: str, options: List[str], param: str) -> bool | None:
# print(behaviour, options, param)
if behaviour == "clear":
return True
elif behaviour in ("versionnum", "versionnew", "version"):
# People tend to just check that this returned anything at all. Just say we're 2.0.0 for all of these.
send_rlv_chat(int(param), "2.0.0")
return True
elif behaviour == "getinv":
# Pretend we don't have anything
send_rlv_chat(int(param), "")
return True
elif behaviour == "sit":
# Sure, we can sit on stuff, whatever.
region.circuit.send(Message(
'AgentRequestSit',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('TargetObject', TargetID=UUID(options[0]), Offset=(0, 0, 0)),
))
return True
return None
addons = [RLVAtHomeAddon()]

View File

@@ -6,7 +6,13 @@ from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
SUSPICIOUS_PACKETS = {"TransferRequest", "UUIDNameRequest", "UUIDGroupNameRequest", "OpenCircuit"}
SUSPICIOUS_PACKETS = {
"TransferRequest",
"UUIDNameRequest",
"UUIDGroupNameRequest",
"OpenCircuit",
"AddCircuitCode",
}
REGULAR_IM_DIALOGS = (IMDialogType.TYPING_STOP, IMDialogType.TYPING_STOP, IMDialogType.NOTHING_SPECIAL)

View File

@@ -0,0 +1,22 @@
import random
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
class SimulatePacketLossAddon(BaseAddon):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
# Messing with these may kill your circuit
if message.name in {"PacketAck", "StartPingCheck", "CompletePingCheck", "UseCircuitCode",
"CompleteAgentMovement", "AgentMovementComplete"}:
return
# Simulate 30% packet loss
if random.random() > 0.7:
# Do nothing, drop this packet on the floor
return True
return
addons = [SimulatePacketLossAddon()]

View File

@@ -13,7 +13,7 @@ def _to_spongecase(val):
def handle_lludp_message(session: Session, _region: ProxiedRegion, message: Message):
ctx = session.addon_ctx
ctx = session.addon_ctx[__name__]
ctx.setdefault("spongecase", False)
if message.name == "ChatFromViewer":
chat = message["ChatData"]["Message"]

View File

@@ -0,0 +1,55 @@
"""
Tail animation generator
Demonstrates programmatic generation of local motions using BaseAnimHelperAddon
You can use this to create an animation with a script, fiddle with it until it
looks right, then finally save it with /524 save_local_anim <ANIM_NAME>.
The built animation is automatically applied to all active sessions when loaded,
and is re-generated whenever the script is edited. Unloading the script stops
the animations.
"""
from hippolyzer.lib.base.anim_utils import shift_keyframes, smooth_rot
from hippolyzer.lib.base.datatypes import Quaternion
from hippolyzer.lib.base.llanim import Animation, Joint
from hippolyzer.lib.proxy.addons import AddonManager
import local_anim
AddonManager.hot_reload(local_anim, require_addons_loaded=True)
class TailAnimator(local_anim.BaseAnimHelperAddon):
# Should be unique
ANIM_NAME = "tail_anim"
def build_anim(self) -> Animation:
anim = Animation(
base_priority=5,
duration=5.0,
loop_out_point=5.0,
loop=True,
)
# Iterate along tail joints 1 through 6
for joint_num in range(1, 7):
# Give further along joints a wider range of motion
start_rot = Quaternion.from_euler(0.2, -0.3, 0.15 * joint_num)
end_rot = Quaternion.from_euler(-0.2, -0.3, -0.15 * joint_num)
rot_keyframes = [
# Tween between start_rot and end_rot, using smooth interpolation.
# SL's keyframes only allow linear interpolation which doesn't look great
# for natural motions. `smooth_rot()` gets around that by generating
# smooth inter frames for SL to linearly interpolate between.
*smooth_rot(start_rot, end_rot, inter_frames=10, time=0.0, duration=2.5),
*smooth_rot(end_rot, start_rot, inter_frames=10, time=2.5, duration=2.5),
]
anim.joints[f"mTail{joint_num}"] = Joint(
priority=5,
# Each joint's frames should be ahead of the previous joint's by 2 frames
rot_keyframes=shift_keyframes(rot_keyframes, joint_num * 2),
)
return anim
addons = [TailAnimator()]

View File

@@ -3,7 +3,7 @@ Example of how to request a Transfer
"""
from typing import *
from hippolyzer.lib.base.legacy_inv import InventoryModel, InventoryItem
from hippolyzer.lib.base.inventory import InventoryModel, InventoryItem
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.templates import (
AssetType,
@@ -35,7 +35,7 @@ class TransferExampleAddon(BaseAddon):
async def get_first_script(self, session: Session, region: ProxiedRegion):
"""Get the contents of the first script in the selected object"""
# Ask for the object inventory so we can find a script
region.circuit.send_message(Message(
region.circuit.send(Message(
'RequestTaskInventory',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('InventoryData', LocalID=session.selected.object_local),
@@ -47,7 +47,7 @@ class TransferExampleAddon(BaseAddon):
file_name=inv_message["InventoryData"]["Filename"], file_path=XferFilePath.CACHE)
inv_model = InventoryModel.from_bytes(xfer.reassemble_chunks())
first_script: Optional[InventoryItem] = None
for item in inv_model.items.values():
for item in inv_model.all_items:
if item.type == "lsltext":
first_script = item
if not first_script:

View File

@@ -64,12 +64,12 @@ class TurboObjectInventoryAddon(BaseAddon):
# Any previous requests will have triggered a delete of the inventory file
# by marking it complete on the server-side. Re-send our RequestTaskInventory
# To make sure there's a fresh copy.
region.circuit.send_message(request_msg.take())
region.circuit.send(request_msg.take())
inv_message = await region.message_handler.wait_for(('ReplyTaskInventory',), timeout=5.0)
# No task inventory, send the reply as-is
file_name = inv_message["InventoryData"]["Filename"]
if not file_name:
region.circuit.send_message(inv_message)
region.circuit.send(inv_message)
return
xfer = region.xfer_manager.request(
@@ -87,7 +87,7 @@ class TurboObjectInventoryAddon(BaseAddon):
continue
# Send the original ReplyTaskInventory to the viewer so it knows the file is ready
region.circuit.send_message(inv_message)
region.circuit.send(inv_message)
proxied_xfer = Xfer(data=xfer.reassemble_chunks())
# Wait for the viewer to request the inventory file

View File

@@ -2,21 +2,17 @@
Example of how to upload assets, assumes assets are already encoded
in the appropriate format.
/524 upload <asset type>
/524 upload_asset <asset type>
"""
import pprint
from pathlib import Path
from typing import *
import aiohttp
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.mesh import LLMeshSerializer
from hippolyzer.lib.base.serialization import BufferReader
from hippolyzer.lib.base.templates import AssetType
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.addon_utils import ais_item_to_inventory_data, show_message, BaseAddon
from hippolyzer.lib.proxy.addon_utils import show_message, BaseAddon
from hippolyzer.lib.proxy.commands import handle_command, Parameter
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
@@ -29,7 +25,6 @@ class UploaderAddon(BaseAddon):
async def upload_asset(self, _session: Session, region: ProxiedRegion,
asset_type: AssetType, flags: Optional[int] = None):
"""Upload a raw asset with optional flags"""
inv_type = asset_type.inventory_type
file = await AddonManager.UI.open_file()
if not file:
return
@@ -42,67 +37,32 @@ class UploaderAddon(BaseAddon):
with open(file, "rb") as f:
file_body = f.read()
params = {
"asset_type": asset_type.human_name,
"description": "(No Description)",
"everyone_mask": 0,
"group_mask": 0,
"folder_id": UUID(), # Puts it in the default folder, I guess. Undocumented.
"inventory_type": inv_type.human_name,
"name": name,
"next_owner_mask": 581632,
}
if flags is not None:
params['flags'] = flags
try:
if asset_type == AssetType.MESH:
# Kicking off a mesh upload works a little differently internally
# Half-parse the mesh so that we can figure out how many faces it has
reader = BufferReader("!", file_body)
mesh = reader.read(LLMeshSerializer(parse_segment_contents=False))
upload_token = await region.asset_uploader.initiate_mesh_upload(
name, mesh, flags=flags
)
else:
upload_token = await region.asset_uploader.initiate_asset_upload(
name, asset_type, file_body, flags=flags,
)
except Exception as e:
show_message(e)
raise
caps = region.caps_client
async with aiohttp.ClientSession() as sess:
async with caps.post('NewFileAgentInventory', llsd=params, session=sess) as resp:
parsed = await resp.read_llsd()
if "uploader" not in parsed:
show_message(f"Upload error!: {parsed!r}")
return
print("Got upload URL, uploading...")
if not await AddonManager.UI.confirm("Upload", f"Spend {upload_token.linden_cost}L on upload?"):
return
async with caps.post(parsed["uploader"], data=file_body, session=sess) as resp:
upload_parsed = await resp.read_llsd()
if "new_inventory_item" not in upload_parsed:
show_message(f"Got weird upload resp: {pprint.pformat(upload_parsed)}")
return
await self._force_inv_update(region, upload_parsed['new_inventory_item'])
@handle_command(item_id=UUID)
async def force_inv_update(self, _session: Session, region: ProxiedRegion, item_id: UUID):
"""Force an inventory update for a given item id"""
await self._force_inv_update(region, item_id)
async def _force_inv_update(self, region: ProxiedRegion, item_id: UUID):
session = region.session()
ais_req_data = {
"items": [
{
"owner_id": session.agent_id,
"item_id": item_id,
}
]
}
async with region.caps_client.post('FetchInventory2', llsd=ais_req_data) as resp:
ais_item = (await resp.read_llsd())["items"][0]
message = Message(
"UpdateCreateInventoryItem",
Block(
"AgentData",
AgentID=session.agent_id,
SimApproved=1,
TransactionID=UUID.random(),
),
ais_item_to_inventory_data(ais_item),
direction=Direction.IN
)
region.circuit.send_message(message)
# Do the actual upload
try:
await region.asset_uploader.complete_upload(upload_token)
except Exception as e:
show_message(e)
raise
addons = [UploaderAddon()]

View File

@@ -2,7 +2,7 @@
Example of how to request an Xfer
"""
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.legacy_inv import InventoryModel
from hippolyzer.lib.base.inventory import InventoryModel
from hippolyzer.lib.base.templates import XferFilePath, AssetType, InventoryType, WearableType
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message
@@ -15,7 +15,7 @@ class XferExampleAddon(BaseAddon):
@handle_command()
async def get_mute_list(self, session: Session, region: ProxiedRegion):
"""Fetch the current user's mute list"""
region.circuit.send_message(Message(
region.circuit.send(Message(
'MuteListRequest',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block("MuteData", MuteCRC=0),
@@ -35,7 +35,7 @@ class XferExampleAddon(BaseAddon):
@handle_command()
async def get_task_inventory(self, session: Session, region: ProxiedRegion):
"""Get the inventory of the currently selected object"""
region.circuit.send_message(Message(
region.circuit.send(Message(
'RequestTaskInventory',
# If no session is passed in we'll use the active session when the coro was created
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
@@ -57,7 +57,7 @@ class XferExampleAddon(BaseAddon):
await xfer
inv_model = InventoryModel.from_bytes(xfer.reassemble_chunks())
item_names = [item.name for item in inv_model.items.values()]
item_names = [item.name for item in inv_model.all_items]
show_message(item_names)
@handle_command()
@@ -98,7 +98,7 @@ textures 1
data=asset_data,
transaction_id=transaction_id
)
region.circuit.send_message(Message(
region.circuit.send(Message(
'CreateInventoryItem',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block(

View File

@@ -0,0 +1,53 @@
"""
A simple client that just says hello to people
"""
import asyncio
import pprint
from contextlib import aclosing
import os
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.templates import ChatType, ChatSourceType
from hippolyzer.lib.client.hippo_client import HippoClient
async def amain():
client = HippoClient()
async def _respond_to_chat(message: Message):
if message["ChatData"]["SourceID"] == client.session.agent_id:
return
if message["ChatData"]["SourceType"] != ChatSourceType.AGENT:
return
if "hello" not in message["ChatData"]["Message"].lower():
return
await client.send_chat(f'Hello {message["ChatData"]["FromName"]}!', chat_type=ChatType.SHOUT)
async with aclosing(client):
await client.login(
username=os.environ["HIPPO_USERNAME"],
password=os.environ["HIPPO_PASSWORD"],
start_location=os.environ.get("HIPPO_START_LOCATION", "last"),
)
print("I'm here")
# Wait until we have details about parcels and print them
await client.main_region.parcel_manager.parcels_downloaded.wait()
pprint.pprint(client.main_region.parcel_manager.parcels)
await client.send_chat("Hello World!", chat_type=ChatType.SHOUT)
client.session.message_handler.subscribe("ChatFromSimulator", _respond_to_chat)
# Example of how to work with caps
async with client.main_caps_client.get("SimulatorFeatures") as features_resp:
print("Features:", await features_resp.read_llsd())
while True:
try:
await asyncio.sleep(0.001)
except (KeyboardInterrupt, asyncio.CancelledError):
await client.send_chat("Goodbye World!", chat_type=ChatType.SHOUT)
return
if __name__ == "__main__":
asyncio.run(amain())

View File

@@ -191,7 +191,7 @@
</size>
</property>
<property name="styleSheet">
<string notr="true">color: rgb(80, 0, 0)</string>
<string notr="true"/>
</property>
<property name="tabChangesFocus">
<bool>true</bool>

View File

@@ -2,7 +2,7 @@ import enum
import logging
import typing
from PySide2 import QtCore, QtGui
from PySide6 import QtCore, QtGui
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.message_logger import FilteringMessageLogger
@@ -19,9 +19,9 @@ class MessageLogHeader(enum.IntEnum):
class MessageLogModel(QtCore.QAbstractTableModel, FilteringMessageLogger):
def __init__(self, parent=None):
def __init__(self, parent=None, maxlen=2000):
QtCore.QAbstractTableModel.__init__(self, parent)
FilteringMessageLogger.__init__(self)
FilteringMessageLogger.__init__(self, maxlen=maxlen)
def _begin_insert(self, insert_idx: int):
self.beginInsertRows(QtCore.QModelIndex(), insert_idx, insert_idx)

View File

@@ -7,14 +7,16 @@ import sys
import time
from typing import Optional
import mitmproxy.ctx
import mitmproxy.exceptions
import outleap
from hippolyzer.lib.base import llsd
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.ca_utils import setup_ca
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.http_proxy import create_http_proxy, create_proxy_master, HTTPFlowContext
from hippolyzer.lib.proxy.http_proxy import create_http_proxy, HTTPFlowContext
from hippolyzer.lib.proxy.http_event_manager import MITMProxyEventManager
from hippolyzer.lib.proxy.lludp_proxy import SLSOCKS5Server
from hippolyzer.lib.base.message.message import Message
@@ -43,7 +45,7 @@ class SelectionManagerAddon(BaseAddon):
LOG.debug(f"Don't know about selected {local_id}, requesting object")
needed_objects.add(local_id)
if needed_objects:
if needed_objects and session.session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS:
region.objects.request_objects(needed_objects)
# ParcelDwellRequests are sent whenever "about land" is opened. This gives us a
# decent mechanism for selecting parcels.
@@ -75,6 +77,15 @@ class SelectionManagerAddon(BaseAddon):
selected.task_item = parsed["item-id"]
class AgentUpdaterAddon(BaseAddon):
def handle_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
if event['message'] != 'AgentGroupDataUpdate':
return
session.groups.clear()
for group in event['body']['GroupData']:
session.groups.add(group['GroupID'])
class REPLAddon(BaseAddon):
@handle_command()
async def spawn_repl(self, session: Session, region: ProxiedRegion):
@@ -83,32 +94,36 @@ class REPLAddon(BaseAddon):
AddonManager.spawn_repl()
def run_http_proxy_process(proxy_host, http_proxy_port, flow_context: HTTPFlowContext):
def run_http_proxy_process(proxy_host, http_proxy_port, flow_context: HTTPFlowContext, ssl_insecure=False):
mitm_loop = asyncio.new_event_loop()
asyncio.set_event_loop(mitm_loop)
mitmproxy_master = create_http_proxy(proxy_host, http_proxy_port, flow_context)
mitmproxy_master.start_server()
gc.freeze()
flow_context.mitmproxy_ready.set()
mitm_loop.run_forever()
async def mitmproxy_loop():
mitmproxy_master = create_http_proxy(proxy_host, http_proxy_port, flow_context, ssl_insecure=ssl_insecure)
gc.freeze()
await mitmproxy_master.run()
asyncio.run(mitmproxy_loop())
def start_proxy(session_manager: SessionManager, extra_addons: Optional[list] = None,
extra_addon_paths: Optional[list] = None, proxy_host=None):
extra_addon_paths: Optional[list] = None, proxy_host=None, ssl_insecure=False):
extra_addons = extra_addons or []
extra_addon_paths = extra_addon_paths or []
extra_addons.append(SelectionManagerAddon())
extra_addons.append(REPLAddon())
extra_addons.append(AgentUpdaterAddon())
root_log = logging.getLogger()
root_log.addHandler(logging.StreamHandler())
root_log.setLevel(logging.INFO)
logging.basicConfig()
loop = asyncio.get_event_loop()
loop = asyncio.get_event_loop_policy().get_event_loop()
udp_proxy_port = session_manager.settings.SOCKS_PROXY_PORT
http_proxy_port = session_manager.settings.HTTP_PROXY_PORT
leap_port = session_manager.settings.LEAP_PORT
if proxy_host is None:
proxy_host = session_manager.settings.PROXY_BIND_ADDR
@@ -118,25 +133,28 @@ def start_proxy(session_manager: SessionManager, extra_addons: Optional[list] =
# TODO: argparse
if len(sys.argv) == 3:
if sys.argv[1] == "--setup-ca":
try:
mitmproxy_master = create_http_proxy(proxy_host, http_proxy_port, flow_context)
except mitmproxy.exceptions.ServerException:
# Proxy already running, create the master so we don't try to bind to a port
mitmproxy_master = create_proxy_master(proxy_host, http_proxy_port, flow_context)
mitmproxy_master = create_http_proxy(proxy_host, http_proxy_port, flow_context)
setup_ca(sys.argv[2], mitmproxy_master)
return sys.exit(0)
http_proc = multiprocessing.Process(
target=run_http_proxy_process,
args=(proxy_host, http_proxy_port, flow_context),
args=(proxy_host, http_proxy_port, flow_context, ssl_insecure),
daemon=True,
)
http_proc.start()
# These need to be set for mitmproxy's ASGIApp serving code to work.
mitmproxy.ctx.master = None
mitmproxy.ctx.log = logging.getLogger("mitmproxy log")
server = SLSOCKS5Server(session_manager)
coro = asyncio.start_server(server.handle_connection, proxy_host, udp_proxy_port)
async_server = loop.run_until_complete(coro)
leap_server = outleap.LEAPBridgeServer(session_manager.leap_client_connected)
coro = asyncio.start_server(leap_server.handle_connection, proxy_host, leap_port)
async_leap_server = loop.run_until_complete(coro)
event_manager = MITMProxyEventManager(session_manager, flow_context)
loop.create_task(event_manager.run())
@@ -163,6 +181,8 @@ def start_proxy(session_manager: SessionManager, extra_addons: Optional[list] =
# Close the server
print("Closing SOCKS server")
async_server.close()
print("Shutting down LEAP server")
async_leap_server.close()
print("Shutting down addons")
AddonManager.shutdown()
print("Waiting for SOCKS server to close")

View File

@@ -17,14 +17,14 @@ import urllib.parse
from typing import *
import multidict
from qasync import QEventLoop
from PySide2 import QtCore, QtWidgets, QtGui
from qasync import QEventLoop, asyncSlot
from PySide6 import QtCore, QtWidgets, QtGui
from hippolyzer.apps.model import MessageLogModel, MessageLogHeader, RegionListModel
from hippolyzer.apps.proxy import start_proxy
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import bytes_unescape, bytes_escape, get_resource_filename
from hippolyzer.lib.base.helpers import bytes_unescape, bytes_escape, get_resource_filename, create_logged_task
from hippolyzer.lib.base.message.llsd_msg_serializer import LLSDMessageSerializer
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.message_formatting import (
@@ -34,15 +34,18 @@ from hippolyzer.lib.base.message.message_formatting import (
SpannedString,
)
from hippolyzer.lib.base.message.msgtypes import MsgType
from hippolyzer.lib.base.message.template_dict import TemplateDictionary
from hippolyzer.lib.base.message.template_dict import DEFAULT_TEMPLATE_DICT
from hippolyzer.lib.base.settings import SettingDescriptor
from hippolyzer.lib.base.ui_helpers import loadUi
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base.network.transport import Direction, SocketUDPTransport
from hippolyzer.lib.client.state import BaseClientSessionManager
from hippolyzer.lib.proxy.addons import BaseInteractionManager, AddonManager
from hippolyzer.lib.proxy.ca_utils import setup_ca_everywhere
from hippolyzer.lib.proxy.caps_client import ProxyCapsClient
from hippolyzer.lib.proxy.http_proxy import create_proxy_master, HTTPFlowContext
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, AbstractMessageLogEntry
from hippolyzer.lib.proxy.http_proxy import create_http_proxy, HTTPFlowContext
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, AbstractMessageLogEntry, WrappingMessageLogger, \
import_log_entries, export_log_entries
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
from hippolyzer.lib.proxy.settings import ProxySettings
@@ -60,7 +63,7 @@ def show_error_message(error_msg, parent=None):
error_dialog = QtWidgets.QErrorMessage(parent=parent)
# No obvious way to set this to plaintext, yuck...
error_dialog.showMessage(html.escape(error_msg))
error_dialog.exec_()
error_dialog.exec()
error_dialog.raise_()
@@ -68,11 +71,12 @@ class GUISessionManager(SessionManager, QtCore.QObject):
regionAdded = QtCore.Signal(ProxiedRegion)
regionRemoved = QtCore.Signal(ProxiedRegion)
def __init__(self, settings, model):
def __init__(self, settings):
BaseClientSessionManager.__init__(self)
SessionManager.__init__(self, settings)
QtCore.QObject.__init__(self)
self.all_regions = []
self.message_logger = model
self.message_logger = WrappingMessageLogger()
def checkRegions(self):
new_regions = itertools.chain(*[s.regions for s in self.sessions])
@@ -87,13 +91,13 @@ class GUISessionManager(SessionManager, QtCore.QObject):
self.all_regions = new_regions
class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
def __init__(self, parent):
class GUIInteractionManager(BaseInteractionManager):
def __init__(self, parent: QtWidgets.QWidget):
BaseInteractionManager.__init__(self)
QtCore.QObject.__init__(self, parent=parent)
self._parent = parent
def main_window_handle(self) -> Any:
return self.parent()
return self._parent
def _dialog_async_exec(self, dialog: QtWidgets.QDialog):
future = asyncio.Future()
@@ -101,12 +105,16 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
dialog.open()
return future
async def _file_dialog(self, caption: str, directory: str, filter_str: str, mode: QtWidgets.QFileDialog.FileMode) \
-> Tuple[bool, QtWidgets.QFileDialog]:
dialog = QtWidgets.QFileDialog(self.parent(), caption=caption, directory=directory, filter=filter_str)
async def _file_dialog(
self, caption: str, directory: str, filter_str: str, mode: QtWidgets.QFileDialog.FileMode,
default_suffix: str = '',
) -> Tuple[bool, QtWidgets.QFileDialog]:
dialog = QtWidgets.QFileDialog(self._parent, caption=caption, directory=directory, filter=filter_str)
dialog.setFileMode(mode)
if mode == QtWidgets.QFileDialog.FileMode.AnyFile:
dialog.setAcceptMode(QtWidgets.QFileDialog.AcceptMode.AcceptSave)
if default_suffix:
dialog.setDefaultSuffix(default_suffix)
res = await self._dialog_async_exec(dialog)
return res, dialog
@@ -134,9 +142,10 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
return None
return dialog.selectedFiles()[0]
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '') -> Optional[str]:
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '',
default_suffix: str = '') -> Optional[str]:
res, dialog = await self._file_dialog(
caption, directory, filter_str, QtWidgets.QFileDialog.FileMode.AnyFile
caption, directory, filter_str, QtWidgets.QFileDialog.FileMode.AnyFile, default_suffix,
)
if not res or not dialog.selectedFiles():
return None
@@ -148,7 +157,7 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
title,
caption,
QtWidgets.QMessageBox.Ok | QtWidgets.QMessageBox.Cancel,
self.parent(),
self._parent,
)
fut = asyncio.Future()
msg.finished.connect(lambda r: fut.set_result(r))
@@ -156,6 +165,24 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
return (await fut) == QtWidgets.QMessageBox.Ok
class GUIProxySettings(ProxySettings):
FIRST_RUN: bool = SettingDescriptor(True)
"""Persistent settings backed by QSettings"""
def __init__(self, settings: QtCore.QSettings):
super().__init__()
self._settings_obj = settings
def get_setting(self, name: str) -> Any:
val: Any = self._settings_obj.value(name, defaultValue=dataclasses.MISSING)
if val is dataclasses.MISSING:
return val
return json.loads(val)
def set_setting(self, name: str, val: Any):
self._settings_obj.setValue(name, json.dumps(val))
def nonFatalExceptions(f):
@functools.wraps(f)
def _wrapper(self, *args, **kwargs):
@@ -169,7 +196,35 @@ def nonFatalExceptions(f):
return _wrapper
class ProxyGUI(QtWidgets.QMainWindow):
def buildReplacements(session: Session, region: ProxiedRegion):
if not session or not region:
return {}
selected = session.selected
agent_object = region.objects.lookup_fullid(session.agent_id)
selected_local = selected.object_local
selected_object = None
if selected_local:
# We may or may not have an object for this
selected_object = region.objects.lookup_localid(selected_local)
return {
"SELECTED_LOCAL": selected_local,
"SELECTED_FULL": selected_object.FullID if selected_object else None,
"SELECTED_PARCEL_LOCAL": selected.parcel_local,
"SELECTED_PARCEL_FULL": selected.parcel_full,
"SELECTED_SCRIPT_ITEM": selected.script_item,
"SELECTED_TASK_ITEM": selected.task_item,
"AGENT_ID": session.agent_id,
"AGENT_LOCAL": agent_object.LocalID if agent_object else None,
"SESSION_ID": session.id,
"AGENT_POS": agent_object.Position if agent_object else None,
"NULL_KEY": UUID(),
"RANDOM_KEY": UUID.random,
"CIRCUIT_CODE": session.circuit_code,
"REGION_HANDLE": region.handle,
}
class MessageLogWindow(QtWidgets.QMainWindow):
DEFAULT_IGNORE = "StartPingCheck CompletePingCheck PacketAck SimulatorViewerTimeMessage SimStats " \
"AgentUpdate AgentAnimation AvatarAnimation ViewerEffect CoarseLocationUpdate LayerData " \
"CameraConstraint ObjectUpdateCached RequestMultipleObjects ObjectUpdate ObjectUpdateCompressed " \
@@ -178,50 +233,65 @@ class ProxyGUI(QtWidgets.QMainWindow):
"AvatarRenderInfo FirestormBridge ObjectAnimation ParcelDwellRequest ParcelAccessListRequest " \
"ParcelDwellReply ParcelAccessListReply AttachedSoundGainChange " \
"ParcelPropertiesRequest ParcelProperties GetObjectCost GetObjectPhysicsData ObjectImage " \
"ViewerAsset GetTexture SetAlwaysRun GetDisplayNames MapImageService MapItemReply".split(" ")
"ViewerAsset GetTexture SetAlwaysRun GetDisplayNames MapImageService MapItemReply " \
"AgentFOV GenericStreamingMessage".split(" ")
DEFAULT_FILTER = f"!({' || '.join(ignored for ignored in DEFAULT_IGNORE)})"
textRequest: QtWidgets.QTextEdit
def __init__(self):
super().__init__()
def __init__(
self, settings: GUIProxySettings, session_manager: GUISessionManager,
log_live_messages: bool, parent: Optional[QtWidgets.QWidget] = None,
):
super().__init__(parent=parent)
loadUi(MAIN_WINDOW_UI_PATH, self)
if parent:
self.setWindowTitle("Message Log")
self.menuBar.setEnabled(False) # type: ignore
self.menuBar.hide() # type: ignore
self._selectedEntry: Optional[AbstractMessageLogEntry] = None
self.settings = GUIProxySettings(QtCore.QSettings("SaladDais", "hippolyzer"))
self.model = MessageLogModel(parent=self.tableView)
self.settings = settings
self.sessionManager = session_manager
if log_live_messages:
self.model = MessageLogModel(parent=self.tableView)
session_manager.message_logger.loggers.append(self.model)
else:
self.model = MessageLogModel(parent=self.tableView, maxlen=None)
self.tableView.setModel(self.model)
self.model.rowsAboutToBeInserted.connect(self.beforeInsert)
self.model.rowsInserted.connect(self.afterInsert)
self.tableView.selectionModel().selectionChanged.connect(self._messageSelected)
self.checkBeautify.clicked.connect(self._showSelectedMessage)
self.checkPause.clicked.connect(self._setPaused)
self._setFilter(self.DEFAULT_FILTER)
self.setFilter(self.DEFAULT_FILTER)
self.btnClearLog.clicked.connect(self.model.clear)
self.lineEditFilter.editingFinished.connect(self._setFilter)
self.lineEditFilter.editingFinished.connect(self.setFilter)
self.btnMessageBuilder.clicked.connect(self._sendToMessageBuilder)
self.btnCopyRepr.clicked.connect(self._copyRepr)
self.actionInstallHTTPSCerts.triggered.connect(self._installHTTPSCerts)
self.actionInstallHTTPSCerts.triggered.connect(self.installHTTPSCerts)
self.actionManageAddons.triggered.connect(self._manageAddons)
self.actionManageFilters.triggered.connect(self._manageFilters)
self.actionOpenMessageBuilder.triggered.connect(self._openMessageBuilder)
self.actionProxyRemotelyAccessible.setChecked(self.settings.REMOTELY_ACCESSIBLE)
self.actionProxySSLInsecure.setChecked(self.settings.SSL_INSECURE)
self.actionUseViewerObjectCache.setChecked(self.settings.USE_VIEWER_OBJECT_CACHE)
self.actionRequestMissingObjects.setChecked(self.settings.AUTOMATICALLY_REQUEST_MISSING_OBJECTS)
self.actionProxyRemotelyAccessible.triggered.connect(self._setProxyRemotelyAccessible)
self.actionProxySSLInsecure.triggered.connect(self._setProxySSLInsecure)
self.actionUseViewerObjectCache.triggered.connect(self._setUseViewerObjectCache)
self.actionRequestMissingObjects.triggered.connect(self._setRequestMissingObjects)
self.actionOpenNewMessageLogWindow.triggered.connect(self._openNewMessageLogWindow)
self.actionImportLogEntries.triggered.connect(self._importLogEntries)
self.actionExportLogEntries.triggered.connect(self._exportLogEntries)
self._filterMenu = QtWidgets.QMenu()
self._populateFilterMenu()
self.toolButtonFilter.setMenu(self._filterMenu)
self.sessionManager = GUISessionManager(self.settings, self.model)
self.interactionManager = GUIInteractionManager(self)
AddonManager.UI = self.interactionManager
self._shouldScrollOnInsert = True
self.tableView.horizontalHeader().resizeSection(MessageLogHeader.Host, 80)
self.tableView.horizontalHeader().resizeSection(MessageLogHeader.Method, 60)
@@ -230,10 +300,16 @@ class ProxyGUI(QtWidgets.QMainWindow):
self.textResponse.hide()
def closeEvent(self, event) -> None:
loggers = self.sessionManager.message_logger.loggers
if self.model in loggers:
loggers.remove(self.model)
super().closeEvent(event)
def _populateFilterMenu(self):
def _addFilterAction(text, filter_str):
filter_action = QtWidgets.QAction(text, self)
filter_action.triggered.connect(lambda: self._setFilter(filter_str))
filter_action = QtGui.QAction(text, self)
filter_action.triggered.connect(lambda: self.setFilter(filter_str))
self._filterMenu.addAction(filter_action)
self._filterMenu.clear()
@@ -243,16 +319,19 @@ class ProxyGUI(QtWidgets.QMainWindow):
for preset_name, preset_filter in filters.items():
_addFilterAction(preset_name, preset_filter)
def getFilterDict(self):
return self.settings.FILTERS
def setFilterDict(self, val: dict):
self.settings.FILTERS = val
self._populateFilterMenu()
def _manageFilters(self):
dialog = FilterDialog(self)
dialog.exec_()
dialog.exec()
@nonFatalExceptions
def _setFilter(self, filter_str=None):
def setFilter(self, filter_str=None):
if filter_str is None:
filter_str = self.lineEditFilter.text()
else:
@@ -284,23 +363,22 @@ class ProxyGUI(QtWidgets.QMainWindow):
return
req = entry.request(
beautify=self.checkBeautify.isChecked(),
replacements=self.buildReplacements(entry.session, entry.region),
replacements=buildReplacements(entry.session, entry.region),
)
highlight_range = None
if isinstance(req, SpannedString):
match_result = self.model.filter.match(entry)
# Match result was a tuple indicating what matched
if isinstance(match_result, tuple):
highlight_range = req.spans.get(match_result)
self.textRequest.setPlainText(req)
if highlight_range:
cursor = self.textRequest.textCursor()
cursor.setPosition(highlight_range[0], QtGui.QTextCursor.MoveAnchor)
cursor.setPosition(highlight_range[1], QtGui.QTextCursor.KeepAnchor)
highlight_format = QtGui.QTextBlockFormat()
highlight_format.setBackground(QtCore.Qt.yellow)
cursor.setBlockFormat(highlight_format)
# The string has a map of fields and their associated positions within the string,
# use that to highlight any individual fields the filter matched on.
if isinstance(req, SpannedString):
for field in self.model.filter.match(entry, short_circuit=False).fields:
field_span = req.spans.get(field)
if not field_span:
continue
cursor = self.textRequest.textCursor()
cursor.setPosition(field_span[0], QtGui.QTextCursor.MoveAnchor)
cursor.setPosition(field_span[1], QtGui.QTextCursor.KeepAnchor)
highlight_format = QtGui.QTextBlockFormat()
highlight_format.setBackground(QtCore.Qt.yellow)
cursor.setBlockFormat(highlight_format)
resp = entry.response(beautify=self.checkBeautify.isChecked())
if resp:
@@ -324,7 +402,7 @@ class ProxyGUI(QtWidgets.QMainWindow):
win.show()
msg = self._selectedEntry
beautify = self.checkBeautify.isChecked()
replacements = self.buildReplacements(msg.session, msg.region)
replacements = buildReplacements(msg.session, msg.region)
win.setMessageText(msg.request(beautify=beautify, replacements=replacements))
@nonFatalExceptions
@@ -340,37 +418,43 @@ class ProxyGUI(QtWidgets.QMainWindow):
win = MessageBuilderWindow(self, self.sessionManager)
win.show()
def buildReplacements(self, session: Session, region: ProxiedRegion):
if not session or not region:
return {}
selected = session.selected
agent_object = region.objects.lookup_fullid(session.agent_id)
selected_local = selected.object_local
selected_object = None
if selected_local:
# We may or may not have an object for this
selected_object = region.objects.lookup_localid(selected_local)
return {
"SELECTED_LOCAL": selected_local,
"SELECTED_FULL": selected_object.FullID if selected_object else None,
"SELECTED_PARCEL_LOCAL": selected.parcel_local,
"SELECTED_PARCEL_FULL": selected.parcel_full,
"SELECTED_SCRIPT_ITEM": selected.script_item,
"SELECTED_TASK_ITEM": selected.task_item,
"AGENT_ID": session.agent_id,
"AGENT_LOCAL": agent_object.LocalID if agent_object else None,
"SESSION_ID": session.id,
"AGENT_POS": agent_object.Position if agent_object else None,
"NULL_KEY": UUID(),
"RANDOM_KEY": UUID.random,
"CIRCUIT_CODE": session.circuit_code,
"REGION_HANDLE": region.handle,
}
def _openNewMessageLogWindow(self):
win: QtWidgets.QMainWindow = MessageLogWindow(
self.settings, self.sessionManager, log_live_messages=True, parent=self)
win.setFilter(self.lineEditFilter.text())
win.show()
win.activateWindow()
def _installHTTPSCerts(self):
@asyncSlot()
async def _importLogEntries(self):
log_file = await AddonManager.UI.open_file(
caption="Import Log Entries", filter_str="Hippolyzer Logs (*.hippolog)"
)
if not log_file:
return
win = MessageLogWindow(self.settings, self.sessionManager, log_live_messages=False, parent=self)
win.setFilter(self.lineEditFilter.text())
with open(log_file, "rb") as f:
entries = import_log_entries(f.read())
for entry in entries:
win.model.add_log_entry(entry)
win.show()
win.activateWindow()
@asyncSlot()
async def _exportLogEntries(self):
log_file = await AddonManager.UI.save_file(
caption="Export Log Entries", filter_str="Hippolyzer Logs (*.hippolog)", default_suffix="hippolog",
)
if not log_file:
return
with open(log_file, "wb") as f:
f.write(export_log_entries(self.model))
def installHTTPSCerts(self):
msg = QtWidgets.QMessageBox()
msg.setText("This will install the proxy's HTTPS certificate in the config dir"
" of any installed viewers, continue?")
msg.setText("Would you like to install the proxy's HTTPS certificate in the config dir"
" of any installed viewers so that HTTPS connections will work?")
yes_btn = msg.addButton("Yes", QtWidgets.QMessageBox.NoRole)
msg.addButton("No", QtWidgets.QMessageBox.NoRole)
msg.exec()
@@ -378,7 +462,7 @@ class ProxyGUI(QtWidgets.QMainWindow):
if clicked_btn is not yes_btn:
return
master = create_proxy_master("127.0.0.1", -1, HTTPFlowContext())
master = create_http_proxy("127.0.0.1", -1, HTTPFlowContext())
dirs = setup_ca_everywhere(master)
msg = QtWidgets.QMessageBox()
@@ -394,6 +478,12 @@ class ProxyGUI(QtWidgets.QMainWindow):
msg.setText("Remote accessibility setting changes will take effect on next run")
msg.exec()
def _setProxySSLInsecure(self, checked: bool):
self.sessionManager.settings.SSL_INSECURE = checked
msg = QtWidgets.QMessageBox()
msg.setText("SSL security setting changes will take effect on next run")
msg.exec()
def _setUseViewerObjectCache(self, checked: bool):
self.sessionManager.settings.USE_VIEWER_OBJECT_CACHE = checked
@@ -402,7 +492,7 @@ class ProxyGUI(QtWidgets.QMainWindow):
def _manageAddons(self):
dialog = AddonDialog(self)
dialog.exec_()
dialog.exec()
def getAddonList(self) -> List[str]:
return self.sessionManager.settings.ADDON_SCRIPTS
@@ -446,7 +536,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
def __init__(self, parent, session_manager):
super().__init__(parent=parent)
loadUi(MESSAGE_BUILDER_UI_PATH, self)
self.templateDict = TemplateDictionary()
self.templateDict = DEFAULT_TEMPLATE_DICT
self.llsdSerializer = LLSDMessageSerializer()
self.sessionManager: SessionManager = session_manager
self.regionModel = RegionListModel(self, self.sessionManager)
@@ -486,12 +576,12 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
message_names = sorted(x.name for x in self.templateDict)
for message_name in message_names:
if self.templateDict[message_name].msg_trust:
if self.templateDict[message_name].trusted:
self.comboTrusted.addItem(message_name)
else:
self.comboUntrusted.addItem(message_name)
cap_names = sorted(set(itertools.chain(*[r.caps.keys() for r in self.regionModel.regions])))
cap_names = sorted(set(itertools.chain(*[r.cap_urls.keys() for r in self.regionModel.regions])))
for cap_name in cap_names:
if cap_name.endswith("ProxyWrapper"):
continue
@@ -522,7 +612,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
break
self.textRequest.setPlainText(
f"""{method} [[{cap_name}]]{path}{params} HTTP/1.1
# {region.caps.get(cap_name, "<unknown URI>")}
# {region.cap_urls.get(cap_name, "<unknown URI>")}
{headers}
{body}"""
)
@@ -575,24 +665,9 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
if var.name in ("TaskID", "ObjectID"):
return VerbatimHumanVal("[[SELECTED_FULL]]")
if var.type.is_int:
return 0
elif var.type.is_float:
return 0.0
elif var.type == MsgType.MVT_LLUUID:
return UUID()
elif var.type == MsgType.MVT_BOOL:
return False
elif var.type == MsgType.MVT_VARIABLE:
return ""
elif var.type in (MsgType.MVT_LLVector3, MsgType.MVT_LLVector3d, MsgType.MVT_LLQuaternion):
return VerbatimHumanVal("(0.0, 0.0, 0.0)")
elif var.type == MsgType.MVT_LLVector4:
return VerbatimHumanVal("(0.0, 0.0, 0.0, 0.0)")
elif var.type == MsgType.MVT_FIXED:
return b"\x00" * var.size
elif var.type == MsgType.MVT_IP_ADDR:
return "0.0.0.0"
default_val = var.default_value
if default_val is not None:
return default_val
return VerbatimHumanVal("")
@nonFatalExceptions
@@ -600,7 +675,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
session, region = self._getTarget()
msg_text = self.textRequest.toPlainText()
replacements = self.parent().buildReplacements(session, region)
replacements = buildReplacements(session, region)
if re.match(r"\A\s*(in|out)\s+", msg_text, re.I):
sender_func = self._sendLLUDPMessage
@@ -632,13 +707,11 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
msg = HumanMessageSerializer.from_human_string(msg_text, replacements, env, safe=False)
if self.checkLLUDPViaCaps.isChecked():
if msg.direction == Direction.IN:
region.eq_manager.queue_event(
self.llsdSerializer.serialize(msg, as_dict=True)
)
region.eq_manager.inject_message(msg)
else:
self._sendHTTPRequest(
"POST",
region.caps["UntrustedSimulatorMessage"],
region.cap_urls["UntrustedSimulatorMessage"],
{"Content-Type": "application/llsd+xml", "Accept": "application/llsd+xml"},
self.llsdSerializer.serialize(msg),
)
@@ -646,19 +719,28 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
transport = None
off_circuit = self.checkOffCircuit.isChecked()
if off_circuit:
transport = SocketUDPTransport(socket.socket(socket.AF_INET, socket.SOCK_DGRAM))
region.circuit.send_message(msg, transport=transport)
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
sock.bind(("0.0.0.0", 0))
transport = SocketUDPTransport(sock)
region.circuit.send(msg, transport=transport)
if off_circuit:
transport.close()
def _sendEQMessage(self, session, region: Optional[ProxiedRegion], msg_text: str, _replacements: dict):
def _sendEQMessage(self, session, region: Optional[ProxiedRegion], msg_text: str, replacements: dict):
if not session or not region:
raise RuntimeError("Need a valid session and region to send EQ event")
message_line, _, body = (x.strip() for x in msg_text.partition("\n"))
message_name = message_line.rsplit(" ", 1)[-1]
region.eq_manager.queue_event({
env = self._buildEnv(session, region)
def directive_handler(m):
return self._handleHTTPDirective(env, replacements, False, m)
body = re.sub(rb"<!HIPPO(\w+)\[\[(.*?)]]>", directive_handler, body.encode("utf8"), flags=re.S)
region.eq_manager.inject_event({
"message": message_name,
"body": llsd.parse_xml(body.encode("utf8")),
"body": llsd.parse_xml(body),
})
def _sendHTTPMessage(self, session, region, msg_text: str, replacements: dict):
@@ -682,7 +764,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
cap_name = match.group(1)
cap_url = session.global_caps.get(cap_name)
if not cap_url:
cap_url = region.caps.get(cap_name)
cap_url = region.cap_urls.get(cap_name)
if not cap_url:
raise ValueError("Don't have a Cap for %s" % cap_name)
uri = cap_url + match.group(2)
@@ -722,7 +804,10 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
val = subfield_eval(contents.decode("utf8").strip(), globals_={**env, **replacements})
val = _coerce_to_bytes(val)
elif directive == b"REPL":
val = _coerce_to_bytes(replacements[contents.decode("utf8").strip()])
repl = replacements[contents.decode("utf8").strip()]
if callable(repl):
repl = repl()
val = _coerce_to_bytes(repl)
else:
raise ValueError(f"Unknown directive {directive}")
@@ -743,13 +828,13 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
# enough for the full response to pass through the proxy
await resp.read()
asyncio.create_task(_send_request())
create_logged_task(_send_request(), "Send HTTP Request")
class AddonDialog(QtWidgets.QDialog):
listAddons: QtWidgets.QListWidget
def __init__(self, parent: ProxyGUI):
def __init__(self, parent: MessageLogWindow):
super().__init__()
loadUi(ADDON_DIALOG_UI_PATH, self)
@@ -800,7 +885,7 @@ class AddonDialog(QtWidgets.QDialog):
class FilterDialog(QtWidgets.QDialog):
listFilters: QtWidgets.QListWidget
def __init__(self, parent: ProxyGUI):
def __init__(self, parent: MessageLogWindow):
super().__init__()
loadUi(FILTER_DIALOG_UI_PATH, self)
@@ -838,29 +923,16 @@ class FilterDialog(QtWidgets.QDialog):
self.listFilters.takeItem(idx)
class GUIProxySettings(ProxySettings):
"""Persistent settings backed by QSettings"""
def __init__(self, settings: QtCore.QSettings):
super().__init__()
self._settings_obj = settings
def get_setting(self, name: str) -> Any:
val: Any = self._settings_obj.value(name, defaultValue=dataclasses.MISSING)
if val is dataclasses.MISSING:
return val
return json.loads(val)
def set_setting(self, name: str, val: Any):
self._settings_obj.setValue(name, json.dumps(val))
def gui_main():
multiprocessing.set_start_method('spawn')
QtCore.QCoreApplication.setAttribute(QtCore.Qt.AA_ShareOpenGLContexts)
app = QtWidgets.QApplication(sys.argv)
loop = QEventLoop(app)
asyncio.set_event_loop(loop)
window = ProxyGUI()
settings = GUIProxySettings(QtCore.QSettings("SaladDais", "hippolyzer"))
session_manager = GUISessionManager(settings)
window = MessageLogWindow(settings, session_manager, log_live_messages=True)
AddonManager.UI = GUIInteractionManager(window)
timer = QtCore.QTimer(app)
timer.timeout.connect(window.sessionManager.checkRegions)
timer.start(100)
@@ -869,10 +941,15 @@ def gui_main():
http_host = None
if window.sessionManager.settings.REMOTELY_ACCESSIBLE:
http_host = "0.0.0.0"
if settings.FIRST_RUN:
settings.FIRST_RUN = False
# Automatically offer to install the HTTPS certs on first run.
window.installHTTPSCerts()
start_proxy(
session_manager=window.sessionManager,
extra_addon_paths=window.getAddonList(),
proxy_host=http_host,
ssl_insecure=settings.SSL_INSECURE,
)

View File

@@ -193,7 +193,7 @@
</size>
</property>
<property name="styleSheet">
<string notr="true">color: rgb(80, 0, 0)</string>
<string notr="true"/>
</property>
<property name="tabChangesFocus">
<bool>true</bool>
@@ -213,7 +213,7 @@
</widget>
<widget class="QPlainTextEdit" name="textResponse">
<property name="styleSheet">
<string notr="true">color: rgb(0, 0, 80)</string>
<string notr="true"/>
</property>
<property name="tabChangesFocus">
<bool>true</bool>
@@ -245,7 +245,7 @@
<x>0</x>
<y>0</y>
<width>700</width>
<height>22</height>
<height>29</height>
</rect>
</property>
<widget class="QMenu" name="menuFile">
@@ -256,6 +256,10 @@
<bool>true</bool>
</property>
<addaction name="actionOpenMessageBuilder"/>
<addaction name="actionOpenNewMessageLogWindow"/>
<addaction name="separator"/>
<addaction name="actionImportLogEntries"/>
<addaction name="actionExportLogEntries"/>
<addaction name="separator"/>
<addaction name="actionInstallHTTPSCerts"/>
<addaction name="actionManageAddons"/>
@@ -264,6 +268,7 @@
<addaction name="actionProxyRemotelyAccessible"/>
<addaction name="actionUseViewerObjectCache"/>
<addaction name="actionRequestMissingObjects"/>
<addaction name="actionProxySSLInsecure"/>
</widget>
<addaction name="menuFile"/>
</widget>
@@ -323,6 +328,32 @@
<string>Force the proxy to request objects that it doesn't know about due to cache misses</string>
</property>
</action>
<action name="actionOpenNewMessageLogWindow">
<property name="text">
<string>Open New Message Log Window</string>
</property>
</action>
<action name="actionImportLogEntries">
<property name="text">
<string>Import Log Entries</string>
</property>
</action>
<action name="actionExportLogEntries">
<property name="text">
<string>Export Log Entries</string>
</property>
</action>
<action name="actionProxySSLInsecure">
<property name="checkable">
<bool>true</bool>
</property>
<property name="text">
<string>Allow Insecure SSL Connections</string>
</property>
<property name="toolTip">
<string>Allow invalid SSL certificates from upstream connections</string>
</property>
</action>
</widget>
<resources/>
<connections/>

View File

@@ -0,0 +1,125 @@
"""
Assorted utilities to make creating animations from scratch easier
"""
import copy
from typing import List, Union, Mapping
from hippolyzer.lib.base.datatypes import Vector3, Quaternion
from hippolyzer.lib.base.llanim import PosKeyframe, RotKeyframe, JOINTS_DICT, Joint
from hippolyzer.lib.base.mesh_skeleton import AVATAR_SKELETON
from hippolyzer.lib.base.multidict import OrderedMultiDict
def smooth_step(t: float):
t = max(0.0, min(1.0, t))
return t * t * (3 - 2 * t)
def rot_interp(r0: Quaternion, r1: Quaternion, t: float):
"""
Bad quaternion interpolation
TODO: This is definitely not correct yet seems to work ok? Implement slerp.
"""
# Ignore W
r0 = r0.data(3)
r1 = r1.data(3)
return Quaternion(*map(lambda pair: ((pair[0] * (1.0 - t)) + (pair[1] * t)), zip(r0, r1)))
def unique_frames(frames: List[Union[PosKeyframe, RotKeyframe]]):
"""Drop frames where time and coordinate are exact duplicates of another frame"""
new_frames = []
for frame in frames:
# TODO: fudge factor for float comparison instead
if frame not in new_frames:
new_frames.append(frame)
return new_frames
def shift_keyframes(frames: List[Union[PosKeyframe, RotKeyframe]], num: int):
"""
Shift keyframes around by `num` frames
Assumes keyframes occur at a set cadence, and that first and last keyframe are at the same coord.
"""
# Get rid of duplicate frames
frames = unique_frames(frames)
pop_idx = -1
insert_idx = 0
if num < 0:
insert_idx = len(frames) - 1
pop_idx = 0
num = -num
old_times = [f.time for f in frames]
new_frames = frames.copy()
# Drop last, duped frame. We'll copy the first frame to replace it later
new_frames.pop(-1)
for _ in range(num):
new_frames.insert(insert_idx, new_frames.pop(pop_idx))
# Put first frame back on the end
new_frames.append(copy.copy(new_frames[0]))
assert len(old_times) == len(new_frames)
assert new_frames[0] == new_frames[-1]
# Make the times of the shifted keyframes match up with the previous timeline
for old_time, new_frame in zip(old_times, new_frames):
new_frame.time = old_time
return new_frames
def smooth_pos(start: Vector3, end: Vector3, inter_frames: int, time: float, duration: float) -> List[PosKeyframe]:
"""Generate keyframes to smoothly interpolate between two positions"""
frames = [PosKeyframe(time=time, pos=start)]
for i in range(0, inter_frames):
t = (i + 1) / (inter_frames + 1)
smooth_t = smooth_step(t)
pos = Vector3(smooth_t, smooth_t, smooth_t).interpolate(start, end)
frames.append(PosKeyframe(time=time + (t * duration), pos=pos))
return frames + [PosKeyframe(time=time + duration, pos=end)]
def smooth_rot(start: Quaternion, end: Quaternion, inter_frames: int, time: float, duration: float)\
-> List[RotKeyframe]:
"""Generate keyframes to smoothly interpolate between two rotations"""
frames = [RotKeyframe(time=time, rot=start)]
for i in range(0, inter_frames):
t = (i + 1) / (inter_frames + 1)
smooth_t = smooth_step(t)
frames.append(RotKeyframe(time=time + (t * duration), rot=rot_interp(start, end, smooth_t)))
return frames + [RotKeyframe(time=time + duration, rot=end)]
def mirror_joints(joints_dict: Mapping[str, Joint]) -> JOINTS_DICT:
"""Mirror a joints dict so left / right are swapped, including transformations"""
new_joints: JOINTS_DICT = OrderedMultiDict()
for joint_name, joint in joints_dict.items():
inverse_joint_node = AVATAR_SKELETON[joint_name].inverse
if not inverse_joint_node:
new_joints[joint_name] = joint
continue
# Okay, this is one we have to actually mirror
new_joint = Joint(joint.priority, [], [])
for rot_keyframe in joint.rot_keyframes:
new_joint.rot_keyframes.append(RotKeyframe(
time=rot_keyframe.time,
# Just need to mirror on yaw and roll
rot=Quaternion.from_euler(*(rot_keyframe.rot.to_euler() * Vector3(-1, 1, -1)))
))
for pos_keyframe in joint.pos_keyframes:
new_joint.pos_keyframes.append(PosKeyframe(
time=pos_keyframe.time,
# Y is left / right so just negate it.
pos=pos_keyframe.pos * Vector3(1, -1, 1)
))
new_joints[inverse_joint_node.name] = new_joint
return new_joints

View File

@@ -0,0 +1,330 @@
# This currently implements basic LLMesh -> Collada.
#
# TODO:
# * inverse, Collada -> LLMesh (for simple cases, maybe using impasse rather than pycollada)
# * round-tripping tests, LLMesh->Collada->LLMesh
# * * Can't really test using Collada->LLMesh->Collada because Collada->LLMesh is almost always
# going to be lossy due to how SL represents vertex data and materials compared to what
# Collada allows.
# * Eventually scrap this and just use GLTF instead once we know we have the semantics correct
# * * Collada was just easier to bootstrap given that it's the only officially supported input format
# * * Collada tooling sucks and even LL is moving away from it
# * * Ensuring LLMesh->Collada and LLMesh->GLTF conversion don't differ semantically is easy via assimp.
import logging
import os.path
import secrets
import sys
from typing import Dict, Optional
import collada
import collada.source
from collada import E
from lxml import etree
import numpy as np
import transformations
from hippolyzer.lib.base.helpers import get_resource_filename
from hippolyzer.lib.base.serialization import BufferReader
from hippolyzer.lib.base.mesh import (
LLMeshSerializer,
MeshAsset,
positions_from_domain,
SkinSegmentDict,
llsd_to_mat4,
)
LOG = logging.getLogger(__name__)
DIR = os.path.dirname(os.path.realpath(__file__))
def mat4_to_collada(mat: np.ndarray) -> np.ndarray:
return mat.flatten(order='C')
def mesh_to_collada(ll_mesh: MeshAsset, include_skin=True) -> collada.Collada:
dae = collada.Collada()
axis = collada.asset.UP_AXIS.Z_UP
dae.assetInfo.upaxis = axis
scene = collada.scene.Scene("scene", [llmesh_to_node(ll_mesh, dae, include_skin=include_skin)])
dae.scenes.append(scene)
dae.scene = scene
return dae
def llmesh_to_node(ll_mesh: MeshAsset, dae: collada.Collada, uniq=None,
include_skin=True, node_transform: Optional[np.ndarray] = None) -> collada.scene.Node:
if node_transform is None:
node_transform = np.identity(4)
should_skin = False
skin_seg = ll_mesh.segments.get('skin')
bind_shape_matrix = None
if include_skin and skin_seg:
bind_shape_matrix = llsd_to_mat4(skin_seg["bind_shape_matrix"])
should_skin = True
# Transform from the skin will be applied on the controller, not the node
node_transform = np.identity(4)
if not uniq:
uniq = secrets.token_urlsafe(4)
geom_nodes = []
node_name = f"mainnode{uniq}"
# TODO: do the other LODs?
for submesh_num, submesh in enumerate(ll_mesh.segments["high_lod"]):
# Make sure none of our IDs collide with those of other nodes
sub_uniq = uniq + str(submesh_num)
range_xyz = positions_from_domain(submesh["Position"], submesh["PositionDomain"])
xyz = np.array([x.data() for x in range_xyz])
range_uv = positions_from_domain(submesh['TexCoord0'], submesh['TexCoord0Domain'])
uv = np.array([x.data() for x in range_uv]).flatten()
norms = np.array([x.data() for x in submesh["Normal"]])
effect = collada.material.Effect(
id=f"effect{sub_uniq}",
params=[],
specular=(0.0, 0.0, 0.0, 0.0),
reflectivity=(0.0, 0.0, 0.0, 0.0),
emission=(0.0, 0.0, 0.0, 0.0),
ambient=(0.0, 0.0, 0.0, 0.0),
reflective=0.0,
shadingtype="blinn",
shininess=0.0,
diffuse=(1.0, 1.0, 1.0),
)
mat = collada.material.Material(f"material{sub_uniq}", f"material{sub_uniq}", effect)
dae.materials.append(mat)
dae.effects.append(effect)
vert_src = collada.source.FloatSource(f"verts-array{sub_uniq}", xyz.flatten(), ("X", "Y", "Z"))
norm_src = collada.source.FloatSource(f"norms-array{sub_uniq}", norms.flatten(), ("X", "Y", "Z"))
# UV maps have to have the same name or they'll behave weirdly when objects are merged.
uv_src = collada.source.FloatSource("uvs-array", np.array(uv), ("U", "V"))
geom = collada.geometry.Geometry(dae, f"geometry{sub_uniq}", "geometry", [vert_src, norm_src, uv_src])
input_list = collada.source.InputList()
input_list.addInput(0, 'VERTEX', f'#verts-array{sub_uniq}', set="0")
input_list.addInput(0, 'NORMAL', f'#norms-array{sub_uniq}', set="0")
input_list.addInput(0, 'TEXCOORD', '#uvs-array', set="0")
tri_idxs = np.array(submesh["TriangleList"]).flatten()
matnode = collada.scene.MaterialNode(f"materialref{sub_uniq}", mat, inputs=[])
tri_set = geom.createTriangleSet(tri_idxs, input_list, f'materialref{sub_uniq}')
geom.primitives.append(tri_set)
dae.geometries.append(geom)
if should_skin:
joint_names = np.array(skin_seg['joint_names'], dtype=object)
joints_source = collada.source.NameSource(f"joint-names{sub_uniq}", joint_names, ("JOINT",))
# PyCollada has a bug where it doesn't set the source URI correctly. Fix it.
accessor = joints_source.xmlnode.find(f"{dae.tag('technique_common')}/{dae.tag('accessor')}")
if not accessor.get('source').startswith('#'):
accessor.set('source', f"#{accessor.get('source')}")
flattened_bind_poses = []
for bind_pose in skin_seg['inverse_bind_matrix']:
flattened_bind_poses.append(mat4_to_collada(llsd_to_mat4(bind_pose)))
flattened_bind_poses = np.array(flattened_bind_poses)
inv_bind_source = _create_mat4_source(f"bind-poses{sub_uniq}", flattened_bind_poses, "TRANSFORM")
weight_joint_idxs = []
weights = []
vert_weight_counts = []
cur_weight_idx = 0
for vert_weights in submesh['Weights']:
vert_weight_counts.append(len(vert_weights))
for vert_weight in vert_weights:
weights.append(vert_weight.weight)
weight_joint_idxs.append(vert_weight.joint_idx)
weight_joint_idxs.append(cur_weight_idx)
cur_weight_idx += 1
weights_source = collada.source.FloatSource(f"skin-weights{sub_uniq}", np.array(weights), ("WEIGHT",))
# We need to make a controller for each material since materials are essentially distinct meshes
# in SL, with their own distinct sets of weights and vertex data.
controller_node = E.controller(
E.skin(
E.bind_shape_matrix(' '.join(str(x) for x in mat4_to_collada(bind_shape_matrix))),
joints_source.xmlnode,
inv_bind_source.xmlnode,
weights_source.xmlnode,
E.joints(
E.input(semantic="JOINT", source=f"#joint-names{sub_uniq}"),
E.input(semantic="INV_BIND_MATRIX", source=f"#bind-poses{sub_uniq}")
),
E.vertex_weights(
E.input(semantic="JOINT", source=f"#joint-names{sub_uniq}", offset="0"),
E.input(semantic="WEIGHT", source=f"#skin-weights{sub_uniq}", offset="1"),
E.vcount(' '.join(str(x) for x in vert_weight_counts)),
E.v(' '.join(str(x) for x in weight_joint_idxs)),
count=str(len(submesh['Weights']))
),
source=f"#geometry{sub_uniq}"
),
id=f"Armature-{sub_uniq}",
name=node_name
)
controller = collada.controller.Controller.load(dae, {}, controller_node)
dae.controllers.append(controller)
geom_node = collada.scene.ControllerNode(controller, [matnode])
else:
geom_node = collada.scene.GeometryNode(geom, [matnode])
geom_nodes.append(geom_node)
node = collada.scene.Node(
node_name,
children=geom_nodes,
transforms=[collada.scene.MatrixTransform(mat4_to_collada(node_transform))],
)
if should_skin:
# We need a skeleton per _mesh asset_ because you could have incongruous skeletons
# within the same linkset.
# TODO: can we maintain some kind of skeleton cache, where if this skeleton has no conflicts
# with another skeleton in the cache, we just use that skeleton and add any additional joints?
skel_root = load_skeleton_nodes()
transform_skeleton(skel_root, dae, skin_seg)
skel = collada.scene.Node.load(dae, skel_root, {})
skel.children.append(node)
skel.id = f"Skel-{uniq}"
skel.save()
node = skel
return node
def load_skeleton_nodes() -> etree.ElementBase:
# TODO: this sucks. Can't we construct nodes with the appropriate transformation
# matrices from the data in `avatar_skeleton.xml`?
skel_path = get_resource_filename("lib/base/data/male_collada_joints.xml")
with open(skel_path, 'r') as f:
return etree.fromstring(f.read())
def transform_skeleton(skel_root: etree.ElementBase, dae: collada.Collada, skin_seg: SkinSegmentDict,
include_unreferenced_bones=False):
"""Update skeleton XML nodes to account for joint translations in the mesh"""
joint_nodes: Dict[str, collada.scene.Node] = {}
for skel_node in skel_root.iter():
# xpath is loathsome so this is easier.
if skel_node.tag != dae.tag('node') or skel_node.get('type') != 'JOINT':
continue
joint_nodes[skel_node.get('name')] = collada.scene.Node.load(dae, skel_node, {})
for joint_name, matrix in zip(skin_seg['joint_names'], skin_seg.get('alt_inverse_bind_matrix', [])):
joint_node = joint_nodes[joint_name]
joint_decomp = transformations.decompose_matrix(llsd_to_mat4(matrix))
joint_node.matrix = mat4_to_collada(transformations.compose_matrix(translate=joint_decomp[3]))
# Update the underlying XML element with the new transform matrix
joint_node.save()
if not include_unreferenced_bones:
needed_heirarchy = set()
for skel_node in joint_nodes.values():
skel_node = skel_node.xmlnode
if skel_node.get('name') in skin_seg['joint_names']:
# Add this joint and any ancestors the list of needed joints
while skel_node is not None:
needed_heirarchy.add(skel_node.get('name'))
skel_node = skel_node.getparent()
for skel_node in joint_nodes.values():
skel_node = skel_node.xmlnode
if skel_node.get('name') not in needed_heirarchy:
skel_node.getparent().remove(skel_node)
pelvis_offset = skin_seg.get('pelvis_offset')
# TODO: should we even do this here? It's not present in the collada, just
# something that's specified in the uploader before conversion to LLMesh.
if pelvis_offset and 'mPelvis' in joint_nodes:
pelvis_node = joint_nodes['mPelvis']
# Column-major!
pelvis_node.matrix[3][2] += pelvis_offset
pelvis_node.save()
def _create_mat4_source(name: str, data: np.ndarray, semantic: str):
# PyCollada has no way to make a source with a float4x4 semantic. Do it a bad way.
# Note that collada demands column-major matrices whereas LLSD mesh has them row-major!
source = collada.source.FloatSource(name, data, tuple(f"M{x}" for x in range(16)))
accessor = source.xmlnode[1][0]
for child in list(accessor):
accessor.remove(child)
accessor.append(E.param(name=semantic, type="float4x4"))
return source
def fix_weird_bind_matrices(skin_seg: SkinSegmentDict) -> None:
"""
Fix weird-looking bind matrices to have sensible scaling and rotations
Sometimes we get enormous inverse bind matrices (each component 10k+) and tiny
bind shape matrix components. This detects inverse bind shape matrices
with weird scales and tries to set them to what they "should" be without
the weird inverted scaling.
"""
# Sometimes we get mesh assets that have the vertex data naturally in y-up orientation,
# and get re-oriented to z-up not through the bind shape matrix, but through the
# transforms in the inverse bind matrices!
#
# Blender, for one, does not like this very much, and generally won't generate mesh
# assets like this, as explained here https://developer.blender.org/T38660.
# In vanilla Blender, these mesh assets will show up scaled and rotated _only_ according
# to the bind shape matrix, which may end up with the model 25 meters tall and sitting
# on its side.
#
# https://avalab.org/avastar/292/knowledge/compare-workbench/, while somewhat outdated,
# has some information on rest pose vs default pose and scaling that I believe is relevant.
# https://github.com/KhronosGroup/glTF-Blender-IO/issues/994 as well.
#
# While trying to figure out what was going on, I searched for something like
# "inverse bind matrix scale collada", "bind pose scale blender", etc. Pretty much every
# result was either a bug filed by, or a question asked by the creator of Avastar, or an SL user.
# I think that says a lot about how annoying it is to author mesh for SL in particular.
#
# I spent a good month or so tearing my hair out over this wondering how these values could
# even be possible. I wasn't sure how I should write mesh import code if I don't understand
# how to interpret existing data, or how it even ended up the way it did. Turns out I wasn't
# misinterpreting the data, the data really is just weird.
#
# I'd also had the idea that you could sniff which body a given rigged asset was meant
# for by doing trivial matching on the inverse bind matrices, but obviously that isn't true!
#
# Basically:
# 1) Maya is evil and generates evil, this evil bleeds into SL's assets through transforms.
# 2) Blender is also evil, but in a manner that doesn't agree with Maya's evil.
# 3) Collada was a valiant effort, but is evil in practice. Seemingly simple Collada
# files are interpreted completely differently by Blender, Maya, and sometimes SL.
# 4) Those three evils collude to make an interop nightmare for everyone like "oh my rigger
# rigs using Maya and now my model is huge and all my normals are fucked on reimport"
# 5) Yes, there's still good reasons to be using Avastar in 2022 even though nobody authoring
# rigged mesh for any other use has to use something similar.
if not skin_seg['joint_names']:
return
# TODO: calculate the correct inverse bind matrix scale & rotations from avatar_skeleton.xml
# definitions. If the rotation and scale factors are the same across all inverse bind matrices then
# they can be moved over to the bind shape matrix to keep Blender happy.
# Maybe add a scaled / rotated empty as a parent for the armature instead?
return
def main():
# Take an llmesh file as an argument and spit out basename-converted.dae
with open(sys.argv[1], "rb") as f:
reader = BufferReader("<", f.read())
mesh = mesh_to_collada(reader.read(LLMeshSerializer(parse_segment_contents=True)))
mesh.write(sys.argv[1].rsplit(".", 1)[0] + "-converted.dae")
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,485 @@
<!-- from http://wiki.secondlife.com/wiki/Project_Bento_Resources_and_Information collada -->
<node id="Avatar" name="Avatar" type="NODE" xmlns="http://www.collada.org/2005/11/COLLADASchema">
<translate sid="location">0 0 0</translate>
<rotate sid="rotationZ">0 0 1 0</rotate>
<rotate sid="rotationY">0 1 0 0</rotate>
<rotate sid="rotationX">1 0 0 0</rotate>
<scale sid="scale">1 1 1</scale>
<node id="mPelvis" name="mPelvis" sid="mPelvis" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 1.067 0 0 0 1</matrix>
<node id="PELVIS" name="PELVIS" sid="PELVIS" type="JOINT">
<matrix sid="transform">1 0 0 -0.01 0 1 0 0 0 0 1 -0.02 0 0 0 1</matrix>
</node>
<node id="BUTT" name="BUTT" sid="BUTT" type="JOINT">
<matrix sid="transform">1 0 0 -0.06 0 1 0 0 0 0 1 -0.1 0 0 0 1</matrix>
</node>
<node id="mSpine1" name="mSpine1" sid="mSpine1" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.084 0 0 0 1</matrix>
<node id="mSpine2" name="mSpine2" sid="mSpine2" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 -0.084 0 0 0 1</matrix>
<node id="mTorso" name="mTorso" sid="mTorso" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.084 0 0 0 1</matrix>
<node id="BELLY" name="BELLY" sid="BELLY" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 0 0 0 1 0.04 0 0 0 1</matrix>
</node>
<node id="LEFT_HANDLE" name="LEFT_HANDLE" sid="LEFT_HANDLE" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.1 0 0 1 0.058 0 0 0 1</matrix>
</node>
<node id="RIGHT_HANDLE" name="RIGHT_HANDLE" sid="RIGHT_HANDLE" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.1 0 0 1 0.058 0 0 0 1</matrix>
</node>
<node id="LOWER_BACK" name="LOWER_BACK" sid="LOWER_BACK" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.023 0 0 0 1</matrix>
</node>
<node id="mSpine3" name="mSpine3" sid="mSpine3" type="JOINT">
<matrix sid="transform">1 0 0 -0.015 0 1 0 0 0 0 1 0.205 0 0 0 1</matrix>
<node id="mSpine4" name="mSpine4" sid="mSpine4" type="JOINT">
<matrix sid="transform">1 0 0 0.015 0 1 0 0 0 0 1 -0.205 0 0 0 1</matrix>
<node id="mChest" name="mChest" sid="mChest" type="JOINT">
<matrix sid="transform">1 0 0 -0.015 0 1 0 0 0 0 1 0.205 0 0 0 1</matrix>
<node id="CHEST" name="CHEST" sid="CHEST" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 0 0 0 1 0.07 0 0 0 1</matrix>
</node>
<node id="LEFT_PEC" name="LEFT_PEC" sid="LEFT_PEC" type="JOINT">
<matrix sid="transform">1 0 0 0.119 0 1 0 0.082 0 0 1 0.042 0 0 0 1</matrix>
</node>
<node id="RIGHT_PEC" name="RIGHT_PEC" sid="RIGHT_PEC" type="JOINT">
<matrix sid="transform">1 0 0 0.119 0 1 0 -0.082 0 0 1 0.042 0 0 0 1</matrix>
</node>
<node id="UPPER_BACK" name="UPPER_BACK" sid="UPPER_BACK" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.017 0 0 0 1</matrix>
</node>
<node id="mNeck" name="mNeck" sid="mNeck" type="JOINT">
<matrix sid="transform">1 0 0 -0.01 0 1 0 0 0 0 1 0.251 0 0 0 1</matrix>
<node id="NECK" name="NECK" sid="NECK" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
</node>
<node id="mHead" name="mHead" sid="mHead" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.076 0 0 0 1</matrix>
<node id="HEAD" name="HEAD" sid="HEAD" type="JOINT">
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 0.07 0 0 0 1</matrix>
</node>
<node id="mSkull" name="mSkull" sid="mSkull" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.079 0 0 0 1</matrix>
</node>
<node id="mEyeRight" name="mEyeRight" sid="mEyeRight" type="JOINT">
<matrix sid="transform">1 0 0 0.098 0 1 0 -0.036 0 0 1 0.079 0 0 0 1</matrix>
</node>
<node id="mEyeLeft" name="mEyeLeft" sid="mEyeLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.098 0 1 0 0.036 0 0 1 0.079 0 0 0 1</matrix>
</node>
<node id="mFaceRoot" name="mFaceRoot" sid="mFaceRoot" type="JOINT">
<matrix sid="transform">1 0 0 0.025 0 1 0 0 0 0 1 0.045 0 0 0 1</matrix>
<node id="mFaceEyeAltRight" name="mFaceEyeAltRight" sid="mFaceEyeAltRight" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 -0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEyeAltLeft" name="mFaceEyeAltLeft" sid="mFaceEyeAltLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceForeheadLeft" name="mFaceForeheadLeft" sid="mFaceForeheadLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.061 0 1 0 0.035 0 0 1 0.083 0 0 0 1</matrix>
</node>
<node id="mFaceForeheadRight" name="mFaceForeheadRight" sid="mFaceForeheadRight" type="JOINT">
<matrix sid="transform">1 0 0 0.061 0 1 0 -0.035 0 0 1 0.083 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowOuterLeft" name="mFaceEyebrowOuterLeft" sid="mFaceEyebrowOuterLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.064 0 1 0 0.051 0 0 1 0.048 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowCenterLeft" name="mFaceEyebrowCenterLeft" sid="mFaceEyebrowCenterLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.07 0 1 0 0.043 0 0 1 0.056 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowInnerLeft" name="mFaceEyebrowInnerLeft" sid="mFaceEyebrowInnerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.075 0 1 0 0.022 0 0 1 0.051 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowOuterRight" name="mFaceEyebrowOuterRight" sid="mFaceEyebrowOuterRight" type="JOINT">
<matrix sid="transform">1 0 0 0.064 0 1 0 -0.051 0 0 1 0.048 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowCenterRight" name="mFaceEyebrowCenterRight" sid="mFaceEyebrowCenterRight" type="JOINT">
<matrix sid="transform">1 0 0 0.07 0 1 0 -0.043 0 0 1 0.056 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowInnerRight" name="mFaceEyebrowInnerRight" sid="mFaceEyebrowInnerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.075 0 1 0 -0.022 0 0 1 0.051 0 0 0 1</matrix>
</node>
<node id="mFaceEyeLidUpperLeft" name="mFaceEyeLidUpperLeft" sid="mFaceEyeLidUpperLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEyeLidLowerLeft" name="mFaceEyeLidLowerLeft" sid="mFaceEyeLidLowerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEyeLidUpperRight" name="mFaceEyeLidUpperRight" sid="mFaceEyeLidUpperRight" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 -0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEyeLidLowerRight" name="mFaceEyeLidLowerRight" sid="mFaceEyeLidLowerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 -0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEar1Left" name="mFaceEar1Left" sid="mFaceEar1Left" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.08 0 0 1 0.002 0 0 0 1</matrix>
<node id="mFaceEar2Left" name="mFaceEar2Left" sid="mFaceEar2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.019 0 1 0 0.018 0 0 1 0.025 0 0 0 1</matrix>
</node>
</node>
<node id="mFaceEar1Right" name="mFaceEar1Right" sid="mFaceEar1Right" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.08 0 0 1 0.002 0 0 0 1</matrix>
<node id="mFaceEar2Right" name="mFaceEar2Right" sid="mFaceEar2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.019 0 1 0 -0.018 0 0 1 0.025 0 0 0 1</matrix>
</node>
</node>
<node id="mFaceNoseLeft" name="mFaceNoseLeft" sid="mFaceNoseLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.086 0 1 0 0.015 0 0 1 -0.004 0 0 0 1</matrix>
</node>
<node id="mFaceNoseCenter" name="mFaceNoseCenter" sid="mFaceNoseCenter" type="JOINT">
<matrix sid="transform">1 0 0 0.102 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceNoseRight" name="mFaceNoseRight" sid="mFaceNoseRight" type="JOINT">
<matrix sid="transform">1 0 0 0.086 0 1 0 -0.015 0 0 1 -0.004 0 0 0 1</matrix>
</node>
<node id="mFaceCheekLowerLeft" name="mFaceCheekLowerLeft" sid="mFaceCheekLowerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.05 0 1 0 0.034 0 0 1 -0.031 0 0 0 1</matrix>
</node>
<node id="mFaceCheekUpperLeft" name="mFaceCheekUpperLeft" sid="mFaceCheekUpperLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.07 0 1 0 0.034 0 0 1 -0.005 0 0 0 1</matrix>
</node>
<node id="mFaceCheekLowerRight" name="mFaceCheekLowerRight" sid="mFaceCheekLowerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.05 0 1 0 -0.034 0 0 1 -0.031 0 0 0 1</matrix>
</node>
<node id="mFaceCheekUpperRight" name="mFaceCheekUpperRight" sid="mFaceCheekUpperRight" type="JOINT">
<matrix sid="transform">1 0 0 0.07 0 1 0 -0.034 0 0 1 -0.005 0 0 0 1</matrix>
</node>
<node id="mFaceJaw" name="mFaceJaw" sid="mFaceJaw" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 0 0 0 1 -0.015 0 0 0 1</matrix>
<node id="mFaceChin" name="mFaceChin" sid="mFaceChin" type="JOINT">
<matrix sid="transform">1 0 0 0.074 0 1 0 0 0 0 1 -0.054 0 0 0 1</matrix>
</node>
<node id="mFaceTeethLower" name="mFaceTeethLower" sid="mFaceTeethLower" type="JOINT">
<matrix sid="transform">1 0 0 0.021 0 1 0 0 0 0 1 -0.039 0 0 0 1</matrix>
<node id="mFaceLipLowerLeft" name="mFaceLipLowerLeft" sid="mFaceLipLowerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceLipLowerRight" name="mFaceLipLowerRight" sid="mFaceLipLowerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceLipLowerCenter" name="mFaceLipLowerCenter" sid="mFaceLipLowerCenter" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceTongueBase" name="mFaceTongueBase" sid="mFaceTongueBase" type="JOINT">
<matrix sid="transform">1 0 0 0.039 0 1 0 0 0 0 1 0.005 0 0 0 1</matrix>
<node id="mFaceTongueTip" name="mFaceTongueTip" sid="mFaceTongueTip" type="JOINT">
<matrix sid="transform">1 0 0 0.022 0 1 0 0 0 0 1 0.007 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
<node id="mFaceJawShaper" name="mFaceJawShaper" sid="mFaceJawShaper" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceForeheadCenter" name="mFaceForeheadCenter" sid="mFaceForeheadCenter" type="JOINT">
<matrix sid="transform">1 0 0 0.069 0 1 0 0 0 0 1 0.065 0 0 0 1</matrix>
</node>
<node id="mFaceNoseBase" name="mFaceNoseBase" sid="mFaceNoseBase" type="JOINT">
<matrix sid="transform">1 0 0 0.094 0 1 0 0 0 0 1 -0.016 0 0 0 1</matrix>
</node>
<node id="mFaceTeethUpper" name="mFaceTeethUpper" sid="mFaceTeethUpper" type="JOINT">
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 -0.03 0 0 0 1</matrix>
<node id="mFaceLipUpperLeft" name="mFaceLipUpperLeft" sid="mFaceLipUpperLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 -0.003 0 0 0 1</matrix>
</node>
<node id="mFaceLipUpperRight" name="mFaceLipUpperRight" sid="mFaceLipUpperRight" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 -0.003 0 0 0 1</matrix>
</node>
<node id="mFaceLipCornerLeft" name="mFaceLipCornerLeft" sid="mFaceLipCornerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 -0.019 0 0 1 -0.01 0 0 0 1</matrix>
</node>
<node id="mFaceLipCornerRight" name="mFaceLipCornerRight" sid="mFaceLipCornerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 0.019 0 0 1 -0.01 0 0 0 1</matrix>
</node>
<node id="mFaceLipUpperCenter" name="mFaceLipUpperCenter" sid="mFaceLipUpperCenter" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 -0.003 0 0 0 1</matrix>
</node>
</node>
<node id="mFaceEyecornerInnerLeft" name="mFaceEyecornerInnerLeft" sid="mFaceEyecornerInnerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.075 0 1 0 0.017 0 0 1 0.032 0 0 0 1</matrix>
</node>
<node id="mFaceEyecornerInnerRight" name="mFaceEyecornerInnerRight" sid="mFaceEyecornerInnerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.075 0 1 0 -0.017 0 0 1 0.032 0 0 0 1</matrix>
</node>
<node id="mFaceNoseBridge" name="mFaceNoseBridge" sid="mFaceNoseBridge" type="JOINT">
<matrix sid="transform">1 0 0 0.091 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
<node id="mCollarLeft" name="mCollarLeft" sid="mCollarLeft" type="JOINT">
<matrix sid="transform">1 0 0 -0.021 0 1 0 0.085 0 0 1 0.165 0 0 0 1</matrix>
<node id="L_CLAVICLE" name="L_CLAVICLE" sid="L_CLAVICLE" type="JOINT">
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
</node>
<node id="mShoulderLeft" name="mShoulderLeft" sid="mShoulderLeft" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.079 0 0 1 0 0 0 0 1</matrix>
<node id="L_UPPER_ARM" name="L_UPPER_ARM" sid="L_UPPER_ARM" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.12 0 0 1 0.01 0 0 0 1</matrix>
</node>
<node id="mElbowLeft" name="mElbowLeft" sid="mElbowLeft" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.248 0 0 1 0 0 0 0 1</matrix>
<node id="L_LOWER_ARM" name="L_LOWER_ARM" sid="L_LOWER_ARM" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.1 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mWristLeft" name="mWristLeft" sid="mWristLeft" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.205 0 0 1 0 0 0 0 1</matrix>
<node id="L_HAND" name="L_HAND" sid="L_HAND" type="JOINT">
<matrix sid="transform">1 0 0 0.01 0 1 0 0.05 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mHandMiddle1Left" name="mHandMiddle1Left" sid="mHandMiddle1Left" type="JOINT">
<matrix sid="transform">1 0 0 0.013 0 1 0 0.101 0 0 1 0.015 0 0 0 1</matrix>
<node id="mHandMiddle2Left" name="mHandMiddle2Left" sid="mHandMiddle2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 0.04 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandMiddle3Left" name="mHandMiddle3Left" sid="mHandMiddle3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 0.049 0 0 1 -0.008 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandIndex1Left" name="mHandIndex1Left" sid="mHandIndex1Left" type="JOINT">
<matrix sid="transform">1 0 0 0.038 0 1 0 0.097 0 0 1 0.015 0 0 0 1</matrix>
<node id="mHandIndex2Left" name="mHandIndex2Left" sid="mHandIndex2Left" type="JOINT">
<matrix sid="transform">1 0 0 0.017 0 1 0 0.036 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandIndex3Left" name="mHandIndex3Left" sid="mHandIndex3Left" type="JOINT">
<matrix sid="transform">1 0 0 0.014 0 1 0 0.032 0 0 1 -0.006 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandRing1Left" name="mHandRing1Left" sid="mHandRing1Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.01 0 1 0 0.099 0 0 1 0.009 0 0 0 1</matrix>
<node id="mHandRing2Left" name="mHandRing2Left" sid="mHandRing2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.013 0 1 0 0.038 0 0 1 -0.008 0 0 0 1</matrix>
<node id="mHandRing3Left" name="mHandRing3Left" sid="mHandRing3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.013 0 1 0 0.04 0 0 1 -0.009 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandPinky1Left" name="mHandPinky1Left" sid="mHandPinky1Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.031 0 1 0 0.095 0 0 1 0.003 0 0 0 1</matrix>
<node id="mHandPinky2Left" name="mHandPinky2Left" sid="mHandPinky2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.024 0 1 0 0.025 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandPinky3Left" name="mHandPinky3Left" sid="mHandPinky3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.015 0 1 0 0.018 0 0 1 -0.004 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandThumb1Left" name="mHandThumb1Left" sid="mHandThumb1Left" type="JOINT">
<matrix sid="transform">1 0 0 0.031 0 1 0 0.026 0 0 1 0.004 0 0 0 1</matrix>
<node id="mHandThumb2Left" name="mHandThumb2Left" sid="mHandThumb2Left" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 0.032 0 0 1 -0.001 0 0 0 1</matrix>
<node id="mHandThumb3Left" name="mHandThumb3Left" sid="mHandThumb3Left" type="JOINT">
<matrix sid="transform">1 0 0 0.023 0 1 0 0.031 0 0 1 -0.001 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
<node id="mCollarRight" name="mCollarRight" sid="mCollarRight" type="JOINT">
<matrix sid="transform">1 0 0 -0.021 0 1 0 -0.085 0 0 1 0.165 0 0 0 1</matrix>
<node id="R_CLAVICLE" name="R_CLAVICLE" sid="R_CLAVICLE" type="JOINT">
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
</node>
<node id="mShoulderRight" name="mShoulderRight" sid="mShoulderRight" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.079 0 0 1 0 0 0 0 1</matrix>
<node id="R_UPPER_ARM" name="R_UPPER_ARM" sid="R_UPPER_ARM" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.12 0 0 1 0.01 0 0 0 1</matrix>
</node>
<node id="mElbowRight" name="mElbowRight" sid="mElbowRight" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.248 0 0 1 0 0 0 0 1</matrix>
<node id="R_LOWER_ARM" name="R_LOWER_ARM" sid="R_LOWER_ARM" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.1 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mWristRight" name="mWristRight" sid="mWristRight" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.205 0 0 1 0 0 0 0 1</matrix>
<node id="R_HAND" name="R_HAND" sid="R_HAND" type="JOINT">
<matrix sid="transform">1 0 0 0.01 0 1 0 -0.05 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mHandMiddle1Right" name="mHandMiddle1Right" sid="mHandMiddle1Right" type="JOINT">
<matrix sid="transform">1 0 0 0.013 0 1 0 -0.101 0 0 1 0.015 0 0 0 1</matrix>
<node id="mHandMiddle2Right" name="mHandMiddle2Right" sid="mHandMiddle2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 -0.04 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandMiddle3Right" name="mHandMiddle3Right" sid="mHandMiddle3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 -0.049 0 0 1 -0.008 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandIndex1Right" name="mHandIndex1Right" sid="mHandIndex1Right" type="JOINT">
<matrix sid="transform">1 0 0 0.038 0 1 0 -0.097 0 0 1 0.015 0 0 0 1</matrix>
<node id="mHandIndex2Right" name="mHandIndex2Right" sid="mHandIndex2Right" type="JOINT">
<matrix sid="transform">1 0 0 0.017 0 1 0 -0.036 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandIndex3Right" name="mHandIndex3Right" sid="mHandIndex3Right" type="JOINT">
<matrix sid="transform">1 0 0 0.014 0 1 0 -0.032 0 0 1 -0.006 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandRing1Right" name="mHandRing1Right" sid="mHandRing1Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.01 0 1 0 -0.099 0 0 1 0.009 0 0 0 1</matrix>
<node id="mHandRing2Right" name="mHandRing2Right" sid="mHandRing2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.013 0 1 0 -0.038 0 0 1 -0.008 0 0 0 1</matrix>
<node id="mHandRing3Right" name="mHandRing3Right" sid="mHandRing3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.013 0 1 0 -0.04 0 0 1 -0.009 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandPinky1Right" name="mHandPinky1Right" sid="mHandPinky1Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.031 0 1 0 -0.095 0 0 1 0.003 0 0 0 1</matrix>
<node id="mHandPinky2Right" name="mHandPinky2Right" sid="mHandPinky2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.024 0 1 0 -0.025 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandPinky3Right" name="mHandPinky3Right" sid="mHandPinky3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.015 0 1 0 -0.018 0 0 1 -0.004 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandThumb1Right" name="mHandThumb1Right" sid="mHandThumb1Right" type="JOINT">
<matrix sid="transform">1 0 0 0.031 0 1 0 -0.026 0 0 1 0.004 0 0 0 1</matrix>
<node id="mHandThumb2Right" name="mHandThumb2Right" sid="mHandThumb2Right" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 -0.032 0 0 1 -0.001 0 0 0 1</matrix>
<node id="mHandThumb3Right" name="mHandThumb3Right" sid="mHandThumb3Right" type="JOINT">
<matrix sid="transform">1 0 0 0.023 0 1 0 -0.031 0 0 1 -0.001 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
<node id="mWingsRoot" name="mWingsRoot" sid="mWingsRoot" type="JOINT">
<matrix sid="transform">1 0 0 -0.014 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mWing1Left" name="mWing1Left" sid="mWing1Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.099 0 1 0 0.105 0 0 1 0.181 0 0 0 1</matrix>
<node id="mWing2Left" name="mWing2Left" sid="mWing2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.168 0 1 0 0.169 0 0 1 0.067 0 0 0 1</matrix>
<node id="mWing3Left" name="mWing3Left" sid="mWing3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.181 0 1 0 0.183 0 0 1 0 0 0 0 1</matrix>
<node id="mWing4Left" name="mWing4Left" sid="mWing4Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.171 0 1 0 0.173 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mWing4FanLeft" name="mWing4FanLeft" sid="mWing4FanLeft" type="JOINT">
<matrix sid="transform">1 0 0 -0.171 0 1 0 0.173 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
<node id="mWing1Right" name="mWing1Right" sid="mWing1Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.099 0 1 0 -0.105 0 0 1 0.181 0 0 0 1</matrix>
<node id="mWing2Right" name="mWing2Right" sid="mWing2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.168 0 1 0 -0.169 0 0 1 0.067 0 0 0 1</matrix>
<node id="mWing3Right" name="mWing3Right" sid="mWing3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.181 0 1 0 -0.183 0 0 1 0 0 0 0 1</matrix>
<node id="mWing4Right" name="mWing4Right" sid="mWing4Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.171 0 1 0 -0.173 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mWing4FanRight" name="mWing4FanRight" sid="mWing4FanRight" type="JOINT">
<matrix sid="transform">1 0 0 -0.171 0 1 0 -0.173 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
<node id="mHipRight" name="mHipRight" sid="mHipRight" type="JOINT">
<matrix sid="transform">1 0 0 0.034 0 1 0 -0.129 0 0 1 -0.041 0 0 0 1</matrix>
<node id="R_UPPER_LEG" name="R_UPPER_LEG" sid="R_UPPER_LEG" type="JOINT">
<matrix sid="transform">1 0 0 -0.02 0 1 0 0.05 0 0 1 -0.22 0 0 0 1</matrix>
</node>
<node id="mKneeRight" name="mKneeRight" sid="mKneeRight" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 0.049 0 0 1 -0.491 0 0 0 1</matrix>
<node id="R_LOWER_LEG" name="R_LOWER_LEG" sid="R_LOWER_LEG" type="JOINT">
<matrix sid="transform">1 0 0 -0.02 0 1 0 0 0 0 1 -0.2 0 0 0 1</matrix>
</node>
<node id="mAnkleRight" name="mAnkleRight" sid="mAnkleRight" type="JOINT">
<matrix sid="transform">1 0 0 -0.029 0 1 0 0 0 0 1 -0.468 0 0 0 1</matrix>
<node id="R_FOOT" name="R_FOOT" sid="R_FOOT" type="JOINT">
<matrix sid="transform">1 0 0 0.077 0 1 0 0 0 0 1 -0.041 0 0 0 1</matrix>
</node>
<node id="mFootRight" name="mFootRight" sid="mFootRight" type="JOINT">
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
<node id="mToeRight" name="mToeRight" sid="mToeRight" type="JOINT">
<matrix sid="transform">1 0 0 0.109 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
<node id="mHipLeft" name="mHipLeft" sid="mHipLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.034 0 1 0 0.127 0 0 1 -0.041 0 0 0 1</matrix>
<node id="L_UPPER_LEG" name="L_UPPER_LEG" sid="L_UPPER_LEG" type="JOINT">
<matrix sid="transform">1 0 0 -0.02 0 1 0 -0.05 0 0 1 -0.22 0 0 0 1</matrix>
</node>
<node id="mKneeLeft" name="mKneeLeft" sid="mKneeLeft" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 -0.046 0 0 1 -0.491 0 0 0 1</matrix>
<node id="L_LOWER_LEG" name="L_LOWER_LEG" sid="L_LOWER_LEG" type="JOINT">
<matrix sid="transform">1 0 0 -0.02 0 1 0 0 0 0 1 -0.2 0 0 0 1</matrix>
</node>
<node id="mAnkleLeft" name="mAnkleLeft" sid="mAnkleLeft" type="JOINT">
<matrix sid="transform">1 0 0 -0.029 0 1 0 0.001 0 0 1 -0.468 0 0 0 1</matrix>
<node id="L_FOOT" name="L_FOOT" sid="L_FOOT" type="JOINT">
<matrix sid="transform">1 0 0 0.077 0 1 0 0 0 0 1 -0.041 0 0 0 1</matrix>
</node>
<node id="mFootLeft" name="mFootLeft" sid="mFootLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
<node id="mToeLeft" name="mToeLeft" sid="mToeLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.109 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
<node id="mTail1" name="mTail1" sid="mTail1" type="JOINT">
<matrix sid="transform">1 0 0 -0.116 0 1 0 0 0 0 1 0.047 0 0 0 1</matrix>
<node id="mTail2" name="mTail2" sid="mTail2" type="JOINT">
<matrix sid="transform">1 0 0 -0.197 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mTail3" name="mTail3" sid="mTail3" type="JOINT">
<matrix sid="transform">1 0 0 -0.168 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mTail4" name="mTail4" sid="mTail4" type="JOINT">
<matrix sid="transform">1 0 0 -0.142 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mTail5" name="mTail5" sid="mTail5" type="JOINT">
<matrix sid="transform">1 0 0 -0.112 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mTail6" name="mTail6" sid="mTail6" type="JOINT">
<matrix sid="transform">1 0 0 -0.094 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
<node id="mGroin" name="mGroin" sid="mGroin" type="JOINT">
<matrix sid="transform">1 0 0 0.064 0 1 0 0 0 0 1 -0.097 0 0 0 1</matrix>
</node>
<node id="mHindLimbsRoot" name="mHindLimbsRoot" sid="mHindLimbsRoot" type="JOINT">
<matrix sid="transform">1 0 0 -0.2 0 1 0 0 0 0 1 0.084 0 0 0 1</matrix>
<node id="mHindLimb1Left" name="mHindLimb1Left" sid="mHindLimb1Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.204 0 1 0 0.129 0 0 1 -0.125 0 0 0 1</matrix>
<node id="mHindLimb2Left" name="mHindLimb2Left" sid="mHindLimb2Left" type="JOINT">
<matrix sid="transform">1 0 0 0.002 0 1 0 -0.046 0 0 1 -0.491 0 0 0 1</matrix>
<node id="mHindLimb3Left" name="mHindLimb3Left" sid="mHindLimb3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.03 0 1 0 -0.003 0 0 1 -0.468 0 0 0 1</matrix>
<node id="mHindLimb4Left" name="mHindLimb4Left" sid="mHindLimb4Left" type="JOINT">
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
<node id="mHindLimb1Right" name="mHindLimb1Right" sid="mHindLimb1Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.204 0 1 0 -0.129 0 0 1 -0.125 0 0 0 1</matrix>
<node id="mHindLimb2Right" name="mHindLimb2Right" sid="mHindLimb2Right" type="JOINT">
<matrix sid="transform">1 0 0 0.002 0 1 0 0.046 0 0 1 -0.491 0 0 0 1</matrix>
<node id="mHindLimb3Right" name="mHindLimb3Right" sid="mHindLimb3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.03 0 1 0 0.003 0 0 1 -0.468 0 0 0 1</matrix>
<node id="mHindLimb4Right" name="mHindLimb4Right" sid="mHindLimb4Right" type="JOINT">
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
</node>

View File

@@ -18,6 +18,8 @@ You should have received a copy of the GNU Lesser General Public License
along with this program; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
from __future__ import annotations
import ast
import enum
import hashlib
@@ -27,6 +29,7 @@ import math
from typing import *
import recordclass
import transformations
logger = getLogger('hippolyzer.lib.base.datatypes')
@@ -36,12 +39,13 @@ class _IterableStub:
__iter__: Callable
class TupleCoord(recordclass.datatuple, _IterableStub): # type: ignore
__options__ = {
"fast_new": False,
}
RAD_TO_DEG = 180 / math.pi
class TupleCoord(recordclass.RecordClass, _IterableStub):
def __init__(self, *args):
# Only to help typing, doesn't actually do anything.
# All the important stuff happens in `__new__()`
pass
@classmethod
@@ -58,6 +62,9 @@ class TupleCoord(recordclass.datatuple, _IterableStub): # type: ignore
def __abs__(self):
return self.__class__(*(abs(x) for x in self))
def __neg__(self):
return self.__class__(*(-x for x in self))
def __add__(self, other):
return self.__class__(*(x + y for x, y in zip(self, other)))
@@ -215,6 +222,15 @@ class Quaternion(TupleCoord):
)
return super().__mul__(other)
@classmethod
def from_transformations(cls, coord) -> Quaternion:
"""Convert to W (S) last form"""
return cls(coord[1], coord[2], coord[3], coord[0])
def to_transformations(self) -> Tuple[float, float, float, float]:
"""Convert to W (S) first form for use with the transformations lib"""
return self.W, self.X, self.Y, self.Z
@classmethod
def from_euler(cls, roll, pitch, yaw, degrees=False):
if degrees:
@@ -236,6 +252,9 @@ class Quaternion(TupleCoord):
return cls(X=x, Y=y, Z=z, W=w)
def to_euler(self) -> Vector3:
return Vector3(*transformations.euler_from_quaternion(self.to_transformations()))
def data(self, wanted_components=None):
if wanted_components == 3:
return self.X, self.Y, self.Z
@@ -244,6 +263,7 @@ class Quaternion(TupleCoord):
class UUID(uuid.UUID):
_NULL_UUID_STR = '00000000-0000-0000-0000-000000000000'
ZERO: UUID
__slots__ = ()
def __init__(self, val: Union[uuid.UUID, str, None] = None, bytes=None, int=None):
@@ -268,18 +288,25 @@ class UUID(uuid.UUID):
return self.__class__(int=self.int ^ other.int)
UUID.ZERO = UUID()
class JankStringyBytes(bytes):
"""
Treat bytes as UTF8 if used in string context
Sinful, but necessary evil for now since templates don't specify what's
binary and what's a string.
binary and what's a string. There are also certain fields where the value
may be either binary _or_ a string, depending on the context.
"""
__slots__ = ()
def __str__(self):
return self.rstrip(b"\x00").decode("utf8", errors="replace")
def __bool__(self):
return not (super().__eq__(b"") or super().__eq__(b"\x00"))
def __eq__(self, other):
if isinstance(other, str):
return str(self) == other
@@ -288,12 +315,58 @@ class JankStringyBytes(bytes):
def __ne__(self, other):
return not self.__eq__(other)
def __contains__(self, item):
if isinstance(item, str):
return item in str(self)
return item in bytes(self)
def __add__(self, other):
if isinstance(other, bytes):
return JankStringyBytes(bytes(self) + other)
return str(self) + other
def __radd__(self, other):
if isinstance(other, bytes):
return JankStringyBytes(other + bytes(self))
return other + str(self)
def lower(self):
return str(self).lower()
def upper(self):
return str(self).upper()
def startswith(self, __prefix, __start=None, __end=None):
if __start or __end:
raise RuntimeError("Can't handle __start or __end")
if isinstance(__prefix, str):
return str(self).startswith(__prefix)
return self.startswith(__prefix)
def endswith(self, __prefix, __start=None, __end=None):
if __start or __end:
raise RuntimeError("Can't handle __start or __end")
if isinstance(__prefix, str):
return str(self).endswith(__prefix)
return self.endswith(__prefix)
class RawBytes(bytes):
__slots__ = ()
pass
_T = TypeVar("_T")
class Pretty(Generic[_T]):
"""Wrapper for var values so Messages will know to serialize"""
__slots__ = ("value",)
def __init__(self, value: _T):
self.value: _T = value
class StringEnum(str, enum.Enum):
def __str__(self):
return self.value
@@ -325,7 +398,7 @@ def flags_to_pod(flag_cls: Type[enum.IntFlag], val: int) -> Tuple[Union[str, int
return tuple(flag.name for flag in iter(flag_cls) if val & flag.value) + extra
class TaggedUnion(recordclass.datatuple): # type: ignore
class TaggedUnion(recordclass.RecordClass):
tag: Any
value: Any
@@ -333,5 +406,5 @@ class TaggedUnion(recordclass.datatuple): # type: ignore
__all__ = [
"Vector3", "Vector4", "Vector2", "Quaternion", "TupleCoord",
"UUID", "RawBytes", "StringEnum", "JankStringyBytes", "TaggedUnion",
"IntEnum", "IntFlag", "flags_to_pod"
"IntEnum", "IntFlag", "flags_to_pod", "Pretty", "RAD_TO_DEG"
]

View File

@@ -18,17 +18,20 @@ You should have received a copy of the GNU Lesser General Public License
along with this program; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
import asyncio
import logging
from logging import getLogger
from hippolyzer.lib.base.helpers import create_logged_task
logger = getLogger('utilities.events')
LOG = logging.getLogger(__name__)
class Event:
""" an object containing data which will be passed out to all subscribers """
def __init__(self):
def __init__(self, name=None):
self.subscribers = []
self.name = name
def subscribe(self, handler, *args, one_shot=False, predicate=None, **kwargs):
""" establish the subscribers (handlers) to this event """
@@ -38,7 +41,8 @@ class Event:
return self
def _handler_key(self, handler):
@staticmethod
def _handler_key(handler):
return handler[:3]
def unsubscribe(self, handler, *args, **kwargs):
@@ -52,24 +56,37 @@ class Event:
raise ValueError(f"Handler {handler!r} is not subscribed to this event.")
return self
def _create_async_wrapper(self, handler, args, inner_args, kwargs):
# Note that unsubscription may be delayed due to asyncio scheduling :)
async def _run_handler_wrapper():
unsubscribe = await handler(args, *inner_args, **kwargs)
if unsubscribe:
_ = self.unsubscribe(handler, *inner_args, **kwargs)
return _run_handler_wrapper
def notify(self, args):
for handler in self.subscribers[:]:
instance, inner_args, kwargs, one_shot, predicate = handler
for subscriber in self.subscribers[:]:
handler, inner_args, kwargs, one_shot, predicate = subscriber
if predicate and not predicate(args):
continue
if one_shot:
self.unsubscribe(instance, *inner_args, **kwargs)
if instance(args, *inner_args, **kwargs):
self.unsubscribe(instance, *inner_args, **kwargs)
self.unsubscribe(handler, *inner_args, **kwargs)
if asyncio.iscoroutinefunction(handler):
create_logged_task(self._create_async_wrapper(handler, args, inner_args, kwargs)(), self.name, LOG)
else:
try:
if handler(args, *inner_args, **kwargs) and not one_shot:
self.unsubscribe(handler, *inner_args, **kwargs)
except:
# One handler failing shouldn't prevent notification of other handlers.
LOG.exception(f"Failed in handler for {self.name}")
def get_subscriber_count(self):
def __len__(self):
return len(self.subscribers)
def clear_subscribers(self):
self.subscribers.clear()
return self
__iadd__ = subscribe
__isub__ = unsubscribe
__call__ = notify
__len__ = get_subscriber_count

View File

@@ -176,7 +176,7 @@ class MessageTemplateNotFound(MessageSystemError):
self.template = template
def __str__(self):
return "No message template found, context: '%s'" % self.context
return "No message template found for %s, context: '%s'" % (self.template, self.context)
class MessageTemplateParsingError(MessageSystemError):

View File

@@ -0,0 +1,528 @@
"""
WIP LLMesh -> glTF converter, for testing eventual glTF -> LLMesh conversion logic.
"""
# TODO:
# * Simple tests
# * Round-tripping skinning data from Blender-compatible glTF back to LLMesh (maybe through rig retargeting?)
# * Panda3D-glTF viewer for LLMesh? The glTFs seem to work fine in Panda3D-glTF's `gltf-viewer`.
# * Check if skew and projection components of transform matrices are ignored in practice as the spec requires.
# I suppose this would render some real assets impossible to represent with glTF.
import dataclasses
import math
import pprint
import sys
import uuid
from pathlib import Path
from typing import *
import gltflib
import numpy as np
import transformations
from hippolyzer.lib.base.datatypes import Vector3
from hippolyzer.lib.base.mesh import (
LLMeshSerializer, MeshAsset, positions_from_domain, SkinSegmentDict, VertexWeight, llsd_to_mat4
)
from hippolyzer.lib.base.mesh_skeleton import AVATAR_SKELETON
from hippolyzer.lib.base.serialization import BufferReader
class IdentityList(list):
"""
List, but does index() by object identity, not equality
GLTF references objects by their index within some list, but we prefer to pass around
actual object references internally. If we don't do this, then when we try and get
a GLTF reference to a given object via `.index()` then we could end up actually getting
a reference to some other object that just happens to be equal. This was causing issues
with all primitives ending up with the same material, due to the default material's value
being the same across all primitives.
"""
def index(self, value, start: Optional[int] = None, stop: Optional[int] = None) -> int:
view = self[start:stop]
for i, x in enumerate(view):
if x is value:
if start:
return i + start
return i
raise ValueError(value)
def sl_to_gltf_coords(coords):
"""
SL (X, Y, Z) -> GL (X, Z, Y), as GLTF commandeth
Note that this will only work when reordering axes, flipping an axis is more complicated.
"""
return coords[0], coords[2], coords[1], *coords[3:]
def sl_to_gltf_uv(uv):
"""Flip the V coordinate of a UV to match glTF convention"""
return [uv[0], -uv[1]]
def sl_mat4_to_gltf(mat: np.ndarray) -> List[float]:
"""
Convert an SL Mat4 to the glTF coordinate system
This should only be done immediately before storing the matrix in a glTF structure!
"""
# TODO: This is probably not correct. We definitely need to flip Z but there's
# probably a better way to do it.
decomp = [sl_to_gltf_coords(x) for x in transformations.decompose_matrix(mat)]
trans = decomp[3]
decomp[3] = (trans[0], trans[1], -trans[2])
return list(transformations.compose_matrix(*decomp).flatten(order='F'))
# Mat3 to convert points from SL coordinate space to GLTF coordinate space
POINT_TO_GLTF_MAT = transformations.compose_matrix(angles=(-(math.pi / 2), 0, 0))[:3, :3]
def sl_vec3_array_to_gltf(vec_list: np.ndarray) -> np.ndarray:
new_array = []
for x in vec_list:
new_array.append(POINT_TO_GLTF_MAT.dot(x))
return np.array(new_array)
def sl_weights_to_gltf(sl_weights: List[List[VertexWeight]]) -> Tuple[np.ndarray, np.ndarray]:
"""Convert SL Weights to separate JOINTS_0 and WEIGHTS_0 vec4 arrays"""
joints = np.zeros((len(sl_weights), 4), dtype=np.uint8)
weights = np.zeros((len(sl_weights), 4), dtype=np.float32)
for i, vert_weights in enumerate(sl_weights):
# We need to re-normalize these since the quantization can mess them up
collected_weights = []
for j, vert_weight in enumerate(vert_weights):
joints[i, j] = vert_weight.joint_idx
collected_weights.append(vert_weight.weight)
weight_sum = sum(collected_weights)
if weight_sum:
for j, weight in enumerate(collected_weights):
weights[i, j] = weight / weight_sum
return joints, weights
def normalize_vec3(a):
norm = np.linalg.norm(a)
if norm == 0:
return a
return a / norm
def apply_bind_shape_matrix(bind_shape_matrix: np.ndarray, verts: np.ndarray, norms: np.ndarray) \
-> Tuple[np.ndarray, np.ndarray]:
"""
Apply the bind shape matrix to the mesh data
glTF expects all verts and normals to be in armature-local space so that mesh data can be shared
between differently-oriented armatures. Or something.
# https://github.com/KhronosGroup/glTF-Blender-IO/issues/566#issuecomment-523119339
glTF also doesn't have a concept of a "bind shape matrix" like Collada does
per its skinning docs, so we have to mix it into the mesh data manually.
See https://github.com/KhronosGroup/glTF-Tutorials/blob/master/gltfTutorial/gltfTutorial_020_Skins.md
"""
scale, _, angles, translation, _ = transformations.decompose_matrix(bind_shape_matrix)
scale_mat = transformations.compose_matrix(scale=scale)[:3, :3]
rot_mat = transformations.euler_matrix(*angles)[:3, :3]
rot_scale_mat = scale_mat @ np.linalg.inv(rot_mat)
# Apply the SRT transform to each vert
verts = (verts @ rot_scale_mat) + translation
# Our scale is unlikely to be uniform, so we have to fix up our normals as well.
# https://paroj.github.io/gltut/Illumination/Tut09%20Normal%20Transformation.html
inv_transpose_mat = np.transpose(np.linalg.inv(bind_shape_matrix)[:3, :3])
new_norms = [normalize_vec3(inv_transpose_mat @ norm) for norm in norms]
return verts, np.array(new_norms)
@dataclasses.dataclass
class JointContext:
node: gltflib.Node
# Original matrix for the bone, may have custom translation, but otherwise the same.
orig_matrix: np.ndarray
# xform that must be applied to inverse bind matrices to account for the changed bone
fixup_matrix: np.ndarray
JOINT_CONTEXT_DICT = Dict[str, JointContext]
class GLTFBuilder:
def __init__(self, blender_compatibility=False):
self.scene = gltflib.Scene(nodes=IdentityList())
self.model = gltflib.GLTFModel(
asset=gltflib.Asset(version="2.0"),
accessors=IdentityList(),
nodes=IdentityList(),
materials=IdentityList(),
buffers=IdentityList(),
bufferViews=IdentityList(),
meshes=IdentityList(),
skins=IdentityList(),
scenes=IdentityList((self.scene,)),
extensionsUsed=["KHR_materials_specular"],
scene=0,
)
self.gltf = gltflib.GLTF(
model=self.model,
resources=IdentityList(),
)
self.blender_compatibility = blender_compatibility
def add_nodes_from_llmesh(self, mesh: MeshAsset, name: str, mesh_transform: Optional[np.ndarray] = None):
"""Build a glTF version of a mesh asset, appending it and its armature to the scene root"""
# TODO: mesh data instancing?
# consider https://github.com/KhronosGroup/glTF-Blender-IO/issues/1634.
if mesh_transform is None:
mesh_transform = np.identity(4)
skin_seg: Optional[SkinSegmentDict] = mesh.segments.get('skin')
skin = None
if skin_seg:
mesh_transform = llsd_to_mat4(skin_seg['bind_shape_matrix'])
joint_ctxs = self.add_joints(skin_seg)
# Give our armature a root node and parent the pelvis to it
armature_node = self.add_node("Armature")
self.scene.nodes.append(self.model.nodes.index(armature_node))
armature_node.children.append(self.model.nodes.index(joint_ctxs['mPelvis'].node))
skin = self.add_skin("Armature", joint_ctxs, skin_seg)
skin.skeleton = self.model.nodes.index(armature_node)
primitives = []
# Just the high LOD for now
for submesh in mesh.segments['high_lod']:
verts = np.array(positions_from_domain(submesh['Position'], submesh['PositionDomain']))
norms = np.array(submesh['Normal'])
tris = np.array(submesh['TriangleList'])
joints = np.array([])
weights = np.array([])
range_uv = np.array([])
if "TexCoord0" in submesh:
range_uv = np.array(positions_from_domain(submesh['TexCoord0'], submesh['TexCoord0Domain']))
if 'Weights' in submesh:
joints, weights = sl_weights_to_gltf(submesh['Weights'])
if skin:
# Convert verts and norms to armature-local space
verts, norms = apply_bind_shape_matrix(mesh_transform, verts, norms)
primitives.append(self.add_primitive(
tris=tris,
positions=verts,
normals=norms,
uvs=range_uv,
joints=joints,
weights=weights,
))
mesh_node = self.add_node(
name,
self.add_mesh(name, primitives),
transform=mesh_transform,
)
if skin:
# Node translation isn't relevant, we're going to use the bind matrices
# If you pull this into Blender you may want to untick "Guess Original Bind Pose",
# it guesses that based on the inverse bind matrices which may have Maya poisoning.
# TODO: Maybe we could automatically undo that by comparing expected bone scale and rot
# to scale and rot in the inverse bind matrices, and applying fixups to the
# bind shape matrix and inverse bind matrices?
mesh_node.matrix = None
mesh_node.skin = self.model.skins.index(skin)
self.scene.nodes.append(self.model.nodes.index(mesh_node))
def add_node(
self,
name: str,
mesh: Optional[gltflib.Mesh] = None,
transform: Optional[np.ndarray] = None,
) -> gltflib.Node:
node = gltflib.Node(
name=name,
mesh=self.model.meshes.index(mesh) if mesh else None,
matrix=sl_mat4_to_gltf(transform) if transform is not None else None,
children=[],
)
self.model.nodes.append(node)
return node
def add_mesh(
self,
name: str,
primitives: List[gltflib.Primitive],
) -> gltflib.Mesh:
for i, prim in enumerate(primitives):
# Give the materials a name relating to what "face" they belong to
self.model.materials[prim.material].name = f"{name}.{i:03}"
mesh = gltflib.Mesh(name=name, primitives=primitives)
self.model.meshes.append(mesh)
return mesh
def add_primitive(
self,
tris: np.ndarray,
positions: np.ndarray,
normals: np.ndarray,
uvs: np.ndarray,
weights: np.ndarray,
joints: np.ndarray,
) -> gltflib.Primitive:
# Make a Material for the primitive. Materials pretty much _are_ the primitives in
# LLMesh, so just make them both in one go. We need a unique material for each primitive.
material = gltflib.Material(
pbrMetallicRoughness=gltflib.PBRMetallicRoughness(
baseColorFactor=[1.0, 1.0, 1.0, 1.0],
metallicFactor=0.0,
roughnessFactor=0.0,
),
extensions={
"KHR_materials_specular": {
"specularFactor": 0.0,
"specularColorFactor": [0, 0, 0]
},
}
)
self.model.materials.append(material)
attributes = gltflib.Attributes(
POSITION=self.maybe_add_vec_array(sl_vec3_array_to_gltf(positions), gltflib.AccessorType.VEC3),
NORMAL=self.maybe_add_vec_array(sl_vec3_array_to_gltf(normals), gltflib.AccessorType.VEC3),
TEXCOORD_0=self.maybe_add_vec_array(np.array([sl_to_gltf_uv(uv) for uv in uvs]), gltflib.AccessorType.VEC2),
JOINTS_0=self.maybe_add_vec_array(joints, gltflib.AccessorType.VEC4, gltflib.ComponentType.UNSIGNED_BYTE),
WEIGHTS_0=self.maybe_add_vec_array(weights, gltflib.AccessorType.VEC4),
)
return gltflib.Primitive(
attributes=attributes,
indices=self.model.accessors.index(self.add_scalars(tris)),
material=self.model.materials.index(material),
mode=gltflib.PrimitiveMode.TRIANGLES,
)
def add_scalars(self, scalars: np.ndarray) -> gltflib.Accessor:
"""
Add a potentially multidimensional array of scalars, returning the accessor
Generally only used for triangle indices
"""
scalar_bytes = scalars.astype(np.uint32).flatten().tobytes()
buffer_view = self.add_buffer_view(scalar_bytes, None)
accessor = gltflib.Accessor(
bufferView=self.model.bufferViews.index(buffer_view),
componentType=gltflib.ComponentType.UNSIGNED_INT,
count=scalars.size, # use the flattened size!
type=gltflib.AccessorType.SCALAR.value, # type: ignore
min=[int(scalars.min())], # type: ignore
max=[int(scalars.max())], # type: ignore
)
self.model.accessors.append(accessor)
return accessor
def maybe_add_vec_array(
self,
vecs: np.ndarray,
vec_type: gltflib.AccessorType,
component_type: gltflib.ComponentType = gltflib.ComponentType.FLOAT,
) -> Optional[int]:
if not vecs.size:
return None
accessor = self.add_vec_array(vecs, vec_type, component_type)
return self.model.accessors.index(accessor)
def add_vec_array(
self,
vecs: np.ndarray,
vec_type: gltflib.AccessorType,
component_type: gltflib.ComponentType = gltflib.ComponentType.FLOAT
) -> gltflib.Accessor:
"""
Add a two-dimensional array of vecs (positions, normals, weights, UVs) returning the accessor
Vec type may be a vec2, vec3, or a vec4.
"""
# Pretty much all of these are float32 except the ones that aren't
dtype = np.float32
if component_type == gltflib.ComponentType.UNSIGNED_BYTE:
dtype = np.uint8
vec_data = vecs.astype(dtype).tobytes()
buffer_view = self.add_buffer_view(vec_data, target=None)
accessor = gltflib.Accessor(
bufferView=self.model.bufferViews.index(buffer_view),
componentType=component_type,
count=len(vecs),
type=vec_type.value, # type: ignore
min=vecs.min(axis=0).tolist(), # type: ignore
max=vecs.max(axis=0).tolist(), # type: ignore
)
self.model.accessors.append(accessor)
return accessor
def add_buffer_view(self, data: bytes, target: Optional[gltflib.BufferTarget]) -> gltflib.BufferView:
"""Create a buffer view and associated buffer and resource for a blob of data"""
resource = gltflib.FileResource(filename=f"res-{uuid.uuid4()}.bin", data=data)
self.gltf.resources.append(resource)
buffer = gltflib.Buffer(uri=resource.filename, byteLength=len(resource.data))
self.model.buffers.append(buffer)
buffer_view = gltflib.BufferView(
buffer=self.model.buffers.index(buffer),
byteLength=buffer.byteLength,
byteOffset=0,
target=target
)
self.model.bufferViews.append(buffer_view)
return buffer_view
def add_joints(self, skin: SkinSegmentDict) -> JOINT_CONTEXT_DICT:
# There may be some joints not present in the mesh that we need to add to reach the mPelvis root
required_joints = set()
for joint_name in skin['joint_names']:
joint_node = AVATAR_SKELETON[joint_name]
required_joints.add(joint_node)
required_joints.update(joint_node.ancestors)
# If this is present, it may override the joint positions from the skeleton definition
if 'alt_inverse_bind_matrix' in skin:
joint_overrides = dict(zip(skin['joint_names'], skin['alt_inverse_bind_matrix']))
else:
joint_overrides = {}
built_joints: JOINT_CONTEXT_DICT = {}
for joint in required_joints:
joint_matrix = joint.matrix
# Do we have a joint position override that would affect joint_matrix?
override = joint_overrides.get(joint.name)
if override:
decomp = list(transformations.decompose_matrix(joint_matrix))
# We specifically only want the translation from the override!
translation = transformations.translation_from_matrix(llsd_to_mat4(override))
# Only do it if the difference is over 0.1mm though
if Vector3.dist(Vector3(*translation), joint.translation) > 0.0001:
decomp[3] = translation
joint_matrix = transformations.compose_matrix(*decomp)
# Do we need to mess with the bone's matrices to make Blender cooperate?
orig_matrix = joint_matrix
fixup_matrix = np.identity(4)
if self.blender_compatibility:
joint_matrix, fixup_matrix = self._fix_blender_joint(joint_matrix)
# TODO: populate "extras" here with the metadata the Blender collada stuff uses to store
# "bind_mat" and "rest_mat" so we can go back to our original matrices when exporting
# from blender to .dae!
gltf_joint = self.add_node(joint.name, transform=joint_matrix)
# Store the node along with any fixups we may need to apply to the bind matrices later
built_joints[joint.name] = JointContext(gltf_joint, orig_matrix, fixup_matrix)
# Add each joint to the child list of their respective parent
for joint_name, joint_ctx in built_joints.items():
if parent_name := AVATAR_SKELETON[joint_name].parent_name:
built_joints[parent_name].node.children.append(self.model.nodes.index(joint_ctx.node))
return built_joints
def _fix_blender_joint(self, joint_matrix: np.ndarray) -> Tuple[np.ndarray, np.ndarray]:
"""
Split a joint matrix into a joint matrix and fixup matrix
If we don't account for weird scaling on the collision volumes, then
Blender freaks out. This is an issue in blender where it doesn't
apply the inverse bind matrices relative to the scale and rotation of
the bones themselves, as it should per the glTF spec. Blender's glTF loader
tries to recover from this by applying certain transforms as a pose, but
the damage has been done by that point. Nobody else runs really runs into
this because they have the good sense to not use some nightmare abomination
rig with scaling and rotation on the skeleton like SL does.
Blender will _only_ correctly handle the translation component of the joint,
any other transforms need to be mixed into the inverse bind matrices themselves.
There's no internal concept of bone scale or rot in Blender right now.
Should investigate an Avastar-style approach of optionally retargeting
to a Blender-compatible rig with translation-only bones, and modify
the bind matrices to accommodate. The glTF importer supports metadata through
the "extras" fields, so we can potentially abuse the "bind_mat" metadata field
that Blender already uses for the "Keep Bind Info" Collada import / export hack.
For context:
* https://github.com/KhronosGroup/glTF-Blender-IO/issues/1305
* https://developer.blender.org/T38660 (these are Collada, but still relevant)
* https://developer.blender.org/T29246
* https://developer.blender.org/T50412
* https://developer.blender.org/T53620 (FBX but still relevant)
"""
scale, shear, angles, translate, projection = transformations.decompose_matrix(joint_matrix)
joint_matrix = transformations.compose_matrix(translate=translate)
fixup_matrix = transformations.compose_matrix(scale=scale, angles=angles)
return joint_matrix, fixup_matrix
def add_skin(self, name: str, joint_nodes: JOINT_CONTEXT_DICT, skin_seg: SkinSegmentDict) -> gltflib.Skin:
joints_arr = []
for joint_name in skin_seg['joint_names']:
joint_ctx = joint_nodes[joint_name]
joints_arr.append(self.model.nodes.index(joint_ctx.node))
inv_binds = []
for joint_name, inv_bind in zip(skin_seg['joint_names'], skin_seg['inverse_bind_matrix']):
joint_ctx = joint_nodes[joint_name]
inv_bind = joint_ctx.fixup_matrix @ llsd_to_mat4(inv_bind)
inv_binds.append(sl_mat4_to_gltf(inv_bind))
inv_binds_data = np.array(inv_binds, dtype=np.float32).tobytes()
buffer_view = self.add_buffer_view(inv_binds_data, target=None)
accessor = gltflib.Accessor(
bufferView=self.model.bufferViews.index(buffer_view),
componentType=gltflib.ComponentType.FLOAT,
count=len(inv_binds),
type=gltflib.AccessorType.MAT4.value, # type: ignore
)
self.model.accessors.append(accessor)
accessor_idx = self.model.accessors.index(accessor)
skin = gltflib.Skin(name=name, joints=joints_arr, inverseBindMatrices=accessor_idx)
self.model.skins.append(skin)
return skin
def finalize(self):
"""Clean up the mesh to pass the glTF smell test, should be done last"""
def _nullify_empty_lists(dc):
for field in dataclasses.fields(dc):
# Empty lists should be replaced with None
if getattr(dc, field.name) == []:
setattr(dc, field.name, None)
for node in self.model.nodes:
_nullify_empty_lists(node)
_nullify_empty_lists(self.model)
return self.gltf
def main():
# Take an llmesh file as an argument and spit out basename-converted.gltf
with open(sys.argv[1], "rb") as f:
reader = BufferReader("<", f.read())
filename = Path(sys.argv[1]).stem
mesh: MeshAsset = reader.read(LLMeshSerializer(parse_segment_contents=True))
builder = GLTFBuilder(blender_compatibility=True)
builder.add_nodes_from_llmesh(mesh, filename)
gltf = builder.finalize()
pprint.pprint(gltf.model)
gltf.export_glb(sys.argv[1].rsplit(".", 1)[0] + "-converted.gltf")
if __name__ == "__main__":
main()

View File

@@ -1,7 +1,12 @@
from __future__ import annotations
import asyncio
import codecs
import functools
import logging
import os
import lazy_object_proxy
import pkg_resources
import re
import weakref
@@ -17,7 +22,7 @@ def _with_patched_multidict(f):
# There's no way to tell pprint "hey, this is a dict,
# this is how you access its items." A lot of the formatting logic
# is in the module-level `_safe_repr()` which we don't want to mess with.
# Instead, pretend our MultiDict has dict's __repr__ and while we're inside
# Instead, pretend our MultiDict has dict's __repr__ while we're inside
# calls to pprint. Hooray.
orig_repr = MultiDict.__repr__
if orig_repr is dict.__repr__:
@@ -65,6 +70,9 @@ class HippoPrettyPrinter(PrettyPrinter):
return f"({reprs})"
def pformat(self, obj: object, *args, **kwargs) -> str:
# Unwrap lazy object proxies before pprinting them
if isinstance(obj, lazy_object_proxy.Proxy):
obj = obj.__wrapped__
if isinstance(obj, (bytes, str)):
return self._str_format(obj)
return self._base_pformat(obj, *args, **kwargs)
@@ -126,6 +134,13 @@ def proxify(obj: Union[Callable[[], _T], weakref.ReferenceType, _T]) -> _T:
return obj
class BiDiDict(Generic[_T]):
"""Dictionary for bidirectional lookups"""
def __init__(self, values: Dict[_T, _T]):
self.forward = {**values}
self.backward = {value: key for (key, value) in values.items()}
def bytes_unescape(val: bytes) -> bytes:
# Only in CPython. bytes -> bytes with escape decoding.
# https://stackoverflow.com/a/23151714
@@ -141,7 +156,60 @@ def get_resource_filename(resource_filename: str):
return pkg_resources.resource_filename("hippolyzer", resource_filename)
def to_chunks(chunkable: Sequence[_T], chunk_size: int) -> Generator[_T, None, None]:
def to_chunks(chunkable: Sequence[_T], chunk_size: int) -> Generator[Sequence[_T], None, None]:
while chunkable:
yield chunkable[:chunk_size]
chunkable = chunkable[chunk_size:]
def get_mtime(path):
try:
return os.stat(path).st_mtime
except:
return None
def fut_logger(name: str, logger: logging.Logger, fut: asyncio.Future, *args) -> None:
"""Callback suitable for exception logging in `Future.add_done_callback()`"""
if not fut.cancelled() and fut.exception():
if isinstance(fut.exception(), asyncio.CancelledError):
# Don't really care if the task was just cancelled
return
logger.exception(f"Failed in task for {name}", exc_info=fut.exception())
def add_future_logger(
fut: asyncio.Future,
name: Optional[str] = None,
logger: Optional[logging.Logger] = None,
):
"""Add a logger to Futures that will never be directly `await`ed, logging exceptions"""
fut.add_done_callback(functools.partial(fut_logger, name, logger or logging.getLogger()))
def create_logged_task(
coro: Coroutine,
name: Optional[str] = None,
logger: Optional[logging.Logger] = None,
) -> asyncio.Task:
task = asyncio.create_task(coro, name=name)
add_future_logger(task, name, logger)
return task
def reorient_coord(coord, new_orientation, min_val: int | float = 0):
"""
Reorient a coordinate instance such that its components are negated and transposed appropriately.
For ex:
reorient_coord((1,2,3), (3,-2,-1)) == (3,-2,-1)
"""
min_val = abs(min_val)
coords = []
for axis in new_orientation:
axis_idx = abs(axis) - 1
new_coord = coord[axis_idx] if axis >= 0 else min_val - coord[axis_idx]
coords.append(new_coord)
if coord.__class__ in (list, tuple):
return coord.__class__(coords)
return coord.__class__(*coords)

View File

@@ -0,0 +1,749 @@
"""
Parse the horrible legacy inventory-related format.
It's typically only used for object contents now.
"""
# TODO: Maybe handle CRC calculation? Does anything care about that?
# I don't think anything in the viewer actually looks at the result
# of the CRC check for UDP stuff.
from __future__ import annotations
import abc
import asyncio
import dataclasses
import datetime as dt
import inspect
import logging
import secrets
import struct
import weakref
from io import StringIO
from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.legacy_schema import (
parse_schema_line,
SchemaBase,
SchemaDate,
SchemaFieldSerializer,
SchemaHexInt,
SchemaInt,
SchemaLLSD,
SchemaMultilineStr,
SchemaParsingError,
SchemaStr,
SchemaUUID,
schema_field,
)
from hippolyzer.lib.base.message.message import Block
from hippolyzer.lib.base.templates import SaleType, InventoryType, LookupIntEnum, AssetType, FolderType
MAGIC_ID = UUID("3c115e51-04f4-523c-9fa6-98aff1034730")
LOG = logging.getLogger(__name__)
_T = TypeVar("_T")
class SchemaFlagField(SchemaHexInt):
"""Like a hex int, but must be serialized as bytes in LLSD due to being a U32"""
@classmethod
def from_llsd(cls, val: Any, flavor: str) -> int:
# Sometimes values in S32 range will just come through normally
if isinstance(val, int):
return val
if flavor == "legacy":
return struct.unpack("!I", val)[0]
return val
@classmethod
def to_llsd(cls, val: int, flavor: str) -> Any:
if flavor == "legacy":
return struct.pack("!I", val)
return val
class SchemaEnumField(SchemaStr, Generic[_T]):
def __init__(self, enum_cls: Type[LookupIntEnum]):
super().__init__()
self._enum_cls = enum_cls
def deserialize(self, val: str) -> _T:
return self._enum_cls.from_lookup_name(val)
def serialize(self, val: _T) -> str:
return self._enum_cls(val).to_lookup_name()
def from_llsd(self, val: Union[str, int], flavor: str) -> _T:
if flavor == "legacy":
return self.deserialize(val)
return self._enum_cls(val)
def to_llsd(self, val: _T, flavor: str) -> Union[int, str]:
if flavor == "legacy":
return self.serialize(val)
return int(val)
def _yield_schema_tokens(reader: StringIO):
in_bracket = False
# empty str == EOF in Python
while line := reader.readline():
line = line.strip()
# Whitespace-only lines are automatically skipped
if not line:
continue
try:
key, val = parse_schema_line(line)
except SchemaParsingError:
# Can happen if there's a malformed multi-line string, just
# skip by it.
LOG.warning(f"Found invalid inventory line {line!r}")
continue
if key == "{":
if in_bracket:
LOG.warning("Found multiple opening brackets inside structure, "
"was a nested structure not handled?")
in_bracket = True
continue
if key == "}":
if not in_bracket:
LOG.warning("Unexpected closing bracket")
in_bracket = False
break
yield key, val
if in_bracket:
LOG.warning("Reached EOF while inside a bracket")
class InventoryBase(SchemaBase):
SCHEMA_NAME: ClassVar[str]
@classmethod
def from_reader(cls, reader: StringIO, read_header=False) -> InventoryBase:
tok_iter = _yield_schema_tokens(reader)
# Someone else hasn't already read the header for us
if read_header:
schema_name, _ = next(tok_iter)
if schema_name != cls.SCHEMA_NAME:
raise ValueError(f"Expected schema name {schema_name!r} to be {cls.SCHEMA_NAME!r}")
fields = cls._get_fields_dict()
obj_dict = {}
for key, val in tok_iter:
if key in fields:
field: dataclasses.Field = fields[key]
spec = field.metadata.get("spec")
# Not a real key, an internal var on our dataclass
if not spec:
LOG.warning(f"Internal key {key!r}")
continue
spec_cls = spec
if not inspect.isclass(spec_cls):
spec_cls = spec_cls.__class__
# some kind of nested structure like sale_info
if issubclass(spec_cls, SchemaBase):
obj_dict[key] = spec.from_reader(reader)
elif issubclass(spec_cls, SchemaFieldSerializer):
obj_dict[key] = spec.deserialize(val)
else:
raise ValueError(f"Unsupported spec for {key!r}, {spec!r}")
else:
LOG.warning(f"Unknown key {key!r}")
return cls._obj_from_dict(obj_dict)
def to_writer(self, writer: StringIO):
writer.write(f"\t{self.SCHEMA_NAME}")
if self.SCHEMA_NAME == "permissions":
writer.write(" 0\n")
else:
writer.write("\t0\n")
writer.write("\t{\n")
# Make sure the ID field always comes first, if there is one.
fields_dict: Dict[str, dataclasses.Field] = {}
if hasattr(self, "ID_ATTR"):
fields_dict = {getattr(self, "ID_ATTR"): dataclasses.field()}
# update()ing will put all fields that aren't yet in the dict after the ID attr.
fields_dict.update(self._get_fields_dict())
for field_name, field in fields_dict.items():
spec = field.metadata.get("spec")
# Not meant to be serialized
if not spec:
continue
if field.metadata.get("llsd_only"):
continue
val = getattr(self, field_name)
if val is None and not field.metadata.get("include_none"):
continue
spec_cls = spec
if not inspect.isclass(spec_cls):
spec_cls = spec_cls.__class__
# Some kind of nested structure like sale_info
if isinstance(val, SchemaBase):
val.to_writer(writer)
elif issubclass(spec_cls, SchemaFieldSerializer):
writer.write(f"\t\t{field_name}\t{spec.serialize(val)}\n")
else:
raise ValueError(f"Bad inventory spec {spec!r}")
writer.write("\t}\n")
class InventoryDifferences(NamedTuple):
changed: List[InventoryNodeBase]
removed: List[InventoryNodeBase]
class InventoryModel(InventoryBase):
def __init__(self):
self.nodes: Dict[UUID, InventoryNodeBase] = {}
self.root: Optional[InventoryContainerBase] = None
self.any_dirty = asyncio.Event()
@classmethod
def from_reader(cls, reader: StringIO, read_header=False) -> InventoryModel:
model = cls()
for key, value in _yield_schema_tokens(reader):
if key == "inv_object":
obj = InventoryObject.from_reader(reader)
if obj is not None:
model.add(obj)
elif key == "inv_category":
cat = InventoryCategory.from_reader(reader)
if cat is not None:
model.add(cat)
elif key == "inv_item":
item = InventoryItem.from_reader(reader)
if item is not None:
model.add(item)
else:
LOG.warning("Unknown key {0}".format(key))
return model
@classmethod
def from_llsd(cls, llsd_val: List[Dict], flavor: str = "legacy") -> Self:
model = cls()
for obj_dict in llsd_val:
obj = None
for inv_type in INVENTORY_TYPES:
if inv_type.ID_ATTR in obj_dict:
if (obj := inv_type.from_llsd(obj_dict, flavor)) is not None:
model.add(obj)
break
if obj is None:
LOG.warning(f"Unknown object type {obj_dict!r}")
return model
@property
def ordered_nodes(self) -> Iterable[InventoryNodeBase]:
yield from self.all_containers
yield from self.all_items
@property
def all_containers(self) -> Iterable[InventoryContainerBase]:
for node in self.nodes.values():
if isinstance(node, InventoryContainerBase):
yield node
@property
def dirty_categories(self) -> Iterable[InventoryCategory]:
for node in self.nodes:
if isinstance(node, InventoryCategory) and node.version == InventoryCategory.VERSION_NONE:
yield node
@property
def all_items(self) -> Iterable[InventoryItem]:
for node in self.nodes.values():
if not isinstance(node, InventoryContainerBase):
yield node # type: ignore
def __eq__(self, other):
if not isinstance(other, InventoryModel):
return False
return set(self.nodes.values()) == set(other.nodes.values())
def to_writer(self, writer: StringIO):
for node in self.ordered_nodes:
node.to_writer(writer)
def to_llsd(self, flavor: str = "legacy"):
return list(node.to_llsd(flavor) for node in self.ordered_nodes)
def add(self, node: InventoryNodeBase):
if node.node_id in self.nodes:
raise KeyError(f"{node.node_id} already exists in the inventory model")
self.nodes[node.node_id] = node
if isinstance(node, InventoryContainerBase):
if node.parent_id == UUID.ZERO:
self.root = node
node.model = weakref.proxy(self)
return node
def update(self, node: InventoryNodeBase, update_fields: Optional[Iterable[str]] = None) -> InventoryNodeBase:
"""Update an existing node, optionally only updating specific fields"""
if node.node_id not in self.nodes:
raise KeyError(f"{node.node_id} not in the inventory model")
orig_node = self.nodes[node.node_id]
if node.__class__ != orig_node.__class__:
raise ValueError(f"Tried to update {orig_node!r} from non-matching {node!r}")
if not update_fields:
# Update everything but the model parameter
update_fields = node.get_field_names()
for field_name in update_fields:
setattr(orig_node, field_name, getattr(node, field_name))
return orig_node
def upsert(self, node: InventoryNodeBase, update_fields: Optional[Iterable[str]] = None) -> InventoryNodeBase:
"""Add or update a node"""
if node.node_id in self.nodes:
return self.update(node, update_fields)
return self.add(node)
def unlink(self, node: InventoryNodeBase, single_only: bool = False) -> Sequence[InventoryNodeBase]:
"""Unlink a node and its descendants from the tree, returning the removed nodes"""
assert node.model == self
if node == self.root:
self.root = None
unlinked = [node]
if isinstance(node, InventoryContainerBase) and not single_only:
for child in node.children:
unlinked.extend(self.unlink(child))
self.nodes.pop(node.node_id, None)
node.model = None
return unlinked
def get_differences(self, other: InventoryModel) -> InventoryDifferences:
# Includes modified things with the same ID
changed_in_other = []
removed_in_other = []
other_keys = set(other.nodes.keys())
our_keys = set(self.nodes.keys())
# Removed
for key in our_keys - other_keys:
removed_in_other.append(self.nodes[key])
# Updated
for key in other_keys.intersection(our_keys):
other_node = other.nodes[key]
if other_node != self.nodes[key]:
changed_in_other.append(other_node)
# Added
for key in other_keys - our_keys:
changed_in_other.append(other.nodes[key])
return InventoryDifferences(
changed=changed_in_other,
removed=removed_in_other,
)
def flag_if_dirty(self):
if any(self.dirty_categories):
self.any_dirty.set()
def __getitem__(self, item: UUID) -> InventoryNodeBase:
return self.nodes[item]
def __contains__(self, item: UUID):
return item in self.nodes
def get(self, key: UUID) -> Optional[InventoryNodeBase]:
return self.nodes.get(key)
def get_category(self, key: UUID) -> InventoryCategory:
node = self.get(key)
if not isinstance(node, InventoryCategory):
raise ValueError(f"{node!r} is not a category")
return node
def get_item(self, key: UUID) -> InventoryItem:
node = self.get(key)
if not isinstance(node, InventoryItem):
raise ValueError(f"{node!r} is not an item")
return node
@dataclasses.dataclass
class InventoryPermissions(InventoryBase):
SCHEMA_NAME: ClassVar[str] = "permissions"
base_mask: int = schema_field(SchemaHexInt)
owner_mask: int = schema_field(SchemaHexInt)
group_mask: int = schema_field(SchemaHexInt)
everyone_mask: int = schema_field(SchemaHexInt)
next_owner_mask: int = schema_field(SchemaHexInt)
creator_id: UUID = schema_field(SchemaUUID)
owner_id: UUID = schema_field(SchemaUUID)
last_owner_id: UUID = schema_field(SchemaUUID)
group_id: UUID = schema_field(SchemaUUID)
# Nothing actually cares about this, but it could be there.
# It's kind of redundant since it just means owner_id == NULL_KEY && group_id != NULL_KEY.
is_owner_group: Optional[int] = schema_field(SchemaInt, default=None, llsd_only=True)
@classmethod
def make_default(cls) -> Self:
return cls(
base_mask=0xFFffFFff,
owner_mask=0xFFffFFff,
group_mask=0,
everyone_mask=0,
next_owner_mask=0x82000,
creator_id=UUID.ZERO,
owner_id=UUID.ZERO,
last_owner_id=UUID.ZERO,
group_id=UUID.ZERO,
is_owner_group=None
)
@dataclasses.dataclass
class InventorySaleInfo(InventoryBase):
SCHEMA_NAME: ClassVar[str] = "sale_info"
sale_type: SaleType = schema_field(SchemaEnumField(SaleType))
sale_price: int = schema_field(SchemaInt)
@classmethod
def make_default(cls) -> Self:
return cls(sale_type=SaleType.NOT, sale_price=10)
class _HasBaseNodeAttrs(abc.ABC):
"""
Only exists so that we can assert that all subclasses should have this without forcing
a particular serialization order, as would happen if this was present on InventoryNodeBase.
"""
name: str
type: AssetType
@dataclasses.dataclass
class InventoryNodeBase(InventoryBase, _HasBaseNodeAttrs):
ID_ATTR: ClassVar[str]
parent_id: Optional[UUID] = schema_field(SchemaUUID)
model: Optional[InventoryModel] = dataclasses.field(
default=None, init=False, hash=False, compare=False, repr=False
)
@classmethod
def get_field_names(cls) -> Set[str]:
return set(cls._get_fields_dict().keys()) - {"model"}
@property
def node_id(self) -> UUID:
return getattr(self, self.ID_ATTR)
@node_id.setter
def node_id(self, val: UUID):
setattr(self, self.ID_ATTR, val)
@property
def parent(self) -> Optional[InventoryContainerBase]:
return self.model.nodes.get(self.parent_id)
def unlink(self) -> Sequence[InventoryNodeBase]:
return self.model.unlink(self)
@classmethod
def _obj_from_dict(cls, obj_dict):
# Bad entry, ignore
# TODO: Check on these. might be symlinks or something.
if obj_dict.get("type") == "-1":
LOG.warning(f"Skipping bad object with type == -1: {obj_dict!r}")
return None
return super()._obj_from_dict(obj_dict)
def __hash__(self):
return hash(self.node_id)
def __iter__(self) -> Iterator[InventoryNodeBase]:
return iter(())
def __contains__(self, item) -> bool:
return item in tuple(self)
@dataclasses.dataclass
class InventoryContainerBase(InventoryNodeBase):
type: AssetType = schema_field(SchemaEnumField(AssetType))
@property
def children(self) -> Sequence[InventoryNodeBase]:
return tuple(
x for x in self.model.nodes.values()
if x.parent_id == self.node_id
)
@property
def descendents(self) -> List[InventoryNodeBase]:
new_children: List[InventoryNodeBase] = [self]
descendents = []
while new_children:
to_check = new_children[:]
new_children.clear()
for obj in to_check:
if isinstance(obj, InventoryContainerBase):
for child in obj.children:
if child in descendents:
continue
new_children.append(child)
descendents.append(child)
else:
if obj not in descendents:
descendents.append(obj)
return descendents
def __getitem__(self, item: Union[int, str]) -> InventoryNodeBase:
if isinstance(item, int):
return self.children[item]
for child in self.children:
if child.name == item:
return child
raise KeyError(f"{item!r} not found in children")
def __iter__(self) -> Iterator[InventoryNodeBase]:
return iter(self.children)
def get_or_create_subcategory(self, name: str) -> InventoryCategory:
for child in self:
if child.name == name and isinstance(child, InventoryCategory):
return child
child = InventoryCategory(
name=name,
cat_id=UUID.random(),
parent_id=self.node_id,
type=AssetType.CATEGORY,
pref_type=FolderType.NONE,
owner_id=getattr(self, 'owner_id', UUID.ZERO),
version=1,
)
self.model.add(child)
return child
# So autogenerated __hash__ doesn't kill our inherited one
__hash__ = InventoryNodeBase.__hash__
@dataclasses.dataclass
class InventoryObject(InventoryContainerBase):
SCHEMA_NAME: ClassVar[str] = "inv_object"
ID_ATTR: ClassVar[str] = "obj_id"
obj_id: UUID = schema_field(SchemaUUID)
name: str = schema_field(SchemaMultilineStr)
metadata: Optional[Dict[str, Any]] = schema_field(SchemaLLSD, default=None, include_none=True)
__hash__ = InventoryNodeBase.__hash__
@dataclasses.dataclass
class InventoryCategory(InventoryContainerBase):
ID_ATTR: ClassVar[str] = "cat_id"
# AIS calls this something else...
ID_ATTR_AIS: ClassVar[str] = "category_id"
SCHEMA_NAME: ClassVar[str] = "inv_category"
VERSION_NONE: ClassVar[int] = -1
cat_id: UUID = schema_field(SchemaUUID)
pref_type: FolderType = schema_field(SchemaEnumField(FolderType), llsd_name="preferred_type")
name: str = schema_field(SchemaMultilineStr)
owner_id: Optional[UUID] = schema_field(SchemaUUID, default=None)
version: int = schema_field(SchemaInt, default=VERSION_NONE, llsd_only=True)
metadata: Optional[Dict[str, Any]] = schema_field(SchemaLLSD, default=None, include_none=False)
def to_folder_data(self) -> Block:
return Block(
"FolderData",
FolderID=self.cat_id,
ParentID=self.parent_id,
CallbackID=0,
Type=self.pref_type,
Name=self.name,
)
@classmethod
def from_folder_data(cls, block: Block):
return cls(
cat_id=block["FolderID"],
parent_id=block["ParentID"],
pref_type=block["Type"],
name=block["Name"],
type=AssetType.CATEGORY,
)
@classmethod
def from_llsd(cls, inv_dict: Dict, flavor: str = "legacy") -> Self:
if flavor == "ais" and "type" not in inv_dict:
inv_dict = inv_dict.copy()
inv_dict["type"] = AssetType.CATEGORY
return super().from_llsd(inv_dict, flavor)
def to_llsd(self, flavor: str = "legacy"):
payload = super().to_llsd(flavor)
if flavor == "ais":
# AIS already knows the inventory type is category
payload.pop("type", None)
return payload
@classmethod
def _get_fields_dict(cls, llsd_flavor: Optional[str] = None):
fields = super()._get_fields_dict(llsd_flavor)
if llsd_flavor == "ais":
# These have different names though
fields["type_default"] = fields.pop("preferred_type")
fields["agent_id"] = fields.pop("owner_id")
fields["category_id"] = fields.pop("cat_id")
return fields
__hash__ = InventoryNodeBase.__hash__
@dataclasses.dataclass
class InventoryItem(InventoryNodeBase):
SCHEMA_NAME: ClassVar[str] = "inv_item"
ID_ATTR: ClassVar[str] = "item_id"
item_id: UUID = schema_field(SchemaUUID)
permissions: InventoryPermissions = schema_field(InventoryPermissions)
asset_id: Optional[UUID] = schema_field(SchemaUUID, default=None)
shadow_id: Optional[UUID] = schema_field(SchemaUUID, default=None)
type: Optional[AssetType] = schema_field(SchemaEnumField(AssetType), default=None)
inv_type: Optional[InventoryType] = schema_field(SchemaEnumField(InventoryType), default=None)
flags: Optional[int] = schema_field(SchemaFlagField, default=None)
sale_info: Optional[InventorySaleInfo] = schema_field(InventorySaleInfo, default=None)
name: Optional[str] = schema_field(SchemaMultilineStr, default=None)
desc: Optional[str] = schema_field(SchemaMultilineStr, default=None)
metadata: Optional[Dict[str, Any]] = schema_field(SchemaLLSD, default=None, include_none=True)
"""Specifically for script metadata, generally just experience info"""
thumbnail: Optional[Dict[str, Any]] = schema_field(SchemaLLSD, default=None, include_none=False)
"""Generally just a dict with the thumbnail UUID in it"""
creation_date: Optional[dt.datetime] = schema_field(SchemaDate, llsd_name="created_at", default=None)
__hash__ = InventoryNodeBase.__hash__
@property
def true_asset_id(self) -> UUID:
if self.asset_id is not None:
return self.asset_id
return self.shadow_id ^ MAGIC_ID
def to_inventory_data(self, block_name: str = "InventoryData") -> Block:
return Block(
block_name,
ItemID=self.item_id,
FolderID=self.parent_id,
CallbackID=0,
CreatorID=self.permissions.creator_id,
OwnerID=self.permissions.owner_id,
GroupID=self.permissions.group_id,
BaseMask=self.permissions.base_mask,
OwnerMask=self.permissions.owner_mask,
GroupMask=self.permissions.group_mask,
EveryoneMask=self.permissions.everyone_mask,
NextOwnerMask=self.permissions.next_owner_mask,
GroupOwned=self.permissions.owner_id == UUID.ZERO and self.permissions.group_id != UUID.ZERO,
AssetID=self.true_asset_id,
Type=self.type,
InvType=self.inv_type,
Flags=self.flags,
SaleType=self.sale_info.sale_type,
SalePrice=self.sale_info.sale_price,
Name=self.name,
Description=self.desc,
CreationDate=SchemaDate.to_llsd(self.creation_date, "legacy"),
# Meaningless here
CRC=secrets.randbits(32),
)
@classmethod
def from_inventory_data(cls, block: Block):
return cls(
item_id=block["ItemID"],
# Might be under one of two names
parent_id=block.get("ParentID", block["FolderID"]),
permissions=InventoryPermissions(
creator_id=block["CreatorID"],
owner_id=block["OwnerID"],
# Unknown, not sent in this schema
last_owner_id=block.get("LastOwnerID", UUID.ZERO),
group_id=block["GroupID"],
base_mask=block["BaseMask"],
owner_mask=block["OwnerMask"],
group_mask=block["GroupMask"],
everyone_mask=block["EveryoneMask"],
next_owner_mask=block["NextOwnerMask"],
),
# May be missing in UpdateInventoryItem
asset_id=block.get("AssetID"),
type=AssetType(block["Type"]),
inv_type=InventoryType(block["InvType"]),
flags=block["Flags"],
sale_info=InventorySaleInfo(
sale_type=SaleType(block["SaleType"]),
sale_price=block["SalePrice"],
),
name=block["Name"],
desc=block["Description"],
creation_date=SchemaDate.from_llsd(block["CreationDate"], "legacy"),
)
def to_llsd(self, flavor: str = "legacy"):
val = super().to_llsd(flavor=flavor)
if flavor == "ais":
# There's little chance this differs from owner ID, just place it.
val["agent_id"] = val["permissions"]["owner_id"]
if val["type"] == AssetType.LINK:
# For link items, there is no asset, only a linked ID.
val["linked_id"] = val.pop("asset_id")
# These don't exist either
val.pop("permissions", None)
val.pop("sale_info", None)
return val
@classmethod
def from_llsd(cls, inv_dict: Dict, flavor: str = "legacy") -> Self:
if flavor == "ais" and "linked_id" in inv_dict:
# Links get represented differently than other items for whatever reason.
# This is incredibly annoying, under *NIX there's nothing really special about symlinks.
inv_dict = inv_dict.copy()
# Fill this in since it needs to be there
if "permissions" not in inv_dict:
inv_dict["permissions"] = InventoryPermissions(
base_mask=0xFFffFFff,
owner_mask=0xFFffFFff,
group_mask=0xFFffFFff,
everyone_mask=0,
next_owner_mask=0xFFffFFff,
creator_id=UUID.ZERO,
owner_id=UUID.ZERO,
last_owner_id=UUID.ZERO,
group_id=UUID.ZERO,
).to_llsd("ais")
if "sale_info" not in inv_dict:
inv_dict["sale_info"] = InventorySaleInfo(
sale_type=SaleType.NOT,
sale_price=0,
).to_llsd("ais")
if "type" not in inv_dict:
inv_dict["type"] = AssetType.LINK
# In the context of symlinks, asset id means linked item ID.
# This is also how indra stores symlinks. Why the asymmetry in AIS if none of the
# consumers actually want it? Who knows.
inv_dict["asset_id"] = inv_dict.pop("linked_id")
return super().from_llsd(inv_dict, flavor)
INVENTORY_TYPES: Tuple[Type[InventoryNodeBase], ...] = (InventoryCategory, InventoryObject, InventoryItem)

View File

@@ -1,7 +1,6 @@
import os
import tempfile
from io import BytesIO
from typing import *
import defusedxml.ElementTree
from glymur import jp2box, Jp2k
@@ -10,12 +9,6 @@ from glymur import jp2box, Jp2k
jp2box.ET = defusedxml.ElementTree
SL_DEFAULT_ENCODE = {
"cratios": (1920.0, 480.0, 120.0, 30.0, 10.0),
"irreversible": True,
}
class BufferedJp2k(Jp2k):
"""
For manipulating JP2K from within a binary buffer.
@@ -24,12 +17,7 @@ class BufferedJp2k(Jp2k):
based on filename, so this is the least brittle approach.
"""
def __init__(self, contents: bytes, encode_kwargs: Optional[Dict] = None):
if encode_kwargs is None:
self.encode_kwargs = SL_DEFAULT_ENCODE.copy()
else:
self.encode_kwargs = encode_kwargs
def __init__(self, contents: bytes):
stream = BytesIO(contents)
self.temp_file = tempfile.NamedTemporaryFile(delete=False)
stream.seek(0)
@@ -44,11 +32,12 @@ class BufferedJp2k(Jp2k):
os.remove(self.temp_file.name)
self.temp_file = None
def _write(self, img_array, verbose=False, **kwargs):
# Glymur normally only lets you control encode params when a write happens within
# the constructor. Keep around the encode params from the constructor and pass
# them to successive write calls.
return super()._write(img_array, verbose=False, **self.encode_kwargs, **kwargs)
def _populate_cparams(self, img_array):
if self._cratios is None:
self._cratios = (1920.0, 480.0, 120.0, 30.0, 10.0)
if self._irreversible is None:
self.irreversible = True
return super()._populate_cparams(img_array)
def __bytes__(self):
with open(self.temp_file.name, "rb") as f:

View File

@@ -1,275 +0,0 @@
"""
Parse the horrible legacy inventory-related format.
It's typically only used for object contents now.
"""
from __future__ import annotations
import dataclasses
import datetime as dt
import itertools
import logging
import weakref
from io import StringIO
from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.legacy_schema import (
parse_schema_line,
SchemaBase,
SchemaDate,
SchemaFieldSerializer,
SchemaHexInt,
SchemaInt,
SchemaMultilineStr,
SchemaParsingError,
SchemaStr,
SchemaUUID,
schema_field,
)
MAGIC_ID = UUID("3c115e51-04f4-523c-9fa6-98aff1034730")
LOG = logging.getLogger(__name__)
_T = TypeVar("_T")
def _yield_schema_tokens(reader: StringIO):
in_bracket = False
# empty str == EOF in Python
while line := reader.readline():
line = line.strip()
# Whitespace-only lines are automatically skipped
if not line:
continue
try:
key, val = parse_schema_line(line)
except SchemaParsingError:
# Can happen if there's a malformed multi-line string, just
# skip by it.
LOG.warning(f"Found invalid inventory line {line!r}")
continue
if key == "{":
if in_bracket:
LOG.warning("Found multiple opening brackets inside structure, "
"was a nested structure not handled?")
in_bracket = True
continue
if key == "}":
if not in_bracket:
LOG.warning("Unexpected closing bracket")
in_bracket = False
break
yield key, val
if in_bracket:
LOG.warning("Reached EOF while inside a bracket")
class InventoryBase(SchemaBase):
SCHEMA_NAME: ClassVar[str]
@classmethod
def from_reader(cls, reader: StringIO, read_header=False) -> InventoryBase:
tok_iter = _yield_schema_tokens(reader)
# Someone else hasn't already read the header for us
if read_header:
schema_name, _ = next(tok_iter)
if schema_name != cls.SCHEMA_NAME:
raise ValueError(f"Expected schema name {schema_name!r} to be {cls.SCHEMA_NAME!r}")
fields = cls._fields_dict()
obj_dict = {}
for key, val in tok_iter:
if key in fields:
field: dataclasses.Field = fields[key]
spec = field.metadata.get("spec")
# Not a real key, an internal var on our dataclass
if not spec:
LOG.warning(f"Internal key {key!r}")
continue
# some kind of nested structure like sale_info
if issubclass(spec, SchemaBase):
obj_dict[key] = spec.from_reader(reader)
elif issubclass(spec, SchemaFieldSerializer):
obj_dict[key] = spec.deserialize(val)
else:
raise ValueError(f"Unsupported spec for {key!r}, {spec!r}")
else:
LOG.warning(f"Unknown key {key!r}")
return cls._obj_from_dict(obj_dict)
def to_writer(self, writer: StringIO):
writer.write(f"\t{self.SCHEMA_NAME}\t0\n")
writer.write("\t{\n")
for field_name, field in self._fields_dict().items():
spec = field.metadata.get("spec")
# Not meant to be serialized
if not spec:
continue
val = getattr(self, field_name)
if val is None:
continue
# Some kind of nested structure like sale_info
if isinstance(val, SchemaBase):
val.to_writer(writer)
elif issubclass(spec, SchemaFieldSerializer):
writer.write(f"\t\t{field_name}\t{spec.serialize(val)}\n")
else:
raise ValueError(f"Bad inventory spec {spec!r}")
writer.write("\t}\n")
class InventoryModel(InventoryBase):
def __init__(self):
self.containers: Dict[UUID, InventoryContainerBase] = {}
self.items: Dict[UUID, InventoryItem] = {}
self.root: Optional[InventoryContainerBase] = None
@classmethod
def from_reader(cls, reader: StringIO, read_header=False) -> InventoryModel:
model = cls()
for key, value in _yield_schema_tokens(reader):
if key == "inv_object":
obj = InventoryObject.from_reader(reader)
if obj is not None:
model.add_container(obj)
elif key == "inv_category":
cat = InventoryCategory.from_reader(reader)
if cat is not None:
model.add_container(cat)
elif key == "inv_item":
item = InventoryItem.from_reader(reader)
if item is not None:
model.add_item(item)
else:
LOG.warning("Unknown key {0}".format(key))
model.reparent_nodes()
return model
def to_writer(self, writer: StringIO):
for container in self.containers.values():
container.to_writer(writer)
for item in self.items.values():
item.to_writer(writer)
def add_container(self, container: InventoryContainerBase):
self.containers[container.node_id] = container
container.model = weakref.proxy(self)
def add_item(self, item: InventoryItem):
self.items[item.item_id] = item
item.model = weakref.proxy(self)
def reparent_nodes(self):
self.root = None
for container in self.containers.values():
container.children.clear()
if container.parent_id == UUID():
self.root = container
for obj in itertools.chain(self.items.values(), self.containers.values()):
if not obj.parent_id or obj.parent_id == UUID():
continue
parent_container = self.containers.get(obj.parent_id)
if not parent_container:
LOG.warning("{0} had an invalid parent {1}".format(obj, obj.parent_id))
continue
parent_container.children.append(obj)
@dataclasses.dataclass
class InventoryPermissions(InventoryBase):
SCHEMA_NAME: ClassVar[str] = "permissions"
base_mask: int = schema_field(SchemaHexInt)
owner_mask: int = schema_field(SchemaHexInt)
group_mask: int = schema_field(SchemaHexInt)
everyone_mask: int = schema_field(SchemaHexInt)
next_owner_mask: int = schema_field(SchemaHexInt)
creator_id: UUID = schema_field(SchemaUUID)
owner_id: UUID = schema_field(SchemaUUID)
last_owner_id: UUID = schema_field(SchemaUUID)
group_id: UUID = schema_field(SchemaUUID)
@dataclasses.dataclass
class InventorySaleInfo(InventoryBase):
SCHEMA_NAME: ClassVar[str] = "sale_info"
sale_type: str = schema_field(SchemaStr)
sale_price: int = schema_field(SchemaInt)
@dataclasses.dataclass
class InventoryNodeBase(InventoryBase):
ID_ATTR: ClassVar[str]
parent_id: Optional[UUID] = schema_field(SchemaUUID)
model: Optional[InventoryModel] = dataclasses.field(default=None, init=False)
@property
def node_id(self) -> UUID:
return getattr(self, self.ID_ATTR)
@property
def parent(self):
return self.model.containers.get(self.parent_id)
@classmethod
def _obj_from_dict(cls, obj_dict):
# Bad entry, ignore
# TODO: Check on these. might be symlinks or something.
if obj_dict.get("type") == "-1":
LOG.warning(f"Skipping bad object with type == -1: {obj_dict!r}")
return None
return super()._obj_from_dict(obj_dict)
@dataclasses.dataclass
class InventoryContainerBase(InventoryNodeBase):
type: str = schema_field(SchemaStr)
name: str = schema_field(SchemaMultilineStr)
children: List[InventoryNodeBase] = dataclasses.field(default_factory=list, init=False)
@dataclasses.dataclass
class InventoryObject(InventoryContainerBase):
SCHEMA_NAME: ClassVar[str] = "inv_object"
ID_ATTR: ClassVar[str] = "obj_id"
obj_id: UUID = schema_field(SchemaUUID)
@dataclasses.dataclass
class InventoryCategory(InventoryContainerBase):
ID_ATTR: ClassVar[str] = "cat_id"
SCHEMA_NAME: ClassVar[str] = "inv_object"
cat_id: UUID = schema_field(SchemaUUID)
pref_type: str = schema_field(SchemaStr)
owner_id: UUID = schema_field(SchemaUUID)
version: int = schema_field(SchemaInt)
@dataclasses.dataclass
class InventoryItem(InventoryNodeBase):
SCHEMA_NAME: ClassVar[str] = "inv_item"
ID_ATTR: ClassVar[str] = "item_id"
item_id: UUID = schema_field(SchemaUUID)
type: str = schema_field(SchemaStr)
inv_type: str = schema_field(SchemaStr)
flags: int = schema_field(SchemaHexInt)
name: str = schema_field(SchemaMultilineStr)
desc: str = schema_field(SchemaMultilineStr)
creation_date: dt.datetime = schema_field(SchemaDate)
permissions: InventoryPermissions = schema_field(InventoryPermissions)
sale_info: InventorySaleInfo = schema_field(InventorySaleInfo)
asset_id: Optional[UUID] = schema_field(SchemaUUID, default=None)
shadow_id: Optional[UUID] = schema_field(SchemaUUID, default=None)
@property
def true_asset_id(self) -> UUID:
if self.asset_id is not None:
return self.asset_id
return self.shadow_id ^ MAGIC_ID

View File

@@ -9,11 +9,14 @@ import abc
import calendar
import dataclasses
import datetime as dt
import inspect
import logging
import re
from io import StringIO
from typing import *
import hippolyzer.lib.base.llsd as llsd
from hippolyzer.lib.base.datatypes import UUID
LOG = logging.getLogger(__name__)
@@ -31,16 +34,32 @@ class SchemaFieldSerializer(abc.ABC, Generic[_T]):
def serialize(cls, val: _T) -> str:
pass
@classmethod
def from_llsd(cls, val: Any, flavor: str) -> _T:
return val
@classmethod
def to_llsd(cls, val: _T, flavor: str) -> Any:
return val
class SchemaDate(SchemaFieldSerializer[dt.datetime]):
@classmethod
def deserialize(cls, val: str) -> dt.datetime:
return dt.datetime.utcfromtimestamp(int(val))
return dt.datetime.fromtimestamp(int(val), dt.timezone.utc)
@classmethod
def serialize(cls, val: dt.datetime) -> str:
return str(calendar.timegm(val.utctimetuple()))
@classmethod
def from_llsd(cls, val: Any, flavor: str) -> dt.datetime:
return dt.datetime.fromtimestamp(val, dt.timezone.utc)
@classmethod
def to_llsd(cls, val: dt.datetime, flavor: str):
return calendar.timegm(val.utctimetuple())
class SchemaHexInt(SchemaFieldSerializer[int]):
@classmethod
@@ -85,6 +104,13 @@ class SchemaStr(SchemaFieldSerializer[str]):
class SchemaUUID(SchemaFieldSerializer[UUID]):
@classmethod
def from_llsd(cls, val: Any, flavor: str) -> UUID:
# FetchInventory2 will return a string, but we want a UUID. It's not an issue
# for us to return a UUID later there because it'll just cast to string if
# that's what it wants
return UUID(val)
@classmethod
def deserialize(cls, val: str) -> UUID:
return UUID(val)
@@ -94,11 +120,28 @@ class SchemaUUID(SchemaFieldSerializer[UUID]):
return str(val)
def schema_field(spec: Type[Union[SchemaBase, SchemaFieldSerializer]], *, default=dataclasses.MISSING, init=True,
repr=True, hash=None, compare=True) -> dataclasses.Field: # noqa
class SchemaLLSD(SchemaFieldSerializer[_T]):
"""Arbitrary LLSD embedded in a field"""
@classmethod
def deserialize(cls, val: str) -> _T:
return llsd.parse_xml(val.partition("|")[0].encode("utf8"))
@classmethod
def serialize(cls, val: _T) -> str:
# Don't include the XML header
return llsd.format_xml(val).split(b">", 1)[1].decode("utf8") + "\n|"
_SCHEMA_SPEC = Union[Type[Union["SchemaBase", SchemaFieldSerializer]], SchemaFieldSerializer]
def schema_field(spec: _SCHEMA_SPEC, *, default=dataclasses.MISSING, init=True,
repr=True, hash=None, compare=True, llsd_name=None, llsd_only=False,
include_none=False) -> dataclasses.Field: # noqa
"""Describe a field in the inventory schema and the shape of its value"""
return dataclasses.field(
metadata={"spec": spec}, default=default, init=init, repr=repr, hash=hash, compare=compare
return dataclasses.field( # noqa
metadata={"spec": spec, "llsd_name": llsd_name, "llsd_only": llsd_only, "include_none": include_none},
default=default, init=init, repr=repr, hash=hash, compare=compare,
)
@@ -121,11 +164,17 @@ def parse_schema_line(line: str):
@dataclasses.dataclass
class SchemaBase(abc.ABC):
@classmethod
def _fields_dict(cls):
return {f.name: f for f in dataclasses.fields(cls)}
def _get_fields_dict(cls, llsd_flavor: Optional[str] = None) -> Dict[str, dataclasses.Field]:
fields_dict = {}
for field in dataclasses.fields(cls):
field_name = field.name
if llsd_flavor:
field_name = field.metadata.get("llsd_name") or field_name
fields_dict[field_name] = field
return fields_dict
@classmethod
def from_str(cls, text: str):
def from_str(cls, text: str) -> Self:
return cls.from_reader(StringIO(text))
@classmethod
@@ -134,9 +183,45 @@ class SchemaBase(abc.ABC):
pass
@classmethod
def from_bytes(cls, data: bytes):
def from_bytes(cls, data: bytes) -> Self:
return cls.from_str(data.decode("utf8"))
@classmethod
def from_llsd(cls, inv_dict: Dict, flavor: str = "legacy") -> Self:
fields = cls._get_fields_dict(llsd_flavor=flavor)
obj_dict = {}
try:
for key, val in inv_dict.items():
if key in fields:
field = fields[key]
key = field.name
spec = field.metadata.get("spec")
# Not a real key, an internal var on our dataclass
if not spec:
LOG.warning(f"Internal key {key!r}")
continue
spec_cls = spec
if not inspect.isclass(spec_cls):
spec_cls = spec_cls.__class__
# some kind of nested structure like sale_info
if issubclass(spec_cls, SchemaBase):
obj_dict[key] = spec.from_llsd(val, flavor)
elif issubclass(spec_cls, SchemaFieldSerializer):
obj_dict[key] = spec.from_llsd(val, flavor)
else:
raise ValueError(f"Unsupported spec for {key!r}, {spec!r}")
else:
if flavor != "ais":
# AIS has a number of different fields that are irrelevant depending on
# what exactly sent the payload
LOG.warning(f"Unknown key {key!r}")
except:
LOG.error(f"Failed to parse inventory schema: {inv_dict!r}")
raise
return cls._obj_from_dict(obj_dict)
def to_bytes(self) -> bytes:
return self.to_str().encode("utf8")
@@ -146,10 +231,36 @@ class SchemaBase(abc.ABC):
writer.seek(0)
return writer.read()
def to_llsd(self, flavor: str = "legacy"):
obj_dict = {}
for field_name, field in self._get_fields_dict(llsd_flavor=flavor).items():
spec = field.metadata.get("spec")
# Not meant to be serialized
if not spec:
continue
val = getattr(self, field.name)
if val is None:
continue
spec_cls = spec
if not inspect.isclass(spec_cls):
spec_cls = spec_cls.__class__
# Some kind of nested structure like sale_info
if isinstance(val, SchemaBase):
val = val.to_llsd(flavor)
elif issubclass(spec_cls, SchemaFieldSerializer):
val = spec.to_llsd(val, flavor)
else:
raise ValueError(f"Bad inventory spec {spec!r}")
obj_dict[field_name] = val
return obj_dict
@abc.abstractmethod
def to_writer(self, writer: StringIO):
pass
@classmethod
def _obj_from_dict(cls, obj_dict: Dict):
def _obj_from_dict(cls, obj_dict: Dict) -> Self:
return cls(**obj_dict) # type: ignore

View File

@@ -15,6 +15,8 @@ CONSTRAINT_DATACLASS = se.ForwardSerializable(lambda: se.Dataclass(Constraint))
POSKEYFRAME_DATACLASS = se.ForwardSerializable(lambda: se.Dataclass(PosKeyframe))
ROTKEYFRAME_DATACLASS = se.ForwardSerializable(lambda: se.Dataclass(RotKeyframe))
JOINTS_DICT = OrderedMultiDict[str, "Joint"]
@dataclasses.dataclass
class Animation:
@@ -29,7 +31,7 @@ class Animation:
ease_in_duration: float = se.dataclass_field(se.F32)
ease_out_duration: float = se.dataclass_field(se.F32)
hand_pose: HandPose = se.dataclass_field(lambda: se.IntEnum(HandPose, se.U32), default=0)
joints: OrderedMultiDict[str, Joint] = se.dataclass_field(se.MultiDictAdapter(
joints: JOINTS_DICT = se.dataclass_field(se.MultiDictAdapter(
se.Collection(se.U32, se.Tuple(se.CStr(), JOINT_DATACLASS)),
))
constraints: List[Constraint] = se.dataclass_field(

View File

@@ -1,20 +1,27 @@
import calendar
import datetime
import struct
import typing
import uuid
import zlib
from llbase.llsd import *
from llsd import *
# So we can directly reference the original wrapper funcs where necessary
import llbase.llsd
import llsd as base_llsd
from llsd.base import is_string, is_unicode
from hippolyzer.lib.base.datatypes import *
class HippoLLSDBaseFormatter(llbase.llsd.LLSDBaseFormatter):
class HippoLLSDBaseFormatter(base_llsd.base.LLSDBaseFormatter):
UUID: callable
ARRAY: callable
BINARY: callable
def __init__(self):
super().__init__()
self.type_map[UUID] = self.UUID
self.type_map[JankStringyBytes] = self.BINARY
self.type_map[Vector2] = self.TUPLECOORD
self.type_map[Vector3] = self.TUPLECOORD
self.type_map[Vector4] = self.TUPLECOORD
@@ -24,44 +31,131 @@ class HippoLLSDBaseFormatter(llbase.llsd.LLSDBaseFormatter):
return self.ARRAY(v.data())
class HippoLLSDXMLFormatter(llbase.llsd.LLSDXMLFormatter, HippoLLSDBaseFormatter):
class HippoLLSDXMLFormatter(base_llsd.serde_xml.LLSDXMLFormatter, HippoLLSDBaseFormatter):
def __init__(self):
super().__init__()
def _generate(self, something):
if isinstance(something, int) and type(something) is not int:
# The lookup in the underlying library will fail if we don't convert IntEnums to actual ints.
something = int(something)
return super()._generate(something)
class HippoLLSDXMLPrettyFormatter(base_llsd.serde_xml.LLSDXMLPrettyFormatter, HippoLLSDBaseFormatter):
def __init__(self):
super().__init__()
class HippoLLSDXMLPrettyFormatter(llbase.llsd.LLSDXMLPrettyFormatter, HippoLLSDBaseFormatter):
def __init__(self):
super().__init__()
def format_pretty_xml(val: typing.Any):
def format_pretty_xml(val: typing.Any) -> bytes:
return HippoLLSDXMLPrettyFormatter().format(val)
def format_xml(val: typing.Any):
def format_xml(val: typing.Any) -> bytes:
return HippoLLSDXMLFormatter().format(val)
class HippoLLSDNotationFormatter(llbase.llsd.LLSDNotationFormatter, HippoLLSDBaseFormatter):
class HippoLLSDNotationFormatter(base_llsd.serde_notation.LLSDNotationFormatter, HippoLLSDBaseFormatter):
def __init__(self):
super().__init__()
def STRING(self, v):
# llbase's notation LLSD encoder isn't suitable for generating line-delimited
# LLSD because the string formatter leaves \n unencoded, unlike indra's llcommon.
# Add our own escaping rule.
return super().STRING(v).replace(b"\n", b"\\n")
def format_notation(val: typing.Any):
def format_notation(val: typing.Any) -> bytes:
return HippoLLSDNotationFormatter().format(val)
def format_binary(val: typing.Any, with_header=True):
val = llbase.llsd.format_binary(val)
if not with_header:
return val.split(b"\n", 1)[1]
def format_binary(val: typing.Any, with_header=True) -> bytes:
val = _format_binary_recurse(val)
if with_header:
return b'<?llsd/binary?>\n' + val
return val
class HippoLLSDBinaryParser(llbase.llsd.LLSDBinaryParser):
# This is copied almost wholesale from https://bitbucket.org/lindenlab/llbase/src/master/llbase/llsd.py
# With a few minor changes to make serialization round-trip correctly. It's evil.
def _format_binary_recurse(something) -> bytes:
"""Binary formatter workhorse."""
def _format_list(list_something):
array_builder = [b'[' + struct.pack('!i', len(list_something))]
for item in list_something:
array_builder.append(_format_binary_recurse(item))
array_builder.append(b']')
return b''.join(array_builder)
if something is None:
return b'!'
elif isinstance(something, LLSD):
return _format_binary_recurse(something.thing)
elif isinstance(something, bool):
if something:
return b'1'
else:
return b'0'
elif isinstance(something, int):
try:
return b'i' + struct.pack('!i', something)
except (OverflowError, struct.error) as exc:
raise LLSDSerializationError(str(exc), something)
elif isinstance(something, float):
try:
return b'r' + struct.pack('!d', something)
except SystemError as exc:
raise LLSDSerializationError(str(exc), something)
elif isinstance(something, uuid.UUID):
return b'u' + something.bytes
elif isinstance(something, (binary, JankStringyBytes)):
return b'b' + struct.pack('!i', len(something)) + something
elif is_string(something):
if is_unicode(something):
something = something.encode("utf8")
return b's' + struct.pack('!i', len(something)) + something
elif isinstance(something, uri):
return b'l' + struct.pack('!i', len(something)) + something.encode("utf8")
elif isinstance(something, datetime.datetime):
return b'd' + struct.pack('<d', something.timestamp())
elif isinstance(something, datetime.date):
seconds_since_epoch = calendar.timegm(something.timetuple())
return b'd' + struct.pack('<d', seconds_since_epoch)
elif isinstance(something, (list, tuple)):
return _format_list(something)
elif isinstance(something, dict):
map_builder = [b'{' + struct.pack('!i', len(something))]
for key, value in something.items():
if isinstance(key, str):
key = key.encode("utf8")
map_builder.append(b'k' + struct.pack('!i', len(key)) + key)
map_builder.append(_format_binary_recurse(value))
map_builder.append(b'}')
return b''.join(map_builder)
else:
try:
return _format_list(list(something))
except TypeError:
raise LLSDSerializationError(
"Cannot serialize unknown type: %s (%s)" %
(type(something), something))
class HippoLLSDBinaryParser(base_llsd.serde_binary.LLSDBinaryParser):
def __init__(self):
super().__init__()
self._dispatch[ord('u')] = lambda: UUID(bytes=self._getc(16))
self._dispatch[ord('d')] = self._parse_date
def _parse_date(self):
seconds = struct.unpack("<d", self._getc(8))[0]
try:
return datetime.datetime.fromtimestamp(seconds, tz=datetime.timezone.utc)
except OverflowError as exc:
# A garbage seconds value can cause utcfromtimestamp() to raise
# OverflowError: timestamp out of range for platform time_t
self._error(exc, -8)
def _parse_string(self):
# LLSD's C++ API lets you stuff binary in a string field even though it's only
@@ -74,22 +168,26 @@ class HippoLLSDBinaryParser(llbase.llsd.LLSDBinaryParser):
return bytes_val
# Python uses one, C++ uses the other, and everyone's unhappy.
_BINARY_HEADERS = (b'<? LLSD/Binary ?>', b'<?llsd/binary?>')
def parse_binary(data: bytes):
if data.startswith(b'<?llsd/binary?>'):
if any(data.startswith(x) for x in _BINARY_HEADERS):
data = data.split(b'\n', 1)[1]
return HippoLLSDBinaryParser().parse(data)
def parse_xml(data: bytes):
return llbase.llsd.parse_xml(data)
return base_llsd.parse_xml(data)
def parse_notation(data: bytes):
return llbase.llsd.parse_notation(data)
return base_llsd.parse_notation(data)
def zip_llsd(val: typing.Any):
return zlib.compress(format_binary(val, with_header=False))
return zlib.compress(format_binary(val, with_header=False), level=zlib.Z_BEST_COMPRESSION)
def unzip_llsd(data: bytes):
@@ -101,13 +199,13 @@ def parse(data: bytes):
# content-type is usually nonsense.
try:
data = data.lstrip()
if data.startswith(b'<?llsd/binary?>'):
if any(data.startswith(x) for x in _BINARY_HEADERS):
return parse_binary(data)
elif data.startswith(b'<'):
return parse_xml(data)
else:
return parse_notation(data)
except KeyError as e:
raise llbase.llsd.LLSDParseError('LLSD could not be parsed: %s' % (e,))
raise base_llsd.LLSDParseError('LLSD could not be parsed: %s' % (e,))
except TypeError as e:
raise llbase.llsd.LLSDParseError('Input stream not of type bytes. %s' % (e,))
raise base_llsd.LLSDParseError('Input stream not of type bytes. %s' % (e,))

View File

@@ -11,21 +11,75 @@ from typing import *
import zlib
from copy import deepcopy
import numpy as np
import recordclass
from hippolyzer.lib.base import serialization as se
from hippolyzer.lib.base.datatypes import Vector3, Vector2, UUID, TupleCoord
from hippolyzer.lib.base.llsd import zip_llsd, unzip_llsd
from hippolyzer.lib.base.serialization import ParseContext
LOG = logging.getLogger(__name__)
def llsd_to_mat4(mat: Union[np.ndarray, Sequence[float]]) -> np.ndarray:
return np.array(mat).reshape((4, 4), order='F')
def mat4_to_llsd(mat: np.ndarray) -> List[float]:
return list(mat.flatten(order='F'))
@dataclasses.dataclass
class MeshAsset:
header: MeshHeaderDict = dataclasses.field(default_factory=dict)
segments: MeshSegmentDict = dataclasses.field(default_factory=dict)
raw_segments: Dict[str, bytes] = dataclasses.field(default_factory=dict)
@classmethod
def make_triangle(cls) -> MeshAsset:
"""Make an asset representing an un-rigged single-sided mesh triangle"""
inst = cls()
inst.header = {
"version": 1,
"high_lod": {"offset": 0, "size": 0},
"physics_mesh": {"offset": 0, "size": 0},
"physics_convex": {"offset": 0, "size": 0},
}
base_lod: LODSegmentDict = {
'Normal': [
Vector3(-0.0, -0.0, -1.0),
Vector3(-0.0, -0.0, -1.0),
Vector3(-0.0, -0.0, -1.0)
],
'PositionDomain': {'Max': [0.5, 0.5, 0.0], 'Min': [-0.5, -0.5, 0.0]},
'Position': [
Vector3(0.0, 0.0, 0.0),
Vector3(1.0, 0.0, 0.0),
Vector3(0.5, 1.0, 0.0)
],
'TexCoord0Domain': {'Max': [1.0, 1.0], 'Min': [0.0, 0.0]},
'TexCoord0': [
Vector2(0.0, 0.0),
Vector2(1.0, 0.0),
Vector2(0.5, 1.0)
],
'TriangleList': [[0, 1, 2]],
}
inst.segments['physics_mesh'] = [deepcopy(base_lod)]
inst.segments['high_lod'] = [deepcopy(base_lod)]
convex_segment: PhysicsConvexSegmentDict = {
'BoundingVerts': [
Vector3(-0.0, 1.0, -1.0),
Vector3(-1.0, -1.0, -1.0),
Vector3(1.0, -1.0, -1.0)
],
'Max': [0.5, 0.5, 0.0],
'Min': [-0.5, -0.5, 0.0]
}
inst.segments['physics_convex'] = convex_segment
return inst
def iter_lods(self) -> Generator[List[LODSegmentDict], None, None]:
for lod_name, lod_val in self.segments.items():
if lod_name.endswith("_lod"):
@@ -124,7 +178,7 @@ class DomainDict(TypedDict):
Min: List[float]
class VertexWeight(recordclass.datatuple): # type: ignore
class VertexWeight(recordclass.RecordClass):
"""Vertex weight for a specific joint on a specific vertex"""
# index of the joint within the joint_names list in the skin segment
joint_idx: int
@@ -135,20 +189,26 @@ class VertexWeight(recordclass.datatuple): # type: ignore
class SkinSegmentDict(TypedDict, total=False):
"""Rigging information"""
joint_names: List[str]
# model -> world transform matrix for model
# model -> world transform mat4 for model
bind_shape_matrix: List[float]
# world -> joint local transform matrices
# world -> joint local transform mat4s
inverse_bind_matrix: List[List[float]]
# offset matrices for joints, translation-only.
# Not sure what these are relative to, base joint or model <0,0,0>.
# Transform mat4s for the joint nodes themselves.
# The matrices may have scale or other components, but only the
# translation component will be used by the viewer.
# All translations are relative to the joint's parent.
alt_inverse_bind_matrix: List[List[float]]
lock_scale_if_joint_position: bool
pelvis_offset: float
class PhysicsConvexSegmentDict(DomainDict, total=False):
"""Data for convex hull collisions, populated by the client"""
# Min / Max domain vals are inline, unlike for LODs
"""
Data for convex hull collisions, populated by the client
Min / Max pos domain vals are inline, unlike for LODs, so this inherits from DomainDict
"""
# Indices into the Positions list
HullList: List[int]
# -1.0 - 1.0, dequantized from binary field of U16s
Positions: List[Vector3]
@@ -158,13 +218,13 @@ class PhysicsConvexSegmentDict(DomainDict, total=False):
class PhysicsHavokSegmentDict(TypedDict, total=False):
"""Cached data for Havok collisions, populated by sim and not used by client."""
HullMassProps: MassPropsDict
MOPP: MOPPDict
MeshDecompMassProps: MassPropsDict
HullMassProps: HavokMassPropsDict
MOPP: HavokMOPPDict
MeshDecompMassProps: HavokMassPropsDict
WeldingData: bytes
class MassPropsDict(TypedDict, total=False):
class HavokMassPropsDict(TypedDict, total=False):
# Vec, center of mass
CoM: List[float]
# 9 floats, Mat3?
@@ -173,7 +233,7 @@ class MassPropsDict(TypedDict, total=False):
volume: float
class MOPPDict(TypedDict, total=False):
class HavokMOPPDict(TypedDict, total=False):
"""Memory Optimized Partial Polytope"""
BuildType: int
MoppData: bytes
@@ -205,7 +265,6 @@ def positions_to_domain(positions: Iterable[TupleCoord], domain: DomainDict):
class VertexWeights(se.SerializableBase):
"""Serializer for a list of joint weights on a single vertex"""
INFLUENCE_SER = se.QuantizedFloat(se.U16, 0.0, 1.0)
INFLUENCE_LIMIT = 4
INFLUENCE_TERM = 0xFF
@@ -216,18 +275,30 @@ class VertexWeights(se.SerializableBase):
for val in vals:
joint_idx, influence = val
writer.write(se.U8, joint_idx)
writer.write(cls.INFLUENCE_SER, influence, ctx=ctx)
writer.write(se.U16, round(influence * 0xFFff), ctx=ctx)
if len(vals) != cls.INFLUENCE_LIMIT:
writer.write(se.U8, cls.INFLUENCE_TERM)
@classmethod
def deserialize(cls, reader: se.Reader, ctx=None):
# NOTE: normally you'd want to do something like arrange this into a nicely
# aligned byte array with zero padding so that you could vectorize the decoding.
# In cases where having a vertex with no weights is semantically equivalent to
# having a vertex _with_ weights of a value of 0.0 that's fine. This isn't the case
# in LL's implementation of mesh:
#
# https://bitbucket.org/lindenlab/viewer/src/d31a83fb946c49a38376ea3b312b5380d0c8c065/indra/llmath/llvolume.cpp#lines-2560:2628
#
# Consider the difference between handling of b"\x00\x00\x00\xFF" and b"\xFF" with the above logic.
# To simplify round-tripping while preserving those semantics, we don't do a vectorized decode.
# I had a vectorized numpy version, but those requirements made everything a bit of a mess.
influence_list = []
for _ in range(cls.INFLUENCE_LIMIT):
joint_idx = reader.read(se.U8)
joint_idx = reader.read_bytes(1)[0]
if joint_idx == cls.INFLUENCE_TERM:
break
influence_list.append(VertexWeight(joint_idx, reader.read(cls.INFLUENCE_SER, ctx=ctx)))
weight = reader.read(se.U16, ctx=ctx) / 0xFFff
influence_list.append(VertexWeight(joint_idx, weight))
return influence_list
@@ -262,16 +333,46 @@ class SegmentSerializer:
return new_segment
class VecListAdapter(se.Adapter):
def __init__(self, child_spec: se.SERIALIZABLE_TYPE, vec_type: Type):
super().__init__(child_spec)
self.vec_type = vec_type
def encode(self, val: Any, ctx: Optional[ParseContext]) -> Any:
return val
def decode(self, val: Any, ctx: Optional[ParseContext], pod: bool = False) -> Any:
new_vals = []
for elem in val:
new_vals.append(self.vec_type(*elem))
return new_vals
LE_U16: np.dtype = np.dtype(np.uint16).newbyteorder('<') # noqa
LOD_SEGMENT_SERIALIZER = SegmentSerializer({
# 16-bit indices to the verts making up the tri. Imposes a 16-bit
# upper limit on verts in any given material in the mesh.
"TriangleList": se.Collection(None, se.Collection(3, se.U16)),
"TriangleList": se.ExprAdapter(
se.NumPyArray(se.BytesGreedy(), LE_U16, 3),
decode_func=lambda x: x.tolist(),
),
# These are used to interpolate between values in their respective domains
# Each position represents a single vert.
"Position": se.Collection(None, se.Vector3U16(0.0, 1.0)),
"TexCoord0": se.Collection(None, se.Vector2U16(0.0, 1.0)),
# Normals have a static domain between -1 and 1
"Normal": se.Collection(None, se.Vector3U16(0.0, 1.0)),
"Position": VecListAdapter(
se.QuantizedNumPyArray(se.NumPyArray(se.BytesGreedy(), LE_U16, 3), 0.0, 1.0),
Vector3,
),
"TexCoord0": VecListAdapter(
se.QuantizedNumPyArray(se.NumPyArray(se.BytesGreedy(), LE_U16, 2), 0.0, 1.0),
Vector2,
),
# Normals have a static domain between -1 and 1, so we just use that rather than 0.0 - 1.0.
"Normal": VecListAdapter(
se.QuantizedNumPyArray(se.NumPyArray(se.BytesGreedy(), LE_U16, 3), -1.0, 1.0),
Vector3,
),
"Weights": se.Collection(None, VertexWeights)
})

View File

@@ -0,0 +1,182 @@
from __future__ import annotations
import copy
import dataclasses
import re
import weakref
from typing import *
import transformations
from lxml import etree
from hippolyzer.lib.base.datatypes import Vector3, RAD_TO_DEG
from hippolyzer.lib.base.helpers import get_resource_filename
from hippolyzer.lib.base.mesh import MeshAsset, SkinSegmentDict, llsd_to_mat4
MAYBE_JOINT_REF = Optional[str]
SKELETON_REF = Optional[Callable[[], "Skeleton"]]
@dataclasses.dataclass
class JointNode:
name: str
parent_name: MAYBE_JOINT_REF
skeleton: SKELETON_REF
translation: Vector3
pivot: Vector3 # pivot point for the joint, generally the same as translation
rotation: Vector3 # Euler rotation in degrees
scale: Vector3
type: str # bone or collision_volume
support: str
def __hash__(self):
return hash((self.name, self.type))
@property
def matrix(self):
return transformations.compose_matrix(
scale=tuple(self.scale),
angles=tuple(self.rotation / RAD_TO_DEG),
translate=tuple(self.translation),
)
@property
def parent(self) -> Optional[JointNode]:
if self.parent_name:
return self.skeleton()[self.parent_name]
return None
@property
def index(self) -> int:
bone_idx = 0
for node in self.skeleton().joint_dict.values():
if node.type != "bone":
continue
if self is node:
return bone_idx
bone_idx += 1
raise KeyError(f"{self.name!r} doesn't exist in skeleton")
@property
def ancestors(self) -> Sequence[JointNode]:
joint_node = self
skeleton = self.skeleton()
ancestors: List[JointNode] = []
while joint_node.parent_name:
joint_node = skeleton.joint_dict.get(joint_node.parent_name)
ancestors.append(joint_node)
return ancestors
@property
def children(self) -> Sequence[JointNode]:
children: List[JointNode] = []
for node in self.skeleton().joint_dict.values():
if node.parent_name and node.parent_name == self.name:
children.append(node)
return children
@property
def inverse(self) -> Optional[JointNode]:
l_re = re.compile(r"(.*?(?:_|\b))L((?:_|\b).*)")
r_re = re.compile(r"(.*?(?:_|\b))R((?:_|\b).*)")
inverse_name = None
if "Left" in self.name:
inverse_name = self.name.replace("Left", "Right")
elif "LEFT" in self.name:
inverse_name = self.name.replace("LEFT", "RIGHT")
elif l_re.match(self.name):
inverse_name = re.sub(l_re, r"\1R\2", self.name)
elif "Right" in self.name:
inverse_name = self.name.replace("Right", "Left")
elif "RIGHT" in self.name:
inverse_name = self.name.replace("RIGHT", "LEFT")
elif r_re.match(self.name):
inverse_name = re.sub(r_re, r"\1L\2", self.name)
if inverse_name:
return self.skeleton().joint_dict.get(inverse_name)
return None
@property
def descendents(self) -> Set[JointNode]:
descendents: Set[JointNode] = set()
ancestors: Set[str] = {self.name}
last_ancestors: Set[str] = set()
while last_ancestors != ancestors:
last_ancestors = ancestors.copy()
for node in self.skeleton().joint_dict.values():
if node.parent_name and node.parent_name in ancestors:
ancestors.add(node.name)
descendents.add(node)
return descendents
class Skeleton:
def __init__(self, root_node: Optional[etree.ElementBase] = None):
self.joint_dict: Dict[str, JointNode] = {}
if root_node is not None:
self._parse_node_children(root_node, None)
def __getitem__(self, item: str) -> JointNode:
return self.joint_dict[item]
def clone(self) -> Self:
val = copy.deepcopy(self)
skel_ref = weakref.ref(val)
for joint in val.joint_dict.values():
joint.skeleton = skel_ref
return val
def _parse_node_children(self, node: etree.ElementBase, parent_name: MAYBE_JOINT_REF):
name = node.get('name')
joint = JointNode(
name=name,
parent_name=parent_name,
skeleton=weakref.ref(self),
translation=_get_vec_attr(node, "pos", Vector3()),
pivot=_get_vec_attr(node, "pivot", Vector3()),
rotation=_get_vec_attr(node, "rot", Vector3()),
scale=_get_vec_attr(node, "scale", Vector3(1, 1, 1)),
support=node.get('support', 'base'),
type=node.tag,
)
self.joint_dict[name] = joint
for child in node.iterchildren():
self._parse_node_children(child, joint.name)
def merge_mesh_skeleton(self, mesh: MeshAsset) -> None:
"""Update this skeleton with a skeleton definition from a mesh asset"""
skin_seg: Optional[SkinSegmentDict] = mesh.segments.get('skin')
if not skin_seg:
return
for joint_name, matrix in zip(skin_seg['joint_names'], skin_seg.get('alt_inverse_bind_matrix', [])):
# We're only meant to use the translation component from the alt inverse bind matrix.
joint_decomp = transformations.decompose_matrix(llsd_to_mat4(matrix))
joint_node = self.joint_dict.get(joint_name)
if not joint_node:
continue
joint_node.translation = Vector3(*joint_decomp[3])
if pelvis_offset := skin_seg.get('pelvis_offset'):
# TODO: Should we even do this?
pelvis_node = self["mPelvis"]
pelvis_node.translation += Vector3(0, 0, pelvis_offset)
def _get_vec_attr(node, attr_name: str, default: Vector3) -> Vector3:
attr_val = node.get(attr_name, None)
if not attr_val:
return default
return Vector3(*(float(x) for x in attr_val.split(" ") if x))
def load_avatar_skeleton() -> Skeleton:
skel_path = get_resource_filename("lib/base/data/avatar_skeleton.xml")
with open(skel_path, 'r') as f:
skel_root = etree.fromstring(f.read())
return Skeleton(skel_root.getchildren()[0])
AVATAR_SKELETON = load_avatar_skeleton()

View File

@@ -1,8 +1,12 @@
from __future__ import annotations
import abc
import asyncio
import copy
import dataclasses
import datetime as dt
import logging
from collections import deque
from typing import *
from typing import Optional
@@ -13,15 +17,32 @@ from .msgtypes import PacketFlags
from .udpserializer import UDPMessageSerializer
@dataclasses.dataclass
class ReliableResendInfo:
last_resent: dt.datetime
message: Message
completed: asyncio.Future = dataclasses.field(default_factory=asyncio.Future)
tries_left: int = 10
class Circuit:
def __init__(self, near_host: Optional[ADDR_TUPLE], far_host: ADDR_TUPLE, transport):
def __init__(
self,
near_host: Optional[ADDR_TUPLE],
far_host: ADDR_TUPLE,
transport: Optional[AbstractUDPTransport] = None,
):
self.near_host: Optional[ADDR_TUPLE] = near_host
self.host: ADDR_TUPLE = far_host
self.is_alive = True
self.transport: Optional[AbstractUDPTransport] = transport
self.transport = transport
self.serializer = UDPMessageSerializer()
self.last_packet_at = dt.datetime.now()
self.packet_id_base = 0
self.unacked_reliable: Dict[Tuple[Direction, int], ReliableResendInfo] = {}
self.resend_every: float = 3.0
# Reliable messages that we've already seen and handled, for resend suppression
self.seen_reliable: deque[int] = deque(maxlen=1_000)
def _send_prepared_message(self, message: Message, transport=None):
try:
@@ -31,6 +52,11 @@ class Circuit:
raise
return self.send_datagram(serialized, message.direction, transport=transport)
def disconnect(self):
self.packet_id_base = 0
self.unacked_reliable.clear()
self.is_alive = False
def send_datagram(self, data: bytes, direction: Direction, transport=None):
self.last_packet_at = dt.datetime.now()
src_addr, dst_addr = self.host, self.near_host
@@ -46,22 +72,74 @@ class Circuit:
raise RuntimeError(f"Trying to re-send finalized {message!r}")
message.packet_id = self.packet_id_base
self.packet_id_base += 1
if not message.acks:
message.send_flags &= PacketFlags.ACK
if message.acks:
message.send_flags |= PacketFlags.ACK
else:
message.send_flags &= ~PacketFlags.ACK
# If it was queued, it's not anymore
message.queued = False
message.finalized = True
return True
def send_message(self, message: Message, transport=None):
def send(self, message: Message, transport=None) -> UDPPacket:
if self.prepare_message(message):
# If the message originates from us then we're responsible for resends.
if message.reliable and message.synthetic and not transport:
self.unacked_reliable[(message.direction, message.packet_id)] = ReliableResendInfo(
last_resent=dt.datetime.now(),
message=message,
)
return self._send_prepared_message(message, transport)
def send_reliable(self, message: Message, transport=None) -> asyncio.Future:
"""send() wrapper that always sends reliably and allows `await`ing ACK receipt"""
if not message.synthetic:
raise ValueError("Not able to send non-synthetic message reliably!")
message.send_flags |= PacketFlags.RELIABLE
self.send(message, transport)
return self.unacked_reliable[(message.direction, message.packet_id)].completed
def collect_acks(self, message: Message):
effective_acks = list(message.acks)
if message.name == "PacketAck":
effective_acks.extend(x["ID"] for x in message["Packets"])
for ack in effective_acks:
resend_info = self.unacked_reliable.pop((~message.direction, ack), None)
if resend_info:
resend_info.completed.set_result(None)
def resend_unacked(self):
for resend_info in list(self.unacked_reliable.values()):
# Not time to attempt a resend yet
if dt.datetime.now() - resend_info.last_resent < dt.timedelta(seconds=self.resend_every):
continue
msg = copy.copy(resend_info.message)
resend_info.tries_left -= 1
# We were on our last try and we never received an ack
if not resend_info.tries_left:
logging.warning(f"Giving up on unacked {msg.packet_id}")
del self.unacked_reliable[(msg.direction, msg.packet_id)]
resend_info.completed.set_exception(TimeoutError("Exceeded resend limit"))
continue
resend_info.last_resent = dt.datetime.now()
msg.send_flags |= PacketFlags.RESENT
self._send_prepared_message(msg)
def send_acks(self, to_ack: Sequence[int], direction=Direction.OUT, packet_id=None):
logging.debug("%r acking %r" % (direction, to_ack))
# TODO: maybe tack this onto `.acks` for next message?
message = Message('PacketAck', *[Block('Packets', ID=x) for x in to_ack])
message.packet_id = packet_id
message.direction = direction
message.injected = True
self.send_message(message)
self.send(message)
def track_reliable(self, packet_id: int) -> bool:
"""Tracks a reliable packet, returning if it's a new message"""
if packet_id in self.seen_reliable:
return False
self.seen_reliable.append(packet_id)
return True
def __repr__(self):
return "<%s %r : %r>" % (self.__class__.__name__, self.near_host, self.host)

File diff suppressed because it is too large Load Diff

View File

@@ -29,7 +29,10 @@ from hippolyzer.lib.base.message.msgtypes import MsgType
PACKER = Callable[[Any], bytes]
UNPACKER = Callable[[bytes], Any]
LLSD_PACKER = Callable[[Any], Any]
LLSD_UNPACKER = Callable[[Any], Any]
SPEC = Tuple[UNPACKER, PACKER]
LLSD_SPEC = Tuple[LLSD_UNPACKER, LLSD_PACKER]
def _pack_string(pack_string):
@@ -64,6 +67,21 @@ def _make_tuplecoord_spec(typ: Type[TupleCoord], struct_fmt: str,
return lambda x: typ(*struct_obj.unpack(x)), _packer
def _make_llsd_tuplecoord_spec(typ: Type[TupleCoord], needed_elems: Optional[int] = None):
if needed_elems is None:
# Number of elems needed matches the number in the coord type
def _packer(x):
return list(x)
else:
# Special case, we only want to pack some of the components.
# Mostly for Quaternion since we don't actually need to send W.
def _packer(x):
if isinstance(x, TupleCoord):
x = x.data()
return list(x.data(needed_elems))
return lambda x: typ(*x), _packer
def _unpack_specs(cls):
cls.UNPACKERS = {k: v[0] for (k, v) in cls.SPECS.items()}
cls.PACKERS = {k: v[1] for (k, v) in cls.SPECS.items()}
@@ -78,7 +96,7 @@ class TemplateDataPacker:
MsgType.MVT_S8: _make_struct_spec('b'),
MsgType.MVT_U8: _make_struct_spec('B'),
MsgType.MVT_BOOL: _make_struct_spec('B'),
MsgType.MVT_LLUUID: (lambda x: UUID(bytes=bytes(x)), lambda x: x.bytes),
MsgType.MVT_LLUUID: (lambda x: UUID(bytes=bytes(x)), lambda x: UUID(x).bytes),
MsgType.MVT_IP_ADDR: (socket.inet_ntoa, socket.inet_aton),
MsgType.MVT_IP_PORT: _make_struct_spec('!H'),
MsgType.MVT_U16: _make_struct_spec('<H'),
@@ -110,10 +128,15 @@ class TemplateDataPacker:
class LLSDDataPacker(TemplateDataPacker):
# Some template var types aren't directly representable in LLSD, so they
# get encoded to binary fields.
SPECS = {
SPECS: Dict[MsgType, LLSD_SPEC] = {
MsgType.MVT_IP_ADDR: (socket.inet_ntoa, socket.inet_aton),
# LLSD ints are technically bound to S32 range.
MsgType.MVT_U32: _make_struct_spec('!I'),
MsgType.MVT_U64: _make_struct_spec('!Q'),
MsgType.MVT_S64: _make_struct_spec('!q'),
# These are arrays in LLSD, we need to turn them into coords.
MsgType.MVT_LLVector3: _make_llsd_tuplecoord_spec(Vector3),
MsgType.MVT_LLVector3d: _make_llsd_tuplecoord_spec(Vector3),
MsgType.MVT_LLVector4: _make_llsd_tuplecoord_spec(Vector4),
MsgType.MVT_LLQuaternion: _make_llsd_tuplecoord_spec(Quaternion, needed_elems=3)
}

View File

@@ -5,14 +5,13 @@ from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.message.data_packer import LLSDDataPacker
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.template import MessageTemplateVariable
from hippolyzer.lib.base.message.template_dict import TemplateDictionary
from hippolyzer.lib.base.message.template_dict import TemplateDictionary, DEFAULT_TEMPLATE_DICT
VAR_PAIR = Tuple[dict, MessageTemplateVariable]
class LLSDMessageSerializer:
DEFAULT_TEMPLATE = TemplateDictionary()
DEFAULT_TEMPLATE = DEFAULT_TEMPLATE_DICT
def __init__(self, message_template=None, message_cls: Type[Message] = Message):
if message_template is not None:

View File

@@ -32,6 +32,7 @@ from typing import *
from hippolyzer.lib.base.datatypes import *
import hippolyzer.lib.base.serialization as se
import hippolyzer.lib.base.templates as templates
from hippolyzer.lib.base.datatypes import Pretty
from hippolyzer.lib.base.message.msgtypes import PacketFlags
from hippolyzer.lib.base.network.transport import Direction, ADDR_TUPLE
@@ -62,19 +63,20 @@ class Block:
Block expects a name, and kwargs for variables (var_name = value)
"""
__slots__ = ('name', 'size', 'vars', 'message_name', '_ser_cache', 'fill_missing',)
PARENT_MESSAGE_NAME: ClassVar[Optional[str]] = None
def __init__(self, name, /, *, fill_missing=False, **kwargs):
self.name = name
self.size = 0
self.message_name: Optional[str] = None
self.message_name: Optional[str] = self.PARENT_MESSAGE_NAME
self.vars: Dict[str, VAR_TYPE] = {}
self._ser_cache: Dict[str, Any] = {}
self.fill_missing = fill_missing
for var_name, val in kwargs.items():
self[var_name] = val
def get_variable(self, var_name):
return self.vars.get(var_name)
def get(self, var_name, default: Optional[VAR_TYPE] = None) -> Optional[VAR_TYPE]:
return self.vars.get(var_name, default)
def __contains__(self, item):
return item in self.vars
@@ -83,6 +85,9 @@ class Block:
return self.vars[name]
def __setitem__(self, key, value):
if isinstance(value, Pretty):
return self.serialize_var(key, value.value)
# These don't pickle well since they're likely to get hot-reloaded
if isinstance(value, (enum.IntEnum, enum.IntFlag)):
value = int(value)
@@ -181,9 +186,9 @@ class MsgBlockList(List["Block"]):
class Message:
__slots__ = ("name", "send_flags", "_packet_id", "acks", "body_boundaries", "queued",
__slots__ = ("name", "send_flags", "packet_id", "acks", "body_boundaries", "queued",
"offset", "raw_extra", "raw_body", "deserializer", "_blocks", "finalized",
"direction", "meta", "injected", "dropped", "sender")
"direction", "meta", "synthetic", "dropped", "sender", "unknown_message")
def __init__(self, name, *args, packet_id=None, flags=0, acks=None, direction=None):
# TODO: Do this on a timer or something.
@@ -191,10 +196,11 @@ class Message:
self.name = name
self.send_flags = flags
self._packet_id: Optional[int] = packet_id # aka, sequence number
self.packet_id: Optional[int] = packet_id # aka, sequence number
self.acks = acks if acks is not None else tuple()
self.body_boundaries = (-1, -1)
self.unknown_message = False
self.offset = 0
self.raw_extra = b""
self.direction: Direction = direction if direction is not None else Direction.OUT
@@ -208,26 +214,16 @@ class Message:
self.queued: bool = False
self._blocks: BLOCK_DICT = {}
self.meta = {}
self.injected = False
self.synthetic = packet_id is None
self.dropped = False
self.sender: Optional[ADDR_TUPLE] = None
self.add_blocks(args)
@property
def packet_id(self) -> Optional[int]:
return self._packet_id
@packet_id.setter
def packet_id(self, val: Optional[int]):
self._packet_id = val
# Changing packet ID clears the finalized flag
self.finalized = False
def add_blocks(self, block_list):
# can have a list of blocks if it is multiple or variable
for block in block_list:
if type(block) == list:
if type(block) is list:
for bl in block:
self.add_block(bl)
else:
@@ -271,7 +267,7 @@ class Message:
block.message_name = self.name
block.finalize()
def get_block(self, block_name: str, default=None, /) -> Optional[Block]:
def get_blocks(self, block_name: str, default=None, /) -> Optional[MsgBlockList]:
return self.blocks.get(block_name, default)
@property
@@ -293,10 +289,10 @@ class Message:
def ensure_parsed(self):
# This is a little magic, think about whether we want this.
if self.raw_body and self.deserializer():
if self.raw_body and self.deserializer and self.deserializer():
self.deserializer().parse_message_body(self)
def to_dict(self):
def to_dict(self, extended=False):
""" A dict representation of a message.
This is the form used for templated messages sent via EQ.
@@ -312,6 +308,18 @@ class Message:
new_vars[var_name] = val
dict_blocks.append(new_vars)
if extended:
base_repr.update({
"packet_id": self.packet_id,
"meta": self.meta.copy(),
"dropped": self.dropped,
"synthetic": self.synthetic,
"direction": self.direction.name,
"send_flags": int(self.send_flags),
"extra": self.extra,
"acks": self.acks,
})
return base_repr
@classmethod
@@ -321,6 +329,32 @@ class Message:
msg.create_block_list(block_type)
for block in blocks:
msg.add_block(Block(block_type, **block))
if 'packet_id' in dict_val:
# extended format
msg.packet_id = dict_val['packet_id']
msg.meta = dict_val['meta']
msg.dropped = dict_val['dropped']
msg.synthetic = dict_val['synthetic']
msg.direction = Direction[dict_val['direction']]
msg.send_flags = dict_val['send_flags']
msg.extra = dict_val['extra']
msg.acks = dict_val['acks']
return msg
@classmethod
def from_eq_event(cls, event) -> Message:
# If this isn't a templated message (like some EQ-only events are),
# then we wrap it in a synthetic `Message` so that the API for handling
# both EQ-only and templated message events can be the same. Ick.
msg = cls(event["message"])
if isinstance(event["body"], dict):
msg.add_block(Block("EventData", **event["body"]))
else:
# Shouldn't be any events that have anything other than a dict
# as a body, but just to be sure...
msg.add_block(Block("EventData", Data=event["body"]))
msg.synthetic = True
return msg
def invalidate_caches(self):
@@ -359,12 +393,16 @@ class Message:
message_copy = copy.deepcopy(self)
# Set the queued flag so the original will be dropped and acks will be sent
self.queued = True
if not self.finalized:
self.queued = True
# Original was dropped so let's make sure we have clean acks and packet id
message_copy.acks = tuple()
message_copy.send_flags &= ~PacketFlags.ACK
message_copy.packet_id = None
message_copy.dropped = False
message_copy.finalized = False
message_copy.queued = False
return message_copy
def to_summary(self):

View File

@@ -20,7 +20,7 @@ Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
from logging import getLogger
from llbase import llsd
import llsd
from hippolyzer.lib.base.message.data import msg_details

View File

@@ -62,9 +62,16 @@ class HumanMessageSerializer:
continue
if first_line:
direction, message_name = line.split(" ", 1)
first_split = [x for x in line.split(" ") if x]
direction, message_name = first_split[:2]
options = [x.strip("[]") for x in first_split[2:]]
msg = Message(message_name)
msg.direction = Direction[direction.upper()]
for option in options:
if option in PacketFlags.__members__:
msg.send_flags |= PacketFlags[option]
elif re.match(r"^\d+$", option):
msg.send_flags |= int(option)
first_line = False
continue
@@ -137,9 +144,17 @@ class HumanMessageSerializer:
if msg.direction is not None:
string += f'{msg.direction.name} '
string += msg.name
flags = msg.send_flags
for poss_flag in iter(PacketFlags):
if flags & poss_flag:
flags &= ~poss_flag
string += f" [{poss_flag.name}]"
# Make sure flags with unknown meanings don't get lost
if flags:
string += f" [{int(flags)}]"
if msg.packet_id is not None:
string += f'\n# {msg.packet_id}: {PacketFlags(msg.send_flags)!r}'
string += f'{", DROPPED" if msg.dropped else ""}{", INJECTED" if msg.injected else ""}'
string += f'\n# ID: {msg.packet_id}'
string += f'{", DROPPED" if msg.dropped else ""}{", SYNTHETIC" if msg.synthetic else ""}'
if msg.extra:
string += f'\n# EXTRA: {msg.extra!r}'
string += '\n\n'

View File

@@ -31,7 +31,8 @@ _T = TypeVar("_T")
_K = TypeVar("_K", bound=Hashable)
MESSAGE_HANDLER = Callable[[_T], Any]
PREDICATE = Callable[[_T], bool]
MESSAGE_NAMES = Iterable[_K]
# TODO: Can't do `Iterable[Union[_K, Literal["*"]]]` apparently?
MESSAGE_NAMES = Iterable[Union[_K, str]]
class MessageHandler(Generic[_T, _K]):
@@ -41,12 +42,11 @@ class MessageHandler(Generic[_T, _K]):
def register(self, message_name: _K) -> Event:
LOG.debug('Creating a monitor for %s' % message_name)
return self.handlers.setdefault(message_name, Event())
return self.handlers.setdefault(message_name, Event(message_name))
def subscribe(self, message_name: _K, handler: MESSAGE_HANDLER) -> Event:
def subscribe(self, message_name: Union[_K, Literal["*"]], handler: MESSAGE_HANDLER):
notifier = self.register(message_name)
notifier.subscribe(handler)
return notifier
def _subscribe_all(self, message_names: MESSAGE_NAMES, handler: MESSAGE_HANDLER,
predicate: Optional[PREDICATE] = None) -> List[Event]:
@@ -57,7 +57,7 @@ class MessageHandler(Generic[_T, _K]):
@contextlib.contextmanager
def subscribe_async(self, message_names: MESSAGE_NAMES, predicate: Optional[PREDICATE] = None,
take: Optional[bool] = None) -> ContextManager[Callable[[], Awaitable[_T]]]:
take: Optional[bool] = None) -> Generator[Callable[[], Awaitable[_T]], None, None]:
"""
Subscribe to a set of message matching predicate while within a block
@@ -92,6 +92,7 @@ class MessageHandler(Generic[_T, _K]):
finally:
for n in notifiers:
n.unsubscribe(_handler_wrapper)
return None
def wait_for(self, message_names: MESSAGE_NAMES, predicate: Optional[PREDICATE] = None,
timeout: Optional[float] = None, take: Optional[bool] = None) -> Awaitable[_T]:
@@ -107,12 +108,14 @@ class MessageHandler(Generic[_T, _K]):
take = self.take_by_default
notifiers = [self.register(name) for name in message_names]
fut = asyncio.get_event_loop().create_future()
loop = asyncio.get_event_loop_policy().get_event_loop()
fut = loop.create_future()
timeout_task = None
async def _canceller():
await asyncio.sleep(timeout)
fut.set_exception(asyncio.exceptions.TimeoutError("Timed out waiting for packet"))
if not fut.done():
fut.set_exception(asyncio.exceptions.TimeoutError("Timed out waiting for packet"))
for n in notifiers:
n.unsubscribe(_handler)
@@ -125,7 +128,8 @@ class MessageHandler(Generic[_T, _K]):
# Whatever was awaiting this future now owns this message
if take:
message = message.take()
fut.set_result(message)
if not fut.done():
fut.set_result(message)
# Make sure to unregister this handler for all message types
for n in notifiers:
n.unsubscribe(_handler)
@@ -142,7 +146,7 @@ class MessageHandler(Generic[_T, _K]):
# Always try to call wildcard handlers
self._handle_type('*', message)
def _handle_type(self, name: _K, message: _T):
def _handle_type(self, name: Union[_K, Literal["*"]], message: _T):
handler = self.handlers.get(name)
if not handler:
return

View File

@@ -47,7 +47,6 @@ class MsgBlockType:
MBT_SINGLE = 0
MBT_MULTIPLE = 1
MBT_VARIABLE = 2
MBT_String_List = ['Single', 'Multiple', 'Variable']
class PacketFlags(enum.IntFlag):
@@ -55,6 +54,8 @@ class PacketFlags(enum.IntFlag):
RELIABLE = 0x40
RESENT = 0x20
ACK = 0x10
# Not a real flag, just used for display.
EQ = 1 << 10
# frequency for messages
@@ -62,28 +63,23 @@ class PacketFlags(enum.IntFlag):
# = '\xFF\xFF'
# = '\xFF'
# = ''
class MsgFrequency:
FIXED_FREQUENCY_MESSAGE = -1 # marking it
LOW_FREQUENCY_MESSAGE = 4
MEDIUM_FREQUENCY_MESSAGE = 2
HIGH_FREQUENCY_MESSAGE = 1
class MsgFrequency(enum.IntEnum):
FIXED = -1 # marking it
LOW = 4
MEDIUM = 2
HIGH = 1
class MsgTrust:
LL_NOTRUST = 0
LL_TRUSTED = 1
class MsgEncoding(enum.IntEnum):
UNENCODED = 0
ZEROCODED = 1
class MsgEncoding:
LL_UNENCODED = 0
LL_ZEROCODED = 1
class MsgDeprecation:
LL_DEPRECATED = 0
LL_UDPDEPRECATED = 1
LL_UDPBLACKLISTED = 2
LL_NOTDEPRECATED = 3
class MsgDeprecation(enum.IntEnum):
DEPRECATED = 0
UDPDEPRECATED = 1
UDPBLACKLISTED = 2
NOTDEPRECATED = 3
# message variable types

View File

@@ -21,7 +21,8 @@ Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
import typing
from .msgtypes import MsgType, MsgBlockType
from .msgtypes import MsgType, MsgBlockType, MsgFrequency
from ..datatypes import UUID
class MessageTemplateVariable:
@@ -36,7 +37,7 @@ class MessageTemplateVariable:
return f"{self.__class__.__name__}(name={self.name!r}, tp={self.type!r}, size={self.size!r})"
@property
def probably_binary(self):
def probably_binary(self) -> bool:
if self._probably_binary is not None:
return self._probably_binary
@@ -48,7 +49,7 @@ class MessageTemplateVariable:
return self._probably_binary
@property
def probably_text(self):
def probably_text(self) -> bool:
if self._probably_text is not None:
return self._probably_text
@@ -61,6 +62,32 @@ class MessageTemplateVariable:
self._probably_text = self._probably_text and self.name != "NameValue"
return self._probably_text
@property
def default_value(self):
if self.type.is_int:
return 0
elif self.type.is_float:
return 0.0
elif self.type == MsgType.MVT_LLUUID:
return UUID()
elif self.type == MsgType.MVT_BOOL:
return False
elif self.type == MsgType.MVT_VARIABLE:
if self.probably_binary:
return b""
if self.probably_text:
return ""
return b""
elif self.type in (MsgType.MVT_LLVector3, MsgType.MVT_LLVector3d, MsgType.MVT_LLQuaternion):
return 0.0, 0.0, 0.0
elif self.type == MsgType.MVT_LLVector4:
return 0.0, 0.0, 0.0, 0.0
elif self.type == MsgType.MVT_FIXED:
return b"\x00" * self.size
elif self.type == MsgType.MVT_IP_ADDR:
return "0.0.0.0"
return None
class MessageTemplateBlock:
def __init__(self, name):
@@ -70,49 +97,36 @@ class MessageTemplateBlock:
self.block_type: MsgBlockType = MsgBlockType.MBT_SINGLE
self.number = 0
def add_variable(self, var):
def add_variable(self, var: MessageTemplateVariable):
self.variable_map[var.name] = var
self.variables.append(var)
def get_variable(self, name):
def get_variable(self, name) -> MessageTemplateVariable:
return self.variable_map[name]
class MessageTemplate(object):
frequency_strings = {-1: 'fixed', 1: 'high', 2: 'medium', 4: 'low'} # strings for printout
deprecation_strings = ["Deprecated", "UDPDeprecated", "UDPBlackListed", "NotDeprecated"] # using _as_string methods
encoding_strings = ["Unencoded", "Zerocoded"] # etc
trusted_strings = ["Trusted", "NotTrusted"] # etc LDE 24oct2008
class MessageTemplate:
def __init__(self, name):
self.blocks: typing.List[MessageTemplateBlock] = []
self.block_map: typing.Dict[str, MessageTemplateBlock] = {}
# this is the function or object that will handle this type of message
self.received_count = 0
self.name = name
self.frequency = None
self.msg_num = 0
self.msg_freq_num_bytes = None
self.msg_trust = None
self.msg_deprecation = None
self.msg_encoding = None
self.frequency: typing.Optional[MsgFrequency] = None
self.num = 0
# Frequency + msg num as bytes
self.freq_num_bytes = None
self.trusted = False
self.deprecation = None
self.encoding = None
def add_block(self, block):
def add_block(self, block: MessageTemplateBlock):
self.block_map[block.name] = block
self.blocks.append(block)
def get_block(self, name):
def get_block(self, name) -> MessageTemplateBlock:
return self.block_map[name]
def get_msg_freq_num_len(self):
if self.frequency == -1:
if self.frequency == MsgFrequency.FIXED:
return 4
return self.frequency
def get_frequency_as_string(self):
return MessageTemplate.frequency_strings[self.frequency]
def get_deprecation_as_string(self):
return MessageTemplate.deprecation_strings[self.msg_deprecation]

View File

@@ -27,25 +27,35 @@ from .template import MessageTemplate
from .template_parser import MessageTemplateParser
DEFAULT_PARSER = MessageTemplateParser(msg_tmpl)
class TemplateDictionary:
"""the dictionary with all known templates"""
def __init__(self, template_list=None, message_template=None):
if template_list is None:
if message_template is None:
parser = MessageTemplateParser(msg_tmpl)
parser = DEFAULT_PARSER
else:
parser = MessageTemplateParser(message_template)
template_list = parser.message_templates
self.template_list: typing.List[MessageTemplate] = template_list
self.template_list: typing.List[MessageTemplate] = []
# maps name to template
self.message_templates = {}
self.message_templates: typing.Dict[str, MessageTemplate] = {}
# maps (freq,num) to template
self.message_dict = {}
self.load_templates(template_list)
def load_templates(self, template_list):
self.template_list.clear()
self.template_list.extend(template_list)
self.message_templates.clear()
self.message_dict.clear()
self.build_dictionaries(template_list)
self.build_message_ids()
@@ -58,32 +68,32 @@ class TemplateDictionary:
# do a mapping of type to a string for easier reference
frequency_str = ''
if template.frequency == MsgFrequency.FIXED_FREQUENCY_MESSAGE:
if template.frequency == MsgFrequency.FIXED:
frequency_str = "Fixed"
elif template.frequency == MsgFrequency.LOW_FREQUENCY_MESSAGE:
elif template.frequency == MsgFrequency.LOW:
frequency_str = "Low"
elif template.frequency == MsgFrequency.MEDIUM_FREQUENCY_MESSAGE:
elif template.frequency == MsgFrequency.MEDIUM:
frequency_str = "Medium"
elif template.frequency == MsgFrequency.HIGH_FREQUENCY_MESSAGE:
elif template.frequency == MsgFrequency.HIGH:
frequency_str = "High"
self.message_dict[(frequency_str,
template.msg_num)] = template
template.num)] = template
def build_message_ids(self):
for template in list(self.message_templates.values()):
frequency = template.frequency
num_bytes = None
if frequency == MsgFrequency.FIXED_FREQUENCY_MESSAGE:
if frequency == MsgFrequency.FIXED:
# have to do this because Fixed messages are stored as a long in the template
num_bytes = b'\xff\xff\xff' + struct.pack("B", template.msg_num)
elif frequency == MsgFrequency.LOW_FREQUENCY_MESSAGE:
num_bytes = b'\xff\xff' + struct.pack("!H", template.msg_num)
elif frequency == MsgFrequency.MEDIUM_FREQUENCY_MESSAGE:
num_bytes = b'\xff' + struct.pack("B", template.msg_num)
elif frequency == MsgFrequency.HIGH_FREQUENCY_MESSAGE:
num_bytes = struct.pack("B", template.msg_num)
template.msg_freq_num_bytes = num_bytes
num_bytes = b'\xff\xff\xff' + struct.pack("B", template.num)
elif frequency == MsgFrequency.LOW:
num_bytes = b'\xff\xff' + struct.pack("!H", template.num)
elif frequency == MsgFrequency.MEDIUM:
num_bytes = b'\xff' + struct.pack("B", template.num)
elif frequency == MsgFrequency.HIGH:
num_bytes = struct.pack("B", template.num)
template.freq_num_bytes = num_bytes
def get_template_by_name(self, template_name) -> typing.Optional[MessageTemplate]:
return self.message_templates.get(template_name)
@@ -99,3 +109,6 @@ class TemplateDictionary:
def __iter__(self):
return iter(self.template_list)
DEFAULT_TEMPLATE_DICT = TemplateDictionary()

View File

@@ -22,7 +22,7 @@ import struct
import re
from . import template
from .msgtypes import MsgFrequency, MsgTrust, MsgEncoding
from .msgtypes import MsgFrequency, MsgEncoding
from .msgtypes import MsgDeprecation, MsgBlockType, MsgType
from ..exc import MessageTemplateParsingError, MessageTemplateNotFound
@@ -112,67 +112,69 @@ class MessageTemplateParser:
frequency = None
freq_str = match.group(2)
if freq_str == 'Low':
frequency = MsgFrequency.LOW_FREQUENCY_MESSAGE
frequency = MsgFrequency.LOW
elif freq_str == 'Medium':
frequency = MsgFrequency.MEDIUM_FREQUENCY_MESSAGE
frequency = MsgFrequency.MEDIUM
elif freq_str == 'High':
frequency = MsgFrequency.HIGH_FREQUENCY_MESSAGE
frequency = MsgFrequency.HIGH
elif freq_str == 'Fixed':
frequency = MsgFrequency.FIXED_FREQUENCY_MESSAGE
frequency = MsgFrequency.FIXED
new_template.frequency = frequency
msg_num = int(match.group(3), 0)
if frequency == MsgFrequency.FIXED_FREQUENCY_MESSAGE:
if frequency == MsgFrequency.FIXED:
# have to do this because Fixed messages are stored as a long in the template
msg_num &= 0xff
msg_num_bytes = struct.pack('!BBBB', 0xff, 0xff, 0xff, msg_num)
elif frequency == MsgFrequency.LOW_FREQUENCY_MESSAGE:
elif frequency == MsgFrequency.LOW:
msg_num_bytes = struct.pack('!BBH', 0xff, 0xff, msg_num)
elif frequency == MsgFrequency.MEDIUM_FREQUENCY_MESSAGE:
elif frequency == MsgFrequency.MEDIUM:
msg_num_bytes = struct.pack('!BB', 0xff, msg_num)
elif frequency == MsgFrequency.HIGH_FREQUENCY_MESSAGE:
elif frequency == MsgFrequency.HIGH:
msg_num_bytes = struct.pack('!B', msg_num)
else:
raise Exception("don't know about frequency %s" % frequency)
new_template.msg_num = msg_num
new_template.msg_freq_num_bytes = msg_num_bytes
new_template.num = msg_num
new_template.freq_num_bytes = msg_num_bytes
msg_trust = None
msg_trust_str = match.group(4)
if msg_trust_str == 'Trusted':
msg_trust = MsgTrust.LL_TRUSTED
msg_trust = True
elif msg_trust_str == 'NotTrusted':
msg_trust = MsgTrust.LL_NOTRUST
msg_trust = False
else:
raise ValueError(f"Invalid trust {msg_trust_str}")
new_template.msg_trust = msg_trust
new_template.trusted = msg_trust
msg_encoding = None
msg_encoding_str = match.group(5)
if msg_encoding_str == 'Unencoded':
msg_encoding = MsgEncoding.LL_UNENCODED
msg_encoding = MsgEncoding.UNENCODED
elif msg_encoding_str == 'Zerocoded':
msg_encoding = MsgEncoding.LL_ZEROCODED
msg_encoding = MsgEncoding.ZEROCODED
else:
raise ValueError(f"Invalid encoding {msg_encoding_str}")
new_template.msg_encoding = msg_encoding
new_template.encoding = msg_encoding
msg_dep = None
msg_dep_str = match.group(7)
if msg_dep_str:
if msg_dep_str == 'Deprecated':
msg_dep = MsgDeprecation.LL_DEPRECATED
msg_dep = MsgDeprecation.DEPRECATED
elif msg_dep_str == 'UDPDeprecated':
msg_dep = MsgDeprecation.LL_UDPDEPRECATED
msg_dep = MsgDeprecation.UDPDEPRECATED
elif msg_dep_str == 'UDPBlackListed':
msg_dep = MsgDeprecation.LL_UDPBLACKLISTED
msg_dep = MsgDeprecation.UDPBLACKLISTED
elif msg_dep_str == 'NotDeprecated':
msg_dep = MsgDeprecation.LL_NOTDEPRECATED
msg_dep = MsgDeprecation.NOTDEPRECATED
else:
msg_dep = MsgDeprecation.LL_NOTDEPRECATED
msg_dep = MsgDeprecation.NOTDEPRECATED
if msg_dep is None:
raise MessageTemplateParsingError("Unknown msg_dep field %s" % match.group(0))
new_template.msg_deprecation = msg_dep
new_template.deprecation = msg_dep
return new_template

View File

@@ -26,7 +26,7 @@ from logging import getLogger
from hippolyzer.lib.base.datatypes import JankStringyBytes
from hippolyzer.lib.base.settings import Settings
from .template import MessageTemplateVariable
from .template_dict import TemplateDictionary
from .template_dict import DEFAULT_TEMPLATE_DICT
from .msgtypes import MsgType, MsgBlockType, PacketLayout
from .data_packer import TemplateDataPacker
from .message import Message, Block
@@ -62,13 +62,13 @@ def _parse_msg_num(reader: se.BufferReader):
class UDPMessageDeserializer:
DEFAULT_TEMPLATE = TemplateDictionary()
DEFAULT_TEMPLATE = DEFAULT_TEMPLATE_DICT
def __init__(self, settings=None):
self.settings = settings or Settings()
self.template_dict = self.DEFAULT_TEMPLATE
def deserialize(self, msg_buff: bytes):
def deserialize(self, msg_buff: bytes) -> Message:
msg = self._parse_message_header(msg_buff)
if not self.settings.ENABLE_DEFERRED_PACKET_PARSING:
try:
@@ -85,6 +85,7 @@ class UDPMessageDeserializer:
reader = se.BufferReader("!", data)
msg: Message = Message("Placeholder")
msg.synthetic = False
msg.send_flags = reader.read(se.U8)
msg.packet_id = reader.read(se.U32)
@@ -125,8 +126,14 @@ class UDPMessageDeserializer:
frequency, num = _parse_msg_num(reader)
current_template = self.template_dict.get_template_by_pair(frequency, num)
if current_template is None:
raise exc.MessageTemplateNotFound("deserializing data")
msg.name = current_template.name
if self.settings.ALLOW_UNKNOWN_MESSAGES:
LOG.warning(f"Unknown message type {frequency}:{num}")
msg.unknown_message = True
msg.name = "UnknownMessage:%d" % num
else:
raise exc.MessageTemplateNotFound("deserializing data", f"{frequency}:{num}")
else:
msg.name = current_template.name
# extra field, see note regarding msg.offset
msg.raw_extra = reader.read_bytes(msg.offset)
@@ -142,6 +149,12 @@ class UDPMessageDeserializer:
# Already parsed if we don't have a raw body
if not raw_body:
return
if msg.unknown_message:
# We can't parse this, we don't know anything about it
msg.deserializer = None
return
msg.raw_body = None
msg.deserializer = None
@@ -156,7 +169,6 @@ class UDPMessageDeserializer:
reader.seek(current_template.get_msg_freq_num_len() + msg.offset)
for tmpl_block in current_template.blocks:
LOG.debug("Parsing %s:%s" % (msg.name, tmpl_block.name))
# EOF?
if not len(reader):
# Seems like even some "Single" blocks are optional?
@@ -179,7 +191,6 @@ class UDPMessageDeserializer:
for i in range(repeat_count):
current_block = Block(tmpl_block.name)
LOG.debug("Adding block %s" % current_block.name)
msg.add_block(current_block)
for tmpl_variable in tmpl_block.variables:
@@ -221,11 +232,17 @@ class UDPMessageDeserializer:
if tmpl_variable.probably_binary:
return unpacked_data
# Truncated strings need to be treated carefully
if tmpl_variable.probably_text and unpacked_data.endswith(b"\x00"):
try:
return unpacked_data.decode("utf8").rstrip("\x00")
except UnicodeDecodeError:
return JankStringyBytes(unpacked_data)
if tmpl_variable.probably_text:
# If it has a null terminator, let's try to decode it first.
# We don't want to do this if there isn't one, because that may change
# the meaning of the data.
if unpacked_data.endswith(b"\x00"):
try:
return unpacked_data.decode("utf8").rstrip("\x00")
except UnicodeDecodeError:
pass
# Failed, return jank stringy bytes
return JankStringyBytes(unpacked_data)
elif tmpl_variable.type in {MsgType.MVT_FIXED, MsgType.MVT_VARIABLE}:
# No idea if this should be bytes or a string... make an object that's sort of both.
return JankStringyBytes(unpacked_data)

View File

@@ -26,7 +26,7 @@ from .data_packer import TemplateDataPacker
from .message import Message, MsgBlockList
from .msgtypes import MsgType, MsgBlockType
from .template import MessageTemplateVariable, MessageTemplateBlock
from .template_dict import TemplateDictionary
from .template_dict import TemplateDictionary, DEFAULT_TEMPLATE_DICT
from hippolyzer.lib.base import exc
from hippolyzer.lib.base import serialization as se
from hippolyzer.lib.base.datatypes import RawBytes
@@ -35,7 +35,7 @@ logger = getLogger('message.udpserializer')
class UDPMessageSerializer:
DEFAULT_TEMPLATE = TemplateDictionary(None)
DEFAULT_TEMPLATE = DEFAULT_TEMPLATE_DICT
def __init__(self, message_template=None):
if message_template is not None:
@@ -45,7 +45,7 @@ class UDPMessageSerializer:
def serialize(self, msg: Message):
current_template = self.template_dict.get_template_by_name(msg.name)
if current_template is None:
if current_template is None and msg.raw_body is None:
raise exc.MessageSerializationError("message name", "invalid message name")
# Header and trailers are all big-endian
@@ -69,13 +69,13 @@ class UDPMessageSerializer:
# frequency and message number. The template stores it because it doesn't
# change per template.
body_writer = se.BufferWriter("<")
body_writer.write_bytes(current_template.msg_freq_num_bytes)
body_writer.write_bytes(current_template.freq_num_bytes)
body_writer.write_bytes(msg.extra)
# We're going to pop off keys as we go, so shallow copy the dict.
blocks = copy.copy(msg.blocks)
missing_block = None
missing_blocks: List[MessageTemplateBlock] = []
# Iterate based on the order of the blocks in the message template
for tmpl_block in current_template.blocks:
block_list = blocks.pop(tmpl_block.name, None)
@@ -83,13 +83,21 @@ class UDPMessageSerializer:
# omitted by SL. Not an error unless another block containing data follows it.
# Keep track.
if block_list is None:
missing_block = tmpl_block.name
missing_blocks.append(tmpl_block)
logger.debug("No block %s, bailing out" % tmpl_block.name)
continue
# Had a missing block before, but we found one later in the template?
elif missing_block:
raise ValueError(f"Unexpected {tmpl_block.name} block after missing {missing_block}")
self._serialize_block(body_writer, tmpl_block, block_list)
# Had a missing block before, but we specified one defined later in the template?
elif missing_blocks:
if not all(x.block_type == MsgBlockType.MBT_VARIABLE for x in missing_blocks):
raise ValueError(f"Unexpected {tmpl_block.name} block after missing {missing_blocks!r}")
# This is okay, we just need to put empty blocks for all the variable blocks that came before.
# Normally we wouldn't even put these to match SL behavior, but in this case we need the
# empty blocks so the decoder will decode these as the correct block type.
for missing_block in missing_blocks:
self._serialize_block_list(body_writer, missing_block, MsgBlockList())
missing_blocks.clear()
self._serialize_block_list(body_writer, tmpl_block, block_list)
if blocks:
raise KeyError(f"Unexpected {tuple(blocks.keys())!r} blocks in {msg.name}")
@@ -105,8 +113,8 @@ class UDPMessageSerializer:
writer.write(se.U8, len(msg.acks))
return writer.copy_buffer()
def _serialize_block(self, writer: se.BufferWriter, tmpl_block: MessageTemplateBlock,
block_list: MsgBlockList):
def _serialize_block_list(self, writer: se.BufferWriter, tmpl_block: MessageTemplateBlock,
block_list: MsgBlockList):
block_count = len(block_list)
# Multiple block type means there is a static number of blocks
if tmpl_block.block_type == MsgBlockType.MBT_MULTIPLE:

View File

@@ -82,8 +82,9 @@ CAPS_DICT = Union[
class CapsClient:
def __init__(self, caps: Optional[CAPS_DICT] = None):
def __init__(self, caps: Optional[CAPS_DICT] = None, session: Optional[aiohttp.ClientSession] = None) -> None:
self._caps = caps
self._session = session
def _request_fixups(self, cap_or_url: str, headers: Dict, proxy: Optional[bool], ssl: Any):
return cap_or_url, headers, proxy, ssl
@@ -117,6 +118,7 @@ class CapsClient:
session_owned = False
# Use an existing session if we have one to take advantage of connection pooling
# otherwise create one
session = session or self._session
if session is None:
session_owned = True
session = aiohttp.ClientSession(

View File

@@ -30,6 +30,7 @@ class UDPPacket:
self.dst_addr = dst_addr
self.data = data
self.direction = direction
self.meta = {}
@property
def outgoing(self):
@@ -45,6 +46,9 @@ class UDPPacket:
return self.dst_addr
return self.src_addr
def __repr__(self):
return f"<{self.__class__.__name__} src_addr={self.src_addr!r} dst_addr={self.dst_addr!r} data={self.data!r}>"
class AbstractUDPTransport(abc.ABC):
__slots__ = ()

View File

@@ -35,19 +35,14 @@ import hippolyzer.lib.base.serialization as se
import hippolyzer.lib.base.templates as tmpls
class Object(recordclass.datatuple): # type: ignore
__options__ = {
"use_weakref": True,
}
__weakref__: Any
class Object(recordclass.RecordClass, use_weakref=True): # type: ignore
LocalID: Optional[int] = None
State: Optional[int] = None
FullID: Optional[UUID] = None
CRC: Optional[int] = None
PCode: Optional[tmpls.PCode] = None
Material: Optional[tmpls.MCode] = None
ClickAction: Optional[int] = None
ClickAction: Optional[tmpls.ClickAction] = None
Scale: Optional[Vector3] = None
ParentID: Optional[int] = None
# Actually contains a weakref proxy
@@ -71,7 +66,7 @@ class Object(recordclass.datatuple): # type: ignore
ProfileBegin: Optional[int] = None
ProfileEnd: Optional[int] = None
ProfileHollow: Optional[int] = None
TextureEntry: Optional[tmpls.TextureEntry] = None
TextureEntry: Optional[tmpls.TextureEntryCollection] = None
TextureAnim: Optional[tmpls.TextureAnim] = None
NameValue: Optional[Any] = None
Data: Optional[Any] = None
@@ -130,12 +125,14 @@ class Object(recordclass.datatuple): # type: ignore
SitName: Optional[str] = None
TextureID: Optional[List[UUID]] = None
RegionHandle: Optional[int] = None
Animations: Optional[List[UUID]] = None
def __init__(self, **_kwargs):
""" set up the object attributes """
self.ExtraParams = self.ExtraParams or {} # Variable 1
self.ObjectCosts = self.ObjectCosts or {}
self.ChildIDs = []
self.Animations = self.Animations or []
# Same as parent, contains weakref proxies.
self.Children: List[Object] = []
@@ -199,6 +196,28 @@ class Object(recordclass.datatuple): # type: ignore
del val["Parent"]
return val
@property
def Ancestors(self) -> List[Object]:
obj = self
ancestors = []
while obj.Parent:
obj = obj.Parent
ancestors.append(obj)
return ancestors
@property
def Descendents(self) -> List[Object]:
new_children = [self]
descendents = []
while new_children:
to_check = new_children[:]
new_children.clear()
for obj in to_check:
for child in obj.Children:
new_children.append(child)
descendents.append(child)
return descendents
def handle_to_gridxy(handle: int) -> Tuple[int, int]:
return (handle >> 32) // 256, (handle & 0xFFffFFff) // 256
@@ -224,6 +243,7 @@ def normalize_object_update(block: Block, handle: int):
"NameValue": block.deserialize_var("NameValue", make_copy=False),
"TextureAnim": block.deserialize_var("TextureAnim", make_copy=False),
"ExtraParams": block.deserialize_var("ExtraParams", make_copy=False) or {},
"ClickAction": block.deserialize_var("ClickAction", make_copy=False),
"PSBlock": block.deserialize_var("PSBlock", make_copy=False).value,
"UpdateFlags": block.deserialize_var("UpdateFlags", make_copy=False),
"State": block.deserialize_var("State", make_copy=False),
@@ -236,7 +256,7 @@ def normalize_object_update(block: Block, handle: int):
# OwnerID is only set in this packet if a sound is playing. Don't allow
# ObjectUpdates to clobber _real_ OwnerIDs we had from ObjectProperties
# with a null UUID.
if object_data["OwnerID"] == UUID():
if object_data["OwnerID"] == UUID.ZERO:
del object_data["OwnerID"]
del object_data["Flags"]
del object_data["Gain"]
@@ -270,6 +290,9 @@ def normalize_object_update_compressed_data(data: bytes):
# Only used for determining which sections are present
del compressed["Flags"]
# Unlike other ObjectUpdate types, a null value in an ObjectUpdateCompressed
# always means that there is no value, not that the value hasn't changed
# from the client's view. Use the default value when that happens.
ps_block = compressed.pop("PSBlockNew", None)
if ps_block is None:
ps_block = compressed.pop("PSBlock", None)
@@ -278,6 +301,20 @@ def normalize_object_update_compressed_data(data: bytes):
compressed.pop("PSBlock", None)
if compressed["NameValue"] is None:
compressed["NameValue"] = NameValueCollection()
if compressed["Text"] is None:
compressed["Text"] = b""
compressed["TextColor"] = b""
if compressed["MediaURL"] is None:
compressed["MediaURL"] = b""
if compressed["AngularVelocity"] is None:
compressed["AngularVelocity"] = Vector3()
if compressed["SoundFlags"] is None:
compressed["SoundFlags"] = 0
compressed["SoundGain"] = 0.0
compressed["SoundRadius"] = 0.0
compressed["Sound"] = UUID.ZERO
if compressed["TextureEntry"] is None:
compressed["TextureEntry"] = tmpls.TextureEntryCollection()
object_data = {
"PSBlock": ps_block.value,
@@ -286,10 +323,10 @@ def normalize_object_update_compressed_data(data: bytes):
"LocalID": compressed.pop("ID"),
**compressed,
}
if object_data["TextureEntry"] is None:
object_data.pop("TextureEntry")
# Don't clobber OwnerID in case the object has a proper one.
if object_data["OwnerID"] == UUID():
# Don't clobber OwnerID in case the object has a proper one from
# a previous ObjectProperties. OwnerID isn't expected to be populated
# on ObjectUpdates unless an attached sound is playing.
if object_data["OwnerID"] == UUID.ZERO:
del object_data["OwnerID"]
return object_data
@@ -399,8 +436,8 @@ class FastObjectUpdateCompressedDataDeserializer:
"PCode": pcode,
"State": state,
"CRC": crc,
"Material": material,
"ClickAction": click_action,
"Material": tmpls.MCode(material),
"ClickAction": tmpls.ClickAction(click_action),
"Scale": scale,
"Position": pos,
"Rotation": rot,

View File

@@ -10,6 +10,7 @@ from io import SEEK_CUR, SEEK_SET, SEEK_END, RawIOBase, BufferedIOBase
from typing import *
import lazy_object_proxy
import numpy as np
import hippolyzer.lib.base.llsd as llsd
import hippolyzer.lib.base.datatypes as dtypes
@@ -27,6 +28,14 @@ class _Unserializable:
return False
class MissingType:
"""Simple sentinel type like dataclasses._MISSING_TYPE"""
pass
MISSING = MissingType()
UNSERIALIZABLE = _Unserializable()
_T = TypeVar("_T")
@@ -288,7 +297,7 @@ class SerializableBase(abc.ABC):
@classmethod
def default_value(cls) -> Any:
# None may be a valid default, so return MISSING as a sentinel val
return dataclasses.MISSING
return MISSING
class Adapter(SerializableBase, abc.ABC):
@@ -328,18 +337,18 @@ class ForwardSerializable(SerializableBase):
def __init__(self, func: Callable[[], SERIALIZABLE_TYPE]):
super().__init__()
self._func = func
self._wrapped = dataclasses.MISSING
self._wrapped: Union[MissingType, SERIALIZABLE_TYPE] = MISSING
def _ensure_evaled(self):
if self._wrapped is dataclasses.MISSING:
if self._wrapped is MISSING:
self._wrapped = self._func()
def __getattr__(self, attr):
return getattr(self._wrapped, attr)
def default_value(self) -> Any:
if self._wrapped is dataclasses.MISSING:
return dataclasses.MISSING
if self._wrapped is MISSING:
return MISSING
return self._wrapped.default_value()
def serialize(self, val, writer: BufferWriter, ctx: Optional[ParseContext]):
@@ -357,10 +366,10 @@ class Template(SerializableBase):
def __init__(self, template_spec: Dict[str, SERIALIZABLE_TYPE], skip_missing=False):
self._template_spec = template_spec
self._skip_missing = skip_missing
self._size = dataclasses.MISSING
self._size = MISSING
def calc_size(self):
if self._size is not dataclasses.MISSING:
if self._size is not MISSING:
return self._size
sum_bytes = 0
for _, field_type in self._template_spec.items():
@@ -830,7 +839,7 @@ class QuantizedFloat(QuantizedFloatBase):
super().__init__(prim_spec, zero_median=False)
self.lower = lower
self.upper = upper
# We know the range in `QuantizedFloat` when it's constructed, so we can infer
# We know the range in `QuantizedFloat` when it's constructed, so we can infer
# whether or not we should round towards zero in __init__
max_error = (upper - lower) * self.step_mag
midpoint = (upper + lower) / 2.0
@@ -1196,9 +1205,9 @@ class ContextMixin(Generic[_T]):
def _choose_option(self, ctx: Optional[ParseContext]) -> _T:
idx = self._fun(ctx)
if idx not in self._options:
if dataclasses.MISSING not in self._options:
if MISSING not in self._options:
raise KeyError(f"{idx!r} not found in {self._options!r}")
idx = dataclasses.MISSING
idx = MISSING
return self._options[idx]
@@ -1339,6 +1348,12 @@ class TypedBytesBase(SerializableBase, abc.ABC):
return self._spec.default_value()
class TypedBytesGreedy(TypedBytesBase):
def __init__(self, spec, empty_is_none=False, check_trailing_bytes=True, lazy=False):
self._bytes_tmpl = BytesGreedy()
super().__init__(spec, empty_is_none, check_trailing_bytes, lazy=lazy)
class TypedByteArray(TypedBytesBase):
def __init__(self, len_spec, spec, empty_is_none=False, check_trailing_bytes=True, lazy=False):
self._bytes_tmpl = ByteArray(len_spec)
@@ -1436,7 +1451,7 @@ class StringEnumAdapter(Adapter):
class FixedPoint(SerializableBase):
def __init__(self, ser_spec, int_bits, frac_bits, signed=False):
# Should never be used due to how this handles signs :/
assert(not ser_spec.is_signed)
assert (not ser_spec.is_signed)
self._ser_spec: SerializablePrimitive = ser_spec
self._signed = signed
@@ -1446,7 +1461,7 @@ class FixedPoint(SerializableBase):
self._min_val = ((1 << int_bits) * -1) if signed else 0
self._max_val = 1 << int_bits
assert(required_bits == (ser_spec.calc_size() * 8))
assert (required_bits == (ser_spec.calc_size() * 8))
def deserialize(self, reader: Reader, ctx):
fixed_val = float(self._ser_spec.deserialize(reader, ctx))
@@ -1476,8 +1491,8 @@ def _make_undefined_raiser():
return f
def dataclass_field(spec: Union[SERIALIZABLE_TYPE, Callable], *, default=dataclasses.MISSING,
default_factory=dataclasses.MISSING, init=True, repr=True, # noqa
def dataclass_field(spec: Union[SERIALIZABLE_TYPE, Callable], *, default: Any = dataclasses.MISSING,
default_factory: Any = dataclasses.MISSING, init=True, repr=True, # noqa
hash=None, compare=True) -> dataclasses.Field: # noqa
enrich_factory = False
# Lambda, need to defer evaluation of spec until it's actually used.
@@ -1498,7 +1513,7 @@ def dataclass_field(spec: Union[SERIALIZABLE_TYPE, Callable], *, default=datacla
metadata={"spec": spec}, default=default, default_factory=default_factory, init=init,
repr=repr, hash=hash, compare=compare
)
# Need to stuff this on so it knows which field went unspecified.
# Need to stuff this on, so it knows which field went unspecified.
if enrich_factory:
default_factory.field = field
return field
@@ -1565,8 +1580,16 @@ def bitfield_field(bits: int, *, adapter: Optional[Adapter] = None, default=0, i
class BitfieldDataclass(DataclassAdapter):
def __init__(self, data_cls: Type,
prim_spec: Optional[SerializablePrimitive] = None, shift: bool = True):
PRIM_SPEC: ClassVar[Optional[SerializablePrimitive]] = None
def __init__(self, data_cls: Optional[Type] = None,
prim_spec: Optional[SerializablePrimitive] = None, shift: Optional[bool] = None):
if not dataclasses.is_dataclass(data_cls):
raise ValueError(f"{data_cls!r} is not a dataclass")
if prim_spec is None:
prim_spec = getattr(data_cls, 'PRIM_SPEC', None)
if shift is None:
shift = getattr(data_cls, 'SHIFT', True)
super().__init__(data_cls, prim_spec)
self._shift = shift
self._bitfield_spec = self._build_bitfield(data_cls)
@@ -1596,7 +1619,9 @@ class BitfieldDataclass(DataclassAdapter):
class ExprAdapter(Adapter):
def __init__(self, child_spec: SERIALIZABLE_TYPE, decode_func: Callable, encode_func: Callable):
_ID = lambda x: x
def __init__(self, child_spec: SERIALIZABLE_TYPE, decode_func: Callable = _ID, encode_func: Callable = _ID):
super().__init__(child_spec)
self._decode_func = decode_func
self._encode_func = encode_func
@@ -1645,9 +1670,64 @@ class BinaryLLSD(SerializableBase):
writer.write_bytes(llsd.format_binary(val, with_header=False))
class NumPyArray(Adapter):
"""
An 2-dimensional, dynamic-length array of data from numpy. Greedy.
Unlike most other serializers, your endianness _must_ be specified in the dtype!
"""
__slots__ = ['dtype', 'elems']
def __init__(self, child_spec: Optional[SERIALIZABLE_TYPE], dtype: np.dtype, elems: int):
super().__init__(child_spec)
self.dtype = dtype
self.elems = elems
def _pick_dtype(self, endian: str) -> np.dtype:
return self.dtype.newbyteorder('>') if endian != "<" else self.dtype
def decode(self, val: Any, ctx: Optional[ParseContext], pod: bool = False) -> Any:
num_elems = len(val) // self.dtype.itemsize
num_ndims = num_elems // self.elems
buf_array = np.frombuffer(val, dtype=self.dtype, count=num_elems)
return buf_array.reshape((num_ndims, self.elems))
def encode(self, val, ctx: Optional[ParseContext]) -> Any:
val: np.ndarray = np.array(val, dtype=self.dtype).flatten()
return val.tobytes()
class QuantizedNumPyArray(Adapter):
"""Like QuantizedFloat. Only works correctly for unsigned types, no zero midpoint rounding!"""
def __init__(self, child_spec: NumPyArray, lower: float, upper: float):
super().__init__(child_spec)
self.dtype = child_spec.dtype
self.lower = lower
self.upper = upper
self.step_mag = 1.0 / ((2 ** (self.dtype.itemsize * 8)) - 1)
def encode(self, val: Any, ctx: Optional[ParseContext]) -> Any:
val = np.array(val, dtype=np.float64)
val = np.clip(val, self.lower, self.upper)
delta = self.upper - self.lower
if delta == 0.0:
return np.zeros(val.shape, dtype=self.dtype)
val -= self.lower
val /= delta
val /= self.step_mag
return np.rint(val).astype(self.dtype)
def decode(self, val: Any, ctx: Optional[ParseContext], pod: bool = False) -> Any:
val = val.astype(np.float64)
val *= self.step_mag
val *= self.upper - self.lower
val += self.lower
return val
def subfield_serializer(msg_name, block_name, var_name):
def f(orig_cls):
global SUBFIELD_SERIALIZERS
SUBFIELD_SERIALIZERS[(msg_name, block_name, var_name)] = orig_cls
return orig_cls
return f
@@ -1844,7 +1924,7 @@ class IntEnumSubfieldSerializer(AdapterInstanceSubfieldSerializer):
val = super().deserialize(ctx_obj, val, pod=pod)
# Don't pretend we were able to deserialize this if we
# had to fall through to the `int` case.
if pod and type(val) == int:
if pod and type(val) is int:
return UNSERIALIZABLE
return val
@@ -1859,7 +1939,6 @@ class IntFlagSubfieldSerializer(AdapterInstanceSubfieldSerializer):
def http_serializer(msg_name):
def f(orig_cls):
global HTTP_SERIALIZERS
HTTP_SERIALIZERS[msg_name] = orig_cls
return orig_cls
return f

View File

@@ -55,6 +55,7 @@ class SettingDescriptor(Generic[_T]):
class Settings:
ENABLE_DEFERRED_PACKET_PARSING: bool = SettingDescriptor(True)
ALLOW_UNKNOWN_MESSAGES: bool = SettingDescriptor(True)
def __init__(self):
self._settings: Dict[str, Any] = {}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,45 @@
import asyncio
from typing import Any, Optional, List, Tuple
from hippolyzer.lib.base.message.circuit import Circuit, ConnectionHolder
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.base.network.transport import AbstractUDPTransport, ADDR_TUPLE, UDPPacket
class MockTransport(AbstractUDPTransport):
def sendto(self, data: Any, addr: Optional[ADDR_TUPLE] = ...) -> None:
pass
def abort(self) -> None:
pass
def close(self) -> None:
pass
def __init__(self):
super().__init__()
self.packets: List[Tuple[bytes, Tuple[str, int]]] = []
def send_packet(self, packet: UDPPacket) -> None:
self.packets.append((packet.data, packet.dst_addr))
class MockHandlingCircuit(Circuit):
def __init__(self, handler: MessageHandler[Message, str]):
super().__init__(("127.0.0.1", 1), ("127.0.0.1", 2), None)
self.handler = handler
def _send_prepared_message(self, message: Message, transport=None):
loop = asyncio.get_event_loop_policy().get_event_loop()
loop.call_soon(self.handler.handle, message)
class MockConnectionHolder(ConnectionHolder):
def __init__(self, circuit, message_handler):
self.circuit = circuit
self.message_handler = message_handler
async def soon(awaitable) -> Message:
return await asyncio.wait_for(awaitable, timeout=1.0)

View File

@@ -8,8 +8,10 @@ import dataclasses
from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import create_logged_task
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.circuit import ConnectionHolder
from hippolyzer.lib.base.message.msgtypes import PacketFlags
from hippolyzer.lib.base.templates import (
TransferRequestParamsBase,
TransferChannelType,
@@ -94,7 +96,7 @@ class TransferManager:
if params_dict.get("SessionID", dataclasses.MISSING) is None:
params.SessionID = self._session_id
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
'TransferRequest',
Block(
'TransferInfo',
@@ -104,9 +106,10 @@ class TransferManager:
Priority=priority,
Params_=params,
),
flags=PacketFlags.RELIABLE,
))
transfer = Transfer(transfer_id)
asyncio.create_task(self._pump_transfer_replies(transfer))
create_logged_task(self._pump_transfer_replies(transfer), "Transfer Pump")
return transfer
async def _pump_transfer_replies(self, transfer: Transfer):

View File

@@ -1,5 +1,5 @@
from PySide2.QtCore import QMetaObject
from PySide2.QtUiTools import QUiLoader
from PySide6.QtCore import QMetaObject
from PySide6.QtUiTools import QUiLoader
class UiLoader(QUiLoader):

View File

@@ -5,6 +5,7 @@ Body parts and linden clothing layers
from __future__ import annotations
import dataclasses
import enum
import logging
from io import StringIO
from typing import *
@@ -13,14 +14,77 @@ from xml.etree.ElementTree import parse as parse_etree
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import get_resource_filename
from hippolyzer.lib.base.legacy_inv import InventorySaleInfo, InventoryPermissions
from hippolyzer.lib.base.inventory import InventorySaleInfo, InventoryPermissions
from hippolyzer.lib.base.legacy_schema import SchemaBase, parse_schema_line, SchemaParsingError
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.templates import WearableType
LOG = logging.getLogger(__name__)
_T = TypeVar("_T")
WEARABLE_VERSION = "LLWearable version 22"
DEFAULT_WEARABLE_TEX = UUID("c228d1cf-4b5d-4ba8-84f4-899a0796aa97")
class AvatarTEIndex(enum.IntEnum):
"""From llavatarappearancedefines.h"""
HEAD_BODYPAINT = 0
UPPER_SHIRT = enum.auto()
LOWER_PANTS = enum.auto()
EYES_IRIS = enum.auto()
HAIR = enum.auto()
UPPER_BODYPAINT = enum.auto()
LOWER_BODYPAINT = enum.auto()
LOWER_SHOES = enum.auto()
HEAD_BAKED = enum.auto()
UPPER_BAKED = enum.auto()
LOWER_BAKED = enum.auto()
EYES_BAKED = enum.auto()
LOWER_SOCKS = enum.auto()
UPPER_JACKET = enum.auto()
LOWER_JACKET = enum.auto()
UPPER_GLOVES = enum.auto()
UPPER_UNDERSHIRT = enum.auto()
LOWER_UNDERPANTS = enum.auto()
SKIRT = enum.auto()
SKIRT_BAKED = enum.auto()
HAIR_BAKED = enum.auto()
LOWER_ALPHA = enum.auto()
UPPER_ALPHA = enum.auto()
HEAD_ALPHA = enum.auto()
EYES_ALPHA = enum.auto()
HAIR_ALPHA = enum.auto()
HEAD_TATTOO = enum.auto()
UPPER_TATTOO = enum.auto()
LOWER_TATTOO = enum.auto()
HEAD_UNIVERSAL_TATTOO = enum.auto()
UPPER_UNIVERSAL_TATTOO = enum.auto()
LOWER_UNIVERSAL_TATTOO = enum.auto()
SKIRT_TATTOO = enum.auto()
HAIR_TATTOO = enum.auto()
EYES_TATTOO = enum.auto()
LEFT_ARM_TATTOO = enum.auto()
LEFT_LEG_TATTOO = enum.auto()
AUX1_TATTOO = enum.auto()
AUX2_TATTOO = enum.auto()
AUX3_TATTOO = enum.auto()
LEFTARM_BAKED = enum.auto()
LEFTLEG_BAKED = enum.auto()
AUX1_BAKED = enum.auto()
AUX2_BAKED = enum.auto()
AUX3_BAKED = enum.auto()
@property
def is_baked(self) -> bool:
return self.name.endswith("_BAKED")
class VisualParamGroup(enum.IntEnum):
TWEAKABLE = 0
ANIMATABLE = 1
TWEAKABLE_NO_TRANSMIT = 2
TRANSMIT_NOT_TWEAKABLE = 3
@dataclasses.dataclass
@@ -29,26 +93,48 @@ class VisualParam:
name: str
value_min: float
value_max: float
value_default: float
group: VisualParamGroup
# These might be `None` if the param isn't meant to be directly edited
edit_group: Optional[str]
wearable: Optional[str]
def dequantize_val(self, val: int) -> float:
"""Dequantize U8 values from AvatarAppearance messages"""
spec = se.QuantizedFloat(se.U8, self.value_min, self.value_max, False)
return spec.decode(val, None)
class VisualParams(List[VisualParam]):
def __init__(self):
def __init__(self, lad_path):
super().__init__()
lad_path = get_resource_filename("lib/base/data/avatar_lad.xml")
with open(lad_path, "rb") as f:
doc = parse_etree(f)
temp_params = []
for param in doc.findall(".//param"):
self.append(VisualParam(
temp_params.append(VisualParam(
id=int(param.attrib["id"]),
name=param.attrib["name"],
group=VisualParamGroup(int(param.get("group", "0"))),
edit_group=param.get("edit_group"),
wearable=param.get("wearable"),
value_min=float(param.attrib["value_min"]),
value_max=float(param.attrib["value_max"]),
value_default=float(param.attrib.get("value_default", 0.0))
))
# Some functionality relies on the list being sorted by ID, though there may be holes.
temp_params.sort(key=lambda x: x.id)
# Remove dupes, only using the last value present (matching indra behavior)
# This is necessary to remove the duplicate eye pop entry...
self.extend({x.id: x for x in temp_params}.values())
@property
def appearance_params(self) -> Iterator[VisualParam]:
for param in self:
if param.group not in (VisualParamGroup.TWEAKABLE, VisualParamGroup.TRANSMIT_NOT_TWEAKABLE):
continue
yield param
def by_name(self, name: str) -> VisualParam:
return [x for x in self if x.name == name][0]
@@ -59,8 +145,45 @@ class VisualParams(List[VisualParam]):
def by_wearable(self, wearable: str) -> List[VisualParam]:
return [x for x in self if x.wearable == wearable]
def by_id(self, vparam_id: int) -> VisualParam:
return [x for x in self if x.id == vparam_id][0]
VISUAL_PARAMS = VisualParams()
def parse_appearance_message(self, message: Message) -> Dict[int, float]:
params = {}
for param, value_block in zip(self.appearance_params, message["VisualParam"]):
params[param.id] = param.dequantize_val(value_block["ParamValue"])
return params
VISUAL_PARAMS = VisualParams(get_resource_filename("lib/base/data/avatar_lad.xml"))
# See `llpaneleditwearable.cpp`, which TE slots should be set for each wearable type is hardcoded
# in the viewer.
WEARABLE_TEXTURE_SLOTS: Dict[WearableType, Sequence[AvatarTEIndex]] = {
WearableType.SHAPE: (),
WearableType.SKIN: (AvatarTEIndex.HEAD_BODYPAINT, AvatarTEIndex.UPPER_BODYPAINT, AvatarTEIndex.LOWER_BODYPAINT),
WearableType.HAIR: (AvatarTEIndex.HAIR,),
WearableType.EYES: (AvatarTEIndex.EYES_IRIS,),
WearableType.SHIRT: (AvatarTEIndex.UPPER_SHIRT,),
WearableType.PANTS: (AvatarTEIndex.LOWER_PANTS,),
WearableType.SHOES: (AvatarTEIndex.LOWER_SHOES,),
WearableType.SOCKS: (AvatarTEIndex.LOWER_SOCKS,),
WearableType.JACKET: (AvatarTEIndex.UPPER_JACKET, AvatarTEIndex.LOWER_JACKET),
WearableType.GLOVES: (AvatarTEIndex.UPPER_GLOVES,),
WearableType.UNDERSHIRT: (AvatarTEIndex.UPPER_UNDERSHIRT,),
WearableType.UNDERPANTS: (AvatarTEIndex.LOWER_UNDERPANTS,),
WearableType.SKIRT: (AvatarTEIndex.SKIRT,),
WearableType.ALPHA: (AvatarTEIndex.LOWER_ALPHA, AvatarTEIndex.UPPER_ALPHA,
AvatarTEIndex.HEAD_ALPHA, AvatarTEIndex.EYES_ALPHA, AvatarTEIndex.HAIR_ALPHA),
WearableType.TATTOO: (AvatarTEIndex.LOWER_TATTOO, AvatarTEIndex.UPPER_TATTOO, AvatarTEIndex.HEAD_TATTOO),
WearableType.UNIVERSAL: (AvatarTEIndex.HEAD_UNIVERSAL_TATTOO, AvatarTEIndex.UPPER_UNIVERSAL_TATTOO,
AvatarTEIndex.LOWER_UNIVERSAL_TATTOO, AvatarTEIndex.SKIRT_TATTOO,
AvatarTEIndex.HAIR_TATTOO, AvatarTEIndex.EYES_TATTOO, AvatarTEIndex.LEFT_ARM_TATTOO,
AvatarTEIndex.LEFT_LEG_TATTOO, AvatarTEIndex.AUX1_TATTOO, AvatarTEIndex.AUX2_TATTOO,
AvatarTEIndex.AUX3_TATTOO),
WearableType.PHYSICS: (),
}
@dataclasses.dataclass
@@ -71,7 +194,7 @@ class Wearable(SchemaBase):
sale_info: InventorySaleInfo
# VisualParam ID -> val
parameters: Dict[int, float]
# TextureEntry ID -> texture ID
# TextureEntry ID -> texture UUID
textures: Dict[int, UUID]
@classmethod
@@ -146,3 +269,22 @@ class Wearable(SchemaBase):
writer.write(f"textures {len(self.textures)}\n")
for te_id, texture_id in self.textures.items():
writer.write(f"{te_id} {texture_id}\n")
@classmethod
def make_default(cls, w_type: WearableType) -> Self:
instance = cls(
name="New " + w_type.name.replace("_", " ").title(),
permissions=InventoryPermissions.make_default(),
sale_info=InventorySaleInfo.make_default(),
parameters={},
textures={},
wearable_type=w_type,
)
for te_idx in WEARABLE_TEXTURE_SLOTS[w_type]:
instance.textures[te_idx] = DEFAULT_WEARABLE_TEX
for param in VISUAL_PARAMS.by_wearable(w_type.name.lower()):
instance.parameters[param.id] = param.value_default
return instance

View File

@@ -9,9 +9,10 @@ import random
from typing import *
from hippolyzer.lib.base.datatypes import UUID, RawBytes
from hippolyzer.lib.base.helpers import create_logged_task
from hippolyzer.lib.base.message.data_packer import TemplateDataPacker
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.msgtypes import MsgType
from hippolyzer.lib.base.message.msgtypes import MsgType, PacketFlags
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.base.message.circuit import ConnectionHolder
from hippolyzer.lib.base.templates import XferPacket, XferFilePath, AssetType, XferError
@@ -110,7 +111,7 @@ class XferManager:
direction: Direction = Direction.OUT,
) -> Xfer:
xfer_id = xfer_id if xfer_id is not None else random.getrandbits(64)
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
'RequestXfer',
Block(
'XferID',
@@ -125,7 +126,7 @@ class XferManager:
direction=direction,
))
xfer = Xfer(xfer_id, direction=direction, turbo=turbo)
asyncio.create_task(self._pump_xfer_replies(xfer))
create_logged_task(self._pump_xfer_replies(xfer), "Xfer Pump")
return xfer
async def _pump_xfer_replies(self, xfer: Xfer):
@@ -174,10 +175,11 @@ class XferManager:
to_ack = range(xfer.next_ackable, ack_max)
xfer.next_ackable = ack_max
for ack_id in to_ack:
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send_reliable(Message(
"ConfirmXferPacket",
Block("XferID", ID=xfer.xfer_id, Packet=ack_id),
direction=xfer.direction,
flags=PacketFlags.RELIABLE,
))
xfer.chunks[packet_id.PacketID] = packet_data
@@ -216,7 +218,7 @@ class XferManager:
else:
inline_data = data
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
"AssetUploadRequest",
Block(
"AssetBlock",
@@ -225,7 +227,8 @@ class XferManager:
Tempfile=temp_file,
StoreLocal=store_local,
AssetData=inline_data,
)
),
flags=PacketFlags.RELIABLE
))
fut = asyncio.Future()
asyncio.create_task(self._pump_asset_upload(xfer, transaction_id, fut))
@@ -267,17 +270,19 @@ class XferManager:
xfer.xfer_id = request_msg["XferID"]["ID"]
packet_id = 0
# TODO: No resend yet. If it's lost, it's lost.
while xfer.chunks:
chunk = xfer.chunks.pop(packet_id)
# EOF if there are no chunks left
packet_val = XferPacket(PacketID=packet_id, IsEOF=not bool(xfer.chunks))
self._connection_holder.circuit.send_message(Message(
# We just send reliably since I don't care to implement the Xfer-specific
# resend-on-unacked nastiness
_ = self._connection_holder.circuit.send_reliable(Message(
"SendXferPacket",
Block("XferID", ID=xfer.xfer_id, Packet_=packet_val),
Block("DataPacket", Data=chunk),
# Send this towards the sender of the RequestXfer
direction=~request_msg.direction,
flags=PacketFlags.RELIABLE,
))
# Don't care about the value, just want to know it was confirmed.
if wait_for_confirm:

View File

@@ -0,0 +1,127 @@
from typing import NamedTuple, Union, Optional, List
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.mesh import MeshAsset, LLMeshSerializer
from hippolyzer.lib.base.templates import AssetType
from hippolyzer.lib.client.state import BaseClientRegion
class UploadError(Exception):
pass
class UploadToken(NamedTuple):
linden_cost: int
uploader_url: str
payload: bytes
class MeshUploadDetails(NamedTuple):
mesh_bytes: bytes
num_faces: int
class AssetUploader:
def __init__(self, region: BaseClientRegion):
self._region = region
async def initiate_asset_upload(self, name: str, asset_type: AssetType,
body: bytes, flags: Optional[int] = None) -> UploadToken:
payload = {
"asset_type": asset_type.to_lookup_name(),
"description": "(No Description)",
"everyone_mask": 0,
"group_mask": 0,
"folder_id": UUID.ZERO, # Puts it in the default folder, I guess. Undocumented.
"inventory_type": asset_type.inventory_type.to_lookup_name(),
"name": name,
"next_owner_mask": 581632,
}
if flags is not None:
payload['flags'] = flags
resp_payload = await self._make_newfileagentinventory_req(payload)
return UploadToken(resp_payload["upload_price"], resp_payload["uploader"], body)
async def _make_newfileagentinventory_req(self, payload: dict):
async with self._region.caps_client.post("NewFileAgentInventory", llsd=payload) as resp:
resp.raise_for_status()
resp_payload = await resp.read_llsd()
# Need to sniff the resp payload for this because SL sends a 200 status code on error
if "error" in resp_payload:
raise UploadError(resp_payload)
return resp_payload
async def complete_upload(self, token: UploadToken) -> dict:
async with self._region.caps_client.post(token.uploader_url, data=token.payload) as resp:
resp.raise_for_status()
resp_payload = await resp.read_llsd()
# The actual upload endpoints return 200 on error, have to sniff the payload to figure
# out if it actually failed...
if "error" in resp_payload:
raise UploadError(resp_payload)
await self._handle_upload_complete(resp_payload)
return resp_payload
async def _handle_upload_complete(self, resp_payload: dict):
"""
Generic hook called when any asset upload completes.
Could trigger an AIS fetch to send the viewer details about the item we just created,
assuming we were in proxy context.
"""
pass
# The mesh upload flow is a little special, so it gets its own method
async def initiate_mesh_upload(self, name: str, mesh: Union[MeshUploadDetails, MeshAsset],
flags: Optional[int] = None) -> UploadToken:
if isinstance(mesh, MeshAsset):
writer = se.BufferWriter("!")
writer.write(LLMeshSerializer(), mesh)
mesh = MeshUploadDetails(writer.copy_buffer(), len(mesh.segments['high_lod']))
asset_resources = self._build_asset_resources(name, [mesh])
payload = {
'asset_resources': asset_resources,
'asset_type': 'mesh',
'description': '(No Description)',
'everyone_mask': 0,
'folder_id': UUID.ZERO,
'group_mask': 0,
'inventory_type': 'object',
'name': name,
'next_owner_mask': 581632,
'texture_folder_id': UUID.ZERO
}
if flags is not None:
payload['flags'] = flags
resp_payload = await self._make_newfileagentinventory_req(payload)
upload_body = llsd.format_xml(asset_resources)
return UploadToken(resp_payload["upload_price"], resp_payload["uploader"], upload_body)
def _build_asset_resources(self, name: str, meshes: List[MeshUploadDetails]) -> dict:
instances = []
for mesh in meshes:
instances.append({
'face_list': [{
'diffuse_color': [1.0, 1.0, 1.0, 1.0],
'fullbright': False
}] * mesh.num_faces,
'material': 3,
'mesh': 0,
'mesh_name': name,
'physics_shape_type': 2,
'position': [0.0, 0.0, 0.0],
'rotation': [0.7071067690849304, 0.0, 0.0, 0.7071067690849304],
'scale': [1.0, 1.0, 1.0]
})
return {
'instance_list': instances,
'mesh_list': [mesh.mesh_bytes for mesh in meshes],
'metric': 'MUT_Unspecified',
'texture_list': []
}

View File

@@ -0,0 +1,778 @@
from __future__ import annotations
import asyncio
import hashlib
from importlib.metadata import version
import logging
import uuid
import weakref
import xmlrpc.client
from typing import *
import aiohttp
import multidict
from hippolyzer.lib.base.datatypes import Vector3, StringEnum
from hippolyzer.lib.base.helpers import proxify, get_resource_filename, create_logged_task
from hippolyzer.lib.base.message.circuit import Circuit
from hippolyzer.lib.base.message.llsd_msg_serializer import LLSDMessageSerializer
from hippolyzer.lib.base.message.message import Message, Block
from hippolyzer.lib.base.message.message_dot_xml import MessageDotXML
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.base.message.udpdeserializer import UDPMessageDeserializer
from hippolyzer.lib.base.network.caps_client import CapsClient, CAPS_DICT
from hippolyzer.lib.base.network.transport import ADDR_TUPLE, Direction, SocketUDPTransport, AbstractUDPTransport
from hippolyzer.lib.base.settings import Settings, SettingDescriptor
from hippolyzer.lib.base.templates import RegionHandshakeReplyFlags, ChatType, ThrottleData
from hippolyzer.lib.base.transfer_manager import TransferManager
from hippolyzer.lib.base.xfer_manager import XferManager
from hippolyzer.lib.client.asset_uploader import AssetUploader
from hippolyzer.lib.client.inventory_manager import InventoryManager
from hippolyzer.lib.client.object_manager import ClientObjectManager, ClientWorldObjectManager
from hippolyzer.lib.client.parcel_manager import ParcelManager
from hippolyzer.lib.client.state import BaseClientSession, BaseClientRegion, BaseClientSessionManager
LOG = logging.getLogger(__name__)
class StartLocation(StringEnum):
LAST = "last"
HOME = "home"
class ClientSettings(Settings):
SSL_VERIFY: bool = SettingDescriptor(False)
"""Off by default for now, the cert validation is a big mess due to LL using an internal CA."""
SSL_CERT_PATH: str = SettingDescriptor(get_resource_filename("lib/base/network/data/ca-bundle.crt"))
USER_AGENT: str = SettingDescriptor(f"Hippolyzer/v{version('hippolyzer')}")
SEND_AGENT_UPDATES: bool = SettingDescriptor(True)
"""Generally you want to send these, lots of things will break if you don't send at least one."""
AUTO_REQUEST_PARCELS: bool = SettingDescriptor(True)
"""Automatically request all parcel details when connecting to a region"""
AUTO_REQUEST_MATERIALS: bool = SettingDescriptor(True)
"""Automatically request all materials when connecting to a region"""
class HippoCapsClient(CapsClient):
def __init__(
self,
settings: ClientSettings,
caps: Optional[CAPS_DICT] = None,
session: Optional[aiohttp.ClientSession] = None,
) -> None:
super().__init__(caps, session)
self._settings = settings
def _request_fixups(self, cap_or_url: str, headers: Dict, proxy: Optional[bool], ssl: Any):
headers["User-Agent"] = self._settings.USER_AGENT
return cap_or_url, headers, proxy, self._settings.SSL_VERIFY
class HippoClientProtocol(asyncio.DatagramProtocol):
def __init__(self, session: HippoClientSession):
self.session = proxify(session)
self.message_xml = MessageDotXML()
self.deserializer = UDPMessageDeserializer(
settings=self.session.session_manager.settings,
)
def datagram_received(self, data, source_addr: ADDR_TUPLE):
region = self.session.region_by_circuit_addr(source_addr)
if not region:
logging.warning("Received packet from invalid address %s", source_addr)
return
message = self.deserializer.deserialize(data)
message.direction = Direction.IN
message.sender = source_addr
if not self.message_xml.validate_udp_msg(message.name):
LOG.warning(
f"Received {message.name!r} over UDP, when it should come over the event queue. Discarding."
)
raise PermissionError(f"UDPBanned message {message.name}")
region.circuit.collect_acks(message)
should_handle = True
if message.reliable:
# This is a bit crap. We send an ACK immediately through a PacketAck.
# This is pretty wasteful, we should batch them up and send them on a timer.
# We should ACK even if it's a resend of something we've already handled, maybe
# they never got the ACK.
region.circuit.send_acks((message.packet_id,))
should_handle = region.circuit.track_reliable(message.packet_id)
try:
if should_handle:
self.session.message_handler.handle(message)
except:
LOG.exception("Failed in session message handler")
if should_handle:
region.message_handler.handle(message)
class HippoClientRegion(BaseClientRegion):
def __init__(self, circuit_addr, seed_cap: Optional[str], session: HippoClientSession, handle=None):
super().__init__()
self.caps = multidict.MultiDict()
self.message_handler: MessageHandler[Message, str] = MessageHandler(take_by_default=False)
self.circuit_addr = circuit_addr
self.handle = handle
if seed_cap:
self.caps["Seed"] = seed_cap
self.session: Callable[[], HippoClientSession] = weakref.ref(session)
self.caps_client = HippoCapsClient(session.session_manager.settings, self.caps, session.http_session)
self.xfer_manager = XferManager(proxify(self), self.session().secure_session_id)
self.transfer_manager = TransferManager(proxify(self), session.agent_id, session.id)
self.asset_uploader = AssetUploader(proxify(self))
self.parcel_manager = ParcelManager(proxify(self))
self.objects = ClientObjectManager(self)
self._llsd_serializer = LLSDMessageSerializer()
self._eq_task: Optional[asyncio.Task] = None
self.connected: asyncio.Future = asyncio.Future()
self.message_handler.subscribe("StartPingCheck", self._handle_ping_check)
def update_caps(self, caps: Mapping[str, str]) -> None:
self.caps.update(caps)
@property
def cap_urls(self) -> multidict.MultiDict:
return self.caps.copy()
async def connect(self, main_region: bool = False):
# Disconnect first if we're already connected
if self.circuit and self.circuit.is_alive:
self.disconnect()
if self.connected.done():
self.connected = asyncio.Future()
try:
# TODO: What happens if a circuit code is invalid, again? Does it just refuse to ACK?
await self.circuit.send_reliable(
Message(
"UseCircuitCode",
Block(
"CircuitCode",
Code=self.session().circuit_code,
SessionID=self.session().id,
ID=self.session().agent_id,
),
)
)
self.circuit.is_alive = True
# Clear out any old caps urls except the seed URL, we're about to fetch new caps.
seed_url = self.caps["Seed"]
self.caps.clear()
self.caps["Seed"] = seed_url
# Kick this off and await it later
seed_resp_fut = self.caps_client.post("Seed", llsd=list(self.session().session_manager.SUPPORTED_CAPS))
# Register first so we can handle it even if the ack happens after the message is sent
region_handshake_fut = self.message_handler.wait_for(("RegionHandshake",))
# If we're connecting to the main region, it won't even send us a RegionHandshake until we
# first send a CompleteAgentMovement.
if main_region:
await self.complete_agent_movement()
self.name = str((await region_handshake_fut)["RegionInfo"][0]["SimName"])
self.session().objects.track_region_objects(self.handle)
await self.circuit.send_reliable(
Message(
"RegionHandshakeReply",
Block("AgentData", AgentID=self.session().agent_id, SessionID=self.session().id),
Block(
"RegionInfo",
Flags=(
RegionHandshakeReplyFlags.SUPPORTS_SELF_APPEARANCE
| RegionHandshakeReplyFlags.VOCACHE_CULLING_ENABLED
)
)
)
)
await self.circuit.send_reliable(
Message(
"AgentThrottle",
Block(
"AgentData",
AgentID=self.session().agent_id,
SessionID=self.session().id,
CircuitCode=self.session().circuit_code,
),
Block(
"Throttle",
GenCounter=0,
# Reasonable defaults, I guess
Throttles_=ThrottleData(
resend=207360.0,
land=165376.0,
wind=33075.19921875,
cloud=33075.19921875,
task=682700.75,
texture=682700.75,
asset=269312.0
),
)
)
)
if self.session().session_manager.settings.SEND_AGENT_UPDATES:
# Usually we want to send at least one, since lots of messages will never be sent by the sim
# until we send at least one AgentUpdate. For example, ParcelOverlay and LayerData.
await self.circuit.send_reliable(
Message(
"AgentUpdate",
Block(
'AgentData',
AgentID=self.session().agent_id,
SessionID=self.session().id,
# Don't really care about the other fields.
fill_missing=True,
)
)
)
async with seed_resp_fut as seed_resp:
seed_resp.raise_for_status()
self.update_caps(await seed_resp.read_llsd())
self._eq_task = create_logged_task(self._poll_event_queue(), "EQ Poll")
settings = self.session().session_manager.settings
if settings.AUTO_REQUEST_PARCELS:
_ = create_logged_task(self.parcel_manager.request_dirty_parcels(), "Parcel Request")
if settings.AUTO_REQUEST_MATERIALS:
_ = create_logged_task(self.objects.request_all_materials(), "Request All Materials")
except Exception as e:
# Let consumers who were `await`ing the connected signal know there was an error
if not self.connected.done():
self.connected.set_exception(e)
raise
self.connected.set_result(None)
def disconnect(self) -> None:
"""Simulator has gone away, disconnect. Should be synchronous"""
if self._eq_task is not None:
self._eq_task.cancel()
self._eq_task = None
self.circuit.disconnect()
self.objects.clear()
if self.connected.done():
self.connected = asyncio.Future()
# TODO: cancel XFers and Transfers and whatnot
async def complete_agent_movement(self) -> None:
await self.circuit.send_reliable(
Message(
"CompleteAgentMovement",
Block(
"AgentData",
AgentID=self.session().agent_id,
SessionID=self.session().id,
CircuitCode=self.session().circuit_code
),
)
)
self.session().main_region = self
async def _poll_event_queue(self):
ack: Optional[int] = None
while True:
payload = {"ack": ack, "done": False}
try:
async with self.caps_client.post("EventQueueGet", llsd=payload) as resp:
if resp.status != 200:
await asyncio.sleep(0.1)
continue
polled = await resp.read_llsd()
for event in polled["events"]:
if self._llsd_serializer.can_handle(event["message"]):
msg = self._llsd_serializer.deserialize(event)
else:
msg = Message.from_eq_event(event)
msg.sender = self.circuit_addr
msg.direction = Direction.IN
self.session().message_handler.handle(msg)
self.message_handler.handle(msg)
ack = polled["id"]
await asyncio.sleep(0.001)
except aiohttp.client_exceptions.ServerDisconnectedError:
# This is expected to happen during long-polling, just pick up again where we left off.
await asyncio.sleep(0.001)
async def _handle_ping_check(self, message: Message):
self.circuit.send(
Message(
"CompletePingCheck",
Block("PingID", PingID=message["PingID"]["PingID"]),
)
)
class HippoClientSession(BaseClientSession):
"""Represents a client's view of a remote session"""
REGION_CLS = HippoClientRegion
region_by_handle: Callable[[int], Optional[HippoClientRegion]]
region_by_circuit_addr: Callable[[ADDR_TUPLE], Optional[HippoClientRegion]]
regions: List[HippoClientRegion]
session_manager: HippoClient
main_region: Optional[HippoClientRegion]
def __init__(self, id, secure_session_id, agent_id, circuit_code, session_manager: Optional[HippoClient] = None,
login_data=None):
super().__init__(id, secure_session_id, agent_id, circuit_code, session_manager, login_data=login_data)
self.http_session = session_manager.http_session
self.objects = ClientWorldObjectManager(proxify(self), session_manager.settings, None)
self.inventory = InventoryManager(proxify(self))
self.transport: Optional[SocketUDPTransport] = None
self.protocol: Optional[HippoClientProtocol] = None
self.message_handler.take_by_default = False
for msg_name in ("DisableSimulator", "CloseCircuit"):
self.message_handler.subscribe(msg_name, lambda msg: self.unregister_region(msg.sender))
for msg_name in ("TeleportFinish", "CrossedRegion", "EstablishAgentCommunication"):
self.message_handler.subscribe(msg_name, self._handle_register_region_message)
def register_region(self, circuit_addr: Optional[ADDR_TUPLE] = None, seed_url: Optional[str] = None,
handle: Optional[int] = None) -> HippoClientRegion:
return super().register_region(circuit_addr, seed_url, handle) # type:ignore
def unregister_region(self, circuit_addr: ADDR_TUPLE) -> None:
for i, region in enumerate(self.regions):
if region.circuit_addr == circuit_addr:
self.regions[i].disconnect()
del self.regions[i]
return
raise KeyError(f"No such region for {circuit_addr!r}")
def open_circuit(self, circuit_addr: ADDR_TUPLE):
for region in self.regions:
if region.circuit_addr == circuit_addr:
valid_circuit = False
if not region.circuit or not region.circuit.is_alive:
region.circuit = Circuit(("127.0.0.1", 0), circuit_addr, self.transport)
region.circuit.is_alive = False
valid_circuit = True
if region.circuit and region.circuit.is_alive:
# Whatever, already open
logging.debug("Tried to re-open circuit for %r" % (circuit_addr,))
valid_circuit = True
return valid_circuit
return False
def _handle_register_region_message(self, msg: Message):
# Handle events that inform us about new regions
sim_addr, sim_handle, sim_seed = None, None, None
moving_to_region = False
# Sim is asking us to talk to a neighbour
if msg.name == "EstablishAgentCommunication":
ip_split = msg["EventData"]["sim-ip-and-port"].split(":")
sim_addr = (ip_split[0], int(ip_split[1]))
sim_seed = msg["EventData"]["seed-capability"]
# We teleported or cross region, opening comms to new sim
elif msg.name in ("TeleportFinish", "CrossedRegion"):
sim_block = msg.get_blocks("RegionData", msg.get_blocks("Info"))[0]
sim_addr = (sim_block["SimIP"], sim_block["SimPort"])
sim_handle = sim_block["RegionHandle"]
sim_seed = sim_block["SeedCapability"]
moving_to_region = True
# Sim telling us about a neighbour
# elif msg.name == "EnableSimulator":
# sim_block = msg["SimulatorInfo"][0]
# sim_addr = (sim_block["IP"], sim_block["Port"])
# sim_handle = sim_block["Handle"]
# TODO: EnableSimulator is a little weird. It creates a region and establishes a
# circuit, but with no seed cap. The viewer will send UseCircuitCode and all that,
# but it's totally workable to just wait for an EstablishAgentCommunication to do that,
# since that's when the region actually shows up. I guess EnableSimulator just gives the
# viewer some lead time to set up the circuit before the region is actually shown through
# EstablishAgentCommunication? Either way, messing around with regions that don't have seed
# caps is annoying, so let's just not do it.
# Register a region if this message was telling us about a new one
if sim_addr is not None:
region = self.register_region(sim_addr, handle=sim_handle, seed_url=sim_seed)
# We can't actually connect without a sim seed, mind you, when we receive and EnableSimulator
# we have to wait for the EstablishAgentCommunication to actually connect.
need_connect = (region.circuit and region.circuit.is_alive) or moving_to_region
self.open_circuit(sim_addr)
if need_connect:
create_logged_task(region.connect(main_region=moving_to_region), "Region Connect")
elif moving_to_region:
# No need to connect, but we do need to complete agent movement.
create_logged_task(region.complete_agent_movement(), "CompleteAgentMovement")
class HippoClient(BaseClientSessionManager):
"""A simple client, only connects to one region at a time currently."""
SUPPORTED_CAPS: Set[str] = {
"AbuseCategories",
"AcceptFriendship",
"AcceptGroupInvite",
"AgentPreferences",
"AgentProfile",
"AgentState",
"AttachmentResources",
"AvatarPickerSearch",
"AvatarRenderInfo",
"CharacterProperties",
"ChatSessionRequest",
"CopyInventoryFromNotecard",
"CreateInventoryCategory",
"DeclineFriendship",
"DeclineGroupInvite",
"DispatchRegionInfo",
"DirectDelivery",
"EnvironmentSettings",
"EstateAccess",
"DispatchOpenRegionSettings",
"EstateChangeInfo",
"EventQueueGet",
"ExtEnvironment",
"FetchLib2",
"FetchLibDescendents2",
"FetchInventory2",
"FetchInventoryDescendents2",
"IncrementCOFVersion",
"InventoryAPIv3",
"LibraryAPIv3",
"InterestList",
"InventoryThumbnailUpload",
"GetDisplayNames",
"GetExperiences",
"AgentExperiences",
"FindExperienceByName",
"GetExperienceInfo",
"GetAdminExperiences",
"GetCreatorExperiences",
"ExperiencePreferences",
"GroupExperiences",
"UpdateExperience",
"IsExperienceAdmin",
"IsExperienceContributor",
"RegionExperiences",
"ExperienceQuery",
"GetMesh",
"GetMesh2",
"GetMetadata",
"GetObjectCost",
"GetObjectPhysicsData",
"GetTexture",
"GroupAPIv1",
"GroupMemberData",
"GroupProposalBallot",
"HomeLocation",
"LandResources",
"LSLSyntax",
"MapLayer",
"MapLayerGod",
"MeshUploadFlag",
"NavMeshGenerationStatus",
"NewFileAgentInventory",
"ObjectAnimation",
"ObjectMedia",
"ObjectMediaNavigate",
"ObjectNavMeshProperties",
"ParcelPropertiesUpdate",
"ParcelVoiceInfoRequest",
"ProductInfoRequest",
"ProvisionVoiceAccountRequest",
"ReadOfflineMsgs",
"RegionObjects",
"RemoteParcelRequest",
"RenderMaterials",
"RequestTextureDownload",
"ResourceCostSelected",
"RetrieveNavMeshSrc",
"SearchStatRequest",
"SearchStatTracking",
"SendPostcard",
"SendUserReport",
"SendUserReportWithScreenshot",
"ServerReleaseNotes",
"SetDisplayName",
"SimConsoleAsync",
"SimulatorFeatures",
"StartGroupProposal",
"TerrainNavMeshProperties",
"TextureStats",
"UntrustedSimulatorMessage",
"UpdateAgentInformation",
"UpdateAgentLanguage",
"UpdateAvatarAppearance",
"UpdateGestureAgentInventory",
"UpdateGestureTaskInventory",
"UpdateNotecardAgentInventory",
"UpdateNotecardTaskInventory",
"UpdateScriptAgent",
"UpdateScriptTask",
"UpdateSettingsAgentInventory",
"UpdateSettingsTaskInventory",
"UploadAgentProfileImage",
"UploadBakedTexture",
"UserInfo",
"ViewerAsset",
"ViewerBenefits",
"ViewerMetrics",
"ViewerStartAuction",
"ViewerStats",
}
DEFAULT_OPTIONS = {
"inventory-root",
"inventory-skeleton",
"inventory-lib-root",
"inventory-lib-owner",
"inventory-skel-lib",
"initial-outfit",
"gestures",
"display_names",
"event_notifications",
"classified_categories",
"adult_compliant",
"buddy-list",
"newuser-config",
"ui-config",
"advanced-mode",
"max-agent-groups",
"map-server-url",
"voice-config",
"tutorial_setting",
"login-flags",
"global-textures",
# Not an official option, just so this can be tracked.
"pyogp-client",
}
DEFAULT_LOGIN_URI = "https://login.agni.lindenlab.com/cgi-bin/login.cgi"
def __init__(self, options: Optional[Set[str]] = None):
self._username: Optional[str] = None
self._password: Optional[str] = None
self._mac = uuid.getnode()
self._options = options if options is not None else self.DEFAULT_OPTIONS
self.http_session: Optional[aiohttp.ClientSession] = aiohttp.ClientSession(trust_env=True)
self.session: Optional[HippoClientSession] = None
self.settings = ClientSettings()
self._resend_task: Optional[asyncio.Task] = None
@property
def main_region(self) -> Optional[HippoClientRegion]:
if not self.session:
return None
return self.session.main_region
@property
def main_circuit(self) -> Optional[Circuit]:
if not self.main_region:
return None
return self.main_region.circuit
@property
def main_caps_client(self) -> Optional[CapsClient]:
if not self.main_region:
return None
return self.main_region.caps_client
async def aclose(self):
try:
self.logout()
finally:
if self.http_session:
await self.http_session.close()
self.http_session = None
def __del__(self):
# Make sure we don't leak resources if someone was lazy.
try:
self.logout()
finally:
if self.http_session:
try:
asyncio.create_task(self.http_session.close)
except:
pass
self.http_session = None
async def _create_transport(self) -> Tuple[AbstractUDPTransport, HippoClientProtocol]:
loop = asyncio.get_event_loop_policy().get_event_loop()
transport, protocol = await loop.create_datagram_endpoint(
lambda: HippoClientProtocol(self.session),
local_addr=('0.0.0.0', 0))
transport = SocketUDPTransport(transport)
return transport, protocol
async def login(
self,
username: str,
password: str,
login_uri: Optional[str] = None,
agree_to_tos: bool = False,
start_location: Union[StartLocation, str, None] = StartLocation.LAST,
connect: bool = True,
):
if self.session:
raise RuntimeError("Already logged in!")
if not login_uri:
login_uri = self.DEFAULT_LOGIN_URI
if start_location is None:
start_location = StartLocation.LAST
# This isn't a symbolic start location and isn't a URI, must be a sim name.
if start_location not in iter(StartLocation) and not start_location.startswith("uri:"):
start_location = f"uri:{start_location}&128&128&128"
split_username = username.split(" ")
if len(split_username) < 2:
first_name = split_username[0]
last_name = "Resident"
else:
first_name, last_name = split_username
payload = {
"address_size": 64,
"agree_to_tos": int(agree_to_tos),
"channel": "Hippolyzer",
"extended_errors": 1,
"first": first_name,
"last": last_name,
"host_id": "",
"id0": hashlib.md5(str(self._mac).encode("ascii")).hexdigest(),
"mac": hashlib.md5(str(self._mac).encode("ascii")).hexdigest(),
"mfa_hash": "",
"passwd": "$1$" + hashlib.md5(str(password).encode("ascii")).hexdigest(),
# TODO: actually get these
"platform": "lnx",
"platform_string": "Linux 6.6",
# TODO: What is this?
"platform_version": "2.38.0",
"read_critical": 0,
"start": str(start_location),
"token": "",
"version": version("hippolyzer"),
"options": list(self._options),
}
async with self.http_session.post(
login_uri,
data=xmlrpc.client.dumps((payload,), "login_to_simulator"),
headers={"Content-Type": "text/xml", "User-Agent": self.settings.USER_AGENT},
ssl=self.settings.SSL_VERIFY,
) as resp:
resp.raise_for_status()
login_data = xmlrpc.client.loads((await resp.read()).decode("utf8"))[0][0]
self.session = HippoClientSession.from_login_data(login_data, self)
self.session.transport, self.session.protocol = await self._create_transport()
self._resend_task = create_logged_task(self._attempt_resends(), "Circuit Resend")
self.session.message_handler.subscribe("AgentDataUpdate", self._handle_agent_data_update)
self.session.message_handler.subscribe("AgentGroupDataUpdate", self._handle_agent_group_data_update)
assert self.session.open_circuit(self.session.regions[-1].circuit_addr)
if connect:
region = self.session.regions[-1]
await region.connect(main_region=True)
def logout(self):
if not self.session:
return
if self._resend_task:
self._resend_task.cancel()
self._resend_task = None
if self.main_circuit and self.main_circuit.is_alive:
# Don't need to send reliably, there's a good chance the server won't ACK anyway.
self.main_circuit.send(
Message(
"LogoutRequest",
Block("AgentData", AgentID=self.session.agent_id, SessionID=self.session.id),
)
)
session = self.session
self.session = None
for region in session.regions:
region.disconnect()
session.transport.close()
def send_chat(self, message: Union[bytes, str], channel: int = 0, chat_type=ChatType.NORMAL) -> asyncio.Future:
return self.main_circuit.send_reliable(Message(
"ChatFromViewer",
Block("AgentData", SessionID=self.session.id, AgentID=self.session.agent_id),
Block("ChatData", Message=message, Channel=channel, Type=chat_type),
))
def teleport(self, region_handle: int, local_pos=Vector3(0, 0, 0)) -> asyncio.Future:
"""Synchronously requests a teleport, returning a Future for teleport completion"""
teleport_fut = asyncio.Future()
# Send request synchronously, await asynchronously.
send_fut = self.main_circuit.send_reliable(
Message(
'TeleportLocationRequest',
Block('AgentData', AgentID=self.session.agent_id, SessionID=self.session.id),
Block('Info', RegionHandle=region_handle, Position=local_pos, fill_missing=True),
)
)
async def _handle_teleport():
# Subscribe first, we may receive an event before we receive the packet ACK.
with self.session.message_handler.subscribe_async(
("TeleportLocal", "TeleportFailed", "TeleportFinish"),
) as get_tp_done_msg:
try:
await send_fut
except Exception as e:
# Pass along error if we failed to send reliably.
teleport_fut.set_exception(e)
return
# Wait for a message that says we're done the teleport
msg = await get_tp_done_msg()
if msg.name == "TeleportFailed":
teleport_fut.set_exception(RuntimeError("Failed to teleport"))
elif msg.name == "TeleportLocal":
# Within the sim, nothing else we need to do
teleport_fut.set_result(None)
elif msg.name == "TeleportFinish":
# Non-local TP, wait until we receive the AgentMovementComplete to
# set the finished signal.
# Region should be registered by this point, wait for it to connect
try:
# just fail if it takes longer than 30 seconds for the handshake to complete
await asyncio.wait_for(self.session.region_by_handle(region_handle).connected, 30)
except Exception as e:
teleport_fut.set_exception(e)
return
teleport_fut.set_result(None)
create_logged_task(_handle_teleport(), "Teleport")
return teleport_fut
async def _attempt_resends(self):
while True:
if self.session is None:
break
for region in self.session.regions:
if not region.circuit.is_alive:
continue
region.circuit.resend_unacked()
await asyncio.sleep(0.5)
def _handle_agent_data_update(self, msg: Message):
self.session.active_group = msg["AgentData"]["ActiveGroupID"]
def _handle_agent_group_data_update(self, msg: Message):
self.session.groups.clear()
for block in msg["GroupData"]:
self.session.groups.add(block["GroupID"])

View File

@@ -0,0 +1,415 @@
from __future__ import annotations
import asyncio
import dataclasses
import gzip
import itertools
import logging
from pathlib import Path
from typing import Union, List, Tuple, Set, Sequence, Dict, TYPE_CHECKING
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.inventory import InventoryModel, InventoryCategory, InventoryItem, InventoryNodeBase
from hippolyzer.lib.base.message.message import Message, Block
from hippolyzer.lib.base.templates import AssetType, FolderType, InventoryType, Permissions
from hippolyzer.lib.base.templates import WearableType
if TYPE_CHECKING:
from hippolyzer.lib.client.state import BaseClientSession
LOG = logging.getLogger(__name__)
class CannotMoveError(Exception):
def __init__(self):
pass
def _get_node_id(node_or_id: InventoryNodeBase | UUID) -> UUID:
if isinstance(node_or_id, UUID):
return node_or_id
return node_or_id.node_id
class InventoryManager:
def __init__(self, session: BaseClientSession):
self._session = session
self.model: InventoryModel = InventoryModel()
self._load_skeleton()
self._session.message_handler.subscribe("BulkUpdateInventory", self._handle_bulk_update_inventory)
self._session.message_handler.subscribe("UpdateCreateInventoryItem", self._handle_update_create_inventory_item)
self._session.message_handler.subscribe("RemoveInventoryItem", self._handle_remove_inventory_item)
self._session.message_handler.subscribe("RemoveInventoryObjects", self._handle_remove_inventory_objects)
self._session.message_handler.subscribe("RemoveInventoryFolder", self._handle_remove_inventory_folder)
self._session.message_handler.subscribe("MoveInventoryItem", self._handle_move_inventory_item)
self._session.message_handler.subscribe("MoveInventoryFolder", self._handle_move_inventory_folder)
def _load_skeleton(self):
assert not self.model.nodes
skel_cats: List[dict] = self._session.login_data.get('inventory-skeleton', [])
for skel_cat in skel_cats:
self.model.add(InventoryCategory(
name=skel_cat["name"],
cat_id=UUID(skel_cat["folder_id"]),
parent_id=UUID(skel_cat["parent_id"]),
# Don't use the version from the skeleton, this flags the inventory as needing
# completion from the inventory cache. This matches indra's behavior.
version=InventoryCategory.VERSION_NONE,
type=AssetType.CATEGORY,
pref_type=FolderType(skel_cat.get("type_default", FolderType.NONE)),
owner_id=self._session.agent_id,
))
def load_cache(self, path: Union[str, Path]):
# Per indra, rough flow for loading inv on login is:
# 1. Look at inventory skeleton from login response
# 2. Pre-populate model with categories from the skeleton, including their versions
# 3. Read the inventory cache, tracking categories and items separately
# 4. Walk the list of categories in our cache. If the cat exists in the skeleton and the versions
# match, then we may load the category and its descendants from cache.
# 5. Any categories in the skeleton but not in the cache, or those with mismatched versions must be fetched.
# The viewer does this by setting the local version of the cats to -1 and forcing a descendent fetch
# over AIS.
#
# By the time you call this function call, you should have already loaded the inventory skeleton
# into the model set its inventory category versions to VERSION_NONE.
skel_cats: List[dict] = self._session.login_data['inventory-skeleton']
# UUID -> version map for inventory skeleton
skel_versions = {UUID(cat["folder_id"]): cat["version"] for cat in skel_cats}
LOG.info(f"Parsing inv cache at {path}")
cached_categories, cached_items = self._parse_cache(path)
LOG.info(f"Done parsing inv cache at {path}")
loaded_cat_ids: Set[UUID] = set()
for cached_cat in cached_categories:
existing_cat: InventoryCategory = self.model.get(cached_cat.cat_id) # noqa
# Don't clobber an existing cat unless it just has a placeholder version,
# maybe from loading the skeleton?
if existing_cat and existing_cat.version != InventoryCategory.VERSION_NONE:
continue
# Cached cat isn't the same as what the inv server says it should be, can't use it.
if cached_cat.version != skel_versions.get(cached_cat.cat_id):
continue
# Update any existing category in-place, or add if not present
self.model.upsert(cached_cat)
# Any items in this category in our cache file are usable and should be added
loaded_cat_ids.add(cached_cat.cat_id)
for cached_item in cached_items:
# The skeleton doesn't have any items, so if we run into any items they should be exactly the
# same as what we're trying to add. No point clobbering.
if cached_item.item_id in self.model:
continue
# The parent category didn't have a cache hit against the inventory skeleton, can't add!
# We don't even know if this item would be in the current version of its parent cat!
if cached_item.parent_id not in loaded_cat_ids:
continue
self.model.add(cached_item)
self.model.flag_if_dirty()
def _parse_cache(self, path: Union[str, Path]) -> Tuple[List[InventoryCategory], List[InventoryItem]]:
"""Warning, may be incredibly slow due to llsd.parse_notation() behavior"""
categories: List[InventoryCategory] = []
items: List[InventoryItem] = []
# Parse our cached items and categories out of the compressed inventory cache
first_line = True
with gzip.open(path, "rb") as f:
# Line-delimited LLSD notation!
for line in f.readlines():
# TODO: Parsing of invcache is dominated by `parse_notation()`. It's stupidly inefficient.
# TODO: sniff out binary LLSD invcaches
node_llsd = llsd.parse_notation(line)
if first_line:
# First line is the file header
first_line = False
if node_llsd['inv_cache_version'] not in (2, 3):
raise ValueError(f"Unknown cache version: {node_llsd!r}")
continue
if InventoryCategory.ID_ATTR in node_llsd:
if (cat_node := InventoryCategory.from_llsd(node_llsd)) is not None:
categories.append(cat_node)
elif InventoryItem.ID_ATTR in node_llsd:
if (item_node := InventoryItem.from_llsd(node_llsd)) is not None:
items.append(item_node)
else:
LOG.warning(f"Unknown node type in inv cache: {node_llsd!r}")
return categories, items
def _handle_bulk_update_inventory(self, msg: Message):
any_cats = False
for folder_block in msg.get_blocks("FolderData", ()):
if folder_block["FolderID"] == UUID.ZERO:
continue
any_cats = True
self.model.upsert(
InventoryCategory.from_folder_data(folder_block),
# Don't clobber version, we only want to fetch the folder if it's new
# and hasn't just moved.
update_fields={"parent_id", "name", "pref_type"},
)
for item_block in msg.get_blocks("ItemData", ()):
if item_block["ItemID"] == UUID.ZERO:
continue
self.model.upsert(InventoryItem.from_inventory_data(item_block))
if any_cats:
self.model.flag_if_dirty()
def _validate_recipient(self, recipient: UUID):
if self._session.agent_id != recipient:
raise ValueError(f"AgentID Mismatch {self._session.agent_id} != {recipient}")
def _handle_update_create_inventory_item(self, msg: Message):
self._validate_recipient(msg["AgentData"]["AgentID"])
for inventory_block in msg["InventoryData"]:
self.model.upsert(InventoryItem.from_inventory_data(inventory_block))
def _handle_remove_inventory_item(self, msg: Message):
self._validate_recipient(msg["AgentData"]["AgentID"])
for inventory_block in msg["InventoryData"]:
node = self.model.get(inventory_block["ItemID"])
if node:
self.model.unlink(node)
def _handle_remove_inventory_folder(self, msg: Message):
self._validate_recipient(msg["AgentData"]["AgentID"])
for folder_block in msg["FolderData"]:
node = self.model.get(folder_block["FolderID"])
if node:
self.model.unlink(node)
def _handle_remove_inventory_objects(self, msg: Message):
self._validate_recipient(msg["AgentData"]["AgentID"])
for item_block in msg.get_blocks("ItemData", []):
node = self.model.get(item_block["ItemID"])
if node:
self.model.unlink(node)
for folder_block in msg.get_blocks("FolderData", []):
node = self.model.get(folder_block["FolderID"])
if node:
self.model.unlink(node)
def _handle_move_inventory_item(self, msg: Message):
for inventory_block in msg["InventoryData"]:
node = self.model.get(inventory_block["ItemID"])
if not node:
LOG.warning(f"Missing inventory item {inventory_block['ItemID']}")
continue
if inventory_block["NewName"]:
node.name = str(inventory_block["NewName"])
node.parent_id = inventory_block['FolderID']
def _handle_move_inventory_folder(self, msg: Message):
for inventory_block in msg["InventoryData"]:
node = self.model.get(inventory_block["FolderID"])
if not node:
LOG.warning(f"Missing inventory folder {inventory_block['FolderID']}")
continue
node.parent_id = inventory_block['ParentID']
def process_aisv3_response(self, payload: dict):
if "name" in payload:
# Just a rough guess. Assume this response is updating something if there's
# a "name" key.
if InventoryCategory.ID_ATTR_AIS in payload:
if (cat_node := InventoryCategory.from_llsd(payload, flavor="ais")) is not None:
self.model.upsert(cat_node)
elif InventoryItem.ID_ATTR in payload:
if (item_node := InventoryItem.from_llsd(payload, flavor="ais")) is not None:
self.model.upsert(item_node)
else:
LOG.warning(f"Unknown node type in AIS payload: {payload!r}")
# Parse the embedded stuff
embedded_dict = payload.get("_embedded", {})
for category_llsd in embedded_dict.get("categories", {}).values():
self.model.upsert(InventoryCategory.from_llsd(category_llsd, flavor="ais"))
for item_llsd in embedded_dict.get("items", {}).values():
self.model.upsert(InventoryItem.from_llsd(item_llsd, flavor="ais"))
for link_llsd in embedded_dict.get("links", {}).values():
self.model.upsert(InventoryItem.from_llsd(link_llsd, flavor="ais"))
for cat_id, version in payload.get("_updated_category_versions", {}).items():
# The key will be a string, so convert to UUID first
cat_node = self.model.get_category(UUID(cat_id))
cat_node.version = version
# Get rid of anything we were asked to
for node_id in itertools.chain(
payload.get("_broken_links_removed", ()),
payload.get("_removed_items", ()),
payload.get("_category_items_removed", ()),
payload.get("_categories_removed", ()),
):
node = self.model.get(node_id)
if node:
# Presumably this list is exhaustive, so don't unlink children.
self.model.unlink(node, single_only=True)
async def make_ais_request(
self,
method: str,
path: str,
params: dict,
payload: dict | Sequence | dataclasses.MISSING = dataclasses.MISSING,
) -> dict:
caps_client = self._session.main_region.caps_client
async with caps_client.request(method, "InventoryAPIv3", path=path, params=params, llsd=payload) as resp:
if resp.ok or resp.status == 400:
data = await resp.read_llsd()
if err_desc := data.get("error_description", ""):
err_desc: str
if err_desc.startswith("Cannot change parent_id."):
raise CannotMoveError()
resp.raise_for_status()
self.process_aisv3_response(data)
else:
resp.raise_for_status()
return data
async def create_folder(
self,
parent: InventoryCategory | UUID,
name: str,
pref_type: int = AssetType.NONE,
cat_id: UUID | None = None
) -> InventoryCategory:
parent_id = _get_node_id(parent)
payload = {
"categories": [
{
"category_id": cat_id,
"name": name,
"type_default": pref_type,
"parent_id": parent_id
}
]
}
data = await self.make_ais_request("POST", f"/category/{parent_id}", {"tid": UUID.random()}, payload)
return self.model.get_category(data["_created_categories"][0])
async def create_item(
self,
parent: UUID | InventoryCategory,
name: str,
type: AssetType,
inv_type: InventoryType,
wearable_type: WearableType,
transaction_id: UUID,
next_mask: int | Permissions = 0x0008e000,
description: str = '',
) -> InventoryItem:
parent_id = _get_node_id(parent)
with self._session.main_region.message_handler.subscribe_async(
("UpdateCreateInventoryItem",),
predicate=lambda x: x["AgentData"]["TransactionID"] == transaction_id,
take=False,
) as get_msg:
await self._session.main_region.circuit.send_reliable(
Message(
'CreateInventoryItem',
Block('AgentData', AgentID=self._session.agent_id, SessionID=self._session.id),
Block(
'InventoryBlock',
CallbackID=0,
FolderID=parent_id,
TransactionID=transaction_id,
NextOwnerMask=next_mask,
Type=type,
InvType=inv_type,
WearableType=wearable_type,
Name=name,
Description=description,
)
)
)
msg = await asyncio.wait_for(get_msg(), 5.0)
# We assume that _handle_update_create_inventory_item() has already been called internally
# by the time that the `await` returns given asyncio scheduling
return self.model.get_item(msg["InventoryData"]["ItemID"])
async def move(self, node: InventoryNodeBase, new_parent: UUID | InventoryCategory) -> None:
# AIS error messages suggest using the MOVE HTTP method instead of setting a new parent
# via PATCH. MOVE is not implemented in AIS. Instead, we do what the viewer does and use
# legacy UDP messages for reparenting things
new_parent = _get_node_id(new_parent)
msg = Message(
"MoveInventoryFolder",
Block("AgentData", AgentID=self._session.agent_id, SessionID=self._session.id, Stamp=0),
)
if isinstance(node, InventoryItem):
msg.add_block(Block("InventoryData", ItemID=node.node_id, FolderID=new_parent, NewName=b''))
else:
msg.add_block(Block("InventoryData", FolderID=node.node_id, ParentID=new_parent))
# No message to say if this even succeeded. Great.
# TODO: probably need to update category versions for both source and target
await self._session.main_region.circuit.send_reliable(msg)
node.parent_id = new_parent
async def copy(self, node: InventoryNodeBase, destination: UUID | InventoryCategory, contents: bool = True)\
-> InventoryItem | InventoryCategory:
destination = _get_node_id(destination)
if isinstance(node, InventoryItem):
with self._session.main_region.message_handler.subscribe_async(
("BulkUpdateInventory",),
# Not ideal, but there doesn't seem to be an easy way to determine the transaction ID,
# and using the callback ID seems a bit crap.
predicate=lambda x: x["ItemData"]["Name"] == node.name,
take=False,
) as get_msg:
await self._session.main_region.circuit.send_reliable(Message(
'CopyInventoryItem',
Block('AgentData', AgentID=self._session.agent_id, SessionID=self._session.id),
Block(
'InventoryData',
CallbackID=0,
OldAgentID=self._session.agent_id,
OldItemID=node.item_id,
NewFolderID=destination,
NewName=b''
)
))
msg = await asyncio.wait_for(get_msg(), 5.0)
# BulkInventoryUpdate message may not have already been handled internally, do it manually.
self._handle_bulk_update_inventory(msg)
# Now pull the item out of the inventory
new_item = self.model.get(msg["ItemData"]["ItemID"])
assert new_item is not None
return new_item # type: ignore
elif isinstance(node, InventoryCategory):
# Keep a list of the original descendents in case we're copy a folder within itself
to_copy = list(node.descendents)
# There's not really any way to "copy" a category, we just create a new one with the same properties.
new_cat = await self.create_folder(destination, node.name, node.pref_type)
if contents:
cat_lookup: Dict[UUID, UUID] = {node.node_id: new_cat.node_id}
# Recreate the category hierarchy first, keeping note of the new category IDs.
for node in to_copy:
if isinstance(node, InventoryCategory):
new_parent = cat_lookup[node.parent_id]
cat_lookup[node.node_id] = (await self.copy(node, new_parent, contents=False)).node_id
# Items have to be explicitly copied individually
for node in to_copy:
if isinstance(node, InventoryItem):
new_parent = cat_lookup[node.parent_id]
await self.copy(node, new_parent, contents=False)
return new_cat
else:
raise ValueError(f"Unknown node type: {node!r}")
async def update(self, node: InventoryNodeBase, data: dict) -> None:
path = f"/category/{node.node_id}"
if isinstance(node, InventoryItem):
path = f"/item/{node.node_id}"
await self.make_ais_request("PATCH", path, {}, data)

View File

@@ -15,8 +15,10 @@ from typing import *
from hippolyzer.lib.base.datatypes import UUID, Vector3
from hippolyzer.lib.base.helpers import proxify
from hippolyzer.lib.base.inventory import InventoryItem, InventoryModel, InventoryObject
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.base.message.msgtypes import PacketFlags
from hippolyzer.lib.base.objects import (
normalize_object_update,
normalize_terse_object_update,
@@ -25,21 +27,28 @@ from hippolyzer.lib.base.objects import (
Object, handle_to_global_pos,
)
from hippolyzer.lib.base.settings import Settings
from hippolyzer.lib.base.wearables import VISUAL_PARAMS
from hippolyzer.lib.client.namecache import NameCache, NameCacheEntry
from hippolyzer.lib.client.state import BaseClientSession, BaseClientRegion
from hippolyzer.lib.base.templates import PCode, ObjectStateSerializer
from hippolyzer.lib.base.templates import PCode, ObjectStateSerializer, XferFilePath
from hippolyzer.lib.base import llsd
if TYPE_CHECKING:
from hippolyzer.lib.client.state import BaseClientRegion, BaseClientSession
LOG = logging.getLogger(__name__)
OBJECT_OR_LOCAL = Union[Object, int]
MATERIAL_MAP_TYPE = Dict[UUID, dict]
class UpdateType(enum.IntEnum):
OBJECT_UPDATE = enum.auto()
class ObjectUpdateType(enum.IntEnum):
UPDATE = enum.auto()
PROPERTIES = enum.auto()
FAMILY = enum.auto()
COSTS = enum.auto()
KILL = enum.auto()
ANIMATIONS = enum.auto()
APPEARANCE = enum.auto()
class ClientObjectManager:
@@ -47,12 +56,13 @@ class ClientObjectManager:
Object manager for a specific region
"""
__slots__ = ("_region", "_world_objects", "state")
__slots__ = ("_region", "_world_objects", "state", "__weakref__", "_requesting_all_mats_lock")
def __init__(self, region: BaseClientRegion):
self._region: BaseClientRegion = proxify(region)
self._world_objects: ClientWorldObjectManager = proxify(region.session().objects)
self.state: RegionObjectsState = RegionObjectsState()
self._requesting_all_mats_lock = asyncio.Lock()
def __len__(self):
return len(self.state.localid_lookup)
@@ -70,7 +80,7 @@ class ClientObjectManager:
if self._region.handle is not None:
# We're tracked by the world object manager, tell it to untrack
# any objects that we owned
self._world_objects.clear_region_objects(self._region.handle)
self._world_objects.untrack_region_objects(self._region.handle)
def lookup_localid(self, localid: int) -> Optional[Object]:
return self.state.lookup_localid(localid)
@@ -116,17 +126,17 @@ class ClientObjectManager:
*[Block("ObjectData", ObjectLocalID=x) for x in ids_to_req[:255]],
]
# Selecting causes ObjectProperties to be sent
self._region.circuit.send_message(Message("ObjectSelect", blocks))
self._region.circuit.send_message(Message("ObjectDeselect", blocks))
self._region.circuit.send(Message("ObjectSelect", blocks, flags=PacketFlags.RELIABLE))
self._region.circuit.send(Message("ObjectDeselect", blocks, flags=PacketFlags.RELIABLE))
ids_to_req = ids_to_req[255:]
futures = []
for local_id in local_ids:
if local_id in unselected_ids:
# Need to wait until we get our reply
fut = self.state.register_future(local_id, UpdateType.PROPERTIES)
fut = self.state.register_future(local_id, ObjectUpdateType.PROPERTIES)
else:
# This was selected so we should already have up to date info
# This was selected so we should already have up-to-date info
fut = asyncio.Future()
fut.set_result(self.lookup_localid(local_id))
futures.append(fut)
@@ -150,33 +160,124 @@ class ClientObjectManager:
ids_to_req = local_ids
while ids_to_req:
self._region.circuit.send_message(Message(
self._region.circuit.send(Message(
"RequestMultipleObjects",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
*[Block("ObjectData", CacheMissType=0, ID=x) for x in ids_to_req[:255]],
flags=PacketFlags.RELIABLE,
))
ids_to_req = ids_to_req[255:]
futures = []
for local_id in local_ids:
futures.append(self.state.register_future(local_id, UpdateType.OBJECT_UPDATE))
futures.append(self.state.register_future(local_id, ObjectUpdateType.UPDATE))
return futures
async def request_all_materials(self) -> MATERIAL_MAP_TYPE:
"""
Request all materials within the sim
Sigh, yes, this is best practice per indra :(
"""
if self._requesting_all_mats_lock.locked():
# We're already requesting all materials, wait until the lock is free
# and just return what was returned.
async with self._requesting_all_mats_lock:
return self.state.materials
async with self._requesting_all_mats_lock:
async with self._region.caps_client.get("RenderMaterials") as resp:
resp.raise_for_status()
# Clear out all previous materials, this is a complete response.
self.state.materials.clear()
self._process_materials_response(await resp.read())
return self.state.materials
async def request_materials(self, material_ids: Sequence[UUID]) -> MATERIAL_MAP_TYPE:
if self._requesting_all_mats_lock.locked():
# Just wait for the in-flight request for all materials to complete
# if we have one in flight.
async with self._requesting_all_mats_lock:
# Wait for the lock to be released
pass
not_found = set(x for x in material_ids if (x not in self.state.materials))
if not_found:
# Request any materials we don't already have, if there were any
data = {"Zipped": llsd.zip_llsd([x.bytes for x in material_ids])}
async with self._region.caps_client.post("RenderMaterials", data=data) as resp:
resp.raise_for_status()
self._process_materials_response(await resp.read())
# build up a dict of just the requested mats
mats = {}
for mat_id in material_ids:
mats[mat_id] = self.state.materials[mat_id]
return mats
def _process_materials_response(self, response: bytes):
entries = llsd.unzip_llsd(llsd.parse_xml(response)["Zipped"])
for entry in entries:
self.state.materials[UUID(bytes=entry["ID"])] = entry["Material"]
async def request_object_inv(self, obj: Object) -> List[InventoryItem]:
if "RequestTaskInventory" in self._region.cap_urls:
return await self.request_object_inv_via_cap(obj)
else:
return await self.request_object_inv_via_xfer(obj)
async def request_object_inv_via_cap(self, obj: Object) -> List[InventoryItem]:
async with self._region.caps_client.get("RequestTaskInventory", params={"task_id": obj.FullID}) as resp:
resp.raise_for_status()
all_items = [InventoryItem.from_llsd(x) for x in (await resp.read_llsd())["contents"]]
# Synthesize the Contents directory so the items can have a parent
parent = InventoryObject(
obj_id=obj.FullID,
name="Contents",
)
model = InventoryModel()
model.add(parent)
for item in all_items:
model.add(item)
return all_items
async def request_object_inv_via_xfer(self, obj: Object) -> List[InventoryItem]:
session = self._region.session()
with self._region.message_handler.subscribe_async(
('ReplyTaskInventory',), predicate=lambda x: x["InventoryData"]["TaskID"] == obj.FullID
) as get_msg:
await self._region.circuit.send_reliable(Message(
'RequestTaskInventory',
# If no session is passed in we'll use the active session when the coro was created
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('InventoryData', LocalID=obj.LocalID),
))
inv_message = await asyncio.wait_for(get_msg(), timeout=5.0)
# Xfer doesn't need to be immediately awaited, multiple signals can be waited on.
xfer = await self._region.xfer_manager.request(
file_name=inv_message["InventoryData"]["Filename"], file_path=XferFilePath.CACHE)
inv_model = InventoryModel.from_bytes(xfer.reassemble_chunks())
return list(inv_model.all_items)
class ObjectEvent:
__slots__ = ("object", "updated", "update_type")
object: Object
updated: Set[str]
update_type: UpdateType
update_type: ObjectUpdateType
def __init__(self, obj: Object, updated: Set[str], update_type: UpdateType):
def __init__(self, obj: Object, updated: Set[str], update_type: ObjectUpdateType):
self.object = obj
self.updated = updated
self.update_type = update_type
@property
def name(self) -> UpdateType:
def name(self) -> ObjectUpdateType:
return self.update_type
@@ -186,7 +287,7 @@ class ClientWorldObjectManager:
self._session: BaseClientSession = session
self._settings = settings
self.name_cache = name_cache or NameCache()
self.events: MessageHandler[ObjectEvent, UpdateType] = MessageHandler(take_by_default=False)
self.events: MessageHandler[ObjectEvent, ObjectUpdateType] = MessageHandler(take_by_default=False)
self._fullid_lookup: Dict[UUID, Object] = {}
self._avatars: Dict[UUID, Avatar] = {}
self._avatar_objects: Dict[UUID, Object] = {}
@@ -207,6 +308,12 @@ class ClientWorldObjectManager:
self._handle_object_properties_generic)
message_handler.subscribe("ObjectPropertiesFamily",
self._handle_object_properties_generic)
message_handler.subscribe("AvatarAnimation",
self._handle_animation_message)
message_handler.subscribe("ObjectAnimation",
self._handle_animation_message)
message_handler.subscribe("AvatarAppearance",
self._handle_avatar_appearance_message)
def lookup_fullid(self, full_id: UUID) -> Optional[Object]:
return self._fullid_lookup.get(full_id, None)
@@ -220,7 +327,7 @@ class ClientWorldObjectManager:
@property
def all_avatars(self) -> Iterable[Avatar]:
return tuple(self._avatars.values())
return list(self._avatars.values())
def __len__(self):
return len(self._fullid_lookup)
@@ -236,12 +343,14 @@ class ClientWorldObjectManager:
if self._get_region_manager(handle) is None:
self._region_managers[handle] = proxify(self._session.region_by_handle(handle).objects)
def clear_region_objects(self, handle: int):
def untrack_region_objects(self, handle: int):
"""Handle signal that a region object manager was just cleared"""
# Make sure they're gone from our lookup table
for obj in tuple(self._fullid_lookup.values()):
for obj in list(self._fullid_lookup.values()):
if obj.RegionHandle == handle:
del self._fullid_lookup[obj.FullID]
if handle in self._region_managers:
del self._region_managers[handle]
self._rebuild_avatar_objects()
def _get_region_manager(self, handle: int) -> Optional[ClientObjectManager]:
@@ -272,6 +381,10 @@ class ClientWorldObjectManager:
futs.extend(region_mgr.request_object_properties(region_objs))
return futs
async def request_object_inv(self, obj: Object) -> List[InventoryItem]:
region_mgr = self._get_region_manager(obj.RegionHandle)
return await region_mgr.request_object_inv(obj)
async def load_ancestors(self, obj: Object, wait_time: float = 1.0):
"""
Ensure that the entire chain of parents above this object is loaded
@@ -286,16 +399,17 @@ class ClientWorldObjectManager:
obj = obj.Parent
def clear(self):
for handle in tuple(self._region_managers.keys()):
self.untrack_region_objects(handle)
self._avatars.clear()
for region_mgr in self._region_managers.values():
region_mgr.clear()
if self._fullid_lookup:
LOG.warning(f"Had {len(self._fullid_lookup)} objects not tied to a region manager!")
self._fullid_lookup.clear()
self._rebuild_avatar_objects()
self._region_managers.clear()
def _update_existing_object(self, obj: Object, new_properties: dict, update_type: UpdateType):
def _update_existing_object(self, obj: Object, new_properties: dict, update_type: ObjectUpdateType,
msg: Optional[Message]):
old_parent_id = obj.ParentID
new_parent_id = new_properties.get("ParentID", obj.ParentID)
old_local_id = obj.LocalID
@@ -338,23 +452,23 @@ class ClientWorldObjectManager:
LOG.warning(f"Tried to move object {obj!r} to unknown region {new_region_handle}")
if obj.PCode == PCode.AVATAR:
# `Avatar` instances are handled separately. Update all Avatar objects so
# we can deal with the RegionHandle change.
# `Avatar` instances are handled separately. Update all Avatar objects,
# so we can deal with the RegionHandle change.
self._rebuild_avatar_objects()
elif new_parent_id != old_parent_id:
# Parent ID changed, but we're in the same region
new_region_state.handle_object_reparented(obj, old_parent_id=old_parent_id)
if actually_updated_props and new_region_state is not None:
self._run_object_update_hooks(obj, actually_updated_props, update_type)
self._run_object_update_hooks(obj, actually_updated_props, update_type, msg)
def _track_new_object(self, region: RegionObjectsState, obj: Object):
def _track_new_object(self, region: RegionObjectsState, obj: Object, msg: Message):
region.track_object(obj)
self._fullid_lookup[obj.FullID] = obj
if obj.PCode == PCode.AVATAR:
self._avatar_objects[obj.FullID] = obj
self._rebuild_avatar_objects()
self._run_object_update_hooks(obj, set(obj.to_dict().keys()), UpdateType.OBJECT_UPDATE)
self._run_object_update_hooks(obj, set(obj.to_dict().keys()), ObjectUpdateType.UPDATE, msg)
def _kill_object_by_local_id(self, region_state: RegionObjectsState, local_id: int):
obj = region_state.lookup_localid(local_id)
@@ -406,11 +520,11 @@ class ClientWorldObjectManager:
# our view of the world then we want to move it to this region.
obj = self.lookup_fullid(object_data["FullID"])
if obj:
self._update_existing_object(obj, object_data, UpdateType.OBJECT_UPDATE)
self._update_existing_object(obj, object_data, ObjectUpdateType.UPDATE, msg)
else:
if region_state is None:
continue
self._track_new_object(region_state, Object(**object_data))
self._track_new_object(region_state, Object(**object_data), msg)
msg.meta["ObjectUpdateIDs"] = tuple(seen_locals)
def _handle_terse_object_update(self, msg: Message):
@@ -430,7 +544,7 @@ class ClientWorldObjectManager:
# Need the Object as context because decoding state requires PCode.
state_deserializer = ObjectStateSerializer.deserialize
object_data["State"] = state_deserializer(ctx_obj=obj, val=object_data["State"])
self._update_existing_object(obj, object_data, UpdateType.OBJECT_UPDATE)
self._update_existing_object(obj, object_data, ObjectUpdateType.UPDATE, msg)
else:
if region_state:
region_state.missing_locals.add(object_data["LocalID"])
@@ -458,7 +572,7 @@ class ClientWorldObjectManager:
self._update_existing_object(obj, {
"UpdateFlags": update_flags,
"RegionHandle": handle,
}, UpdateType.OBJECT_UPDATE)
}, ObjectUpdateType.UPDATE, msg)
continue
cached_obj_data = self._lookup_cache_entry(handle, block["ID"], block["CRC"])
@@ -466,7 +580,7 @@ class ClientWorldObjectManager:
cached_obj = normalize_object_update_compressed_data(cached_obj_data)
cached_obj["UpdateFlags"] = update_flags
cached_obj["RegionHandle"] = handle
self._track_new_object(region_state, Object(**cached_obj))
self._track_new_object(region_state, Object(**cached_obj), msg)
continue
# Don't know about it and wasn't cached.
@@ -497,11 +611,11 @@ class ClientWorldObjectManager:
LOG.warning(f"Got ObjectUpdateCompressed for unknown region {handle}: {object_data!r}")
obj = self.lookup_fullid(object_data["FullID"])
if obj:
self._update_existing_object(obj, object_data, UpdateType.OBJECT_UPDATE)
self._update_existing_object(obj, object_data, ObjectUpdateType.UPDATE, msg)
else:
if region_state is None:
continue
self._track_new_object(region_state, Object(**object_data))
self._track_new_object(region_state, Object(**object_data), msg)
msg.meta["ObjectUpdateIDs"] = tuple(seen_locals)
def _handle_object_properties_generic(self, packet: Message):
@@ -514,7 +628,7 @@ class ClientWorldObjectManager:
obj = self.lookup_fullid(block["ObjectID"])
if obj:
seen_locals.append(obj.LocalID)
self._update_existing_object(obj, object_properties, UpdateType.PROPERTIES)
self._update_existing_object(obj, object_properties, ObjectUpdateType.PROPERTIES, packet)
else:
LOG.debug(f"Received {packet.name} for unknown {block['ObjectID']}")
packet.meta["ObjectUpdateIDs"] = tuple(seen_locals)
@@ -552,6 +666,59 @@ class ClientWorldObjectManager:
region_state.coarse_locations.update(coarse_locations)
self._rebuild_avatar_objects()
def _handle_animation_message(self, message: Message):
sender_id = message["Sender"]["ID"]
if message.name == "AvatarAnimation":
avatar = self._avatars.get(sender_id)
if not avatar:
LOG.warning(f"Received AvatarAnimation for unknown avatar {sender_id}")
return
if not avatar.Object:
LOG.warning(f"Received AvatarAnimation for avatar with no object {sender_id}")
return
obj = avatar.Object
elif message.name == "ObjectAnimation":
obj = self.lookup_fullid(sender_id)
if not obj:
# This is only a debug message in the viewer, but let's be louder.
LOG.warning(f"Received ObjectAnimation for animesh with no object {sender_id}")
return
else:
LOG.error(f"Unknown animation message type: {message.name}")
return
obj.Animations.clear()
for block in message.blocks.get("AnimationList", []):
obj.Animations.append(block["AnimID"])
self._run_object_update_hooks(obj, {"Animations"}, ObjectUpdateType.ANIMATIONS, message)
def _handle_avatar_appearance_message(self, message: Message):
sender_id: UUID = message["Sender"]["ID"]
if message["Sender"]["IsTrial"]:
return
av = self.lookup_avatar(sender_id)
if not av:
LOG.warning(f"Received AvatarAppearance with no avatar {sender_id}")
return
version = message["AppearanceData"]["CofVersion"]
if version < av.COFVersion:
LOG.warning(f"Ignoring stale appearance for {sender_id}, {version} < {av.COFVersion}")
return
if not message.get_blocks("VisualParam"):
LOG.warning(f"No visual params in AvatarAppearance for {sender_id}")
return
av.COFVersion = version
av.Appearance = VISUAL_PARAMS.parse_appearance_message(message)
av_obj = av.Object
if av_obj:
self._run_object_update_hooks(av_obj, set(), ObjectUpdateType.APPEARANCE, message)
def _process_get_object_cost_response(self, parsed: dict):
if "error" in parsed:
return
@@ -561,18 +728,23 @@ class ClientWorldObjectManager:
LOG.debug(f"Received ObjectCost for unknown {object_id}")
continue
obj.ObjectCosts.update(object_costs)
self._run_object_update_hooks(obj, {"ObjectCosts"}, UpdateType.COSTS)
self._run_object_update_hooks(obj, {"ObjectCosts"}, ObjectUpdateType.COSTS, None)
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str], update_type: UpdateType):
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str], update_type: ObjectUpdateType,
msg: Optional[Message]):
region_state = self._get_region_state(obj.RegionHandle)
region_state.resolve_futures(obj, update_type)
if region_state:
region_state.resolve_futures(obj, update_type)
else:
LOG.warning(f"{obj} not tied to a region state")
if obj.PCode == PCode.AVATAR and "NameValue" in updated_props:
if obj.NameValue:
self.name_cache.update(obj.FullID, obj.NameValue.to_dict())
self.events.handle(ObjectEvent(obj, updated_props, update_type))
def _run_kill_object_hooks(self, obj: Object):
self.events.handle(ObjectEvent(obj, set(), UpdateType.KILL))
self.events.handle(ObjectEvent(obj, set(), ObjectUpdateType.KILL))
def _rebuild_avatar_objects(self):
# Get all avatars known through coarse locations and which region the location was in
@@ -642,13 +814,14 @@ class RegionObjectsState:
__slots__ = (
"handle", "missing_locals", "_orphans", "localid_lookup", "coarse_locations",
"_object_futures"
"_object_futures", "materials"
)
def __init__(self):
self.missing_locals = set()
self.localid_lookup: Dict[int, Object] = {}
self.coarse_locations: Dict[UUID, Vector3] = {}
self.materials: MATERIAL_MAP_TYPE = {}
self._object_futures: Dict[Tuple[int, int], List[asyncio.Future]] = {}
self._orphans: Dict[int, List[int]] = collections.defaultdict(list)
@@ -661,6 +834,7 @@ class RegionObjectsState:
self.coarse_locations.clear()
self.missing_locals.clear()
self.localid_lookup.clear()
self.materials.clear()
def lookup_localid(self, localid: int) -> Optional[Object]:
return self.localid_lookup.get(localid)
@@ -754,7 +928,8 @@ class RegionObjectsState:
def handle_object_reparented(self, obj: Object, old_parent_id: int):
"""Recreate any links to ancestor Objects for obj due to parent changes"""
self._unparent_object(obj, old_parent_id)
self._parent_object(obj, insert_at_head=True)
# Avatars get sent to the _end_ of the child list when reparented
self._parent_object(obj, insert_at_head=obj.PCode != PCode.AVATAR)
def collect_orphans(self, parent_localid: int) -> Sequence[int]:
"""Take ownership of any orphan IDs belonging to parent_localid"""
@@ -779,7 +954,7 @@ class RegionObjectsState:
del self._orphans[parent_id]
return removed
def register_future(self, local_id: int, future_type: UpdateType) -> asyncio.Future[Object]:
def register_future(self, local_id: int, future_type: ObjectUpdateType) -> asyncio.Future[Object]:
fut = asyncio.Future()
fut_key = (local_id, future_type)
local_futs = self._object_futures.get(fut_key, [])
@@ -788,7 +963,7 @@ class RegionObjectsState:
fut.add_done_callback(local_futs.remove)
return fut
def resolve_futures(self, obj: Object, update_type: UpdateType):
def resolve_futures(self, obj: Object, update_type: ObjectUpdateType):
futures = self._object_futures.get((obj.LocalID, update_type), [])
for fut in futures[:]:
fut.set_result(obj)
@@ -822,9 +997,9 @@ class Avatar:
self.FullID: UUID = full_id
self.Object: Optional["Object"] = obj
self.RegionHandle: int = region_handle
# TODO: Allow hooking into getZOffsets FS bridge response
# to fill in the Z axis if it's infinite
self.CoarseLocation = coarse_location
self.Appearance: Dict[int, float] = {}
self.COFVersion: int = -1
self.Valid = True
self.GuessedZ: Optional[float] = None
self._resolved_name = resolved_name

View File

@@ -0,0 +1,251 @@
import asyncio
import dataclasses
import logging
from typing import *
import numpy as np
from hippolyzer.lib.base.datatypes import UUID, Vector3, Vector2
from hippolyzer.lib.base.message.message import Message, Block
from hippolyzer.lib.base.templates import ParcelGridFlags, ParcelFlags
from hippolyzer.lib.client.state import BaseClientRegion
LOG = logging.getLogger(__name__)
@dataclasses.dataclass
class Parcel:
local_id: int
name: str
flags: ParcelFlags
group_id: UUID
# TODO: More properties
class ParcelManager:
# We expect to receive this number of ParcelOverlay messages
NUM_CHUNKS = 4
# No, we don't support varregion or whatever.
REGION_SIZE = 256
# Basically, the minimum parcel size is 4 on either axis so each "point" in the
# ParcelOverlay represents an area this size
GRID_STEP = 4
GRIDS_PER_EDGE = REGION_SIZE // GRID_STEP
def __init__(self, region: BaseClientRegion):
# dimensions are south to north, west to east
self.overlay = np.zeros((self.GRIDS_PER_EDGE, self.GRIDS_PER_EDGE), dtype=np.uint8)
# 1-indexed parcel list index
self.parcel_indices = np.zeros((self.GRIDS_PER_EDGE, self.GRIDS_PER_EDGE), dtype=np.uint16)
self.parcels: List[Optional[Parcel]] = []
self.overlay_chunks: List[Optional[bytes]] = [None] * self.NUM_CHUNKS
self.overlay_complete = asyncio.Event()
self.parcels_downloaded = asyncio.Event()
self._parcels_dirty: bool = True
self._region = region
self._next_seq = 1
self._region.message_handler.subscribe("ParcelOverlay", self._handle_parcel_overlay)
def _handle_parcel_overlay(self, message: Message):
self.add_overlay_chunk(message["ParcelData"]["Data"], message["ParcelData"]["SequenceID"])
def add_overlay_chunk(self, chunk: bytes, chunk_num: int) -> bool:
self.overlay_chunks[chunk_num] = chunk
# Still have some pending chunks, don't try to parse this yet
if not all(self.overlay_chunks):
return False
new_overlay_data = b"".join(self.overlay_chunks)
self.overlay_chunks = [None] * self.NUM_CHUNKS
self._parcels_dirty = False
if new_overlay_data != self.overlay.data[:]:
# If the raw data doesn't match, then we have to parse again
new_data = np.frombuffer(new_overlay_data, dtype=np.uint8).reshape(self.overlay.shape)
np.copyto(self.overlay, new_data)
self._parse_overlay()
# We could optimize this by just marking specific squares dirty
# if the parcel indices have changed between parses, but I don't care
# to do that.
self._parcels_dirty = True
self.parcels_downloaded.clear()
if not self.overlay_complete.is_set():
self.overlay_complete.set()
return True
@classmethod
def _pos_to_grid_coords(cls, pos: Vector3) -> Tuple[int, int]:
return round(pos.Y // cls.GRID_STEP), round(pos.X // cls.GRID_STEP)
def _parse_overlay(self):
# Zero out all parcel indices
self.parcel_indices[:, :] = 0
next_parcel_idx = 1
for y in range(0, self.GRIDS_PER_EDGE):
for x in range(0, self.GRIDS_PER_EDGE):
# We already have a parcel index for this grid, continue
if self.parcel_indices[y, x]:
continue
# Fill all adjacent grids with this parcel index
self._flood_fill_parcel_index(y, x, next_parcel_idx)
# SL doesn't allow disjoint grids to be part of the same parcel, so
# whatever grid we find next without a parcel index must be a new parcel
next_parcel_idx += 1
# Should have found at least one parcel
assert next_parcel_idx >= 2
# Have a different number of parcels now, we can't use the existing parcel objects
# because it's unlikely that just parcel boundaries have changed.
if len(self.parcels) != next_parcel_idx - 1:
# We don't know about any of these parcels yet, fill with none
self.parcels = [None] * (next_parcel_idx - 1)
def _flood_fill_parcel_index(self, start_y, start_x, parcel_idx):
"""Flood fill all neighboring grids with the parcel index, being mindful of parcel boundaries"""
# We know the start grid is assigned to this parcel index
self.parcel_indices[start_y, start_x] = parcel_idx
# Queue of grids to test the neighbors of, start with the start grid.
neighbor_test_queue: List[Tuple[int, int]] = [(start_y, start_x)]
while neighbor_test_queue:
to_test = neighbor_test_queue.pop(0)
test_grid = self.overlay[to_test]
for direction in ((-1, 0), (1, 0), (0, -1), (0, 1)):
new_pos = to_test[0] + direction[0], to_test[1] + direction[1]
if any(x < 0 or x >= self.GRIDS_PER_EDGE for x in new_pos):
# Outside bounds
continue
if self.parcel_indices[new_pos]:
# Already set, skip
continue
if direction[0] == -1 and test_grid & ParcelGridFlags.SOUTH_LINE:
# Test grid is already on a south line, can't go south.
continue
if direction[1] == -1 and test_grid & ParcelGridFlags.WEST_LINE:
# Test grid is already on a west line, can't go west.
continue
grid = self.overlay[new_pos]
if direction[0] == 1 and grid & ParcelGridFlags.SOUTH_LINE:
# Hit a south line going north, this is outside the current parcel
continue
if direction[1] == 1 and grid & ParcelGridFlags.WEST_LINE:
# Hit a west line going east, this is outside the current parcel
continue
# This grid is within the current parcel, set the parcel index
self.parcel_indices[new_pos] = parcel_idx
# Append the grid to the neighbour testing queue
neighbor_test_queue.append(new_pos)
async def request_dirty_parcels(self) -> Tuple[Parcel, ...]:
if self._parcels_dirty:
return await self.request_all_parcels()
return tuple(self.parcels)
async def request_all_parcels(self) -> Tuple[Parcel, ...]:
await self.overlay_complete.wait()
# Because of how we build up the parcel index map, it's safe for us to
# do this instead of keeping track of seen IDs in a set or similar
last_seen_parcel_index = 0
futs = []
for y in range(0, self.GRIDS_PER_EDGE):
for x in range(0, self.GRIDS_PER_EDGE):
parcel_index = self.parcel_indices[y, x]
assert parcel_index != 0
if parcel_index <= last_seen_parcel_index:
continue
assert parcel_index == last_seen_parcel_index + 1
last_seen_parcel_index = parcel_index
# Request a position within the parcel
futs.append(self.request_parcel_properties(
Vector2(x * self.GRID_STEP + 1.0, y * self.GRID_STEP + 1.0)
))
# Wait for all parcel properties to come in
await asyncio.gather(*futs)
self.parcels_downloaded.set()
self._parcels_dirty = False
return tuple(self.parcels)
async def request_parcel_properties(self, pos: Vector2) -> Parcel:
await self.overlay_complete.wait()
seq_id = self._next_seq
# Register a wait on a ParcelProperties matching this seq
parcel_props_fut = self._region.message_handler.wait_for(
("ParcelProperties",),
predicate=lambda msg: msg["ParcelData"]["SequenceID"] == seq_id,
timeout=10.0,
)
# We don't care about when we receive an ack, we only care about when we receive the parcel props
_ = self._region.circuit.send_reliable(Message(
"ParcelPropertiesRequest",
Block("AgentData", AgentID=self._region.session().agent_id, SessionID=self._region.session().id),
Block(
"ParcelData",
SequenceID=seq_id,
West=pos.X,
East=pos.X,
North=pos.Y,
South=pos.Y,
# What does this even mean?
SnapSelection=0,
),
))
self._next_seq += 1
return self._process_parcel_properties(await parcel_props_fut, pos)
def _process_parcel_properties(self, parcel_props: Message, pos: Optional[Vector2] = None) -> Parcel:
data_block = parcel_props["ParcelData"][0]
grid_coord = None
# Parcel indices are one-indexed, convert to zero-indexed.
if pos is not None:
# We have a pos, figure out where in the grid we should look for the parcel index
grid_coord = self._pos_to_grid_coords(pos)
else:
# Need to look at the parcel bitmap to figure out a valid grid coord.
# This is a boolean array where each bit says whether the parcel occupies that grid.
parcel_bitmap = data_block.deserialize_var("Bitmap")
for y in range(self.GRIDS_PER_EDGE):
for x in range(self.GRIDS_PER_EDGE):
if parcel_bitmap[y, x]:
# This is the first grid the parcel occupies per the bitmap
grid_coord = y, x
break
if grid_coord:
break
parcel = Parcel(
local_id=data_block["LocalID"],
name=data_block["Name"],
flags=ParcelFlags(data_block["ParcelFlags"]),
group_id=data_block["GroupID"],
# Parcel UUID isn't in this response :/
)
# I guess the bitmap _could_ be empty, but probably not.
if grid_coord is not None:
parcel_idx = self.parcel_indices[grid_coord] - 1
if len(self.parcels) > parcel_idx >= 0:
# Okay, parcels list is sane, place the parcel in there.
self.parcels[parcel_idx] = parcel
else:
LOG.warning(f"Received ParcelProperties with incomplete overlay for {grid_coord!r}")
return parcel
async def get_parcel_at(self, pos: Vector2, request_if_missing: bool = True) -> Optional[Parcel]:
grid_coord = self._pos_to_grid_coords(pos)
parcel = None
if parcel_idx := self.parcel_indices[grid_coord]:
parcel = self.parcels[parcel_idx - 1]
if request_if_missing and parcel is None:
return await self.request_parcel_properties(pos)
return parcel

View File

@@ -0,0 +1,51 @@
from typing import NamedTuple, List, Sequence
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.templates import ChatType
class RLVCommand(NamedTuple):
behaviour: str
param: str
options: List[str]
class RLVParser:
@staticmethod
def is_rlv_message(msg: Message) -> bool:
chat: str = msg["ChatData"]["Message"]
chat_type: int = msg["ChatData"]["ChatType"]
return chat and chat.startswith("@") and chat_type == ChatType.OWNER
@staticmethod
def parse_chat(chat: str) -> List[RLVCommand]:
assert chat.startswith("@")
chat = chat.lstrip("@")
commands = []
for command_str in chat.split(","):
if not command_str:
continue
# RLV-style command, `<cmd>(:<option1>;<option2>)?(=<param>)?`
# Roughly (?<behaviour>[^:=]+)(:(?<option>[^=]*))?=(?<param>\w+)
options, _, param = command_str.partition("=")
behaviour, _, options = options.partition(":")
# TODO: Not always correct, commands can specify their own parsing for the option field
# maybe special-case these?
options = options.split(";") if options else []
commands.append(RLVCommand(behaviour, param, options))
return commands
@staticmethod
def format_chat(commands: Sequence[RLVCommand]) -> str:
assert commands
chat = ""
for command in commands:
if chat:
chat += ","
chat += command.behaviour
if command.options:
chat += ":" + ";".join(command.options)
if command.param:
chat += "=" + command.param
return "@" + chat

View File

@@ -4,24 +4,80 @@ Base classes for common session-related state shared between clients and proxies
from __future__ import annotations
import abc
import logging
import weakref
from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.circuit import ConnectionHolder
import multidict
from hippolyzer.lib.base.datatypes import UUID, Vector3
from hippolyzer.lib.base.message.circuit import ConnectionHolder, Circuit
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.base.network.caps_client import CapsClient
from hippolyzer.lib.base.network.transport import ADDR_TUPLE
from hippolyzer.lib.base.objects import handle_to_global_pos
from hippolyzer.lib.base.xfer_manager import XferManager
from hippolyzer.lib.client.inventory_manager import InventoryManager
if TYPE_CHECKING:
from hippolyzer.lib.client.object_manager import ClientObjectManager, ClientWorldObjectManager
from hippolyzer.lib.client.object_manager import ClientObjectManager, ClientWorldObjectManager
class BaseClientRegion(ConnectionHolder, abc.ABC):
"""Represents a client's view of a remote region"""
# Actually a weakref
handle: Optional[int]
# Actually a weakref
session: Callable[[], BaseClientSession]
objects: ClientObjectManager
xfer_manager: XferManager
caps_client: CapsClient
cap_urls: multidict.MultiDict[str]
circuit_addr: ADDR_TUPLE
circuit: Optional[Circuit]
_name: Optional[str]
def __init__(self):
self._name = None
self.circuit = None
@abc.abstractmethod
def update_caps(self, caps: Mapping[str, str]) -> None:
pass
@property
def name(self):
if self._name:
return self._name
return "Pending %r" % (self.circuit_addr,)
@name.setter
def name(self, val):
self._name = val
@property
def global_pos(self) -> Vector3:
if self.handle is None:
raise ValueError("Can't determine global region position without handle")
return handle_to_global_pos(self.handle)
@property
def is_alive(self):
if not self.circuit:
return False
return self.circuit.is_alive
def mark_dead(self):
logging.info("Marking %r dead" % self)
if self.circuit:
self.circuit.is_alive = False
self.objects.clear()
def __repr__(self):
return "<%s %s (%r)>" % (self.__class__.__name__, self.name, self.handle)
class BaseClientSessionManager:
pass
class BaseClientSession(abc.ABC):
@@ -29,8 +85,105 @@ class BaseClientSession(abc.ABC):
id: UUID
agent_id: UUID
secure_session_id: UUID
active_group: UUID
groups: Set[UUID]
message_handler: MessageHandler[Message, str]
regions: Sequence[BaseClientRegion]
regions: MutableSequence[BaseClientRegion]
region_by_handle: Callable[[int], Optional[BaseClientRegion]]
region_by_circuit_addr: Callable[[ADDR_TUPLE], Optional[BaseClientRegion]]
objects: ClientWorldObjectManager
inventory: InventoryManager
login_data: Dict[str, Any]
REGION_CLS = Type[BaseClientRegion]
def __init__(self, id, secure_session_id, agent_id, circuit_code,
session_manager: Optional[BaseClientSessionManager], login_data=None):
self.login_data = login_data or {}
self.pending = True
self.id: UUID = id
self.secure_session_id: UUID = secure_session_id
self.agent_id: UUID = agent_id
self.circuit_code = circuit_code
self.global_caps = {}
self.session_manager = session_manager
self.active_group: UUID = UUID.ZERO
self.groups: Set[UUID] = set()
self.regions = []
self._main_region = None
self.message_handler: MessageHandler[Message, str] = MessageHandler()
super().__init__()
@classmethod
def from_login_data(cls, login_data, session_manager):
sess = cls(
id=UUID(login_data["session_id"]),
secure_session_id=UUID(login_data["secure_session_id"]),
agent_id=UUID(login_data["agent_id"]),
circuit_code=int(login_data["circuit_code"]),
session_manager=session_manager,
login_data=login_data,
)
appearance_service = login_data.get("agent_appearance_service")
map_image_service = login_data.get("map-server-url")
if appearance_service:
sess.global_caps["AppearanceService"] = appearance_service
if map_image_service:
sess.global_caps["MapImageService"] = map_image_service
# Login data also has details about the initial sim
sess.register_region(
circuit_addr=(login_data["sim_ip"], login_data["sim_port"]),
handle=(login_data["region_x"] << 32) | login_data["region_y"],
seed_url=login_data["seed_capability"],
)
return sess
def register_region(self, circuit_addr: Optional[ADDR_TUPLE] = None, seed_url: Optional[str] = None,
handle: Optional[int] = None) -> BaseClientRegion:
if not any((circuit_addr, seed_url)):
raise ValueError("One of circuit_addr and seed_url must be defined!")
for region in self.regions:
if region.circuit_addr == circuit_addr:
if seed_url and region.cap_urls.get("Seed") != seed_url:
region.update_caps({"Seed": seed_url})
if handle:
region.handle = handle
return region
if seed_url and region.cap_urls.get("Seed") == seed_url:
return region
if not circuit_addr:
raise ValueError("Can't create region without circuit addr!")
logging.info("Registering region for %r" % (circuit_addr,))
region = self.REGION_CLS(circuit_addr, seed_url, self, handle=handle)
self.regions.append(region)
return region
@property
def main_region(self) -> Optional[BaseClientRegion]:
if self._main_region and self._main_region() in self.regions:
return self._main_region()
return None
@main_region.setter
def main_region(self, val: BaseClientRegion):
self._main_region = weakref.ref(val)
def transaction_to_assetid(self, transaction_id: UUID):
return UUID.combine(transaction_id, self.secure_session_id)
def region_by_circuit_addr(self, circuit_addr) -> Optional[BaseClientRegion]:
for region in self.regions:
if region.circuit_addr == circuit_addr and region.circuit:
return region
return None
def region_by_handle(self, handle: int) -> Optional[BaseClientRegion]:
for region in self.regions:
if region.handle == handle:
return region
return None
def __repr__(self):
return "<%s %s>" % (self.__class__.__name__, self.id)

View File

@@ -1,11 +1,14 @@
from __future__ import annotations
from typing import *
import abc
import copy
import dataclasses
import multiprocessing
import pickle
import secrets
import warnings
from typing import *
import outleap
from hippolyzer.lib.base.datatypes import UUID, Vector3
from hippolyzer.lib.base.message.message import Block, Message
@@ -14,10 +17,11 @@ from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.base.network.transport import UDPPacket, Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager, Session
from hippolyzer.lib.proxy.task_scheduler import TaskLifeScope
from hippolyzer.lib.base.templates import ChatSourceType, ChatType
if TYPE_CHECKING:
from hippolyzer.lib.proxy.sessions import SessionManager, Session
from hippolyzer.lib.proxy.region import ProxiedRegion
class AssetAliasTracker:
@@ -73,17 +77,17 @@ def show_message(text, session=None) -> None:
direction=Direction.IN,
)
if session:
session.main_region.circuit.send_message(message)
session.main_region.circuit.send(message)
else:
for session in AddonManager.SESSION_MANAGER.sessions:
session.main_region.circuit.send_message(copy.copy(message))
session.main_region.circuit.send(copy.copy(message))
def send_chat(message: Union[bytes, str], channel=0, chat_type=ChatType.NORMAL, session=None):
session = session or addon_ctx.session.get(None) or None
if not session:
raise RuntimeError("Tried to send chat without session")
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"ChatFromViewer",
Block(
"AgentData",
@@ -99,36 +103,32 @@ def send_chat(message: Union[bytes, str], channel=0, chat_type=ChatType.NORMAL,
))
def ais_item_to_inventory_data(ais_item: dict):
return Block(
"InventoryData",
ItemID=ais_item["item_id"],
FolderID=ais_item["parent_id"],
CallbackID=0,
CreatorID=ais_item["permissions"]["creator_id"],
OwnerID=ais_item["permissions"]["owner_id"],
GroupID=ais_item["permissions"]["group_id"],
BaseMask=ais_item["permissions"]["base_mask"],
OwnerMask=ais_item["permissions"]["owner_mask"],
GroupMask=ais_item["permissions"]["group_mask"],
EveryoneMask=ais_item["permissions"]["everyone_mask"],
NextOwnerMask=ais_item["permissions"]["next_owner_mask"],
GroupOwned=0,
AssetID=ais_item["asset_id"],
Type=ais_item["type"],
InvType=ais_item["inv_type"],
Flags=ais_item["flags"],
SaleType=ais_item["sale_info"]["sale_type"],
SalePrice=ais_item["sale_info"]["sale_price"],
Name=ais_item["name"],
Description=ais_item["desc"],
CreationDate=ais_item["created_at"],
# Meaningless here
CRC=secrets.randbits(32),
)
class MetaBaseAddon(abc.ABCMeta):
"""
Metaclass for BaseAddon that prevents class member assignments from clobbering descriptors
Without this things like:
class Foo(BaseAddon):
bar: int = GlobalProperty(0)
Foo.bar = 2
Won't work as you expect!
"""
def __setattr__(self, key: str, value):
try:
existing = object.__getattribute__(self, key)
if existing and isinstance(existing, BaseAddonProperty):
existing.__set__(self, value)
return
except AttributeError:
# If the attribute doesn't exist then it's fine to use the base setattr.
pass
super().__setattr__(key, value)
class BaseAddon(abc.ABC):
class BaseAddon(metaclass=MetaBaseAddon):
def _schedule_task(self, coro: Coroutine, session=None,
region_scoped=False, session_scoped=True, addon_scoped=True):
session = session or addon_ctx.session.get(None) or None
@@ -172,7 +172,7 @@ class BaseAddon(abc.ABC):
pass
def handle_object_updated(self, session: Session, region: ProxiedRegion,
obj: Object, updated_props: Set[str]):
obj: Object, updated_props: Set[str], msg: Optional[Message]):
pass
def handle_object_killed(self, session: Session, region: ProxiedRegion, obj: Object):
@@ -181,20 +181,26 @@ class BaseAddon(abc.ABC):
def handle_region_changed(self, session: Session, region: ProxiedRegion):
pass
def handle_region_registered(self, session: Session, region: ProxiedRegion):
pass
def handle_circuit_created(self, session: Session, region: ProxiedRegion):
pass
def handle_rlv_command(self, session: Session, region: ProxiedRegion, source: UUID,
cmd: str, options: List[str], param: str):
behaviour: str, options: List[str], param: str):
pass
def handle_proxied_packet(self, session_manager: SessionManager, packet: UDPPacket,
session: Optional[Session], region: Optional[ProxiedRegion]):
pass
async def handle_leap_client_added(self, session_manager: SessionManager, leap_client: outleap.LEAPClient):
pass
_T = TypeVar("_T")
_U = TypeVar("_U", Session, SessionManager)
_U = TypeVar("_U", "Session", "SessionManager")
class BaseAddonProperty(abc.ABC, Generic[_T, _U]):
@@ -205,13 +211,17 @@ class BaseAddonProperty(abc.ABC, Generic[_T, _U]):
session_manager.addon_ctx dict, without any namespacing. Can be accessed either
through `AddonClass.property_name` or `addon_instance.property_name`.
"""
__slots__ = ("name", "default")
__slots__ = ("name", "default", "_owner")
def __init__(self, default=dataclasses.MISSING):
self.default = default
self._owner = None
def __set_name__(self, owner, name: str):
self.name = name
# Keep track of which addon "owns" this property so that we can shove
# the data in a bucket specific to that addon name.
self._owner = owner
def _make_default(self) -> _T:
if self.default is not dataclasses.MISSING:
@@ -229,21 +239,23 @@ class BaseAddonProperty(abc.ABC, Generic[_T, _U]):
if ctx_obj is None:
raise AttributeError(
f"{self.__class__} {self.name} accessed outside proper context")
addon_state = ctx_obj.addon_ctx[self._owner.__name__]
# Set a default if we have one, otherwise let the keyerror happen.
# Maybe we should do this at addon initialization instead of on get.
if self.name not in ctx_obj.addon_ctx:
if self.name not in addon_state:
default = self._make_default()
if default is not dataclasses.MISSING:
ctx_obj.addon_ctx[self.name] = default
addon_state[self.name] = default
else:
raise AttributeError(f"{self.name} is not set")
return ctx_obj.addon_ctx[self.name]
return addon_state[self.name]
def __set__(self, _obj, value: _T) -> None:
self._get_context_obj().addon_ctx[self.name] = value
addon_state = self._get_context_obj().addon_ctx[self._owner.__name__]
addon_state[self.name] = value
class SessionProperty(BaseAddonProperty[_T, Session]):
class SessionProperty(BaseAddonProperty[_T, "Session"]):
"""
Property tied to the current session context
@@ -253,7 +265,7 @@ class SessionProperty(BaseAddonProperty[_T, Session]):
return addon_ctx.session.get()
class GlobalProperty(BaseAddonProperty[_T, SessionManager]):
class GlobalProperty(BaseAddonProperty[_T, "SessionManager"]):
"""
Property tied to the global SessionManager context

View File

@@ -15,9 +15,13 @@ import time
from types import ModuleType
from typing import *
import outleap
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import get_mtime
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.network.transport import UDPPacket
from hippolyzer.lib.client.rlv import RLVParser
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.task_scheduler import TaskLifeScope, TaskScheduler
@@ -31,13 +35,6 @@ if TYPE_CHECKING:
LOG = logging.getLogger(__name__)
def _get_mtime(path):
try:
return os.stat(path).st_mtime
except:
return None
class BaseInteractionManager:
@abc.abstractmethod
async def open_dir(self, caption: str = '', directory: str = '', filter_str: str = '') -> Optional[str]:
@@ -52,7 +49,8 @@ class BaseInteractionManager:
pass
@abc.abstractmethod
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '') -> Optional[str]:
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '',
default_suffix: str = '') -> Optional[str]:
pass
@abc.abstractmethod
@@ -63,6 +61,15 @@ class BaseInteractionManager:
return None
# Used to initialize a REPL environment with commonly desired helpers
REPL_INITIALIZER = r"""
from hippolyzer.lib.base.datatypes import *
from hippolyzer.lib.base.templates import *
from hippolyzer.lib.base.message.message import Block, Message, Direction
from hippolyzer.lib.proxy.addon_utils import send_chat, show_message
"""
class AddonManager:
COMMAND_CHANNEL = 524
@@ -138,6 +145,16 @@ class AddonManager:
if _locals is None:
_locals = stack.frame.f_locals
init_globals = {}
exec(REPL_INITIALIZER, init_globals, None)
# We're modifying the globals of the caller, be careful of things we imported
# for the REPL initializer clobber things that already exist in the caller's globals.
# Making our own mutable copy of the globals dict, mutating that and then passing it
# to embed() is not an option due to https://github.com/prompt-toolkit/ptpython/issues/279
for global_name, global_val in init_globals.items():
if global_name not in _globals:
_globals[global_name] = global_val
async def _wrapper():
coro: Coroutine = ptpython.repl.embed( # noqa: the type signature lies
globals=_globals,
@@ -158,7 +175,10 @@ class AddonManager:
def load_addon_from_path(cls, path, reload=False, raise_exceptions=True):
path = pathlib.Path(path).absolute()
mod_name = "hippolyzer.user_addon_%s" % path.stem
cls.BASE_ADDON_SPECS.append(importlib.util.spec_from_file_location(mod_name, path))
spec = importlib.util.spec_from_file_location(mod_name, path)
if not spec:
raise ValueError(f"Unable to load {path}")
cls.BASE_ADDON_SPECS.append(spec)
addon_dir = os.path.realpath(pathlib.Path(path).parent.absolute())
if addon_dir not in sys.path:
@@ -185,9 +205,9 @@ class AddonManager:
@classmethod
def _check_hotreloads(cls):
"""Mark addons that rely on changed files for reloading"""
for filename, importers in cls.HOTRELOAD_IMPORTERS.items():
mtime = _get_mtime(filename)
if not mtime or mtime == cls.FILE_MTIMES.get(filename, None):
for file_path, importers in cls.HOTRELOAD_IMPORTERS.items():
mtime = get_mtime(file_path)
if not mtime or mtime == cls.FILE_MTIMES.get(file_path, None):
continue
# Mark anything that imported this as dirty too, handling circular
@@ -206,16 +226,21 @@ class AddonManager:
_dirty_importers(importers)
if file_path not in cls.BASE_ADDON_SPECS:
# Make sure we won't reload importers in a loop if this is actually something
# that was dynamically imported, where `hot_reload()` might not be called again!
cls.FILE_MTIMES[file_path] = mtime
@classmethod
def hot_reload(cls, mod: Any, require_addons_loaded=False):
# Solely to trick the type checker because ModuleType doesn't apply where it should
# and Protocols aren't well supported yet.
# and Protocols aren't well-supported yet.
imported_mod: ModuleType = mod
imported_file = imported_mod.__file__
# Mark the caller as having imported (and being dependent on) `module`
stack = inspect.stack()[1]
cls.HOTRELOAD_IMPORTERS[imported_file].add(stack.filename)
cls.FILE_MTIMES[imported_file] = _get_mtime(imported_file)
cls.FILE_MTIMES[imported_file] = get_mtime(imported_file)
importing_spec = next((s for s in cls.BASE_ADDON_SPECS if s.origin == stack.filename), None)
imported_spec = next((s for s in cls.BASE_ADDON_SPECS if s.origin == imported_file), None)
@@ -261,9 +286,12 @@ class AddonManager:
new_addons = {}
for spec in cls.BASE_ADDON_SPECS[:]:
previous_mod = cls.FRESH_ADDON_MODULES.get(spec.name)
# Whether we've EVER successfully loaded this module,
# There may be a `None` entry in the dict if that's the case.
had_mod = spec.name in cls.FRESH_ADDON_MODULES
try:
mtime = _get_mtime(spec.origin)
mtime = get_mtime(spec.origin)
mtime_changed = mtime != cls.FILE_MTIMES.get(spec.origin, None)
if not mtime_changed and had_mod:
continue
@@ -275,20 +303,21 @@ class AddonManager:
# Keep module loaded even if file went away.
continue
if previous_mod:
cls._unload_module(previous_mod)
logging.info("(Re)compiling addon %s" % spec.origin)
old_mod = cls.FRESH_ADDON_MODULES.get(spec.name)
mod = importlib.util.module_from_spec(spec)
sys.modules[spec.name] = mod
spec.loader.exec_module(mod)
cls.FILE_MTIMES[spec.origin] = mtime
cls._unload_module(old_mod)
new_addons[spec.name] = mod
# Make sure module initialization happens after any pending task cancellations
# due to module unloading.
asyncio.get_event_loop().call_soon(cls._init_module, mod)
loop = asyncio.get_event_loop_policy().get_event_loop()
loop.call_soon(cls._init_module, mod)
except Exception as e:
if had_mod:
logging.exception("Exploded trying to reload addon %s" % spec.name)
@@ -320,11 +349,11 @@ class AddonManager:
cls.SCHEDULER.kill_matching_tasks(lifetime_mask=TaskLifeScope.ADDON, creator=addon)
@classmethod
def _call_all_addon_hooks(cls, hook_name, *args, **kwargs):
def _call_all_addon_hooks(cls, hook_name, *args, call_async=False, **kwargs) -> Optional[bool]:
for module in cls.FRESH_ADDON_MODULES.values():
if not module:
continue
ret = cls._call_module_hooks(module, hook_name, *args, **kwargs)
ret = cls._call_module_hooks(module, hook_name, *args, call_async=call_async, **kwargs)
if ret:
return ret
@@ -355,15 +384,15 @@ class AddonManager:
return commands
@classmethod
def _call_module_hooks(cls, module, hook_name, *args, **kwargs):
def _call_module_hooks(cls, module, hook_name, *args, call_async=False, **kwargs):
for addon in cls._get_module_addons(module):
ret = cls._try_call_hook(addon, hook_name, *args, **kwargs)
ret = cls._try_call_hook(addon, hook_name, *args, call_async=call_async, **kwargs)
if ret:
return ret
return cls._try_call_hook(module, hook_name, *args, **kwargs)
return cls._try_call_hook(module, hook_name, *args, call_async=call_async, **kwargs)
@classmethod
def _try_call_hook(cls, addon, hook_name, *args, **kwargs):
def _try_call_hook(cls, addon, hook_name, *args, call_async=False, **kwargs) -> Optional[bool]:
if cls._SUBPROCESS:
return
@@ -373,6 +402,20 @@ class AddonManager:
if not hook_func:
return
try:
if call_async:
old_hook_func = hook_func
# Wrapper so we can invoke an async hook synchronously.
def _wrapper(*w_args, **w_kwargs):
cls.SCHEDULER.schedule_task(
old_hook_func(*w_args, **w_kwargs),
scope=TaskLifeScope.ADDON,
creator=addon,
)
# Fall through to any other handlers as well,
# async handlers don't chain.
return None
hook_func = _wrapper
return hook_func(*args, **kwargs)
except:
logging.exception("Exploded in %r's %s hook" % (addon, hook_name))
@@ -410,26 +453,36 @@ class AddonManager:
raise
return True
if message.name == "ChatFromSimulator" and "ChatData" in message:
chat: str = message["ChatData"]["Message"]
chat_type: int = message["ChatData"]["ChatType"]
# RLV-style OwnerSay?
if chat and chat.startswith("@") and chat_type == 8:
# RLV-style command, `@<cmd>(:<option1>;<option2>)?(=<param>)?`
options, _, param = chat.rpartition("=")
cmd, _, options = options.lstrip("@").partition(":")
options = options.split(";")
if RLVParser.is_rlv_message(message):
# RLV allows putting multiple commands into one message, blindly splitting on ",".
all_cmds_handled = True
chat: str = message["ChatData"]["Message"]
source = message["ChatData"]["SourceID"]
try:
with addon_ctx.push(session, region):
handled = cls._call_all_addon_hooks("handle_rlv_command",
session, region, source, cmd, options, param)
if handled:
region.circuit.drop_message(message)
return True
except:
LOG.exception(f"Failed while handling command {chat!r}")
if not cls._SWALLOW_ADDON_EXCEPTIONS:
raise
for command in RLVParser.parse_chat(chat):
try:
with addon_ctx.push(session, region):
handled = cls._call_all_addon_hooks(
"handle_rlv_command",
session,
region,
source,
command.behaviour,
command.options,
command.param,
)
if handled:
region.circuit.drop_message(message)
else:
all_cmds_handled = False
except:
LOG.exception(f"Failed while handling command {command!r}")
all_cmds_handled = False
if not cls._SWALLOW_ADDON_EXCEPTIONS:
raise
# Drop the chat message if all commands it contained were handled by an addon
if all_cmds_handled:
return True
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_lludp_message", session, region, message)
@@ -510,9 +563,9 @@ class AddonManager:
@classmethod
def handle_object_updated(cls, session: Session, region: ProxiedRegion,
obj: Object, updated_props: Set[str]):
obj: Object, updated_props: Set[str], msg: Optional[Message]):
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_object_updated", session, region, obj, updated_props)
return cls._call_all_addon_hooks("handle_object_updated", session, region, obj, updated_props, msg)
@classmethod
def handle_object_killed(cls, session: Session, region: ProxiedRegion, obj: Object):
@@ -526,6 +579,11 @@ class AddonManager:
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_region_changed", session, region)
@classmethod
def handle_region_registered(cls, session: Session, region: ProxiedRegion):
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_region_registered", session, region)
@classmethod
def handle_circuit_created(cls, session: Session, region: ProxiedRegion):
with addon_ctx.push(session, region):
@@ -537,3 +595,7 @@ class AddonManager:
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_proxied_packet", session_manager,
packet, session, region)
@classmethod
def handle_leap_client_added(cls, session_manager: SessionManager, leap_client: outleap.LEAPClient):
return cls._call_all_addon_hooks("handle_leap_client_added", session_manager, leap_client, call_async=True)

View File

@@ -0,0 +1,39 @@
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.inventory import InventoryItem
from hippolyzer.lib.base.message.message import Message, Block
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.client.asset_uploader import AssetUploader
class ProxyAssetUploader(AssetUploader):
async def _handle_upload_complete(self, resp_payload: dict):
# Check if this a failure response first, raising if it is
await super()._handle_upload_complete(resp_payload)
# Fetch enough data from AIS to tell the viewer about the new inventory item
session = self._region.session()
item_id = resp_payload["new_inventory_item"]
ais_req_data = {
"items": [
{
"owner_id": session.agent_id,
"item_id": item_id,
}
]
}
async with self._region.caps_client.post('FetchInventory2', llsd=ais_req_data) as resp:
ais_item = InventoryItem.from_llsd((await resp.read_llsd())["items"][0], flavor="ais")
# Got it, ship it off to the viewer
message = Message(
"UpdateCreateInventoryItem",
Block(
"AgentData",
AgentID=session.agent_id,
SimApproved=1,
TransactionID=UUID.random(),
),
ais_item.to_inventory_data(),
direction=Direction.IN
)
self._region.circuit.send(message)

View File

@@ -0,0 +1,93 @@
from __future__ import annotations
import enum
import typing
from weakref import ref
from typing import *
if TYPE_CHECKING:
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
def is_asset_server_cap_name(cap_name):
return cap_name and (
cap_name.startswith("GetMesh")
or cap_name.startswith("GetTexture")
or cap_name.startswith("ViewerAsset")
)
class CapType(enum.Enum):
NORMAL = enum.auto()
TEMPORARY = enum.auto()
WRAPPER = enum.auto()
PROXY_ONLY = enum.auto()
@property
def fake(self) -> bool:
return self == CapType.PROXY_ONLY or self == CapType.WRAPPER
class SerializedCapData(typing.NamedTuple):
cap_name: typing.Optional[str] = None
region_addr: typing.Optional[str] = None
session_id: typing.Optional[str] = None
base_url: typing.Optional[str] = None
type: str = "NORMAL"
def __bool__(self):
return bool(self.cap_name or self.session_id)
@property
def asset_server_cap(self):
return is_asset_server_cap_name(self.cap_name)
class CapData(NamedTuple):
cap_name: Optional[str] = None
# Actually they're weakrefs but the type sigs suck.
region: Optional[Callable[[], Optional[ProxiedRegion]]] = None
session: Optional[Callable[[], Optional[Session]]] = None
base_url: Optional[str] = None
type: CapType = CapType.NORMAL
def __bool__(self):
return bool(self.cap_name or self.session)
def serialize(self) -> "SerializedCapData":
return SerializedCapData(
cap_name=self.cap_name,
region_addr=str(self.region().circuit_addr) if self.region and self.region() else None,
session_id=str(self.session().id) if self.session and self.session() else None,
base_url=self.base_url,
type=self.type.name,
)
@classmethod
def deserialize(
cls,
ser_cap_data: "SerializedCapData",
session_mgr: Optional[SessionManager],
) -> "CapData":
cap_session = None
cap_region = None
if session_mgr and ser_cap_data.session_id:
for session in session_mgr.sessions:
if ser_cap_data.session_id == str(session.id):
cap_session = session
if cap_session and ser_cap_data.region_addr:
for region in cap_session.regions:
if ser_cap_data.region_addr == str(region.circuit_addr):
cap_region = region
return cls(
cap_name=ser_cap_data.cap_name,
region=ref(cap_region) if cap_region else None,
session=ref(cap_session) if cap_session else None,
base_url=ser_cap_data.base_url,
type=CapType[ser_cap_data.type],
)
@property
def asset_server_cap(self) -> bool:
return is_asset_server_cap_name(self.cap_name)

View File

@@ -20,7 +20,7 @@ class ProxyCapsClient(CapsClient):
def _get_caps(self) -> Optional[CAPS_DICT]:
if not self._region:
return None
return self._region.caps
return self._region.cap_urls
def _request_fixups(self, cap_or_url: str, headers: Dict, proxy: Optional[bool], ssl: Any):
# We want to proxy this through Hippolyzer
@@ -28,7 +28,8 @@ class ProxyCapsClient(CapsClient):
# We go through the proxy by default, tack on a header letting mitmproxy know the
# request came from us so we can tag the request as injected. The header will be popped
# off before passing through to the server.
headers["X-Hippo-Injected"] = "1"
if "X-Hippo-Injected" not in headers:
headers["X-Hippo-Injected"] = "1"
proxy_port = self._settings.HTTP_PROXY_PORT
proxy = f"http://127.0.0.1:{proxy_port}"
# TODO: set up the SSLContext to validate mitmproxy's cert

View File

@@ -25,7 +25,7 @@ class ProxiedCircuit(Circuit):
except:
logging.exception(f"Failed to serialize: {message.to_dict()!r}")
raise
if self.logging_hook and message.injected:
if self.logging_hook and message.synthetic:
self.logging_hook(message)
return self.send_datagram(serialized, message.direction, transport=transport)
@@ -34,44 +34,46 @@ class ProxiedCircuit(Circuit):
return self.out_injections, self.in_injections
return self.in_injections, self.out_injections
def prepare_message(self, message: Message, direction=None):
def prepare_message(self, message: Message):
if message.finalized:
raise RuntimeError(f"Trying to re-send finalized {message!r}")
direction = direction or getattr(message, 'direction')
fwd_injections, reverse_injections = self._get_injections(direction)
if message.queued:
# This is due to be dropped, nothing should be sending the original
raise RuntimeError(f"Trying to send original of queued {message!r}")
fwd_injections, reverse_injections = self._get_injections(message.direction)
message.finalized = True
# Injected, let's gen an ID
if message.packet_id is None:
message.packet_id = fwd_injections.gen_injectable_id()
message.injected = True
else:
message.synthetic = True
# This message wasn't injected by the proxy so we need to rewrite packet IDs
# to account for IDs the real creator of the packet couldn't have known about.
elif not message.synthetic:
# was_dropped needs the unmodified packet ID
if fwd_injections.was_dropped(message.packet_id) and message.name != "PacketAck":
logging.warning("Attempting to re-send previously dropped %s:%s, did we ack?" %
(message.packet_id, message.name))
message.packet_id = fwd_injections.get_effective_id(message.packet_id)
fwd_injections.track_seen(message.packet_id)
message.finalized = True
if not message.injected:
# This message wasn't injected by the proxy so we need to rewrite packet IDs
# to account for IDs the other parties couldn't have known about.
message.acks = tuple(
reverse_injections.get_original_id(x) for x in message.acks
if not reverse_injections.was_injected(x)
)
if message.name == "PacketAck":
if not self._rewrite_packet_ack(message, reverse_injections):
logging.debug(f"Dropping {direction} ack for injected packets!")
if not self._rewrite_packet_ack(message, reverse_injections) and not message.acks:
logging.debug(f"Dropping {message.direction} ack for injected packets!")
# Let caller know this shouldn't be sent at all, it's strictly ACKs for
# injected packets.
return False
elif message.name == "StartPingCheck":
self._rewrite_start_ping_check(message, fwd_injections)
if not message.acks:
if message.acks:
message.send_flags |= PacketFlags.ACK
else:
message.send_flags &= ~PacketFlags.ACK
return True
@@ -97,15 +99,18 @@ class ProxiedCircuit(Circuit):
new_id = fwd_injections.get_effective_id(orig_id)
if orig_id != new_id:
logging.debug("Rewrote oldest unacked %s -> %s" % (orig_id, new_id))
# Get a list of unacked IDs for the direction this StartPingCheck is heading
fwd_unacked = (a for (d, a) in self.unacked_reliable.keys() if d == message.direction)
# Use the proxy's oldest unacked ID if it's older than the client's
new_id = min((new_id, *fwd_unacked))
message["PingID"]["OldestUnacked"] = new_id
def drop_message(self, message: Message, orig_direction=None):
def drop_message(self, message: Message):
if message.finalized:
raise RuntimeError(f"Trying to drop finalized {message!r}")
if message.packet_id is None:
return
orig_direction = orig_direction or message.direction
fwd_injections, reverse_injections = self._get_injections(orig_direction)
fwd_injections, reverse_injections = self._get_injections(message.direction)
fwd_injections.mark_dropped(message.packet_id)
message.dropped = True
@@ -113,7 +118,7 @@ class ProxiedCircuit(Circuit):
# Was sent reliably, tell the other end that we saw it and to shut up.
if message.reliable:
self.send_acks([message.packet_id], ~orig_direction)
self.send_acks([message.packet_id], ~message.direction)
# This packet had acks for the other end, send them in a separate PacketAck
effective_acks = tuple(
@@ -121,7 +126,7 @@ class ProxiedCircuit(Circuit):
if not reverse_injections.was_injected(x)
)
if effective_acks:
self.send_acks(effective_acks, orig_direction, packet_id=message.packet_id)
self.send_acks(effective_acks, message.direction, packet_id=message.packet_id)
class InjectionTracker:

View File

@@ -26,6 +26,10 @@ class CommandDetails(NamedTuple):
lifetime: Optional[TaskLifeScope] = None
def parse_bool(val: str) -> bool:
return val.lower() in ('on', 'true', '1', '1.0', 'yes')
def handle_command(command_name: Optional[str] = None, /, *, lifetime: Optional[TaskLifeScope] = None,
single_instance: bool = False, **params: Union[Parameter, callable]):
"""
@@ -61,13 +65,13 @@ def handle_command(command_name: Optional[str] = None, /, *, lifetime: Optional[
# Greedy, takes the rest of the message
if param.sep is None:
param_val = message
message = None
message = ""
else:
message = message.lstrip(param.sep)
if not message:
if param.optional:
break
raise KeyError(f"Missing parameter {param_name}")
if not param.optional:
raise KeyError(f"Missing parameter {param_name}")
continue
param_val, _, message = message.partition(param.sep) # type: ignore
param_vals[param_name] = param.parser(param_val)

View File

@@ -58,7 +58,7 @@ class HTTPAssetRepo(collections.UserDict):
return False
asset = self[asset_id]
flow.response = http.HTTPResponse.make(
flow.response = http.Response.make(
content=asset.data,
headers={
"Content-Type": "application/octet-stream",

Some files were not shown because too many files have changed in this diff Show More