77 Commits

Author SHA1 Message Date
Salad Dais
d498d1f2c8 v0.11.0 2022-07-18 08:53:24 +00:00
Salad Dais
8c0635bb2a Add classmethod for rebuilding TEs into a TECollection 2022-07-18 06:37:20 +00:00
Salad Dais
309dbeeb52 Add TextureEntry.st_to_uv() to convert between coords 2022-07-18 00:34:56 +00:00
Salad Dais
4cc87bf81e Add a default value for TextureEntryCollection.realize() num_faces 2022-07-17 01:09:22 +00:00
Salad Dais
f34bb42dcb TextureEntry -> TextureEntryCollection, improve .realize()
The "TextureEntry" name from the message template is kind of a
misnomer, the field actually includes multiple TextureEntries.
2022-07-17 00:45:20 +00:00
Salad Dais
59ec99809a Correct TE rotation quantization
Literally everything has its own special float quantization. Argh.
2022-07-16 23:17:34 +00:00
Salad Dais
4b963f96d2 Add TextureEntry.realize() to ease indexing into specific faces 2022-07-14 03:10:11 +00:00
Salad Dais
58db8f66de Correct type signatures for TextureEntriy 2022-07-10 17:58:13 +00:00
Salad Dais
95623eba58 More InventoryModel fixes 2022-07-10 01:55:34 +00:00
Salad Dais
8dba0617bd Make injecting inventory EQ events easier 2022-07-09 04:21:44 +00:00
Salad Dais
289073be8e Add InventoryModel diffing 2022-07-09 02:48:23 +00:00
Salad Dais
f3c8015366 Support mutable InventoryModels 2022-07-08 22:06:14 +00:00
Salad Dais
99e8118458 Support HIPPO XML directives in injected EQ events 2022-07-05 14:24:35 +00:00
Salad Dais
80745cfd1c Add TextureEntry.unwrap() to ease working with potentially lazy TEs 2022-07-05 03:08:52 +00:00
Salad Dais
92a06bccaf Dequantize OffsetS and OffsetT in TextureEntrys 2022-07-05 02:08:53 +00:00
Salad Dais
fde9ddf4d9 Initial work to support in-flight EQ response pre-emption 2022-07-04 17:57:05 +00:00
Salad Dais
03a56c9982 Auto-load certain symbols in REPL, add docs for REPL 2022-06-27 01:49:27 +00:00
Salad Dais
d07a0df0fd WIP LLMesh -> Collada
First half of the LLMesh -> Collada -> LLMesh transform for #24
2022-06-24 13:15:20 +00:00
Salad Dais
848397fe63 Fix windows build workflow 2022-06-24 07:36:51 +00:00
Salad Dais
0f9246c5c6 Use github.ref_name instead of github.ref 2022-06-24 02:32:50 +00:00
Salad Dais
2e7f887970 v0.10.0 2022-06-24 01:54:37 +00:00
Salad Dais
ef9df6b058 Update Windows bundling action to add artifact to release 2022-06-24 01:12:21 +00:00
Salad Dais
baae0f6d6e Fix TupleCoord negation 2022-06-21 07:15:49 +00:00
Salad Dais
0f369b682d Upgrade to mitmproxy 8.0
Not 8.1 since that drops Python 3.8 support. Closes #26
2022-06-20 15:15:57 +00:00
Salad Dais
1f1e4de254 Add addon for testing object manager conformance against viewer
Closes #18
2022-06-20 12:38:11 +00:00
Salad Dais
75ddc0a5ba Be smarter about object cache miss autorequests 2022-06-20 12:33:12 +00:00
Salad Dais
e4cb168138 Clear up last few event loop warnings 2022-06-20 12:31:08 +00:00
Salad Dais
63aebba754 Clear up some event loop deprecation warnings 2022-06-20 05:55:01 +00:00
Salad Dais
8cf1a43d59 Better defaults when parsing ObjectUpdateCompressed
This helps our view of the cache better match the viewer's VOCache
2022-06-20 03:23:46 +00:00
Salad Dais
bbc8813b61 Add unary minus for TupleCoords 2022-06-19 04:33:20 +00:00
Salad Dais
5b51dbd30f Add workaround instructions for most recent Firestorm release
Closes #25
2022-05-13 23:52:50 +00:00
Salad Dais
295c7972e7 Use windows-2019 runner instead of windows-latest
windows-latest has some weird ACL changes that cause the cx_Freeze
packaging steps to fail.
2022-05-13 23:39:37 +00:00
Salad Dais
b034661c38 Revert "Temporarily stop generating lib_licenses.txt automatically"
This reverts commit f12fd95ee1.
2022-05-13 23:39:09 +00:00
Salad Dais
f12fd95ee1 Temporarily stop generating lib_licenses.txt automatically
Something is busted with pip-licenses in CI. Not sure why, but
it's only needed for Windows builds anyway.
2022-03-12 19:15:59 +00:00
Salad Dais
bc33313fc7 v0.9.0 2022-03-12 18:40:38 +00:00
Salad Dais
affc7fcf89 Clarify comment in proxy object manager 2022-03-05 11:03:28 +00:00
Salad Dais
b8f1593a2c Allow filtering on HTTP status code 2022-03-05 10:50:09 +00:00
Salad Dais
7879f4e118 Split up mitmproxy integration test a bit 2022-03-05 10:49:55 +00:00
Salad Dais
4ba611ae01 Only apply local mesh to selected links 2022-02-28 07:32:46 +00:00
Salad Dais
82ff6d9c64 Add more TeleportFlags 2022-02-28 07:32:22 +00:00
Salad Dais
f603ea6186 Better handle timeouts that have missing cap_data metadata 2021-12-18 20:43:10 +00:00
Salad Dais
fcf6a4568b Better handling for proxied HTTP requests that timeout 2021-12-17 19:27:20 +00:00
Salad Dais
2ad6cc1b51 Better handle broken 'LLSD' responses 2021-12-17 00:18:51 +00:00
Salad Dais
025f7d31f2 Make sure .queued is cleared if message take()n twice 2021-12-15 20:17:54 +00:00
Salad Dais
9fdb281e4a Create example addon for simulating packet loss 2021-12-13 06:12:43 +00:00
Salad Dais
11e28bde2a Allow filtering message log on HTTP headers 2021-12-11 15:08:45 +00:00
Salad Dais
1faa6f977c Update docs on send() and send_reliable() 2021-12-10 13:41:20 +00:00
Salad Dais
6866e7397f Clean up cap registration API 2021-12-10 13:22:54 +00:00
Salad Dais
fa0b3a5340 Mark all Messages synthetic unless they came off the wire 2021-12-10 07:30:02 +00:00
Salad Dais
16c808bce8 Match viewer resend behaviour 2021-12-10 07:04:36 +00:00
Salad Dais
ec4b2d0770 Move last of the explicit direction params 2021-12-10 06:50:07 +00:00
Salad Dais
3b610fdfd1 Add awaitable send_reliable() 2021-12-09 05:30:35 +00:00
Salad Dais
8b93c5eefa Rename send_message() to send() 2021-12-09 05:30:12 +00:00
Salad Dais
f4bb9eae8f Fix __contains__ for JankStringyBytes 2021-12-09 03:48:29 +00:00
Salad Dais
ecb14197cf Make message log filter highlight every matched field
Previously only the first match was being highlighted.
2021-12-09 01:14:09 +00:00
Salad Dais
95fd58e25a Begin PySide6 cleanup 2021-12-09 00:02:48 +00:00
Salad Dais
afc333ab49 Improve highlighting of matched fields in message log 2021-12-08 23:50:16 +00:00
Salad Dais
eb6406bca4 Fix ACK collection logic for injected reliable messages 2021-12-08 22:29:29 +00:00
Salad Dais
d486aa130d Add support for specifying flags in message builder 2021-12-08 21:10:06 +00:00
Salad Dais
d66d5226a2 Initial implementation of reliable injected packets
See #17. Not yet tested for real.
2021-12-08 04:49:45 +00:00
Salad Dais
d86da70eeb v0.8.0 2021-12-07 07:16:25 +00:00
Salad Dais
aa0b4b63a9 Update cx_freeze script to handle PySide6 2021-12-07 07:16:25 +00:00
Salad Dais
5f479e46b4 Automatically offer to install the HTTPS certs on first run 2021-12-07 07:16:25 +00:00
Salad Dais
1e55d5a9d8 Continue handling HTTP flows if flow logging fails
If flow beautification for display throws then we don't want
to bypass other handling of the flow.

This fixes a login failure due to SL's login XML-RPC endpoint
returning a Content-Type of "application/llsd+xml/r/n" when it's
actually "application/xml".
2021-12-06 17:01:13 +00:00
Salad Dais
077a95b5e7 Migrate to PySide6 to support Python 3.10
Update Glymur too
2021-12-06 13:37:31 +00:00
Salad Dais
4f1399cf66 Add note about LinHippoAutoProxy 2021-12-06 12:26:16 +00:00
Salad Dais
9590b30e66 Add note about Python 3.10 support 2021-12-05 20:25:06 +00:00
Salad Dais
34f3ee4c3e Move mtime wrapper to helpers 2021-12-05 18:14:26 +00:00
Salad Dais
7d655543f5 Dont reserialize responses as pretty LLSD-XML
Certain LLSD parsers don't like the empty text nodes it adds around
the root element of the document. Yuck.
2021-12-05 18:12:53 +00:00
Salad Dais
5de3ed0d5e Add support for LLSD inventory representations 2021-12-03 05:59:58 +00:00
Salad Dais
74c3287cc0 Add base addon for creating proxy-only caps based on ASGI apps 2021-12-02 06:04:29 +00:00
Salad Dais
3a7f8072a0 Initial implementation of proxy-provided caps
Useful for mocking out a cap while developing the viewer-side
pieces of it.
2021-12-02 03:22:47 +00:00
dependabot[bot]
5fa91580eb Bump mitmproxy from 7.0.2 to 7.0.3 (#21)
Bumps [mitmproxy](https://github.com/mitmproxy/mitmproxy) from 7.0.2 to 7.0.3.
- [Release notes](https://github.com/mitmproxy/mitmproxy/releases)
- [Changelog](https://github.com/mitmproxy/mitmproxy/blob/main/CHANGELOG.md)
- [Commits](https://github.com/mitmproxy/mitmproxy/compare/v7.0.2...v7.0.3)

---
updated-dependencies:
- dependency-name: mitmproxy
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-11-30 05:30:06 -04:00
Salad Dais
d8fbb55438 Improve LLUDP integration tests 2021-11-30 09:25:31 +00:00
Salad Dais
99eb4fed74 Fix _reorient_coord to work correctly for normals again 2021-11-30 09:24:49 +00:00
Salad Dais
6b78b841df Fix range of mesh normals 2021-11-23 01:36:14 +00:00
Salad Dais
dae852db69 Fix filter dialog 2021-11-19 04:30:36 +00:00
75 changed files with 2478 additions and 517 deletions

View File

@@ -2,18 +2,23 @@
# onto the release after it gets created. Don't want actions with repo write.
name: Bundle Windows EXE
on:
# Only trigger on release creation
release:
types:
- created
workflow_dispatch:
env:
target_tag: ${{ github.ref_name }}
jobs:
build:
runs-on: windows-latest
runs-on: windows-2019
permissions:
contents: write
strategy:
matrix:
python-version: [3.9]
@@ -34,14 +39,24 @@ jobs:
pip install cx_freeze
- name: Bundle with cx_Freeze
shell: bash
run: |
python setup_cxfreeze.py build_exe
pip install pip-licenses
pip-licenses --format=plain-vertical --with-license-file --no-license-path --output-file=lib_licenses.txt
python setup_cxfreeze.py finalize_cxfreeze
# Should only be one, but we don't know what it's named
mv ./dist/*.zip hippolyzer-windows-${{ env.target_tag }}.zip
- name: Upload the artifact
uses: actions/upload-artifact@v2
with:
name: hippolyzer-gui-windows-${{ github.sha }}
path: ./dist/**
name: hippolyzer-windows-${{ github.sha }}
path: ./hippolyzer-windows-${{ env.target_tag }}.zip
- uses: ncipollo/release-action@v1.10.0
with:
artifacts: hippolyzer-windows-${{ env.target_tag }}.zip
tag: ${{ env.target_tag }}
token: ${{ secrets.GITHUB_TOKEN }}
allowUpdates: true

View File

@@ -8,7 +8,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [3.8, 3.9]
python-version: ["3.8", "3.10"]
steps:
- uses: actions/checkout@v2

View File

@@ -83,6 +83,28 @@ SOCKS 5 works correctly on these platforms, so you can just configure it through
the `no_proxy` env var appropriately. For ex. `no_proxy="asset-cdn.glb.agni.lindenlab.com" ./firestorm`.
* Log in!
##### Firestorm
The proxy selection dialog in the most recent Firestorm release is non-functional, as
https://bitbucket.org/lindenlab/viewer/commits/454c7f4543688126b2fa5c0560710f5a1733702e was not pulled in.
As a workaround, you can go to `Debug -> Show Debug Settings` and enter the following values:
| Name | Value |
|---------------------|-----------|
| HttpProxyType | Web |
| BrowserProxyAddress | 127.0.0.1 |
| BrowserProxyEnabled | TRUE |
| BrowserProxyPort | 9062 |
| Socks5ProxyEnabled | TRUE |
| Socks5ProxyHost | 127.0.0.1 |
| Socks5ProxyPort | 9061 |
Or, if you're on Linux, you can also use [LinHippoAutoProxy](https://github.com/SaladDais/LinHippoAutoProxy).
Connections from the in-viewer browser will likely _not_ be run through Hippolyzer when using either of
these workarounds.
### Filtering
By default, the proxy's display filter is configured to ignore many high-frequency messages.
@@ -311,6 +333,13 @@ If you are a viewer developer, please put them in a viewer.
apply the mesh to the local mesh target. It works on attachments too. Useful for testing rigs before a
final, real upload.
## REPL
A quick and dirty REPL is also included for when you want to do ad-hoc introspection of proxy state.
It can be launched at any time by typing `/524 spawn_repl` in chat.
![Screenshot of REPL](https://github.com/SaladDais/Hippolyzer/blob/master/static/repl_screenshot.png?raw=true)
## Potential Changes
* AISv3 wrapper?
@@ -375,6 +404,12 @@ To have your client's traffic proxied through Hippolyzer the general flow is:
* The proxy needs to use content sniffing to figure out which requests are login requests,
so make sure your request would pass `MITMProxyEventManager._is_login_request()`
#### Do I have to do all that?
You might be able to automate some of it on Linux by using
[LinHippoAutoProxy](https://github.com/SaladDais/LinHippoAutoProxy). If you're on Windows or MacOS the
above is your only option.
### Should I use this library to make an SL client in Python?
No. If you just want to write a client in Python, you should instead look at using

View File

@@ -11,7 +11,7 @@ import enum
import os.path
from typing import *
from PySide2 import QtCore, QtGui, QtWidgets
from PySide6 import QtCore, QtGui, QtWidgets
from hippolyzer.lib.base.datatypes import Vector3
from hippolyzer.lib.base.message.message import Block, Message
@@ -80,7 +80,7 @@ class BlueishObjectListGUIAddon(BaseAddon):
raise
def _highlight_object(self, session: Session, obj: Object):
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"ForceObjectSelect",
Block("Header", ResetList=False),
Block("Data", LocalID=obj.LocalID),
@@ -88,7 +88,7 @@ class BlueishObjectListGUIAddon(BaseAddon):
))
def _teleport_to_object(self, session: Session, obj: Object):
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"TeleportLocationRequest",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
Block(

View File

@@ -20,13 +20,13 @@ bulk upload, like changing priority or removing a joint.
"""
import asyncio
import os
import pathlib
from abc import abstractmethod
from typing import *
from hippolyzer.lib.base import serialization as se
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import get_mtime
from hippolyzer.lib.base.llanim import Animation
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.proxy import addon_ctx
@@ -39,13 +39,6 @@ from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
def _get_mtime(path: str):
try:
return os.stat(path).st_mtime
except:
return None
class LocalAnimAddon(BaseAddon):
# name -> path, only for anims actually from files
local_anim_paths: Dict[str, str] = SessionProperty(dict)
@@ -166,7 +159,7 @@ class LocalAnimAddon(BaseAddon):
cls.local_anim_playing_ids.pop(anim_name, None)
cls.local_anim_bytes.pop(anim_name, None)
region.circuit.send_message(new_msg)
region.circuit.send(new_msg)
print(f"Changing {anim_name} to {next_id}")
@classmethod
@@ -176,7 +169,7 @@ class LocalAnimAddon(BaseAddon):
anim_data = None
if anim_path:
old_mtime = cls.local_anim_mtimes.get(anim_name)
mtime = _get_mtime(anim_path)
mtime = get_mtime(anim_path)
if only_if_changed and old_mtime == mtime:
return

View File

@@ -81,17 +81,16 @@ class MeshUploadInterceptingAddon(BaseAddon):
@handle_command()
async def set_local_mesh_target(self, session: Session, region: ProxiedRegion):
"""Set the currently selected object as the target for local mesh"""
parent_object = region.objects.lookup_localid(session.selected.object_local)
if not parent_object:
"""Set the currently selected objects as the target for local mesh"""
selected_links = [region.objects.lookup_localid(l_id) for l_id in session.selected.object_locals]
selected_links = [o for o in selected_links if o is not None]
if not selected_links:
show_message("Nothing selected")
return
linkset_objects = [parent_object] + parent_object.Children
old_locals = self.local_mesh_target_locals
self.local_mesh_target_locals = [
x.LocalID
for x in linkset_objects
for x in selected_links
if ExtraParamType.MESH in x.ExtraParams
]

View File

@@ -16,18 +16,23 @@ import local_mesh
AddonManager.hot_reload(local_mesh, require_addons_loaded=True)
def _reorient_coord(coord, orientation):
def _reorient_coord(coord, orientation, normals=False):
coords = []
for axis in orientation:
axis_idx = abs(axis) - 1
coords.append(coord[axis_idx] if axis >= 0 else 1.0 - coord[axis_idx])
if normals:
# Normals have a static domain from -1.0 to 1.0, just negate.
new_coord = coord[axis_idx] if axis >= 0 else -coord[axis_idx]
else:
new_coord = coord[axis_idx] if axis >= 0 else 1.0 - coord[axis_idx]
coords.append(new_coord)
if coord.__class__ in (list, tuple):
return coord.__class__(coords)
return coord.__class__(*coords)
def _reorient_coord_list(coord_list, orientation):
return [_reorient_coord(x, orientation) for x in coord_list]
def _reorient_coord_list(coord_list, orientation, normals=False):
return [_reorient_coord(x, orientation, normals) for x in coord_list]
def reorient_mesh(orientation):
@@ -42,7 +47,7 @@ def reorient_mesh(orientation):
# flipping the axes around.
material["Position"] = _reorient_coord_list(material["Position"], orientation)
# Are you even supposed to do this to the normals?
material["Normal"] = _reorient_coord_list(material["Normal"], orientation)
material["Normal"] = _reorient_coord_list(material["Normal"], orientation, normals=True)
return mesh
return _reorienter

View File

@@ -126,14 +126,14 @@ class MessageMirrorAddon(BaseAddon):
# Send the message normally first if we're mirroring
if message.name in MIRROR:
region.circuit.send_message(message)
region.circuit.send(message)
# We're going to send the message on a new circuit, we need to take
# it so we get a new packet ID and clean ACKs
message = message.take()
self._lludp_fixups(target_session, message)
target_region.circuit.send_message(message)
target_region.circuit.send(message)
return True
def _lludp_fixups(self, target_session: Session, message: Message):
@@ -206,7 +206,7 @@ class MessageMirrorAddon(BaseAddon):
return
caps_source = target_region
new_base_url = caps_source.caps.get(cap_data.cap_name)
new_base_url = caps_source.cap_urls.get(cap_data.cap_name)
if not new_base_url:
print("No equiv cap?")
return

View File

@@ -0,0 +1,49 @@
"""
Example of proxy-provided caps
Useful for mocking out a cap that isn't actually implemented by the server
while developing the viewer-side pieces of it.
Implements a cap that accepts an `obj_id` UUID query parameter and returns
the name of the object.
"""
import asyncio
import asgiref.wsgi
from flask import Flask, Response, request
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.webapp_cap_addon import WebAppCapAddon
app = Flask("GetObjectNameCapApp")
@app.route('/')
async def get_object_name():
# Should always have the current region, the cap handler is bound to one.
# Just need to pull it from the `addon_ctx` module's global.
obj_mgr = addon_ctx.region.get().objects
obj_id = UUID(request.args['obj_id'])
obj = obj_mgr.lookup_fullid(obj_id)
if not obj:
return Response(f"Couldn't find {obj_id!r}", status=404, mimetype="text/plain")
try:
await asyncio.wait_for(obj_mgr.request_object_properties(obj)[0], 1.0)
except asyncio.TimeoutError:
return Response(f"Timed out requesting {obj_id!r}'s properties", status=500, mimetype="text/plain")
return Response(obj.Name, mimetype="text/plain")
class MockProxyCapExampleAddon(WebAppCapAddon):
# A cap URL with this name will be tied to each region when
# the sim is first connected to. The URL will be returned to the
# viewer in the Seed if the viewer requests it by name.
CAP_NAME = "GetObjectNameExample"
# Any asgi app should be fine.
APP = asgiref.wsgi.WsgiToAsgi(app)
addons = [MockProxyCapExampleAddon()]

View File

@@ -27,7 +27,7 @@ from mitmproxy.http import HTTPFlow
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.jp2_utils import BufferedJp2k
from hippolyzer.lib.base.multiprocessing_utils import ParentProcessWatcher
from hippolyzer.lib.base.templates import TextureEntry
from hippolyzer.lib.base.templates import TextureEntryCollection
from hippolyzer.lib.proxy.addon_utils import AssetAliasTracker, BaseAddon, GlobalProperty, AddonProcess
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.base.message.message import Message
@@ -148,7 +148,7 @@ class MonochromeAddon(BaseAddon):
message["RegionInfo"][field_name] = tracker.get_alias_uuid(val)
@staticmethod
def _make_te_monochrome(tracker: AssetAliasTracker, parsed_te: TextureEntry):
def _make_te_monochrome(tracker: AssetAliasTracker, parsed_te: TextureEntryCollection):
# Need a deepcopy because TEs are owned by the ObjectManager
# and we don't want to change the canonical view.
parsed_te = copy.deepcopy(parsed_te)

View File

@@ -0,0 +1,111 @@
"""
Check object manager state against region ViewerObject cache
Can't look at every object we've tracked and every object in VOCache
and report mismatches due to weird VOCache cache eviction criteria and certain
cacheable objects not being added to the VOCache.
Off the top of my head, animesh objects get explicit KillObjects at extreme
view distances same as avatars, but will still be present in the cache even
though they will not be in gObjectList.
"""
import asyncio
import logging
from typing import *
from hippolyzer.lib.base.objects import normalize_object_update_compressed_data
from hippolyzer.lib.base.templates import ObjectUpdateFlags, PCode
from hippolyzer.lib.proxy.addon_utils import BaseAddon, GlobalProperty
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager, Session
from hippolyzer.lib.proxy.vocache import is_valid_vocache_dir, RegionViewerObjectCacheChain
LOG = logging.getLogger(__name__)
class ObjectManagementValidator(BaseAddon):
base_cache_path: Optional[str] = GlobalProperty(None)
orig_auto_request: Optional[bool] = GlobalProperty(None)
def handle_init(self, session_manager: SessionManager):
if self.orig_auto_request is None:
self.orig_auto_request = session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS
session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS = False
async def _choose_cache_path():
while not self.base_cache_path:
cache_dir = await AddonManager.UI.open_dir("Choose the base cache directory")
if not cache_dir:
return
if not is_valid_vocache_dir(cache_dir):
continue
self.base_cache_path = cache_dir
if not self.base_cache_path:
self._schedule_task(_choose_cache_path(), session_scoped=False)
def handle_unload(self, session_manager: SessionManager):
session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS = self.orig_auto_request
def handle_session_init(self, session: Session):
# Use only the specified cache path for the vocache
session.cache_dir = self.base_cache_path
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if message.name != "DisableSimulator":
return
# Send it off to the client without handling it normally,
# we need to defer region teardown in the proxy
region.circuit.send(message)
self._schedule_task(self._check_cache_before_region_teardown(region))
return True
async def _check_cache_before_region_teardown(self, region: ProxiedRegion):
await asyncio.sleep(0.5)
print("Ok, checking cache differences")
try:
# Index will have been rewritten, so re-read it.
region_cache_chain = RegionViewerObjectCacheChain.for_region(
handle=region.handle,
cache_id=region.cache_id,
cache_dir=self.base_cache_path
)
if not region_cache_chain.region_caches:
print(f"no caches for {region!r}?")
return
all_full_ids = set()
for obj in region.objects.all_objects:
cacheable = True
orig_obj = obj
# Walk along the ancestry checking for things that would make the tree non-cacheable
while obj is not None:
if obj.UpdateFlags & ObjectUpdateFlags.TEMPORARY_ON_REZ:
cacheable = False
if obj.PCode == PCode.AVATAR:
cacheable = False
obj = obj.Parent
if cacheable:
all_full_ids.add(orig_obj.FullID)
for key in all_full_ids:
obj = region.objects.lookup_fullid(key)
cached_data = region_cache_chain.lookup_object_data(obj.LocalID, obj.CRC)
if not cached_data:
continue
orig_dict = obj.to_dict()
parsed_data = normalize_object_update_compressed_data(cached_data)
updated = obj.update_properties(parsed_data)
# Can't compare this yet
updated -= {"TextureEntry"}
if updated:
print(key)
for attr in updated:
print("\t", attr, orig_dict[attr], parsed_data[attr])
finally:
# Ok to teardown region in the proxy now
region.mark_dead()
addons = [ObjectManagementValidator()]

View File

@@ -37,7 +37,7 @@ class PaydayAddon(BaseAddon):
chat_type=ChatType.SHOUT,
)
# Do the traditional money dance.
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"AgentAnimation",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
Block("AnimationList", AnimID=UUID("928cae18-e31d-76fd-9cc9-2f55160ff818"), StartAnim=True),

View File

@@ -9,12 +9,12 @@ import asyncio
import struct
from typing import *
from PySide2.QtGui import QImage
from PySide6.QtGui import QImage
from hippolyzer.lib.base.datatypes import UUID, Vector3, Quaternion
from hippolyzer.lib.base.helpers import to_chunks
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.templates import ObjectUpdateFlags, PCode, MCode, MultipleObjectUpdateFlags, TextureEntry
from hippolyzer.lib.base.templates import ObjectUpdateFlags, PCode, MCode, MultipleObjectUpdateFlags, TextureEntryCollection
from hippolyzer.lib.client.object_manager import ObjectEvent, UpdateType
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.addons import AddonManager
@@ -42,7 +42,7 @@ class PixelArtistAddon(BaseAddon):
return
img = QImage()
with open(filename, "rb") as f:
img.loadFromData(f.read(), aformat=None)
img.loadFromData(f.read(), format=None)
img = img.convertToFormat(QImage.Format_RGBA8888)
height = img.height()
width = img.width()
@@ -80,7 +80,7 @@ class PixelArtistAddon(BaseAddon):
# TODO: We don't track the land group or user's active group, so
# "anyone can build" must be on for rezzing to work.
group_id = UUID()
region.circuit.send_message(Message(
region.circuit.send(Message(
'ObjectAdd',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id, GroupID=group_id),
Block(
@@ -124,12 +124,12 @@ class PixelArtistAddon(BaseAddon):
y = i // width
obj = created_prims[prim_idx]
# Set a blank texture on all faces
te = TextureEntry()
te = TextureEntryCollection()
te.Textures[None] = UUID('5748decc-f629-461c-9a36-a35a221fe21f')
# Set the prim color to the color from the pixel
te.Color[None] = pixel_color
# Set the prim texture and color
region.circuit.send_message(Message(
region.circuit.send(Message(
'ObjectImage',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('ObjectData', ObjectLocalID=obj.LocalID, MediaURL=b'', TextureEntry_=te),
@@ -149,7 +149,7 @@ class PixelArtistAddon(BaseAddon):
# Move the "pixels" to their correct position in chunks
for chunk in to_chunks(positioning_blocks, 25):
region.circuit.send_message(Message(
region.circuit.send(Message(
'MultipleObjectUpdate',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
*chunk,

View File

@@ -116,7 +116,7 @@ class RecapitatorAddon(BaseAddon):
except:
logging.exception("Exception while recapitating")
# Tell the viewer about the status of its original upload
region.circuit.send_message(Message(
region.circuit.send(Message(
"AssetUploadComplete",
Block("AssetBlock", UUID=asset_id, Type=asset_block["Type"], Success=success),
direction=Direction.IN,

View File

@@ -0,0 +1,22 @@
import random
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
class SimulatePacketLossAddon(BaseAddon):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
# Messing with these may kill your circuit
if message.name in {"PacketAck", "StartPingCheck", "CompletePingCheck", "UseCircuitCode",
"CompleteAgentMovement", "AgentMovementComplete"}:
return
# Simulate 30% packet loss
if random.random() > 0.7:
# Do nothing, drop this packet on the floor
return True
return
addons = [SimulatePacketLossAddon()]

View File

@@ -3,7 +3,7 @@ Example of how to request a Transfer
"""
from typing import *
from hippolyzer.lib.base.legacy_inv import InventoryModel, InventoryItem
from hippolyzer.lib.base.inventory import InventoryModel, InventoryItem
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.templates import (
AssetType,
@@ -35,7 +35,7 @@ class TransferExampleAddon(BaseAddon):
async def get_first_script(self, session: Session, region: ProxiedRegion):
"""Get the contents of the first script in the selected object"""
# Ask for the object inventory so we can find a script
region.circuit.send_message(Message(
region.circuit.send(Message(
'RequestTaskInventory',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('InventoryData', LocalID=session.selected.object_local),
@@ -47,7 +47,7 @@ class TransferExampleAddon(BaseAddon):
file_name=inv_message["InventoryData"]["Filename"], file_path=XferFilePath.CACHE)
inv_model = InventoryModel.from_bytes(xfer.reassemble_chunks())
first_script: Optional[InventoryItem] = None
for item in inv_model.items.values():
for item in inv_model.all_items:
if item.type == "lsltext":
first_script = item
if not first_script:

View File

@@ -64,12 +64,12 @@ class TurboObjectInventoryAddon(BaseAddon):
# Any previous requests will have triggered a delete of the inventory file
# by marking it complete on the server-side. Re-send our RequestTaskInventory
# To make sure there's a fresh copy.
region.circuit.send_message(request_msg.take())
region.circuit.send(request_msg.take())
inv_message = await region.message_handler.wait_for(('ReplyTaskInventory',), timeout=5.0)
# No task inventory, send the reply as-is
file_name = inv_message["InventoryData"]["Filename"]
if not file_name:
region.circuit.send_message(inv_message)
region.circuit.send(inv_message)
return
xfer = region.xfer_manager.request(
@@ -87,7 +87,7 @@ class TurboObjectInventoryAddon(BaseAddon):
continue
# Send the original ReplyTaskInventory to the viewer so it knows the file is ready
region.circuit.send_message(inv_message)
region.circuit.send(inv_message)
proxied_xfer = Xfer(data=xfer.reassemble_chunks())
# Wait for the viewer to request the inventory file

View File

@@ -102,7 +102,7 @@ class UploaderAddon(BaseAddon):
ais_item_to_inventory_data(ais_item),
direction=Direction.IN
)
region.circuit.send_message(message)
region.circuit.send(message)
addons = [UploaderAddon()]

View File

@@ -2,7 +2,7 @@
Example of how to request an Xfer
"""
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.legacy_inv import InventoryModel
from hippolyzer.lib.base.inventory import InventoryModel
from hippolyzer.lib.base.templates import XferFilePath, AssetType, InventoryType, WearableType
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message
@@ -15,7 +15,7 @@ class XferExampleAddon(BaseAddon):
@handle_command()
async def get_mute_list(self, session: Session, region: ProxiedRegion):
"""Fetch the current user's mute list"""
region.circuit.send_message(Message(
region.circuit.send(Message(
'MuteListRequest',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block("MuteData", MuteCRC=0),
@@ -35,7 +35,7 @@ class XferExampleAddon(BaseAddon):
@handle_command()
async def get_task_inventory(self, session: Session, region: ProxiedRegion):
"""Get the inventory of the currently selected object"""
region.circuit.send_message(Message(
region.circuit.send(Message(
'RequestTaskInventory',
# If no session is passed in we'll use the active session when the coro was created
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
@@ -57,7 +57,7 @@ class XferExampleAddon(BaseAddon):
await xfer
inv_model = InventoryModel.from_bytes(xfer.reassemble_chunks())
item_names = [item.name for item in inv_model.items.values()]
item_names = [item.name for item in inv_model.all_items]
show_message(item_names)
@handle_command()
@@ -98,7 +98,7 @@ textures 1
data=asset_data,
transaction_id=transaction_id
)
region.circuit.send_message(Message(
region.circuit.send(Message(
'CreateInventoryItem',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block(

View File

@@ -2,7 +2,7 @@ import enum
import logging
import typing
from PySide2 import QtCore, QtGui
from PySide6 import QtCore, QtGui
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.message_logger import FilteringMessageLogger

View File

@@ -7,6 +7,7 @@ import sys
import time
from typing import Optional
import mitmproxy.ctx
import mitmproxy.exceptions
from hippolyzer.lib.base import llsd
@@ -86,10 +87,13 @@ class REPLAddon(BaseAddon):
def run_http_proxy_process(proxy_host, http_proxy_port, flow_context: HTTPFlowContext):
mitm_loop = asyncio.new_event_loop()
asyncio.set_event_loop(mitm_loop)
mitmproxy_master = create_http_proxy(proxy_host, http_proxy_port, flow_context)
mitmproxy_master.start_server()
gc.freeze()
mitm_loop.run_forever()
async def mitmproxy_loop():
mitmproxy_master = create_http_proxy(proxy_host, http_proxy_port, flow_context)
gc.freeze()
await mitmproxy_master.run()
asyncio.run(mitmproxy_loop())
def start_proxy(session_manager: SessionManager, extra_addons: Optional[list] = None,
@@ -104,7 +108,7 @@ def start_proxy(session_manager: SessionManager, extra_addons: Optional[list] =
root_log.setLevel(logging.INFO)
logging.basicConfig()
loop = asyncio.get_event_loop()
loop = asyncio.get_event_loop_policy().get_event_loop()
udp_proxy_port = session_manager.settings.SOCKS_PROXY_PORT
http_proxy_port = session_manager.settings.HTTP_PROXY_PORT
@@ -131,6 +135,9 @@ def start_proxy(session_manager: SessionManager, extra_addons: Optional[list] =
daemon=True,
)
http_proc.start()
# These need to be set for mitmproxy's ASGIApp serving code to work.
mitmproxy.ctx.master = None
mitmproxy.ctx.log = logging.getLogger("mitmproxy log")
server = SLSOCKS5Server(session_manager)
coro = asyncio.start_server(server.handle_connection, proxy_host, udp_proxy_port)

View File

@@ -18,7 +18,7 @@ from typing import *
import multidict
from qasync import QEventLoop, asyncSlot
from PySide2 import QtCore, QtWidgets, QtGui
from PySide6 import QtCore, QtWidgets, QtGui
from hippolyzer.apps.model import MessageLogModel, MessageLogHeader, RegionListModel
from hippolyzer.apps.proxy import start_proxy
@@ -35,6 +35,7 @@ from hippolyzer.lib.base.message.message_formatting import (
)
from hippolyzer.lib.base.message.msgtypes import MsgType
from hippolyzer.lib.base.message.template_dict import DEFAULT_TEMPLATE_DICT
from hippolyzer.lib.base.settings import SettingDescriptor
from hippolyzer.lib.base.ui_helpers import loadUi
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base.network.transport import Direction, SocketUDPTransport
@@ -61,7 +62,7 @@ def show_error_message(error_msg, parent=None):
error_dialog = QtWidgets.QErrorMessage(parent=parent)
# No obvious way to set this to plaintext, yuck...
error_dialog.showMessage(html.escape(error_msg))
error_dialog.exec_()
error_dialog.exec()
error_dialog.raise_()
@@ -88,13 +89,13 @@ class GUISessionManager(SessionManager, QtCore.QObject):
self.all_regions = new_regions
class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
def __init__(self, parent):
class GUIInteractionManager(BaseInteractionManager):
def __init__(self, parent: QtWidgets.QWidget):
BaseInteractionManager.__init__(self)
QtCore.QObject.__init__(self, parent=parent)
self._parent = parent
def main_window_handle(self) -> Any:
return self.parent()
return self._parent
def _dialog_async_exec(self, dialog: QtWidgets.QDialog):
future = asyncio.Future()
@@ -106,7 +107,7 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
self, caption: str, directory: str, filter_str: str, mode: QtWidgets.QFileDialog.FileMode,
default_suffix: str = '',
) -> Tuple[bool, QtWidgets.QFileDialog]:
dialog = QtWidgets.QFileDialog(self.parent(), caption=caption, directory=directory, filter=filter_str)
dialog = QtWidgets.QFileDialog(self._parent, caption=caption, directory=directory, filter=filter_str)
dialog.setFileMode(mode)
if mode == QtWidgets.QFileDialog.FileMode.AnyFile:
dialog.setAcceptMode(QtWidgets.QFileDialog.AcceptMode.AcceptSave)
@@ -154,7 +155,7 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
title,
caption,
QtWidgets.QMessageBox.Ok | QtWidgets.QMessageBox.Cancel,
self.parent(),
self._parent,
)
fut = asyncio.Future()
msg.finished.connect(lambda r: fut.set_result(r))
@@ -163,6 +164,8 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
class GUIProxySettings(ProxySettings):
FIRST_RUN: bool = SettingDescriptor(True)
"""Persistent settings backed by QSettings"""
def __init__(self, settings: QtCore.QSettings):
super().__init__()
@@ -265,7 +268,7 @@ class MessageLogWindow(QtWidgets.QMainWindow):
self.lineEditFilter.editingFinished.connect(self.setFilter)
self.btnMessageBuilder.clicked.connect(self._sendToMessageBuilder)
self.btnCopyRepr.clicked.connect(self._copyRepr)
self.actionInstallHTTPSCerts.triggered.connect(self._installHTTPSCerts)
self.actionInstallHTTPSCerts.triggered.connect(self.installHTTPSCerts)
self.actionManageAddons.triggered.connect(self._manageAddons)
self.actionManageFilters.triggered.connect(self._manageFilters)
self.actionOpenMessageBuilder.triggered.connect(self._openMessageBuilder)
@@ -300,7 +303,7 @@ class MessageLogWindow(QtWidgets.QMainWindow):
def _populateFilterMenu(self):
def _addFilterAction(text, filter_str):
filter_action = QtWidgets.QAction(text, self)
filter_action = QtGui.QAction(text, self)
filter_action.triggered.connect(lambda: self.setFilter(filter_str))
self._filterMenu.addAction(filter_action)
@@ -311,13 +314,16 @@ class MessageLogWindow(QtWidgets.QMainWindow):
for preset_name, preset_filter in filters.items():
_addFilterAction(preset_name, preset_filter)
def getFilterDict(self):
return self.settings.FILTERS
def setFilterDict(self, val: dict):
self.settings.FILTERS = val
self._populateFilterMenu()
def _manageFilters(self):
dialog = FilterDialog(self)
dialog.exec_()
dialog.exec()
@nonFatalExceptions
def setFilter(self, filter_str=None):
@@ -354,21 +360,20 @@ class MessageLogWindow(QtWidgets.QMainWindow):
beautify=self.checkBeautify.isChecked(),
replacements=buildReplacements(entry.session, entry.region),
)
highlight_range = None
if isinstance(req, SpannedString):
match_result = self.model.filter.match(entry)
# Match result was a tuple indicating what matched
if isinstance(match_result, tuple):
highlight_range = req.spans.get(match_result)
self.textRequest.setPlainText(req)
if highlight_range:
cursor = self.textRequest.textCursor()
cursor.setPosition(highlight_range[0], QtGui.QTextCursor.MoveAnchor)
cursor.setPosition(highlight_range[1], QtGui.QTextCursor.KeepAnchor)
highlight_format = QtGui.QTextBlockFormat()
highlight_format.setBackground(QtCore.Qt.yellow)
cursor.setBlockFormat(highlight_format)
# The string has a map of fields and their associated positions within the string,
# use that to highlight any individual fields the filter matched on.
if isinstance(req, SpannedString):
for field in self.model.filter.match(entry, short_circuit=False).fields:
field_span = req.spans.get(field)
if not field_span:
continue
cursor = self.textRequest.textCursor()
cursor.setPosition(field_span[0], QtGui.QTextCursor.MoveAnchor)
cursor.setPosition(field_span[1], QtGui.QTextCursor.KeepAnchor)
highlight_format = QtGui.QTextBlockFormat()
highlight_format.setBackground(QtCore.Qt.yellow)
cursor.setBlockFormat(highlight_format)
resp = entry.response(beautify=self.checkBeautify.isChecked())
if resp:
@@ -441,10 +446,10 @@ class MessageLogWindow(QtWidgets.QMainWindow):
with open(log_file, "wb") as f:
f.write(export_log_entries(self.model))
def _installHTTPSCerts(self):
def installHTTPSCerts(self):
msg = QtWidgets.QMessageBox()
msg.setText("This will install the proxy's HTTPS certificate in the config dir"
" of any installed viewers, continue?")
msg.setText("Would you like to install the proxy's HTTPS certificate in the config dir"
" of any installed viewers so that HTTPS connections will work?")
yes_btn = msg.addButton("Yes", QtWidgets.QMessageBox.NoRole)
msg.addButton("No", QtWidgets.QMessageBox.NoRole)
msg.exec()
@@ -476,7 +481,7 @@ class MessageLogWindow(QtWidgets.QMainWindow):
def _manageAddons(self):
dialog = AddonDialog(self)
dialog.exec_()
dialog.exec()
def getAddonList(self) -> List[str]:
return self.sessionManager.settings.ADDON_SCRIPTS
@@ -565,7 +570,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
else:
self.comboUntrusted.addItem(message_name)
cap_names = sorted(set(itertools.chain(*[r.caps.keys() for r in self.regionModel.regions])))
cap_names = sorted(set(itertools.chain(*[r.cap_urls.keys() for r in self.regionModel.regions])))
for cap_name in cap_names:
if cap_name.endswith("ProxyWrapper"):
continue
@@ -596,7 +601,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
break
self.textRequest.setPlainText(
f"""{method} [[{cap_name}]]{path}{params} HTTP/1.1
# {region.caps.get(cap_name, "<unknown URI>")}
# {region.cap_urls.get(cap_name, "<unknown URI>")}
{headers}
{body}"""
)
@@ -691,13 +696,11 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
msg = HumanMessageSerializer.from_human_string(msg_text, replacements, env, safe=False)
if self.checkLLUDPViaCaps.isChecked():
if msg.direction == Direction.IN:
region.eq_manager.inject_event(
self.llsdSerializer.serialize(msg, as_dict=True)
)
region.eq_manager.inject_message(msg)
else:
self._sendHTTPRequest(
"POST",
region.caps["UntrustedSimulatorMessage"],
region.cap_urls["UntrustedSimulatorMessage"],
{"Content-Type": "application/llsd+xml", "Accept": "application/llsd+xml"},
self.llsdSerializer.serialize(msg),
)
@@ -706,18 +709,25 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
off_circuit = self.checkOffCircuit.isChecked()
if off_circuit:
transport = SocketUDPTransport(socket.socket(socket.AF_INET, socket.SOCK_DGRAM))
region.circuit.send_message(msg, transport=transport)
region.circuit.send(msg, transport=transport)
if off_circuit:
transport.close()
def _sendEQMessage(self, session, region: Optional[ProxiedRegion], msg_text: str, _replacements: dict):
def _sendEQMessage(self, session, region: Optional[ProxiedRegion], msg_text: str, replacements: dict):
if not session or not region:
raise RuntimeError("Need a valid session and region to send EQ event")
message_line, _, body = (x.strip() for x in msg_text.partition("\n"))
message_name = message_line.rsplit(" ", 1)[-1]
env = self._buildEnv(session, region)
def directive_handler(m):
return self._handleHTTPDirective(env, replacements, False, m)
body = re.sub(rb"<!HIPPO(\w+)\[\[(.*?)]]>", directive_handler, body.encode("utf8"), flags=re.S)
region.eq_manager.inject_event({
"message": message_name,
"body": llsd.parse_xml(body.encode("utf8")),
"body": llsd.parse_xml(body),
})
def _sendHTTPMessage(self, session, region, msg_text: str, replacements: dict):
@@ -741,7 +751,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
cap_name = match.group(1)
cap_url = session.global_caps.get(cap_name)
if not cap_url:
cap_url = region.caps.get(cap_name)
cap_url = region.cap_urls.get(cap_name)
if not cap_url:
raise ValueError("Don't have a Cap for %s" % cap_name)
uri = cap_url + match.group(2)
@@ -781,7 +791,10 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
val = subfield_eval(contents.decode("utf8").strip(), globals_={**env, **replacements})
val = _coerce_to_bytes(val)
elif directive == b"REPL":
val = _coerce_to_bytes(replacements[contents.decode("utf8").strip()])
repl = replacements[contents.decode("utf8").strip()]
if callable(repl):
repl = repl()
val = _coerce_to_bytes(repl)
else:
raise ValueError(f"Unknown directive {directive}")
@@ -915,6 +928,10 @@ def gui_main():
http_host = None
if window.sessionManager.settings.REMOTELY_ACCESSIBLE:
http_host = "0.0.0.0"
if settings.FIRST_RUN:
settings.FIRST_RUN = False
# Automatically offer to install the HTTPS certs on first run.
window.installHTTPSCerts()
start_proxy(
session_manager=window.sessionManager,
extra_addon_paths=window.getAddonList(),

View File

@@ -0,0 +1,306 @@
# This currently implements basic LLMesh -> Collada.
#
# TODO:
# * inverse, Collada -> LLMesh (for simple cases, maybe using impasse rather than pycollada)
# * round-tripping tests, LLMesh->Collada->LLMesh
# * * Can't really test using Collada->LLMesh->Collada because Collada->LLMesh is almost always
# going to be lossy due to how SL represents vertex data and materials compared to what
# Collada allows.
# * Eventually scrap this and just use GLTF instead once we know we have the semantics correct
# * * Collada was just easier to bootstrap given that it's the only officially supported input format
# * * Collada tooling sucks and even LL is moving away from it
# * * Ensuring LLMesh->Collada and LLMesh->GLTF conversion don't differ semantically is easy via assimp.
import collections
import os.path
import secrets
import statistics
import sys
from typing import Dict, List, Iterable, Optional
import collada
import collada.source
from collada import E
from lxml import etree
import numpy as np
import transformations
from hippolyzer.lib.base.helpers import get_resource_filename
from hippolyzer.lib.base.serialization import BufferReader
from hippolyzer.lib.base.mesh import LLMeshSerializer, MeshAsset, positions_from_domain, SkinSegmentDict
DIR = os.path.dirname(os.path.realpath(__file__))
def mesh_to_collada(ll_mesh: MeshAsset, include_skin=True) -> collada.Collada:
dae = collada.Collada()
axis = collada.asset.UP_AXIS.Z_UP
dae.assetInfo.upaxis = axis
scene = collada.scene.Scene("scene", [llmesh_to_node(ll_mesh, dae, include_skin=include_skin)])
dae.scenes.append(scene)
dae.scene = scene
return dae
def llmesh_to_node(ll_mesh: MeshAsset, dae: collada.Collada, uniq=None,
include_skin=True, node_transform: Optional[np.ndarray] = None) -> collada.scene.Node:
if node_transform is None:
node_transform = np.identity(4)
should_skin = False
skin_seg = ll_mesh.segments.get('skin')
bind_shape_matrix = None
if include_skin and skin_seg:
bind_shape_matrix = np.array(skin_seg["bind_shape_matrix"]).reshape((4, 4))
should_skin = True
# Transform from the skin will be applied on the controller, not the node
node_transform = np.identity(4)
if not uniq:
uniq = secrets.token_urlsafe(4)
geom_nodes = []
node_name = f"mainnode{uniq}"
# TODO: do the other LODs?
for submesh_num, submesh in enumerate(ll_mesh.segments["high_lod"]):
# Make sure none of our IDs collide with those of other nodes
sub_uniq = uniq + str(submesh_num)
range_xyz = positions_from_domain(submesh["Position"], submesh["PositionDomain"])
xyz = np.array([x.data() for x in range_xyz])
range_uv = positions_from_domain(submesh['TexCoord0'], submesh['TexCoord0Domain'])
uv = np.array([x.data() for x in range_uv]).flatten()
norms = np.array([x.data() for x in submesh["Normal"]])
effect = collada.material.Effect(
id=f"effect{sub_uniq}",
params=[],
specular=(0.0, 0.0, 0.0, 0.0),
reflectivity=(0.0, 0.0, 0.0, 0.0),
emission=(0.0, 0.0, 0.0, 0.0),
ambient=(0.0, 0.0, 0.0, 0.0),
reflective=0.0,
shadingtype="blinn",
shininess=0.0,
diffuse=(0.0, 0.0, 0.0),
)
mat = collada.material.Material(f"material{sub_uniq}", f"material{sub_uniq}", effect)
dae.materials.append(mat)
dae.effects.append(effect)
vert_src = collada.source.FloatSource(f"verts-array{sub_uniq}", xyz.flatten(), ("X", "Y", "Z"))
norm_src = collada.source.FloatSource(f"norms-array{sub_uniq}", norms.flatten(), ("X", "Y", "Z"))
# UV maps have to have the same name or they'll behave weirdly when objects are merged.
uv_src = collada.source.FloatSource("uvs-array", np.array(uv), ("U", "V"))
geom = collada.geometry.Geometry(dae, f"geometry{sub_uniq}", "geometry", [vert_src, norm_src, uv_src])
input_list = collada.source.InputList()
input_list.addInput(0, 'VERTEX', f'#verts-array{sub_uniq}', set="0")
input_list.addInput(0, 'NORMAL', f'#norms-array{sub_uniq}', set="0")
input_list.addInput(0, 'TEXCOORD', '#uvs-array', set="0")
tri_idxs = np.array(submesh["TriangleList"]).flatten()
matnode = collada.scene.MaterialNode(f"materialref{sub_uniq}", mat, inputs=[])
tri_set = geom.createTriangleSet(tri_idxs, input_list, f'materialref{sub_uniq}')
geom.primitives.append(tri_set)
dae.geometries.append(geom)
if should_skin:
joint_names = np.array(skin_seg['joint_names'], dtype=object)
joints_source = collada.source.NameSource(f"joint-names{sub_uniq}", joint_names, ("JOINT",))
# PyCollada has a bug where it doesn't set the source URI correctly. Fix it.
accessor = joints_source.xmlnode.find(f"{dae.tag('technique_common')}/{dae.tag('accessor')}")
if not accessor.get('source').startswith('#'):
accessor.set('source', f"#{accessor.get('source')}")
flattened_bind_poses = []
# LLMesh matrices are row-major, convert to col-major for Collada.
for bind_pose in skin_seg['inverse_bind_matrix']:
flattened_bind_poses.append(np.array(bind_pose).reshape((4, 4)).flatten('F'))
flattened_bind_poses = np.array(flattened_bind_poses)
inv_bind_source = _create_mat4_source(f"bind-poses{sub_uniq}", flattened_bind_poses, "TRANSFORM")
weight_joint_idxs = []
weights = []
vert_weight_counts = []
cur_weight_idx = 0
for vert_weights in submesh['Weights']:
vert_weight_counts.append(len(vert_weights))
for vert_weight in vert_weights:
weights.append(vert_weight.weight)
weight_joint_idxs.append(vert_weight.joint_idx)
weight_joint_idxs.append(cur_weight_idx)
cur_weight_idx += 1
weights_source = collada.source.FloatSource(f"skin-weights{sub_uniq}", np.array(weights), ("WEIGHT",))
# We need to make a controller for each material since materials are essentially distinct meshes
# in SL, with their own distinct sets of weights and vertex data.
controller_node = E.controller(
E.skin(
E.bind_shape_matrix(' '.join(str(x) for x in bind_shape_matrix.flatten('F'))),
joints_source.xmlnode,
inv_bind_source.xmlnode,
weights_source.xmlnode,
E.joints(
E.input(semantic="JOINT", source=f"#joint-names{sub_uniq}"),
E.input(semantic="INV_BIND_MATRIX", source=f"#bind-poses{sub_uniq}")
),
E.vertex_weights(
E.input(semantic="JOINT", source=f"#joint-names{sub_uniq}", offset="0"),
E.input(semantic="WEIGHT", source=f"#skin-weights{sub_uniq}", offset="1"),
E.vcount(' '.join(str(x) for x in vert_weight_counts)),
E.v(' '.join(str(x) for x in weight_joint_idxs)),
count=str(len(submesh['Weights']))
),
source=f"#geometry{sub_uniq}"
),
id=f"Armature-{sub_uniq}",
name=node_name
)
controller = collada.controller.Controller.load(dae, {}, controller_node)
dae.controllers.append(controller)
geom_node = collada.scene.ControllerNode(controller, [matnode])
else:
geom_node = collada.scene.GeometryNode(geom, [matnode])
geom_nodes.append(geom_node)
node = collada.scene.Node(
node_name,
children=geom_nodes,
transforms=[collada.scene.MatrixTransform(np.array(node_transform.flatten('F')))],
)
if should_skin:
# We need a skeleton per _mesh asset_ because you could have incongruous skeletons
# within the same linkset.
skel_root = load_skeleton_nodes()
transform_skeleton(skel_root, dae, skin_seg)
skel = collada.scene.Node.load(dae, skel_root, {})
skel.children.append(node)
skel.id = f"Skel-{uniq}"
skel.save()
node = skel
return node
def load_skeleton_nodes() -> etree.ElementBase:
# TODO: this sucks. Can't we construct nodes with the appropriate transformation
# matrices from the data in `avatar_skeleton.xml`?
skel_path = get_resource_filename("lib/base/data/male_collada_joints.xml")
with open(skel_path, 'r') as f:
return etree.fromstring(f.read())
def transform_skeleton(skel_root: etree.ElementBase, dae: collada.Collada, skin_seg: SkinSegmentDict,
include_unreferenced_bones=False):
"""Update skeleton XML nodes to account for joint translations in the mesh"""
# TODO: Use translation component only.
joint_nodes: Dict[str, collada.scene.Node] = {}
for skel_node in skel_root.iter():
# xpath is loathsome so this is easier.
if skel_node.tag != dae.tag('node') or skel_node.get('type') != 'JOINT':
continue
joint_nodes[skel_node.get('name')] = collada.scene.Node.load(dae, skel_node, {})
for joint_name, matrix in zip(skin_seg['joint_names'], skin_seg.get('alt_inverse_bind_matrix', [])):
joint_node = joint_nodes[joint_name]
joint_node.matrix = np.array(matrix).reshape((4, 4)).flatten('F')
# Update the underlying XML element with the new transform matrix
joint_node.save()
if not include_unreferenced_bones:
needed_heirarchy = set()
for skel_node in joint_nodes.values():
skel_node = skel_node.xmlnode
if skel_node.get('name') in skin_seg['joint_names']:
# Add this joint and any ancestors the list of needed joints
while skel_node is not None:
needed_heirarchy.add(skel_node.get('name'))
skel_node = skel_node.getparent()
for skel_node in joint_nodes.values():
skel_node = skel_node.xmlnode
if skel_node.get('name') not in needed_heirarchy:
skel_node.getparent().remove(skel_node)
pelvis_offset = skin_seg.get('pelvis_offset')
# TODO: should we even do this here? It's not present in the collada, just
# something that's specified in the uploader before conversion to LLMesh.
if pelvis_offset and 'mPelvis' in joint_nodes:
pelvis_node = joint_nodes['mPelvis']
# Column-major!
pelvis_node.matrix[3][2] += pelvis_offset
pelvis_node.save()
def _create_mat4_source(name: str, data: np.ndarray, semantic: str):
# PyCollada has no way to make a source with a float4x4 semantic. Do it a bad way.
# Note that collada demands column-major matrices whereas LLSD mesh has them row-major!
source = collada.source.FloatSource(name, data, tuple(f"M{x}" for x in range(16)))
accessor = source.xmlnode[1][0]
for child in list(accessor):
accessor.remove(child)
accessor.append(E.param(name=semantic, type="float4x4"))
return source
def fix_weird_bind_matrices(skin_seg: SkinSegmentDict):
"""
Fix weird-looking bind matrices to have normal scaling
Not sure why these even happen (weird mesh authoring programs?)
Sometimes get enormous inverse bind matrices (each component 10k+) and tiny
bind shape matrix components. This detects inverse bind shape matrices
with weird scales and tries to set them to what they "should" be without
the weird inverted scaling.
"""
axis_counters = [collections.Counter() for _ in range(3)]
for joint_inv in skin_seg['inverse_bind_matrix']:
joint_mat = np.array(joint_inv).reshape((4, 4))
joint_scale = transformations.decompose_matrix(joint_mat)[0]
for axis_counter, axis_val in zip(axis_counters, joint_scale):
axis_counter[axis_val] += 1
most_common_inv_scale = []
for axis_counter in axis_counters:
most_common_inv_scale.append(axis_counter.most_common(1)[0][0])
if abs(1.0 - statistics.fmean(most_common_inv_scale)) > 1.0:
# The magnitude of the scales in the inverse bind matrices look very strange.
# The bind matrix itself is probably messed up as well, try to fix it.
skin_seg['bind_shape_matrix'] = fix_llsd_matrix_scale(skin_seg['bind_shape_matrix'], most_common_inv_scale)
if joint_positions := skin_seg.get('alt_inverse_bind_matrix', None):
fix_matrix_list_scale(joint_positions, most_common_inv_scale)
rev_scale = tuple(1.0 / x for x in most_common_inv_scale)
fix_matrix_list_scale(skin_seg['inverse_bind_matrix'], rev_scale)
def fix_matrix_list_scale(source: List[List[float]], scale_fixup: Iterable[float]):
for i, alt_inv_matrix in enumerate(source):
source[i] = fix_llsd_matrix_scale(alt_inv_matrix, scale_fixup)
def fix_llsd_matrix_scale(source: List[float], scale_fixup: Iterable[float]):
matrix = np.array(source).reshape((4, 4))
decomposed = list(transformations.decompose_matrix(matrix))
# Need to handle both the scale and translation matrices
for idx in (0, 3):
decomposed[idx] = tuple(x * y for x, y in zip(decomposed[idx], scale_fixup))
return list(transformations.compose_matrix(*decomposed).flatten('C'))
def main():
# Take an llmesh file as an argument and spit out basename-converted.dae
with open(sys.argv[1], "rb") as f:
reader = BufferReader("<", f.read())
mesh = mesh_to_collada(reader.read(LLMeshSerializer(parse_segment_contents=True)))
mesh.write(sys.argv[1].rsplit(".", 1)[0] + "-converted.dae")
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,485 @@
<!-- from http://wiki.secondlife.com/wiki/Project_Bento_Resources_and_Information collada -->
<node id="Avatar" name="Avatar" type="NODE" xmlns="http://www.collada.org/2005/11/COLLADASchema">
<translate sid="location">0 0 0</translate>
<rotate sid="rotationZ">0 0 1 0</rotate>
<rotate sid="rotationY">0 1 0 0</rotate>
<rotate sid="rotationX">1 0 0 0</rotate>
<scale sid="scale">1 1 1</scale>
<node id="mPelvis" name="mPelvis" sid="mPelvis" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 1.067 0 0 0 1</matrix>
<node id="PELVIS" name="PELVIS" sid="PELVIS" type="JOINT">
<matrix sid="transform">1 0 0 -0.01 0 1 0 0 0 0 1 -0.02 0 0 0 1</matrix>
</node>
<node id="BUTT" name="BUTT" sid="BUTT" type="JOINT">
<matrix sid="transform">1 0 0 -0.06 0 1 0 0 0 0 1 -0.1 0 0 0 1</matrix>
</node>
<node id="mSpine1" name="mSpine1" sid="mSpine1" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.084 0 0 0 1</matrix>
<node id="mSpine2" name="mSpine2" sid="mSpine2" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 -0.084 0 0 0 1</matrix>
<node id="mTorso" name="mTorso" sid="mTorso" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.084 0 0 0 1</matrix>
<node id="BELLY" name="BELLY" sid="BELLY" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 0 0 0 1 0.04 0 0 0 1</matrix>
</node>
<node id="LEFT_HANDLE" name="LEFT_HANDLE" sid="LEFT_HANDLE" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.1 0 0 1 0.058 0 0 0 1</matrix>
</node>
<node id="RIGHT_HANDLE" name="RIGHT_HANDLE" sid="RIGHT_HANDLE" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.1 0 0 1 0.058 0 0 0 1</matrix>
</node>
<node id="LOWER_BACK" name="LOWER_BACK" sid="LOWER_BACK" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.023 0 0 0 1</matrix>
</node>
<node id="mSpine3" name="mSpine3" sid="mSpine3" type="JOINT">
<matrix sid="transform">1 0 0 -0.015 0 1 0 0 0 0 1 0.205 0 0 0 1</matrix>
<node id="mSpine4" name="mSpine4" sid="mSpine4" type="JOINT">
<matrix sid="transform">1 0 0 0.015 0 1 0 0 0 0 1 -0.205 0 0 0 1</matrix>
<node id="mChest" name="mChest" sid="mChest" type="JOINT">
<matrix sid="transform">1 0 0 -0.015 0 1 0 0 0 0 1 0.205 0 0 0 1</matrix>
<node id="CHEST" name="CHEST" sid="CHEST" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 0 0 0 1 0.07 0 0 0 1</matrix>
</node>
<node id="LEFT_PEC" name="LEFT_PEC" sid="LEFT_PEC" type="JOINT">
<matrix sid="transform">1 0 0 0.119 0 1 0 0.082 0 0 1 0.042 0 0 0 1</matrix>
</node>
<node id="RIGHT_PEC" name="RIGHT_PEC" sid="RIGHT_PEC" type="JOINT">
<matrix sid="transform">1 0 0 0.119 0 1 0 -0.082 0 0 1 0.042 0 0 0 1</matrix>
</node>
<node id="UPPER_BACK" name="UPPER_BACK" sid="UPPER_BACK" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.017 0 0 0 1</matrix>
</node>
<node id="mNeck" name="mNeck" sid="mNeck" type="JOINT">
<matrix sid="transform">1 0 0 -0.01 0 1 0 0 0 0 1 0.251 0 0 0 1</matrix>
<node id="NECK" name="NECK" sid="NECK" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
</node>
<node id="mHead" name="mHead" sid="mHead" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.076 0 0 0 1</matrix>
<node id="HEAD" name="HEAD" sid="HEAD" type="JOINT">
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 0.07 0 0 0 1</matrix>
</node>
<node id="mSkull" name="mSkull" sid="mSkull" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.079 0 0 0 1</matrix>
</node>
<node id="mEyeRight" name="mEyeRight" sid="mEyeRight" type="JOINT">
<matrix sid="transform">1 0 0 0.098 0 1 0 -0.036 0 0 1 0.079 0 0 0 1</matrix>
</node>
<node id="mEyeLeft" name="mEyeLeft" sid="mEyeLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.098 0 1 0 0.036 0 0 1 0.079 0 0 0 1</matrix>
</node>
<node id="mFaceRoot" name="mFaceRoot" sid="mFaceRoot" type="JOINT">
<matrix sid="transform">1 0 0 0.025 0 1 0 0 0 0 1 0.045 0 0 0 1</matrix>
<node id="mFaceEyeAltRight" name="mFaceEyeAltRight" sid="mFaceEyeAltRight" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 -0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEyeAltLeft" name="mFaceEyeAltLeft" sid="mFaceEyeAltLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceForeheadLeft" name="mFaceForeheadLeft" sid="mFaceForeheadLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.061 0 1 0 0.035 0 0 1 0.083 0 0 0 1</matrix>
</node>
<node id="mFaceForeheadRight" name="mFaceForeheadRight" sid="mFaceForeheadRight" type="JOINT">
<matrix sid="transform">1 0 0 0.061 0 1 0 -0.035 0 0 1 0.083 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowOuterLeft" name="mFaceEyebrowOuterLeft" sid="mFaceEyebrowOuterLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.064 0 1 0 0.051 0 0 1 0.048 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowCenterLeft" name="mFaceEyebrowCenterLeft" sid="mFaceEyebrowCenterLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.07 0 1 0 0.043 0 0 1 0.056 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowInnerLeft" name="mFaceEyebrowInnerLeft" sid="mFaceEyebrowInnerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.075 0 1 0 0.022 0 0 1 0.051 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowOuterRight" name="mFaceEyebrowOuterRight" sid="mFaceEyebrowOuterRight" type="JOINT">
<matrix sid="transform">1 0 0 0.064 0 1 0 -0.051 0 0 1 0.048 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowCenterRight" name="mFaceEyebrowCenterRight" sid="mFaceEyebrowCenterRight" type="JOINT">
<matrix sid="transform">1 0 0 0.07 0 1 0 -0.043 0 0 1 0.056 0 0 0 1</matrix>
</node>
<node id="mFaceEyebrowInnerRight" name="mFaceEyebrowInnerRight" sid="mFaceEyebrowInnerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.075 0 1 0 -0.022 0 0 1 0.051 0 0 0 1</matrix>
</node>
<node id="mFaceEyeLidUpperLeft" name="mFaceEyeLidUpperLeft" sid="mFaceEyeLidUpperLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEyeLidLowerLeft" name="mFaceEyeLidLowerLeft" sid="mFaceEyeLidLowerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEyeLidUpperRight" name="mFaceEyeLidUpperRight" sid="mFaceEyeLidUpperRight" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 -0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEyeLidLowerRight" name="mFaceEyeLidLowerRight" sid="mFaceEyeLidLowerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.073 0 1 0 -0.036 0 0 1 0.034 0 0 0 1</matrix>
</node>
<node id="mFaceEar1Left" name="mFaceEar1Left" sid="mFaceEar1Left" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.08 0 0 1 0.002 0 0 0 1</matrix>
<node id="mFaceEar2Left" name="mFaceEar2Left" sid="mFaceEar2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.019 0 1 0 0.018 0 0 1 0.025 0 0 0 1</matrix>
</node>
</node>
<node id="mFaceEar1Right" name="mFaceEar1Right" sid="mFaceEar1Right" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.08 0 0 1 0.002 0 0 0 1</matrix>
<node id="mFaceEar2Right" name="mFaceEar2Right" sid="mFaceEar2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.019 0 1 0 -0.018 0 0 1 0.025 0 0 0 1</matrix>
</node>
</node>
<node id="mFaceNoseLeft" name="mFaceNoseLeft" sid="mFaceNoseLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.086 0 1 0 0.015 0 0 1 -0.004 0 0 0 1</matrix>
</node>
<node id="mFaceNoseCenter" name="mFaceNoseCenter" sid="mFaceNoseCenter" type="JOINT">
<matrix sid="transform">1 0 0 0.102 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceNoseRight" name="mFaceNoseRight" sid="mFaceNoseRight" type="JOINT">
<matrix sid="transform">1 0 0 0.086 0 1 0 -0.015 0 0 1 -0.004 0 0 0 1</matrix>
</node>
<node id="mFaceCheekLowerLeft" name="mFaceCheekLowerLeft" sid="mFaceCheekLowerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.05 0 1 0 0.034 0 0 1 -0.031 0 0 0 1</matrix>
</node>
<node id="mFaceCheekUpperLeft" name="mFaceCheekUpperLeft" sid="mFaceCheekUpperLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.07 0 1 0 0.034 0 0 1 -0.005 0 0 0 1</matrix>
</node>
<node id="mFaceCheekLowerRight" name="mFaceCheekLowerRight" sid="mFaceCheekLowerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.05 0 1 0 -0.034 0 0 1 -0.031 0 0 0 1</matrix>
</node>
<node id="mFaceCheekUpperRight" name="mFaceCheekUpperRight" sid="mFaceCheekUpperRight" type="JOINT">
<matrix sid="transform">1 0 0 0.07 0 1 0 -0.034 0 0 1 -0.005 0 0 0 1</matrix>
</node>
<node id="mFaceJaw" name="mFaceJaw" sid="mFaceJaw" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 0 0 0 1 -0.015 0 0 0 1</matrix>
<node id="mFaceChin" name="mFaceChin" sid="mFaceChin" type="JOINT">
<matrix sid="transform">1 0 0 0.074 0 1 0 0 0 0 1 -0.054 0 0 0 1</matrix>
</node>
<node id="mFaceTeethLower" name="mFaceTeethLower" sid="mFaceTeethLower" type="JOINT">
<matrix sid="transform">1 0 0 0.021 0 1 0 0 0 0 1 -0.039 0 0 0 1</matrix>
<node id="mFaceLipLowerLeft" name="mFaceLipLowerLeft" sid="mFaceLipLowerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceLipLowerRight" name="mFaceLipLowerRight" sid="mFaceLipLowerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceLipLowerCenter" name="mFaceLipLowerCenter" sid="mFaceLipLowerCenter" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceTongueBase" name="mFaceTongueBase" sid="mFaceTongueBase" type="JOINT">
<matrix sid="transform">1 0 0 0.039 0 1 0 0 0 0 1 0.005 0 0 0 1</matrix>
<node id="mFaceTongueTip" name="mFaceTongueTip" sid="mFaceTongueTip" type="JOINT">
<matrix sid="transform">1 0 0 0.022 0 1 0 0 0 0 1 0.007 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
<node id="mFaceJawShaper" name="mFaceJawShaper" sid="mFaceJawShaper" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mFaceForeheadCenter" name="mFaceForeheadCenter" sid="mFaceForeheadCenter" type="JOINT">
<matrix sid="transform">1 0 0 0.069 0 1 0 0 0 0 1 0.065 0 0 0 1</matrix>
</node>
<node id="mFaceNoseBase" name="mFaceNoseBase" sid="mFaceNoseBase" type="JOINT">
<matrix sid="transform">1 0 0 0.094 0 1 0 0 0 0 1 -0.016 0 0 0 1</matrix>
</node>
<node id="mFaceTeethUpper" name="mFaceTeethUpper" sid="mFaceTeethUpper" type="JOINT">
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 -0.03 0 0 0 1</matrix>
<node id="mFaceLipUpperLeft" name="mFaceLipUpperLeft" sid="mFaceLipUpperLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 -0.003 0 0 0 1</matrix>
</node>
<node id="mFaceLipUpperRight" name="mFaceLipUpperRight" sid="mFaceLipUpperRight" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 -0.003 0 0 0 1</matrix>
</node>
<node id="mFaceLipCornerLeft" name="mFaceLipCornerLeft" sid="mFaceLipCornerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 -0.019 0 0 1 -0.01 0 0 0 1</matrix>
</node>
<node id="mFaceLipCornerRight" name="mFaceLipCornerRight" sid="mFaceLipCornerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 0.019 0 0 1 -0.01 0 0 0 1</matrix>
</node>
<node id="mFaceLipUpperCenter" name="mFaceLipUpperCenter" sid="mFaceLipUpperCenter" type="JOINT">
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 -0.003 0 0 0 1</matrix>
</node>
</node>
<node id="mFaceEyecornerInnerLeft" name="mFaceEyecornerInnerLeft" sid="mFaceEyecornerInnerLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.075 0 1 0 0.017 0 0 1 0.032 0 0 0 1</matrix>
</node>
<node id="mFaceEyecornerInnerRight" name="mFaceEyecornerInnerRight" sid="mFaceEyecornerInnerRight" type="JOINT">
<matrix sid="transform">1 0 0 0.075 0 1 0 -0.017 0 0 1 0.032 0 0 0 1</matrix>
</node>
<node id="mFaceNoseBridge" name="mFaceNoseBridge" sid="mFaceNoseBridge" type="JOINT">
<matrix sid="transform">1 0 0 0.091 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
<node id="mCollarLeft" name="mCollarLeft" sid="mCollarLeft" type="JOINT">
<matrix sid="transform">1 0 0 -0.021 0 1 0 0.085 0 0 1 0.165 0 0 0 1</matrix>
<node id="L_CLAVICLE" name="L_CLAVICLE" sid="L_CLAVICLE" type="JOINT">
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
</node>
<node id="mShoulderLeft" name="mShoulderLeft" sid="mShoulderLeft" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.079 0 0 1 0 0 0 0 1</matrix>
<node id="L_UPPER_ARM" name="L_UPPER_ARM" sid="L_UPPER_ARM" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.12 0 0 1 0.01 0 0 0 1</matrix>
</node>
<node id="mElbowLeft" name="mElbowLeft" sid="mElbowLeft" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.248 0 0 1 0 0 0 0 1</matrix>
<node id="L_LOWER_ARM" name="L_LOWER_ARM" sid="L_LOWER_ARM" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.1 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mWristLeft" name="mWristLeft" sid="mWristLeft" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 0.205 0 0 1 0 0 0 0 1</matrix>
<node id="L_HAND" name="L_HAND" sid="L_HAND" type="JOINT">
<matrix sid="transform">1 0 0 0.01 0 1 0 0.05 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mHandMiddle1Left" name="mHandMiddle1Left" sid="mHandMiddle1Left" type="JOINT">
<matrix sid="transform">1 0 0 0.013 0 1 0 0.101 0 0 1 0.015 0 0 0 1</matrix>
<node id="mHandMiddle2Left" name="mHandMiddle2Left" sid="mHandMiddle2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 0.04 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandMiddle3Left" name="mHandMiddle3Left" sid="mHandMiddle3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 0.049 0 0 1 -0.008 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandIndex1Left" name="mHandIndex1Left" sid="mHandIndex1Left" type="JOINT">
<matrix sid="transform">1 0 0 0.038 0 1 0 0.097 0 0 1 0.015 0 0 0 1</matrix>
<node id="mHandIndex2Left" name="mHandIndex2Left" sid="mHandIndex2Left" type="JOINT">
<matrix sid="transform">1 0 0 0.017 0 1 0 0.036 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandIndex3Left" name="mHandIndex3Left" sid="mHandIndex3Left" type="JOINT">
<matrix sid="transform">1 0 0 0.014 0 1 0 0.032 0 0 1 -0.006 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandRing1Left" name="mHandRing1Left" sid="mHandRing1Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.01 0 1 0 0.099 0 0 1 0.009 0 0 0 1</matrix>
<node id="mHandRing2Left" name="mHandRing2Left" sid="mHandRing2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.013 0 1 0 0.038 0 0 1 -0.008 0 0 0 1</matrix>
<node id="mHandRing3Left" name="mHandRing3Left" sid="mHandRing3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.013 0 1 0 0.04 0 0 1 -0.009 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandPinky1Left" name="mHandPinky1Left" sid="mHandPinky1Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.031 0 1 0 0.095 0 0 1 0.003 0 0 0 1</matrix>
<node id="mHandPinky2Left" name="mHandPinky2Left" sid="mHandPinky2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.024 0 1 0 0.025 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandPinky3Left" name="mHandPinky3Left" sid="mHandPinky3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.015 0 1 0 0.018 0 0 1 -0.004 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandThumb1Left" name="mHandThumb1Left" sid="mHandThumb1Left" type="JOINT">
<matrix sid="transform">1 0 0 0.031 0 1 0 0.026 0 0 1 0.004 0 0 0 1</matrix>
<node id="mHandThumb2Left" name="mHandThumb2Left" sid="mHandThumb2Left" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 0.032 0 0 1 -0.001 0 0 0 1</matrix>
<node id="mHandThumb3Left" name="mHandThumb3Left" sid="mHandThumb3Left" type="JOINT">
<matrix sid="transform">1 0 0 0.023 0 1 0 0.031 0 0 1 -0.001 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
<node id="mCollarRight" name="mCollarRight" sid="mCollarRight" type="JOINT">
<matrix sid="transform">1 0 0 -0.021 0 1 0 -0.085 0 0 1 0.165 0 0 0 1</matrix>
<node id="R_CLAVICLE" name="R_CLAVICLE" sid="R_CLAVICLE" type="JOINT">
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
</node>
<node id="mShoulderRight" name="mShoulderRight" sid="mShoulderRight" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.079 0 0 1 0 0 0 0 1</matrix>
<node id="R_UPPER_ARM" name="R_UPPER_ARM" sid="R_UPPER_ARM" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.12 0 0 1 0.01 0 0 0 1</matrix>
</node>
<node id="mElbowRight" name="mElbowRight" sid="mElbowRight" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.248 0 0 1 0 0 0 0 1</matrix>
<node id="R_LOWER_ARM" name="R_LOWER_ARM" sid="R_LOWER_ARM" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.1 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mWristRight" name="mWristRight" sid="mWristRight" type="JOINT">
<matrix sid="transform">1 0 0 0 0 1 0 -0.205 0 0 1 0 0 0 0 1</matrix>
<node id="R_HAND" name="R_HAND" sid="R_HAND" type="JOINT">
<matrix sid="transform">1 0 0 0.01 0 1 0 -0.05 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mHandMiddle1Right" name="mHandMiddle1Right" sid="mHandMiddle1Right" type="JOINT">
<matrix sid="transform">1 0 0 0.013 0 1 0 -0.101 0 0 1 0.015 0 0 0 1</matrix>
<node id="mHandMiddle2Right" name="mHandMiddle2Right" sid="mHandMiddle2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 -0.04 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandMiddle3Right" name="mHandMiddle3Right" sid="mHandMiddle3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 -0.049 0 0 1 -0.008 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandIndex1Right" name="mHandIndex1Right" sid="mHandIndex1Right" type="JOINT">
<matrix sid="transform">1 0 0 0.038 0 1 0 -0.097 0 0 1 0.015 0 0 0 1</matrix>
<node id="mHandIndex2Right" name="mHandIndex2Right" sid="mHandIndex2Right" type="JOINT">
<matrix sid="transform">1 0 0 0.017 0 1 0 -0.036 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandIndex3Right" name="mHandIndex3Right" sid="mHandIndex3Right" type="JOINT">
<matrix sid="transform">1 0 0 0.014 0 1 0 -0.032 0 0 1 -0.006 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandRing1Right" name="mHandRing1Right" sid="mHandRing1Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.01 0 1 0 -0.099 0 0 1 0.009 0 0 0 1</matrix>
<node id="mHandRing2Right" name="mHandRing2Right" sid="mHandRing2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.013 0 1 0 -0.038 0 0 1 -0.008 0 0 0 1</matrix>
<node id="mHandRing3Right" name="mHandRing3Right" sid="mHandRing3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.013 0 1 0 -0.04 0 0 1 -0.009 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandPinky1Right" name="mHandPinky1Right" sid="mHandPinky1Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.031 0 1 0 -0.095 0 0 1 0.003 0 0 0 1</matrix>
<node id="mHandPinky2Right" name="mHandPinky2Right" sid="mHandPinky2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.024 0 1 0 -0.025 0 0 1 -0.006 0 0 0 1</matrix>
<node id="mHandPinky3Right" name="mHandPinky3Right" sid="mHandPinky3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.015 0 1 0 -0.018 0 0 1 -0.004 0 0 0 1</matrix>
</node>
</node>
</node>
<node id="mHandThumb1Right" name="mHandThumb1Right" sid="mHandThumb1Right" type="JOINT">
<matrix sid="transform">1 0 0 0.031 0 1 0 -0.026 0 0 1 0.004 0 0 0 1</matrix>
<node id="mHandThumb2Right" name="mHandThumb2Right" sid="mHandThumb2Right" type="JOINT">
<matrix sid="transform">1 0 0 0.028 0 1 0 -0.032 0 0 1 -0.001 0 0 0 1</matrix>
<node id="mHandThumb3Right" name="mHandThumb3Right" sid="mHandThumb3Right" type="JOINT">
<matrix sid="transform">1 0 0 0.023 0 1 0 -0.031 0 0 1 -0.001 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
<node id="mWingsRoot" name="mWingsRoot" sid="mWingsRoot" type="JOINT">
<matrix sid="transform">1 0 0 -0.014 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mWing1Left" name="mWing1Left" sid="mWing1Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.099 0 1 0 0.105 0 0 1 0.181 0 0 0 1</matrix>
<node id="mWing2Left" name="mWing2Left" sid="mWing2Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.168 0 1 0 0.169 0 0 1 0.067 0 0 0 1</matrix>
<node id="mWing3Left" name="mWing3Left" sid="mWing3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.181 0 1 0 0.183 0 0 1 0 0 0 0 1</matrix>
<node id="mWing4Left" name="mWing4Left" sid="mWing4Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.171 0 1 0 0.173 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mWing4FanLeft" name="mWing4FanLeft" sid="mWing4FanLeft" type="JOINT">
<matrix sid="transform">1 0 0 -0.171 0 1 0 0.173 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
<node id="mWing1Right" name="mWing1Right" sid="mWing1Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.099 0 1 0 -0.105 0 0 1 0.181 0 0 0 1</matrix>
<node id="mWing2Right" name="mWing2Right" sid="mWing2Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.168 0 1 0 -0.169 0 0 1 0.067 0 0 0 1</matrix>
<node id="mWing3Right" name="mWing3Right" sid="mWing3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.181 0 1 0 -0.183 0 0 1 0 0 0 0 1</matrix>
<node id="mWing4Right" name="mWing4Right" sid="mWing4Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.171 0 1 0 -0.173 0 0 1 0 0 0 0 1</matrix>
</node>
<node id="mWing4FanRight" name="mWing4FanRight" sid="mWing4FanRight" type="JOINT">
<matrix sid="transform">1 0 0 -0.171 0 1 0 -0.173 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
</node>
<node id="mHipRight" name="mHipRight" sid="mHipRight" type="JOINT">
<matrix sid="transform">1 0 0 0.034 0 1 0 -0.129 0 0 1 -0.041 0 0 0 1</matrix>
<node id="R_UPPER_LEG" name="R_UPPER_LEG" sid="R_UPPER_LEG" type="JOINT">
<matrix sid="transform">1 0 0 -0.02 0 1 0 0.05 0 0 1 -0.22 0 0 0 1</matrix>
</node>
<node id="mKneeRight" name="mKneeRight" sid="mKneeRight" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 0.049 0 0 1 -0.491 0 0 0 1</matrix>
<node id="R_LOWER_LEG" name="R_LOWER_LEG" sid="R_LOWER_LEG" type="JOINT">
<matrix sid="transform">1 0 0 -0.02 0 1 0 0 0 0 1 -0.2 0 0 0 1</matrix>
</node>
<node id="mAnkleRight" name="mAnkleRight" sid="mAnkleRight" type="JOINT">
<matrix sid="transform">1 0 0 -0.029 0 1 0 0 0 0 1 -0.468 0 0 0 1</matrix>
<node id="R_FOOT" name="R_FOOT" sid="R_FOOT" type="JOINT">
<matrix sid="transform">1 0 0 0.077 0 1 0 0 0 0 1 -0.041 0 0 0 1</matrix>
</node>
<node id="mFootRight" name="mFootRight" sid="mFootRight" type="JOINT">
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
<node id="mToeRight" name="mToeRight" sid="mToeRight" type="JOINT">
<matrix sid="transform">1 0 0 0.109 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
<node id="mHipLeft" name="mHipLeft" sid="mHipLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.034 0 1 0 0.127 0 0 1 -0.041 0 0 0 1</matrix>
<node id="L_UPPER_LEG" name="L_UPPER_LEG" sid="L_UPPER_LEG" type="JOINT">
<matrix sid="transform">1 0 0 -0.02 0 1 0 -0.05 0 0 1 -0.22 0 0 0 1</matrix>
</node>
<node id="mKneeLeft" name="mKneeLeft" sid="mKneeLeft" type="JOINT">
<matrix sid="transform">1 0 0 -0.001 0 1 0 -0.046 0 0 1 -0.491 0 0 0 1</matrix>
<node id="L_LOWER_LEG" name="L_LOWER_LEG" sid="L_LOWER_LEG" type="JOINT">
<matrix sid="transform">1 0 0 -0.02 0 1 0 0 0 0 1 -0.2 0 0 0 1</matrix>
</node>
<node id="mAnkleLeft" name="mAnkleLeft" sid="mAnkleLeft" type="JOINT">
<matrix sid="transform">1 0 0 -0.029 0 1 0 0.001 0 0 1 -0.468 0 0 0 1</matrix>
<node id="L_FOOT" name="L_FOOT" sid="L_FOOT" type="JOINT">
<matrix sid="transform">1 0 0 0.077 0 1 0 0 0 0 1 -0.041 0 0 0 1</matrix>
</node>
<node id="mFootLeft" name="mFootLeft" sid="mFootLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
<node id="mToeLeft" name="mToeLeft" sid="mToeLeft" type="JOINT">
<matrix sid="transform">1 0 0 0.109 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
<node id="mTail1" name="mTail1" sid="mTail1" type="JOINT">
<matrix sid="transform">1 0 0 -0.116 0 1 0 0 0 0 1 0.047 0 0 0 1</matrix>
<node id="mTail2" name="mTail2" sid="mTail2" type="JOINT">
<matrix sid="transform">1 0 0 -0.197 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mTail3" name="mTail3" sid="mTail3" type="JOINT">
<matrix sid="transform">1 0 0 -0.168 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mTail4" name="mTail4" sid="mTail4" type="JOINT">
<matrix sid="transform">1 0 0 -0.142 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mTail5" name="mTail5" sid="mTail5" type="JOINT">
<matrix sid="transform">1 0 0 -0.112 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
<node id="mTail6" name="mTail6" sid="mTail6" type="JOINT">
<matrix sid="transform">1 0 0 -0.094 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
<node id="mGroin" name="mGroin" sid="mGroin" type="JOINT">
<matrix sid="transform">1 0 0 0.064 0 1 0 0 0 0 1 -0.097 0 0 0 1</matrix>
</node>
<node id="mHindLimbsRoot" name="mHindLimbsRoot" sid="mHindLimbsRoot" type="JOINT">
<matrix sid="transform">1 0 0 -0.2 0 1 0 0 0 0 1 0.084 0 0 0 1</matrix>
<node id="mHindLimb1Left" name="mHindLimb1Left" sid="mHindLimb1Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.204 0 1 0 0.129 0 0 1 -0.125 0 0 0 1</matrix>
<node id="mHindLimb2Left" name="mHindLimb2Left" sid="mHindLimb2Left" type="JOINT">
<matrix sid="transform">1 0 0 0.002 0 1 0 -0.046 0 0 1 -0.491 0 0 0 1</matrix>
<node id="mHindLimb3Left" name="mHindLimb3Left" sid="mHindLimb3Left" type="JOINT">
<matrix sid="transform">1 0 0 -0.03 0 1 0 -0.003 0 0 1 -0.468 0 0 0 1</matrix>
<node id="mHindLimb4Left" name="mHindLimb4Left" sid="mHindLimb4Left" type="JOINT">
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
<node id="mHindLimb1Right" name="mHindLimb1Right" sid="mHindLimb1Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.204 0 1 0 -0.129 0 0 1 -0.125 0 0 0 1</matrix>
<node id="mHindLimb2Right" name="mHindLimb2Right" sid="mHindLimb2Right" type="JOINT">
<matrix sid="transform">1 0 0 0.002 0 1 0 0.046 0 0 1 -0.491 0 0 0 1</matrix>
<node id="mHindLimb3Right" name="mHindLimb3Right" sid="mHindLimb3Right" type="JOINT">
<matrix sid="transform">1 0 0 -0.03 0 1 0 0.003 0 0 1 -0.468 0 0 0 1</matrix>
<node id="mHindLimb4Right" name="mHindLimb4Right" sid="mHindLimb4Right" type="JOINT">
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
</node>
</node>
</node>
</node>
</node>
</node>
</node>

View File

@@ -18,6 +18,8 @@ You should have received a copy of the GNU Lesser General Public License
along with this program; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
from __future__ import annotations
import ast
import enum
import hashlib
@@ -58,6 +60,9 @@ class TupleCoord(recordclass.datatuple, _IterableStub): # type: ignore
def __abs__(self):
return self.__class__(*(abs(x) for x in self))
def __neg__(self):
return self.__class__(*(-x for x in self))
def __add__(self, other):
return self.__class__(*(x + y for x, y in zip(self, other)))
@@ -244,6 +249,7 @@ class Quaternion(TupleCoord):
class UUID(uuid.UUID):
_NULL_UUID_STR = '00000000-0000-0000-0000-000000000000'
ZERO: UUID
__slots__ = ()
def __init__(self, val: Union[uuid.UUID, str, None] = None, bytes=None, int=None):
@@ -268,12 +274,16 @@ class UUID(uuid.UUID):
return self.__class__(int=self.int ^ other.int)
UUID.ZERO = UUID()
class JankStringyBytes(bytes):
"""
Treat bytes as UTF8 if used in string context
Sinful, but necessary evil for now since templates don't specify what's
binary and what's a string.
binary and what's a string. There are also certain fields where the value
may be either binary _or_ a string, depending on the context.
"""
__slots__ = ()
@@ -288,6 +298,11 @@ class JankStringyBytes(bytes):
def __ne__(self, other):
return not self.__eq__(other)
def __contains__(self, item):
if isinstance(item, str):
return item in str(self)
return item in bytes(self)
class RawBytes(bytes):
__slots__ = ()

View File

@@ -2,6 +2,8 @@ from __future__ import annotations
import codecs
import functools
import os
import pkg_resources
import re
import weakref
@@ -145,3 +147,10 @@ def to_chunks(chunkable: Sequence[_T], chunk_size: int) -> Generator[_T, None, N
while chunkable:
yield chunkable[:chunk_size]
chunkable = chunkable[chunk_size:]
def get_mtime(path):
try:
return os.stat(path).st_mtime
except:
return None

View File

@@ -7,8 +7,9 @@ from __future__ import annotations
import dataclasses
import datetime as dt
import itertools
import logging
import struct
import typing
import weakref
from io import StringIO
from typing import *
@@ -33,6 +34,17 @@ LOG = logging.getLogger(__name__)
_T = TypeVar("_T")
class SchemaFlagField(SchemaHexInt):
"""Like a hex int, but must be serialized as bytes in LLSD due to being a U32"""
@classmethod
def from_llsd(cls, val: Any) -> int:
return struct.unpack("!I", val)[0]
@classmethod
def to_llsd(cls, val: int) -> Any:
return struct.pack("!I", val)
def _yield_schema_tokens(reader: StringIO):
in_bracket = False
# empty str == EOF in Python
@@ -76,7 +88,7 @@ class InventoryBase(SchemaBase):
if schema_name != cls.SCHEMA_NAME:
raise ValueError(f"Expected schema name {schema_name!r} to be {cls.SCHEMA_NAME!r}")
fields = cls._fields_dict()
fields = cls._get_fields_dict()
obj_dict = {}
for key, val in tok_iter:
if key in fields:
@@ -100,7 +112,7 @@ class InventoryBase(SchemaBase):
def to_writer(self, writer: StringIO):
writer.write(f"\t{self.SCHEMA_NAME}\t0\n")
writer.write("\t{\n")
for field_name, field in self._fields_dict().items():
for field_name, field in self._get_fields_dict().items():
spec = field.metadata.get("spec")
# Not meant to be serialized
if not spec:
@@ -120,10 +132,14 @@ class InventoryBase(SchemaBase):
writer.write("\t}\n")
class InventoryDifferences(typing.NamedTuple):
changed: List[InventoryNodeBase]
removed: List[InventoryNodeBase]
class InventoryModel(InventoryBase):
def __init__(self):
self.containers: Dict[UUID, InventoryContainerBase] = {}
self.items: Dict[UUID, InventoryItem] = {}
self.nodes: Dict[UUID, InventoryNodeBase] = {}
self.root: Optional[InventoryContainerBase] = None
@classmethod
@@ -133,48 +149,113 @@ class InventoryModel(InventoryBase):
if key == "inv_object":
obj = InventoryObject.from_reader(reader)
if obj is not None:
model.add_container(obj)
model.add(obj)
elif key == "inv_category":
cat = InventoryCategory.from_reader(reader)
if cat is not None:
model.add_container(cat)
model.add(cat)
elif key == "inv_item":
item = InventoryItem.from_reader(reader)
if item is not None:
model.add_item(item)
model.add(item)
else:
LOG.warning("Unknown key {0}".format(key))
model.reparent_nodes()
return model
@classmethod
def from_llsd(cls, llsd_val: List[Dict]) -> InventoryModel:
model = cls()
for obj_dict in llsd_val:
if InventoryCategory.ID_ATTR in obj_dict:
if (obj := InventoryCategory.from_llsd(obj_dict)) is not None:
model.add(obj)
elif InventoryObject.ID_ATTR in obj_dict:
if (obj := InventoryObject.from_llsd(obj_dict)) is not None:
model.add(obj)
elif InventoryItem.ID_ATTR in obj_dict:
if (obj := InventoryItem.from_llsd(obj_dict)) is not None:
model.add(obj)
else:
LOG.warning(f"Unknown object type {obj_dict!r}")
return model
@property
def ordered_nodes(self) -> Iterable[InventoryNodeBase]:
yield from self.all_containers
yield from self.all_items
@property
def all_containers(self) -> Iterable[InventoryContainerBase]:
for node in self.nodes.values():
if isinstance(node, InventoryContainerBase):
yield node
@property
def all_items(self) -> Iterable[InventoryItem]:
for node in self.nodes.values():
if not isinstance(node, InventoryContainerBase):
yield node
def __eq__(self, other):
if not isinstance(other, InventoryModel):
return False
return set(self.nodes.values()) == set(other.nodes.values())
def to_writer(self, writer: StringIO):
for container in self.containers.values():
container.to_writer(writer)
for item in self.items.values():
item.to_writer(writer)
for node in self.ordered_nodes:
node.to_writer(writer)
def add_container(self, container: InventoryContainerBase):
self.containers[container.node_id] = container
container.model = weakref.proxy(self)
def to_llsd(self):
return list(node.to_llsd() for node in self.ordered_nodes)
def add_item(self, item: InventoryItem):
self.items[item.item_id] = item
item.model = weakref.proxy(self)
def add(self, node: InventoryNodeBase):
if node.node_id in self.nodes:
raise KeyError(f"{node.node_id} already exists in the inventory model")
def reparent_nodes(self):
self.root = None
for container in self.containers.values():
container.children.clear()
if container.parent_id == UUID():
self.root = container
for obj in itertools.chain(self.items.values(), self.containers.values()):
if not obj.parent_id or obj.parent_id == UUID():
continue
parent_container = self.containers.get(obj.parent_id)
if not parent_container:
LOG.warning("{0} had an invalid parent {1}".format(obj, obj.parent_id))
continue
parent_container.children.append(obj)
self.nodes[node.node_id] = node
if isinstance(node, InventoryContainerBase):
if node.parent_id == UUID.ZERO:
self.root = node
node.model = weakref.proxy(self)
def unlink(self, node: InventoryNodeBase) -> Sequence[InventoryNodeBase]:
"""Unlink a node and its descendants from the tree, returning the removed nodes"""
assert node.model == self
if node == self.root:
self.root = None
unlinked = [node]
if isinstance(node, InventoryContainerBase):
for child in node.children:
unlinked.extend(self.unlink(child))
self.nodes.pop(node.node_id, None)
node.model = None
return unlinked
def get_differences(self, other: InventoryModel) -> InventoryDifferences:
# Includes modified things with the same ID
changed_in_other = []
removed_in_other = []
other_keys = set(other.nodes.keys())
our_keys = set(self.nodes.keys())
# Removed
for key in our_keys - other_keys:
removed_in_other.append(self.nodes[key])
# Updated
for key in other_keys.intersection(our_keys):
other_node = other.nodes[key]
if other_node != self.nodes[key]:
changed_in_other.append(other_node)
# Added
for key in other_keys - our_keys:
changed_in_other.append(other.nodes[key])
return InventoryDifferences(
changed=changed_in_other,
removed=removed_in_other,
)
@dataclasses.dataclass
@@ -204,16 +285,27 @@ class InventorySaleInfo(InventoryBase):
class InventoryNodeBase(InventoryBase):
ID_ATTR: ClassVar[str]
name: str
parent_id: Optional[UUID] = schema_field(SchemaUUID)
model: Optional[InventoryModel] = dataclasses.field(default=None, init=False)
model: Optional[InventoryModel] = dataclasses.field(
default=None, init=False, hash=False, compare=False, repr=False
)
@property
def node_id(self) -> UUID:
return getattr(self, self.ID_ATTR)
@node_id.setter
def node_id(self, val: UUID):
setattr(self, self.ID_ATTR, val)
@property
def parent(self):
return self.model.containers.get(self.parent_id)
def parent(self) -> Optional[InventoryContainerBase]:
return self.model.nodes.get(self.parent_id)
def unlink(self) -> Sequence[InventoryNodeBase]:
return self.model.unlink(self)
@classmethod
def _obj_from_dict(cls, obj_dict):
@@ -224,12 +316,58 @@ class InventoryNodeBase(InventoryBase):
return None
return super()._obj_from_dict(obj_dict)
def __hash__(self):
return hash(self.node_id)
def __iter__(self) -> Iterator[InventoryNodeBase]:
return iter(())
def __contains__(self, item) -> bool:
return item in tuple(self)
@dataclasses.dataclass
class InventoryContainerBase(InventoryNodeBase):
type: str = schema_field(SchemaStr)
name: str = schema_field(SchemaMultilineStr)
children: List[InventoryNodeBase] = dataclasses.field(default_factory=list, init=False)
@property
def children(self) -> Sequence[InventoryNodeBase]:
return tuple(
x for x in self.model.nodes.values()
if x.parent_id == self.node_id
)
def __getitem__(self, item: Union[int, str]) -> InventoryNodeBase:
if isinstance(item, int):
return self.children[item]
for child in self.children:
if child.name == item:
return child
raise KeyError(f"{item!r} not found in children")
def __iter__(self) -> Iterator[InventoryNodeBase]:
return iter(self.children)
def get_or_create_subcategory(self, name: str) -> InventoryCategory:
for child in self:
if child.name == name and isinstance(child, InventoryCategory):
return child
child = InventoryCategory(
name=name,
cat_id=UUID.random(),
parent_id=self.node_id,
type="category",
pref_type="-1",
owner_id=getattr(self, 'owner_id', UUID.ZERO),
version=1,
)
self.model.add(child)
return child
# So autogenerated __hash__ doesn't kill our inherited one
__hash__ = InventoryNodeBase.__hash__
@dataclasses.dataclass
@@ -239,17 +377,21 @@ class InventoryObject(InventoryContainerBase):
obj_id: UUID = schema_field(SchemaUUID)
__hash__ = InventoryNodeBase.__hash__
@dataclasses.dataclass
class InventoryCategory(InventoryContainerBase):
ID_ATTR: ClassVar[str] = "cat_id"
SCHEMA_NAME: ClassVar[str] = "inv_object"
SCHEMA_NAME: ClassVar[str] = "inv_category"
cat_id: UUID = schema_field(SchemaUUID)
pref_type: str = schema_field(SchemaStr)
pref_type: str = schema_field(SchemaStr, llsd_name="preferred_type")
owner_id: UUID = schema_field(SchemaUUID)
version: int = schema_field(SchemaInt)
__hash__ = InventoryNodeBase.__hash__
@dataclasses.dataclass
class InventoryItem(InventoryNodeBase):
@@ -259,15 +401,17 @@ class InventoryItem(InventoryNodeBase):
item_id: UUID = schema_field(SchemaUUID)
type: str = schema_field(SchemaStr)
inv_type: str = schema_field(SchemaStr)
flags: int = schema_field(SchemaHexInt)
flags: int = schema_field(SchemaFlagField)
name: str = schema_field(SchemaMultilineStr)
desc: str = schema_field(SchemaMultilineStr)
creation_date: dt.datetime = schema_field(SchemaDate)
creation_date: dt.datetime = schema_field(SchemaDate, llsd_name="created_at")
permissions: InventoryPermissions = schema_field(InventoryPermissions)
sale_info: InventorySaleInfo = schema_field(InventorySaleInfo)
asset_id: Optional[UUID] = schema_field(SchemaUUID, default=None)
shadow_id: Optional[UUID] = schema_field(SchemaUUID, default=None)
__hash__ = InventoryNodeBase.__hash__
@property
def true_asset_id(self) -> UUID:
if self.asset_id is not None:

View File

@@ -1,7 +1,6 @@
import os
import tempfile
from io import BytesIO
from typing import *
import defusedxml.ElementTree
from glymur import jp2box, Jp2k
@@ -10,12 +9,6 @@ from glymur import jp2box, Jp2k
jp2box.ET = defusedxml.ElementTree
SL_DEFAULT_ENCODE = {
"cratios": (1920.0, 480.0, 120.0, 30.0, 10.0),
"irreversible": True,
}
class BufferedJp2k(Jp2k):
"""
For manipulating JP2K from within a binary buffer.
@@ -24,12 +17,7 @@ class BufferedJp2k(Jp2k):
based on filename, so this is the least brittle approach.
"""
def __init__(self, contents: bytes, encode_kwargs: Optional[Dict] = None):
if encode_kwargs is None:
self.encode_kwargs = SL_DEFAULT_ENCODE.copy()
else:
self.encode_kwargs = encode_kwargs
def __init__(self, contents: bytes):
stream = BytesIO(contents)
self.temp_file = tempfile.NamedTemporaryFile(delete=False)
stream.seek(0)
@@ -44,11 +32,12 @@ class BufferedJp2k(Jp2k):
os.remove(self.temp_file.name)
self.temp_file = None
def _write(self, img_array, verbose=False, **kwargs):
# Glymur normally only lets you control encode params when a write happens within
# the constructor. Keep around the encode params from the constructor and pass
# them to successive write calls.
return super()._write(img_array, verbose=False, **self.encode_kwargs, **kwargs)
def _populate_cparams(self, img_array):
if self._cratios is None:
self._cratios = (1920.0, 480.0, 120.0, 30.0, 10.0)
if self._irreversible is None:
self.irreversible = True
return super()._populate_cparams(img_array)
def __bytes__(self):
with open(self.temp_file.name, "rb") as f:

View File

@@ -31,6 +31,14 @@ class SchemaFieldSerializer(abc.ABC, Generic[_T]):
def serialize(cls, val: _T) -> str:
pass
@classmethod
def from_llsd(cls, val: Any) -> _T:
return val
@classmethod
def to_llsd(cls, val: _T) -> Any:
return val
class SchemaDate(SchemaFieldSerializer[dt.datetime]):
@classmethod
@@ -41,6 +49,14 @@ class SchemaDate(SchemaFieldSerializer[dt.datetime]):
def serialize(cls, val: dt.datetime) -> str:
return str(calendar.timegm(val.utctimetuple()))
@classmethod
def from_llsd(cls, val: Any) -> dt.datetime:
return dt.datetime.utcfromtimestamp(val)
@classmethod
def to_llsd(cls, val: dt.datetime):
return calendar.timegm(val.utctimetuple())
class SchemaHexInt(SchemaFieldSerializer[int]):
@classmethod
@@ -95,10 +111,11 @@ class SchemaUUID(SchemaFieldSerializer[UUID]):
def schema_field(spec: Type[Union[SchemaBase, SchemaFieldSerializer]], *, default=dataclasses.MISSING, init=True,
repr=True, hash=None, compare=True) -> dataclasses.Field: # noqa
repr=True, hash=None, compare=True, llsd_name=None) -> dataclasses.Field: # noqa
"""Describe a field in the inventory schema and the shape of its value"""
return dataclasses.field(
metadata={"spec": spec}, default=default, init=init, repr=repr, hash=hash, compare=compare
metadata={"spec": spec, "llsd_name": llsd_name}, default=default,
init=init, repr=repr, hash=hash, compare=compare,
)
@@ -121,8 +138,14 @@ def parse_schema_line(line: str):
@dataclasses.dataclass
class SchemaBase(abc.ABC):
@classmethod
def _fields_dict(cls):
return {f.name: f for f in dataclasses.fields(cls)}
def _get_fields_dict(cls, llsd=False):
fields_dict = {}
for field in dataclasses.fields(cls):
field_name = field.name
if llsd:
field_name = field.metadata.get("llsd_name") or field_name
fields_dict[field_name] = field
return fields_dict
@classmethod
def from_str(cls, text: str):
@@ -137,6 +160,30 @@ class SchemaBase(abc.ABC):
def from_bytes(cls, data: bytes):
return cls.from_str(data.decode("utf8"))
@classmethod
def from_llsd(cls, inv_dict: Dict):
fields = cls._get_fields_dict(llsd=True)
obj_dict = {}
for key, val in inv_dict.items():
if key in fields:
field: dataclasses.Field = fields[key]
key = field.name
spec = field.metadata.get("spec")
# Not a real key, an internal var on our dataclass
if not spec:
LOG.warning(f"Internal key {key!r}")
continue
# some kind of nested structure like sale_info
if issubclass(spec, SchemaBase):
obj_dict[key] = spec.from_llsd(val)
elif issubclass(spec, SchemaFieldSerializer):
obj_dict[key] = spec.from_llsd(val)
else:
raise ValueError(f"Unsupported spec for {key!r}, {spec!r}")
else:
LOG.warning(f"Unknown key {key!r}")
return cls._obj_from_dict(obj_dict)
def to_bytes(self) -> bytes:
return self.to_str().encode("utf8")
@@ -146,6 +193,28 @@ class SchemaBase(abc.ABC):
writer.seek(0)
return writer.read()
def to_llsd(self):
obj_dict = {}
for field_name, field in self._get_fields_dict(llsd=True).items():
spec = field.metadata.get("spec")
# Not meant to be serialized
if not spec:
continue
val = getattr(self, field.name)
if val is None:
continue
# Some kind of nested structure like sale_info
if isinstance(val, SchemaBase):
val = val.to_llsd()
elif issubclass(spec, SchemaFieldSerializer):
val = spec.to_llsd(val)
else:
raise ValueError(f"Bad inventory spec {spec!r}")
obj_dict[field_name] = val
return obj_dict
@abc.abstractmethod
def to_writer(self, writer: StringIO):
pass

View File

@@ -270,8 +270,8 @@ LOD_SEGMENT_SERIALIZER = SegmentSerializer({
# Each position represents a single vert.
"Position": se.Collection(None, se.Vector3U16(0.0, 1.0)),
"TexCoord0": se.Collection(None, se.Vector2U16(0.0, 1.0)),
# Normals have a static domain between -1 and 1
"Normal": se.Collection(None, se.Vector3U16(0.0, 1.0)),
# Normals have a static domain between -1 and 1, so just use that.
"Normal": se.Collection(None, se.Vector3U16(-1.0, 1.0)),
"Weights": se.Collection(None, VertexWeights)
})

View File

@@ -1,6 +1,9 @@
from __future__ import annotations
import abc
import asyncio
import copy
import dataclasses
import datetime as dt
import logging
from typing import *
@@ -13,6 +16,14 @@ from .msgtypes import PacketFlags
from .udpserializer import UDPMessageSerializer
@dataclasses.dataclass
class ReliableResendInfo:
last_resent: dt.datetime
message: Message
completed: asyncio.Future = dataclasses.field(default_factory=asyncio.Future)
tries_left: int = 10
class Circuit:
def __init__(self, near_host: Optional[ADDR_TUPLE], far_host: ADDR_TUPLE, transport):
self.near_host: Optional[ADDR_TUPLE] = near_host
@@ -22,6 +33,8 @@ class Circuit:
self.serializer = UDPMessageSerializer()
self.last_packet_at = dt.datetime.now()
self.packet_id_base = 0
self.unacked_reliable: Dict[Tuple[Direction, int], ReliableResendInfo] = {}
self.resend_every: float = 3.0
def _send_prepared_message(self, message: Message, transport=None):
try:
@@ -46,24 +59,69 @@ class Circuit:
raise RuntimeError(f"Trying to re-send finalized {message!r}")
message.packet_id = self.packet_id_base
self.packet_id_base += 1
if not message.acks:
message.send_flags &= PacketFlags.ACK
if message.acks:
message.send_flags |= PacketFlags.ACK
else:
message.send_flags &= ~PacketFlags.ACK
# If it was queued, it's not anymore
message.queued = False
message.finalized = True
def send_message(self, message: Message, transport=None):
def send(self, message: Message, transport=None) -> UDPPacket:
if self.prepare_message(message):
# If the message originates from us then we're responsible for resends.
if message.reliable and message.synthetic:
self.unacked_reliable[(message.direction, message.packet_id)] = ReliableResendInfo(
last_resent=dt.datetime.now(),
message=message,
)
return self._send_prepared_message(message, transport)
# Temporary alias
send_message = send
def send_reliable(self, message: Message, transport=None) -> asyncio.Future:
"""send() wrapper that always sends reliably and allows `await`ing ACK receipt"""
if not message.synthetic:
raise ValueError("Not able to send non-synthetic message reliably!")
message.send_flags |= PacketFlags.RELIABLE
self.send(message, transport)
return self.unacked_reliable[(message.direction, message.packet_id)].completed
def collect_acks(self, message: Message):
effective_acks = list(message.acks)
if message.name == "PacketAck":
effective_acks.extend(x["ID"] for x in message["Packets"])
for ack in effective_acks:
resend_info = self.unacked_reliable.pop((~message.direction, ack), None)
if resend_info:
resend_info.completed.set_result(None)
def resend_unacked(self):
for resend_info in list(self.unacked_reliable.values()):
# Not time to attempt a resend yet
if dt.datetime.now() - resend_info.last_resent < dt.timedelta(seconds=self.resend_every):
continue
msg = copy.copy(resend_info.message)
resend_info.tries_left -= 1
# We were on our last try and we never received an ack
if not resend_info.tries_left:
logging.warning(f"Giving up on unacked {msg.packet_id}")
del self.unacked_reliable[(msg.direction, msg.packet_id)]
resend_info.completed.set_exception(TimeoutError("Exceeded resend limit"))
continue
resend_info.last_resent = dt.datetime.now()
msg.send_flags |= PacketFlags.RESENT
self._send_prepared_message(msg)
def send_acks(self, to_ack: Sequence[int], direction=Direction.OUT, packet_id=None):
logging.debug("%r acking %r" % (direction, to_ack))
# TODO: maybe tack this onto `.acks` for next message?
message = Message('PacketAck', *[Block('Packets', ID=x) for x in to_ack])
message.packet_id = packet_id
message.direction = direction
message.injected = True
self.send_message(message)
self.send(message)
def __repr__(self):
return "<%s %r : %r>" % (self.__class__.__name__, self.near_host, self.host)

View File

@@ -188,7 +188,7 @@ class MsgBlockList(List["Block"]):
class Message:
__slots__ = ("name", "send_flags", "packet_id", "acks", "body_boundaries", "queued",
"offset", "raw_extra", "raw_body", "deserializer", "_blocks", "finalized",
"direction", "meta", "injected", "dropped", "sender")
"direction", "meta", "synthetic", "dropped", "sender")
def __init__(self, name, *args, packet_id=None, flags=0, acks=None, direction=None):
# TODO: Do this on a timer or something.
@@ -213,7 +213,7 @@ class Message:
self.queued: bool = False
self._blocks: BLOCK_DICT = {}
self.meta = {}
self.injected = False
self.synthetic = packet_id is None
self.dropped = False
self.sender: Optional[ADDR_TUPLE] = None
@@ -312,7 +312,7 @@ class Message:
"packet_id": self.packet_id,
"meta": self.meta.copy(),
"dropped": self.dropped,
"injected": self.injected,
"synthetic": self.synthetic,
"direction": self.direction.name,
"send_flags": int(self.send_flags),
"extra": self.extra,
@@ -334,7 +334,7 @@ class Message:
msg.packet_id = dict_val['packet_id']
msg.meta = dict_val['meta']
msg.dropped = dict_val['dropped']
msg.injected = dict_val['injected']
msg.synthetic = dict_val['synthetic']
msg.direction = Direction[dict_val['direction']]
msg.send_flags = dict_val['send_flags']
msg.extra = dict_val['extra']
@@ -386,6 +386,7 @@ class Message:
message_copy.packet_id = None
message_copy.dropped = False
message_copy.finalized = False
message_copy.queued = False
return message_copy
def to_summary(self):

View File

@@ -62,9 +62,16 @@ class HumanMessageSerializer:
continue
if first_line:
direction, message_name = line.split(" ", 1)
first_split = [x for x in line.split(" ") if x]
direction, message_name = first_split[:2]
options = [x.strip("[]") for x in first_split[2:]]
msg = Message(message_name)
msg.direction = Direction[direction.upper()]
for option in options:
if option in PacketFlags.__members__:
msg.send_flags |= PacketFlags[option]
elif re.match(r"^\d+$", option):
msg.send_flags |= int(option)
first_line = False
continue
@@ -137,9 +144,17 @@ class HumanMessageSerializer:
if msg.direction is not None:
string += f'{msg.direction.name} '
string += msg.name
flags = msg.send_flags
for poss_flag in iter(PacketFlags):
if flags & poss_flag:
flags &= ~poss_flag
string += f" [{poss_flag.name}]"
# Make sure flags with unknown meanings don't get lost
if flags:
string += f" [{int(flags)}]"
if msg.packet_id is not None:
string += f'\n# {msg.packet_id}: {PacketFlags(msg.send_flags)!r}'
string += f'{", DROPPED" if msg.dropped else ""}{", INJECTED" if msg.injected else ""}'
string += f'\n# ID: {msg.packet_id}'
string += f'{", DROPPED" if msg.dropped else ""}{", SYNTHETIC" if msg.synthetic else ""}'
if msg.extra:
string += f'\n# EXTRA: {msg.extra!r}'
string += '\n\n'

View File

@@ -107,7 +107,8 @@ class MessageHandler(Generic[_T, _K]):
take = self.take_by_default
notifiers = [self.register(name) for name in message_names]
fut = asyncio.get_event_loop().create_future()
loop = asyncio.get_event_loop_policy().get_event_loop()
fut = loop.create_future()
timeout_task = None
async def _canceller():

View File

@@ -68,7 +68,7 @@ class UDPMessageDeserializer:
self.settings = settings or Settings()
self.template_dict = self.DEFAULT_TEMPLATE
def deserialize(self, msg_buff: bytes):
def deserialize(self, msg_buff: bytes) -> Message:
msg = self._parse_message_header(msg_buff)
if not self.settings.ENABLE_DEFERRED_PACKET_PARSING:
try:
@@ -85,6 +85,7 @@ class UDPMessageDeserializer:
reader = se.BufferReader("!", data)
msg: Message = Message("Placeholder")
msg.synthetic = False
msg.send_flags = reader.read(se.U8)
msg.packet_id = reader.read(se.U32)

View File

@@ -71,7 +71,7 @@ class Object(recordclass.datatuple): # type: ignore
ProfileBegin: Optional[int] = None
ProfileEnd: Optional[int] = None
ProfileHollow: Optional[int] = None
TextureEntry: Optional[tmpls.TextureEntry] = None
TextureEntry: Optional[tmpls.TextureEntryCollection] = None
TextureAnim: Optional[tmpls.TextureAnim] = None
NameValue: Optional[Any] = None
Data: Optional[Any] = None
@@ -270,6 +270,9 @@ def normalize_object_update_compressed_data(data: bytes):
# Only used for determining which sections are present
del compressed["Flags"]
# Unlike other ObjectUpdate types, a null value in an ObjectUpdateCompressed
# always means that there is no value, not that the value hasn't changed
# from the client's view. Use the default value when that happens.
ps_block = compressed.pop("PSBlockNew", None)
if ps_block is None:
ps_block = compressed.pop("PSBlock", None)
@@ -278,6 +281,20 @@ def normalize_object_update_compressed_data(data: bytes):
compressed.pop("PSBlock", None)
if compressed["NameValue"] is None:
compressed["NameValue"] = NameValueCollection()
if compressed["Text"] is None:
compressed["Text"] = b""
compressed["TextColor"] = b""
if compressed["MediaURL"] is None:
compressed["MediaURL"] = b""
if compressed["AngularVelocity"] is None:
compressed["AngularVelocity"] = Vector3()
if compressed["SoundFlags"] is None:
compressed["SoundFlags"] = 0
compressed["SoundGain"] = 0.0
compressed["SoundRadius"] = 0.0
compressed["Sound"] = UUID()
if compressed["TextureEntry"] is None:
compressed["TextureEntry"] = tmpls.TextureEntryCollection()
object_data = {
"PSBlock": ps_block.value,
@@ -286,9 +303,9 @@ def normalize_object_update_compressed_data(data: bytes):
"LocalID": compressed.pop("ID"),
**compressed,
}
if object_data["TextureEntry"] is None:
object_data.pop("TextureEntry")
# Don't clobber OwnerID in case the object has a proper one.
# Don't clobber OwnerID in case the object has a proper one from
# a previous ObjectProperties. OwnerID isn't expected to be populated
# on ObjectUpdates unless an attached sound is playing.
if object_data["OwnerID"] == UUID():
del object_data["OwnerID"]
return object_data

View File

@@ -3,16 +3,18 @@ Serialization templates for structures used in LLUDP and HTTP bodies.
"""
import abc
import collections
import dataclasses
import enum
import importlib
import logging
import math
import zlib
from typing import *
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID, IntEnum, IntFlag
from hippolyzer.lib.base.datatypes import UUID, IntEnum, IntFlag, Vector3
from hippolyzer.lib.base.namevalue import NameValuesSerializer
try:
@@ -862,7 +864,7 @@ class ShineLevel(IntEnum):
HIGH = 3
@dataclasses.dataclass
@dataclasses.dataclass(unsafe_hash=True)
class BasicMaterials:
# Meaning is technically implementation-dependent, these are in LL data files
Bump: int = se.bitfield_field(bits=5)
@@ -881,7 +883,7 @@ class TexGen(IntEnum):
CYLINDRICAL = 0x6
@dataclasses.dataclass
@dataclasses.dataclass(unsafe_hash=True)
class MediaFlags:
WebPage: bool = se.bitfield_field(bits=1, adapter=se.BoolAdapter())
TexGen: "TexGen" = se.bitfield_field(bits=2, adapter=se.IntEnum(TexGen))
@@ -1039,9 +1041,64 @@ def _te_field(spec: se.SERIALIZABLE_TYPE, first=False, optional=False,
_T = TypeVar("_T")
_TE_FIELD_KEY = Optional[Sequence[int]]
# If this seems weird it's because it is. TE offsets are S16s with `0` as the actual 0
# point, and LL divides by `0x7FFF` to convert back to float. Negative S16s can
# actually go to -0x8000 due to two's complement, creating a larger range for negatives.
TE_S16_COORD = se.QuantizedFloat(se.S16, -1.000030518509476, 1.0, False)
class PackedTERotation(se.QuantizedFloat):
"""Another weird one, packed TE rotations have their own special quantization"""
def __init__(self):
super().__init__(se.S16, math.pi * -2, math.pi * 2, zero_median=False)
self.step_mag = 1.0 / (se.U16.max_val + 1)
def _float_to_quantized(self, val: float, lower: float, upper: float):
val = math.fmod(val, upper)
val = super()._float_to_quantized(val, lower, upper)
if val == se.S16.max_val + 1:
val = self.prim_min
return val
@dataclasses.dataclass
class TextureEntry:
"""Representation of a TE for a single face. Not sent over the wire."""
Textures: UUID = UUID('89556747-24cb-43ed-920b-47caed15465f')
Color: bytes = b"\xff\xff\xff\xff"
ScalesS: float = 1.0
ScalesT: float = 1.0
OffsetsS: float = 0.0
OffsetsT: float = 0.0
# In radians
Rotation: float = 0.0
MediaFlags: Optional[MediaFlags] = None
BasicMaterials: Optional[BasicMaterials] = None
Glow: int = 0
Materials: UUID = UUID.ZERO
def st_to_uv(self, st_coord: Vector3) -> Vector3:
"""Convert OpenGL ST coordinates to UV coordinates, accounting for mapping"""
uv = Vector3(st_coord.X - 0.5, st_coord.Y - 0.5)
cos_rot = math.cos(self.Rotation)
sin_rot = math.sin(self.Rotation)
uv = Vector3(
X=uv.X * cos_rot + uv.Y * sin_rot,
Y=-uv.X * sin_rot + uv.Y * cos_rot
)
uv *= Vector3(self.ScalesS, self.ScalesT)
return uv + Vector3(self.OffsetsS + 0.5, self.OffsetsT + 0.5)
# Max number of TEs possible according to llprimitive (but not really true!)
# Useful if you don't know how many faces / TEs an object really has because it's mesh
# or something.
MAX_TES = 45
@dataclasses.dataclass
class TextureEntryCollection:
Textures: Dict[_TE_FIELD_KEY, UUID] = _te_field(
# Plywood texture
se.UUID, first=True, default=UUID('89556747-24cb-43ed-920b-47caed15465f'))
@@ -1049,9 +1106,9 @@ class TextureEntry:
Color: Dict[_TE_FIELD_KEY, bytes] = _te_field(Color4(invert_bytes=True), default=b"\xff\xff\xff\xff")
ScalesS: Dict[_TE_FIELD_KEY, float] = _te_field(se.F32, default=1.0)
ScalesT: Dict[_TE_FIELD_KEY, float] = _te_field(se.F32, default=1.0)
OffsetsS: Dict[_TE_FIELD_KEY, int] = _te_field(se.S16, default=0)
OffsetsT: Dict[_TE_FIELD_KEY, int] = _te_field(se.S16, default=0)
Rotation: Dict[_TE_FIELD_KEY, int] = _te_field(se.S16, default=0)
OffsetsS: Dict[_TE_FIELD_KEY, float] = _te_field(TE_S16_COORD, default=0.0)
OffsetsT: Dict[_TE_FIELD_KEY, float] = _te_field(TE_S16_COORD, default=0.0)
Rotation: Dict[_TE_FIELD_KEY, float] = _te_field(PackedTERotation(), default=0.0)
BasicMaterials: Dict[_TE_FIELD_KEY, "BasicMaterials"] = _te_field(
BUMP_SHINY_FULLBRIGHT, default_factory=lambda: BasicMaterials(Bump=0, FullBright=False, Shiny=0),
)
@@ -1059,11 +1116,59 @@ class TextureEntry:
MEDIA_FLAGS,
default_factory=lambda: MediaFlags(WebPage=False, TexGen=TexGen.DEFAULT, _Unused=0),
)
# TODO: dequantize
Glow: Dict[_TE_FIELD_KEY, int] = _te_field(se.U8, default=0)
Materials: Dict[_TE_FIELD_KEY, UUID] = _te_field(se.UUID, optional=True, default=UUID())
Materials: Dict[_TE_FIELD_KEY, UUID] = _te_field(se.UUID, optional=True, default=UUID.ZERO)
def unwrap(self):
"""Return `self` regardless of whether this is lazy wrapped object or not"""
return self
def realize(self, num_faces: int = MAX_TES) -> List[TextureEntry]:
"""
Turn the "default" vs "exception cases" wire format TE representation to per-face lookups
Makes it easier to get all TE details associated with a specific face
"""
as_dicts = [dict() for _ in range(num_faces)]
for field in dataclasses.fields(self):
key = field.name
vals = getattr(self, key)
# Fill give all faces the default value for this key
for te in as_dicts:
te[key] = vals[None]
# Walk over the exception cases and replace the default value
for face_nums, val in vals.items():
# Default case already handled
if face_nums is None:
continue
for face_num in face_nums:
if face_num >= num_faces:
raise ValueError(f"Bad value for num_faces? {face_num} >= {num_faces}")
as_dicts[face_num][key] = val
return [TextureEntry(**x) for x in as_dicts]
@classmethod
def from_tes(cls, tes: List[TextureEntry]) -> "TextureEntryCollection":
instance = cls()
if not tes:
return instance
for field in dataclasses.fields(cls):
te_vals: Dict[Any, List[int]] = collections.defaultdict(list)
for i, te in enumerate(tes):
# Group values by what face they occur on
te_vals[getattr(te, field.name)].append(i)
# Make most common value the "default", everything else is an exception
sorted_vals = sorted(te_vals.items(), key=lambda x: len(x[1]), reverse=True)
default_val = sorted_vals.pop(0)[0]
te_vals = {None: default_val}
for val, face_nums in sorted_vals:
te_vals[tuple(face_nums)] = val
setattr(instance, field.name, te_vals)
return instance
TE_SERIALIZER = se.Dataclass(TextureEntry)
TE_SERIALIZER = se.Dataclass(TextureEntryCollection)
@se.subfield_serializer("ObjectUpdate", "ObjectData", "TextureEntry")
@@ -1600,6 +1705,7 @@ class RegionHandshakeReplyFlags(IntFlag):
@se.flag_field_serializer("TeleportStart", "Info", "TeleportFlags")
@se.flag_field_serializer("TeleportProgress", "Info", "TeleportFlags")
@se.flag_field_serializer("TeleportFinish", "Info", "TeleportFlags")
@se.flag_field_serializer("TeleportLocal", "Info", "TeleportFlags")
@se.flag_field_serializer("TeleportLureRequest", "Info", "TeleportFlags")
class TeleportFlags(IntFlag):
SET_HOME_TO_TARGET = 1 << 0 # newbie leaving prelude (starter area)
@@ -1618,6 +1724,8 @@ class TeleportFlags(IntFlag):
IS_FLYING = 1 << 13
SHOW_RESET_HOME = 1 << 14
FORCE_REDIRECT = 1 << 15
VIA_GLOBAL_COORDS = 1 << 16
WITHIN_REGION = 1 << 17
@se.http_serializer("RenderMaterials")

View File

@@ -94,7 +94,7 @@ class TransferManager:
if params_dict.get("SessionID", dataclasses.MISSING) is None:
params.SessionID = self._session_id
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
'TransferRequest',
Block(
'TransferInfo',

View File

@@ -1,5 +1,5 @@
from PySide2.QtCore import QMetaObject
from PySide2.QtUiTools import QUiLoader
from PySide6.QtCore import QMetaObject
from PySide6.QtUiTools import QUiLoader
class UiLoader(QUiLoader):

View File

@@ -13,7 +13,7 @@ from xml.etree.ElementTree import parse as parse_etree
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import get_resource_filename
from hippolyzer.lib.base.legacy_inv import InventorySaleInfo, InventoryPermissions
from hippolyzer.lib.base.inventory import InventorySaleInfo, InventoryPermissions
from hippolyzer.lib.base.legacy_schema import SchemaBase, parse_schema_line, SchemaParsingError
from hippolyzer.lib.base.templates import WearableType

View File

@@ -110,7 +110,7 @@ class XferManager:
direction: Direction = Direction.OUT,
) -> Xfer:
xfer_id = xfer_id if xfer_id is not None else random.getrandbits(64)
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
'RequestXfer',
Block(
'XferID',
@@ -174,7 +174,7 @@ class XferManager:
to_ack = range(xfer.next_ackable, ack_max)
xfer.next_ackable = ack_max
for ack_id in to_ack:
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
"ConfirmXferPacket",
Block("XferID", ID=xfer.xfer_id, Packet=ack_id),
direction=xfer.direction,
@@ -216,7 +216,7 @@ class XferManager:
else:
inline_data = data
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
"AssetUploadRequest",
Block(
"AssetBlock",
@@ -272,7 +272,7 @@ class XferManager:
chunk = xfer.chunks.pop(packet_id)
# EOF if there are no chunks left
packet_val = XferPacket(PacketID=packet_id, IsEOF=not bool(xfer.chunks))
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
"SendXferPacket",
Block("XferID", ID=xfer.xfer_id, Packet_=packet_val),
Block("DataPacket", Data=chunk),

View File

@@ -116,8 +116,8 @@ class ClientObjectManager:
*[Block("ObjectData", ObjectLocalID=x) for x in ids_to_req[:255]],
]
# Selecting causes ObjectProperties to be sent
self._region.circuit.send_message(Message("ObjectSelect", blocks))
self._region.circuit.send_message(Message("ObjectDeselect", blocks))
self._region.circuit.send(Message("ObjectSelect", blocks))
self._region.circuit.send(Message("ObjectDeselect", blocks))
ids_to_req = ids_to_req[255:]
futures = []
@@ -150,7 +150,7 @@ class ClientObjectManager:
ids_to_req = local_ids
while ids_to_req:
self._region.circuit.send_message(Message(
self._region.circuit.send(Message(
"RequestMultipleObjects",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
*[Block("ObjectData", CacheMissType=0, ID=x) for x in ids_to_req[:255]],

View File

@@ -73,17 +73,17 @@ def show_message(text, session=None) -> None:
direction=Direction.IN,
)
if session:
session.main_region.circuit.send_message(message)
session.main_region.circuit.send(message)
else:
for session in AddonManager.SESSION_MANAGER.sessions:
session.main_region.circuit.send_message(copy.copy(message))
session.main_region.circuit.send(copy.copy(message))
def send_chat(message: Union[bytes, str], channel=0, chat_type=ChatType.NORMAL, session=None):
session = session or addon_ctx.session.get(None) or None
if not session:
raise RuntimeError("Tried to send chat without session")
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"ChatFromViewer",
Block(
"AgentData",
@@ -128,6 +128,17 @@ def ais_item_to_inventory_data(ais_item: dict):
)
def ais_folder_to_inventory_data(ais_folder: dict):
return Block(
"FolderData",
FolderID=ais_folder["cat_id"],
ParentID=ais_folder["parent_id"],
CallbackID=0,
Type=ais_folder["preferred_type"],
Name=ais_folder["name"],
)
class BaseAddon(abc.ABC):
def _schedule_task(self, coro: Coroutine, session=None,
region_scoped=False, session_scoped=True, addon_scoped=True):
@@ -181,6 +192,9 @@ class BaseAddon(abc.ABC):
def handle_region_changed(self, session: Session, region: ProxiedRegion):
pass
def handle_region_registered(self, session: Session, region: ProxiedRegion):
pass
def handle_circuit_created(self, session: Session, region: ProxiedRegion):
pass

View File

@@ -16,6 +16,7 @@ from types import ModuleType
from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import get_mtime
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.network.transport import UDPPacket
from hippolyzer.lib.proxy import addon_ctx
@@ -31,13 +32,6 @@ if TYPE_CHECKING:
LOG = logging.getLogger(__name__)
def _get_mtime(path):
try:
return os.stat(path).st_mtime
except:
return None
class BaseInteractionManager:
@abc.abstractmethod
async def open_dir(self, caption: str = '', directory: str = '', filter_str: str = '') -> Optional[str]:
@@ -64,6 +58,14 @@ class BaseInteractionManager:
return None
# Used to initialize a REPL environment with commonly desired helpers
REPL_INITIALIZER = r"""
from hippolyzer.lib.base.datatypes import *
from hippolyzer.lib.base.message.message import Block, Message, Direction
from hippolyzer.lib.proxy.addon_utils import send_chat, show_message
"""
class AddonManager:
COMMAND_CHANNEL = 524
@@ -139,6 +141,9 @@ class AddonManager:
if _locals is None:
_locals = stack.frame.f_locals
_globals = dict(_globals)
exec(REPL_INITIALIZER, _globals, None)
async def _wrapper():
coro: Coroutine = ptpython.repl.embed( # noqa: the type signature lies
globals=_globals,
@@ -187,7 +192,7 @@ class AddonManager:
def _check_hotreloads(cls):
"""Mark addons that rely on changed files for reloading"""
for filename, importers in cls.HOTRELOAD_IMPORTERS.items():
mtime = _get_mtime(filename)
mtime = get_mtime(filename)
if not mtime or mtime == cls.FILE_MTIMES.get(filename, None):
continue
@@ -216,7 +221,7 @@ class AddonManager:
# Mark the caller as having imported (and being dependent on) `module`
stack = inspect.stack()[1]
cls.HOTRELOAD_IMPORTERS[imported_file].add(stack.filename)
cls.FILE_MTIMES[imported_file] = _get_mtime(imported_file)
cls.FILE_MTIMES[imported_file] = get_mtime(imported_file)
importing_spec = next((s for s in cls.BASE_ADDON_SPECS if s.origin == stack.filename), None)
imported_spec = next((s for s in cls.BASE_ADDON_SPECS if s.origin == imported_file), None)
@@ -264,7 +269,7 @@ class AddonManager:
for spec in cls.BASE_ADDON_SPECS[:]:
had_mod = spec.name in cls.FRESH_ADDON_MODULES
try:
mtime = _get_mtime(spec.origin)
mtime = get_mtime(spec.origin)
mtime_changed = mtime != cls.FILE_MTIMES.get(spec.origin, None)
if not mtime_changed and had_mod:
continue
@@ -288,8 +293,8 @@ class AddonManager:
# Make sure module initialization happens after any pending task cancellations
# due to module unloading.
asyncio.get_event_loop().call_soon(cls._init_module, mod)
loop = asyncio.get_event_loop_policy().get_event_loop()
loop.call_soon(cls._init_module, mod)
except Exception as e:
if had_mod:
logging.exception("Exploded trying to reload addon %s" % spec.name)
@@ -527,6 +532,11 @@ class AddonManager:
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_region_changed", session, region)
@classmethod
def handle_region_registered(cls, session: Session, region: ProxiedRegion):
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_region_registered", session, region)
@classmethod
def handle_circuit_created(cls, session: Session, region: ProxiedRegion):
with addon_ctx.push(session, region):

View File

@@ -24,6 +24,10 @@ class CapType(enum.Enum):
WRAPPER = enum.auto()
PROXY_ONLY = enum.auto()
@property
def fake(self) -> bool:
return self == CapType.PROXY_ONLY or self == CapType.WRAPPER
class SerializedCapData(typing.NamedTuple):
cap_name: typing.Optional[str] = None

View File

@@ -20,7 +20,7 @@ class ProxyCapsClient(CapsClient):
def _get_caps(self) -> Optional[CAPS_DICT]:
if not self._region:
return None
return self._region.caps
return self._region.cap_urls
def _request_fixups(self, cap_or_url: str, headers: Dict, proxy: Optional[bool], ssl: Any):
# We want to proxy this through Hippolyzer
@@ -28,7 +28,8 @@ class ProxyCapsClient(CapsClient):
# We go through the proxy by default, tack on a header letting mitmproxy know the
# request came from us so we can tag the request as injected. The header will be popped
# off before passing through to the server.
headers["X-Hippo-Injected"] = "1"
if "X-Hippo-Injected" not in headers:
headers["X-Hippo-Injected"] = "1"
proxy_port = self._settings.HTTP_PROXY_PORT
proxy = f"http://127.0.0.1:{proxy_port}"
# TODO: set up the SSLContext to validate mitmproxy's cert

View File

@@ -25,7 +25,7 @@ class ProxiedCircuit(Circuit):
except:
logging.exception(f"Failed to serialize: {message.to_dict()!r}")
raise
if self.logging_hook and message.injected:
if self.logging_hook and message.synthetic:
self.logging_hook(message)
return self.send_datagram(serialized, message.direction, transport=transport)
@@ -34,47 +34,46 @@ class ProxiedCircuit(Circuit):
return self.out_injections, self.in_injections
return self.in_injections, self.out_injections
def prepare_message(self, message: Message, direction=None):
def prepare_message(self, message: Message):
if message.finalized:
raise RuntimeError(f"Trying to re-send finalized {message!r}")
if message.queued:
# This is due to be dropped, nothing should be sending the original
raise RuntimeError(f"Trying to send original of queued {message!r}")
direction = direction or getattr(message, 'direction')
fwd_injections, reverse_injections = self._get_injections(direction)
fwd_injections, reverse_injections = self._get_injections(message.direction)
message.finalized = True
# Injected, let's gen an ID
if message.packet_id is None:
message.packet_id = fwd_injections.gen_injectable_id()
message.injected = True
else:
message.synthetic = True
# This message wasn't injected by the proxy so we need to rewrite packet IDs
# to account for IDs the real creator of the packet couldn't have known about.
elif not message.synthetic:
# was_dropped needs the unmodified packet ID
if fwd_injections.was_dropped(message.packet_id) and message.name != "PacketAck":
logging.warning("Attempting to re-send previously dropped %s:%s, did we ack?" %
(message.packet_id, message.name))
message.packet_id = fwd_injections.get_effective_id(message.packet_id)
fwd_injections.track_seen(message.packet_id)
message.finalized = True
if not message.injected:
# This message wasn't injected by the proxy so we need to rewrite packet IDs
# to account for IDs the other parties couldn't have known about.
message.acks = tuple(
reverse_injections.get_original_id(x) for x in message.acks
if not reverse_injections.was_injected(x)
)
if message.name == "PacketAck":
if not self._rewrite_packet_ack(message, reverse_injections):
logging.debug(f"Dropping {direction} ack for injected packets!")
if not self._rewrite_packet_ack(message, reverse_injections) and not message.acks:
logging.debug(f"Dropping {message.direction} ack for injected packets!")
# Let caller know this shouldn't be sent at all, it's strictly ACKs for
# injected packets.
return False
elif message.name == "StartPingCheck":
self._rewrite_start_ping_check(message, fwd_injections)
if not message.acks:
if message.acks:
message.send_flags |= PacketFlags.ACK
else:
message.send_flags &= ~PacketFlags.ACK
return True
@@ -100,15 +99,18 @@ class ProxiedCircuit(Circuit):
new_id = fwd_injections.get_effective_id(orig_id)
if orig_id != new_id:
logging.debug("Rewrote oldest unacked %s -> %s" % (orig_id, new_id))
# Get a list of unacked IDs for the direction this StartPingCheck is heading
fwd_unacked = (a for (d, a) in self.unacked_reliable.keys() if d == message.direction)
# Use the proxy's oldest unacked ID if it's older than the client's
new_id = min((new_id, *fwd_unacked))
message["PingID"]["OldestUnacked"] = new_id
def drop_message(self, message: Message, orig_direction=None):
def drop_message(self, message: Message):
if message.finalized:
raise RuntimeError(f"Trying to drop finalized {message!r}")
if message.packet_id is None:
return
orig_direction = orig_direction or message.direction
fwd_injections, reverse_injections = self._get_injections(orig_direction)
fwd_injections, reverse_injections = self._get_injections(message.direction)
fwd_injections.mark_dropped(message.packet_id)
message.dropped = True
@@ -116,7 +118,7 @@ class ProxiedCircuit(Circuit):
# Was sent reliably, tell the other end that we saw it and to shut up.
if message.reliable:
self.send_acks([message.packet_id], ~orig_direction)
self.send_acks([message.packet_id], ~message.direction)
# This packet had acks for the other end, send them in a separate PacketAck
effective_acks = tuple(
@@ -124,7 +126,7 @@ class ProxiedCircuit(Circuit):
if not reverse_injections.was_injected(x)
)
if effective_acks:
self.send_acks(effective_acks, orig_direction, packet_id=message.packet_id)
self.send_acks(effective_acks, message.direction, packet_id=message.packet_id)
class InjectionTracker:

View File

@@ -83,16 +83,19 @@ class MITMProxyEventManager:
finally:
# If someone has taken this request out of the regular callback flow,
# they'll manually send a callback at some later time.
if not flow.taken:
self.to_proxy_queue.put(("callback", flow.id, flow.get_state()))
if not flow.taken and not flow.resumed:
# Addon hasn't taken ownership of this flow, send it back to mitmproxy
# ourselves.
flow.resume()
def _handle_request(self, flow: HippoHTTPFlow):
url = flow.request.url
cap_data = self.session_manager.resolve_cap(url)
flow.cap_data = cap_data
# Don't do anything special with the proxy's own requests,
# we only pass it through for logging purposes.
if flow.request_injected:
# Don't do anything special with the proxy's own requests unless the requested
# URL can only be handled by the proxy. Ideally we only pass the request through
# for logging purposes.
if flow.request_injected and (not cap_data or not cap_data.type.fake):
return
# The local asset repo gets first bite at the apple
@@ -104,7 +107,7 @@ class MITMProxyEventManager:
AddonManager.handle_http_request(flow)
if cap_data and cap_data.cap_name.endswith("ProxyWrapper"):
orig_cap_name = cap_data.cap_name.rsplit("ProxyWrapper", 1)[0]
orig_cap_url = cap_data.region().caps[orig_cap_name]
orig_cap_url = cap_data.region().cap_urls[orig_cap_name]
split_orig_url = urllib.parse.urlsplit(orig_cap_url)
orig_cap_host = split_orig_url[1]
@@ -135,7 +138,7 @@ class MITMProxyEventManager:
)
elif cap_data and cap_data.asset_server_cap:
# Both the wrapper request and the actual asset server request went through
# the proxy
# the proxy. Don't bother trying the redirect strategy anymore.
self._asset_server_proxied = True
logging.warning("noproxy not used, switching to URI rewrite strategy")
elif cap_data and cap_data.cap_name == "EventQueueGet":
@@ -159,6 +162,17 @@ class MITMProxyEventManager:
"Connection": "close",
},
)
elif cap_data and cap_data.cap_name == "Seed":
# Drop any proxy-only caps from the seed request we send to the server,
# add those cap names as metadata so we know to send their urls in the response
parsed_seed: List[str] = llsd.parse_xml(flow.request.content)
flow.metadata['needed_proxy_caps'] = []
for known_cap_name, (known_cap_type, known_cap_url) in cap_data.region().caps.items():
if known_cap_type == CapType.PROXY_ONLY and known_cap_name in parsed_seed:
parsed_seed.remove(known_cap_name)
flow.metadata['needed_proxy_caps'].append(known_cap_name)
if flow.metadata['needed_proxy_caps']:
flow.request.content = llsd.format_xml(parsed_seed)
elif not cap_data:
if self._is_login_request(flow):
# Not strictly a Cap, but makes it easier to filter on.
@@ -198,10 +212,14 @@ class MITMProxyEventManager:
def _handle_response(self, flow: HippoHTTPFlow):
message_logger = self.session_manager.message_logger
if message_logger:
message_logger.log_http_response(flow)
try:
message_logger.log_http_response(flow)
except:
logging.exception("Failed while logging HTTP flow")
# Don't handle responses for requests injected by the proxy
if flow.request_injected:
# Don't process responses for requests or responses injected by the proxy.
# We already processed it, it came from us!
if flow.request_injected or flow.response_injected:
return
status = flow.response.status_code
@@ -262,7 +280,10 @@ class MITMProxyEventManager:
for cap_name in wrappable_caps:
if cap_name in parsed:
parsed[cap_name] = region.register_wrapper_cap(cap_name)
flow.response.content = llsd.format_pretty_xml(parsed)
# Send the client the URLs for any proxy-only caps it requested
for cap_name in flow.metadata['needed_proxy_caps']:
parsed[cap_name] = region.cap_urls[cap_name]
flow.response.content = llsd.format_xml(parsed)
elif cap_data.cap_name == "EventQueueGet":
parsed_eq_resp = llsd.parse_xml(flow.response.content)
if parsed_eq_resp:
@@ -281,13 +302,13 @@ class MITMProxyEventManager:
# HACK: see note in above request handler for EventQueueGet
req_ack_id = llsd.parse_xml(flow.request.content)["ack"]
eq_manager.cache_last_poll_response(req_ack_id, parsed_eq_resp)
flow.response.content = llsd.format_pretty_xml(parsed_eq_resp)
flow.response.content = llsd.format_xml(parsed_eq_resp)
elif cap_data.cap_name in self.UPLOAD_CREATING_CAPS:
if not region:
return
parsed = llsd.parse_xml(flow.response.content)
if "uploader" in parsed:
region.register_temporary_cap(cap_data.cap_name + "Uploader", parsed["uploader"])
region.register_cap(cap_data.cap_name + "Uploader", parsed["uploader"], CapType.TEMPORARY)
except:
logging.exception("OOPS, blew up in HTTP proxy!")

View File

@@ -1,6 +1,8 @@
from __future__ import annotations
import copy
import multiprocessing
import weakref
from typing import *
from typing import Optional
@@ -20,16 +22,18 @@ class HippoHTTPFlow:
Hides the nastiness of writing to flow.metadata so we can pass
state back and forth between the two proxies
"""
__slots__ = ("flow",)
__slots__ = ("flow", "callback_queue", "resumed", "taken")
def __init__(self, flow: HTTPFlow):
def __init__(self, flow: HTTPFlow, callback_queue: Optional[multiprocessing.Queue] = None):
self.flow: HTTPFlow = flow
self.resumed = False
self.taken = False
self.callback_queue = weakref.ref(callback_queue) if callback_queue else None
meta = self.flow.metadata
meta.setdefault("taken", False)
meta.setdefault("can_stream", True)
meta.setdefault("response_injected", False)
meta.setdefault("request_injected", False)
meta.setdefault("cap_data", None)
meta.setdefault("cap_data", CapData())
meta.setdefault("from_browser", False)
@property
@@ -91,12 +95,27 @@ class HippoHTTPFlow:
def take(self) -> HippoHTTPFlow:
"""Don't automatically pass this flow back to mitmproxy"""
self.metadata["taken"] = True
# TODO: Having to explicitly take / release Flows to use them in an async
# context is kind of janky. The HTTP callback handling code should probably
# be made totally async, including the addon hooks. Would coroutine per-callback
# be expensive?
assert not self.taken and not self.resumed
self.taken = True
return self
@property
def taken(self) -> bool:
return self.metadata["taken"]
def resume(self):
"""Release the HTTP flow back to the normal processing flow"""
assert self.callback_queue
assert not self.resumed
self.taken = False
self.resumed = True
self.callback_queue().put(("callback", self.flow.id, self.get_state()))
def preempt(self):
# Must be some flow that we previously resumed, we're racing
# the result from the server end.
assert not self.taken and self.resumed
self.callback_queue().put(("preempt", self.flow.id, self.get_state()))
@property
def is_replay(self) -> bool:
@@ -120,11 +139,14 @@ class HippoHTTPFlow:
flow: Optional[HTTPFlow] = HTTPFlow.from_state(flow_state)
assert flow is not None
cap_data_ser = flow.metadata.get("cap_data_ser")
callback_queue = None
if session_manager:
callback_queue = session_manager.flow_context.to_proxy_queue
if cap_data_ser is not None:
flow.metadata["cap_data"] = CapData.deserialize(cap_data_ser, session_manager)
else:
flow.metadata["cap_data"] = None
return cls(flow)
return cls(flow, callback_queue)
def copy(self) -> HippoHTTPFlow:
# HACK: flow.copy() expects the flow to be fully JSON serializable, but

View File

@@ -7,6 +7,7 @@ import sys
import queue
import typing
import uuid
import weakref
import mitmproxy.certs
import mitmproxy.ctx
@@ -70,7 +71,7 @@ class SLTlsConfig(mitmproxy.addons.tlsconfig.TlsConfig):
)
self.certstore.certs = old_cert_store.certs
def tls_start_server(self, tls_start: tls.TlsStartData):
def tls_start_server(self, tls_start: tls.TlsData):
super().tls_start_server(tls_start)
# Since 2000 the recommendation per RFCs has been to only check SANs and not the CN field.
# Most browsers do this, as does mitmproxy. The viewer does not, and the sim certs have no SAN
@@ -99,7 +100,7 @@ class IPCInterceptionAddon:
"""
def __init__(self, flow_context: HTTPFlowContext):
self.mitmproxy_ready = flow_context.mitmproxy_ready
self.intercepted_flows: typing.Dict[str, HTTPFlow] = {}
self.flows: weakref.WeakValueDictionary[str, HTTPFlow] = weakref.WeakValueDictionary()
self.from_proxy_queue: multiprocessing.Queue = flow_context.from_proxy_queue
self.to_proxy_queue: multiprocessing.Queue = flow_context.to_proxy_queue
self.shutdown_signal: multiprocessing.Event = flow_context.shutdown_signal
@@ -134,11 +135,13 @@ class IPCInterceptionAddon:
await asyncio.sleep(0.001)
continue
if event_type == "callback":
orig_flow = self.intercepted_flows.pop(flow_id)
orig_flow = self.flows[flow_id]
orig_flow.set_state(flow_state)
# Remove the taken flag from the flow if present, the flow by definition
# isn't take()n anymore once it's been passed back to the proxy.
orig_flow.metadata.pop("taken", None)
elif event_type == "preempt":
orig_flow = self.flows.get(flow_id)
if orig_flow:
orig_flow.intercept()
orig_flow.set_state(flow_state)
elif event_type == "replay":
flow: HTTPFlow = HTTPFlow.from_state(flow_state)
# mitmproxy won't replay intercepted flows, this is an old flow so
@@ -160,8 +163,8 @@ class IPCInterceptionAddon:
from_browser = "Mozilla" in flow.request.headers.get("User-Agent", "")
flow.metadata["from_browser"] = from_browser
# Only trust the "injected" header if not from a browser
was_injected = flow.request.headers.pop("X-Hippo-Injected", False)
if was_injected and not from_browser:
was_injected = flow.request.headers.pop("X-Hippo-Injected", "")
if was_injected == "1" and not from_browser:
flow.metadata["request_injected"] = True
# Does this request need the stupid hack around aiohttp's windows proactor bug
@@ -172,13 +175,13 @@ class IPCInterceptionAddon:
def _queue_flow_interception(self, event_type: str, flow: HTTPFlow):
flow.intercept()
self.intercepted_flows[flow.id] = flow
self.flows[flow.id] = flow
self.from_proxy_queue.put((event_type, flow.get_state()), True)
def responseheaders(self, flow: HTTPFlow):
# The response was injected earlier in an earlier handler,
# we don't want to touch this anymore.
if flow.metadata["response_injected"]:
if flow.metadata.get("response_injected"):
return
# Someone fucked up and put a mimetype in Content-Encoding.
@@ -189,7 +192,10 @@ class IPCInterceptionAddon:
flow.response.headers["Content-Encoding"] = "identity"
def response(self, flow: HTTPFlow):
if flow.metadata["response_injected"]:
cap_data: typing.Optional[SerializedCapData] = flow.metadata.get("cap_data")
if flow.metadata.get("response_injected") and cap_data and cap_data.asset_server_cap:
# Don't bother intercepting asset server requests where we injected a response.
# We don't want to log them and they don't need any more processing by user hooks.
return
self._queue_flow_interception("response", flow)
@@ -197,10 +203,10 @@ class IPCInterceptionAddon:
class SLMITMAddon(IPCInterceptionAddon):
def responseheaders(self, flow: HTTPFlow):
super().responseheaders(flow)
cap_data: typing.Optional[SerializedCapData] = flow.metadata["cap_data_ser"]
cap_data: typing.Optional[SerializedCapData] = flow.metadata.get("cap_data_ser")
# Request came from the proxy itself, don't touch it.
if flow.metadata["request_injected"]:
if flow.metadata.get("request_injected"):
return
# This is an asset server response that we're not interested in intercepting.
@@ -209,7 +215,7 @@ class SLMITMAddon(IPCInterceptionAddon):
# Can't stream if we injected our own response or we were asked not to stream
if not flow.metadata["response_injected"] and flow.metadata["can_stream"]:
flow.response.stream = True
elif not cap_data and not flow.metadata["from_browser"]:
elif not cap_data and not flow.metadata.get("from_browser"):
object_name = flow.response.headers.get("X-SecondLife-Object-Name", "")
# Meh. Add some fake Cap data for this so it can be matched on.
if object_name.startswith("#Firestorm LSL Bridge"):
@@ -229,10 +235,6 @@ class SLMITMMaster(mitmproxy.master.Master):
SLMITMAddon(flow_context),
)
def start_server(self):
self.start()
asyncio.ensure_future(self.running())
def create_proxy_master(host, port, flow_context: HTTPFlowContext): # pragma: no cover
opts = mitmproxy.options.Options()

View File

@@ -1,3 +1,4 @@
import asyncio
import logging
import weakref
from typing import Optional, Tuple
@@ -35,6 +36,18 @@ class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
)
self.message_xml = MessageDotXML()
self.session: Optional[Session] = None
loop = asyncio.get_event_loop_policy().get_event_loop()
self.resend_task = loop.create_task(self.attempt_resends())
async def attempt_resends(self):
while True:
await asyncio.sleep(0.1)
if self.session is None:
continue
for region in self.session.regions:
if not region.circuit or not region.circuit.is_alive:
continue
region.circuit.resend_unacked()
def _ensure_message_allowed(self, msg: Message):
if not self.message_xml.validate_udp_msg(msg.name):
@@ -99,6 +112,9 @@ class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
LOG.error("No circuit for %r, dropping packet!" % (packet.far_addr,))
return
# Process any ACKs for messages we injected first
region.circuit.collect_acks(message)
if message.name == "AgentMovementComplete":
self.session.main_region = region
if region.handle is None:
@@ -148,7 +164,7 @@ class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
# Send the message if it wasn't explicitly dropped or sent before
if not message.finalized:
region.circuit.send_message(message)
region.circuit.send(message)
def close(self):
super().close()
@@ -156,3 +172,4 @@ class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
AddonManager.handle_session_closed(self.session)
self.session_manager.close_session(self.session)
self.session = None
self.resend_task.cancel()

View File

@@ -3,7 +3,7 @@ import ast
import typing
from arpeggio import Optional, ZeroOrMore, EOF, \
ParserPython, PTNodeVisitor, visit_parse_tree, RegExMatch
ParserPython, PTNodeVisitor, visit_parse_tree, RegExMatch, OneOrMore
def literal():
@@ -26,7 +26,9 @@ def literal():
def identifier():
return RegExMatch(r'[a-zA-Z*]([a-zA-Z0-9_*]+)?')
# Identifiers are allowed to have "-". It's not a special character
# in our grammar, and we expect them to show up some places, like header names.
return RegExMatch(r'[a-zA-Z*]([a-zA-Z0-9_*-]+)?')
def field_specifier():
@@ -42,7 +44,7 @@ def unary_expression():
def meta_field_specifier():
return "Meta", ".", identifier
return "Meta", OneOrMore(".", identifier)
def enum_field_specifier():
@@ -69,12 +71,17 @@ def message_filter():
return expression, EOF
MATCH_RESULT = typing.Union[bool, typing.Tuple]
class MatchResult(typing.NamedTuple):
result: bool
fields: typing.List[typing.Tuple]
def __bool__(self):
return self.result
class BaseFilterNode(abc.ABC):
@abc.abstractmethod
def match(self, msg) -> MATCH_RESULT:
def match(self, msg, short_circuit=True) -> MatchResult:
raise NotImplementedError()
@property
@@ -104,18 +111,36 @@ class BinaryFilterNode(BaseFilterNode, abc.ABC):
class UnaryNotFilterNode(UnaryFilterNode):
def match(self, msg) -> MATCH_RESULT:
return not self.node.match(msg)
def match(self, msg, short_circuit=True) -> MatchResult:
# Should we pass fields up here? Maybe not.
return MatchResult(not self.node.match(msg, short_circuit), [])
class OrFilterNode(BinaryFilterNode):
def match(self, msg) -> MATCH_RESULT:
return self.left_node.match(msg) or self.right_node.match(msg)
def match(self, msg, short_circuit=True) -> MatchResult:
left_match = self.left_node.match(msg, short_circuit)
if left_match and short_circuit:
return MatchResult(True, left_match.fields)
right_match = self.right_node.match(msg, short_circuit)
if right_match and short_circuit:
return MatchResult(True, right_match.fields)
if left_match or right_match:
# Fine since fields should be empty when result=False
return MatchResult(True, left_match.fields + right_match.fields)
return MatchResult(False, [])
class AndFilterNode(BinaryFilterNode):
def match(self, msg) -> MATCH_RESULT:
return self.left_node.match(msg) and self.right_node.match(msg)
def match(self, msg, short_circuit=True) -> MatchResult:
left_match = self.left_node.match(msg, short_circuit)
if not left_match:
return MatchResult(False, [])
right_match = self.right_node.match(msg, short_circuit)
if not right_match:
return MatchResult(False, [])
return MatchResult(True, left_match.fields + right_match.fields)
class MessageFilterNode(BaseFilterNode):
@@ -124,15 +149,15 @@ class MessageFilterNode(BaseFilterNode):
self.operator = operator
self.value = value
def match(self, msg) -> MATCH_RESULT:
return msg.matches(self)
def match(self, msg, short_circuit=True) -> MatchResult:
return msg.matches(self, short_circuit)
@property
def children(self):
return self.selector, self.operator, self.value
class MetaFieldSpecifier(str):
class MetaFieldSpecifier(tuple):
pass
@@ -158,7 +183,7 @@ class MessageFilterVisitor(PTNodeVisitor):
return LiteralValue(ast.literal_eval(node.value))
def visit_meta_field_specifier(self, _node, children):
return MetaFieldSpecifier(children[0])
return MetaFieldSpecifier(children)
def visit_enum_field_specifier(self, _node, children):
return EnumFieldSpecifier(*children)

View File

@@ -21,7 +21,7 @@ from hippolyzer.lib.base.datatypes import TaggedUnion, UUID, TupleCoord
from hippolyzer.lib.base.helpers import bytes_escape
from hippolyzer.lib.base.message.message_formatting import HumanMessageSerializer
from hippolyzer.lib.proxy.message_filter import MetaFieldSpecifier, compile_filter, BaseFilterNode, MessageFilterNode, \
EnumFieldSpecifier
EnumFieldSpecifier, MatchResult
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.caps import CapType, SerializedCapData
@@ -235,7 +235,7 @@ class AbstractMessageLogEntry(abc.ABC):
obj = self.region.objects.lookup_localid(selected_local)
return obj and obj.FullID
def _get_meta(self, name: str):
def _get_meta(self, name: str) -> typing.Any:
# Slight difference in semantics. Filters are meant to return the same
# thing no matter when they're run, so SelectedLocal and friends resolve
# to the selected items _at the time the message was logged_. To handle
@@ -308,7 +308,9 @@ class AbstractMessageLogEntry(abc.ABC):
def _val_matches(self, operator, val, expected):
if isinstance(expected, MetaFieldSpecifier):
expected = self._get_meta(str(expected))
if len(expected) != 1:
raise ValueError(f"Can only support single-level Meta specifiers, not {expected!r}")
expected = self._get_meta(str(expected[0]))
if not isinstance(expected, (int, float, bytes, str, type(None), tuple)):
if callable(expected):
expected = expected()
@@ -362,12 +364,18 @@ class AbstractMessageLogEntry(abc.ABC):
if matcher.value or matcher.operator:
return False
return self._packet_root_matches(matcher.selector[0])
if len(matcher.selector) == 2 and matcher.selector[0] == "Meta":
return self._val_matches(matcher.operator, self._get_meta(matcher.selector[1]), matcher.value)
if matcher.selector[0] == "Meta":
if len(matcher.selector) == 2:
return self._val_matches(matcher.operator, self._get_meta(matcher.selector[1]), matcher.value)
elif len(matcher.selector) == 3:
meta_dict = self._get_meta(matcher.selector[1])
if not meta_dict or not hasattr(meta_dict, 'get'):
return False
return self._val_matches(matcher.operator, meta_dict.get(matcher.selector[2]), matcher.value)
return None
def matches(self, matcher: "MessageFilterNode"):
return self._base_matches(matcher) or False
def matches(self, matcher: "MessageFilterNode", short_circuit=True) -> "MatchResult":
return MatchResult(self._base_matches(matcher) or False, [])
@property
def seq(self):
@@ -388,6 +396,14 @@ class AbstractMessageLogEntry(abc.ABC):
xmlified = re.sub(rb" <key>", b"<key>", xmlified)
return xmlified.decode("utf8", errors="replace")
@staticmethod
def _format_xml(content):
beautified = minidom.parseString(content).toprettyxml(indent=" ")
# kill blank lines. will break cdata sections. meh.
beautified = re.sub(r'\n\s*\n', '\n', beautified, flags=re.MULTILINE)
return re.sub(r'<([\w]+)>\s*</\1>', r'<\1></\1>',
beautified, flags=re.MULTILINE)
class HTTPMessageLogEntry(AbstractMessageLogEntry):
__slots__ = ["flow"]
@@ -400,7 +416,7 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
super().__init__(region, session)
# This was a request the proxy made through itself
self.meta["Injected"] = flow.request_injected
self.meta["Synthetic"] = flow.request_injected
@property
def type(self):
@@ -476,13 +492,17 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
if not beautified:
content_type = self._guess_content_type(message)
if content_type.startswith("application/llsd"):
beautified = self._format_llsd(llsd.parse(message.content))
try:
beautified = self._format_llsd(llsd.parse(message.content))
except llsd.LLSDParseError:
# Sometimes LL sends plain XML with a Content-Type of application/llsd+xml.
# Try to detect that case and work around it
if content_type == "application/llsd+xml" and message.content.startswith(b'<'):
beautified = self._format_xml(message.content)
else:
raise
elif any(content_type.startswith(x) for x in ("application/xml", "text/xml")):
beautified = minidom.parseString(message.content).toprettyxml(indent=" ")
# kill blank lines. will break cdata sections. meh.
beautified = re.sub(r'\n\s*\n', '\n', beautified, flags=re.MULTILINE)
beautified = re.sub(r'<([\w]+)>\s*</\1>', r'<\1></\1>',
beautified, flags=re.MULTILINE)
beautified = self._format_xml(message.content)
except:
LOG.exception("Failed to beautify message")
@@ -541,6 +561,20 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
return "application/xml"
return content_type
def _get_meta(self, name: str) -> typing.Any:
lower_name = name.lower()
if lower_name == "url":
return self.flow.request.url
elif lower_name == "reqheaders":
return self.flow.request.headers
elif lower_name == "respheaders":
return self.flow.response.headers
elif lower_name == "host":
return self.flow.request.host.lower()
elif lower_name == "status":
return self.flow.response.status_code
return super()._get_meta(name)
def to_dict(self):
val = super().to_dict()
val['flow'] = self.flow.get_state()
@@ -613,7 +647,7 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
super().__init__(region, session)
_MESSAGE_META_ATTRS = {
"Injected", "Dropped", "Extra", "Resent", "Zerocoded", "Acks", "Reliable",
"Synthetic", "Dropped", "Extra", "Resent", "Zerocoded", "Acks", "Reliable",
}
def _get_meta(self, name: str):
@@ -671,20 +705,21 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
def request(self, beautify=False, replacements=None):
return HumanMessageSerializer.to_human_string(self.message, replacements, beautify)
def matches(self, matcher):
def matches(self, matcher, short_circuit=True) -> "MatchResult":
base_matched = self._base_matches(matcher)
if base_matched is not None:
return base_matched
return MatchResult(base_matched, [])
if not self._packet_root_matches(matcher.selector[0]):
return False
return MatchResult(False, [])
message = self.message
selector_len = len(matcher.selector)
# name, block_name, var_name(, subfield_name)?
if selector_len not in (3, 4):
return False
return MatchResult(False, [])
found_field_keys = []
for block_name in message.blocks:
if not fnmatch.fnmatchcase(block_name, matcher.selector[1]):
continue
@@ -693,13 +728,13 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
if not fnmatch.fnmatchcase(var_name, matcher.selector[2]):
continue
# So we know where the match happened
span_key = (message.name, block_name, block_num, var_name)
field_key = (message.name, block_name, block_num, var_name)
if selector_len == 3:
# We're just matching on the var existing, not having any particular value
if matcher.value is None:
return span_key
if self._val_matches(matcher.operator, block[var_name], matcher.value):
return span_key
found_field_keys.append(field_key)
elif self._val_matches(matcher.operator, block[var_name], matcher.value):
found_field_keys.append(field_key)
# Need to invoke a special unpacker
elif selector_len == 4:
try:
@@ -710,15 +745,21 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
if isinstance(deserialized, TaggedUnion):
deserialized = deserialized.value
if not isinstance(deserialized, dict):
return False
continue
for key in deserialized.keys():
if fnmatch.fnmatchcase(str(key), matcher.selector[3]):
if matcher.value is None:
return span_key
if self._val_matches(matcher.operator, deserialized[key], matcher.value):
return span_key
# Short-circuiting checking individual subfields is fine since
# we only highlight fields anyway.
found_field_keys.append(field_key)
break
elif self._val_matches(matcher.operator, deserialized[key], matcher.value):
found_field_keys.append(field_key)
break
return False
if short_circuit and found_field_keys:
return MatchResult(True, found_field_keys)
return MatchResult(bool(found_field_keys), found_field_keys)
@property
def summary(self):

View File

@@ -63,18 +63,25 @@ class ProxyObjectManager(ClientObjectManager):
cache_dir=self._region.session().cache_dir,
)
def request_missed_cached_objects_soon(self):
def request_missed_cached_objects_soon(self, report_only=False):
if self._cache_miss_timer:
self._cache_miss_timer.cancel()
# Basically debounce. Will only trigger 0.2 seconds after the last time it's invoked to
# deal with the initial flood of ObjectUpdateCached and the natural lag time between that
# and the viewers' RequestMultipleObjects messages
self._cache_miss_timer = asyncio.get_event_loop().call_later(
0.2, self._request_missed_cached_objects)
loop = asyncio.get_event_loop_policy().get_event_loop()
self._cache_miss_timer = loop.call_later(0.2, self._request_missed_cached_objects, report_only)
def _request_missed_cached_objects(self):
def _request_missed_cached_objects(self, report_only: bool):
self._cache_miss_timer = None
self.request_objects(self.queued_cache_misses)
if not self.queued_cache_misses:
# All the queued cache misses ended up being satisfied without us
# having to request them, no need to fire off a request.
return
if report_only:
print(f"Would have automatically requested {self.queued_cache_misses!r}")
else:
self.request_objects(self.queued_cache_misses)
self.queued_cache_misses.clear()
def clear(self):
@@ -110,9 +117,12 @@ class ProxyWorldObjectManager(ClientWorldObjectManager):
)
def _handle_object_update_cached_misses(self, region_handle: int, missing_locals: Set[int]):
region_mgr: Optional[ProxyObjectManager] = self._get_region_manager(region_handle)
if not self._settings.ALLOW_AUTO_REQUEST_OBJECTS:
return
if self._settings.AUTOMATICALLY_REQUEST_MISSING_OBJECTS:
if self._settings.USE_VIEWER_OBJECT_CACHE:
region_mgr.queued_cache_misses |= missing_locals
region_mgr.request_missed_cached_objects_soon(report_only=True)
elif self._settings.AUTOMATICALLY_REQUEST_MISSING_OBJECTS:
# Schedule these local IDs to be requested soon if the viewer doesn't request
# them itself. Ideally we could just mutate the CRC of the ObjectUpdateCached
# to force a CRC cache miss in the viewer, but that appears to cause the viewer
@@ -130,9 +140,9 @@ class ProxyWorldObjectManager(ClientWorldObjectManager):
if obj.PCode == PCode.AVATAR and "ParentID" in updated_props:
if obj.ParentID and not region.objects.lookup_localid(obj.ParentID):
# If an avatar just sat on an object we don't know about, add it to the queued
# cache misses and request if if the viewer doesn't. This should happen
# regardless of the auto-request object setting because otherwise we have no way
# to get a sitting agent's true region location, even if it's ourself.
# cache misses and request it if the viewer doesn't. This should happen
# regardless of the auto-request missing objects setting because otherwise we
# have no way to get a sitting agent's true region location, even if it's ourselves.
region.objects.queued_cache_misses.add(obj.ParentID)
region.objects.request_missed_cached_objects_soon()
AddonManager.handle_object_updated(self._session, region, obj, updated_props)

View File

@@ -11,7 +11,8 @@ import multidict
from hippolyzer.lib.base.datatypes import Vector3, UUID
from hippolyzer.lib.base.helpers import proxify
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.llsd_msg_serializer import LLSDMessageSerializer
from hippolyzer.lib.base.message.message import Message, Block
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.base.objects import handle_to_global_pos
from hippolyzer.lib.client.state import BaseClientRegion
@@ -51,10 +52,11 @@ class ProxiedRegion(BaseClientRegion):
self.cache_id: Optional[UUID] = None
self.circuit: Optional[ProxiedCircuit] = None
self.circuit_addr = circuit_addr
self._caps = CapsMultiDict()
self.caps = CapsMultiDict()
# Reverse lookup for URL -> cap data
self._caps_url_lookup: Dict[str, Tuple[CapType, str]] = {}
if seed_cap:
self._caps["Seed"] = (CapType.NORMAL, seed_cap)
self.caps["Seed"] = (CapType.NORMAL, seed_cap)
self.session: Callable[[], Session] = weakref.ref(session)
self.message_handler: MessageHandler[Message, str] = MessageHandler()
self.http_message_handler: MessageHandler[HippoHTTPFlow, str] = MessageHandler()
@@ -77,8 +79,8 @@ class ProxiedRegion(BaseClientRegion):
self._name = val
@property
def caps(self):
return multidict.MultiDict((x, y[1]) for x, y in self._caps.items())
def cap_urls(self) -> multidict.MultiDict[str, str]:
return multidict.MultiDict((x, y[1]) for x, y in self.caps.items())
@property
def global_pos(self) -> Vector3:
@@ -95,12 +97,12 @@ class ProxiedRegion(BaseClientRegion):
def update_caps(self, caps: Mapping[str, str]):
for cap_name, cap_url in caps.items():
if isinstance(cap_url, str) and cap_url.startswith('http'):
self._caps.add(cap_name, (CapType.NORMAL, cap_url))
self.caps.add(cap_name, (CapType.NORMAL, cap_url))
self._recalc_caps()
def _recalc_caps(self):
self._caps_url_lookup.clear()
for name, cap_info in self._caps.items():
for name, cap_info in self.caps.items():
cap_type, cap_url = cap_info
self._caps_url_lookup[cap_url] = (cap_type, name)
@@ -109,32 +111,31 @@ class ProxiedRegion(BaseClientRegion):
Wrap an existing, non-unique cap with a unique URL
caps like ViewerAsset may be the same globally and wouldn't let us infer
which session / region the request was related to without a wrapper
which session / region the request was related to without a wrapper URL
that we inject into the seed response sent to the viewer.
"""
parsed = list(urllib.parse.urlsplit(self._caps[name][1]))
seed_id = self._caps["Seed"][1].split("/")[-1].encode("utf8")
parsed = list(urllib.parse.urlsplit(self.caps[name][1]))
seed_id = self.caps["Seed"][1].split("/")[-1].encode("utf8")
# Give it a unique domain tied to the current Seed URI
parsed[1] = f"{name.lower()}-{hashlib.sha256(seed_id).hexdigest()[:16]}.hippo-proxy.localhost"
# Force the URL to HTTP, we're going to handle the request ourselves so it doesn't need
# to be secure. This should save on expensive TLS context setup for each req.
parsed[0] = "http"
wrapper_url = urllib.parse.urlunsplit(parsed)
self._caps.add(name + "ProxyWrapper", (CapType.WRAPPER, wrapper_url))
self._recalc_caps()
# Register it with "ProxyWrapper" appended so we don't shadow the real cap URL
# in our own view of the caps
self.register_cap(name + "ProxyWrapper", wrapper_url, CapType.WRAPPER)
return wrapper_url
def register_proxy_cap(self, name: str):
"""
Register a cap to be completely handled by the proxy
"""
cap_url = f"https://caps.hippo-proxy.localhost/cap/{uuid.uuid4()!s}"
self._caps.add(name, (CapType.PROXY_ONLY, cap_url))
self._recalc_caps()
"""Register a cap to be completely handled by the proxy"""
cap_url = f"http://{uuid.uuid4()!s}.caps.hippo-proxy.localhost"
self.register_cap(name, cap_url, CapType.PROXY_ONLY)
return cap_url
def register_temporary_cap(self, name: str, cap_url: str):
def register_cap(self, name: str, cap_url: str, cap_type: CapType = CapType.NORMAL):
"""Register a Cap that only has meaning the first time it's used"""
self._caps.add(name, (CapType.TEMPORARY, cap_url))
self.caps.add(name, (cap_type, cap_url))
self._recalc_caps()
def resolve_cap(self, url: str, consume=True) -> Optional[Tuple[str, str, CapType]]:
@@ -143,9 +144,9 @@ class ProxiedRegion(BaseClientRegion):
cap_type, name = self._caps_url_lookup[cap_url]
if cap_type == CapType.TEMPORARY and consume:
# Resolving a temporary cap pops it out of the dict
temporary_caps = self._caps.popall(name)
temporary_caps = self.caps.popall(name)
temporary_caps.remove((cap_type, cap_url))
self._caps.extend((name, x) for x in temporary_caps)
self.caps.extend((name, x) for x in temporary_caps)
self._recalc_caps()
return name, cap_url, cap_type
return None
@@ -168,9 +169,26 @@ class EventQueueManager:
self._region = weakref.proxy(region)
self._last_ack: Optional[int] = None
self._last_payload: Optional[Any] = None
self.llsd_message_serializer = LLSDMessageSerializer()
def inject_message(self, message: Message):
self.inject_event(self.llsd_message_serializer.serialize(message, True))
def inject_event(self, event: dict):
self._queued_events.append(event)
if self._region:
circuit: ProxiedCircuit = self._region.circuit
session: Session = self._region.session()
# Inject an outbound PlacesQuery message so we can trigger an inbound PlacesReply
# over the EQ. That will allow us to shove our own event onto the response once it comes in,
# otherwise we have to wait until the EQ legitimately returns 200 due to a new event.
# May or may not work in OpenSim.
circuit.send_message(Message(
'PlacesQuery',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id, QueryID=UUID()),
Block('TransactionData', TransactionID=UUID()),
Block('QueryData', QueryText=b'', QueryFlags=64, Category=-1, SimName=b''),
))
def take_injected_events(self):
events = self._queued_events

View File

@@ -99,12 +99,12 @@ class Session(BaseClientSession):
for region in self.regions:
if region.circuit_addr == circuit_addr:
if seed_url and region.caps.get("Seed") != seed_url:
if seed_url and region.cap_urls.get("Seed") != seed_url:
region.update_caps({"Seed": seed_url})
if handle:
region.handle = handle
return region
if seed_url and region.caps.get("Seed") == seed_url:
if seed_url and region.cap_urls.get("Seed") == seed_url:
return region
if not circuit_addr:
@@ -113,6 +113,7 @@ class Session(BaseClientSession):
logging.info("Registering region for %r" % (circuit_addr,))
region = ProxiedRegion(circuit_addr, seed_url, self, handle=handle)
self.regions.append(region)
AddonManager.handle_region_registered(self, region)
return region
def region_by_circuit_addr(self, circuit_addr) -> Optional[ProxiedRegion]:

View File

@@ -63,8 +63,14 @@ class TaskScheduler:
def shutdown(self):
for task_data, task in self.tasks:
task.cancel()
await_all = asyncio.gather(*(task for task_data, task in self.tasks))
asyncio.get_event_loop().run_until_complete(await_all)
try:
event_loop = asyncio.get_running_loop()
await_all = asyncio.gather(*(task for task_data, task in self.tasks))
event_loop.run_until_complete(await_all)
except RuntimeError:
pass
self.tasks.clear()
def _task_done(self, task: asyncio.Task):
for task_details in reversed(self.tasks):

View File

@@ -14,7 +14,7 @@ from hippolyzer.lib.proxy.transport import SOCKS5UDPTransport
class BaseProxyTest(unittest.IsolatedAsyncioTestCase):
def setUp(self) -> None:
async def asyncSetUp(self) -> None:
self.client_addr = ("127.0.0.1", 1)
self.region_addr = ("127.0.0.1", 3)
self.circuit_code = 1234
@@ -37,6 +37,9 @@ class BaseProxyTest(unittest.IsolatedAsyncioTestCase):
self.serializer = UDPMessageSerializer()
self.session.objects.track_region_objects(123)
def tearDown(self) -> None:
self.protocol.close()
async def _wait_drained(self):
await asyncio.sleep(0.001)

View File

@@ -0,0 +1,39 @@
import abc
from mitmproxy.addons import asgiapp
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
async def serve(app, flow: HippoHTTPFlow):
"""Serve a request based on a Hippolyzer HTTP flow using a provided app"""
await asgiapp.serve(app, flow.flow)
# Send the modified flow object back to mitmproxy
flow.resume()
class WebAppCapAddon(BaseAddon, abc.ABC):
"""
Addon that provides a cap via an ASGI webapp
Handles all registration of the cap URL and routing of the request.
"""
CAP_NAME: str
APP: any
def handle_region_registered(self, session: Session, region: ProxiedRegion):
# Register a fake URL for our cap. This will add the cap URL to the Seed
# response that gets sent back to the client if that cap name was requested.
if self.CAP_NAME not in region.cap_urls:
region.register_proxy_cap(self.CAP_NAME)
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
if flow.cap_data.cap_name != self.CAP_NAME:
return
# This request may take a while to generate a response for, take it out of the normal
# HTTP handling flow and handle it in a async task.
# TODO: Make all HTTP handling hooks async so this isn't necessary
self._schedule_task(serve(self.APP, flow.take()))

View File

@@ -1,65 +1,68 @@
aiohttp==3.7.4.post0
aiohttp==3.8.1
aiosignal==1.2.0
appdirs==1.4.4
Arpeggio==1.10.2
asgiref==3.4.1
async-timeout==3.0.1
async-timeout==4.0.1
attrs==21.2.0
blinker==1.4
Brotli==1.0.9
certifi==2021.5.30
cffi==1.14.6
chardet==4.0.0
charset-normalizer==2.0.3
click==8.0.1
cryptography==3.4.7
certifi==2021.10.8
cffi==1.15.0
charset-normalizer==2.0.9
click==8.0.3
cryptography==36.0.2
defusedxml==0.7.1
Flask==2.0.1
Glymur==0.9.3
Flask==2.0.2
frozenlist==1.2.0
Glymur==0.9.6
h11==0.12.0
h2==4.0.0
h2==4.1.0
hpack==4.0.0
hyperframe==6.0.1
idna==2.10
itsdangerous==2.0.1
jedi==0.18.0
Jinja2==3.0.1
jedi==0.18.1
Jinja2==3.0.3
kaitaistruct==0.9
lazy-object-proxy==1.6.0
ldap3==2.9
ldap3==2.9.1
llbase==1.2.11
lxml==4.6.3
lxml==4.6.4
MarkupSafe==2.0.1
mitmproxy==7.0.2
msgpack==1.0.2
multidict==5.1.0
numpy==1.21.0
parso==0.8.2
mitmproxy==8.0.0
msgpack==1.0.3
multidict==5.2.0
numpy==1.21.4
parso==0.8.3
passlib==1.7.4
prompt-toolkit==3.0.19
protobuf==3.17.3
ptpython==3.0.19
prompt-toolkit==3.0.23
protobuf==3.18.1
ptpython==3.0.20
publicsuffix2==2.20191221
pyasn1==0.4.8
pycparser==2.20
Pygments==2.9.0
pyOpenSSL==20.0.1
pycparser==2.21
pycollada==0.7.2
Pygments==2.10.0
pyOpenSSL==22.0.0
pyparsing==2.4.7
pyperclip==1.8.2
PySide2==5.15.2
qasync==0.17.0
PySide6==6.2.2
qasync==0.22.0
recordclass==0.14.3
requests==2.26.0
ruamel.yaml==0.17.10
ruamel.yaml==0.17.16
ruamel.yaml.clib==0.2.6
shiboken2==5.15.2
shiboken6==6.2.2
six==1.16.0
sortedcontainers==2.4.0
tornado==6.1
typing-extensions==3.10.0.0
urllib3==1.26.6
transformations==2021.6.6
typing-extensions==4.0.1
urllib3==1.26.7
urwid==2.1.2
wcwidth==0.2.5
Werkzeug==2.0.1
Werkzeug==2.0.2
wsproto==1.0.0
yarl==1.6.3
yarl==1.7.2
zstandard==0.15.2

View File

@@ -25,7 +25,7 @@ from setuptools import setup, find_packages
here = path.abspath(path.dirname(__file__))
version = '0.7.1'
version = '0.11.0'
with open(path.join(here, 'README.md')) as readme_fh:
readme = readme_fh.read()
@@ -44,6 +44,7 @@ setup(
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: Implementation :: CPython",
"Topic :: System :: Networking :: Monitoring",
"Topic :: Software Development :: Libraries :: Python Modules",
@@ -66,6 +67,7 @@ setup(
'lib/base/data/static_data.db2',
'lib/base/data/static_index.db2',
'lib/base/data/avatar_lad.xml',
'lib/base/data/male_collada_joints.xml',
'lib/base/data/avatar_skeleton.xml',
'lib/base/data/LICENSE-artwork.txt',
],
@@ -88,15 +90,18 @@ setup(
# requests breaks with newer idna
'idna<3,>=2.5',
# 7.x will be a major change.
'mitmproxy>=7.0.2,<8.0',
'mitmproxy>=8.0.0,<8.1',
# For REPLs
'ptpython<4.0',
# JP2 codec
'Glymur<1.0',
'Glymur<0.9.7',
'numpy<2.0',
# These could be in extras_require if you don't want a GUI.
'pyside2<6.0',
'pyside6',
'qasync',
# Needed for mesh format conversion tooling
'pycollada',
'transformations',
],
tests_require=[
"pytest",

View File

@@ -9,20 +9,21 @@ from cx_Freeze import setup, Executable
# We don't need any of these and they make the archive huge.
TO_DELETE = [
"lib/PySide2/Qt3DRender.pyd",
"lib/PySide2/Qt53DRender.dll",
"lib/PySide2/Qt5Charts.dll",
"lib/PySide2/Qt5Location.dll",
"lib/PySide2/Qt5Pdf.dll",
"lib/PySide2/Qt5Quick.dll",
"lib/PySide2/Qt5WebEngineCore.dll",
"lib/PySide2/QtCharts.pyd",
"lib/PySide2/QtMultimedia.pyd",
"lib/PySide2/QtOpenGLFunctions.pyd",
"lib/PySide2/QtOpenGLFunctions.pyi",
"lib/PySide2/d3dcompiler_47.dll",
"lib/PySide2/opengl32sw.dll",
"lib/PySide2/translations",
"lib/PySide6/Qt6DRender.pyd",
"lib/PySide6/Qt63DRender.dll",
"lib/PySide6/Qt6Charts.dll",
"lib/PySide6/Qt6Location.dll",
"lib/PySide6/Qt6Pdf.dll",
"lib/PySide6/Qt6Quick.dll",
"lib/PySide6/Qt6WebEngineCore.dll",
"lib/PySide6/QtCharts.pyd",
"lib/PySide6/QtMultimedia.pyd",
"lib/PySide6/QtOpenGLFunctions.pyd",
"lib/PySide6/QtOpenGLFunctions.pyi",
"lib/PySide6/d3dcompiler_47.dll",
"lib/PySide6/opengl32sw.dll",
"lib/PySide6/lupdate.exe",
"lib/PySide6/translations",
"lib/aiohttp/_find_header.c",
"lib/aiohttp/_frozenlist.c",
"lib/aiohttp/_helpers.c",
@@ -112,7 +113,7 @@ executables = [
setup(
name="hippolyzer_gui",
version="0.7.1",
version="0.9.0",
description="Hippolyzer GUI",
options=options,
executables=executables,

BIN
static/repl_screenshot.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 42 KiB

View File

@@ -134,3 +134,15 @@ class TestDatatypes(unittest.TestCase):
val = llsd.parse_binary(llsd.format_binary(orig))
self.assertIsInstance(val, UUID)
self.assertEqual(orig, val)
def test_jank_stringy_bytes(self):
val = JankStringyBytes(b"foo\x00")
self.assertTrue("o" in val)
self.assertTrue(b"o" in val)
self.assertFalse(b"z" in val)
self.assertFalse("z" in val)
self.assertEqual("foo", val)
self.assertEqual(b"foo\x00", val)
self.assertNotEqual(b"foo", val)
self.assertEqual(b"foo", JankStringyBytes(b"foo"))
self.assertEqual("foo", JankStringyBytes(b"foo"))

View File

@@ -1,7 +1,8 @@
import copy
import unittest
from hippolyzer.lib.base.datatypes import *
from hippolyzer.lib.base.legacy_inv import InventoryModel
from hippolyzer.lib.base.inventory import InventoryModel
from hippolyzer.lib.base.wearables import Wearable, VISUAL_PARAMS
SIMPLE_INV = """\tinv_object\t0
@@ -44,22 +45,123 @@ SIMPLE_INV = """\tinv_object\t0
class TestLegacyInv(unittest.TestCase):
def setUp(self) -> None:
self.model = InventoryModel.from_str(SIMPLE_INV)
def test_parse(self):
model = InventoryModel.from_str(SIMPLE_INV)
self.assertTrue(UUID('f4d91477-def1-487a-b4f3-6fa201c17376') in model.containers)
self.assertIsNotNone(model.root)
self.assertTrue(UUID('f4d91477-def1-487a-b4f3-6fa201c17376') in self.model.nodes)
self.assertIsNotNone(self.model.root)
def test_serialize(self):
model = InventoryModel.from_str(SIMPLE_INV)
new_model = InventoryModel.from_str(model.to_str())
self.assertEqual(model, new_model)
self.model = InventoryModel.from_str(SIMPLE_INV)
new_model = InventoryModel.from_str(self.model.to_str())
self.assertEqual(self.model, new_model)
def test_item_access(self):
model = InventoryModel.from_str(SIMPLE_INV)
item = model.items[UUID('dd163122-946b-44df-99f6-a6030e2b9597')]
item = self.model.nodes[UUID('dd163122-946b-44df-99f6-a6030e2b9597')]
self.assertEqual(item.name, "New Script")
self.assertEqual(item.sale_info.sale_type, "not")
self.assertEqual(item.model, model)
self.assertEqual(item.model, self.model)
def test_access_children(self):
root = self.model.root
item = tuple(self.model.ordered_nodes)[1]
self.assertEqual((item,), root.children)
def test_access_parent(self):
root = self.model.root
item = tuple(self.model.ordered_nodes)[1]
self.assertEqual(root, item.parent)
self.assertEqual(None, root.parent)
def test_unlink(self):
self.assertEqual(1, len(self.model.root.children))
item = tuple(self.model.ordered_nodes)[1]
self.assertEqual([item], item.unlink())
self.assertEqual(0, len(self.model.root.children))
self.assertEqual(None, item.model)
def test_relink(self):
item = tuple(self.model.ordered_nodes)[1]
for unlinked in item.unlink():
self.model.add(unlinked)
self.assertEqual(self.model, item.model)
self.assertEqual(1, len(self.model.root.children))
def test_eq_excludes_model(self):
item = tuple(self.model.ordered_nodes)[1]
item_copy = copy.copy(item)
item_copy.model = None
self.assertEqual(item, item_copy)
def test_llsd_serialization(self):
self.assertEqual(
self.model.to_llsd(),
[
{
'name': 'Contents',
'obj_id': UUID('f4d91477-def1-487a-b4f3-6fa201c17376'),
'parent_id': UUID('00000000-0000-0000-0000-000000000000'),
'type': 'category'
},
{
'asset_id': UUID('00000000-0000-0000-0000-000000000000'),
'created_at': 1587367239,
'desc': '2020-04-20 04:20:39 lsl2 script',
'flags': b'\x00\x00\x00\x00',
'inv_type': 'script',
'item_id': UUID('dd163122-946b-44df-99f6-a6030e2b9597'),
'name': 'New Script',
'parent_id': UUID('f4d91477-def1-487a-b4f3-6fa201c17376'),
'permissions': {
'base_mask': 2147483647,
'creator_id': UUID('a2e76fcd-9360-4f6d-a924-000000000003'),
'everyone_mask': 0,
'group_id': UUID('00000000-0000-0000-0000-000000000000'),
'group_mask': 0,
'last_owner_id': UUID('a2e76fcd-9360-4f6d-a924-000000000003'),
'next_owner_mask': 581632,
'owner_id': UUID('a2e76fcd-9360-4f6d-a924-000000000003'),
'owner_mask': 2147483647
},
'sale_info': {
'sale_price': 10,
'sale_type': 'not'
},
'type': 'lsltext'
}
]
)
def test_llsd_legacy_equality(self):
new_model = InventoryModel.from_llsd(self.model.to_llsd())
self.assertEqual(self.model, new_model)
new_model.root.name = "foo"
self.assertNotEqual(self.model, new_model)
def test_difference_added(self):
new_model = InventoryModel.from_llsd(self.model.to_llsd())
diff = self.model.get_differences(new_model)
self.assertEqual([], diff.changed)
self.assertEqual([], diff.removed)
new_model.root.name = "foo"
diff = self.model.get_differences(new_model)
self.assertEqual([new_model.root], diff.changed)
self.assertEqual([], diff.removed)
item = new_model.root.children[0]
item.unlink()
diff = self.model.get_differences(new_model)
self.assertEqual([new_model.root], diff.changed)
self.assertEqual([item], diff.removed)
new_item = copy.copy(item)
new_item.node_id = UUID.random()
new_model.add(new_item)
diff = self.model.get_differences(new_model)
self.assertEqual([new_model.root, new_item], diff.changed)
self.assertEqual([item], diff.removed)
GIRL_NEXT_DOOR_SHAPE = """LLWearable version 22

View File

@@ -300,3 +300,14 @@ class HumanReadableMessageTests(unittest.TestCase):
with self.assertRaises(ValueError):
HumanMessageSerializer.from_human_string(val)
def test_flags(self):
val = """
OUT FooMessage [ZEROCODED] [RELIABLE] [1]
[SomeBlock]
foo = 1
"""
msg = HumanMessageSerializer.from_human_string(val)
self.assertEqual(HumanMessageSerializer.to_human_string(msg).strip(), val.strip())

View File

@@ -28,7 +28,8 @@ class MockHandlingCircuit(ProxiedCircuit):
self.handler = handler
def _send_prepared_message(self, message: Message, transport=None):
asyncio.get_event_loop().call_soon(self.handler.handle, message)
loop = asyncio.get_event_loop_policy().get_event_loop()
loop.call_soon(self.handler.handle, message)
class MockConnectionHolder(ConnectionHolder):
@@ -70,7 +71,7 @@ class XferManagerTests(BaseTransferTests):
manager = XferManager(self.server_connection)
xfer = await manager.request(vfile_id=asset_id, vfile_type=AssetType.BODYPART)
self.received_bytes = xfer.reassemble_chunks()
self.server_circuit.send_message(Message(
self.server_circuit.send(Message(
"AssetUploadComplete",
Block("AssetBlock", UUID=asset_id, Type=asset_block["Type"], Success=True),
direction=Direction.IN,
@@ -109,7 +110,7 @@ class TestTransferManager(BaseTransferTests):
self.assertEqual(EstateAssetType.COVENANT, params.EstateAssetType)
data = self.LARGE_PAYLOAD
self.server_circuit.send_message(Message(
self.server_circuit.send(Message(
'TransferInfo',
Block(
'TransferInfo',
@@ -125,7 +126,7 @@ class TestTransferManager(BaseTransferTests):
while True:
chunk = data[:1000]
data = data[1000:]
self.server_circuit.send_message(Message(
self.server_circuit.send(Message(
'TransferPacket',
Block(
'TransferData',

View File

@@ -62,8 +62,8 @@ addons = [ChildAddon()]
class AddonIntegrationTests(BaseProxyTest):
def setUp(self) -> None:
super().setUp()
async def asyncSetUp(self) -> None:
await super().asyncSetUp()
self.addon = MockAddon()
AddonManager.init([], self.session_manager, [self.addon], swallow_addon_exceptions=False)
self.temp_dir = TemporaryDirectory(prefix="addon_test_sources")

View File

@@ -30,8 +30,8 @@ class MockAddon(BaseAddon):
class HTTPIntegrationTests(BaseProxyTest):
def setUp(self) -> None:
super().setUp()
async def asyncSetUp(self) -> None:
await super().asyncSetUp()
self.addon = MockAddon()
AddonManager.init([], self.session_manager, [self.addon])
self.flow_context = self.session_manager.flow_context
@@ -124,8 +124,8 @@ class HTTPIntegrationTests(BaseProxyTest):
class TestCapsClient(BaseProxyTest):
def setUp(self) -> None:
super().setUp()
async def asyncSetUp(self) -> None:
await super().asyncSetUp()
self._setup_default_circuit()
self.caps_client = self.session.main_region.caps_client
@@ -141,29 +141,30 @@ class TestCapsClient(BaseProxyTest):
class TestMITMProxy(BaseProxyTest):
def setUp(self) -> None:
super().setUp()
async def asyncSetUp(self) -> None:
await super().asyncSetUp()
self._setup_default_circuit()
self.caps_client = self.session.main_region.caps_client
def test_mitmproxy_works(self):
proxy_port = 9905
self.session_manager.settings.HTTP_PROXY_PORT = proxy_port
http_proc = multiprocessing.Process(
self.http_proc = multiprocessing.Process(
target=run_http_proxy_process,
args=("127.0.0.1", proxy_port, self.session_manager.flow_context),
daemon=True,
)
http_proc.start()
self.http_proc.start()
self.session_manager.flow_context.mitmproxy_ready.wait(1.0)
http_event_manager = MITMProxyEventManager(self.session_manager, self.session_manager.flow_context)
self.http_event_manager = MITMProxyEventManager(
self.session_manager,
self.session_manager.flow_context
)
def test_mitmproxy_works(self):
async def _request_example_com():
# Pump callbacks from mitmproxy
asyncio.create_task(http_event_manager.run())
asyncio.create_task(self.http_event_manager.run())
try:
async with self.caps_client.get("http://example.com/", timeout=0.5) as resp:
self.assertIn(b"Example Domain", await resp.read())
@@ -173,4 +174,4 @@ class TestMITMProxy(BaseProxyTest):
# Tell the event pump and mitmproxy they need to shut down
self.session_manager.flow_context.shutdown_signal.set()
asyncio.run(_request_example_com())
http_proc.join()
self.http_proc.join()

View File

@@ -12,7 +12,6 @@ from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.udpdeserializer import UDPMessageDeserializer
from hippolyzer.lib.base.objects import Object
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.message_logger import FilteringMessageLogger, LLUDPMessageLogEntry
@@ -48,8 +47,8 @@ class SimpleMessageLogger(FilteringMessageLogger):
class LLUDPIntegrationTests(BaseProxyTest):
def setUp(self) -> None:
super().setUp()
async def asyncSetUp(self) -> None:
await super().asyncSetUp()
self.addon = MockAddon()
self.deserializer = UDPMessageDeserializer()
AddonManager.init([], self.session_manager, [self.addon])
@@ -205,8 +204,8 @@ class LLUDPIntegrationTests(BaseProxyTest):
self.protocol.datagram_received(obj_update, self.region_addr)
await self._wait_drained()
entries = message_logger.entries
self.assertEqual(len(entries), 1)
self.assertEqual(entries[0].name, "ObjectUpdateCompressed")
self.assertEqual(1, len(entries))
self.assertEqual("ObjectUpdateCompressed", entries[0].name)
async def test_filtering_logged_messages(self):
message_logger = SimpleMessageLogger()
@@ -223,8 +222,8 @@ class LLUDPIntegrationTests(BaseProxyTest):
await self._wait_drained()
message_logger.set_filter("ObjectUpdateCompressed")
entries = message_logger.entries
self.assertEqual(len(entries), 1)
self.assertEqual(entries[0].name, "ObjectUpdateCompressed")
self.assertEqual(1, len(entries))
self.assertEqual("ObjectUpdateCompressed", entries[0].name)
async def test_logging_taken_message(self):
message_logger = SimpleMessageLogger()
@@ -262,11 +261,6 @@ class LLUDPIntegrationTests(BaseProxyTest):
# Don't have a serializer, onto the next field
continue
deser = serializer.deserialize(block, orig_val)
# For now we consider returning UNSERIALIZABLE to be acceptable.
# We should probably consider raising instead of returning that.
if deser is se.UNSERIALIZABLE:
continue
new_val = serializer.serialize(block, deser)
if orig_val != new_val:
raise AssertionError(f"{block.name}.{var_name} didn't reserialize correctly,"

View File

@@ -1,13 +1,14 @@
from mitmproxy.test import tflow, tutils
from hippolyzer.lib.proxy.caps import CapType
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.message_logger import HTTPMessageLogEntry
from hippolyzer.lib.proxy.test_utils import BaseProxyTest
class TestHTTPFlows(BaseProxyTest):
def setUp(self) -> None:
super().setUp()
async def asyncSetUp(self) -> None:
await super().asyncSetUp()
self.region = self.session.register_region(
("127.0.0.1", 2),
"https://test.localhost:4/foo",
@@ -18,7 +19,7 @@ class TestHTTPFlows(BaseProxyTest):
"ViewerAsset": "http://assets.example.com",
})
def test_request_formatting(self):
async def test_request_formatting(self):
req = tutils.treq(host="example.com", port=80)
fake_flow = tflow.tflow(req=req, resp=tutils.tresp())
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
@@ -32,7 +33,7 @@ content-length: 7\r
\r
content""")
def test_binary_request_formatting(self):
async def test_binary_request_formatting(self):
req = tutils.treq(host="example.com", port=80)
fake_flow = tflow.tflow(req=req, resp=tutils.tresp())
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
@@ -46,7 +47,7 @@ X-Hippo-Escaped-Body: 1\r
\r
c\\x00ntent""")
def test_llsd_response_formatting(self):
async def test_llsd_response_formatting(self):
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
# Half the time LLSD is sent with a random Content-Type and no PI indicating
@@ -63,7 +64,7 @@ content-length: 33\r
</llsd>
""")
def test_flow_state_serde(self):
async def test_flow_state_serde(self):
fake_flow = tflow.tflow(req=tutils.treq(host="example.com"), resp=tutils.tresp())
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
# Make sure cap resolution works correctly
@@ -72,7 +73,7 @@ content-length: 33\r
new_flow = HippoHTTPFlow.from_state(flow_state, self.session_manager)
self.assertIs(self.session, new_flow.cap_data.session())
def test_http_asset_repo(self):
async def test_http_asset_repo(self):
asset_repo = self.session_manager.asset_repo
asset_id = asset_repo.create_asset(b"foobar", one_shot=True)
req = tutils.treq(host="assets.example.com", path=f"/?animatn_id={asset_id}")
@@ -83,9 +84,9 @@ content-length: 33\r
self.assertTrue(asset_repo.try_serve_asset(flow))
self.assertEqual(b"foobar", flow.response.content)
def test_temporary_cap_resolution(self):
self.region.register_temporary_cap("TempExample", "http://not.example.com")
self.region.register_temporary_cap("TempExample", "http://not2.example.com")
async def test_temporary_cap_resolution(self):
self.region.register_cap("TempExample", "http://not.example.com", CapType.TEMPORARY)
self.region.register_cap("TempExample", "http://not2.example.com", CapType.TEMPORARY)
# Resolving the cap should consume it
cap_data = self.session_manager.resolve_cap("http://not.example.com")
self.assertEqual(cap_data.cap_name, "TempExample")

View File

@@ -130,7 +130,7 @@ class MessageFilterTests(unittest.IsolatedAsyncioTestCase):
# Make sure numbers outside 32bit range come through
self.assertTrue(self._filter_matches("Foo.Bar.Foo == 0xFFffFFffFF", msg))
def test_http_flow(self):
async def test_http_flow(self):
session_manager = SessionManager(ProxySettings())
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
fake_flow.metadata["cap_data_ser"] = SerializedCapData(
@@ -141,7 +141,17 @@ class MessageFilterTests(unittest.IsolatedAsyncioTestCase):
self.assertTrue(self._filter_matches("FakeCap", entry))
self.assertFalse(self._filter_matches("NotFakeCap", entry))
def test_export_import_http_flow(self):
async def test_http_header_filter(self):
session_manager = SessionManager(ProxySettings())
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
fake_flow.request.headers["Cookie"] = 'foo="bar"'
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), session_manager)
entry = HTTPMessageLogEntry(flow)
# The header map is case-insensitive!
self.assertTrue(self._filter_matches('Meta.ReqHeaders.cookie ~= "foo"', entry))
self.assertFalse(self._filter_matches('Meta.ReqHeaders.foobar ~= "foo"', entry))
async def test_export_import_http_flow(self):
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
fake_flow.metadata["cap_data_ser"] = SerializedCapData(
cap_name="FakeCap",

View File

@@ -17,19 +17,19 @@ class MockedProxyCircuit(ProxiedCircuit):
self.in_injections = InjectionTracker(0, maxlen=10)
def _send_prepared_message(self, msg: Message, transport=None):
self.sent_simple.append((msg.packet_id, msg.name, msg.direction, msg.injected, msg.acks))
self.sent_simple.append((msg.packet_id, msg.name, msg.direction, msg.synthetic, msg.acks))
self.sent_msgs.append(msg)
class PacketIDTests(unittest.TestCase):
class PacketIDTests(unittest.IsolatedAsyncioTestCase):
def setUp(self) -> None:
self.circuit = MockedProxyCircuit()
def _send_message(self, msg, outgoing=True):
msg.direction = Direction.OUT if outgoing else Direction.IN
return self.circuit.send_message(msg)
return self.circuit.send(msg)
def test_basic(self):
async def test_basic(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self._send_message(Message('ChatFromViewer', packet_id=2))
@@ -38,7 +38,7 @@ class PacketIDTests(unittest.TestCase):
(2, "ChatFromViewer", Direction.OUT, False, ()),
))
def test_inject(self):
async def test_inject(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self._send_message(Message('ChatFromViewer'))
self._send_message(Message('ChatFromViewer', packet_id=2))
@@ -49,7 +49,7 @@ class PacketIDTests(unittest.TestCase):
(3, "ChatFromViewer", Direction.OUT, False, ()),
))
def test_max_injected(self):
async def test_max_injected(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
for _ in range(5):
self._send_message(Message('ChatFromViewer'))
@@ -74,7 +74,7 @@ class PacketIDTests(unittest.TestCase):
# Make sure we're still able to get the original ID
self.assertEqual(self.circuit.out_injections.get_original_id(15), 3)
def test_inject_hole_in_sequence(self):
async def test_inject_hole_in_sequence(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self._send_message(Message('ChatFromViewer'))
self._send_message(Message('ChatFromViewer', packet_id=4))
@@ -87,7 +87,7 @@ class PacketIDTests(unittest.TestCase):
(6, "ChatFromViewer", Direction.OUT, True, ()),
))
def test_inject_misordered(self):
async def test_inject_misordered(self):
self._send_message(Message('ChatFromViewer', packet_id=2))
self._send_message(Message('ChatFromViewer'))
self._send_message(Message('ChatFromViewer', packet_id=1))
@@ -98,7 +98,7 @@ class PacketIDTests(unittest.TestCase):
(1, "ChatFromViewer", Direction.OUT, False, ()),
])
def test_inject_multiple(self):
async def test_inject_multiple(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self._send_message(Message('ChatFromViewer'))
self._send_message(Message('ChatFromViewer'))
@@ -115,7 +115,7 @@ class PacketIDTests(unittest.TestCase):
(6, "ChatFromViewer", Direction.OUT, True, ()),
])
def test_packet_ack_field_converted(self):
async def test_packet_ack_field_converted(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self._send_message(Message('ChatFromViewer'))
self._send_message(Message('ChatFromViewer'))
@@ -139,7 +139,7 @@ class PacketIDTests(unittest.TestCase):
(6, "ChatFromViewer", Direction.OUT, True, ()),
])
def test_packet_ack_proxied_message_converted(self):
async def test_packet_ack_proxied_message_converted(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self._send_message(Message('ChatFromViewer'))
self._send_message(Message('ChatFromViewer'))
@@ -176,12 +176,9 @@ class PacketIDTests(unittest.TestCase):
self.assertEqual(self.circuit.sent_msgs[5]["Packets"][0]["ID"], 2)
def test_drop_proxied_message(self):
async def test_drop_proxied_message(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self.circuit.drop_message(
Message('ChatFromViewer', packet_id=2, flags=PacketFlags.RELIABLE),
Direction.OUT,
)
self.circuit.drop_message(Message('ChatFromViewer', packet_id=2, flags=PacketFlags.RELIABLE))
self._send_message(Message('ChatFromViewer', packet_id=3))
self.assertSequenceEqual(self.circuit.sent_simple, [
@@ -191,12 +188,9 @@ class PacketIDTests(unittest.TestCase):
])
self.assertEqual(self.circuit.sent_msgs[1]["Packets"][0]["ID"], 2)
def test_unreliable_proxied_message(self):
async def test_unreliable_proxied_message(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self.circuit.drop_message(
Message('ChatFromViewer', packet_id=2),
Direction.OUT,
)
self.circuit.drop_message(Message('ChatFromViewer', packet_id=2))
self._send_message(Message('ChatFromViewer', packet_id=3))
self.assertSequenceEqual(self.circuit.sent_simple, [
@@ -204,15 +198,12 @@ class PacketIDTests(unittest.TestCase):
(3, "ChatFromViewer", Direction.OUT, False, ()),
])
def test_dropped_proxied_message_acks_sent(self):
async def test_dropped_proxied_message_acks_sent(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self._send_message(Message('ChatFromViewer', packet_id=2))
self._send_message(Message('ChatFromViewer', packet_id=3))
self._send_message(Message('ChatFromSimulator'), outgoing=False)
self.circuit.drop_message(
Message('ChatFromViewer', packet_id=4, acks=(4,)),
Direction.OUT,
)
self.circuit.drop_message(Message('ChatFromViewer', packet_id=4, acks=(4,)))
self._send_message(Message('ChatFromViewer', packet_id=5))
self.assertSequenceEqual(self.circuit.sent_simple, [
@@ -229,8 +220,8 @@ class PacketIDTests(unittest.TestCase):
# We injected an incoming packet, so "4" is really "3"
self.assertEqual(self.circuit.sent_msgs[4]["Packets"][0]["ID"], 3)
def test_resending_or_dropping(self):
self.circuit.send_message(Message('ChatFromViewer', packet_id=1))
async def test_resending_or_dropping(self):
self.circuit.send(Message('ChatFromViewer', packet_id=1))
to_drop = Message('ChatFromViewer', packet_id=2, flags=PacketFlags.RELIABLE)
self.circuit.drop_message(to_drop)
with self.assertRaises(RuntimeError):
@@ -238,12 +229,72 @@ class PacketIDTests(unittest.TestCase):
self.circuit.drop_message(to_drop)
# Returns a new message without finalized flag
new_msg = to_drop.take()
self.circuit.send_message(new_msg)
self.circuit.send(new_msg)
with self.assertRaises(RuntimeError):
self.circuit.send_message(new_msg)
self.circuit.send(new_msg)
self.assertSequenceEqual(self.circuit.sent_simple, [
(1, "ChatFromViewer", Direction.OUT, False, ()),
(1, "PacketAck", Direction.IN, True, ()),
# ended up getting the same packet ID when injected
(2, "ChatFromViewer", Direction.OUT, True, ()),
])
async def test_reliable_unacked_queueing(self):
self._send_message(Message('ChatFromViewer', flags=PacketFlags.RELIABLE))
self._send_message(Message('ChatFromViewer', flags=PacketFlags.RELIABLE, packet_id=2))
# Only the first, injected message should be queued for resends
self.assertEqual({(Direction.OUT, 1)}, set(self.circuit.unacked_reliable))
async def test_reliable_resend_cadence(self):
self._send_message(Message('ChatFromViewer', flags=PacketFlags.RELIABLE))
resend_info = self.circuit.unacked_reliable[(Direction.OUT, 1)]
self.circuit.resend_unacked()
# Should have been too soon to retry
self.assertEqual(10, resend_info.tries_left)
# Switch to allowing resends every 0s
self.circuit.resend_every = 0.0
self.circuit.resend_unacked()
self.assertSequenceEqual(self.circuit.sent_simple, [
(1, "ChatFromViewer", Direction.OUT, True, ()),
# Should have resent
(1, "ChatFromViewer", Direction.OUT, True, ()),
])
self.assertEqual(9, resend_info.tries_left)
for _ in range(resend_info.tries_left):
self.circuit.resend_unacked()
# Should have used up all the retry attempts and been kicked out of the retry queue
self.assertEqual(set(), set(self.circuit.unacked_reliable))
async def test_reliable_ack_collection(self):
msg = Message('ChatFromViewer', flags=PacketFlags.RELIABLE)
fut = self.circuit.send_reliable(msg)
self.assertEqual(1, len(self.circuit.unacked_reliable))
# Shouldn't count, this is an ACK going in the wrong direction!
ack_msg = Message("PacketAck", Block("Packets", ID=msg.packet_id))
self.circuit.collect_acks(ack_msg)
self.assertEqual(1, len(self.circuit.unacked_reliable))
self.assertFalse(fut.done())
# But it should count if the ACK message is heading in
ack_msg.direction = Direction.IN
self.circuit.collect_acks(ack_msg)
self.assertEqual(0, len(self.circuit.unacked_reliable))
self.assertTrue(fut.done())
async def test_start_ping_check(self):
# Should not break if no unacked
self._send_message(Message(
"StartPingCheck",
Block("PingID", PingID=0, OldestUnacked=20),
packet_id=5,
))
injected_msg = Message('ChatFromViewer', flags=PacketFlags.RELIABLE)
self._send_message(injected_msg)
self._send_message(Message(
"StartPingCheck",
Block("PingID", PingID=0, OldestUnacked=20),
packet_id=8,
))
# Oldest unacked should have been replaced with the injected packet's ID, it's older!
self.assertEqual(self.circuit.sent_msgs[2]["PingID"]["OldestUnacked"], injected_msg.packet_id)

View File

@@ -55,8 +55,8 @@ class ObjectTrackingAddon(BaseAddon):
class ObjectManagerTestMixin(BaseProxyTest):
def setUp(self) -> None:
super().setUp()
async def asyncSetUp(self) -> None:
await super().asyncSetUp()
self._setup_default_circuit()
self.region = self.session.main_region
self.message_handler = WrappingMessageHandler(self.region)
@@ -418,13 +418,13 @@ class RegionObjectManagerTests(ObjectManagerTestMixin, unittest.IsolatedAsyncioT
'AngularVelocity': Vector3(0.0, 0.0, 0.0791015625),
'TreeSpecies': None,
'ScratchPad': None,
'Text': None,
'TextColor': None,
'MediaURL': None,
'Sound': None,
'SoundGain': None,
'SoundFlags': None,
'SoundRadius': None,
'Text': b'',
'TextColor': b'',
'MediaURL': b'',
'Sound': UUID(),
'SoundGain': 0.0,
'SoundFlags': 0,
'SoundRadius': 0.0,
'NameValue': [],
'PathCurve': 32,
'ProfileCurve': 0,
@@ -505,8 +505,8 @@ class RegionObjectManagerTests(ObjectManagerTestMixin, unittest.IsolatedAsyncioT
class SessionObjectManagerTests(ObjectManagerTestMixin, unittest.IsolatedAsyncioTestCase):
def setUp(self) -> None:
super().setUp()
async def asyncSetUp(self) -> None:
await super().asyncSetUp()
self.second_region = self.session.register_region(
("127.0.0.1", 9), "https://localhost:5", 124
)

View File

@@ -1,9 +1,11 @@
import math
import unittest
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.datatypes import UUID, Vector3
from hippolyzer.lib.base.message.message_formatting import HumanMessageSerializer
from hippolyzer.lib.base.templates import TextureEntrySubfieldSerializer, TEFaceBitfield, TextureEntry
from hippolyzer.lib.base.templates import TextureEntrySubfieldSerializer, TEFaceBitfield, TextureEntryCollection, \
PackedTERotation, TextureEntry
EXAMPLE_TE = b'\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x08\xca*\x98:\x18\x02,\r\xf4\x1e\xc6\xf5\x91\x01]\x83\x014' \
b'\x00\x90i+\x10\x80\xa1\xaa\xa2g\x11o\xa8]\xc6\x00\x00\x00\x00\x00\x00\x00\x00\x80?\x00\x00\x00\x80?' \
@@ -12,12 +14,24 @@ EXAMPLE_TE = b'\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x08\xca*\x98:\x18\x02,\r
class TemplateTests(unittest.TestCase):
def test_te_round_trips(self):
deserialized = TextureEntrySubfieldSerializer.deserialize(None, EXAMPLE_TE)
serialized = TextureEntrySubfieldSerializer.serialize(None, deserialized)
self.assertEqual(EXAMPLE_TE, serialized)
def test_realize_te(self):
deserialized: TextureEntryCollection = TextureEntrySubfieldSerializer.deserialize(None, EXAMPLE_TE)
realized = deserialized.realize(4)
self.assertEqual(UUID('ca2a983a-1802-2c0d-f41e-c6f591015d83'), realized[3].Textures)
self.assertEqual(UUID('89556747-24cb-43ed-920b-47caed15465f'), realized[1].Textures)
with self.assertRaises(ValueError):
deserialized.realize(3)
def test_tecollection_from_tes(self):
deserialized: TextureEntryCollection = TextureEntrySubfieldSerializer.deserialize(None, EXAMPLE_TE)
# The TE collection should re-serialize to the same collection when split up and regrouped
self.assertEqual(deserialized, TextureEntryCollection.from_tes(deserialized.realize(4)))
def test_face_bitfield_round_trips(self):
test_val = b"\x81\x03"
reader = se.BufferReader("!", test_val)
@@ -37,9 +51,9 @@ class TemplateTests(unittest.TestCase):
'Color': {None: b'\xff\xff\xff\xff'},
'ScalesS': {None: 1.0},
'ScalesT': {None: 1.0},
'OffsetsS': {None: 0},
'OffsetsT': {None: 0},
'Rotation': {None: 0},
'OffsetsS': {None: 0.0},
'OffsetsT': {None: 0.0},
'Rotation': {None: 0.0},
'BasicMaterials': {None: {'Bump': 0, 'FullBright': False, 'Shiny': 'OFF'}},
'MediaFlags': {None: {'WebPage': False, 'TexGen': 'DEFAULT', '_Unused': 0}}, 'Glow': {None: 0},
'Materials': {None: '00000000-0000-0000-0000-000000000000'},
@@ -62,8 +76,56 @@ class TemplateTests(unittest.TestCase):
# Serialization order and format should match indra's exactly
self.assertEqual(EXAMPLE_TE, data_field)
deser = spec.deserialize(None, data_field, pod=True)
self.assertEqual(deser, pod_te)
self.assertEqual(pod_te, deser)
def test_textureentry_defaults(self):
te = TextureEntry()
te = TextureEntryCollection()
self.assertEqual(UUID('89556747-24cb-43ed-920b-47caed15465f'), te.Textures[None])
def test_textureentry_rotation_packing(self):
writer = se.BufferWriter("!")
writer.write(PackedTERotation(), math.pi * 2)
# fmod() makes this loop back around to 0
self.assertEqual(b"\x00\x00", writer.copy_buffer())
writer.clear()
writer.write(PackedTERotation(), -math.pi * 2)
# fmod() makes this loop back around to 0
self.assertEqual(b"\x00\x00", writer.copy_buffer())
writer.clear()
writer.write(PackedTERotation(), 0)
self.assertEqual(b"\x00\x00", writer.copy_buffer())
writer.clear()
# These both map to -32768 because of overflow in the positive case
# that isn't caught by exact equality to math.pi * 2
writer.write(PackedTERotation(), math.pi * 1.999999)
self.assertEqual(b"\x80\x00", writer.copy_buffer())
writer.clear()
writer.write(PackedTERotation(), math.pi * -1.999999)
self.assertEqual(b"\x80\x00", writer.copy_buffer())
writer.clear()
def test_textureentry_rotation_unpacking(self):
reader = se.BufferReader("!", b"\x00\x00")
self.assertEqual(0, reader.read(PackedTERotation()))
reader = se.BufferReader("!", b"\x80\x00")
self.assertEqual(-math.pi * 2, reader.read(PackedTERotation()))
# This quantization method does not allow for any representation of
# F_TWO_PI itself, just a value slightly below it! The float representation
# is ever so slightly different from the C++ version, but it should still
# round-trip correctly.
reader = se.BufferReader("!", b"\x7f\xff")
self.assertEqual(6.282993559581101, reader.read(PackedTERotation()))
writer = se.BufferWriter("!")
writer.write(PackedTERotation(), 6.282993559581101)
self.assertEqual(b"\x7f\xff", writer.copy_buffer())
def test_textureentry_st_to_uv_coords(self):
te = TextureEntry(ScalesS=0.5, ScalesT=0.5, OffsetsS=-0.25, OffsetsT=0.25, Rotation=math.pi / 2)
self.assertEqual(Vector3(0.25, 0.75), te.st_to_uv(Vector3(0.5, 0.5)))