70 Commits

Author SHA1 Message Date
Salad Dais
b2f0de2db5 v0.5.0 2021-05-21 23:48:29 +00:00
Salad Dais
0b0e031091 Run Flake8 in CI 2021-05-21 19:02:15 +00:00
Salad Dais
4eeac738dc Clean up linter warnings 2021-05-21 19:00:06 +00:00
Salad Dais
d9416363b3 Add flake8 config 2021-05-21 18:58:15 +00:00
Salad Dais
5906140921 Make Monochrome example addon work with bakes on mesh 2021-05-20 20:42:17 +00:00
Salad Dais
58932e585e Add better ObjectUpdate change detection 2021-05-20 20:42:17 +00:00
Salad Dais
b9f8ce0da2 Update readme 2021-05-20 20:42:17 +00:00
Salad Dais
67aa5e6bcd Possibly fix for flakey tests 2021-05-19 22:26:18 +00:00
Salad Dais
2a05529ceb Fix bad directive in pytest workflow 2021-05-19 22:20:41 +00:00
Salad Dais
a97aa88cc9 Add integration tests for MITMProxyEventManager 2021-05-19 22:14:27 +00:00
Salad Dais
febc0793f2 Add more HTTP flow tests 2021-05-19 20:44:28 +00:00
Salad Dais
141eb3afcd Add more HTTP request logging tests 2021-05-19 06:11:53 +00:00
Salad Dais
517888b1fa Fix missing import for byte escaping 2021-05-19 01:07:37 +00:00
Salad Dais
376b100ed9 Asset server proxying speedups
Should help with #7, will need to check on Windows.
2021-05-17 07:39:26 +00:00
Salad Dais
07fbec47e1 Fix autocompletion for enums used in subfields 2021-05-17 02:12:37 +00:00
Salad Dais
7836527305 Add NameCache CoarseLocation-only Avatars can be named 2021-05-17 01:50:40 +00:00
Salad Dais
21b18b7a52 Make new base classes for enum and flag with pretty repr() 2021-05-16 17:35:23 +00:00
Salad Dais
28b09144f2 Add Avatar wrapper class for Avatar PCoded Objects
Must be specifically requested through lookup_avatar or all_avatars
Includes Avatars known either through CoarseLocationUpdates or ObjectUpdates
2021-05-16 00:05:28 +00:00
Salad Dais
1e13fede82 Minor changes to avatar position accessor, add tests 2021-05-15 21:28:29 +00:00
Salad Dais
1bfb719f08 Run tests on PRs 2021-05-15 20:01:04 +00:00
gwigz
e5b63f7550 Add basic support for coarse locations (#8) 2021-05-15 15:40:40 -03:00
Salad Dais
91328ac448 Add bodypart creation example, make short uploads take short path 2021-05-15 05:17:49 +00:00
Salad Dais
46dbacd475 Fix order of arg-only, kwarg-only specifiers 2021-05-14 04:04:35 +00:00
Salad Dais
187742c20a Fix typo in comment 2021-05-14 04:03:00 +00:00
Salad Dais
5eae956750 Add support for asset upload via xfer
Still needed for shapes.
2021-05-14 04:01:33 +00:00
Salad Dais
37e8f8a20e Add TeleportFlags enum 2021-05-14 04:01:33 +00:00
Salad Dais
b3125f3231 Minor changes to Transfer / Xfer 2021-05-13 00:22:16 +00:00
Salad Dais
46fed98d6a Add note about why Connection: close is there
I forgot.
2021-05-12 20:22:47 +00:00
Salad Dais
3b5938cf5c Better inbound RequestXfer filter 2021-05-12 19:57:12 +00:00
Salad Dais
c7aeb03ea4 Allow shape Xfers through 2021-05-12 05:43:41 +00:00
Salad Dais
ab1bd16b5c whitespace cleanup 2021-05-11 22:00:02 +00:00
Salad Dais
0412ca5019 v0.4.1 2021-05-11 18:49:52 +00:00
Salad Dais
4d238c8dc8 Update readme to mention Windows SOCKS wrapper
Closes #6
2021-05-11 18:49:11 +00:00
Salad Dais
3bcc510cfd Handle Windows config dirs in the roaming profile 2021-05-11 09:55:04 +00:00
Salad Dais
0d9593e14c v0.4.0 2021-05-08 01:44:13 +00:00
Salad Dais
28dfe2f1b2 Allow filter identifiers with underscores, fixes enum filters 2021-05-08 01:32:57 +00:00
Salad Dais
c8f7231eae Fix message log match highlighting 2021-05-08 01:27:11 +00:00
Salad Dais
00e9ecb765 Allow flag or enum references in filter expressions 2021-05-08 00:45:02 +00:00
Salad Dais
2892bbeb98 Add note about how object handling could be improved 2021-05-07 23:05:31 +00:00
Salad Dais
28f57a8836 More mesh documentation 2021-05-07 20:09:05 +00:00
Salad Dais
943b8b11d5 Improve KillObject handling
KillObject should kill the hierarchy. This brings us closer
to indra object handling semantics.
2021-05-07 19:47:49 +00:00
Salad Dais
88915dd8d7 Better handling of object LocalID changes 2021-05-07 05:38:27 +00:00
Salad Dais
60b39e27f8 Add note about attachment tp out / in brokenness 2021-05-07 04:49:49 +00:00
Salad Dais
8af87befbd Make it less annoying to pickle messages 2021-05-06 02:41:12 +00:00
Salad Dais
95e34bb07a Add a few tests for HTTP flow wrappers 2021-05-05 22:25:03 +00:00
Salad Dais
106eb5c063 Fix typo in CI YAML 2021-05-05 21:35:07 +00:00
Salad Dais
e7f88eeed9 Add tests for CapsClient 2021-05-05 21:30:01 +00:00
Salad Dais
d07f100452 Update codecov.yml 2021-05-05 17:37:52 +00:00
Salad Dais
02c212e4a6 Highlight matched line when matching on specific var values
Very helpful for debugging ObjectUpdates which are high frequency
and have many diff objects in a single message.

Just the first line of the var for now. Need to be smarter about
how we build the blocks in the message text if we want to highlight
the whole thing.
2021-05-05 04:15:35 +00:00
Salad Dais
8989843042 v0.3.2 2021-05-04 15:42:27 +00:00
Salad Dais
a217a30133 Log message after addon hooks have run
This used to be the behaviour, but switching from queueing to
immediately adding messages to the log removed the implicit delay
2021-05-04 03:01:18 +00:00
Salad Dais
8514d7bae8 Update readme 2021-05-04 00:10:17 +00:00
Salad Dais
d9084c3332 Include licenses in Windows bundles 2021-05-04 00:09:07 +00:00
Salad Dais
0f35cc00d5 Allow manually triggering windows build 2021-05-03 23:40:00 +00:00
Salad Dais
a6a7ce8fa3 Correct codecov threshold 2021-05-03 23:36:59 +00:00
Salad Dais
269a1e163b Don't fail commits on coverage dropping 2021-05-03 23:33:33 +00:00
Salad Dais
eb2b6ee870 Package a zip for Windows when a release is made 2021-05-03 23:20:40 +00:00
Salad Dais
79a4f72558 v0.3.1 2021-05-03 17:37:22 +00:00
Salad Dais
6316369e1a Don't fail CI if coverage drops 2021-05-03 17:36:37 +00:00
Salad Dais
1b0272f3b3 WIP cx_Freeze support 2021-05-03 17:28:42 +00:00
Salad Dais
aedc2bf48c Fix CapType resolution 2021-05-03 17:09:57 +00:00
Salad Dais
5d3fd69e35 Add badges 2021-05-03 15:05:37 +00:00
Salad Dais
ae464f2c06 Track code coverage on codecov 2021-05-03 14:49:48 +00:00
Salad Dais
7d303d2bca v0.3 2021-05-03 03:04:22 +00:00
Salad Dais
dda3759028 Speed up Object tracking
Fixes #4
2021-05-03 02:59:50 +00:00
Salad Dais
d4e1a7a070 Fix queue consumption under 3.9 2021-05-03 02:07:03 +00:00
Salad Dais
d401842eef Tuned GC threshold 2021-05-03 01:15:17 +00:00
Salad Dais
1e4060f49c Faster message logging, improved queue usage 2021-05-03 01:14:54 +00:00
Salad Dais
a6c7f996ba Don't override message log's clear() method 2021-05-02 19:04:01 +00:00
Salad Dais
8fb36892cf Split Qt-specific parts out of message logger impl 2021-05-02 18:13:16 +00:00
63 changed files with 2291 additions and 1103 deletions

46
.github/workflows/bundle_windows.yml vendored Normal file
View File

@@ -0,0 +1,46 @@
# Have to manually unzip this (it gets double zipped) and add it
# onto the release after it gets created. Don't want actions with repo write.
name: Bundle Windows EXE
on:
# Only trigger on release creation
release:
types:
- created
workflow_dispatch:
jobs:
build:
runs-on: windows-latest
strategy:
matrix:
python-version: [3.9]
steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -e .
pip install cx_freeze
- name: Bundle with cx_Freeze
run: |
python setup_cxfreeze.py build_exe
pip install pip-licenses
pip-licenses --format=plain-vertical --with-license-file --no-license-path --output-file=lib_licenses.txt
python setup_cxfreeze.py finalize_cxfreeze
- name: Upload the artifact
uses: actions/upload-artifact@v2
with:
name: hippolyzer-gui-windows-${{ github.sha }}
path: ./dist/**

View File

@@ -6,6 +6,8 @@ on:
release:
types:
- created
workflow_dispatch:
# based on https://github.com/pypa/gh-action-pypi-publish

View File

@@ -1,6 +1,6 @@
name: Run Python Tests
on: [push]
on: [push, pull_request]
jobs:
build:
@@ -12,16 +12,35 @@ jobs:
steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install flake8 pytest
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: Test with pytest
pip install -r requirements.txt
pip install -r requirements-test.txt
- name: Run Flake8
run: |
pytest
flake8 .
- name: Test with pytest
# Tests are intentionally covered to detect broken tests.
run: |
pytest --cov=./hippolyzer --cov=./tests --cov-report=xml
# Keep this in a workflow without any other secrets in it.
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v1
with:
token: ${{ secrets.CODECOV_TOKEN }}
files: ./coverage.xml
directory: ./coverage/reports/
flags: unittests
env_vars: OS,PYTHON
name: codecov-umbrella
fail_ci_if_error: false
path_to_write_report: ./coverage/codecov_report.txt
verbose: false

1
.gitignore vendored
View File

@@ -1,6 +1,7 @@
#use glob syntax
syntax: glob
__pycache__
*.pyc
build/*
*.egg-info

View File

@@ -1,5 +1,7 @@
# Hippolyzer
![Python Test Status](https://github.com/SaladDais/Hippolyzer/workflows/Run%20Python%20Tests/badge.svg) [![codecov](https://codecov.io/gh/SaladDais/Hippolyzer/branch/master/graph/badge.svg?token=HCTFA4RAXX)](https://codecov.io/gh/SaladDais/Hippolyzer)
[Hippolyzer](http://wiki.secondlife.com/wiki/Hippo) is a fork of Linden Lab's abandoned
[PyOGP library](http://wiki.secondlife.com/wiki/PyOGP)
targeting modern Python 3, with a focus on debugging issues in Second Life-compatible
@@ -22,6 +24,9 @@ with low-level SL details. See the [Local Animation addon example](https://githu
![Screenshot of proxy GUI](https://github.com/SaladDais/Hippolyzer/blob/master/static/screenshot.png?raw=true)
## Setup
### From Source
* Python 3.8 or above is **required**. If you're unable to upgrade your system Python package due to
being on a stable distro, you can use [pyenv](https://github.com/pyenv/pyenv) to create
a self-contained Python install with the appropriate version.
@@ -32,6 +37,11 @@ with low-level SL details. See the [Local Animation addon example](https://githu
* * Under Windows it's `<virtualenv_dir>\Scripts\activate.bat`
* Run `pip install hippolyzer`, or run `pip install -e .` in a cloned repo to install an editable version
### Binary Windows Builds
Binary Windows builds are available on the [Releases page](https://github.com/SaladDais/Hippolyzer/releases/).
I don't extensively test these, building from source is recommended.
## Proxy
A proxy is provided with both a CLI and Qt-based interface. The proxy application wraps a
@@ -52,16 +62,27 @@ the [Alchemy](https://github.com/AlchemyViewer/Alchemy) viewer.
On Linux that would be `~/.firestorm_x64/` if you're using Firestorm.
* * Certificate validation can be disabled entirely through viewer debug setting `NoVerifySSLCert`,
but is not recommended.
#### Windows
Windows viewers have broken SOCKS 5 proxy support. To work around that, you need to use a wrapper EXE that
can make the viewer to correctly talk to Hippolyzer. Follow the instructions on https://github.com/SaladDais/WinHippoAutoProxy
to start the viewer and run it through Hippolyzer.
The proxy should _not_ be configured through the viewer's own preferences panel, it won't work correctly.
#### OS X & Linux
SOCKS 5 works correctly on these platforms, so you can just configure it through the
`preferences -> network -> proxy settings` panel:
* Start the viewer and configure it to use `127.0.0.1:9061` as a SOCKS proxy and `127.0.0.1:9062` as
an HTTP proxy. You **must** select the option in the viewer to use the HTTP proxy for all HTTP
traffic, or logins will fail.
* Optionally, If you want to reduce HTTP proxy lag you can have asset requests bypass the HTTP proxy by setting
the `no_proxy` env var appropriately. For ex. `no_proxy="asset-cdn.glb.agni.lindenlab.com" ./firestorm` or
`setx /m "no_proxy" "asset-cdn.glb.agni.lindenlab.com"` on Windows.
the `no_proxy` env var appropriately. For ex. `no_proxy="asset-cdn.glb.agni.lindenlab.com" ./firestorm`.
* Log in!
![Proxy config in firestorm](https://github.com/SaladDais/Hippolyzer/blob/master/static/proxy_config.png?raw=true)
### Filtering
By default, the proxy's display filter is configured to ignore many high-frequency messages.
@@ -85,6 +106,9 @@ agent's session, you can do `(Meta.AgentID == None || Meta.AgentID == "d929385f-
Vectors can also be compared. This will get any ObjectUpdate variant that occurs within a certain range:
`(*ObjectUpdate*.ObjectData.*Data.Position > (110, 50, 100) && *ObjectUpdate*.ObjectData.*Data.Position < (115, 55, 105))`
If you want to compare against an enum or a flag class in defined in `templates.py`, you can just specify its name:
`ViewerEffect.Effect.Type == ViewerEffectType.EFFECT_BEAM`
### Logging
Decoded messages are displayed in the log pane, clicking one will show the request and
@@ -289,12 +313,8 @@ If you are a viewer developer, please put them in a viewer.
## Potential Changes
* Make package-able for PyPI
* GitHub action to build binary packages and pull together licenses bundle
* AISv3 wrapper?
* Higher level wrappers for common things? I don't really need these, so only if people want to write them.
* Highlight matched portion of message in log view, if applicable
* * Remember deep filters and return a map of them, have message formatter return text ranges?
* Move things out of `templates.py`, right now most binary serialization stuff lives there
because it's more convenient for me to hot-reload.
* Ability to add menus?
@@ -307,6 +327,19 @@ This package [includes portions of the Second Life(TM) Viewer Artwork](https://g
Copyright (C) 2008 Linden Research, Inc. The viewer artwork is licensed under the Creative Commons
Attribution-Share Alike 3.0 License.
## Contributing
Ensure that any patches are clean with no unnecessary whitespace or formatting changes, and that you
add new tests for any added functionality.
## Philosophy
With a few notable exceptions, Hippolyzer focuses mainly on decomposition of data, and doesn't
provide many high-level abstractions for interpreting or manipulating that data. It's careful
to only do lossless transforms on data that are just prettier representations of the data sent
over the wire. Hippolyzer's goal is to help people understand how Second Life actually works,
automatically employing abstractions that hide how SL works is counter to that goal.
## For Client Developers
This section is mostly useful if you're developing a new SL-compatible client from scratch. Clients based
@@ -320,18 +353,20 @@ UDP proxy and an HTTP proxy.
To have your client's traffic proxied through Hippolyzer the general flow is:
* Open a TCP connection to Hippolyzer's SOCKS 5 proxy port
* * This should be done once per logical user session, as Hippolyzer assumes a 1:1 mapping of SOCKS
* * This should be done once per logical user session, as Hippolyzer assumes a 1:1 mapping of SOCKS TCP
connections to SL sessions
* Send a UDP associate command without authentication
* The proxy will respond with a host / port pair that UDP messages may be sent through
* At this point you will no longer need to use the TCP connection, but it must be kept
* At this point you will no longer need to use the TCP connection, but it must be kept
alive until you want to break the UDP association
* Whenever you send a UDP packet to a remote host, you'll need to instead send it to the host / port
from the UDP associate response. A SOCKS 5 header must be prepended to the data indicating the ultimate destination
of the packet
* Any received UDP packets will also have a SOCKS 5 header indicating the real source IP and address
* * When in doubt, check `socks_proxy.py`, `packets.py` and the SOCKS 5 RFC for more info on how to deal with SOCKS.
* All HTTP requests must be sent through the Hippolyzer's HTTP proxy port.
* * <https://github.com/SaladDais/WinHippoAutoProxy/blob/master/winhippoautoproxy/socks5udphooker.cpp> is a simple
example that wraps around `recvfrom()` and `sendto()` and could be used as a starting point.
* All HTTP requests must be sent through the Hippolyzer's HTTP proxy port.
* * You may not need to do any extra plumbing to get this to work if your chosen HTTP client
respects the `HTTP_PROXY` environment variable.
* All HTTPS connections will be encrypted with the proxy's TLS key. You'll need to either add it to whatever

View File

@@ -9,23 +9,22 @@ class GreetingAddon(BaseAddon):
@handle_command()
async def greetings(self, session: Session, region: ProxiedRegion):
"""Greet everyone around you"""
agent_obj = region.objects.lookup_fullid(session.agent_id)
if not agent_obj:
our_avatar = region.objects.lookup_avatar(session.agent_id)
if not our_avatar:
show_message("Don't have an agent object?")
# Note that this will only have avatars closeish to your camera. The sim sends
# KillObjects for avatars that get too far away.
other_agents = [o for o in region.objects.all_avatars if o.FullID != agent_obj.FullID]
other_avatars = [o for o in region.objects.all_avatars if o.FullID != our_avatar.FullID]
if not other_agents:
show_message("No other agents?")
if not other_avatars:
show_message("No other avatars?")
for other_agent in other_agents:
dist = Vector3.dist(agent_obj.Position, other_agent.Position)
for other_avatar in other_avatars:
dist = Vector3.dist(our_avatar.RegionPosition, other_avatar.RegionPosition)
if dist >= 19.0:
continue
nv = other_agent.NameValue.to_dict()
send_chat(f"Greetings, {nv['FirstName']} {nv['LastName']}!")
if other_avatar.Name is None:
continue
send_chat(f"Greetings, {other_avatar.Name}!")
addons = [GreetingAddon()]

View File

@@ -23,8 +23,7 @@ import ctypes
import secrets
from typing import *
import mitmproxy
from mitmproxy.http import HTTPFlow
import mitmproxy.http
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import *

View File

@@ -37,6 +37,22 @@ from hippolyzer.lib.proxy.templates import TextureEntry
glymur.set_option('lib.num_threads', 4)
# These should never be replaced, they're only used as aliases to tell the viewer
# it should fetch the relevant texture from the appearance service
BAKES_ON_MESH_TEXTURE_IDS = {UUID(x) for x in (
"5a9f4a74-30f2-821c-b88d-70499d3e7183",
"ae2de45c-d252-50b8-5c6e-19f39ce79317",
"24daea5f-0539-cfcf-047f-fbc40b2786ba",
"52cc6bb6-2ee5-e632-d3ad-50197b1dcb8a",
"43529ce8-7faa-ad92-165a-bc4078371687",
"09aac1fb-6bce-0bee-7d44-caac6dbb6c63",
"ff62763f-d60a-9855-890b-0c96f8f8cd98",
"8e915e25-31d1-cc95-ae08-d58a47488251",
"9742065b-19b5-297c-858a-29711d539043",
"03642e83-2bd1-4eb9-34b4-4c47ed586d2d",
"edd51b77-fc10-ce7a-4b3d-011dfc349e4f",
)}
def _modify_crc(crc_tweak: int, crc_val: int):
return ctypes.c_uint32(crc_val ^ crc_tweak).value
@@ -137,6 +153,8 @@ class MonochromeAddon(BaseAddon):
# and we don't want to change the canonical view.
parsed_te = copy.deepcopy(parsed_te)
for k, v in parsed_te.Textures.items():
if v in BAKES_ON_MESH_TEXTURE_IDS:
continue
# Replace textures with their alias to bust the viewer cache
parsed_te.Textures[k] = tracker.get_alias_uuid(v)
for k, v in parsed_te.Color.items():
@@ -166,6 +184,8 @@ class MonochromeAddon(BaseAddon):
orig_texture_id = self.mono_tracker.get_orig_uuid(UUID(texture_id))
if not orig_texture_id:
return
if orig_texture_id in BAKES_ON_MESH_TEXTURE_IDS:
return
# The request was for a fake texture ID we created, rewrite the request to
# request the real asset and mark the flow for modification once we receive

View File

@@ -4,10 +4,9 @@ from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.proxy.packets import Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
from hippolyzer.lib.proxy.templates import IMDialogType
from hippolyzer.lib.proxy.templates import IMDialogType, XferFilePath
SUSPICIOUS_PACKETS = {"RequestXfer", "TransferRequest", "UUIDNameRequest",
"UUIDGroupNameRequest", "OpenCircuit"}
SUSPICIOUS_PACKETS = {"TransferRequest", "UUIDNameRequest", "UUIDGroupNameRequest", "OpenCircuit"}
REGULAR_IM_DIALOGS = (IMDialogType.TYPING_STOP, IMDialogType.TYPING_STOP, IMDialogType.NOTHING_SPECIAL)
@@ -29,6 +28,13 @@ class ShieldAddon(BaseAddon):
else:
expected_id = from_agent ^ session.agent_id
msg_block["ID"] = expected_id
if message.name == "RequestXfer":
xfer_block = message["XferID"][0]
# Don't allow Xfers for files, only assets
if xfer_block["FilePath"] != XferFilePath.NONE or xfer_block["Filename"].strip(b"\x00"):
show_message(f"Blocked suspicious {message.name} packet")
region.circuit.drop_message(message)
return True
addons = [ShieldAddon()]

View File

@@ -1,6 +1,7 @@
"""
Example of how to request an Xfer
"""
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.legacy_inv import InventoryModel
from hippolyzer.lib.base.message.message import Block
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message
@@ -8,7 +9,7 @@ from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
from hippolyzer.lib.proxy.templates import XferFilePath
from hippolyzer.lib.proxy.templates import XferFilePath, AssetType, InventoryType, WearableType
class XferExampleAddon(BaseAddon):
@@ -60,5 +61,61 @@ class XferExampleAddon(BaseAddon):
item_names = [item.name for item in inv_model.items.values()]
show_message(item_names)
@handle_command()
async def eyes_for_you(self, session: Session, region: ProxiedRegion):
"""Upload an eye bodypart and create an item for it"""
asset_data = f"""LLWearable version 22
New Eyes
\tpermissions 0
\t{{
\t\tbase_mask\t7fffffff
\t\towner_mask\t7fffffff
\t\tgroup_mask\t00000000
\t\teveryone_mask\t00000000
\t\tnext_owner_mask\t00082000
\t\tcreator_id\t{session.agent_id}
\t\towner_id\t{session.agent_id}
\t\tlast_owner_id\t00000000-0000-0000-0000-000000000000
\t\tgroup_id\t00000000-0000-0000-0000-000000000000
\t}}
\tsale_info\t0
\t{{
\t\tsale_type\tnot
\t\tsale_price\t10
\t}}
type 3
parameters 2
98 0
99 0
textures 1
3 89556747-24cb-43ed-920b-47caed15465f
"""
# If we want to create an item containing the asset we need to know the transaction id
# used to create the asset.
transaction_id = UUID.random()
await region.xfer_manager.upload_asset(
AssetType.BODYPART,
data=asset_data,
transaction_id=transaction_id
)
region.circuit.send_message(ProxiedMessage(
'CreateInventoryItem',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block(
'InventoryBlock',
CallbackID=0,
# Null folder ID will put it in the default folder for the type
FolderID=UUID(),
TransactionID=transaction_id,
NextOwnerMask=0x7fFFffFF,
Type=AssetType.BODYPART,
InvType=InventoryType.WEARABLE,
WearableType=WearableType.EYES,
Name='Eyes For You',
Description=b''
),
))
addons = [XferExampleAddon()]

14
codecov.yml Normal file
View File

@@ -0,0 +1,14 @@
coverage:
precision: 1
round: down
range: "50...80"
status:
project:
default:
# Do not fail commits if the code coverage drops.
target: 0%
threshold: 100%
base: auto
patch:
default:
only_pulls: true

View File

@@ -1,43 +1,15 @@
import collections
import codecs
import copy
import enum
import fnmatch
import io
import logging
import pickle
import queue
import re
import typing
import weakref
from defusedxml import minidom
from PySide2 import QtCore, QtGui
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import *
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.proxy.region import ProxiedRegion, CapType
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.sessions import Session, BaseMessageLogger
from .message_filter import compile_filter, BaseFilterNode, MessageFilterNode, MetaFieldSpecifier
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.message_logger import FilteringMessageLogger
LOG = logging.getLogger(__name__)
def bytes_unescape(val: bytes) -> bytes:
# Only in CPython. bytes -> bytes with escape decoding.
# https://stackoverflow.com/a/23151714
return codecs.escape_decode(val)[0] # type: ignore
def bytes_escape(val: bytes) -> bytes:
# Try to keep newlines as-is
return re.sub(rb"(?<!\\)\\n", b"\n", codecs.escape_encode(val)[0]) # type: ignore
class MessageLogHeader(enum.IntEnum):
Host = 0
Type = enum.auto()
@@ -46,582 +18,23 @@ class MessageLogHeader(enum.IntEnum):
Summary = enum.auto()
class AbstractMessageLogEntry:
region: typing.Optional[ProxiedRegion]
session: typing.Optional[Session]
name: str
type: str
__slots__ = ["_region", "_session", "_region_name", "_agent_id", "_summary", "meta"]
def __init__(self, region, session):
if region and not isinstance(region, weakref.ReferenceType):
region = weakref.ref(region)
if session and not isinstance(session, weakref.ReferenceType):
session = weakref.ref(session)
self._region: typing.Optional[weakref.ReferenceType] = region
self._session: typing.Optional[weakref.ReferenceType] = session
self._region_name = None
self._agent_id = None
self._summary = None
if self.region:
self._region_name = self.region.name
if self.session:
self._agent_id = self.session.agent_id
agent_obj = None
if self.region is not None:
agent_obj = self.region.objects.lookup_fullid(self.agent_id)
self.meta = {
"RegionName": self.region_name,
"AgentID": self.agent_id,
"SessionID": self.session.id if self.session else None,
"AgentLocal": agent_obj.LocalID if agent_obj is not None else None,
"Method": self.method,
"Type": self.type,
"SelectedLocal": self._current_selected_local(),
"SelectedFull": self._current_selected_full(),
}
def freeze(self):
pass
def cache_summary(self):
self._summary = self.summary
def _current_selected_local(self):
if self.session:
return self.session.selected.object_local
return None
def _current_selected_full(self):
selected_local = self._current_selected_local()
if selected_local is None or self.region is None:
return None
obj = self.region.objects.lookup_localid(selected_local)
return obj and obj.FullID
def _get_meta(self, name: str):
# Slight difference in semantics. Filters are meant to return the same
# thing no matter when they're run, so SelectedLocal and friends resolve
# to the selected items _at the time the message was logged_. To handle
# the case where we want to match on the selected object at the time the
# filter is evaluated, we resolve these here.
if name == "CurrentSelectedLocal":
return self._current_selected_local()
elif name == "CurrentSelectedFull":
return self._current_selected_full()
return self.meta.get(name)
@property
def region(self) -> typing.Optional[ProxiedRegion]:
if self._region:
return self._region()
return None
@property
def session(self) -> typing.Optional[Session]:
if self._session:
return self._session()
return None
@property
def region_name(self) -> str:
region = self.region
if region:
self._region_name = region.name
return self._region_name
# Region may die after a message is logged, need to keep this around.
if self._region_name:
return self._region_name
return ""
@property
def agent_id(self) -> typing.Optional[UUID]:
if self._agent_id:
return self._agent_id
session = self.session
if session:
self._agent_id = session.agent_id
return self._agent_id
return None
@property
def host(self) -> str:
region_name = self.region_name
if not region_name:
return ""
session_str = ""
agent_id = self.agent_id
if agent_id:
session_str = f" ({agent_id})"
return region_name + session_str
def request(self, beautify=False, replacements=None):
return None
def response(self, beautify=False):
return None
def _packet_root_matches(self, pattern):
if fnmatch.fnmatchcase(self.name, pattern):
return True
if fnmatch.fnmatchcase(self.type, pattern):
return True
return False
def _val_matches(self, operator, val, expected):
if isinstance(expected, MetaFieldSpecifier):
expected = self._get_meta(str(expected))
if not isinstance(expected, (int, float, bytes, str, type(None), tuple)):
if callable(expected):
expected = expected()
else:
expected = str(expected)
elif expected is not None:
# Unbox the expected value
expected = expected.value
if not isinstance(val, (int, float, bytes, str, type(None), tuple, TupleCoord)):
val = str(val)
if not operator:
return bool(val)
elif operator == "==":
return val == expected
elif operator == "!=":
return val != expected
elif operator == "^=":
if val is None:
return False
return val.startswith(expected)
elif operator == "$=":
if val is None:
return False
return val.endswith(expected)
elif operator == "~=":
if val is None:
return False
return expected in val
elif operator == "<":
return val < expected
elif operator == "<=":
return val <= expected
elif operator == ">":
return val > expected
elif operator == ">=":
return val >= expected
else:
raise ValueError(f"Unexpected operator {operator!r}")
def _base_matches(self, matcher: "MessageFilterNode") -> typing.Optional[bool]:
if len(matcher.selector) == 1:
# Comparison operators would make no sense here
if matcher.value or matcher.operator:
return False
return self._packet_root_matches(matcher.selector[0])
if len(matcher.selector) == 2 and matcher.selector[0] == "Meta":
return self._val_matches(matcher.operator, self._get_meta(matcher.selector[1]), matcher.value)
return None
def matches(self, matcher: "MessageFilterNode"):
return self._base_matches(matcher) or False
@property
def seq(self):
return ""
@property
def method(self):
return ""
@property
def summary(self):
return ""
@staticmethod
def _format_llsd(parsed):
xmlified = llsd.format_pretty_xml(parsed)
# dedent <key> by 1 for easier visual scanning
xmlified = re.sub(rb" <key>", b"<key>", xmlified)
return xmlified.decode("utf8", errors="replace")
class LLUDPMessageLogEntry(AbstractMessageLogEntry):
__slots__ = ["_message", "_name", "_direction", "_frozen_message", "_seq", "_deserializer"]
def __init__(self, message: ProxiedMessage, region, session):
self._message: ProxiedMessage = message
self._deserializer = None
self._name = message.name
self._direction = message.direction
self._frozen_message: typing.Optional[bytes] = None
self._seq = message.packet_id
super().__init__(region, session)
_MESSAGE_META_ATTRS = {
"Injected", "Dropped", "Extra", "Resent", "Zerocoded", "Acks", "Reliable",
}
def _get_meta(self, name: str):
# These may change between when the message is logged and when we
# actually filter on it, since logging happens before addons.
msg = self.message
if name in self._MESSAGE_META_ATTRS:
return getattr(msg, name.lower(), None)
msg_meta = getattr(msg, "meta", None)
if msg_meta is not None:
if name in msg_meta:
return msg_meta[name]
return super()._get_meta(name)
@property
def message(self):
if self._message:
return self._message
elif self._frozen_message:
message = pickle.loads(self._frozen_message)
message.deserializer = self._deserializer
return message
else:
raise ValueError("Didn't have a fresh or frozen message somehow")
def freeze(self):
self.message.invalidate_caches()
# These are expensive to keep around. pickle them and un-pickle on
# an as-needed basis.
self._deserializer = self.message.deserializer
self.message.deserializer = None
self._frozen_message = pickle.dumps(self._message, protocol=pickle.HIGHEST_PROTOCOL)
self._message = None
@property
def type(self):
return "LLUDP"
@property
def name(self):
if self._message:
self._name = self._message.name
return self._name
@property
def method(self):
if self._message:
self._direction = self._message.direction
return self._direction.name if self._direction is not None else ""
def request(self, beautify=False, replacements=None):
return self.message.to_human_string(replacements, beautify)
def matches(self, matcher):
base_matched = self._base_matches(matcher)
if base_matched is not None:
return base_matched
if not self._packet_root_matches(matcher.selector[0]):
return False
message = self.message
selector_len = len(matcher.selector)
# name, block_name, var_name(, subfield_name)?
if selector_len not in (3, 4):
return False
for block_name in message.blocks:
if not fnmatch.fnmatchcase(block_name, matcher.selector[1]):
continue
for block in message[block_name]:
for var_name in block.vars.keys():
if not fnmatch.fnmatchcase(var_name, matcher.selector[2]):
continue
if selector_len == 3:
if matcher.value is None:
return True
if self._val_matches(matcher.operator, block[var_name], matcher.value):
return True
elif selector_len == 4:
try:
deserialized = block.deserialize_var(var_name)
except KeyError:
continue
# Discard the tag if this is a tagged union, we only want the value
if isinstance(deserialized, TaggedUnion):
deserialized = deserialized.value
if not isinstance(deserialized, dict):
return False
for key in deserialized.keys():
if fnmatch.fnmatchcase(str(key), matcher.selector[3]):
if matcher.value is None:
return True
if self._val_matches(matcher.operator, deserialized[key], matcher.value):
return True
return False
@property
def summary(self):
if self._summary is None:
self._summary = self.message.to_summary()[:500]
return self._summary
@property
def seq(self):
if self._message:
self._seq = self._message.packet_id
return self._seq
class EQMessageLogEntry(AbstractMessageLogEntry):
__slots__ = ["event"]
def __init__(self, event, region, session):
super().__init__(region, session)
self.event = event
@property
def type(self):
return "EQ"
def request(self, beautify=False, replacements=None):
return self._format_llsd(self.event["body"])
@property
def name(self):
return self.event["message"]
@property
def summary(self):
if self._summary is not None:
return self._summary
self._summary = ""
self._summary = llsd.format_notation(self.event["body"]).decode("utf8")[:500]
return self._summary
class HTTPMessageLogEntry(AbstractMessageLogEntry):
__slots__ = ["flow"]
def __init__(self, flow: HippoHTTPFlow):
self.flow: HippoHTTPFlow = flow
cap_data = self.flow.cap_data
region = cap_data and cap_data.region
session = cap_data and cap_data.session
super().__init__(region, session)
# This was a request the proxy made through itself
self.meta["Injected"] = flow.request_injected
@property
def type(self):
return "HTTP"
@property
def name(self):
cap_data = self.flow.cap_data
name = cap_data and cap_data.cap_name
if name:
return name
return self.flow.request.url
@property
def method(self):
return self.flow.request.method
def _format_http_message(self, want_request, beautify):
message = self.flow.request if want_request else self.flow.response
method = self.flow.request.method
buf = io.StringIO()
cap_data = self.flow.cap_data
cap_name = cap_data and cap_data.cap_name
base_url = cap_name and cap_data.base_url
temporary_cap = cap_data and cap_data.type == CapType.TEMPORARY
beautify_url = (beautify and base_url and cap_name and
not temporary_cap and self.session and want_request)
if want_request:
buf.write(message.method)
buf.write(" ")
if beautify_url:
buf.write(f"[[{cap_name}]]{message.url[len(base_url):]}")
else:
buf.write(message.url)
buf.write(" ")
buf.write(message.http_version)
else:
buf.write(message.http_version)
buf.write(" ")
buf.write(str(message.status_code))
buf.write(" ")
buf.write(message.reason)
buf.write("\r\n")
if beautify_url:
buf.write("# ")
buf.write(message.url)
buf.write("\r\n")
headers = copy.deepcopy(message.headers)
for key in tuple(headers.keys()):
if key.lower().startswith("x-hippo-"):
LOG.warning(f"Internal header {key!r} leaked out?")
# If this header actually came from somewhere untrusted, we can't
# include it. It may change the meaning of the message when replayed.
headers[f"X-Untrusted-{key}"] = headers[key]
headers.pop(key)
beautified = None
if beautify and message.content:
try:
serializer = se.HTTP_SERIALIZERS.get(cap_name)
if serializer:
if want_request:
beautified = serializer.deserialize_req_body(method, message.content)
else:
beautified = serializer.deserialize_resp_body(method, message.content)
if beautified is se.UNSERIALIZABLE:
beautified = None
else:
beautified = self._format_llsd(beautified)
headers["X-Hippo-Beautify"] = "1"
if not beautified:
content_type = self._guess_content_type(message)
if content_type.startswith("application/llsd"):
beautified = self._format_llsd(llsd.parse(message.content))
elif any(content_type.startswith(x) for x in ("application/xml", "text/xml")):
beautified = minidom.parseString(message.content).toprettyxml(indent=" ")
# kill blank lines. will break cdata sections. meh.
beautified = re.sub(r'\n\s*\n', '\n', beautified, flags=re.MULTILINE)
beautified = re.sub(r'<([\w]+)>\s*</\1>', r'<\1></\1>',
beautified, flags=re.MULTILINE)
except:
LOG.exception("Failed to beautify message")
message_body = beautified or message.content
if isinstance(message_body, bytes):
try:
decoded = message.text
# Valid in many codecs, but unprintable.
if "\x00" in decoded:
raise ValueError("Embedded null")
message_body = decoded
except (UnicodeError, ValueError):
# non-printable characters, return the escaped version.
headers["X-Hippo-Escaped-Body"] = "1"
message_body = bytes_escape(message_body).decode("utf8")
buf.write(bytes(headers).decode("utf8", errors="replace"))
buf.write("\r\n")
buf.write(message_body)
return buf.getvalue()
def request(self, beautify=False, replacements=None):
return self._format_http_message(want_request=True, beautify=beautify)
def response(self, beautify=False):
return self._format_http_message(want_request=False, beautify=beautify)
@property
def summary(self):
if self._summary is not None:
return self._summary
msg = self.flow.response
self._summary = f"{msg.status_code}: "
if not msg.content:
return self._summary
if len(msg.content) > 1000000:
self._summary += "[too large...]"
return self._summary
content_type = self._guess_content_type(msg)
if content_type.startswith("application/llsd"):
notation = llsd.format_notation(llsd.parse(msg.content))
self._summary += notation.decode("utf8")[:500]
return self._summary
def _guess_content_type(self, message):
content_type = message.headers.get("Content-Type", "")
if not message.content or content_type.startswith("application/llsd"):
return content_type
# Sometimes gets sent with `text/plain` or `text/html`. Cool.
if message.content.startswith(rb'<?xml version="1.0" ?><llsd>'):
return "application/llsd+xml"
if message.content.startswith(rb'<llsd>'):
return "application/llsd+xml"
if message.content.startswith(rb'<?xml '):
return "application/xml"
return content_type
class MessageLogModel(QtCore.QAbstractTableModel, BaseMessageLogger):
class MessageLogModel(QtCore.QAbstractTableModel, FilteringMessageLogger):
def __init__(self, parent=None):
QtCore.QAbstractTableModel.__init__(self, parent)
BaseMessageLogger.__init__(self)
self._raw_entries = collections.deque(maxlen=2000)
self._queued_entries = queue.Queue()
self._filtered_entries = []
self._paused = False
self.filter: typing.Optional[BaseFilterNode] = None
FilteringMessageLogger.__init__(self)
def setFilter(self, filter_str: str):
self.filter = compile_filter(filter_str)
def _begin_insert(self, insert_idx: int):
self.beginInsertRows(QtCore.QModelIndex(), insert_idx, insert_idx)
def _end_insert(self):
self.endInsertRows()
def _begin_reset(self):
self.beginResetModel()
# Keep any entries that've aged out of the raw entries list that
# match the new filter
self._filtered_entries = [
m for m in self._filtered_entries if
m not in self._raw_entries and self.filter.match(m)
]
self._filtered_entries.extend((m for m in self._raw_entries if self.filter.match(m)))
def _end_reset(self):
self.endResetModel()
def setPaused(self, paused: bool):
self._paused = paused
def log_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
if self._paused:
return
self.queueLogEntry(LLUDPMessageLogEntry(message, region, session))
def log_http_response(self, flow: HippoHTTPFlow):
if self._paused:
return
# These are huge, let's not log them for now.
if flow.cap_data and flow.cap_data.asset_server_cap:
return
self.queueLogEntry(HTTPMessageLogEntry(flow))
def log_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
if self._paused:
return
self.queueLogEntry(EQMessageLogEntry(event, region, session))
def appendQueuedEntries(self):
while not self._queued_entries.empty():
entry: AbstractMessageLogEntry = self._queued_entries.get(block=False)
# Paused, throw it away.
if self._paused:
continue
self._raw_entries.append(entry)
try:
if self.filter.match(entry):
next_idx = len(self._filtered_entries)
self.beginInsertRows(QtCore.QModelIndex(), next_idx, next_idx)
self._filtered_entries.append(entry)
self.endInsertRows()
entry.cache_summary()
# In the common case we don't need to keep around the serialization
# caches anymore. If the filter changes, the caches will be repopulated
# as necessary.
entry.freeze()
except Exception:
LOG.exception("Failed to filter queued message")
def queueLogEntry(self, entry: AbstractMessageLogEntry):
self._queued_entries.put(entry, block=False)
def rowCount(self, parent=None, *args, **kwargs):
return len(self._filtered_entries)
@@ -656,14 +69,6 @@ class MessageLogModel(QtCore.QAbstractTableModel, BaseMessageLogger):
if orientation == QtCore.Qt.Horizontal and role == QtCore.Qt.DisplayRole:
return MessageLogHeader(col).name
def clear(self):
self.beginResetModel()
self._filtered_entries.clear()
while not self._queued_entries.empty():
self._queued_entries.get(block=False)
self._raw_entries.clear()
self.endResetModel()
class RegionListModel(QtCore.QAbstractListModel):
def __init__(self, parent, session_manager):

View File

@@ -136,7 +136,7 @@ def start_proxy(extra_addons: Optional[list] = None, extra_addon_paths: Optional
async_server = loop.run_until_complete(coro)
event_manager = MITMProxyEventManager(session_manager, flow_context)
loop.create_task(event_manager.pump_proxy_events())
loop.create_task(event_manager.run())
addon_paths = sys.argv[1:]
addon_paths.extend(extra_addon_paths)
@@ -144,6 +144,7 @@ def start_proxy(extra_addons: Optional[list] = None, extra_addon_paths: Optional
# Everything in memory at this point should stay
gc.freeze()
gc.set_threshold(5000, 50, 10)
# Serve requests until Ctrl+C is pressed
print(f"SOCKS and HTTP proxies running on {proxy_host}")
@@ -178,10 +179,15 @@ def start_proxy(extra_addons: Optional[list] = None, extra_addon_paths: Optional
def _windows_timeout_killer(pid: int):
time.sleep(2.0)
print(f"Killing hanging event loop")
print("Killing hanging event loop")
os.kill(pid, 9)
def main():
multiprocessing.set_start_method("spawn")
start_proxy()
if __name__ == "__main__":
multiprocessing.freeze_support()
main()

View File

@@ -8,7 +8,6 @@ import json
import logging
import pathlib
import multiprocessing
import os
import re
import signal
import socket
@@ -20,18 +19,11 @@ import multidict
from qasync import QEventLoop
from PySide2 import QtCore, QtWidgets, QtGui
from hippolyzer.apps.model import (
AbstractMessageLogEntry,
LLUDPMessageLogEntry,
MessageLogModel,
MessageLogHeader,
RegionListModel,
bytes_unescape,
bytes_escape,
)
from hippolyzer.apps.model import MessageLogModel, MessageLogHeader, RegionListModel
from hippolyzer.apps.proxy import start_proxy
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import bytes_unescape, bytes_escape, get_resource_filename
from hippolyzer.lib.base.message.llsd_msg_serializer import LLSDMessageSerializer
from hippolyzer.lib.base.message.message import Block
from hippolyzer.lib.base.message.msgtypes import MsgType
@@ -43,18 +35,18 @@ from hippolyzer.lib.proxy.ca_utils import setup_ca_everywhere
from hippolyzer.lib.proxy.caps_client import CapsClient
from hippolyzer.lib.proxy.http_proxy import create_proxy_master, HTTPFlowContext
from hippolyzer.lib.proxy.packets import Direction
from hippolyzer.lib.proxy.message import ProxiedMessage, VerbatimHumanVal, proxy_eval
from hippolyzer.lib.proxy.message import ProxiedMessage, VerbatimHumanVal, proxy_eval, SpannedString
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, AbstractMessageLogEntry
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
from hippolyzer.lib.proxy.templates import CAP_TEMPLATES
LOG = logging.getLogger(__name__)
BASE_PATH = os.path.dirname(os.path.abspath(__file__))
MAIN_WINDOW_UI_PATH = os.path.join(BASE_PATH, "proxy_mainwindow.ui")
MESSAGE_BUILDER_UI_PATH = os.path.join(BASE_PATH, "message_builder.ui")
ADDON_DIALOG_UI_PATH = os.path.join(BASE_PATH, "addon_dialog.ui")
FILTER_DIALOG_UI_PATH = os.path.join(BASE_PATH, "filter_dialog.ui")
MAIN_WINDOW_UI_PATH = get_resource_filename("apps/proxy_mainwindow.ui")
MESSAGE_BUILDER_UI_PATH = get_resource_filename("apps/message_builder.ui")
ADDON_DIALOG_UI_PATH = get_resource_filename("apps/addon_dialog.ui")
FILTER_DIALOG_UI_PATH = get_resource_filename("apps/filter_dialog.ui")
def show_error_message(error_msg, parent=None):
@@ -169,6 +161,8 @@ class ProxyGUI(QtWidgets.QMainWindow):
"ViewerAsset GetTexture SetAlwaysRun GetDisplayNames MapImageService MapItemReply".split(" ")
DEFAULT_FILTER = f"!({' || '.join(ignored for ignored in DEFAULT_IGNORE)})"
textRequest: QtWidgets.QTextEdit
def __init__(self):
super().__init__()
loadUi(MAIN_WINDOW_UI_PATH, self)
@@ -242,10 +236,10 @@ class ProxyGUI(QtWidgets.QMainWindow):
filter_str = self.lineEditFilter.text()
else:
self.lineEditFilter.setText(filter_str)
self.model.setFilter(filter_str)
self.model.set_filter(filter_str)
def _setPaused(self, checked):
self.model.setPaused(checked)
self.model.set_paused(checked)
def _messageSelected(self, selected, _deselected):
indexes = selected.indexes()
@@ -271,8 +265,23 @@ class ProxyGUI(QtWidgets.QMainWindow):
beautify=self.checkBeautify.isChecked(),
replacements=self.buildReplacements(entry.session, entry.region),
)
resp = entry.response(beautify=self.checkBeautify.isChecked())
highlight_range = None
if isinstance(req, SpannedString):
match_result = self.model.filter.match(entry)
# Match result was a tuple indicating what matched
if isinstance(match_result, tuple):
highlight_range = req.spans.get(match_result)
self.textRequest.setPlainText(req)
if highlight_range:
cursor = self.textRequest.textCursor()
cursor.setPosition(highlight_range[0], QtGui.QTextCursor.MoveAnchor)
cursor.setPosition(highlight_range[1], QtGui.QTextCursor.KeepAnchor)
highlight_format = QtGui.QTextBlockFormat()
highlight_format.setBackground(QtCore.Qt.yellow)
cursor.setBlockFormat(highlight_format)
resp = entry.response(beautify=self.checkBeautify.isChecked())
if resp:
self.textResponse.show()
self.textResponse.setPlainText(resp)
@@ -796,7 +805,6 @@ def gui_main():
window = ProxyGUI()
timer = QtCore.QTimer(app)
timer.timeout.connect(window.sessionManager.checkRegions)
timer.timeout.connect(window.model.appendQueuedEntries)
timer.start(100)
signal.signal(signal.SIGINT, lambda *args: QtWidgets.QApplication.quit())
window.show()
@@ -809,3 +817,8 @@ def gui_main():
extra_addon_paths=window.getAddonList(),
proxy_host=http_host,
)
if __name__ == "__main__":
multiprocessing.freeze_support()
gui_main()

View File

@@ -299,6 +299,32 @@ class StringEnum(str, enum.Enum):
return self.value
class IntEnum(enum.IntEnum):
# Give a special repr() that'll eval in a REPL.
def __repr__(self):
return f"{self.__class__.__name__}.{self.name}"
class IntFlag(enum.IntFlag):
def __repr__(self):
# Make an ORed together version of the flags based on the POD version
flags = flags_to_pod(type(self), self)
flags = " | ".join(
(f"{self.__class__.__name__}.{v}" if isinstance(v, str) else str(v))
for v in flags
)
return f"({flags})"
def flags_to_pod(flag_cls: Type[enum.IntFlag], val: int) -> Tuple[Union[str, int], ...]:
# Shove any bits not represented in the IntFlag into an int
left_over = val
for flag in iter(flag_cls):
left_over &= ~flag.value
extra = (int(left_over),) if left_over else ()
return tuple(flag.name for flag in iter(flag_cls) if val & flag.value) + extra
class TaggedUnion(recordclass.datatuple): # type: ignore
tag: Any
value: Any
@@ -306,5 +332,6 @@ class TaggedUnion(recordclass.datatuple): # type: ignore
__all__ = [
"Vector3", "Vector4", "Vector2", "Quaternion", "TupleCoord",
"UUID", "RawBytes", "StringEnum", "JankStringyBytes", "TaggedUnion"
"UUID", "RawBytes", "StringEnum", "JankStringyBytes", "TaggedUnion",
"IntEnum", "IntFlag", "flags_to_pod"
]

View File

@@ -347,7 +347,7 @@ class RegionCapNotAvailable(RegionDomainError):
class RegionMessageError(RegionDomainError):
""" an error raised when a region does not have a connection
over which it can send UDP messages
over which it can send UDP messages
accepts a region object as an attribute

View File

@@ -1,6 +1,9 @@
from __future__ import annotations
import codecs
import functools
import pkg_resources
import re
import weakref
from pprint import PrettyPrinter
from typing import *
@@ -121,3 +124,18 @@ def proxify(obj: Union[Callable[[], _T], weakref.ReferenceType, _T]) -> _T:
if obj is not None and not isinstance(obj, weakref.ProxyTypes):
return weakref.proxy(obj)
return obj
def bytes_unescape(val: bytes) -> bytes:
# Only in CPython. bytes -> bytes with escape decoding.
# https://stackoverflow.com/a/23151714
return codecs.escape_decode(val)[0] # type: ignore
def bytes_escape(val: bytes) -> bytes:
# Try to keep newlines as-is
return re.sub(rb"(?<!\\)\\n", b"\n", codecs.escape_encode(val)[0]) # type: ignore
def get_resource_filename(resource_filename: str):
return pkg_resources.resource_filename("hippolyzer", resource_filename)

View File

@@ -39,6 +39,7 @@ class MeshAsset:
# These TypedDicts describe the expected shape of the LLSD in the mesh
# header and various segments. They're mainly for type hinting.
class MeshHeaderDict(TypedDict, total=False):
"""Header of the mesh file, includes offsets & sizes for segments' LLSD"""
version: int
creator: UUID
date: dt.datetime
@@ -54,6 +55,7 @@ class MeshHeaderDict(TypedDict, total=False):
class SegmentHeaderDict(TypedDict):
"""Standard shape for segment references within the header"""
offset: int
size: int
@@ -73,6 +75,7 @@ class PhysicsHavokSegmentHeaderDict(PhysicsSegmentHeaderDict, total=False):
class PhysicsCostDataHeaderDict(TypedDict, total=False):
"""Cost of physical representation, populated by server"""
decomposition: float
decomposition_discounted_vertices: int
decomposition_hulls: int
@@ -85,6 +88,7 @@ class PhysicsCostDataHeaderDict(TypedDict, total=False):
class MeshSegmentDict(TypedDict, total=False):
"""Dict of segments unpacked using the MeshHeaderDict"""
high_lod: List[LODSegmentDict]
medium_lod: List[LODSegmentDict]
low_lod: List[LODSegmentDict]
@@ -96,6 +100,7 @@ class MeshSegmentDict(TypedDict, total=False):
class LODSegmentDict(TypedDict, total=False):
"""Represents a single entry within the material list of a LOD segment"""
# Only present if True and no geometry
NoGeometry: bool
# -1.0 - 1.0
@@ -113,17 +118,22 @@ class LODSegmentDict(TypedDict, total=False):
class DomainDict(TypedDict):
"""Description of the real range for quantized coordinates"""
# number of elems depends on what the domain is for, Vec2 or Vec3
Max: List[float]
Min: List[float]
class VertexWeight(recordclass.datatuple): # type: ignore
"""Vertex weight for a specific joint on a specific vertex"""
# index of the joint within the joint_names list in the skin segment
joint_idx: int
# 0.0 - 1.0
weight: float
class SkinSegmentDict(TypedDict, total=False):
"""Rigging information"""
joint_names: List[str]
# model -> world transform matrix for model
bind_shape_matrix: List[float]
@@ -137,14 +147,17 @@ class SkinSegmentDict(TypedDict, total=False):
class PhysicsConvexSegmentDict(DomainDict, total=False):
"""Data for convex hull collisions, populated by the client"""
# Min / Max domain vals are inline, unlike for LODs
HullList: List[int]
# -1.0 - 1.0
# -1.0 - 1.0, dequantized from binary field of U16s
Positions: List[Vector3]
# -1.0 - 1.0
# -1.0 - 1.0, dequantized from binary field of U16s
BoundingVerts: List[Vector3]
class PhysicsHavokSegmentDict(TypedDict, total=False):
"""Cached data for Havok collisions, populated by sim and not used by client."""
HullMassProps: MassPropsDict
MOPP: MOPPDict
MeshDecompMassProps: MassPropsDict
@@ -169,8 +182,11 @@ class MOPPDict(TypedDict, total=False):
def positions_from_domain(positions: Iterable[TupleCoord], domain: DomainDict):
# Used for turning positions into their actual positions within the mesh / domain
# for ex: positions_from_domain(lod["Position"], lod["PositionDomain])
"""
Used for turning positions into their actual positions within the mesh / domain
for ex: positions_from_domain(lod["Position"], lod["PositionDomain])
"""
lower = domain['Min']
upper = domain['Max']
return [
@@ -179,7 +195,7 @@ def positions_from_domain(positions: Iterable[TupleCoord], domain: DomainDict):
def positions_to_domain(positions: Iterable[TupleCoord], domain: DomainDict):
# Used for turning positions into their actual positions within the mesh / domain
"""Used for turning positions into their actual positions within the mesh / domain"""
lower = domain['Min']
upper = domain['Max']
return [
@@ -187,7 +203,36 @@ def positions_to_domain(positions: Iterable[TupleCoord], domain: DomainDict):
]
class VertexWeights(se.SerializableBase):
"""Serializer for a list of joint weights on a single vertex"""
INFLUENCE_SER = se.QuantizedFloat(se.U16, 0.0, 1.0)
INFLUENCE_LIMIT = 4
INFLUENCE_TERM = 0xFF
@classmethod
def serialize(cls, vals, writer: se.BufferWriter, ctx=None):
if len(vals) > cls.INFLUENCE_LIMIT:
raise ValueError(f"{vals!r} is too long, can only have {cls.INFLUENCE_LIMIT} influences!")
for val in vals:
joint_idx, influence = val
writer.write(se.U8, joint_idx)
writer.write(cls.INFLUENCE_SER, influence, ctx=ctx)
if len(vals) != cls.INFLUENCE_LIMIT:
writer.write(se.U8, cls.INFLUENCE_TERM)
@classmethod
def deserialize(cls, reader: se.Reader, ctx=None):
influence_list = []
for _ in range(cls.INFLUENCE_LIMIT):
joint_idx = reader.read(se.U8)
if joint_idx == cls.INFLUENCE_TERM:
break
influence_list.append(VertexWeight(joint_idx, reader.read(cls.INFLUENCE_SER, ctx=ctx)))
return influence_list
class SegmentSerializer:
"""Serializer for binary fields within an LLSD object"""
def __init__(self, templates):
self._templates: Dict[str, se.SerializableBase] = templates
@@ -217,33 +262,6 @@ class SegmentSerializer:
return new_segment
class VertexWeights(se.SerializableBase):
INFLUENCE_SER = se.QuantizedFloat(se.U16, 0.0, 1.0)
INFLUENCE_LIMIT = 4
INFLUENCE_TERM = 0xFF
@classmethod
def serialize(cls, vals, writer: se.BufferWriter, ctx=None):
if len(vals) > cls.INFLUENCE_LIMIT:
raise ValueError(f"{vals!r} is too long, can only have {cls.INFLUENCE_LIMIT} influences!")
for val in vals:
joint_idx, influence = val
writer.write(se.U8, joint_idx)
writer.write(cls.INFLUENCE_SER, influence, ctx=ctx)
if len(vals) != cls.INFLUENCE_LIMIT:
writer.write(se.U8, cls.INFLUENCE_TERM)
@classmethod
def deserialize(cls, reader: se.Reader, ctx=None):
influence_list = []
for _ in range(cls.INFLUENCE_LIMIT):
joint_idx = reader.read(se.U8)
if joint_idx == cls.INFLUENCE_TERM:
break
influence_list.append(VertexWeight(joint_idx, reader.read(cls.INFLUENCE_SER, ctx=ctx)))
return influence_list
LOD_SEGMENT_SERIALIZER = SegmentSerializer({
# 16-bit indices to the verts making up the tri. Imposes a 16-bit
# upper limit on verts in any given material in the mesh.
@@ -265,6 +283,7 @@ class LLMeshSerializer(se.SerializableBase):
KNOWN_SEGMENTS = ("lowest_lod", "low_lod", "medium_lod", "high_lod",
"physics_mesh", "physics_convex", "skin", "physics_havok")
# Define unpackers for specific binary fields within the parsed LLSD segments
SEGMENT_TEMPLATES: Dict[str, SegmentSerializer] = {
"lowest_lod": LOD_SEGMENT_SERIALIZER,
"low_lod": LOD_SEGMENT_SERIALIZER,

View File

@@ -19,5 +19,3 @@ You should have received a copy of the GNU Lesser General Public License
along with this program; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""

View File

@@ -20,8 +20,8 @@ along with this program; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
import os
from hippolyzer.lib.base.helpers import get_resource_filename
msg_tmpl = open(os.path.join(os.path.dirname(__file__), 'message_template.msg'))
with open(os.path.join(os.path.dirname(__file__), 'message.xml'), "rb") as _f:
msg_tmpl = open(get_resource_filename("lib/base/message/data/message_template.msg"))
with open(get_resource_filename("lib/base/message/data/message.xml"), "rb") as _f:
msg_details = _f.read()

View File

@@ -34,13 +34,13 @@ VAR_TYPE = Union[TupleCoord, bytes, str, float, int, Tuple, UUID]
class Block:
"""
"""
base representation of a block
Block expects a name, and kwargs for variables (var_name = value)
"""
__slots__ = ('name', 'size', 'vars', 'message_name', '_ser_cache', 'fill_missing',)
def __init__(self, name, /, fill_missing=False, **kwargs):
def __init__(self, name, /, *, fill_missing=False, **kwargs):
self.name = name
self.size = 0
self.message_name: Optional[str] = None
@@ -129,24 +129,7 @@ class Block:
continue
# We have a serializer, include the pretty output in the repr,
# using the _ suffix so the builder knows it needs to be serialized.
deserialized = self.deserialize_var(key)
type_name = type(deserialized).__name__
# TODO: replace __repr__ for these in a context manager so nested
# Enums / Flags get handled correctly as well. The point of the
# pretty repr() is to make messages directly paste-able into code.
if isinstance(deserialized, enum.IntEnum):
deserialized = f"{type_name}.{deserialized.name}"
elif isinstance(deserialized, enum.IntFlag):
# Make an ORed together version of the flags based on the POD version
flags = se.flags_to_pod(type(deserialized), deserialized)
flags = " | ".join(
(f"{type_name}.{v}" if isinstance(v, str) else str(v))
for v in flags
)
deserialized = f"({flags})"
else:
deserialized = repr(deserialized)
block_vars[f"{key}_"] = deserialized
block_vars[f"{key}_"] = repr(self.deserialize_var(key))
else:
block_vars = self.vars
@@ -193,12 +176,21 @@ class Message:
# should be set once a packet is sent / dropped to prevent accidental
# re-sending or re-dropping
self.finalized = False
# Whether message is owned by the queue or should be sent immediately
# Whether message is owned by a queue or should be sent immediately
self.queued: bool = False
self._blocks: BLOCK_DICT = {}
self.add_blocks(args)
def __reduce_ex__(self, protocol):
reduced: Tuple[Any] = super().__reduce_ex__(protocol)
# https://docs.python.org/3/library/pickle.html#object.__reduce__
# We need to make some changes to the object state to make it serializable
state_dict: Dict = reduced[2][1]
# Have to remove the deserializer weak ref so we can pickle
state_dict['deserializer'] = None
return reduced
@property
def packet_id(self) -> Optional[int]:
return self._packet_id

View File

@@ -79,8 +79,14 @@ class MessageHandler(Generic[_T]):
notifiers = self._subscribe_all(message_names, _handler_wrapper, predicate=predicate)
async def _get_wrapper():
msg = await msg_queue.get()
# Consumption is completion
msg_queue.task_done()
return msg
try:
yield msg_queue.get
yield _get_wrapper
finally:
for n in notifiers:
n.unsubscribe(_handler_wrapper)

View File

@@ -66,7 +66,7 @@ class MessageTemplateBlock:
self.variables: typing.List[MessageTemplateVariable] = []
self.variable_map: typing.Dict[str, MessageTemplateVariable] = {}
self.name = name
self.block_type = 0
self.block_type: MsgBlockType = MsgBlockType.MBT_SINGLE
self.number = 0
def add_variable(self, var):

View File

@@ -19,6 +19,3 @@ You should have received a copy of the GNU Lesser General Public License
along with this program; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""

View File

@@ -18,108 +18,114 @@ You should have received a copy of the GNU Lesser General Public License
along with this program; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
from __future__ import annotations
import dataclasses
from typing import *
import lazy_object_proxy
import recordclass
from hippolyzer.lib.base.datatypes import Vector3, Quaternion, Vector4
from hippolyzer.lib.base.datatypes import Vector3, Quaternion, Vector4, UUID
class Object:
""" represents an Object
class Object(recordclass.datatuple): # type: ignore
__options__ = {
"fast_new": False,
"use_weakref": True,
}
__weakref__: Any
Initialize the Object class instance
>>> obj = Object()
"""
LocalID: Optional[int] = None
State: Optional[int] = None
FullID: Optional[UUID] = None
CRC: Optional[int] = None
PCode: Optional[int] = None
Material: Optional[int] = None
ClickAction: Optional[int] = None
Scale: Optional[Vector3] = None
ParentID: Optional[int] = None
# Actually contains a weakref proxy
Parent: Optional[Object] = None
UpdateFlags: Optional[int] = None
PathCurve: Optional[int] = None
ProfileCurve: Optional[int] = None
PathBegin: Optional[int] = None
PathEnd: Optional[int] = None
PathScaleX: Optional[int] = None
PathScaleY: Optional[int] = None
PathShearX: Optional[int] = None
PathShearY: Optional[int] = None
PathTwist: Optional[int] = None
PathTwistBegin: Optional[int] = None
PathRadiusOffset: Optional[int] = None
PathTaperX: Optional[int] = None
PathTaperY: Optional[int] = None
PathRevolutions: Optional[int] = None
PathSkew: Optional[int] = None
ProfileBegin: Optional[int] = None
ProfileEnd: Optional[int] = None
ProfileHollow: Optional[int] = None
TextureEntry: Optional[Any] = None
TextureAnim: Optional[Any] = None
NameValue: Optional[Any] = None
Data: Optional[Any] = None
Text: Optional[str] = None
TextColor: Optional[bytes] = None
MediaURL: Optional[Any] = None
PSBlock: Optional[Any] = None
ExtraParams: Optional[Any] = None
Sound: Optional[UUID] = None
OwnerID: Optional[UUID] = None
SoundGain: Optional[float] = None
SoundFlags: Optional[int] = None
SoundRadius: Optional[float] = None
JointType: Optional[int] = None
JointPivot: Optional[int] = None
JointAxisOrAnchor: Optional[int] = None
TreeSpecies: Optional[int] = None
ScratchPad: Optional[bytes] = None
ObjectCosts: Optional[Dict] = None
ChildIDs: Optional[List[int]] = None
# Same as parent, contains weakref proxies.
Children: Optional[List[Object]] = None
__slots__ = (
"LocalID",
"State",
"FullID",
"CRC",
"PCode",
"Material",
"ClickAction",
"Scale",
"ParentID",
"UpdateFlags",
"PathCurve",
"ProfileCurve",
"PathBegin",
"PathEnd",
"PathScaleX",
"PathScaleY",
"PathShearX",
"PathShearY",
"PathTwist",
"PathTwistBegin",
"PathRadiusOffset",
"PathTaperX",
"PathTaperY",
"PathRevolutions",
"PathSkew",
"ProfileBegin",
"ProfileEnd",
"ProfileHollow",
"TextureEntry",
"TextureAnim",
"NameValue",
"Data",
"Text",
"TextColor",
"MediaURL",
"PSBlock",
"ExtraParams",
"Sound",
"OwnerID",
"SoundGain",
"SoundFlags",
"SoundRadius",
"JointType",
"JointPivot",
"JointAxisOrAnchor",
"TreeSpecies",
"ObjectCosts",
"FootCollisionPlane",
"Position",
"Velocity",
"Acceleration",
"Rotation",
"AngularVelocity",
"CreatorID",
"GroupID",
"CreationDate",
"BaseMask",
"OwnerMask",
"GroupMask",
"EveryoneMask",
"NextOwnerMask",
"OwnershipCost",
"SaleType",
"SalePrice",
"AggregatePerms",
"AggregatePermTextures",
"AggregatePermTexturesOwner",
"Category",
"InventorySerial",
"ItemID",
"FolderID",
"FromTaskID",
"LastOwnerID",
"Name",
"Description",
"TouchName",
"SitName",
"TextureID",
"ChildIDs",
"Children",
"Parent",
"ScratchPad",
"__weakref__",
)
FootCollisionPlane: Optional[Vector4] = None
Position: Optional[Vector3] = None
Velocity: Optional[Vector3] = None
Acceleration: Optional[Vector3] = None
Rotation: Optional[Quaternion] = None
AngularVelocity: Optional[Vector3] = None
def __init__(self, *, ID=None, LocalID=None, State=None, FullID=None, CRC=None, PCode=None, Material=None,
# from ObjectProperties
CreatorID: Optional[UUID] = None
GroupID: Optional[UUID] = None
CreationDate: Optional[int] = None
BaseMask: Optional[int] = None
OwnerMask: Optional[int] = None
GroupMask: Optional[int] = None
EveryoneMask: Optional[int] = None
NextOwnerMask: Optional[int] = None
OwnershipCost: Optional[int] = None
# TaxRate
SaleType: Optional[int] = None
SalePrice: Optional[int] = None
AggregatePerms: Optional[int] = None
AggregatePermTextures: Optional[int] = None
AggregatePermTexturesOwner: Optional[int] = None
Category: Optional[int] = None
InventorySerial: Optional[int] = None
ItemID: Optional[UUID] = None
FolderID: Optional[UUID] = None
FromTaskID: Optional[UUID] = None
LastOwnerID: Optional[UUID] = None
Name: Optional[str] = None
Description: Optional[str] = None
TouchName: Optional[str] = None
SitName: Optional[str] = None
TextureID: Optional[Any] = None
def __init__(self, *, LocalID=None, State=None, FullID=None, CRC=None, PCode=None, Material=None,
ClickAction=None, Scale=None, ParentID=None, UpdateFlags=None, PathCurve=None, ProfileCurve=None,
PathBegin=None, PathEnd=None, PathScaleX=None, PathScaleY=None, PathShearX=None, PathShearY=None,
PathTwist=None, PathTwistBegin=None, PathRadiusOffset=None, PathTaperX=None, PathTaperY=None,
@@ -131,7 +137,7 @@ class Object:
AngularVelocity=None, TreeSpecies=None, ObjectCosts=None, ScratchPad=None):
""" set up the object attributes """
self.LocalID = LocalID or ID # U32
self.LocalID = LocalID # U32
self.State = State # U8
self.FullID = FullID # LLUUID
self.CRC = CRC # U32 // TEMPORARY HACK FOR JAMES
@@ -248,18 +254,19 @@ class Object:
updated_properties = set()
for key, val in properties.items():
if hasattr(self, key):
old_val = getattr(self, key, val)
old_val = getattr(self, key, dataclasses.MISSING)
# Don't check equality if we're using a lazy proxy,
# parsing is deferred until we actually use it.
is_proxy = isinstance(val, lazy_object_proxy.Proxy)
if is_proxy or old_val != val:
updated_properties.add(key)
if isinstance(val, lazy_object_proxy.Proxy):
# TODO: be smarter about this. Can we store the raw bytes and
# compare those if it's an unparsed object?
if old_val is not val:
updated_properties.add(key)
else:
if old_val != val:
updated_properties.add(key)
setattr(self, key, val)
return updated_properties
def to_dict(self):
return {
x: getattr(self, x) for x in dir(self)
if not isinstance(getattr(self.__class__, x, None), property) and
not callable(getattr(self, x)) and not x.startswith("_")
}
return recordclass.asdict(self)

View File

@@ -5,7 +5,6 @@ import enum
import math
import struct
import types
import typing
import weakref
from io import SEEK_CUR, SEEK_SET, SEEK_END, RawIOBase, BufferedIOBase
from typing import *
@@ -1092,15 +1091,6 @@ class IntEnum(Adapter):
return lambda: self.enum_cls(0)
def flags_to_pod(flag_cls: Type[enum.IntFlag], val: int) -> typing.Tuple[Union[str, int], ...]:
# Shove any bits not represented in the IntFlag into an int
left_over = val
for flag in iter(flag_cls):
left_over &= ~flag.value
extra = (int(left_over),) if left_over else ()
return tuple(flag.name for flag in iter(flag_cls) if val & flag.value) + extra
class IntFlag(Adapter):
def __init__(self, flag_cls: Type[enum.IntFlag],
flag_spec: Optional[SerializablePrimitive] = None):
@@ -1121,7 +1111,7 @@ class IntFlag(Adapter):
def decode(self, val: Any, ctx: Optional[ParseContext], pod: bool = False) -> Any:
if pod:
return flags_to_pod(self.flag_cls, val)
return dtypes.flags_to_pod(self.flag_cls, val)
return self.flag_cls(val)
def default_value(self) -> Any:
@@ -1613,7 +1603,7 @@ class BufferedLLSDBinaryParser(llsd.HippoLLSDBinaryParser):
byte = self._getc()[0]
except IndexError:
byte = None
raise llsd.LLSDParseError("%s at byte %d: %s" % (message, self._index+offset, byte))
raise llsd.LLSDParseError("%s at byte %d: %s" % (message, self._index + offset, byte))
def _getc(self, num=1):
return self._buffer.read_bytes(num)
@@ -1641,8 +1631,14 @@ def subfield_serializer(msg_name, block_name, var_name):
return f
_ENUM_TYPE = TypeVar("_ENUM_TYPE", bound=Type[dtypes.IntEnum])
_FLAG_TYPE = TypeVar("_FLAG_TYPE", bound=Type[dtypes.IntFlag])
def enum_field_serializer(msg_name, block_name, var_name):
def f(orig_cls):
def f(orig_cls: _ENUM_TYPE) -> _ENUM_TYPE:
if not issubclass(orig_cls, dtypes.IntEnum):
raise ValueError(f"{orig_cls} must be a subclass of Hippolyzer's IntEnum class")
wrapper = subfield_serializer(msg_name, block_name, var_name)
wrapper(IntEnumSubfieldSerializer(orig_cls))
return orig_cls
@@ -1650,7 +1646,9 @@ def enum_field_serializer(msg_name, block_name, var_name):
def flag_field_serializer(msg_name, block_name, var_name):
def f(orig_cls):
def f(orig_cls: _FLAG_TYPE) -> _FLAG_TYPE:
if not issubclass(orig_cls, dtypes.IntFlag):
raise ValueError(f"{orig_cls!r} must be a subclass of Hippolyzer's IntFlag class")
wrapper = subfield_serializer(msg_name, block_name, var_name)
wrapper(IntFlagSubfieldSerializer(orig_cls))
return orig_cls
@@ -1703,7 +1701,7 @@ class BaseSubfieldSerializer(abc.ABC):
"""Guess at which template a val might correspond to"""
if dataclasses.is_dataclass(val):
val = dataclasses.asdict(val) # noqa
if isinstance(val, bytes):
if isinstance(val, (bytes, bytearray)):
template_checker = cls._template_sizes_match
elif isinstance(val, dict):
template_checker = cls._template_keys_match

View File

@@ -22,11 +22,11 @@ Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
class Settings:
def __init__(self, quiet_logging=False, spammy_logging=False, log_tests=True):
""" some lovely configurable settings
""" some lovely configurable settings
These are applied application wide, and can be
overridden at any time in a specific instance
quiet_logging overrides spammy_logging
"""

View File

@@ -1,3 +1,4 @@
import itertools
from pathlib import Path
import shutil
import sys
@@ -32,7 +33,6 @@ def setup_ca_everywhere(mitmproxy_master):
pass
except PermissionError:
pass
return valid_paths
@@ -42,7 +42,8 @@ def _viewer_config_dir_iter():
elif sys.platform == "darwin":
paths = (Path.home() / "Library" / "Application Support").iterdir()
elif sys.platform in ("win32", "msys", "cygwin"):
paths = (Path.home() / "AppData" / "Local").iterdir()
app_data = Path.home() / "AppData"
paths = itertools.chain((app_data / "Local").iterdir(), (app_data / "Roaming").iterdir())
else:
raise Exception("Unknown OS, can't locate viewer config dirs!")

View File

@@ -14,7 +14,7 @@ from hippolyzer.lib.proxy.message import ProxiedMessage
if TYPE_CHECKING:
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import BaseMessageLogger
from hippolyzer.lib.proxy.message_logger import BaseMessageLogger
class ProxiedCircuit:

View File

@@ -51,36 +51,39 @@ class MITMProxyEventManager:
self.llsd_message_serializer = LLSDMessageSerializer()
self._asset_server_proxied = False
async def pump_proxy_events(self):
async def run(self):
while not self.shutdown_signal.is_set():
try:
try:
event_type, flow_state = self.from_proxy_queue.get(False)
except queue.Empty:
await asyncio.sleep(0.001)
continue
flow = HippoHTTPFlow.from_state(flow_state, self.session_manager)
try:
if event_type == "request":
self._handle_request(flow)
# A response was injected early in the cycle, we won't get a response
# callback from mitmproxy so just log it now.
message_logger = self.session_manager.message_logger
if message_logger and flow.response_injected:
message_logger.log_http_response(flow)
elif event_type == "response":
self._handle_response(flow)
else:
raise Exception(f"Unknown mitmproxy event type {event_type}")
finally:
# If someone has taken this request out of the regular callback flow,
# they'll manually send a callback at some later time.
if not flow.taken:
self.to_proxy_queue.put(("callback", flow.id, flow.get_state()))
await self.pump_proxy_event()
except:
logging.exception("Exploded when handling parsed packets")
async def pump_proxy_event(self):
try:
event_type, flow_state = self.from_proxy_queue.get(False)
except queue.Empty:
await asyncio.sleep(0.001)
return
flow = HippoHTTPFlow.from_state(flow_state, self.session_manager)
try:
if event_type == "request":
self._handle_request(flow)
# A response was injected early in the cycle, we won't get a response
# callback from mitmproxy so just log it now.
message_logger = self.session_manager.message_logger
if message_logger and flow.response_injected:
message_logger.log_http_response(flow)
elif event_type == "response":
self._handle_response(flow)
else:
raise Exception(f"Unknown mitmproxy event type {event_type}")
finally:
# If someone has taken this request out of the regular callback flow,
# they'll manually send a callback at some later time.
if not flow.taken:
self.to_proxy_queue.put(("callback", flow.id, flow.get_state()))
def _handle_request(self, flow: HippoHTTPFlow):
url = flow.request.url
cap_data = self.session_manager.resolve_cap(url)
@@ -118,11 +121,14 @@ class MITMProxyEventManager:
else:
flow.response = mitmproxy.http.HTTPResponse.make(
307,
b"Redirecting...",
# Can't provide explanation in the body because this results in failing Range requests under
# mitmproxy that return garbage data. Chances are there's weird interactions
# between HTTP/1.x pipelining and range requests under mitmproxy that no other
# applications have hit. If that's a concern then Connection: close should be used.
b"",
{
"Content-Type": "text/plain",
"Connection": "keep-alive",
"Location": redir_url,
"Connection": "close",
}
)
elif cap_data and cap_data.asset_server_cap:

View File

@@ -5,7 +5,6 @@ import multiprocessing
import os
import re
import sys
import pkg_resources
import queue
import typing
import uuid
@@ -20,6 +19,7 @@ from mitmproxy.addons import core, clientplayback
from mitmproxy.http import HTTPFlow
import OpenSSL
from hippolyzer.lib.base.helpers import get_resource_filename
from hippolyzer.lib.base.multiprocessing_utils import ParentProcessWatcher
orig_sethostflags = OpenSSL.SSL._lib.X509_VERIFY_PARAM_set_hostflags # noqa
@@ -230,7 +230,7 @@ def create_proxy_master(host, port, flow_context: HTTPFlowContext): # pragma: n
os.path.join(opts.confdir, "config.yml"),
)
# Use SL's CA bundle so LL's CA certs won't cause verification errors
ca_bundle = pkg_resources.resource_filename("hippolyzer.lib.base", "network/data/ca-bundle.crt")
ca_bundle = get_resource_filename("lib/base/network/data/ca-bundle.crt")
opts.update(
ssl_verify_upstream_trusted_ca=ca_bundle,
listen_host=host,
@@ -249,9 +249,9 @@ def create_http_proxy(bind_host, port, flow_context: HTTPFlowContext): # pragma
def is_asset_server_cap_name(cap_name):
return cap_name and (
cap_name.startswith("GetMesh") or
cap_name.startswith("GetTexture") or
cap_name.startswith("ViewerAsset")
cap_name.startswith("GetMesh")
or cap_name.startswith("GetTexture")
or cap_name.startswith("ViewerAsset")
)

View File

@@ -129,13 +129,14 @@ class InterceptingLLUDPProxyProtocol(BaseLLUDPProxyProtocol):
LOG.exception("Failed in region message handler")
message_logger = self.session_manager.message_logger
if message_logger:
message_logger.log_lludp_message(self.session, region, message)
handled = AddonManager.handle_lludp_message(
self.session, region, message
)
if message_logger:
message_logger.log_lludp_message(self.session, region, message)
if handled:
return

View File

@@ -71,6 +71,14 @@ def proxy_eval(eval_str: str, globals_=None, locals_=None):
)
TextSpan = Tuple[int, int]
SpanDict = Dict[Tuple[Union[str, int], ...], TextSpan]
class SpannedString(str):
spans: SpanDict = {}
class ProxiedMessage(Message):
__slots__ = ("meta", "injected", "dropped", "direction")
@@ -83,9 +91,10 @@ class ProxiedMessage(Message):
_maybe_reload_templates()
def to_human_string(self, replacements=None, beautify=False,
template: Optional[MessageTemplate] = None):
template: Optional[MessageTemplate] = None) -> SpannedString:
replacements = replacements or {}
_maybe_reload_templates()
spans: SpanDict = {}
string = ""
if self.direction is not None:
string += f'{self.direction.name} '
@@ -101,11 +110,18 @@ class ProxiedMessage(Message):
block_suffix = ""
if template and template.get_block(block_name).block_type == MsgBlockType.MBT_VARIABLE:
block_suffix = ' # Variable'
for block in block_list:
for block_num, block in enumerate(block_list):
string += f"[{block_name}]{block_suffix}\n"
for var_name, val in block.items():
start_len = len(string)
string += self._format_var(block, var_name, val, replacements, beautify)
return string
end_len = len(string)
# Store the spans for each var so we can highlight specific matches
spans[(self.name, block_name, block_num, var_name)] = (start_len, end_len)
string += "\n"
spanned = SpannedString(string)
spanned.spans = spans
return spanned
def _format_var(self, block, var_name, var_val, replacements, beautify=False):
string = ""
@@ -129,7 +145,7 @@ class ProxiedMessage(Message):
if serializer.AS_HEX and isinstance(var_val, int):
var_data = hex(var_val)
if serializer.ORIG_INLINE:
string += f" #{var_data}\n"
string += f" #{var_data}"
return string
else:
string += "\n"
@@ -146,7 +162,7 @@ class ProxiedMessage(Message):
if "CircuitCode" in var_name or ("Code" in var_name and "Circuit" in block.name):
if var_val == replacements.get("CIRCUIT_CODE"):
var_data = "[[CIRCUIT_CODE]]"
string += f" {field_prefix}{var_name} = {var_data}\n"
string += f" {field_prefix}{var_name} = {var_data}"
return string
@staticmethod

View File

@@ -3,28 +3,30 @@ import ast
import typing
from arpeggio import Optional, ZeroOrMore, EOF, \
ParserPython, PTNodeVisitor, visit_parse_tree
from arpeggio import RegExMatch as _
ParserPython, PTNodeVisitor, visit_parse_tree, RegExMatch
def literal():
return [
# Nightmare. str or bytes literal.
# https://stackoverflow.com/questions/14366401/#comment79795017_14366904
_(r'''b?(\"\"\"|\'\'\'|\"|\')((?<!\\)(\\\\)*\\\1|.)*?\1'''),
_(r'\d+(\.\d+)?'),
RegExMatch(r'''b?(\"\"\"|\'\'\'|\"|\')((?<!\\)(\\\\)*\\\1|.)*?\1'''),
# base16
RegExMatch(r'0x\d+'),
# base10 int or float.
RegExMatch(r'\d+(\.\d+)?'),
"None",
"True",
"False",
# vector3 (tuple)
_(r'\(\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*\)'),
RegExMatch(r'\(\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*\)'),
# vector4 (tuple)
_(r'\(\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*\)'),
RegExMatch(r'\(\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*\)'),
]
def identifier():
return _(r'[a-zA-Z*]([a-zA-Z0-9*]+)?')
return RegExMatch(r'[a-zA-Z*]([a-zA-Z0-9_*]+)?')
def field_specifier():
@@ -43,12 +45,16 @@ def meta_field_specifier():
return "Meta", ".", identifier
def enum_field_specifier():
return identifier, ".", identifier
def compare_val():
return [literal, meta_field_specifier]
return [literal, meta_field_specifier, enum_field_specifier]
def binary_expression():
return field_specifier, ["==", "!=", "^=", "$=", "~=", ">", ">=", "<", "<="], compare_val
return field_specifier, ["==", "!=", "^=", "$=", "~=", ">", ">=", "<", "<=", "&"], compare_val
def term():
@@ -63,9 +69,12 @@ def message_filter():
return expression, EOF
MATCH_RESULT = typing.Union[bool, typing.Tuple]
class BaseFilterNode(abc.ABC):
@abc.abstractmethod
def match(self, msg) -> bool:
def match(self, msg) -> MATCH_RESULT:
raise NotImplementedError()
@property
@@ -95,17 +104,17 @@ class BinaryFilterNode(BaseFilterNode, abc.ABC):
class UnaryNotFilterNode(UnaryFilterNode):
def match(self, msg) -> bool:
def match(self, msg) -> MATCH_RESULT:
return not self.node.match(msg)
class OrFilterNode(BinaryFilterNode):
def match(self, msg) -> bool:
def match(self, msg) -> MATCH_RESULT:
return self.left_node.match(msg) or self.right_node.match(msg)
class AndFilterNode(BinaryFilterNode):
def match(self, msg) -> bool:
def match(self, msg) -> MATCH_RESULT:
return self.left_node.match(msg) and self.right_node.match(msg)
@@ -115,7 +124,7 @@ class MessageFilterNode(BaseFilterNode):
self.operator = operator
self.value = value
def match(self, msg) -> bool:
def match(self, msg) -> MATCH_RESULT:
return msg.matches(self)
@property
@@ -127,6 +136,11 @@ class MetaFieldSpecifier(str):
pass
class EnumFieldSpecifier(typing.NamedTuple):
enum_name: str
field_name: str
class LiteralValue:
"""Only exists because we can't return `None` in a visitor, need to box it"""
def __init__(self, value):
@@ -134,23 +148,26 @@ class LiteralValue:
class MessageFilterVisitor(PTNodeVisitor):
def visit_identifier(self, node, children):
def visit_identifier(self, node, _children):
return str(node.value)
def visit_field_specifier(self, node, children):
def visit_field_specifier(self, _node, children):
return children
def visit_literal(self, node, children):
def visit_literal(self, node, _children):
return LiteralValue(ast.literal_eval(node.value))
def visit_meta_field_specifier(self, node, children):
def visit_meta_field_specifier(self, _node, children):
return MetaFieldSpecifier(children[0])
def visit_unary_field_specifier(self, node, children):
def visit_enum_field_specifier(self, _node, children):
return EnumFieldSpecifier(*children)
def visit_unary_field_specifier(self, _node, children):
# Looks like a bare field specifier with no operator
return MessageFilterNode(tuple(children), None, None)
def visit_unary_expression(self, node, children):
def visit_unary_expression(self, _node, children):
if len(children) == 1:
if isinstance(children[0], BaseFilterNode):
return children[0]
@@ -162,10 +179,10 @@ class MessageFilterVisitor(PTNodeVisitor):
else:
raise ValueError(f"Unrecognized unary prefix {children[0]}")
def visit_binary_expression(self, node, children):
def visit_binary_expression(self, _node, children):
return MessageFilterNode(tuple(children[0]), children[1], children[2])
def visit_expression(self, node, children):
def visit_expression(self, _node, children):
if self.debug:
print("Expression {}".format(children))
if len(children) > 1:

View File

@@ -0,0 +1,638 @@
from __future__ import annotations
import collections
import copy
import fnmatch
import io
import logging
import pickle
import re
import typing
import weakref
from defusedxml import minidom
from hippolyzer.lib.base import serialization as se, llsd
from hippolyzer.lib.base.datatypes import TaggedUnion, UUID, TupleCoord
from hippolyzer.lib.base.helpers import bytes_escape
from hippolyzer.lib.proxy.message_filter import MetaFieldSpecifier, compile_filter, BaseFilterNode, MessageFilterNode, \
EnumFieldSpecifier
from hippolyzer.lib.proxy.region import CapType
if typing.TYPE_CHECKING:
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
LOG = logging.getLogger(__name__)
class BaseMessageLogger:
def log_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
pass
def log_http_response(self, flow: HippoHTTPFlow):
pass
def log_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
pass
class FilteringMessageLogger(BaseMessageLogger):
def __init__(self):
BaseMessageLogger.__init__(self)
self._raw_entries = collections.deque(maxlen=2000)
self._filtered_entries: typing.List[AbstractMessageLogEntry] = []
self._paused = False
self.filter: BaseFilterNode = compile_filter("")
def set_filter(self, filter_str: str):
self.filter = compile_filter(filter_str)
self._begin_reset()
# Keep any entries that've aged out of the raw entries list that
# match the new filter
self._filtered_entries = [
m for m in self._filtered_entries if
m not in self._raw_entries and self.filter.match(m)
]
self._filtered_entries.extend((m for m in self._raw_entries if self.filter.match(m)))
self._end_reset()
def set_paused(self, paused: bool):
self._paused = paused
def log_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
if self._paused:
return
self._add_log_entry(LLUDPMessageLogEntry(message, region, session))
def log_http_response(self, flow: HippoHTTPFlow):
if self._paused:
return
# These are huge, let's not log them for now.
if flow.cap_data and flow.cap_data.asset_server_cap:
return
self._add_log_entry(HTTPMessageLogEntry(flow))
def log_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
if self._paused:
return
self._add_log_entry(EQMessageLogEntry(event, region, session))
# Hooks that Qt models will want to implement
def _begin_insert(self, insert_idx: int):
pass
def _end_insert(self):
pass
def _begin_reset(self):
pass
def _end_reset(self):
pass
def _add_log_entry(self, entry: AbstractMessageLogEntry):
try:
# Paused, throw it away.
if self._paused:
return
self._raw_entries.append(entry)
if self.filter.match(entry):
next_idx = len(self._filtered_entries)
self._begin_insert(next_idx)
self._filtered_entries.append(entry)
self._end_insert()
entry.cache_summary()
# In the common case we don't need to keep around the serialization
# caches anymore. If the filter changes, the caches will be repopulated
# as necessary.
entry.freeze()
except Exception:
LOG.exception("Failed to filter queued message")
def clear(self):
self._begin_reset()
self._filtered_entries.clear()
self._raw_entries.clear()
self._end_reset()
class AbstractMessageLogEntry:
region: typing.Optional[ProxiedRegion]
session: typing.Optional[Session]
name: str
type: str
__slots__ = ["_region", "_session", "_region_name", "_agent_id", "_summary", "meta"]
def __init__(self, region, session):
if region and not isinstance(region, weakref.ReferenceType):
region = weakref.ref(region)
if session and not isinstance(session, weakref.ReferenceType):
session = weakref.ref(session)
self._region: typing.Optional[weakref.ReferenceType] = region
self._session: typing.Optional[weakref.ReferenceType] = session
self._region_name = None
self._agent_id = None
self._summary = None
if self.region:
self._region_name = self.region.name
if self.session:
self._agent_id = self.session.agent_id
agent_obj = None
if self.region is not None:
agent_obj = self.region.objects.lookup_fullid(self.agent_id)
self.meta = {
"RegionName": self.region_name,
"AgentID": self.agent_id,
"SessionID": self.session.id if self.session else None,
"AgentLocal": agent_obj.LocalID if agent_obj is not None else None,
"Method": self.method,
"Type": self.type,
"SelectedLocal": self._current_selected_local(),
"SelectedFull": self._current_selected_full(),
}
def freeze(self):
pass
def cache_summary(self):
self._summary = self.summary
def _current_selected_local(self):
if self.session:
return self.session.selected.object_local
return None
def _current_selected_full(self):
selected_local = self._current_selected_local()
if selected_local is None or self.region is None:
return None
obj = self.region.objects.lookup_localid(selected_local)
return obj and obj.FullID
def _get_meta(self, name: str):
# Slight difference in semantics. Filters are meant to return the same
# thing no matter when they're run, so SelectedLocal and friends resolve
# to the selected items _at the time the message was logged_. To handle
# the case where we want to match on the selected object at the time the
# filter is evaluated, we resolve these here.
if name == "CurrentSelectedLocal":
return self._current_selected_local()
elif name == "CurrentSelectedFull":
return self._current_selected_full()
return self.meta.get(name)
@property
def region(self) -> typing.Optional[ProxiedRegion]:
if self._region:
return self._region()
return None
@property
def session(self) -> typing.Optional[Session]:
if self._session:
return self._session()
return None
@property
def region_name(self) -> str:
region = self.region
if region:
self._region_name = region.name
return self._region_name
# Region may die after a message is logged, need to keep this around.
if self._region_name:
return self._region_name
return ""
@property
def agent_id(self) -> typing.Optional[UUID]:
if self._agent_id:
return self._agent_id
session = self.session
if session:
self._agent_id = session.agent_id
return self._agent_id
return None
@property
def host(self) -> str:
region_name = self.region_name
if not region_name:
return ""
session_str = ""
agent_id = self.agent_id
if agent_id:
session_str = f" ({agent_id})"
return region_name + session_str
def request(self, beautify=False, replacements=None):
return None
def response(self, beautify=False):
return None
def _packet_root_matches(self, pattern):
if fnmatch.fnmatchcase(self.name, pattern):
return True
if fnmatch.fnmatchcase(self.type, pattern):
return True
return False
def _val_matches(self, operator, val, expected):
if isinstance(expected, MetaFieldSpecifier):
expected = self._get_meta(str(expected))
if not isinstance(expected, (int, float, bytes, str, type(None), tuple)):
if callable(expected):
expected = expected()
else:
expected = str(expected)
elif isinstance(expected, EnumFieldSpecifier):
# Local import so we get a fresh copy of the templates module
from hippolyzer.lib.proxy import templates
enum_cls = getattr(templates, expected.enum_name)
expected = enum_cls[expected.field_name]
elif expected is not None:
# Unbox the expected value
expected = expected.value
if not isinstance(val, (int, float, bytes, str, type(None), tuple, TupleCoord)):
val = str(val)
if not operator:
return bool(val)
elif operator == "==":
return val == expected
elif operator == "!=":
return val != expected
elif operator == "^=":
if val is None:
return False
return val.startswith(expected)
elif operator == "$=":
if val is None:
return False
return val.endswith(expected)
elif operator == "~=":
if val is None:
return False
return expected in val
elif operator == "<":
return val < expected
elif operator == "<=":
return val <= expected
elif operator == ">":
return val > expected
elif operator == ">=":
return val >= expected
elif operator == "&":
return val & expected
else:
raise ValueError(f"Unexpected operator {operator!r}")
def _base_matches(self, matcher: "MessageFilterNode") -> typing.Optional[bool]:
if len(matcher.selector) == 1:
# Comparison operators would make no sense here
if matcher.value or matcher.operator:
return False
return self._packet_root_matches(matcher.selector[0])
if len(matcher.selector) == 2 and matcher.selector[0] == "Meta":
return self._val_matches(matcher.operator, self._get_meta(matcher.selector[1]), matcher.value)
return None
def matches(self, matcher: "MessageFilterNode"):
return self._base_matches(matcher) or False
@property
def seq(self):
return ""
@property
def method(self):
return ""
@property
def summary(self):
return ""
@staticmethod
def _format_llsd(parsed):
xmlified = llsd.format_pretty_xml(parsed)
# dedent <key> by 1 for easier visual scanning
xmlified = re.sub(rb" <key>", b"<key>", xmlified)
return xmlified.decode("utf8", errors="replace")
class HTTPMessageLogEntry(AbstractMessageLogEntry):
__slots__ = ["flow"]
def __init__(self, flow: HippoHTTPFlow):
self.flow: HippoHTTPFlow = flow
cap_data = self.flow.cap_data
region = cap_data and cap_data.region
session = cap_data and cap_data.session
super().__init__(region, session)
# This was a request the proxy made through itself
self.meta["Injected"] = flow.request_injected
@property
def type(self):
return "HTTP"
@property
def name(self):
cap_data = self.flow.cap_data
name = cap_data and cap_data.cap_name
if name:
return name
return self.flow.request.url
@property
def method(self):
return self.flow.request.method
def _format_http_message(self, want_request, beautify):
message = self.flow.request if want_request else self.flow.response
method = self.flow.request.method
buf = io.StringIO()
cap_data = self.flow.cap_data
cap_name = cap_data and cap_data.cap_name
base_url = cap_name and cap_data.base_url
temporary_cap = cap_data and cap_data.type == CapType.TEMPORARY
beautify_url = (beautify and base_url and cap_name
and not temporary_cap and self.session and want_request)
if want_request:
buf.write(message.method)
buf.write(" ")
if beautify_url:
buf.write(f"[[{cap_name}]]{message.url[len(base_url):]}")
else:
buf.write(message.url)
buf.write(" ")
buf.write(message.http_version)
else:
buf.write(message.http_version)
buf.write(" ")
buf.write(str(message.status_code))
buf.write(" ")
buf.write(message.reason)
buf.write("\r\n")
if beautify_url:
buf.write("# ")
buf.write(message.url)
buf.write("\r\n")
headers = copy.deepcopy(message.headers)
for key in tuple(headers.keys()):
if key.lower().startswith("x-hippo-"):
LOG.warning(f"Internal header {key!r} leaked out?")
# If this header actually came from somewhere untrusted, we can't
# include it. It may change the meaning of the message when replayed.
headers[f"X-Untrusted-{key}"] = headers[key]
headers.pop(key)
beautified = None
if beautify and message.content:
try:
serializer = se.HTTP_SERIALIZERS.get(cap_name)
if serializer:
if want_request:
beautified = serializer.deserialize_req_body(method, message.content)
else:
beautified = serializer.deserialize_resp_body(method, message.content)
if beautified is se.UNSERIALIZABLE:
beautified = None
else:
beautified = self._format_llsd(beautified)
headers["X-Hippo-Beautify"] = "1"
if not beautified:
content_type = self._guess_content_type(message)
if content_type.startswith("application/llsd"):
beautified = self._format_llsd(llsd.parse(message.content))
elif any(content_type.startswith(x) for x in ("application/xml", "text/xml")):
beautified = minidom.parseString(message.content).toprettyxml(indent=" ")
# kill blank lines. will break cdata sections. meh.
beautified = re.sub(r'\n\s*\n', '\n', beautified, flags=re.MULTILINE)
beautified = re.sub(r'<([\w]+)>\s*</\1>', r'<\1></\1>',
beautified, flags=re.MULTILINE)
except:
LOG.exception("Failed to beautify message")
message_body = beautified or message.content
if isinstance(message_body, bytes):
try:
decoded = message.text
# Valid in many codecs, but unprintable.
if "\x00" in decoded:
raise ValueError("Embedded null")
message_body = decoded
except (UnicodeError, ValueError):
# non-printable characters, return the escaped version.
headers["X-Hippo-Escaped-Body"] = "1"
message_body = bytes_escape(message_body).decode("utf8")
buf.write(bytes(headers).decode("utf8", errors="replace"))
buf.write("\r\n")
buf.write(message_body)
return buf.getvalue()
def request(self, beautify=False, replacements=None):
return self._format_http_message(want_request=True, beautify=beautify)
def response(self, beautify=False):
return self._format_http_message(want_request=False, beautify=beautify)
@property
def summary(self):
if self._summary is not None:
return self._summary
msg = self.flow.response
self._summary = f"{msg.status_code}: "
if not msg.content:
return self._summary
if len(msg.content) > 1000000:
self._summary += "[too large...]"
return self._summary
content_type = self._guess_content_type(msg)
if content_type.startswith("application/llsd"):
notation = llsd.format_notation(llsd.parse(msg.content))
self._summary += notation.decode("utf8")[:500]
return self._summary
def _guess_content_type(self, message):
content_type = message.headers.get("Content-Type", "")
if not message.content or content_type.startswith("application/llsd"):
return content_type
# Sometimes gets sent with `text/plain` or `text/html`. Cool.
if message.content.startswith(rb'<?xml version="1.0" ?><llsd>'):
return "application/llsd+xml"
if message.content.startswith(rb'<llsd>'):
return "application/llsd+xml"
if message.content.startswith(rb'<?xml '):
return "application/xml"
return content_type
class EQMessageLogEntry(AbstractMessageLogEntry):
__slots__ = ["event"]
def __init__(self, event, region, session):
super().__init__(region, session)
self.event = event
@property
def type(self):
return "EQ"
def request(self, beautify=False, replacements=None):
return self._format_llsd(self.event["body"])
@property
def name(self):
return self.event["message"]
@property
def summary(self):
if self._summary is not None:
return self._summary
self._summary = ""
self._summary = llsd.format_notation(self.event["body"]).decode("utf8")[:500]
return self._summary
class LLUDPMessageLogEntry(AbstractMessageLogEntry):
__slots__ = ["_message", "_name", "_direction", "_frozen_message", "_seq", "_deserializer"]
def __init__(self, message: ProxiedMessage, region, session):
self._message: ProxiedMessage = message
self._deserializer = None
self._name = message.name
self._direction = message.direction
self._frozen_message: typing.Optional[bytes] = None
self._seq = message.packet_id
super().__init__(region, session)
_MESSAGE_META_ATTRS = {
"Injected", "Dropped", "Extra", "Resent", "Zerocoded", "Acks", "Reliable",
}
def _get_meta(self, name: str):
# These may change between when the message is logged and when we
# actually filter on it, since logging happens before addons.
msg = self.message
if name in self._MESSAGE_META_ATTRS:
return getattr(msg, name.lower(), None)
msg_meta = getattr(msg, "meta", None)
if msg_meta is not None:
if name in msg_meta:
return msg_meta[name]
return super()._get_meta(name)
@property
def message(self):
if self._message:
return self._message
elif self._frozen_message:
message = pickle.loads(self._frozen_message)
message.deserializer = self._deserializer
return message
else:
raise ValueError("Didn't have a fresh or frozen message somehow")
def freeze(self):
self.message.invalidate_caches()
# These are expensive to keep around. pickle them and un-pickle on
# an as-needed basis.
self._deserializer = self.message.deserializer
self._frozen_message = pickle.dumps(self._message, protocol=pickle.HIGHEST_PROTOCOL)
self._message = None
@property
def type(self):
return "LLUDP"
@property
def name(self):
if self._message:
self._name = self._message.name
return self._name
@property
def method(self):
if self._message:
self._direction = self._message.direction
return self._direction.name if self._direction is not None else ""
def request(self, beautify=False, replacements=None):
return self.message.to_human_string(replacements, beautify)
def matches(self, matcher):
base_matched = self._base_matches(matcher)
if base_matched is not None:
return base_matched
if not self._packet_root_matches(matcher.selector[0]):
return False
message = self.message
selector_len = len(matcher.selector)
# name, block_name, var_name(, subfield_name)?
if selector_len not in (3, 4):
return False
for block_name in message.blocks:
if not fnmatch.fnmatchcase(block_name, matcher.selector[1]):
continue
for block_num, block in enumerate(message[block_name]):
for var_name in block.vars.keys():
if not fnmatch.fnmatchcase(var_name, matcher.selector[2]):
continue
# So we know where the match happened
span_key = (message.name, block_name, block_num, var_name)
if selector_len == 3:
# We're just matching on the var existing, not having any particular value
if matcher.value is None:
return span_key
if self._val_matches(matcher.operator, block[var_name], matcher.value):
return span_key
# Need to invoke a special unpacker
elif selector_len == 4:
try:
deserialized = block.deserialize_var(var_name)
except KeyError:
continue
# Discard the tag if this is a tagged union, we only want the value
if isinstance(deserialized, TaggedUnion):
deserialized = deserialized.value
if not isinstance(deserialized, dict):
return False
for key in deserialized.keys():
if fnmatch.fnmatchcase(str(key), matcher.selector[3]):
if matcher.value is None:
return span_key
if self._val_matches(matcher.operator, deserialized[key], matcher.value):
return span_key
return False
@property
def summary(self):
if self._summary is None:
self._summary = self.message.to_summary()[:500]
return self._summary
@property
def seq(self):
if self._message:
self._seq = self._message.packet_id
return self._seq

View File

@@ -0,0 +1,41 @@
from __future__ import annotations
import dataclasses
from typing import *
from hippolyzer.lib.base.datatypes import UUID
if TYPE_CHECKING:
from hippolyzer.lib.proxy.message import ProxiedMessage
@dataclasses.dataclass
class NameCacheEntry:
FirstName: Optional[str] = None
LastName: Optional[str] = None
DisplayName: Optional[str] = None
class NameCache:
# TODO: persist this somewhere across runs
def __init__(self):
self._cache: Dict[UUID, NameCacheEntry] = {}
def lookup(self, uuid: UUID) -> Optional[NameCacheEntry]:
return self._cache.get(uuid)
def update(self, uuid: UUID, vals: dict):
# upsert the cache entry
entry = self._cache.get(uuid) or NameCacheEntry()
entry.LastName = vals.get("LastName") or entry.LastName
entry.FirstName = vals.get("FirstName") or entry.FirstName
entry.DisplayName = vals.get("DisplayName") or entry.DisplayName
self._cache[uuid] = entry
def handle_uuid_name_reply(self, msg: ProxiedMessage):
"""UUID lookup reply handler to be registered by regions"""
for block in msg.blocks["UUIDNameBlock"]:
self.update(block["ID"], {
"FirstName": block["FirstName"],
"LastName": block["LastName"],
})

View File

@@ -2,13 +2,15 @@ from __future__ import annotations
import collections
import copy
import enum
import logging
import math
import typing
import weakref
from typing import *
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID, TaggedUnion
from hippolyzer.lib.base.datatypes import UUID, TaggedUnion, Vector3
from hippolyzer.lib.base.helpers import proxify
from hippolyzer.lib.base.message.message import Block
from hippolyzer.lib.base.namevalue import NameValueCollection
@@ -16,6 +18,7 @@ from hippolyzer.lib.base.objects import Object
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.proxy.namecache import NameCache
from hippolyzer.lib.proxy.templates import PCode, ObjectStateSerializer
if TYPE_CHECKING:
@@ -44,8 +47,8 @@ class OrphanManager:
del self._orphans[parent_id]
return removed
def collect_orphans(self, parent: Object) -> typing.Sequence[int]:
return self._orphans.pop(parent.LocalID, [])
def collect_orphans(self, parent_localid: int) -> typing.Sequence[int]:
return self._orphans.pop(parent_localid, [])
def track_orphan(self, obj: Object):
self.track_orphan_by_id(obj.LocalID, obj.ParentID)
@@ -59,21 +62,83 @@ class OrphanManager:
OBJECT_OR_LOCAL = typing.Union[Object, int]
class LocationType(enum.IntEnum):
COARSE = enum.auto()
EXACT = enum.auto()
class Avatar:
"""Wrapper for an avatar known through ObjectUpdate or CoarseLocationUpdate"""
def __init__(
self,
full_id: UUID,
obj: Optional["Object"] = None,
coarse_location: Optional[Vector3] = None,
resolved_name: Optional[str] = None,
):
self.FullID: UUID = full_id
self.Object: Optional["Object"] = obj
self._coarse_location = coarse_location
self._resolved_name = resolved_name
@property
def LocationType(self) -> "LocationType":
if self.Object:
return LocationType.EXACT
return LocationType.COARSE
@property
def RegionPosition(self) -> Vector3:
if self.Object:
return self.Object.RegionPosition
if self._coarse_location is not None:
return self._coarse_location
raise ValueError(f"Avatar {self.FullID} has no known position")
@property
def Name(self) -> Optional[str]:
if self.Object:
nv: Dict[str, str] = self.Object.NameValue.to_dict()
return f"{nv['FirstName']} {nv['LastName']}"
return self._resolved_name
class ObjectManager:
"""Object manager for a specific region"""
"""
Object manager for a specific region
TODO: This model does not make sense given how region->region object handoff works.
The ObjectManager has to notice when an ObjectUpdate for an object came from a
new region and update the associated region itself. It will not receive a KillObject
from the old region in the case of physical region crossings. Right now this means
physical objects or agents that physically cross a sim border get dangling object
references. This is not the case when they teleport, even across a small distance
to a neighbor, as that will send a KillObject in the old sim.
Needs to switch to one manager managing objects for a full session rather than one
manager per region.
"""
def __init__(self, region: ProxiedRegion):
self._localid_lookup: typing.Dict[int, Object] = {}
self._fullid_lookup: typing.Dict[UUID, int] = {}
self._coarse_locations: typing.Dict[UUID, Vector3] = {}
# Objects that we've seen references to but don't have data for
self.missing_locals = set()
self._region: ProxiedRegion = proxify(region)
self._orphan_manager = OrphanManager()
name_cache = None
session = self._region.session()
if session:
name_cache = session.session_manager.name_cache
# Use a local namecache if we don't have a session manager
self.name_cache: Optional[NameCache] = name_cache or NameCache()
message_handler = region.message_handler
message_handler.subscribe("ObjectUpdate", self._handle_object_update)
message_handler.subscribe("ImprovedTerseObjectUpdate",
self._handle_terse_object_update)
message_handler.subscribe("CoarseLocationUpdate",
self._handle_coarse_location_update)
message_handler.subscribe("ObjectUpdateCompressed",
self._handle_object_update_compressed)
message_handler.subscribe("ObjectUpdateCached",
@@ -87,17 +152,35 @@ class ObjectManager:
message_handler.subscribe("KillObject",
self._handle_kill_object)
def __len__(self):
return len(self._localid_lookup)
@property
def all_objects(self) -> typing.Iterable[Object]:
return self._localid_lookup.values()
@property
def all_avatars(self) -> typing.Iterable[Object]:
# This is only avatars within draw distance. Might be useful to have another
# accessor for UUID + pos that's based on CoarseLocationUpdate.
return (o for o in self.all_objects if o.PCode == PCode.AVATAR)
def all_avatars(self) -> typing.Iterable[Avatar]:
av_objects = {o.FullID: o for o in self.all_objects if o.PCode == PCode.AVATAR}
all_ids = set(av_objects.keys()) | self._coarse_locations.keys()
def lookup_localid(self, localid) -> typing.Optional[Object]:
avatars: List[Avatar] = []
for av_id in all_ids:
av_obj = av_objects.get(av_id)
coarse_location = self._coarse_locations.get(av_id)
resolved_name = None
if namecache_entry := self.name_cache.lookup(av_id):
resolved_name = f"{namecache_entry.FirstName} {namecache_entry.LastName}"
avatars.append(Avatar(
full_id=av_id,
coarse_location=coarse_location,
obj=av_obj,
resolved_name=resolved_name,
))
return avatars
def lookup_localid(self, localid: int) -> typing.Optional[Object]:
return self._localid_lookup.get(localid, None)
def lookup_fullid(self, fullid: UUID) -> typing.Optional[Object]:
@@ -106,7 +189,13 @@ class ObjectManager:
return None
return self.lookup_localid(local_id)
def _track_object(self, obj: Object):
def lookup_avatar(self, fullid: UUID) -> typing.Optional[Avatar]:
for avatar in self.all_avatars:
if avatar.FullID == fullid:
return avatar
return None
def _track_object(self, obj: Object, notify: bool = True):
self._localid_lookup[obj.LocalID] = obj
self._fullid_lookup[obj.FullID] = obj.LocalID
# If it was missing, it's not missing anymore.
@@ -115,13 +204,34 @@ class ObjectManager:
self._parent_object(obj)
# Adopt any of our orphaned child objects.
for orphan_local in self._orphan_manager.collect_orphans(obj):
for orphan_local in self._orphan_manager.collect_orphans(obj.LocalID):
child_obj = self.lookup_localid(orphan_local)
# Shouldn't be any dead children in the orphanage
assert child_obj is not None
self._parent_object(child_obj)
self._notify_object_updated(obj, set(obj.to_dict().keys()))
if notify:
self._run_object_update_hooks(obj, set(obj.to_dict().keys()))
def _untrack_object(self, obj: Object):
former_child_ids = obj.ChildIDs[:]
for child_id in former_child_ids:
child_obj = self.lookup_localid(child_id)
assert child_obj is not None
self._unparent_object(child_obj, child_obj.ParentID)
# Place any remaining unkilled children in the orphanage
for child_id in former_child_ids:
self._orphan_manager.track_orphan_by_id(child_id, obj.LocalID)
assert not obj.ChildIDs
# Make sure the parent knows we went away
self._unparent_object(obj, obj.ParentID)
# Do this last in case we only have a weak reference
del self._fullid_lookup[obj.FullID]
del self._localid_lookup[obj.LocalID]
def _parent_object(self, obj: Object, insert_at_head=False):
if obj.ParentID:
@@ -163,9 +273,27 @@ class ObjectManager:
def _update_existing_object(self, obj: Object, new_properties):
new_parent_id = new_properties.get("ParentID", obj.ParentID)
actually_updated_props = set()
if obj.LocalID != new_properties.get("LocalID", obj.LocalID):
# Our LocalID changed, and we deal with linkages to other prims by
# LocalID association. Break any links since our LocalID is changing.
# Could happen if we didn't mark an attachment prim dead and the parent agent
# came back into the sim. Attachment FullIDs do not change across TPs,
# LocalIDs do. This at least lets us partially recover from the bad state.
# Currently known to happen due to physical region crossings, so only debug.
new_localid = new_properties["LocalID"]
LOG.debug(f"Got an update with new LocalID for {obj.FullID}, {obj.LocalID} != {new_localid}. "
f"May have mishandled a KillObject for a prim that left and re-entered region.")
self._untrack_object(obj)
obj.LocalID = new_localid
self._track_object(obj, notify=False)
actually_updated_props |= {"LocalID"}
old_parent_id = obj.ParentID
actually_updated_props = obj.update_properties(new_properties)
actually_updated_props |= obj.update_properties(new_properties)
if new_parent_id != old_parent_id:
self._unparent_object(obj, old_parent_id)
@@ -174,7 +302,7 @@ class ObjectManager:
# Common case where this may be falsy is if we get an ObjectUpdateCached
# that didn't have a changed UpdateFlags field.
if actually_updated_props:
self._notify_object_updated(obj, actually_updated_props)
self._run_object_update_hooks(obj, actually_updated_props)
def _normalize_object_update(self, block: Block):
object_data = {
@@ -192,6 +320,7 @@ class ObjectManager:
"State": block.deserialize_var("State", make_copy=False),
**block.deserialize_var("ObjectData", make_copy=False).value,
}
object_data["LocalID"] = object_data.pop("ID")
# Empty == not updated
if not object_data["TextureEntry"]:
object_data.pop("TextureEntry")
@@ -211,7 +340,7 @@ class ObjectManager:
for block in packet['ObjectData']:
object_data = self._normalize_object_update(block)
seen_locals.append(object_data["ID"])
seen_locals.append(object_data["LocalID"])
obj = self.lookup_fullid(object_data["FullID"])
if obj:
self._update_existing_object(obj, object_data)
@@ -226,6 +355,7 @@ class ObjectManager:
**dict(block.items()),
"TextureEntry": block.deserialize_var("TextureEntry", make_copy=False),
}
object_data["LocalID"] = object_data.pop("ID")
object_data.pop("Data")
# Empty == not updated
if object_data["TextureEntry"] is None:
@@ -236,22 +366,40 @@ class ObjectManager:
seen_locals = []
for block in packet['ObjectData']:
object_data = self._normalize_terse_object_update(block)
obj = self.lookup_localid(object_data["ID"])
obj = self.lookup_localid(object_data["LocalID"])
# Can only update existing object with this message
if obj:
# Need the Object as context because decoding state requires PCode.
state_deserializer = ObjectStateSerializer.deserialize
object_data["State"] = state_deserializer(ctx_obj=obj, val=object_data["State"])
seen_locals.append(object_data["ID"])
seen_locals.append(object_data["LocalID"])
if obj:
self._update_existing_object(obj, object_data)
else:
self.missing_locals.add(object_data["ID"])
LOG.debug(f"Received terse update for unknown object {object_data['ID']}")
self.missing_locals.add(object_data["LocalID"])
LOG.debug(f"Received terse update for unknown object {object_data['LocalID']}")
packet.meta["ObjectUpdateIDs"] = tuple(seen_locals)
def _handle_coarse_location_update(self, packet: ProxiedMessage):
self._coarse_locations.clear()
coarse_locations: typing.Dict[UUID, Vector3] = {}
for agent_block, location_block in zip(packet["AgentData"], packet["Location"]):
x, y, z = location_block["X"], location_block["Y"], location_block["Z"]
coarse_locations[agent_block["AgentID"]] = Vector3(
X=x,
Y=y,
# The z-axis is multiplied by 4 to obtain true Z location
# The z-axis is also limited to 1020m in height
# If z == 255 then the true Z is unknown.
# http://wiki.secondlife.com/wiki/CoarseLocationUpdate
Z=z * 4 if z != 255 else math.inf,
)
self._coarse_locations.update(coarse_locations)
def _handle_object_update_cached(self, packet: ProxiedMessage):
seen_locals = []
for block in packet['ObjectData']:
@@ -288,6 +436,7 @@ class ObjectManager:
"PSBlock": ps_block.value,
# Parent flag not set means explicitly un-parented
"ParentID": compressed.pop("ParentID", None) or 0,
"LocalID": compressed.pop("ID"),
**compressed,
**dict(block.items()),
"UpdateFlags": block.deserialize_var("UpdateFlags", make_copy=False),
@@ -304,8 +453,8 @@ class ObjectManager:
seen_locals = []
for block in packet['ObjectData']:
object_data = self._normalize_object_update_compressed(block)
obj = self.lookup_localid(object_data["ID"])
seen_locals.append(object_data["ID"])
seen_locals.append(object_data["LocalID"])
obj = self.lookup_localid(object_data["LocalID"])
if obj:
self._update_existing_object(obj, object_data)
else:
@@ -331,33 +480,38 @@ class ObjectManager:
def _handle_kill_object(self, packet: ProxiedMessage):
seen_locals = []
for block in packet["ObjectData"]:
obj = self.lookup_localid(block["ID"])
self._kill_object_by_local_id(block["ID"])
seen_locals.append(block["ID"])
self.missing_locals -= {block["ID"]}
if obj:
AddonManager.handle_object_killed(self._region.session(), self._region, obj)
former_child_ids = obj.ChildIDs[:]
for child_id in former_child_ids:
child_obj = self.lookup_localid(child_id)
assert child_obj is not None
self._unparent_object(child_obj, child_obj.ParentID)
del self._localid_lookup[obj.LocalID]
del self._fullid_lookup[obj.FullID]
# Place any remaining unkilled children in the orphanage
for child_id in former_child_ids:
self._orphan_manager.track_orphan_by_id(child_id, obj.LocalID)
assert not obj.ChildIDs
# Make sure the parent knows we went away
self._unparent_object(obj, obj.ParentID)
else:
logging.debug(f"Received {packet.name} for unknown {block['ID']}")
packet.meta["ObjectUpdateIDs"] = tuple(seen_locals)
def _kill_object_by_local_id(self, local_id: int):
obj = self.lookup_localid(local_id)
self.missing_locals -= {local_id}
child_ids: Sequence[int]
if obj:
AddonManager.handle_object_killed(self._region.session(), self._region, obj)
child_ids = obj.ChildIDs
else:
LOG.debug(f"Tried to kill unknown object {local_id}")
# If it had any orphans, they need to die.
child_ids = self._orphan_manager.collect_orphans(local_id)
# KillObject implicitly kills descendents
# This may mutate child_ids, use the reversed iterator so we don't
# invalidate the iterator during removal.
for child_id in reversed(child_ids):
# indra special-cases avatar PCodes and doesn't mark them dead
# due to cascading kill. Is this correct? Do avatars require
# explicit kill?
child_obj = self.lookup_localid(child_id)
if child_obj and child_obj.PCode == PCode.AVATAR:
continue
self._kill_object_by_local_id(child_id)
# Have to do this last, since untracking will clear child IDs
if obj:
self._untrack_object(obj)
def _handle_get_object_cost(self, flow: HippoHTTPFlow):
parsed = llsd.parse_xml(flow.response.content)
if "error" in parsed:
@@ -368,14 +522,18 @@ class ObjectManager:
LOG.debug(f"Received ObjectCost for unknown {object_id}")
continue
obj.ObjectCosts.update(object_costs)
self._notify_object_updated(obj, {"ObjectCosts"})
self._run_object_update_hooks(obj, {"ObjectCosts"})
def _notify_object_updated(self, obj: Object, updated_props: Set[str]):
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str]):
if obj.PCode == PCode.AVATAR and "NameValue" in updated_props:
if obj.NameValue:
self.name_cache.update(obj.FullID, obj.NameValue.to_dict())
AddonManager.handle_object_updated(self._region.session(), self._region, obj, updated_props)
def clear(self):
self._localid_lookup.clear()
self._fullid_lookup.clear()
self._coarse_locations.clear()
self._orphan_manager.clear()
self.missing_locals.clear()

View File

@@ -14,6 +14,7 @@ from hippolyzer.lib.base.datatypes import Vector3
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.proxy.caps_client import CapsClient
from hippolyzer.lib.proxy.circuit import ProxiedCircuit
from hippolyzer.lib.proxy.namecache import NameCache
from hippolyzer.lib.proxy.objects import ObjectManager
from hippolyzer.lib.proxy.transfer_manager import TransferManager
from hippolyzer.lib.proxy.xfer_manager import XferManager
@@ -60,6 +61,9 @@ class ProxiedRegion:
self.transfer_manager = TransferManager(self)
self.caps_client = CapsClient(self)
self.objects = ObjectManager(self)
if session:
name_cache: NameCache = session.session_manager.name_cache
self.message_handler.subscribe("UUIDNameReply", name_cache.handle_uuid_name_reply)
@property
def name(self):
@@ -103,6 +107,9 @@ class ProxiedRegion:
seed_id = self._caps["Seed"][1].split("/")[-1].encode("utf8")
# Give it a unique domain tied to the current Seed URI
parsed[1] = f"{name}-{hashlib.sha256(seed_id).hexdigest()[:16]}.hippo-proxy.localhost"
# Force the URL to HTTP, we're going to handle the request ourselves so it doesn't need
# to be secure. This should save on expensive TLS context setup for each req.
parsed[0] = "http"
wrapper_url = urllib.parse.urlunsplit(parsed)
self._caps.add(name + "ProxyWrapper", (CapType.WRAPPER, wrapper_url))
return wrapper_url

View File

@@ -12,11 +12,11 @@ from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.proxy.circuit import ProxiedCircuit
from hippolyzer.lib.proxy.http_asset_repo import HTTPAssetRepo
from hippolyzer.lib.proxy.http_proxy import HTTPFlowContext, is_asset_server_cap_name, SerializedCapData
from hippolyzer.lib.proxy.namecache import NameCache
from hippolyzer.lib.proxy.region import ProxiedRegion, CapType
if TYPE_CHECKING:
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.proxy.message_logger import BaseMessageLogger
class Session:
@@ -144,17 +144,6 @@ class Session:
return "<%s %s>" % (self.__class__.__name__, self.id)
class BaseMessageLogger:
def log_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
pass
def log_http_response(self, flow: HippoHTTPFlow):
pass
def log_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
pass
class SessionManager:
def __init__(self):
self.sessions: List[Session] = []
@@ -163,6 +152,7 @@ class SessionManager:
self.asset_repo = HTTPAssetRepo()
self.message_logger: Optional[BaseMessageLogger] = None
self.addon_ctx: Dict[str, Any] = {}
self.name_cache = NameCache()
def create_session(self, login_data) -> Session:
session = Session.from_login_data(login_data, self)

View File

@@ -162,8 +162,8 @@ class UDPProxyProtocol(asyncio.DatagramProtocol):
data = data[4:]
elif address_type == 3: # Domain name
domain_length = data[0]
address = data[1:1+domain_length]
data = data[1+domain_length:]
address = data[1:1 + domain_length]
data = data[1 + domain_length:]
else:
logging.error("Don't understand addr type %d" % address_type)
return None

View File

@@ -8,7 +8,7 @@ from typing import *
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.datatypes import UUID, IntEnum, IntFlag
from hippolyzer.lib.base.namevalue import NameValuesSerializer
try:
@@ -25,7 +25,7 @@ except:
@se.enum_field_serializer("RezObject", "InventoryData", "Type")
@se.enum_field_serializer("RezScript", "InventoryBlock", "Type")
@se.enum_field_serializer("UpdateTaskInventory", "InventoryData", "Type")
class AssetType(enum.IntEnum):
class AssetType(IntEnum):
TEXTURE = 0
SOUND = 1
CALLINGCARD = 2
@@ -103,7 +103,7 @@ class AssetType(enum.IntEnum):
@se.enum_field_serializer("RezObject", "InventoryData", "InvType")
@se.enum_field_serializer("RezScript", "InventoryBlock", "InvType")
@se.enum_field_serializer("UpdateTaskInventory", "InventoryData", "InvType")
class InventoryType(enum.IntEnum):
class InventoryType(IntEnum):
TEXTURE = 0
SOUND = 1
CALLINGCARD = 2
@@ -143,7 +143,7 @@ class InventoryType(enum.IntEnum):
@se.enum_field_serializer("AgentIsNowWearing", "WearableData", "WearableType")
@se.enum_field_serializer("AgentWearablesUpdate", "WearableData", "WearableType")
@se.enum_field_serializer("CreateInventoryItem", "InventoryBlock", "WearableType")
class WearableType(enum.IntEnum):
class WearableType(IntEnum):
SHAPE = 0
SKIN = 1
HAIR = 2
@@ -180,7 +180,7 @@ def _register_permissions_flags(message_name, block_name):
@_register_permissions_flags("RezObject", "InventoryData")
@_register_permissions_flags("RezScript", "InventoryBlock")
@_register_permissions_flags("RezMultipleAttachmentsFromInv", "ObjectData")
class Permissions(enum.IntFlag):
class Permissions(IntFlag):
TRANSFER = (1 << 13)
MODIFY = (1 << 14)
COPY = (1 << 15)
@@ -202,7 +202,7 @@ class Permissions(enum.IntFlag):
@se.flag_field_serializer("RezObject", "InventoryData", "Flags")
@se.flag_field_serializer("UpdateCreateInventoryItem", "InventoryData", "Flags")
@se.flag_field_serializer("UpdateTaskInventory", "InventoryData", "Flags")
class InventoryItemFlags(enum.IntFlag):
class InventoryItemFlags(IntFlag):
# The asset has only one reference in the system. If the
# inventory item is deleted, or the assetid updated, then we
# can remove the old reference.
@@ -233,7 +233,7 @@ class InventoryItemFlags(enum.IntFlag):
@se.enum_field_serializer("ObjectPermissions", "ObjectData", "Field")
class PermissionType(enum.IntEnum):
class PermissionType(IntEnum):
BASE = 0x01
OWNER = 0x02
GROUP = 0x04
@@ -242,7 +242,7 @@ class PermissionType(enum.IntEnum):
@se.enum_field_serializer("TransferRequest", "TransferInfo", "SourceType")
class TransferSourceType(enum.IntEnum):
class TransferSourceType(IntEnum):
UNKNOWN = 0
FILE = enum.auto()
ASSET = enum.auto()
@@ -250,7 +250,7 @@ class TransferSourceType(enum.IntEnum):
SIM_ESTATE = enum.auto()
class EstateAssetType(enum.IntEnum):
class EstateAssetType(IntEnum):
NONE = -1
COVENANT = 0
@@ -308,14 +308,14 @@ class TransferParamsSerializer(se.EnumSwitchedSubfieldSerializer):
@se.enum_field_serializer("TransferPacket", "TransferData", "ChannelType")
@se.enum_field_serializer("TransferRequest", "TransferInfo", "ChannelType")
@se.enum_field_serializer("TransferInfo", "TransferInfo", "ChannelType")
class TransferChannelType(enum.IntEnum):
class TransferChannelType(IntEnum):
UNKNOWN = 0
MISC = enum.auto()
ASSET = enum.auto()
@se.enum_field_serializer("TransferInfo", "TransferInfo", "TargetType")
class TransferTargetType(enum.IntEnum):
class TransferTargetType(IntEnum):
UNKNOWN = 0
FILE = enum.auto()
VFILE = enum.auto()
@@ -323,7 +323,7 @@ class TransferTargetType(enum.IntEnum):
@se.enum_field_serializer("TransferInfo", "TransferInfo", "Status")
@se.enum_field_serializer("TransferPacket", "TransferData", "Status")
class TransferStatus(enum.IntEnum):
class TransferStatus(IntEnum):
OK = 0
DONE = 1
SKIP = 2
@@ -380,7 +380,7 @@ class TransferInfoSerializer(se.BaseSubfieldSerializer):
@se.enum_field_serializer("RequestXfer", "XferID", "FilePath")
class XferFilePath(enum.IntEnum):
class XferFilePath(IntEnum):
NONE = 0
USER_SETTINGS = 1
APP_SETTINGS = 2
@@ -403,7 +403,7 @@ class XferFilePath(enum.IntEnum):
@se.enum_field_serializer("AbortXfer", "XferID", "Result")
class XferError(enum.IntEnum):
class XferError(IntEnum):
FILE_EMPTY = -44
FILE_NOT_FOUND = -43
CANNOT_OPEN_FILE = -42
@@ -423,7 +423,7 @@ class SendXferPacketIDSerializer(se.AdapterSubfieldSerializer):
@se.enum_field_serializer("ViewerEffect", "Effect", "Type")
class ViewerEffectType(enum.IntEnum):
class ViewerEffectType(IntEnum):
TEXT = 0
ICON = enum.auto()
CONNECTOR = enum.auto()
@@ -445,7 +445,7 @@ class ViewerEffectType(enum.IntEnum):
EFFECT_BLOB = enum.auto()
class LookAtTarget(enum.IntEnum):
class LookAtTarget(IntEnum):
NONE = 0
IDLE = enum.auto()
AUTO_LISTEN = enum.auto()
@@ -459,7 +459,7 @@ class LookAtTarget(enum.IntEnum):
CLEAR = enum.auto()
class PointAtTarget(enum.IntEnum):
class PointAtTarget(IntEnum):
NONE = 0
SELECT = enum.auto()
GRAB = enum.auto()
@@ -499,7 +499,7 @@ class ViewerEffectDataSerializer(se.EnumSwitchedSubfieldSerializer):
@se.enum_field_serializer("MoneyTransferRequest", "MoneyData", "TransactionType")
@se.enum_field_serializer("MoneyBalanceReply", "TransactionInfo", "TransactionType")
class MoneyTransactionType(enum.IntEnum):
class MoneyTransactionType(IntEnum):
# _many_ of these codes haven't been used in decades.
# Money transaction failure codes
NULL = 0
@@ -561,7 +561,7 @@ class MoneyTransactionType(enum.IntEnum):
@se.flag_field_serializer("MoneyTransferRequest", "MoneyData", "Flags")
class MoneyTransactionFlags(enum.IntFlag):
class MoneyTransactionFlags(IntFlag):
SOURCE_GROUP = 1
DEST_GROUP = 1 << 1
OWNER_GROUP = 1 << 2
@@ -570,7 +570,7 @@ class MoneyTransactionFlags(enum.IntFlag):
@se.enum_field_serializer("ImprovedInstantMessage", "MessageBlock", "Dialog")
class IMDialogType(enum.IntEnum):
class IMDialogType(IntEnum):
NOTHING_SPECIAL = 0
MESSAGEBOX = 1
GROUP_INVITATION = 3
@@ -728,7 +728,7 @@ class ObjectUpdateDataSerializer(se.SimpleSubfieldSerializer):
@se.enum_field_serializer("ObjectUpdate", "ObjectData", "PCode")
@se.enum_field_serializer("ObjectAdd", "ObjectData", "PCode")
class PCode(enum.IntEnum):
class PCode(IntEnum):
# Should actually be a bitmask, these are just some common ones.
PRIMITIVE = 9
AVATAR = 47
@@ -742,7 +742,7 @@ class PCode(enum.IntEnum):
@se.flag_field_serializer("ObjectUpdateCompressed", "ObjectData", "UpdateFlags")
@se.flag_field_serializer("ObjectUpdateCached", "ObjectData", "UpdateFlags")
@se.flag_field_serializer("ObjectAdd", "ObjectData", "AddFlags")
class ObjectUpdateFlags(enum.IntFlag):
class ObjectUpdateFlags(IntFlag):
USE_PHYSICS = 1 << 0
CREATE_SELECTED = 1 << 1
OBJECT_MODIFY = 1 << 2
@@ -796,7 +796,7 @@ class AttachmentStateAdapter(se.Adapter):
@se.flag_field_serializer("AgentUpdate", "AgentData", "State")
class AgentState(enum.IntFlag):
class AgentState(IntFlag):
TYPING = 1 << 3
EDITING = 1 << 4
@@ -836,7 +836,7 @@ class ImprovedTerseObjectUpdateDataSerializer(se.SimpleSubfieldSerializer):
})
class ShineLevel(enum.IntEnum):
class ShineLevel(IntEnum):
OFF = 0
LOW = 1
MEDIUM = 2
@@ -854,7 +854,7 @@ class BasicMaterials:
BUMP_SHINY_FULLBRIGHT = se.BitfieldDataclass(BasicMaterials, se.U8)
class TexGen(enum.IntEnum):
class TexGen(IntEnum):
DEFAULT = 0
PLANAR = 0x2
# These are unused / not supported
@@ -1056,7 +1056,7 @@ class DPTextureEntrySubfieldSerializer(se.SimpleSubfieldSerializer):
TEMPLATE = DATA_PACKER_TE_TEMPLATE
class TextureAnimMode(enum.IntFlag):
class TextureAnimMode(IntFlag):
ON = 0x01
LOOP = 0x02
REVERSE = 0x04
@@ -1092,7 +1092,7 @@ class TextureIDListSerializer(se.SimpleSubfieldSerializer):
TEMPLATE = se.Collection(None, se.UUID)
class ParticleDataFlags(enum.IntFlag):
class ParticleDataFlags(IntFlag):
INTERP_COLOR = 0x001
INTERP_SCALE = 0x002
BOUNCE = 0x004
@@ -1108,12 +1108,12 @@ class ParticleDataFlags(enum.IntFlag):
DATA_BLEND = 0x20000
class ParticleFlags(enum.IntFlag):
class ParticleFlags(IntFlag):
OBJECT_RELATIVE = 0x1
USE_NEW_ANGLE = 0x2
class ParticleBlendFunc(enum.IntEnum):
class ParticleBlendFunc(IntEnum):
ONE = 0
ZERO = 1
DEST_COLOR = 2
@@ -1150,7 +1150,7 @@ PDATA_BLOCK_TEMPLATE = se.Template({
})
class PartPattern(enum.IntFlag):
class PartPattern(IntFlag):
NONE = 0
DROP = 0x1
EXPLODE = 0x2
@@ -1199,7 +1199,7 @@ class PSBlockSerializer(se.SimpleSubfieldSerializer):
@se.enum_field_serializer("ObjectExtraParams", "ObjectData", "ParamType")
class ExtraParamType(enum.IntEnum):
class ExtraParamType(IntEnum):
FLEXIBLE = 0x10
LIGHT = 0x20
SCULPT = 0x30
@@ -1209,11 +1209,11 @@ class ExtraParamType(enum.IntEnum):
EXTENDED_MESH = 0x70
class ExtendedMeshFlags(enum.IntFlag):
class ExtendedMeshFlags(IntFlag):
ANIMATED_MESH = 0x1
class SculptType(enum.IntEnum):
class SculptType(IntEnum):
NONE = 0
SPHERE = 1
TORUS = 2
@@ -1238,10 +1238,10 @@ EXTRA_PARAM_TEMPLATES = {
"UserForce": se.IfPresent(se.Vector3),
}),
ExtraParamType.LIGHT: se.Template({
"Color": Color4(),
"Radius": se.F32,
"Cutoff": se.F32,
"Falloff": se.F32,
"Color": Color4(),
"Radius": se.F32,
"Cutoff": se.F32,
"Falloff": se.F32,
}),
ExtraParamType.SCULPT: se.Template({
"Texture": se.UUID,
@@ -1283,8 +1283,8 @@ class ObjectUpdateExtraParamsSerializer(se.SimpleSubfieldSerializer):
EMPTY_IS_NONE = True
@se.enum_field_serializer("ObjectUpdate", "ObjectData", "Flags")
class SoundFlags(enum.IntEnum):
@se.flag_field_serializer("ObjectUpdate", "ObjectData", "Flags")
class SoundFlags(IntFlag):
LOOP = 1 << 0
SYNC_MASTER = 1 << 1
SYNC_SLAVE = 1 << 2
@@ -1293,7 +1293,7 @@ class SoundFlags(enum.IntEnum):
STOP = 1 << 5
class CompressedFlags(enum.IntFlag):
class CompressedFlags(IntFlag):
SCRATCHPAD = 1
TREE = 1 << 1
TEXT = 1 << 2
@@ -1381,7 +1381,7 @@ class ObjectUpdateCompressedDataSerializer(se.SimpleSubfieldSerializer):
@se.flag_field_serializer("MultipleObjectUpdate", "ObjectData", "Type")
class MultipleObjectUpdateFlags(enum.IntFlag):
class MultipleObjectUpdateFlags(IntFlag):
POSITION = 0x01
ROTATION = 0x02
SCALE = 0x04
@@ -1401,7 +1401,7 @@ class MultipleObjectUpdateDataSerializer(se.FlagSwitchedSubfieldSerializer):
@se.flag_field_serializer("AgentUpdate", "AgentData", "ControlFlags")
@se.flag_field_serializer("ScriptControlChange", "Data", "Controls")
class AgentControlFlags(enum.IntFlag):
class AgentControlFlags(IntFlag):
AT_POS = 1
AT_NEG = 1 << 1
LEFT_POS = 1 << 2
@@ -1437,14 +1437,14 @@ class AgentControlFlags(enum.IntFlag):
@se.flag_field_serializer("AgentUpdate", "AgentData", "Flags")
class AgentUpdateFlags(enum.IntFlag):
class AgentUpdateFlags(IntFlag):
HIDE_TITLE = 1
CLIENT_AUTOPILOT = 1 << 1
@se.enum_field_serializer("ChatFromViewer", "ChatData", "Type")
@se.enum_field_serializer("ChatFromSimulator", "ChatData", "ChatType")
class ChatType(enum.IntEnum):
class ChatType(IntEnum):
WHISPER = 0
NORMAL = 1
SHOUT = 2
@@ -1461,7 +1461,7 @@ class ChatType(enum.IntEnum):
@se.enum_field_serializer("ChatFromSimulator", "ChatData", "SourceType")
class ChatSourceType(enum.IntEnum):
class ChatSourceType(IntEnum):
SYSTEM = 0
AGENT = 1
OBJECT = 2
@@ -1479,7 +1479,7 @@ class NameValueSerializer(se.SimpleSubfieldSerializer):
@se.enum_field_serializer("SetFollowCamProperties", "CameraProperty", "Type")
class CameraPropertyType(enum.IntEnum):
class CameraPropertyType(IntEnum):
PITCH = 0
FOCUS_OFFSET = enum.auto()
FOCUS_OFFSET_X = enum.auto()
@@ -1506,7 +1506,7 @@ class CameraPropertyType(enum.IntEnum):
@se.enum_field_serializer("DeRezObject", "AgentBlock", "Destination")
class DeRezObjectDestination(enum.IntEnum):
class DeRezObjectDestination(IntEnum):
SAVE_INTO_AGENT_INVENTORY = 0 # deprecated, disabled
ACQUIRE_TO_AGENT_INVENTORY = 1 # try to leave copy in world
SAVE_INTO_TASK_INVENTORY = 2
@@ -1526,7 +1526,7 @@ class DeRezObjectDestination(enum.IntEnum):
@se.flag_field_serializer("SimStats", "RegionInfo", "RegionFlagsExtended")
@se.flag_field_serializer("RegionInfo", "RegionInfo", "RegionFlags")
@se.flag_field_serializer("RegionInfo", "RegionInfo3", "RegionFlagsExtended")
class RegionFlags(enum.IntFlag):
class RegionFlags(IntFlag):
ALLOW_DAMAGE = 1 << 0
ALLOW_LANDMARK = 1 << 1
ALLOW_SET_HOME = 1 << 2
@@ -1562,12 +1562,35 @@ class RegionFlags(enum.IntFlag):
@se.flag_field_serializer("RegionHandshakeReply", "RegionInfo", "Flags")
class RegionHandshakeReplyFlags(enum.IntFlag):
class RegionHandshakeReplyFlags(IntFlag):
VOCACHE_CULLING_ENABLED = 0x1 # ask sim to send all cacheable objects.
VOCACHE_IS_EMPTY = 0x2 # the cache file is empty, no need to send cache probes.
SUPPORTS_SELF_APPEARANCE = 0x4 # inbound AvatarAppearance for self is ok
@se.flag_field_serializer("TeleportStart", "Info", "TeleportFlags")
@se.flag_field_serializer("TeleportProgress", "Info", "TeleportFlags")
@se.flag_field_serializer("TeleportFinish", "Info", "TeleportFlags")
@se.flag_field_serializer("TeleportLureRequest", "Info", "TeleportFlags")
class TeleportFlags(IntFlag):
SET_HOME_TO_TARGET = 1 << 0 # newbie leaving prelude (starter area)
SET_LAST_TO_TARGET = 1 << 1
VIA_LURE = 1 << 2
VIA_LANDMARK = 1 << 3
VIA_LOCATION = 1 << 4
VIA_HOME = 1 << 5
VIA_TELEHUB = 1 << 6
VIA_LOGIN = 1 << 7
VIA_GODLIKE_LURE = 1 << 8
GODLIKE = 1 << 9
NINE_ONE_ONE = 1 << 10 # What is this?
DISABLE_CANCEL = 1 << 11 # Used for llTeleportAgentHome()
VIA_REGION_ID = 1 << 12
IS_FLYING = 1 << 13
SHOW_RESET_HOME = 1 << 14
FORCE_REDIRECT = 1 << 15
@se.http_serializer("RenderMaterials")
class RenderMaterialsSerializer(se.BaseHTTPSerializer):
@classmethod

View File

@@ -128,7 +128,7 @@ class TransferManager:
elif msg.name == "TransferAbort":
transfer.error_code = msg["TransferID"][0].deserialize_var("Result")
transfer.set_exception(
ConnectionAbortedError(f"Unknown failure")
ConnectionAbortedError("Unknown failure")
)
def _handle_transfer_packet(self, msg: ProxiedMessage, transfer: Transfer):
@@ -136,7 +136,7 @@ class TransferManager:
packet_id: int = transfer_block["Packet"]
packet_data = transfer_block["Data"]
transfer.chunks[packet_id] = packet_data
if transfer_block["Status"] == TransferStatus.DONE:
if transfer_block["Status"] == TransferStatus.DONE and not transfer.done():
transfer.mark_done()
def _handle_transfer_info(self, msg: ProxiedMessage, transfer: Transfer):

View File

@@ -1,10 +1,9 @@
import dataclasses
from typing import *
import pkg_resources
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import get_resource_filename
from hippolyzer.lib.proxy.templates import AssetType
@@ -64,5 +63,5 @@ class VFS:
return self._data_fh.read(block.size)
_static_path = pkg_resources.resource_filename("hippolyzer.lib.proxy", "data/static_index.db2")
_static_path = get_resource_filename("lib/proxy/data/static_index.db2")
STATIC_VFS = VFS(_static_path)

View File

@@ -1,15 +1,14 @@
"""
Outbound Xfer only.
sim->viewer Xfer is only legitimately used for terrain so not worth implementing.
Managers for inbound and outbound xfer as well as the AssetUploadRequest flow
"""
from __future__ import annotations
import asyncio
import enum
import random
from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.datatypes import UUID, RawBytes
from hippolyzer.lib.base.helpers import proxify
from hippolyzer.lib.base.message.data_packer import TemplateDataPacker
from hippolyzer.lib.base.message.message import Block
@@ -24,7 +23,7 @@ _XFER_MESSAGES = {"AbortXfer", "ConfirmXferPacket", "RequestXfer", "SendXferPack
class Xfer:
def __init__(self, xfer_id: int):
def __init__(self, xfer_id: Optional[int] = None):
super().__init__()
self.xfer_id: Optional[int] = xfer_id
self.chunks: Dict[int, bytes] = {}
@@ -65,6 +64,11 @@ class Xfer:
return self._future.__await__()
class UploadStrategy(enum.IntEnum):
XFER = enum.auto()
ASSET_UPLOAD_REQUEST = enum.auto()
class XferManager:
def __init__(self, region: ProxiedRegion):
self._region: ProxiedRegion = proxify(region)
@@ -141,5 +145,96 @@ class XferManager:
))
xfer.chunks[packet_id.PacketID] = packet_data
if packet_id.IsEOF:
if packet_id.IsEOF and not xfer.done():
xfer.mark_done()
def upload_asset(
self,
asset_type: AssetType,
data: bytes,
store_local: bool = False,
temp_file: bool = False,
transaction_id: Optional[UUID] = None,
upload_strategy: Optional[UploadStrategy] = None,
) -> asyncio.Future[UUID]:
"""Upload an asset through the Xfer upload path"""
if not transaction_id:
transaction_id = UUID.random()
# Small amounts of data can be sent inline, decide based on size
if upload_strategy is None:
if len(data) >= 1150:
upload_strategy = UploadStrategy.XFER
else:
upload_strategy = UploadStrategy.ASSET_UPLOAD_REQUEST
xfer = None
inline_data = b''
if upload_strategy == UploadStrategy.XFER:
# Prepend the expected length field to the first chunk
if not isinstance(data, RawBytes):
data = TemplateDataPacker.pack(len(data), MsgType.MVT_S32) + data
xfer = Xfer()
chunk_num = 0
while data:
xfer.chunks[chunk_num] = data[:1150]
data = data[1150:]
else:
inline_data = data
self._region.circuit.send_message(ProxiedMessage(
"AssetUploadRequest",
Block(
"AssetBlock",
TransactionID=transaction_id,
Type=asset_type,
Tempfile=temp_file,
StoreLocal=store_local,
AssetData=inline_data,
)
))
fut = asyncio.Future()
asyncio.create_task(self._pump_asset_upload(xfer, transaction_id, fut))
return fut
async def _pump_asset_upload(self, xfer: Optional[Xfer], transaction_id: UUID, fut: asyncio.Future):
message_handler = self._region.message_handler
# We'll receive an Xfer request for the asset we're uploading.
# asset ID is determined by hashing secure session ID with chosen transaction ID.
asset_id: UUID = self._region.session().tid_to_assetid(transaction_id)
try:
# Only need to do this if we're using the xfer upload strategy, otherwise all the
# data was already sent in the AssetUploadRequest and we don't expect a RequestXfer.
if xfer is not None:
def request_predicate(request_msg: ProxiedMessage):
return request_msg["XferID"]["VFileID"] == asset_id
msg = await message_handler.wait_for(
'RequestXfer', predicate=request_predicate, timeout=5000)
xfer.xfer_id = msg["XferID"]["ID"]
packet_id = 0
# TODO: No resend yet. If it's lost, it's lost.
while xfer.chunks:
chunk = xfer.chunks.pop(packet_id)
# EOF if there are no chunks left
packet_val = XferPacket(PacketID=packet_id, IsEOF=not bool(xfer.chunks))
self._region.circuit.send_message(ProxiedMessage(
"SendXferPacket",
Block("XferID", ID=xfer.xfer_id, Packet_=packet_val),
Block("DataPacket", Data=chunk),
))
# Don't care about the value, just want to know it was confirmed.
await message_handler.wait_for(
"ConfirmXferPacket", predicate=xfer.is_our_message, timeout=5000)
packet_id += 1
def complete_predicate(complete_msg: ProxiedMessage):
return complete_msg["AssetBlock"]["UUID"] == asset_id
msg = await message_handler.wait_for('AssetUploadComplete', predicate=complete_predicate)
if msg["AssetBlock"]["Success"] == 1:
fut.set_result(asset_id)
else:
fut.set_exception(RuntimeError(f"Xfer for transaction {transaction_id} failed"))
except asyncio.TimeoutError as e:
fut.set_exception(e)

4
requirements-test.txt Normal file
View File

@@ -0,0 +1,4 @@
aioresponses
pytest
pytest-cov
flake8

View File

@@ -5,3 +5,8 @@ license_files =
[bdist_wheel]
universal = 1
[flake8]
max-line-length = 160
exclude = build/*, .eggs/*
ignore = F405, F403, E501, F841, E722, W503, E741

View File

@@ -25,7 +25,7 @@ from setuptools import setup, find_packages
here = path.abspath(path.dirname(__file__))
version = '0.2.1'
version = '0.5.0'
with open(path.join(here, 'README.md')) as readme_fh:
readme = readme_fh.read()
@@ -50,7 +50,7 @@ setup(
"Topic :: Software Development :: Testing",
],
author='Salad Dais',
author_email='SaladDais@users.noreply.github.com',
author_email='83434023+SaladDais@users.noreply.github.com',
url='https://github.com/SaladDais/Hippolyzer/',
license='LGPLv3',
packages=find_packages(include=["hippolyzer", "hippolyzer.*"]),
@@ -98,5 +98,6 @@ setup(
],
tests_require=[
"pytest",
"aioresponses",
],
)

121
setup_cxfreeze.py Normal file
View File

@@ -0,0 +1,121 @@
import setuptools # noqa
import os
import shutil
from distutils.core import Command
from pathlib import Path
from cx_Freeze import setup, Executable
# We don't need any of these and they make the archive huge.
TO_DELETE = [
"lib/PySide2/Qt3DRender.pyd",
"lib/PySide2/Qt53DRender.dll",
"lib/PySide2/Qt5Charts.dll",
"lib/PySide2/Qt5Location.dll",
"lib/PySide2/Qt5Pdf.dll",
"lib/PySide2/Qt5Quick.dll",
"lib/PySide2/Qt5WebEngineCore.dll",
"lib/PySide2/QtCharts.pyd",
"lib/PySide2/QtMultimedia.pyd",
"lib/PySide2/QtOpenGLFunctions.pyd",
"lib/PySide2/QtOpenGLFunctions.pyi",
"lib/PySide2/d3dcompiler_47.dll",
"lib/PySide2/opengl32sw.dll",
"lib/PySide2/translations",
"lib/aiohttp/_find_header.c",
"lib/aiohttp/_frozenlist.c",
"lib/aiohttp/_helpers.c",
"lib/aiohttp/_http_parser.c",
"lib/aiohttp/_http_writer.c",
"lib/aiohttp/_websocket.c",
# Improve this to work with different versions.
"lib/aiohttp/python39.dll",
"lib/lazy_object_proxy/python39.dll",
"lib/lxml/python39.dll",
"lib/markupsafe/python39.dll",
"lib/multidict/python39.dll",
"lib/numpy/core/python39.dll",
"lib/numpy/fft/python39.dll",
"lib/numpy/linalg/python39.dll",
"lib/numpy/random/python39.dll",
"lib/python39.dll",
"lib/recordclass/python39.dll",
"lib/regex/python39.dll",
"lib/test",
"lib/yarl/python39.dll",
]
COPY_TO_ZIP = [
"LICENSE.txt",
"README.md",
"NOTICE.md",
# Must have been generated with pip-licenses before. Many dependencies
# require their license to be distributed with their binaries.
"lib_licenses.txt",
]
BASE_DIR = Path(__file__).parent.absolute()
class FinalizeCXFreezeCommand(Command):
description = "Prepare cx_Freeze build dirs and create a zip"
user_options = []
def initialize_options(self) -> None:
pass
def finalize_options(self) -> None:
pass
def run(self):
(BASE_DIR / "dist").mkdir(exist_ok=True)
for path in (BASE_DIR / "build").iterdir():
if path.name.startswith("exe.") and path.is_dir():
for cleanse_suffix in TO_DELETE:
cleanse_path = path / cleanse_suffix
shutil.rmtree(cleanse_path, ignore_errors=True)
try:
os.unlink(cleanse_path)
except:
pass
for to_copy in COPY_TO_ZIP:
shutil.copy(BASE_DIR / to_copy, path / to_copy)
zip_path = BASE_DIR / "dist" / path.name
shutil.make_archive(zip_path, "zip", path)
options = {
"build_exe": {
"packages": [
"passlib",
"_cffi_backend",
"hippolyzer",
],
# exclude packages that are not really needed
"excludes": [
"tkinter",
],
"include_msvcr": True,
}
}
executables = [
Executable(
"hippolyzer/apps/proxy_gui.py",
base=None,
target_name="hippolyzer_gui"
),
]
setup(
name="hippolyzer_gui",
version="0.5.0",
description="Hippolyzer GUI",
options=options,
executables=executables,
cmdclass={
"finalize_cxfreeze": FinalizeCXFreezeCommand,
}
)

Binary file not shown.

Before

Width:  |  Height:  |  Size: 31 KiB

View File

View File

@@ -1,16 +1,18 @@
import pkg_resources
import os
import unittest
from hippolyzer.lib.base.mesh import LLMeshSerializer, MeshAsset
import hippolyzer.lib.base.serialization as se
BASE_PATH = os.path.dirname(os.path.abspath(__file__))
class TestMesh(unittest.TestCase):
@classmethod
def setUpClass(cls) -> None:
# Use a rigged cube SLM from the upload process as a test file
slm_file = pkg_resources.resource_filename(__name__, "test_resources/testslm.slm")
slm_file = os.path.join(BASE_PATH, "test_resources", "testslm.slm")
with open(slm_file, "rb") as f:
cls.slm_bytes = f.read()

View File

@@ -126,8 +126,6 @@ class TestMessage(unittest.TestCase):
def test_partial_decode_pickle(self):
msg = self.deserial.deserialize(self.serial.serialize(self.chat_msg))
self.assertEqual(msg.deserializer(), self.deserial)
# Have to remove the weak ref so we can pickle
msg.deserializer = None
msg = pickle.loads(pickle.dumps(msg, protocol=pickle.HIGHEST_PROTOCOL))
# We should still have the raw body at this point

View File

@@ -664,14 +664,13 @@ class NameValueSerializationTests(BaseSerializationTest):
self.assertEqual(test.decode("utf8"), str(reader.read(NameValueSerializer())))
def test_namevalues_stringify(self):
test_list = \
b"Alpha STRING R S 'Twas brillig and the slighy toves/Did gyre and gimble in the wabe\n" + \
b"Beta F32 R S 3.14159\n" + \
b"Gamma S32 R S -12345\n" + \
b"Delta VEC3 R S <1.2, -3.4, 5.6>\n" + \
b"Epsilon U32 R S 12345\n" + \
b"Zeta ASSET R S 041a8591-6f30-42f8-b9f7-7f281351f375\n" + \
b"Eta U64 R S 9223372036854775807"
test_list = b"Alpha STRING R S 'Twas brillig and the slighy toves/Did gyre and gimble in the wabe\n" + \
b"Beta F32 R S 3.14159\n" + \
b"Gamma S32 R S -12345\n" + \
b"Delta VEC3 R S <1.2, -3.4, 5.6>\n" + \
b"Epsilon U32 R S 12345\n" + \
b"Zeta ASSET R S 041a8591-6f30-42f8-b9f7-7f281351f375\n" + \
b"Eta U64 R S 9223372036854775807"
self.writer.clear()
self.writer.write_bytes(test_list)

View File

@@ -39,11 +39,7 @@ class TestDictionary(unittest.TestCase):
self.template_list = parser.message_templates
def test_create_dictionary(self):
try:
_msg_dict = TemplateDictionary(None)
assert False, "Template dictionary fail case list==None not caught"
except:
assert True
TemplateDictionary(None)
def test_get_packet(self):
msg_dict = TemplateDictionary(self.template_list)
@@ -55,7 +51,7 @@ class TestDictionary(unittest.TestCase):
def test_get_packet_pair(self):
msg_dict = TemplateDictionary(self.template_list)
packet = msg_dict.get_template_by_pair('Medium', 8)
assert packet.name == 'ConfirmEnableSimulator', "Frequency-Number pair resulting in incorrect packet"
assert packet.name == 'ConfirmEnableSimulator', "Frequency-Number pair resulting in incorrect packet"
class TestTemplates(unittest.TestCase):
@@ -69,11 +65,8 @@ class TestTemplates(unittest.TestCase):
assert parser.message_templates is not None, "Parsing template file failed"
def test_parser_fail(self):
try:
with self.assertRaises(Exception):
_parser = MessageTemplateParser(None)
assert False, "Fail case TEMPLATE_FILE == NONE not caught"
except:
assert True
def test_parser_version(self):
version = self.parser.version
@@ -111,15 +104,15 @@ class TestTemplates(unittest.TestCase):
block = self.msg_dict['OpenCircuit'].get_block('CircuitInfo')
tp = block.block_type
num = block.number
assert tp == MsgBlockType.MBT_SINGLE, "Expected: Single Returned: " + tp
assert num == 0, "Expected: 0 Returned: " + str(num)
assert tp == MsgBlockType.MBT_SINGLE, "Expected: Single Returned: " + tp
assert num == 0, "Expected: 0 Returned: " + str(num)
def test_block_multiple(self):
block = self.msg_dict['NeighborList'].get_block('NeighborBlock')
tp = block.block_type
num = block.number
assert tp == MsgBlockType.MBT_MULTIPLE, "Expected: Multiple Returned: " + tp
assert num == 4, "Expected: 4 Returned: " + str(num)
assert num == 4, "Expected: 4 Returned: " + str(num)
def test_variable(self):
variable = self.msg_dict['StartPingCheck'].get_block('PingID').get_variable('PingID')
@@ -153,7 +146,7 @@ class TestTemplates(unittest.TestCase):
medium_count = 0
high_count = 0
fixed_count = 0
while True:
while True:
try:
line = next(lines)
except StopIteration:

View File

@@ -86,6 +86,6 @@ class TestDeserializer(unittest.TestCase):
# test the 72 byte ObjectUpdate.ObjectData.ObjectData case
hex_string = '00000000000000000000803f6666da41660000432fffff422233e34100000000000000000000000000000000000000' \
'000000000000000000000000000e33de3c000000000000000000000000'
position = TemplateDataPacker.unpack(unhexlify(hex_string)[16:16+12], MsgType.MVT_LLVector3)
position = TemplateDataPacker.unpack(unhexlify(hex_string)[16:16 + 12], MsgType.MVT_LLVector3)
self.assertEqual(position, (128.00155639648438, 127.99840545654297, 28.399967193603516))
self.assertIsInstance(position, Vector3)

View File

@@ -52,6 +52,7 @@ class BaseIntegrationTest(unittest.IsolatedAsyncioTestCase):
self.session.open_circuit(self.client_addr, self.region_addr,
self.protocol.transport)
self.session.main_region = self.session.regions[-1]
self.session.main_region.handle = 0
def _msg_to_datagram(self, msg: ProxiedMessage, src, dst, direction, socks_header=True):
serialized = self.serializer.serialize(msg)

View File

@@ -0,0 +1,72 @@
from __future__ import annotations
import asyncio
from mitmproxy.test import tflow, tutils
from mitmproxy.http import HTTPFlow
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.http_event_manager import MITMProxyEventManager
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.http_proxy import HTTPFlowContext, SerializedCapData
from hippolyzer.lib.proxy.message_logger import FilteringMessageLogger
from hippolyzer.lib.proxy.sessions import SessionManager
from . import BaseIntegrationTest
class MockAddon(BaseAddon):
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
flow.metadata["touched_addon"] = True
def handle_http_response(self, session_manager: SessionManager, flow: HippoHTTPFlow):
flow.metadata["touched_addon"] = True
class SimpleMessageLogger(FilteringMessageLogger):
@property
def entries(self):
return self._filtered_entries
class LLUDPIntegrationTests(BaseIntegrationTest):
def setUp(self) -> None:
super().setUp()
self.addon = MockAddon()
AddonManager.init([], self.session_manager, [self.addon])
self.flow_context = HTTPFlowContext()
self.http_event_manager = MITMProxyEventManager(self.session_manager, self.flow_context)
self._setup_circuit()
async def _pump_one_event(self):
# If we don't yield then the new entry won't end up in the queue
await asyncio.sleep(0.001)
await self.http_event_manager.pump_proxy_event()
await asyncio.sleep(0.001)
async def test_http_flow_request(self):
# mimic a request coming in from mitmproxy over the queue
fake_flow = tflow.tflow(req=tutils.treq(host="example.com"))
fake_flow.metadata["cap_data_ser"] = SerializedCapData()
self.flow_context.from_proxy_queue.put(("request", fake_flow.get_state()), True)
await self._pump_one_event()
self.assertTrue(self.flow_context.from_proxy_queue.empty())
self.assertFalse(self.flow_context.to_proxy_queue.empty())
flow_state = self.flow_context.to_proxy_queue.get(True)[2]
mitm_flow: HTTPFlow = HTTPFlow.from_state(flow_state)
# The response sent back to mitmproxy should have been our modified version
self.assertEqual(True, mitm_flow.metadata["touched_addon"])
async def test_http_flow_response(self):
# mimic a request coming in from mitmproxy over the queue
fake_flow = tflow.tflow(req=tutils.treq(host="example.com"), resp=tutils.tresp())
fake_flow.metadata["cap_data_ser"] = SerializedCapData()
self.flow_context.from_proxy_queue.put(("response", fake_flow.get_state()), True)
await self._pump_one_event()
self.assertTrue(self.flow_context.from_proxy_queue.empty())
self.assertFalse(self.flow_context.to_proxy_queue.empty())
flow_state = self.flow_context.to_proxy_queue.get(True)[2]
mitm_flow: HTTPFlow = HTTPFlow.from_state(flow_state)
# The response sent back to mitmproxy should have been our modified version
self.assertEqual(True, mitm_flow.metadata["touched_addon"])

View File

@@ -13,6 +13,7 @@ from hippolyzer.lib.base.objects import Object
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.message import ProxiedMessage
from hippolyzer.lib.proxy.message_logger import FilteringMessageLogger
from hippolyzer.lib.proxy.packets import ProxiedUDPPacket, Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
@@ -32,7 +33,13 @@ class MockAddon(BaseAddon):
def handle_object_updated(self, session: Session, region: ProxiedRegion,
obj: Object, updated_props: Set[str]):
self.events.append(("object_update", session.id, region.circuit_addr, obj.LocalID))
self.events.append(("object_update", session.id, region.circuit_addr, obj.LocalID, updated_props))
class SimpleMessageLogger(FilteringMessageLogger):
@property
def entries(self):
return self._filtered_entries
class LLUDPIntegrationTests(BaseIntegrationTest):
@@ -46,31 +53,31 @@ class LLUDPIntegrationTests(BaseIntegrationTest):
localid = random.getrandbits(32)
return b'\x00\x00\x00\x0c\xba\x00\r\x00\x00\x00\x00\x00\x00\x00\x00\xff\xff\x03\xd0\x04\x00\x10' \
b'\xe6\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_' + struct.pack("<I", localid) + \
b'\xe6\x00\x12\x12\x10\xbf\x16XB~\x8f\xb4\xfb\x00\x1a\xcd\x9b\xe5' + struct.pack("<I", localid) + \
b'\t\x00\xcdG\x00\x00\x03\x00\x00\x00\x1cB\x00\x00\x1cB\xcd\xcc\xcc=\xedG,' \
b'B\x9e\xb1\x9eBff\xa0A\x00\x00\x00\x00\x00\x00\x00\x00[' \
b'\x8b\xf8\xbe\xc0\x00\x00\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x00\x00\x00\x00\x00' \
b'\x8b\xf8\xbe\xc0\x00\x00\x00k\x9b\xc4\xfe3\nOa\xbb\xe2\xe4\xb2C\xac7\xbd\x00\x00\x00\x00\x00' \
b'\x00\x00\x00\x00\x00\xa2=\x010\x00\x11\x00\x00\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed' \
b'\x15F_@ \x00\x00\x00\x00d\x96\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00' \
b'\x00?\x00\x00\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x00\x00\x00\x003\x00ff\x86\xbf' \
b'\x00?\x00\x00\x00\x1c\x9fJoI\x8dH\xa0\x9d\xc4&\'\'\x19=g\x00\x00\x00\x003\x00ff\x86\xbf' \
b'\x00ff\x86?\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x89UgG$\xcbC' \
b'\xed\x92\x0bG\xca\xed\x15F_\x10\x00\x00\x003\x00\x01\x01\x00\x00\x00\x00\xdb\x0f\xc9@\xa6' \
b'\x9b\xc4=\xd0\x04\x00\x10\xe6\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\\\x04\x00\x00\t' \
b'\x00\xd3G\x00\x00\x03\x00\x00\x00\x1cB\x00\x00\x1cB\xcd\xcc\xcc=\t\x08\x9cA\xf2\x03' \
b'\xa5Bff\xa0A\x00\x00\x00\x00\x00\x00\x00\x00[' \
b'\x8b\xf8\xbe\xc0\x00\x00\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x00\x00\x00\x00\x00' \
b'\x9b\xc4=\xd0\x04\x00\x10\xe6\x00\xc2\xa62\xe2\x9b\xd7L\xc4\xbb\xd6\x1fKC\xa6\xdf\x8d\\\x04\x00' \
b'\x00\t\x00\xd3G\x00\x00\x03\x00\x00\x00\x1cB\x00\x00\x1cB\xcd\xcc\xcc=\t\x08\x9cA\xf2\x03' \
b'\xa5Bff\xa0A\x00\x00\x00\x00\x00\x00\x00\x00[\x8b\xf8' \
b'\xbe\xc0\x00\x00\x00\x0b\x1b\xa0\xd1\x97=C\xcd\xae\x19\xfd\xc9\xbb\x88\x05\xc3\x00\x00\x00\x00\x00' \
b'\x00\x00\x00\x00\x00\xa2=\x010\x00\x11\x00\x00\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed' \
b'\x15F_@ \x00\x00\x00\x00d\x96\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00' \
b'\x00?\x00\x00\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x00\x00\x00\x003\x00ff\x86\xbf' \
b'\x00?\x00\x00\x00\xbd\x8b\xd7h{\xdbM\xbc\x8c3X\xa6\xa6\x0c\x94\xd7\x00\x00\x00\x003\x00ff\x86\xbf' \
b'\x00ff\x86?\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x89UgG$\xcbC' \
b'\xed\x92\x0bG\xca\xed\x15F_\x10\x00\x00\x003\x00\x01\x01\x00\x00\x00\x00\xdb\x0f\xc9@\xa6' \
b'\x9b\xc4=\xd0\x04\x00\x10\xe6\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\xe2\x05\x00\x00' \
b'\x9b\xc4=\xd0\x04\x00\x10\xe6\x00\xd1e\xac\xff,NBK\x91d\xbb\x15\\\x0b\xc3\x9c\xe2\x05\x00\x00' \
b'\t\x00\xbbG\x00\x00\x03\x00\x00\x00\x1cB\x00\x00\x1cB\xcd\xcc\xcc=\x0f5\x97AY\x98ZBff' \
b'\xa0A\x00\x00\x00\x00\x00\x00\x00\x00\xe6Y0\xbf\xc0\x00\x00\x00\x89UgG$\xcbC\xed\x92\x0bG' \
b'\xca\xed\x15F_\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xa2=\x010\x00\x11\x00\x00\x00' \
b'\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_@ ' \
b'#\xce\xf8\xf4\x0cJD.\xb7"\x96\x1cK\xd9\x01\x1b@ ' \
b'\x00\x00\x00\x00d\x96\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00' \
b'?\x00\x00\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x00\x00\x00\x003\x00ff\x86\xbf' \
b'?\x00\x00\x003\xe1\xa1\xcf<\xbdD\xc4\xa0\xe6b\xe9\xbf=\xa2@\x00\x00\x00\x003\x00ff\x86\xbf' \
b'\x00ff\x86?\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x89UgG$\xcbC' \
b'\xed\x92\x0bG\xca\xed\x15F_\x10\x00\x00\x003\x00\x01\x01\x00\x00\x00\x00\xdb\x0f\xc9@\xa6' \
b'\x9b\xc4='
@@ -168,4 +175,31 @@ class LLUDPIntegrationTests(BaseIntegrationTest):
await self._wait_drained()
obj = self.session.regions[0].objects.lookup_localid(1234)
self.assertIsInstance(obj.TextureEntry, lazy_object_proxy.Proxy)
self.assertEqual(obj.TextureEntry.Textures[None], UUID("89556747-24cb-43ed-920b-47caed15465f"))
self.assertEqual(obj.TextureEntry.Textures[None], UUID("1c9f4a6f-498d-48a0-9dc4-262727193d67"))
self.assertEqual(len(self.session.regions[0].objects), 3)
async def test_object_updated_changed_property_list(self):
self._setup_circuit()
# One creating update and one no-op update
obj_update = self._make_objectupdate_compressed(1234)
self.protocol.datagram_received(obj_update, self.region_addr)
obj_update = self._make_objectupdate_compressed(1234)
self.protocol.datagram_received(obj_update, self.region_addr)
await self._wait_drained()
self.assertEqual(len(self.session.regions[0].objects), 3)
object_events = [e for e in self.addon.events if e[0] == "object_update"]
# 3 objects in example packet and we sent it twice
self.assertEqual(len(object_events), 6)
# Only TextureEntry should be marked updated since it's a proxy object
self.assertEqual(object_events[-1][-1], {"TextureEntry"})
async def test_message_logger(self):
message_logger = SimpleMessageLogger()
self.session_manager.message_logger = message_logger
self._setup_circuit()
obj_update = self._make_objectupdate_compressed(1234)
self.protocol.datagram_received(obj_update, self.region_addr)
await self._wait_drained()
entries = message_logger.entries
self.assertEqual(len(entries), 1)
self.assertEqual(entries[0].name, "ObjectUpdateCompressed")

View File

@@ -0,0 +1,65 @@
import unittest
import aiohttp
import aioresponses
from yarl import URL
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.proxy.caps_client import CapsClient
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager
class TestCapsClient(unittest.IsolatedAsyncioTestCase):
def setUp(self) -> None:
self.session = SessionManager().create_session({
"session_id": UUID.random(),
"secure_session_id": UUID.random(),
"agent_id": UUID.random(),
"circuit_code": 0,
"sim_ip": "127.0.0.1",
"sim_port": "1",
"seed_capability": "https://test.localhost:4/foo",
})
self.region = ProxiedRegion(("127.0.0.1", 1), "", self.session)
self.caps_client = CapsClient(self.region)
async def test_bare_url_works(self):
with aioresponses.aioresponses() as m:
m.get("https://example.com/", body=b"foo")
async with self.caps_client.get("https://example.com/") as resp:
self.assertEqual(await resp.read(), b"foo")
async def test_own_session_works(self):
with aioresponses.aioresponses() as m:
async with aiohttp.ClientSession() as sess:
m.get("https://example.com/", body=b"foo")
async with self.caps_client.get("https://example.com/", session=sess) as resp:
self.assertEqual(await resp.read(), b"foo")
async def test_read_llsd(self):
with aioresponses.aioresponses() as m:
m.get("https://example.com/", body=b"<llsd><integer>2</integer></llsd>")
async with self.caps_client.get("https://example.com/") as resp:
self.assertEqual(await resp.read_llsd(), 2)
async def test_caps(self):
self.region.update_caps({"Foobar": "https://example.com/"})
with aioresponses.aioresponses() as m:
m.post("https://example.com/baz", body=b"ok")
data = {"hi": "hello"}
headers = {"Foo": "bar"}
async with self.caps_client.post("Foobar", path="baz", llsd=data, headers=headers) as resp:
self.assertEqual(await resp.read(), b"ok")
# Our original dict should not have been touched
self.assertEqual(headers, {"Foo": "bar"})
req_key = ("POST", URL("https://example.com/baz"))
req_body = m.requests[req_key][0].kwargs['data']
self.assertEqual(req_body, b'<?xml version="1.0" ?><llsd><map><key>hi</key><string>hello'
b'</string></map></llsd>')
with self.assertRaises(KeyError):
with self.caps_client.get("BadCap"):
pass

View File

@@ -0,0 +1,109 @@
import unittest
from mitmproxy.test import tflow, tutils
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.message_logger import HTTPMessageLogEntry
from hippolyzer.lib.proxy.sessions import SessionManager
class TestHTTPFlows(unittest.TestCase):
def setUp(self) -> None:
self.session_manager = SessionManager()
self.session = self.session = self.session_manager.create_session({
"session_id": UUID.random(),
"secure_session_id": UUID.random(),
"agent_id": UUID.random(),
"circuit_code": 0,
"sim_ip": "127.0.0.1",
"sim_port": "1",
"seed_capability": "https://test.localhost:4/foo",
})
self.region = self.session.register_region(
("127.0.0.1", 2),
"https://test.localhost:4/foo",
handle=90,
)
self.region.update_caps({
"FakeCap": "http://example.com",
"ViewerAsset": "http://assets.example.com",
})
def test_request_formatting(self):
req = tutils.treq(host="example.com", port=80)
fake_flow = tflow.tflow(req=req, resp=tutils.tresp())
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
# Make sure cap resolution works correctly
flow.cap_data = self.session_manager.resolve_cap(flow.request.url)
entry = HTTPMessageLogEntry(flow)
self.assertEqual(entry.request(beautify=True), """GET [[FakeCap]]/path HTTP/1.1\r
# http://example.com/path\r
header: qvalue\r
content-length: 7\r
\r
content""")
def test_binary_request_formatting(self):
req = tutils.treq(host="example.com", port=80)
fake_flow = tflow.tflow(req=req, resp=tutils.tresp())
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
# This should trigger the escaped body path without changing content-length
flow.request.content = b"c\x00ntent"
entry = HTTPMessageLogEntry(flow)
self.assertEqual(entry.request(beautify=True), """GET http://example.com/path HTTP/1.1\r
header: qvalue\r
content-length: 7\r
X-Hippo-Escaped-Body: 1\r
\r
c\\x00ntent""")
def test_llsd_response_formatting(self):
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
# Half the time LLSD is sent with a random Content-Type and no PI indicating
# what flavor of LLSD it is. Make sure the sniffing works correctly.
flow.response.content = b"<llsd><integer>1</integer></llsd>"
entry = HTTPMessageLogEntry(flow)
self.assertEqual(entry.response(beautify=True), """HTTP/1.1 200 OK\r
header-response: svalue\r
content-length: 33\r
\r
<?xml version="1.0" ?>
<llsd>
<integer>1</integer>
</llsd>
""")
def test_flow_state_serde(self):
fake_flow = tflow.tflow(req=tutils.treq(host="example.com"), resp=tutils.tresp())
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
# Make sure cap resolution works correctly
flow.cap_data = self.session_manager.resolve_cap(flow.request.url)
flow_state = flow.get_state()
new_flow = HippoHTTPFlow.from_state(flow_state, self.session_manager)
self.assertIs(self.session, new_flow.cap_data.session())
def test_http_asset_repo(self):
asset_repo = self.session_manager.asset_repo
asset_id = asset_repo.create_asset(b"foobar", one_shot=True)
req = tutils.treq(host="assets.example.com", path=f"/?animatn_id={asset_id}")
fake_flow = tflow.tflow(req=req)
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
# Have to resolve cap data so the asset repo knows this is an asset server cap
flow.cap_data = self.session_manager.resolve_cap(flow.request.url)
self.assertTrue(asset_repo.try_serve_asset(flow))
self.assertEqual(b"foobar", flow.response.content)
def test_temporary_cap_resolution(self):
self.region.register_temporary_cap("TempExample", "http://not.example.com")
self.region.register_temporary_cap("TempExample", "http://not2.example.com")
# Resolving the cap should consume it
cap_data = self.session_manager.resolve_cap("http://not.example.com")
self.assertEqual(cap_data.cap_name, "TempExample")
# A CapData object should always be returned, but the cap_name field will be None
new_cap_data = self.session_manager.resolve_cap("http://not.example.com")
self.assertIsNone(new_cap_data.cap_name)
# The second temp cap with the same name should still be in there
cap_data = self.session_manager.resolve_cap("http://not2.example.com")
self.assertEqual(cap_data.cap_name, "TempExample")

View File

@@ -1,13 +1,17 @@
import unittest
from mitmproxy.test import tflow, tutils
from hippolyzer.lib.base.datatypes import Vector3
from hippolyzer.lib.base.message.message import Block
from hippolyzer.lib.base.message.udpdeserializer import UDPMessageDeserializer
from hippolyzer.lib.base.settings import Settings
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.http_proxy import SerializedCapData
from hippolyzer.lib.proxy.message import ProxiedMessage as Message
from hippolyzer.apps.model import LLUDPMessageLogEntry
from hippolyzer.apps.message_filter import compile_filter
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, HTTPMessageLogEntry
from hippolyzer.lib.proxy.message_filter import compile_filter
from hippolyzer.lib.proxy.sessions import SessionManager
OBJECT_UPDATE = b'\xc0\x00\x00\x00Q\x00\x0c\x00\x01\xea\x03\x00\x02\xe6\x03\x00\x01\xbe\xff\x01\x06\xbc\x8e\x0b\x00' \
b'\x01i\x94\x8cjM"\x1bf\xec\xe4\xac1c\x93\xcbKW\x89\x98\x01\t\x03\x00\x01Q@\x88>Q@\x88>Q@\x88><\xa2D' \
@@ -17,7 +21,7 @@ OBJECT_UPDATE = b'\xc0\x00\x00\x00Q\x00\x0c\x00\x01\xea\x03\x00\x02\xe6\x03\x00\
b'\x00\x02d&\x00\x03\x0e\x00\x01\x0e\x00\x01\x19\x00\x01\x80\x00\x01\x80\x00\x01\x80\x00\x01\x80\x00' \
b'\x01\x80\x00\x01\x80\x91\x11\xd2^/\x12\x8f\x81U\xa7@:x\xb3\x0e-\x00\x10\x03\x01\x00\x03\x1e%n\xa2' \
b'\xff\xc5\xe0\x83\x00\x01\x06\x00\x01\r\r\x01\x00\x11\x0e\xdc\x9b\x83\x98\x9aJv\xac\xc3\xdb\xbf7Ta' \
b'\x88\x00" '
b'\x88\x00"'
class MessageFilterTests(unittest.TestCase):
@@ -46,8 +50,10 @@ class MessageFilterTests(unittest.TestCase):
def test_equality(self):
msg = LLUDPMessageLogEntry(Message("Foo", Block("Bar", Baz=1)), None, None)
self.assertTrue(self._filter_matches("Foo.Bar.Baz == 1", msg))
self.assertTrue(self._filter_matches("Foo.Bar.Baz == 0x1", msg))
msg.message["Bar"]["Baz"] = 2
self.assertFalse(self._filter_matches("Foo.Bar.Baz == 1", msg))
self.assertFalse(self._filter_matches("Foo.Bar.Baz == 0x1", msg))
def test_and(self):
msg = LLUDPMessageLogEntry(Message("Foo", Block("Bar", Baz=1)), None, None)
@@ -95,6 +101,14 @@ class MessageFilterTests(unittest.TestCase):
self.assertFalse(self._filter_matches("Foo.Bar.Baz < (0, 3, 0)", msg))
self.assertTrue(self._filter_matches("Foo.Bar.Baz > (0, 0, 0)", msg))
def test_enum_specifier(self):
# 2 is the enum val for SculptType.TORUS
msg = LLUDPMessageLogEntry(Message("Foo", Block("Bar", Baz=2)), None, None)
self.assertTrue(self._filter_matches("Foo.Bar.Baz == SculptType.TORUS", msg))
# bitwise AND should work as well
self.assertTrue(self._filter_matches("Foo.Bar.Baz & SculptType.TORUS", msg))
self.assertFalse(self._filter_matches("Foo.Bar.Baz == SculptType.SPHERE", msg))
def test_tagged_union_subfield(self):
settings = Settings()
settings.ENABLE_DEFERRED_PACKET_PARSING = False
@@ -105,6 +119,17 @@ class MessageFilterTests(unittest.TestCase):
self.assertTrue(self._filter_matches("ObjectUpdate.ObjectData.ObjectData.Position > (88, 41, 25)", entry))
self.assertTrue(self._filter_matches("ObjectUpdate.ObjectData.ObjectData.Position < (90, 43, 27)", entry))
def test_http_flow(self):
session_manager = SessionManager()
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
fake_flow.metadata["cap_data_ser"] = SerializedCapData(
cap_name="FakeCap",
)
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), session_manager)
entry = HTTPMessageLogEntry(flow)
self.assertTrue(self._filter_matches("FakeCap", entry))
self.assertFalse(self._filter_matches("NotFakeCap", entry))
if __name__ == "__main__":
unittest.main()

View File

@@ -294,10 +294,12 @@ class HumanReadableMessageTests(unittest.TestCase):
class TestMessageSubfieldSerializers(unittest.TestCase):
def setUp(self):
self.chat_msg = ProxiedMessage('ChatFromViewer',
Block('AgentData', AgentID=UUID('550e8400-e29b-41d4-a716-446655440000'),
SessionID=UUID('550e8400-e29b-41d4-a716-446655440000')),
Block('ChatData', Message="Chatting\n", Type=1, Channel=0))
self.chat_msg = ProxiedMessage(
'ChatFromViewer',
Block('AgentData',
AgentID=UUID('550e8400-e29b-41d4-a716-446655440000'),
SessionID=UUID('550e8400-e29b-41d4-a716-446655440000')),
Block('ChatData', Message="Chatting\n", Type=1, Channel=0))
def test_pretty_repr(self):
expected_repr = r"""ProxiedMessage('ChatFromViewer',

View File

@@ -1,3 +1,4 @@
import math
import random
import unittest
from typing import *
@@ -12,11 +13,13 @@ from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.objects import ObjectManager
from hippolyzer.lib.proxy.message import ProxiedMessage as Message
from hippolyzer.lib.proxy.templates import PCode
class MockRegion:
def __init__(self, message_handler: MessageHandler):
self.session = lambda: None
self.handle = 123
self.message_handler = message_handler
self.http_message_handler = MessageHandler()
@@ -43,9 +46,11 @@ class ObjectManagerTests(unittest.TestCase):
self.object_addon = ObjectTrackingAddon()
AddonManager.init([], None, [self.object_addon])
def _create_object_update(self, local_id=None, full_id=None, parent_id=None, pos=None, rot=None) -> Message:
def _create_object_update(self, local_id=None, full_id=None, parent_id=None, pos=None, rot=None,
pcode=None, namevalue=None) -> Message:
pos = pos if pos is not None else (1.0, 2.0, 3.0)
rot = rot if rot is not None else (0.0, 0.0, 0.0, 1.0)
pcode = pcode if pcode is not None else PCode.PRIMITIVE
msg = Message(
"ObjectUpdate",
Block("RegionData", RegionHandle=123, TimeDilation=123),
@@ -53,7 +58,7 @@ class ObjectManagerTests(unittest.TestCase):
"ObjectData",
ID=local_id if local_id is not None else random.getrandbits(32),
FullID=full_id if full_id else UUID.random(),
PCode=9,
PCode=pcode,
Scale=Vector3(0.5, 0.5, 0.5),
UpdateFlags=268568894,
PathCurve=16,
@@ -61,6 +66,7 @@ class ObjectManagerTests(unittest.TestCase):
ProfileCurve=1,
PathScaleX=100,
PathScaleY=100,
NameValue=namevalue,
TextureEntry=b'\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x00\x00\x00\x00\x00\x00\x00\x00\x80?\x00\x00'
b'\x00\x80?\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00',
@@ -85,8 +91,11 @@ class ObjectManagerTests(unittest.TestCase):
# Run through (de)serializer to fill in any missing vars
return self.deserializer.deserialize(self.serializer.serialize(msg))
def _create_object(self, local_id=None, full_id=None, parent_id=None, pos=None, rot=None) -> Object:
msg = self._create_object_update(local_id=local_id, full_id=full_id, parent_id=parent_id, pos=pos, rot=rot)
def _create_object(self, local_id=None, full_id=None, parent_id=None, pos=None, rot=None,
pcode=None, namevalue=None) -> Object:
msg = self._create_object_update(
local_id=local_id, full_id=full_id, parent_id=parent_id, pos=pos, rot=rot,
pcode=pcode, namevalue=namevalue)
self.message_handler.handle(msg)
return self.object_manager.lookup_fullid(msg["ObjectData"]["FullID"])
@@ -102,6 +111,9 @@ class ObjectManagerTests(unittest.TestCase):
def _kill_object(self, obj: Object):
self.message_handler.handle(self._create_kill_object(obj.LocalID))
def _get_avatar_positions(self) -> Dict[UUID, Vector3]:
return {av.FullID: av.RegionPosition for av in self.object_manager.all_avatars}
def test_basic_tracking(self):
"""Does creating an object result in it being tracked?"""
msg = self._create_object_update()
@@ -122,14 +134,33 @@ class ObjectManagerTests(unittest.TestCase):
self.assertEqual(set(), self.object_manager.missing_locals)
self.assertSequenceEqual([child.LocalID], parent.ChildIDs)
def test_killing_parent_orphans_children(self):
child = self._create_object(local_id=2, parent_id=1)
def test_killing_parent_kills_children(self):
_child = self._create_object(local_id=2, parent_id=1)
parent = self._create_object(local_id=1)
# This should orphan the child again
self._kill_object(parent)
parent = self._create_object(local_id=1)
# Did we pick the orphan back up?
self.assertSequenceEqual([child.LocalID], parent.ChildIDs)
# We should not have picked up any children
self.assertSequenceEqual([], parent.ChildIDs)
def test_hierarchy_killed(self):
_child = self._create_object(local_id=3, parent_id=2)
_other_child = self._create_object(local_id=4, parent_id=2)
_parent = self._create_object(local_id=2, parent_id=1)
grandparent = self._create_object(local_id=1)
# KillObject implicitly kills all known descendents at that point
self._kill_object(grandparent)
self.assertEqual(0, len(self.object_manager))
def test_hierarchy_avatar_not_killed(self):
_child = self._create_object(local_id=3, parent_id=2)
_parent = self._create_object(local_id=2, parent_id=1, pcode=PCode.AVATAR)
grandparent = self._create_object(local_id=1)
# KillObject should only "unsit" child avatars (does this require an ObjectUpdate
# or is ParentID=0 implied?)
self._kill_object(grandparent)
self.assertEqual(2, len(self.object_manager))
self.assertIsNotNone(self.object_manager.lookup_localid(2))
def test_attachment_orphan_parent_tracking(self):
"""
@@ -142,15 +173,6 @@ class ObjectManagerTests(unittest.TestCase):
parent = self._create_object(local_id=2, parent_id=1)
self.assertSequenceEqual([child.LocalID], parent.ChildIDs)
def test_killing_attachment_parent_orphans_children(self):
child = self._create_object(local_id=3, parent_id=2)
parent = self._create_object(local_id=2, parent_id=1)
# This should orphan the child again
self._kill_object(parent)
parent = self._create_object(local_id=2, parent_id=1)
# Did we pick the orphan back up?
self.assertSequenceEqual([child.LocalID], parent.ChildIDs)
def test_unparenting_succeeds(self):
child = self._create_object(local_id=3, parent_id=2)
parent = self._create_object(local_id=2)
@@ -229,6 +251,65 @@ class ObjectManagerTests(unittest.TestCase):
self.assertEqual(parent.RegionPosition, (0.0, 0.0, 0.0))
self.assertEqual(child.RegionPosition, (1.0, 2.0, 0.0))
def test_avatar_locations(self):
agent1_id = UUID.random()
agent2_id = UUID.random()
self.message_handler.handle(Message(
"CoarseLocationUpdate",
Block("AgentData", AgentID=agent1_id),
Block("AgentData", AgentID=agent2_id),
Block("Location", X=1, Y=2, Z=3),
Block("Location", X=2, Y=3, Z=4),
))
self.assertDictEqual(self._get_avatar_positions(), {
# CoarseLocation's Z axis is multiplied by 4
agent1_id: Vector3(1, 2, 12),
agent2_id: Vector3(2, 3, 16),
})
if __name__ == "__main__":
unittest.main()
# Simulate an avatar sitting on an object
seat_object = self._create_object(pos=(0, 0, 3))
# If we have a real object pos it should override coarse pos
avatar_obj = self._create_object(full_id=agent1_id, pcode=PCode.AVATAR,
parent_id=seat_object.LocalID, pos=Vector3(0, 0, 2))
self.assertDictEqual(self._get_avatar_positions(), {
# Agent is seated, make sure this is region and not local pos
agent1_id: Vector3(0, 0, 5),
agent2_id: Vector3(2, 3, 16),
})
# If the object is killed and no coarse pos, it shouldn't be in the dict
# CoarseLocationUpdates are expected to be complete, so any agents missing
# are no longer in the sim.
self._kill_object(avatar_obj)
self.message_handler.handle(Message(
"CoarseLocationUpdate",
Block("AgentData", AgentID=agent2_id),
Block("Location", X=2, Y=3, Z=4),
))
self.assertDictEqual(self._get_avatar_positions(), {
agent2_id: Vector3(2, 3, 16),
})
# 255 on Z axis means we can't guess the real Z
self.message_handler.handle(Message(
"CoarseLocationUpdate",
Block("AgentData", AgentID=agent2_id),
Block("Location", X=2, Y=3, Z=math.inf),
))
self.assertDictEqual(self._get_avatar_positions(), {
agent2_id: Vector3(2, 3, math.inf),
})
def test_name_cache(self):
# Receiving an update with a NameValue for an avatar should update NameCache
obj = self._create_object(
pcode=PCode.AVATAR,
namevalue=b'DisplayName STRING RW DS unicodename\n'
b'FirstName STRING RW DS firstname\n'
b'LastName STRING RW DS Resident\n'
b'Title STRING RW DS foo',
)
self.assertEqual(self.object_manager.name_cache.lookup(obj.FullID).FirstName, "firstname")
av = self.object_manager.lookup_avatar(obj.FullID)
self.assertEqual(av.Name, "firstname Resident")