Compare commits
39 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
0d9593e14c | ||
|
|
28dfe2f1b2 | ||
|
|
c8f7231eae | ||
|
|
00e9ecb765 | ||
|
|
2892bbeb98 | ||
|
|
28f57a8836 | ||
|
|
943b8b11d5 | ||
|
|
88915dd8d7 | ||
|
|
60b39e27f8 | ||
|
|
8af87befbd | ||
|
|
95e34bb07a | ||
|
|
106eb5c063 | ||
|
|
e7f88eeed9 | ||
|
|
d07f100452 | ||
|
|
02c212e4a6 | ||
|
|
8989843042 | ||
|
|
a217a30133 | ||
|
|
8514d7bae8 | ||
|
|
d9084c3332 | ||
|
|
0f35cc00d5 | ||
|
|
a6a7ce8fa3 | ||
|
|
269a1e163b | ||
|
|
eb2b6ee870 | ||
|
|
79a4f72558 | ||
|
|
6316369e1a | ||
|
|
1b0272f3b3 | ||
|
|
aedc2bf48c | ||
|
|
5d3fd69e35 | ||
|
|
ae464f2c06 | ||
|
|
7d303d2bca | ||
|
|
dda3759028 | ||
|
|
d4e1a7a070 | ||
|
|
d401842eef | ||
|
|
1e4060f49c | ||
|
|
a6c7f996ba | ||
|
|
8fb36892cf | ||
|
|
16c02d8b8c | ||
|
|
badd4dbc78 | ||
|
|
a63418aaac |
46
.github/workflows/bundle_windows.yml
vendored
Normal file
46
.github/workflows/bundle_windows.yml
vendored
Normal file
@@ -0,0 +1,46 @@
|
||||
# Have to manually unzip this (it gets double zipped) and add it
|
||||
# onto the release after it gets created. Don't want actions with repo write.
|
||||
name: Bundle Windows EXE
|
||||
|
||||
on:
|
||||
# Only trigger on release creation
|
||||
release:
|
||||
types:
|
||||
- created
|
||||
workflow_dispatch:
|
||||
|
||||
|
||||
jobs:
|
||||
build:
|
||||
|
||||
runs-on: windows-latest
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: [3.9]
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
pip install -e .
|
||||
pip install cx_freeze
|
||||
|
||||
- name: Bundle with cx_Freeze
|
||||
run: |
|
||||
python setup_cxfreeze.py build_exe
|
||||
pip install pip-licenses
|
||||
pip-licenses --format=plain-vertical --with-license-file --no-license-path --output-file=lib_licenses.txt
|
||||
python setup_cxfreeze.py finalize_cxfreeze
|
||||
|
||||
- name: Upload the artifact
|
||||
uses: actions/upload-artifact@v2
|
||||
with:
|
||||
name: hippolyzer-gui-windows-${{ github.sha }}
|
||||
path: ./dist/**
|
||||
2
.github/workflows/pypi_publish.yml
vendored
2
.github/workflows/pypi_publish.yml
vendored
@@ -6,6 +6,8 @@ on:
|
||||
release:
|
||||
types:
|
||||
- created
|
||||
workflow_dispatch:
|
||||
|
||||
|
||||
# based on https://github.com/pypa/gh-action-pypi-publish
|
||||
|
||||
|
||||
24
.github/workflows/pytest.yml
vendored
24
.github/workflows/pytest.yml
vendored
@@ -12,16 +12,34 @@ jobs:
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
pip install flake8 pytest
|
||||
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
|
||||
pip install -r requirements.txt
|
||||
pip install -r requirements-test.txt
|
||||
- name: Test with pytest
|
||||
# Tests are intentionally covered to detect broken tests.
|
||||
run: |
|
||||
pytest
|
||||
pytest --cov=./hippolyzer --cov=./tests --cov-report=xml
|
||||
|
||||
# Keep this in a workflow without any other secrets in it.
|
||||
- name: Upload coverage to Codecov
|
||||
uses: codecov/codecov-action@v1
|
||||
with:
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
files: ./coverage.xml
|
||||
directory: ./coverage/reports/
|
||||
flags: unittests
|
||||
env_vars: OS,PYTHON
|
||||
name: codecov-umbrella
|
||||
fail_ci_if_error: false
|
||||
# We don't care if coverage drops
|
||||
continue-on-error: true
|
||||
path_to_write_report: ./coverage/codecov_report.txt
|
||||
verbose: false
|
||||
|
||||
1
.gitignore
vendored
1
.gitignore
vendored
@@ -1,6 +1,7 @@
|
||||
#use glob syntax
|
||||
syntax: glob
|
||||
|
||||
__pycache__
|
||||
*.pyc
|
||||
build/*
|
||||
*.egg-info
|
||||
|
||||
17
README.md
17
README.md
@@ -1,5 +1,7 @@
|
||||
# Hippolyzer
|
||||
|
||||
 [](https://codecov.io/gh/SaladDais/Hippolyzer)
|
||||
|
||||
[Hippolyzer](http://wiki.secondlife.com/wiki/Hippo) is a fork of Linden Lab's abandoned
|
||||
[PyOGP library](http://wiki.secondlife.com/wiki/PyOGP)
|
||||
targeting modern Python 3, with a focus on debugging issues in Second Life-compatible
|
||||
@@ -22,6 +24,9 @@ with low-level SL details. See the [Local Animation addon example](https://githu
|
||||

|
||||
|
||||
## Setup
|
||||
|
||||
### From Source
|
||||
|
||||
* Python 3.8 or above is **required**. If you're unable to upgrade your system Python package due to
|
||||
being on a stable distro, you can use [pyenv](https://github.com/pyenv/pyenv) to create
|
||||
a self-contained Python install with the appropriate version.
|
||||
@@ -32,6 +37,11 @@ with low-level SL details. See the [Local Animation addon example](https://githu
|
||||
* * Under Windows it's `<virtualenv_dir>\Scripts\activate.bat`
|
||||
* Run `pip install hippolyzer`, or run `pip install -e .` in a cloned repo to install an editable version
|
||||
|
||||
### Binary Windows Builds
|
||||
|
||||
Binary Windows builds are available on the [Releases page](https://github.com/SaladDais/Hippolyzer/releases/).
|
||||
I don't extensively test these, building from source is recommended.
|
||||
|
||||
## Proxy
|
||||
|
||||
A proxy is provided with both a CLI and Qt-based interface. The proxy application wraps a
|
||||
@@ -85,6 +95,9 @@ agent's session, you can do `(Meta.AgentID == None || Meta.AgentID == "d929385f-
|
||||
Vectors can also be compared. This will get any ObjectUpdate variant that occurs within a certain range:
|
||||
`(*ObjectUpdate*.ObjectData.*Data.Position > (110, 50, 100) && *ObjectUpdate*.ObjectData.*Data.Position < (115, 55, 105))`
|
||||
|
||||
If you want to compare against an enum or a flag class in defined in `templates.py`, you can just specify its name:
|
||||
`ViewerEffect.Effect.Type == ViewerEffectType.EFFECT_BEAM`
|
||||
|
||||
### Logging
|
||||
|
||||
Decoded messages are displayed in the log pane, clicking one will show the request and
|
||||
@@ -289,12 +302,8 @@ If you are a viewer developer, please put them in a viewer.
|
||||
|
||||
## Potential Changes
|
||||
|
||||
* Make package-able for PyPI
|
||||
* GitHub action to build binary packages and pull together licenses bundle
|
||||
* AISv3 wrapper?
|
||||
* Higher level wrappers for common things? I don't really need these, so only if people want to write them.
|
||||
* Highlight matched portion of message in log view, if applicable
|
||||
* * Remember deep filters and return a map of them, have message formatter return text ranges?
|
||||
* Move things out of `templates.py`, right now most binary serialization stuff lives there
|
||||
because it's more convenient for me to hot-reload.
|
||||
* Ability to add menus?
|
||||
|
||||
14
codecov.yml
Normal file
14
codecov.yml
Normal file
@@ -0,0 +1,14 @@
|
||||
coverage:
|
||||
precision: 1
|
||||
round: down
|
||||
range: "50...80"
|
||||
status:
|
||||
project:
|
||||
default:
|
||||
# Do not fail commits if the code coverage drops.
|
||||
target: 0%
|
||||
threshold: 100%
|
||||
base: auto
|
||||
patch:
|
||||
default:
|
||||
only_pulls: true
|
||||
@@ -1,43 +1,15 @@
|
||||
import collections
|
||||
import codecs
|
||||
import copy
|
||||
import enum
|
||||
import fnmatch
|
||||
import io
|
||||
import logging
|
||||
import pickle
|
||||
import queue
|
||||
import re
|
||||
import typing
|
||||
import weakref
|
||||
|
||||
from defusedxml import minidom
|
||||
from PySide2 import QtCore, QtGui
|
||||
|
||||
from hippolyzer.lib.base import llsd
|
||||
from hippolyzer.lib.base.datatypes import *
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion, CapType
|
||||
import hippolyzer.lib.base.serialization as se
|
||||
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
|
||||
from hippolyzer.lib.proxy.sessions import Session, BaseMessageLogger
|
||||
|
||||
from .message_filter import compile_filter, BaseFilterNode, MessageFilterNode, MetaFieldSpecifier
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.message_logger import FilteringMessageLogger
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def bytes_unescape(val: bytes) -> bytes:
|
||||
# Only in CPython. bytes -> bytes with escape decoding.
|
||||
# https://stackoverflow.com/a/23151714
|
||||
return codecs.escape_decode(val)[0] # type: ignore
|
||||
|
||||
|
||||
def bytes_escape(val: bytes) -> bytes:
|
||||
# Try to keep newlines as-is
|
||||
return re.sub(rb"(?<!\\)\\n", b"\n", codecs.escape_encode(val)[0]) # type: ignore
|
||||
|
||||
|
||||
class MessageLogHeader(enum.IntEnum):
|
||||
Host = 0
|
||||
Type = enum.auto()
|
||||
@@ -46,582 +18,23 @@ class MessageLogHeader(enum.IntEnum):
|
||||
Summary = enum.auto()
|
||||
|
||||
|
||||
class AbstractMessageLogEntry:
|
||||
region: typing.Optional[ProxiedRegion]
|
||||
session: typing.Optional[Session]
|
||||
name: str
|
||||
type: str
|
||||
|
||||
__slots__ = ["_region", "_session", "_region_name", "_agent_id", "_summary", "meta"]
|
||||
|
||||
def __init__(self, region, session):
|
||||
if region and not isinstance(region, weakref.ReferenceType):
|
||||
region = weakref.ref(region)
|
||||
if session and not isinstance(session, weakref.ReferenceType):
|
||||
session = weakref.ref(session)
|
||||
|
||||
self._region: typing.Optional[weakref.ReferenceType] = region
|
||||
self._session: typing.Optional[weakref.ReferenceType] = session
|
||||
self._region_name = None
|
||||
self._agent_id = None
|
||||
self._summary = None
|
||||
if self.region:
|
||||
self._region_name = self.region.name
|
||||
if self.session:
|
||||
self._agent_id = self.session.agent_id
|
||||
|
||||
agent_obj = None
|
||||
if self.region is not None:
|
||||
agent_obj = self.region.objects.lookup_fullid(self.agent_id)
|
||||
self.meta = {
|
||||
"RegionName": self.region_name,
|
||||
"AgentID": self.agent_id,
|
||||
"SessionID": self.session.id if self.session else None,
|
||||
"AgentLocal": agent_obj.LocalID if agent_obj is not None else None,
|
||||
"Method": self.method,
|
||||
"Type": self.type,
|
||||
"SelectedLocal": self._current_selected_local(),
|
||||
"SelectedFull": self._current_selected_full(),
|
||||
}
|
||||
|
||||
def freeze(self):
|
||||
pass
|
||||
|
||||
def cache_summary(self):
|
||||
self._summary = self.summary
|
||||
|
||||
def _current_selected_local(self):
|
||||
if self.session:
|
||||
return self.session.selected.object_local
|
||||
return None
|
||||
|
||||
def _current_selected_full(self):
|
||||
selected_local = self._current_selected_local()
|
||||
if selected_local is None or self.region is None:
|
||||
return None
|
||||
obj = self.region.objects.lookup_localid(selected_local)
|
||||
return obj and obj.FullID
|
||||
|
||||
def _get_meta(self, name: str):
|
||||
# Slight difference in semantics. Filters are meant to return the same
|
||||
# thing no matter when they're run, so SelectedLocal and friends resolve
|
||||
# to the selected items _at the time the message was logged_. To handle
|
||||
# the case where we want to match on the selected object at the time the
|
||||
# filter is evaluated, we resolve these here.
|
||||
if name == "CurrentSelectedLocal":
|
||||
return self._current_selected_local()
|
||||
elif name == "CurrentSelectedFull":
|
||||
return self._current_selected_full()
|
||||
return self.meta.get(name)
|
||||
|
||||
@property
|
||||
def region(self) -> typing.Optional[ProxiedRegion]:
|
||||
if self._region:
|
||||
return self._region()
|
||||
return None
|
||||
|
||||
@property
|
||||
def session(self) -> typing.Optional[Session]:
|
||||
if self._session:
|
||||
return self._session()
|
||||
return None
|
||||
|
||||
@property
|
||||
def region_name(self) -> str:
|
||||
region = self.region
|
||||
if region:
|
||||
self._region_name = region.name
|
||||
return self._region_name
|
||||
# Region may die after a message is logged, need to keep this around.
|
||||
if self._region_name:
|
||||
return self._region_name
|
||||
|
||||
return ""
|
||||
|
||||
@property
|
||||
def agent_id(self) -> typing.Optional[UUID]:
|
||||
if self._agent_id:
|
||||
return self._agent_id
|
||||
|
||||
session = self.session
|
||||
if session:
|
||||
self._agent_id = session.agent_id
|
||||
return self._agent_id
|
||||
return None
|
||||
|
||||
@property
|
||||
def host(self) -> str:
|
||||
region_name = self.region_name
|
||||
if not region_name:
|
||||
return ""
|
||||
session_str = ""
|
||||
agent_id = self.agent_id
|
||||
if agent_id:
|
||||
session_str = f" ({agent_id})"
|
||||
return region_name + session_str
|
||||
|
||||
def request(self, beautify=False, replacements=None):
|
||||
return None
|
||||
|
||||
def response(self, beautify=False):
|
||||
return None
|
||||
|
||||
def _packet_root_matches(self, pattern):
|
||||
if fnmatch.fnmatchcase(self.name, pattern):
|
||||
return True
|
||||
if fnmatch.fnmatchcase(self.type, pattern):
|
||||
return True
|
||||
return False
|
||||
|
||||
def _val_matches(self, operator, val, expected):
|
||||
if isinstance(expected, MetaFieldSpecifier):
|
||||
expected = self._get_meta(str(expected))
|
||||
if not isinstance(expected, (int, float, bytes, str, type(None), tuple)):
|
||||
if callable(expected):
|
||||
expected = expected()
|
||||
else:
|
||||
expected = str(expected)
|
||||
elif expected is not None:
|
||||
# Unbox the expected value
|
||||
expected = expected.value
|
||||
if not isinstance(val, (int, float, bytes, str, type(None), tuple, TupleCoord)):
|
||||
val = str(val)
|
||||
|
||||
if not operator:
|
||||
return bool(val)
|
||||
elif operator == "==":
|
||||
return val == expected
|
||||
elif operator == "!=":
|
||||
return val != expected
|
||||
elif operator == "^=":
|
||||
if val is None:
|
||||
return False
|
||||
return val.startswith(expected)
|
||||
elif operator == "$=":
|
||||
if val is None:
|
||||
return False
|
||||
return val.endswith(expected)
|
||||
elif operator == "~=":
|
||||
if val is None:
|
||||
return False
|
||||
return expected in val
|
||||
elif operator == "<":
|
||||
return val < expected
|
||||
elif operator == "<=":
|
||||
return val <= expected
|
||||
elif operator == ">":
|
||||
return val > expected
|
||||
elif operator == ">=":
|
||||
return val >= expected
|
||||
else:
|
||||
raise ValueError(f"Unexpected operator {operator!r}")
|
||||
|
||||
def _base_matches(self, matcher: "MessageFilterNode") -> typing.Optional[bool]:
|
||||
if len(matcher.selector) == 1:
|
||||
# Comparison operators would make no sense here
|
||||
if matcher.value or matcher.operator:
|
||||
return False
|
||||
return self._packet_root_matches(matcher.selector[0])
|
||||
if len(matcher.selector) == 2 and matcher.selector[0] == "Meta":
|
||||
return self._val_matches(matcher.operator, self._get_meta(matcher.selector[1]), matcher.value)
|
||||
return None
|
||||
|
||||
def matches(self, matcher: "MessageFilterNode"):
|
||||
return self._base_matches(matcher) or False
|
||||
|
||||
@property
|
||||
def seq(self):
|
||||
return ""
|
||||
|
||||
@property
|
||||
def method(self):
|
||||
return ""
|
||||
|
||||
@property
|
||||
def summary(self):
|
||||
return ""
|
||||
|
||||
@staticmethod
|
||||
def _format_llsd(parsed):
|
||||
xmlified = llsd.format_pretty_xml(parsed)
|
||||
# dedent <key> by 1 for easier visual scanning
|
||||
xmlified = re.sub(rb" <key>", b"<key>", xmlified)
|
||||
return xmlified.decode("utf8", errors="replace")
|
||||
|
||||
|
||||
class LLUDPMessageLogEntry(AbstractMessageLogEntry):
|
||||
__slots__ = ["_message", "_name", "_direction", "_frozen_message", "_seq", "_deserializer"]
|
||||
|
||||
def __init__(self, message: ProxiedMessage, region, session):
|
||||
self._message: ProxiedMessage = message
|
||||
self._deserializer = None
|
||||
self._name = message.name
|
||||
self._direction = message.direction
|
||||
self._frozen_message: typing.Optional[bytes] = None
|
||||
self._seq = message.packet_id
|
||||
super().__init__(region, session)
|
||||
|
||||
_MESSAGE_META_ATTRS = {
|
||||
"Injected", "Dropped", "Extra", "Resent", "Zerocoded", "Acks", "Reliable",
|
||||
}
|
||||
|
||||
def _get_meta(self, name: str):
|
||||
# These may change between when the message is logged and when we
|
||||
# actually filter on it, since logging happens before addons.
|
||||
msg = self.message
|
||||
if name in self._MESSAGE_META_ATTRS:
|
||||
return getattr(msg, name.lower(), None)
|
||||
msg_meta = getattr(msg, "meta", None)
|
||||
if msg_meta is not None:
|
||||
if name in msg_meta:
|
||||
return msg_meta[name]
|
||||
return super()._get_meta(name)
|
||||
|
||||
@property
|
||||
def message(self):
|
||||
if self._message:
|
||||
return self._message
|
||||
elif self._frozen_message:
|
||||
message = pickle.loads(self._frozen_message)
|
||||
message.deserializer = self._deserializer
|
||||
return message
|
||||
else:
|
||||
raise ValueError("Didn't have a fresh or frozen message somehow")
|
||||
|
||||
def freeze(self):
|
||||
self.message.invalidate_caches()
|
||||
# These are expensive to keep around. pickle them and un-pickle on
|
||||
# an as-needed basis.
|
||||
self._deserializer = self.message.deserializer
|
||||
self.message.deserializer = None
|
||||
self._frozen_message = pickle.dumps(self._message, protocol=pickle.HIGHEST_PROTOCOL)
|
||||
self._message = None
|
||||
|
||||
@property
|
||||
def type(self):
|
||||
return "LLUDP"
|
||||
|
||||
@property
|
||||
def name(self):
|
||||
if self._message:
|
||||
self._name = self._message.name
|
||||
return self._name
|
||||
|
||||
@property
|
||||
def method(self):
|
||||
if self._message:
|
||||
self._direction = self._message.direction
|
||||
return self._direction.name if self._direction is not None else ""
|
||||
|
||||
def request(self, beautify=False, replacements=None):
|
||||
return self.message.to_human_string(replacements, beautify)
|
||||
|
||||
def matches(self, matcher):
|
||||
base_matched = self._base_matches(matcher)
|
||||
if base_matched is not None:
|
||||
return base_matched
|
||||
|
||||
if not self._packet_root_matches(matcher.selector[0]):
|
||||
return False
|
||||
|
||||
message = self.message
|
||||
|
||||
selector_len = len(matcher.selector)
|
||||
# name, block_name, var_name(, subfield_name)?
|
||||
if selector_len not in (3, 4):
|
||||
return False
|
||||
for block_name in message.blocks:
|
||||
if not fnmatch.fnmatchcase(block_name, matcher.selector[1]):
|
||||
continue
|
||||
for block in message[block_name]:
|
||||
for var_name in block.vars.keys():
|
||||
if not fnmatch.fnmatchcase(var_name, matcher.selector[2]):
|
||||
continue
|
||||
if selector_len == 3:
|
||||
if matcher.value is None:
|
||||
return True
|
||||
if self._val_matches(matcher.operator, block[var_name], matcher.value):
|
||||
return True
|
||||
elif selector_len == 4:
|
||||
try:
|
||||
deserialized = block.deserialize_var(var_name)
|
||||
except KeyError:
|
||||
continue
|
||||
# Discard the tag if this is a tagged union, we only want the value
|
||||
if isinstance(deserialized, TaggedUnion):
|
||||
deserialized = deserialized.value
|
||||
if not isinstance(deserialized, dict):
|
||||
return False
|
||||
for key in deserialized.keys():
|
||||
if fnmatch.fnmatchcase(str(key), matcher.selector[3]):
|
||||
if matcher.value is None:
|
||||
return True
|
||||
if self._val_matches(matcher.operator, deserialized[key], matcher.value):
|
||||
return True
|
||||
|
||||
return False
|
||||
|
||||
@property
|
||||
def summary(self):
|
||||
if self._summary is None:
|
||||
self._summary = self.message.to_summary()[:500]
|
||||
return self._summary
|
||||
|
||||
@property
|
||||
def seq(self):
|
||||
if self._message:
|
||||
self._seq = self._message.packet_id
|
||||
return self._seq
|
||||
|
||||
|
||||
class EQMessageLogEntry(AbstractMessageLogEntry):
|
||||
__slots__ = ["event"]
|
||||
|
||||
def __init__(self, event, region, session):
|
||||
super().__init__(region, session)
|
||||
self.event = event
|
||||
|
||||
@property
|
||||
def type(self):
|
||||
return "EQ"
|
||||
|
||||
def request(self, beautify=False, replacements=None):
|
||||
return self._format_llsd(self.event["body"])
|
||||
|
||||
@property
|
||||
def name(self):
|
||||
return self.event["message"]
|
||||
|
||||
@property
|
||||
def summary(self):
|
||||
if self._summary is not None:
|
||||
return self._summary
|
||||
self._summary = ""
|
||||
self._summary = llsd.format_notation(self.event["body"]).decode("utf8")[:500]
|
||||
return self._summary
|
||||
|
||||
|
||||
class HTTPMessageLogEntry(AbstractMessageLogEntry):
|
||||
__slots__ = ["flow"]
|
||||
|
||||
def __init__(self, flow: HippoHTTPFlow):
|
||||
self.flow: HippoHTTPFlow = flow
|
||||
cap_data = self.flow.cap_data
|
||||
region = cap_data and cap_data.region
|
||||
session = cap_data and cap_data.session
|
||||
|
||||
super().__init__(region, session)
|
||||
# This was a request the proxy made through itself
|
||||
self.meta["Injected"] = flow.request_injected
|
||||
|
||||
@property
|
||||
def type(self):
|
||||
return "HTTP"
|
||||
|
||||
@property
|
||||
def name(self):
|
||||
cap_data = self.flow.cap_data
|
||||
name = cap_data and cap_data.cap_name
|
||||
if name:
|
||||
return name
|
||||
return self.flow.request.url
|
||||
|
||||
@property
|
||||
def method(self):
|
||||
return self.flow.request.method
|
||||
|
||||
def _format_http_message(self, want_request, beautify):
|
||||
message = self.flow.request if want_request else self.flow.response
|
||||
method = self.flow.request.method
|
||||
buf = io.StringIO()
|
||||
cap_data = self.flow.cap_data
|
||||
cap_name = cap_data and cap_data.cap_name
|
||||
base_url = cap_name and cap_data.base_url
|
||||
temporary_cap = cap_data and cap_data.type == CapType.TEMPORARY
|
||||
beautify_url = (beautify and base_url and cap_name and
|
||||
not temporary_cap and self.session and want_request)
|
||||
if want_request:
|
||||
buf.write(message.method)
|
||||
buf.write(" ")
|
||||
if beautify_url:
|
||||
buf.write(f"[[{cap_name}]]{message.url[len(base_url):]}")
|
||||
else:
|
||||
buf.write(message.url)
|
||||
buf.write(" ")
|
||||
buf.write(message.http_version)
|
||||
else:
|
||||
buf.write(message.http_version)
|
||||
buf.write(" ")
|
||||
buf.write(str(message.status_code))
|
||||
buf.write(" ")
|
||||
buf.write(message.reason)
|
||||
buf.write("\r\n")
|
||||
if beautify_url:
|
||||
buf.write("# ")
|
||||
buf.write(message.url)
|
||||
buf.write("\r\n")
|
||||
|
||||
headers = copy.deepcopy(message.headers)
|
||||
for key in tuple(headers.keys()):
|
||||
if key.lower().startswith("x-hippo-"):
|
||||
LOG.warning(f"Internal header {key!r} leaked out?")
|
||||
# If this header actually came from somewhere untrusted, we can't
|
||||
# include it. It may change the meaning of the message when replayed.
|
||||
headers[f"X-Untrusted-{key}"] = headers[key]
|
||||
headers.pop(key)
|
||||
beautified = None
|
||||
if beautify and message.content:
|
||||
try:
|
||||
serializer = se.HTTP_SERIALIZERS.get(cap_name)
|
||||
if serializer:
|
||||
if want_request:
|
||||
beautified = serializer.deserialize_req_body(method, message.content)
|
||||
else:
|
||||
beautified = serializer.deserialize_resp_body(method, message.content)
|
||||
|
||||
if beautified is se.UNSERIALIZABLE:
|
||||
beautified = None
|
||||
else:
|
||||
beautified = self._format_llsd(beautified)
|
||||
headers["X-Hippo-Beautify"] = "1"
|
||||
|
||||
if not beautified:
|
||||
content_type = self._guess_content_type(message)
|
||||
if content_type.startswith("application/llsd"):
|
||||
beautified = self._format_llsd(llsd.parse(message.content))
|
||||
elif any(content_type.startswith(x) for x in ("application/xml", "text/xml")):
|
||||
beautified = minidom.parseString(message.content).toprettyxml(indent=" ")
|
||||
# kill blank lines. will break cdata sections. meh.
|
||||
beautified = re.sub(r'\n\s*\n', '\n', beautified, flags=re.MULTILINE)
|
||||
beautified = re.sub(r'<([\w]+)>\s*</\1>', r'<\1></\1>',
|
||||
beautified, flags=re.MULTILINE)
|
||||
except:
|
||||
LOG.exception("Failed to beautify message")
|
||||
|
||||
message_body = beautified or message.content
|
||||
if isinstance(message_body, bytes):
|
||||
try:
|
||||
decoded = message.text
|
||||
# Valid in many codecs, but unprintable.
|
||||
if "\x00" in decoded:
|
||||
raise ValueError("Embedded null")
|
||||
message_body = decoded
|
||||
except (UnicodeError, ValueError):
|
||||
# non-printable characters, return the escaped version.
|
||||
headers["X-Hippo-Escaped-Body"] = "1"
|
||||
message_body = bytes_escape(message_body).decode("utf8")
|
||||
|
||||
buf.write(bytes(headers).decode("utf8", errors="replace"))
|
||||
buf.write("\r\n")
|
||||
|
||||
buf.write(message_body)
|
||||
return buf.getvalue()
|
||||
|
||||
def request(self, beautify=False, replacements=None):
|
||||
return self._format_http_message(want_request=True, beautify=beautify)
|
||||
|
||||
def response(self, beautify=False):
|
||||
return self._format_http_message(want_request=False, beautify=beautify)
|
||||
|
||||
@property
|
||||
def summary(self):
|
||||
if self._summary is not None:
|
||||
return self._summary
|
||||
msg = self.flow.response
|
||||
self._summary = f"{msg.status_code}: "
|
||||
if not msg.content:
|
||||
return self._summary
|
||||
if len(msg.content) > 1000000:
|
||||
self._summary += "[too large...]"
|
||||
return self._summary
|
||||
content_type = self._guess_content_type(msg)
|
||||
if content_type.startswith("application/llsd"):
|
||||
notation = llsd.format_notation(llsd.parse(msg.content))
|
||||
self._summary += notation.decode("utf8")[:500]
|
||||
return self._summary
|
||||
|
||||
def _guess_content_type(self, message):
|
||||
content_type = message.headers.get("Content-Type", "")
|
||||
if not message.content or content_type.startswith("application/llsd"):
|
||||
return content_type
|
||||
# Sometimes gets sent with `text/plain` or `text/html`. Cool.
|
||||
if message.content.startswith(rb'<?xml version="1.0" ?><llsd>'):
|
||||
return "application/llsd+xml"
|
||||
if message.content.startswith(rb'<llsd>'):
|
||||
return "application/llsd+xml"
|
||||
if message.content.startswith(rb'<?xml '):
|
||||
return "application/xml"
|
||||
return content_type
|
||||
|
||||
|
||||
class MessageLogModel(QtCore.QAbstractTableModel, BaseMessageLogger):
|
||||
class MessageLogModel(QtCore.QAbstractTableModel, FilteringMessageLogger):
|
||||
def __init__(self, parent=None):
|
||||
QtCore.QAbstractTableModel.__init__(self, parent)
|
||||
BaseMessageLogger.__init__(self)
|
||||
self._raw_entries = collections.deque(maxlen=2000)
|
||||
self._queued_entries = queue.Queue()
|
||||
self._filtered_entries = []
|
||||
self._paused = False
|
||||
self.filter: typing.Optional[BaseFilterNode] = None
|
||||
FilteringMessageLogger.__init__(self)
|
||||
|
||||
def setFilter(self, filter_str: str):
|
||||
self.filter = compile_filter(filter_str)
|
||||
def _begin_insert(self, insert_idx: int):
|
||||
self.beginInsertRows(QtCore.QModelIndex(), insert_idx, insert_idx)
|
||||
|
||||
def _end_insert(self):
|
||||
self.endInsertRows()
|
||||
|
||||
def _begin_reset(self):
|
||||
self.beginResetModel()
|
||||
# Keep any entries that've aged out of the raw entries list that
|
||||
# match the new filter
|
||||
self._filtered_entries = [
|
||||
m for m in self._filtered_entries if
|
||||
m not in self._raw_entries and self.filter.match(m)
|
||||
]
|
||||
self._filtered_entries.extend((m for m in self._raw_entries if self.filter.match(m)))
|
||||
|
||||
def _end_reset(self):
|
||||
self.endResetModel()
|
||||
|
||||
def setPaused(self, paused: bool):
|
||||
self._paused = paused
|
||||
|
||||
def log_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
|
||||
if self._paused:
|
||||
return
|
||||
self.queueLogEntry(LLUDPMessageLogEntry(message, region, session))
|
||||
|
||||
def log_http_response(self, flow: HippoHTTPFlow):
|
||||
if self._paused:
|
||||
return
|
||||
# These are huge, let's not log them for now.
|
||||
if flow.cap_data and flow.cap_data.asset_server_cap:
|
||||
return
|
||||
self.queueLogEntry(HTTPMessageLogEntry(flow))
|
||||
|
||||
def log_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
|
||||
if self._paused:
|
||||
return
|
||||
self.queueLogEntry(EQMessageLogEntry(event, region, session))
|
||||
|
||||
def appendQueuedEntries(self):
|
||||
while not self._queued_entries.empty():
|
||||
entry: AbstractMessageLogEntry = self._queued_entries.get(block=False)
|
||||
# Paused, throw it away.
|
||||
if self._paused:
|
||||
continue
|
||||
self._raw_entries.append(entry)
|
||||
try:
|
||||
if self.filter.match(entry):
|
||||
next_idx = len(self._filtered_entries)
|
||||
self.beginInsertRows(QtCore.QModelIndex(), next_idx, next_idx)
|
||||
self._filtered_entries.append(entry)
|
||||
self.endInsertRows()
|
||||
|
||||
entry.cache_summary()
|
||||
# In the common case we don't need to keep around the serialization
|
||||
# caches anymore. If the filter changes, the caches will be repopulated
|
||||
# as necessary.
|
||||
entry.freeze()
|
||||
except Exception:
|
||||
LOG.exception("Failed to filter queued message")
|
||||
|
||||
def queueLogEntry(self, entry: AbstractMessageLogEntry):
|
||||
self._queued_entries.put(entry, block=False)
|
||||
|
||||
def rowCount(self, parent=None, *args, **kwargs):
|
||||
return len(self._filtered_entries)
|
||||
|
||||
@@ -656,14 +69,6 @@ class MessageLogModel(QtCore.QAbstractTableModel, BaseMessageLogger):
|
||||
if orientation == QtCore.Qt.Horizontal and role == QtCore.Qt.DisplayRole:
|
||||
return MessageLogHeader(col).name
|
||||
|
||||
def clear(self):
|
||||
self.beginResetModel()
|
||||
self._filtered_entries.clear()
|
||||
while not self._queued_entries.empty():
|
||||
self._queued_entries.get(block=False)
|
||||
self._raw_entries.clear()
|
||||
self.endResetModel()
|
||||
|
||||
|
||||
class RegionListModel(QtCore.QAbstractListModel):
|
||||
def __init__(self, parent, session_manager):
|
||||
|
||||
@@ -144,6 +144,7 @@ def start_proxy(extra_addons: Optional[list] = None, extra_addon_paths: Optional
|
||||
|
||||
# Everything in memory at this point should stay
|
||||
gc.freeze()
|
||||
gc.set_threshold(5000, 50, 10)
|
||||
|
||||
# Serve requests until Ctrl+C is pressed
|
||||
print(f"SOCKS and HTTP proxies running on {proxy_host}")
|
||||
@@ -185,3 +186,8 @@ def _windows_timeout_killer(pid: int):
|
||||
def main():
|
||||
multiprocessing.set_start_method("spawn")
|
||||
start_proxy()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
multiprocessing.freeze_support()
|
||||
main()
|
||||
|
||||
@@ -8,7 +8,6 @@ import json
|
||||
import logging
|
||||
import pathlib
|
||||
import multiprocessing
|
||||
import os
|
||||
import re
|
||||
import signal
|
||||
import socket
|
||||
@@ -20,18 +19,11 @@ import multidict
|
||||
from qasync import QEventLoop
|
||||
from PySide2 import QtCore, QtWidgets, QtGui
|
||||
|
||||
from hippolyzer.apps.model import (
|
||||
AbstractMessageLogEntry,
|
||||
LLUDPMessageLogEntry,
|
||||
MessageLogModel,
|
||||
MessageLogHeader,
|
||||
RegionListModel,
|
||||
bytes_unescape,
|
||||
bytes_escape,
|
||||
)
|
||||
from hippolyzer.apps.model import MessageLogModel, MessageLogHeader, RegionListModel
|
||||
from hippolyzer.apps.proxy import start_proxy
|
||||
from hippolyzer.lib.base import llsd
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.base.helpers import bytes_unescape, bytes_escape, get_resource_filename
|
||||
from hippolyzer.lib.base.message.llsd_msg_serializer import LLSDMessageSerializer
|
||||
from hippolyzer.lib.base.message.message import Block
|
||||
from hippolyzer.lib.base.message.msgtypes import MsgType
|
||||
@@ -43,18 +35,18 @@ from hippolyzer.lib.proxy.ca_utils import setup_ca_everywhere
|
||||
from hippolyzer.lib.proxy.caps_client import CapsClient
|
||||
from hippolyzer.lib.proxy.http_proxy import create_proxy_master, HTTPFlowContext
|
||||
from hippolyzer.lib.proxy.packets import Direction
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage, VerbatimHumanVal, proxy_eval
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage, VerbatimHumanVal, proxy_eval, SpannedString
|
||||
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, AbstractMessageLogEntry
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session, SessionManager
|
||||
from hippolyzer.lib.proxy.templates import CAP_TEMPLATES
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
BASE_PATH = os.path.dirname(os.path.abspath(__file__))
|
||||
MAIN_WINDOW_UI_PATH = os.path.join(BASE_PATH, "proxy_mainwindow.ui")
|
||||
MESSAGE_BUILDER_UI_PATH = os.path.join(BASE_PATH, "message_builder.ui")
|
||||
ADDON_DIALOG_UI_PATH = os.path.join(BASE_PATH, "addon_dialog.ui")
|
||||
FILTER_DIALOG_UI_PATH = os.path.join(BASE_PATH, "filter_dialog.ui")
|
||||
MAIN_WINDOW_UI_PATH = get_resource_filename("apps/proxy_mainwindow.ui")
|
||||
MESSAGE_BUILDER_UI_PATH = get_resource_filename("apps/message_builder.ui")
|
||||
ADDON_DIALOG_UI_PATH = get_resource_filename("apps/addon_dialog.ui")
|
||||
FILTER_DIALOG_UI_PATH = get_resource_filename("apps/filter_dialog.ui")
|
||||
|
||||
|
||||
def show_error_message(error_msg, parent=None):
|
||||
@@ -169,6 +161,8 @@ class ProxyGUI(QtWidgets.QMainWindow):
|
||||
"ViewerAsset GetTexture SetAlwaysRun GetDisplayNames MapImageService MapItemReply".split(" ")
|
||||
DEFAULT_FILTER = f"!({' || '.join(ignored for ignored in DEFAULT_IGNORE)})"
|
||||
|
||||
textRequest: QtWidgets.QTextEdit
|
||||
|
||||
def __init__(self):
|
||||
super().__init__()
|
||||
loadUi(MAIN_WINDOW_UI_PATH, self)
|
||||
@@ -242,10 +236,10 @@ class ProxyGUI(QtWidgets.QMainWindow):
|
||||
filter_str = self.lineEditFilter.text()
|
||||
else:
|
||||
self.lineEditFilter.setText(filter_str)
|
||||
self.model.setFilter(filter_str)
|
||||
self.model.set_filter(filter_str)
|
||||
|
||||
def _setPaused(self, checked):
|
||||
self.model.setPaused(checked)
|
||||
self.model.set_paused(checked)
|
||||
|
||||
def _messageSelected(self, selected, _deselected):
|
||||
indexes = selected.indexes()
|
||||
@@ -271,8 +265,23 @@ class ProxyGUI(QtWidgets.QMainWindow):
|
||||
beautify=self.checkBeautify.isChecked(),
|
||||
replacements=self.buildReplacements(entry.session, entry.region),
|
||||
)
|
||||
resp = entry.response(beautify=self.checkBeautify.isChecked())
|
||||
highlight_range = None
|
||||
if isinstance(req, SpannedString):
|
||||
match_result = self.model.filter.match(entry)
|
||||
# Match result was a tuple indicating what matched
|
||||
if isinstance(match_result, tuple):
|
||||
highlight_range = req.spans.get(match_result)
|
||||
|
||||
self.textRequest.setPlainText(req)
|
||||
if highlight_range:
|
||||
cursor = self.textRequest.textCursor()
|
||||
cursor.setPosition(highlight_range[0], QtGui.QTextCursor.MoveAnchor)
|
||||
cursor.setPosition(highlight_range[1], QtGui.QTextCursor.KeepAnchor)
|
||||
highlight_format = QtGui.QTextBlockFormat()
|
||||
highlight_format.setBackground(QtCore.Qt.yellow)
|
||||
cursor.setBlockFormat(highlight_format)
|
||||
|
||||
resp = entry.response(beautify=self.checkBeautify.isChecked())
|
||||
if resp:
|
||||
self.textResponse.show()
|
||||
self.textResponse.setPlainText(resp)
|
||||
@@ -796,7 +805,6 @@ def gui_main():
|
||||
window = ProxyGUI()
|
||||
timer = QtCore.QTimer(app)
|
||||
timer.timeout.connect(window.sessionManager.checkRegions)
|
||||
timer.timeout.connect(window.model.appendQueuedEntries)
|
||||
timer.start(100)
|
||||
signal.signal(signal.SIGINT, lambda *args: QtWidgets.QApplication.quit())
|
||||
window.show()
|
||||
@@ -809,3 +817,8 @@ def gui_main():
|
||||
extra_addon_paths=window.getAddonList(),
|
||||
proxy_host=http_host,
|
||||
)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
multiprocessing.freeze_support()
|
||||
gui_main()
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import functools
|
||||
import pkg_resources
|
||||
import re
|
||||
import weakref
|
||||
from pprint import PrettyPrinter
|
||||
from typing import *
|
||||
@@ -121,3 +123,18 @@ def proxify(obj: Union[Callable[[], _T], weakref.ReferenceType, _T]) -> _T:
|
||||
if obj is not None and not isinstance(obj, weakref.ProxyTypes):
|
||||
return weakref.proxy(obj)
|
||||
return obj
|
||||
|
||||
|
||||
def bytes_unescape(val: bytes) -> bytes:
|
||||
# Only in CPython. bytes -> bytes with escape decoding.
|
||||
# https://stackoverflow.com/a/23151714
|
||||
return codecs.escape_decode(val)[0] # type: ignore
|
||||
|
||||
|
||||
def bytes_escape(val: bytes) -> bytes:
|
||||
# Try to keep newlines as-is
|
||||
return re.sub(rb"(?<!\\)\\n", b"\n", codecs.escape_encode(val)[0]) # type: ignore
|
||||
|
||||
|
||||
def get_resource_filename(resource_filename: str):
|
||||
return pkg_resources.resource_filename("hippolyzer", resource_filename)
|
||||
|
||||
@@ -39,6 +39,7 @@ class MeshAsset:
|
||||
# These TypedDicts describe the expected shape of the LLSD in the mesh
|
||||
# header and various segments. They're mainly for type hinting.
|
||||
class MeshHeaderDict(TypedDict, total=False):
|
||||
"""Header of the mesh file, includes offsets & sizes for segments' LLSD"""
|
||||
version: int
|
||||
creator: UUID
|
||||
date: dt.datetime
|
||||
@@ -54,6 +55,7 @@ class MeshHeaderDict(TypedDict, total=False):
|
||||
|
||||
|
||||
class SegmentHeaderDict(TypedDict):
|
||||
"""Standard shape for segment references within the header"""
|
||||
offset: int
|
||||
size: int
|
||||
|
||||
@@ -73,6 +75,7 @@ class PhysicsHavokSegmentHeaderDict(PhysicsSegmentHeaderDict, total=False):
|
||||
|
||||
|
||||
class PhysicsCostDataHeaderDict(TypedDict, total=False):
|
||||
"""Cost of physical representation, populated by server"""
|
||||
decomposition: float
|
||||
decomposition_discounted_vertices: int
|
||||
decomposition_hulls: int
|
||||
@@ -85,6 +88,7 @@ class PhysicsCostDataHeaderDict(TypedDict, total=False):
|
||||
|
||||
|
||||
class MeshSegmentDict(TypedDict, total=False):
|
||||
"""Dict of segments unpacked using the MeshHeaderDict"""
|
||||
high_lod: List[LODSegmentDict]
|
||||
medium_lod: List[LODSegmentDict]
|
||||
low_lod: List[LODSegmentDict]
|
||||
@@ -96,6 +100,7 @@ class MeshSegmentDict(TypedDict, total=False):
|
||||
|
||||
|
||||
class LODSegmentDict(TypedDict, total=False):
|
||||
"""Represents a single entry within the material list of a LOD segment"""
|
||||
# Only present if True and no geometry
|
||||
NoGeometry: bool
|
||||
# -1.0 - 1.0
|
||||
@@ -113,17 +118,22 @@ class LODSegmentDict(TypedDict, total=False):
|
||||
|
||||
|
||||
class DomainDict(TypedDict):
|
||||
"""Description of the real range for quantized coordinates"""
|
||||
# number of elems depends on what the domain is for, Vec2 or Vec3
|
||||
Max: List[float]
|
||||
Min: List[float]
|
||||
|
||||
|
||||
class VertexWeight(recordclass.datatuple): # type: ignore
|
||||
"""Vertex weight for a specific joint on a specific vertex"""
|
||||
# index of the joint within the joint_names list in the skin segment
|
||||
joint_idx: int
|
||||
# 0.0 - 1.0
|
||||
weight: float
|
||||
|
||||
|
||||
class SkinSegmentDict(TypedDict, total=False):
|
||||
"""Rigging information"""
|
||||
joint_names: List[str]
|
||||
# model -> world transform matrix for model
|
||||
bind_shape_matrix: List[float]
|
||||
@@ -137,14 +147,17 @@ class SkinSegmentDict(TypedDict, total=False):
|
||||
|
||||
|
||||
class PhysicsConvexSegmentDict(DomainDict, total=False):
|
||||
"""Data for convex hull collisions, populated by the client"""
|
||||
# Min / Max domain vals are inline, unlike for LODs
|
||||
HullList: List[int]
|
||||
# -1.0 - 1.0
|
||||
# -1.0 - 1.0, dequantized from binary field of U16s
|
||||
Positions: List[Vector3]
|
||||
# -1.0 - 1.0
|
||||
# -1.0 - 1.0, dequantized from binary field of U16s
|
||||
BoundingVerts: List[Vector3]
|
||||
|
||||
|
||||
class PhysicsHavokSegmentDict(TypedDict, total=False):
|
||||
"""Cached data for Havok collisions, populated by sim and not used by client."""
|
||||
HullMassProps: MassPropsDict
|
||||
MOPP: MOPPDict
|
||||
MeshDecompMassProps: MassPropsDict
|
||||
@@ -169,8 +182,11 @@ class MOPPDict(TypedDict, total=False):
|
||||
|
||||
|
||||
def positions_from_domain(positions: Iterable[TupleCoord], domain: DomainDict):
|
||||
# Used for turning positions into their actual positions within the mesh / domain
|
||||
# for ex: positions_from_domain(lod["Position"], lod["PositionDomain])
|
||||
"""
|
||||
Used for turning positions into their actual positions within the mesh / domain
|
||||
|
||||
for ex: positions_from_domain(lod["Position"], lod["PositionDomain])
|
||||
"""
|
||||
lower = domain['Min']
|
||||
upper = domain['Max']
|
||||
return [
|
||||
@@ -179,7 +195,7 @@ def positions_from_domain(positions: Iterable[TupleCoord], domain: DomainDict):
|
||||
|
||||
|
||||
def positions_to_domain(positions: Iterable[TupleCoord], domain: DomainDict):
|
||||
# Used for turning positions into their actual positions within the mesh / domain
|
||||
"""Used for turning positions into their actual positions within the mesh / domain"""
|
||||
lower = domain['Min']
|
||||
upper = domain['Max']
|
||||
return [
|
||||
@@ -187,7 +203,36 @@ def positions_to_domain(positions: Iterable[TupleCoord], domain: DomainDict):
|
||||
]
|
||||
|
||||
|
||||
class VertexWeights(se.SerializableBase):
|
||||
"""Serializer for a list of joint weights on a single vertex"""
|
||||
INFLUENCE_SER = se.QuantizedFloat(se.U16, 0.0, 1.0)
|
||||
INFLUENCE_LIMIT = 4
|
||||
INFLUENCE_TERM = 0xFF
|
||||
|
||||
@classmethod
|
||||
def serialize(cls, vals, writer: se.BufferWriter, ctx=None):
|
||||
if len(vals) > cls.INFLUENCE_LIMIT:
|
||||
raise ValueError(f"{vals!r} is too long, can only have {cls.INFLUENCE_LIMIT} influences!")
|
||||
for val in vals:
|
||||
joint_idx, influence = val
|
||||
writer.write(se.U8, joint_idx)
|
||||
writer.write(cls.INFLUENCE_SER, influence, ctx=ctx)
|
||||
if len(vals) != cls.INFLUENCE_LIMIT:
|
||||
writer.write(se.U8, cls.INFLUENCE_TERM)
|
||||
|
||||
@classmethod
|
||||
def deserialize(cls, reader: se.Reader, ctx=None):
|
||||
influence_list = []
|
||||
for _ in range(cls.INFLUENCE_LIMIT):
|
||||
joint_idx = reader.read(se.U8)
|
||||
if joint_idx == cls.INFLUENCE_TERM:
|
||||
break
|
||||
influence_list.append(VertexWeight(joint_idx, reader.read(cls.INFLUENCE_SER, ctx=ctx)))
|
||||
return influence_list
|
||||
|
||||
|
||||
class SegmentSerializer:
|
||||
"""Serializer for binary fields within an LLSD object"""
|
||||
def __init__(self, templates):
|
||||
self._templates: Dict[str, se.SerializableBase] = templates
|
||||
|
||||
@@ -217,33 +262,6 @@ class SegmentSerializer:
|
||||
return new_segment
|
||||
|
||||
|
||||
class VertexWeights(se.SerializableBase):
|
||||
INFLUENCE_SER = se.QuantizedFloat(se.U16, 0.0, 1.0)
|
||||
INFLUENCE_LIMIT = 4
|
||||
INFLUENCE_TERM = 0xFF
|
||||
|
||||
@classmethod
|
||||
def serialize(cls, vals, writer: se.BufferWriter, ctx=None):
|
||||
if len(vals) > cls.INFLUENCE_LIMIT:
|
||||
raise ValueError(f"{vals!r} is too long, can only have {cls.INFLUENCE_LIMIT} influences!")
|
||||
for val in vals:
|
||||
joint_idx, influence = val
|
||||
writer.write(se.U8, joint_idx)
|
||||
writer.write(cls.INFLUENCE_SER, influence, ctx=ctx)
|
||||
if len(vals) != cls.INFLUENCE_LIMIT:
|
||||
writer.write(se.U8, cls.INFLUENCE_TERM)
|
||||
|
||||
@classmethod
|
||||
def deserialize(cls, reader: se.Reader, ctx=None):
|
||||
influence_list = []
|
||||
for _ in range(cls.INFLUENCE_LIMIT):
|
||||
joint_idx = reader.read(se.U8)
|
||||
if joint_idx == cls.INFLUENCE_TERM:
|
||||
break
|
||||
influence_list.append(VertexWeight(joint_idx, reader.read(cls.INFLUENCE_SER, ctx=ctx)))
|
||||
return influence_list
|
||||
|
||||
|
||||
LOD_SEGMENT_SERIALIZER = SegmentSerializer({
|
||||
# 16-bit indices to the verts making up the tri. Imposes a 16-bit
|
||||
# upper limit on verts in any given material in the mesh.
|
||||
@@ -265,6 +283,7 @@ class LLMeshSerializer(se.SerializableBase):
|
||||
KNOWN_SEGMENTS = ("lowest_lod", "low_lod", "medium_lod", "high_lod",
|
||||
"physics_mesh", "physics_convex", "skin", "physics_havok")
|
||||
|
||||
# Define unpackers for specific binary fields within the parsed LLSD segments
|
||||
SEGMENT_TEMPLATES: Dict[str, SegmentSerializer] = {
|
||||
"lowest_lod": LOD_SEGMENT_SERIALIZER,
|
||||
"low_lod": LOD_SEGMENT_SERIALIZER,
|
||||
|
||||
@@ -22,6 +22,8 @@ Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
|
||||
import os
|
||||
|
||||
msg_tmpl = open(os.path.join(os.path.dirname(__file__), 'message_template.msg'))
|
||||
with open(os.path.join(os.path.dirname(__file__), 'message.xml'), "rb") as _f:
|
||||
from hippolyzer.lib.base.helpers import get_resource_filename
|
||||
|
||||
msg_tmpl = open(get_resource_filename("lib/base/message/data/message_template.msg"))
|
||||
with open(get_resource_filename("lib/base/message/data/message.xml"), "rb") as _f:
|
||||
msg_details = _f.read()
|
||||
|
||||
@@ -193,12 +193,21 @@ class Message:
|
||||
# should be set once a packet is sent / dropped to prevent accidental
|
||||
# re-sending or re-dropping
|
||||
self.finalized = False
|
||||
# Whether message is owned by the queue or should be sent immediately
|
||||
# Whether message is owned by a queue or should be sent immediately
|
||||
self.queued: bool = False
|
||||
self._blocks: BLOCK_DICT = {}
|
||||
|
||||
self.add_blocks(args)
|
||||
|
||||
def __reduce_ex__(self, protocol):
|
||||
reduced: Tuple[Any] = super().__reduce_ex__(protocol)
|
||||
# https://docs.python.org/3/library/pickle.html#object.__reduce__
|
||||
# We need to make some changes to the object state to make it serializable
|
||||
state_dict: Dict = reduced[2][1]
|
||||
# Have to remove the deserializer weak ref so we can pickle
|
||||
state_dict['deserializer'] = None
|
||||
return reduced
|
||||
|
||||
@property
|
||||
def packet_id(self) -> Optional[int]:
|
||||
return self._packet_id
|
||||
|
||||
@@ -79,8 +79,14 @@ class MessageHandler(Generic[_T]):
|
||||
|
||||
notifiers = self._subscribe_all(message_names, _handler_wrapper, predicate=predicate)
|
||||
|
||||
async def _get_wrapper():
|
||||
msg = await msg_queue.get()
|
||||
# Consumption is completion
|
||||
msg_queue.task_done()
|
||||
return msg
|
||||
|
||||
try:
|
||||
yield msg_queue.get
|
||||
yield _get_wrapper
|
||||
finally:
|
||||
for n in notifiers:
|
||||
n.unsubscribe(_handler_wrapper)
|
||||
|
||||
@@ -18,108 +18,113 @@ You should have received a copy of the GNU Lesser General Public License
|
||||
along with this program; if not, write to the Free Software Foundation,
|
||||
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import *
|
||||
|
||||
import lazy_object_proxy
|
||||
import recordclass
|
||||
|
||||
from hippolyzer.lib.base.datatypes import Vector3, Quaternion, Vector4
|
||||
from hippolyzer.lib.base.datatypes import Vector3, Quaternion, Vector4, UUID
|
||||
|
||||
|
||||
class Object:
|
||||
""" represents an Object
|
||||
class Object(recordclass.datatuple): # type: ignore
|
||||
__options__ = {
|
||||
"fast_new": False,
|
||||
"use_weakref": True,
|
||||
}
|
||||
__weakref__: Any
|
||||
|
||||
Initialize the Object class instance
|
||||
>>> obj = Object()
|
||||
"""
|
||||
LocalID: Optional[int] = None
|
||||
State: Optional[int] = None
|
||||
FullID: Optional[UUID] = None
|
||||
CRC: Optional[int] = None
|
||||
PCode: Optional[int] = None
|
||||
Material: Optional[int] = None
|
||||
ClickAction: Optional[int] = None
|
||||
Scale: Optional[Vector3] = None
|
||||
ParentID: Optional[int] = None
|
||||
# Actually contains a weakref proxy
|
||||
Parent: Optional[Object] = None
|
||||
UpdateFlags: Optional[int] = None
|
||||
PathCurve: Optional[int] = None
|
||||
ProfileCurve: Optional[int] = None
|
||||
PathBegin: Optional[int] = None
|
||||
PathEnd: Optional[int] = None
|
||||
PathScaleX: Optional[int] = None
|
||||
PathScaleY: Optional[int] = None
|
||||
PathShearX: Optional[int] = None
|
||||
PathShearY: Optional[int] = None
|
||||
PathTwist: Optional[int] = None
|
||||
PathTwistBegin: Optional[int] = None
|
||||
PathRadiusOffset: Optional[int] = None
|
||||
PathTaperX: Optional[int] = None
|
||||
PathTaperY: Optional[int] = None
|
||||
PathRevolutions: Optional[int] = None
|
||||
PathSkew: Optional[int] = None
|
||||
ProfileBegin: Optional[int] = None
|
||||
ProfileEnd: Optional[int] = None
|
||||
ProfileHollow: Optional[int] = None
|
||||
TextureEntry: Optional[Any] = None
|
||||
TextureAnim: Optional[Any] = None
|
||||
NameValue: Optional[Any] = None
|
||||
Data: Optional[Any] = None
|
||||
Text: Optional[str] = None
|
||||
TextColor: Optional[bytes] = None
|
||||
MediaURL: Optional[Any] = None
|
||||
PSBlock: Optional[Any] = None
|
||||
ExtraParams: Optional[Any] = None
|
||||
Sound: Optional[UUID] = None
|
||||
OwnerID: Optional[UUID] = None
|
||||
SoundGain: Optional[float] = None
|
||||
SoundFlags: Optional[int] = None
|
||||
SoundRadius: Optional[float] = None
|
||||
JointType: Optional[int] = None
|
||||
JointPivot: Optional[int] = None
|
||||
JointAxisOrAnchor: Optional[int] = None
|
||||
TreeSpecies: Optional[int] = None
|
||||
ScratchPad: Optional[bytes] = None
|
||||
ObjectCosts: Optional[Dict] = None
|
||||
ChildIDs: Optional[List[int]] = None
|
||||
# Same as parent, contains weakref proxies.
|
||||
Children: Optional[List[Object]] = None
|
||||
|
||||
__slots__ = (
|
||||
"LocalID",
|
||||
"State",
|
||||
"FullID",
|
||||
"CRC",
|
||||
"PCode",
|
||||
"Material",
|
||||
"ClickAction",
|
||||
"Scale",
|
||||
"ParentID",
|
||||
"UpdateFlags",
|
||||
"PathCurve",
|
||||
"ProfileCurve",
|
||||
"PathBegin",
|
||||
"PathEnd",
|
||||
"PathScaleX",
|
||||
"PathScaleY",
|
||||
"PathShearX",
|
||||
"PathShearY",
|
||||
"PathTwist",
|
||||
"PathTwistBegin",
|
||||
"PathRadiusOffset",
|
||||
"PathTaperX",
|
||||
"PathTaperY",
|
||||
"PathRevolutions",
|
||||
"PathSkew",
|
||||
"ProfileBegin",
|
||||
"ProfileEnd",
|
||||
"ProfileHollow",
|
||||
"TextureEntry",
|
||||
"TextureAnim",
|
||||
"NameValue",
|
||||
"Data",
|
||||
"Text",
|
||||
"TextColor",
|
||||
"MediaURL",
|
||||
"PSBlock",
|
||||
"ExtraParams",
|
||||
"Sound",
|
||||
"OwnerID",
|
||||
"SoundGain",
|
||||
"SoundFlags",
|
||||
"SoundRadius",
|
||||
"JointType",
|
||||
"JointPivot",
|
||||
"JointAxisOrAnchor",
|
||||
"TreeSpecies",
|
||||
"ObjectCosts",
|
||||
"FootCollisionPlane",
|
||||
"Position",
|
||||
"Velocity",
|
||||
"Acceleration",
|
||||
"Rotation",
|
||||
"AngularVelocity",
|
||||
"CreatorID",
|
||||
"GroupID",
|
||||
"CreationDate",
|
||||
"BaseMask",
|
||||
"OwnerMask",
|
||||
"GroupMask",
|
||||
"EveryoneMask",
|
||||
"NextOwnerMask",
|
||||
"OwnershipCost",
|
||||
"SaleType",
|
||||
"SalePrice",
|
||||
"AggregatePerms",
|
||||
"AggregatePermTextures",
|
||||
"AggregatePermTexturesOwner",
|
||||
"Category",
|
||||
"InventorySerial",
|
||||
"ItemID",
|
||||
"FolderID",
|
||||
"FromTaskID",
|
||||
"LastOwnerID",
|
||||
"Name",
|
||||
"Description",
|
||||
"TouchName",
|
||||
"SitName",
|
||||
"TextureID",
|
||||
"ChildIDs",
|
||||
"Children",
|
||||
"Parent",
|
||||
"ScratchPad",
|
||||
"__weakref__",
|
||||
)
|
||||
FootCollisionPlane: Optional[Vector4] = None
|
||||
Position: Optional[Vector3] = None
|
||||
Velocity: Optional[Vector3] = None
|
||||
Acceleration: Optional[Vector3] = None
|
||||
Rotation: Optional[Quaternion] = None
|
||||
AngularVelocity: Optional[Vector3] = None
|
||||
|
||||
def __init__(self, *, ID=None, LocalID=None, State=None, FullID=None, CRC=None, PCode=None, Material=None,
|
||||
# from ObjectProperties
|
||||
CreatorID: Optional[UUID] = None
|
||||
GroupID: Optional[UUID] = None
|
||||
CreationDate: Optional[int] = None
|
||||
BaseMask: Optional[int] = None
|
||||
OwnerMask: Optional[int] = None
|
||||
GroupMask: Optional[int] = None
|
||||
EveryoneMask: Optional[int] = None
|
||||
NextOwnerMask: Optional[int] = None
|
||||
OwnershipCost: Optional[int] = None
|
||||
# TaxRate
|
||||
SaleType: Optional[int] = None
|
||||
SalePrice: Optional[int] = None
|
||||
AggregatePerms: Optional[int] = None
|
||||
AggregatePermTextures: Optional[int] = None
|
||||
AggregatePermTexturesOwner: Optional[int] = None
|
||||
Category: Optional[int] = None
|
||||
InventorySerial: Optional[int] = None
|
||||
ItemID: Optional[UUID] = None
|
||||
FolderID: Optional[UUID] = None
|
||||
FromTaskID: Optional[UUID] = None
|
||||
LastOwnerID: Optional[UUID] = None
|
||||
Name: Optional[str] = None
|
||||
Description: Optional[str] = None
|
||||
TouchName: Optional[str] = None
|
||||
SitName: Optional[str] = None
|
||||
TextureID: Optional[Any] = None
|
||||
|
||||
def __init__(self, *, LocalID=None, State=None, FullID=None, CRC=None, PCode=None, Material=None,
|
||||
ClickAction=None, Scale=None, ParentID=None, UpdateFlags=None, PathCurve=None, ProfileCurve=None,
|
||||
PathBegin=None, PathEnd=None, PathScaleX=None, PathScaleY=None, PathShearX=None, PathShearY=None,
|
||||
PathTwist=None, PathTwistBegin=None, PathRadiusOffset=None, PathTaperX=None, PathTaperY=None,
|
||||
@@ -131,7 +136,7 @@ class Object:
|
||||
AngularVelocity=None, TreeSpecies=None, ObjectCosts=None, ScratchPad=None):
|
||||
""" set up the object attributes """
|
||||
|
||||
self.LocalID = LocalID or ID # U32
|
||||
self.LocalID = LocalID # U32
|
||||
self.State = State # U8
|
||||
self.FullID = FullID # LLUUID
|
||||
self.CRC = CRC # U32 // TEMPORARY HACK FOR JAMES
|
||||
@@ -258,8 +263,4 @@ class Object:
|
||||
return updated_properties
|
||||
|
||||
def to_dict(self):
|
||||
return {
|
||||
x: getattr(self, x) for x in dir(self)
|
||||
if not isinstance(getattr(self.__class__, x, None), property) and
|
||||
not callable(getattr(self, x)) and not x.startswith("_")
|
||||
}
|
||||
return recordclass.asdict(self)
|
||||
|
||||
@@ -1703,7 +1703,7 @@ class BaseSubfieldSerializer(abc.ABC):
|
||||
"""Guess at which template a val might correspond to"""
|
||||
if dataclasses.is_dataclass(val):
|
||||
val = dataclasses.asdict(val) # noqa
|
||||
if isinstance(val, bytes):
|
||||
if isinstance(val, (bytes, bytearray)):
|
||||
template_checker = cls._template_sizes_match
|
||||
elif isinstance(val, dict):
|
||||
template_checker = cls._template_keys_match
|
||||
|
||||
@@ -277,13 +277,8 @@ class AddonManager:
|
||||
|
||||
# Make sure module initialization happens after any pending task cancellations
|
||||
# due to module unloading.
|
||||
def _init_soon():
|
||||
cls._call_module_hooks(mod, "handle_init", cls.SESSION_MANAGER)
|
||||
if not cls._SUBPROCESS:
|
||||
for session in cls.SESSION_MANAGER.sessions:
|
||||
with addon_ctx.push(new_session=session):
|
||||
cls._call_module_hooks(mod, "handle_session_init", session)
|
||||
asyncio.get_event_loop().call_soon(_init_soon)
|
||||
|
||||
asyncio.get_event_loop().call_soon(cls._init_module, mod)
|
||||
except Exception as e:
|
||||
if had_mod:
|
||||
logging.exception("Exploded trying to reload addon %s" % spec.name)
|
||||
@@ -299,6 +294,14 @@ class AddonManager:
|
||||
if raise_exceptions and load_exception is not None:
|
||||
raise load_exception
|
||||
|
||||
@classmethod
|
||||
def _init_module(cls, mod: ModuleType):
|
||||
cls._call_module_hooks(mod, "handle_init", cls.SESSION_MANAGER)
|
||||
if not cls._SUBPROCESS:
|
||||
for session in cls.SESSION_MANAGER.sessions:
|
||||
with addon_ctx.push(new_session=session):
|
||||
cls._call_module_hooks(mod, "handle_session_init", session)
|
||||
|
||||
@classmethod
|
||||
def _unload_module(cls, old_mod: ModuleType):
|
||||
cls._call_module_hooks(old_mod, "handle_unload", cls.SESSION_MANAGER)
|
||||
|
||||
@@ -14,7 +14,7 @@ from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import BaseMessageLogger
|
||||
from hippolyzer.lib.proxy.message_logger import BaseMessageLogger
|
||||
|
||||
|
||||
class ProxiedCircuit:
|
||||
|
||||
@@ -5,7 +5,6 @@ import multiprocessing
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
import pkg_resources
|
||||
import queue
|
||||
import typing
|
||||
import uuid
|
||||
@@ -20,6 +19,7 @@ from mitmproxy.addons import core, clientplayback
|
||||
from mitmproxy.http import HTTPFlow
|
||||
import OpenSSL
|
||||
|
||||
from hippolyzer.lib.base.helpers import get_resource_filename
|
||||
from hippolyzer.lib.base.multiprocessing_utils import ParentProcessWatcher
|
||||
|
||||
orig_sethostflags = OpenSSL.SSL._lib.X509_VERIFY_PARAM_set_hostflags # noqa
|
||||
@@ -230,7 +230,7 @@ def create_proxy_master(host, port, flow_context: HTTPFlowContext): # pragma: n
|
||||
os.path.join(opts.confdir, "config.yml"),
|
||||
)
|
||||
# Use SL's CA bundle so LL's CA certs won't cause verification errors
|
||||
ca_bundle = pkg_resources.resource_filename("hippolyzer.lib.base", "network/data/ca-bundle.crt")
|
||||
ca_bundle = get_resource_filename("lib/base/network/data/ca-bundle.crt")
|
||||
opts.update(
|
||||
ssl_verify_upstream_trusted_ca=ca_bundle,
|
||||
listen_host=host,
|
||||
|
||||
@@ -129,13 +129,14 @@ class InterceptingLLUDPProxyProtocol(BaseLLUDPProxyProtocol):
|
||||
LOG.exception("Failed in region message handler")
|
||||
|
||||
message_logger = self.session_manager.message_logger
|
||||
if message_logger:
|
||||
message_logger.log_lludp_message(self.session, region, message)
|
||||
|
||||
handled = AddonManager.handle_lludp_message(
|
||||
self.session, region, message
|
||||
)
|
||||
|
||||
if message_logger:
|
||||
message_logger.log_lludp_message(self.session, region, message)
|
||||
|
||||
if handled:
|
||||
return
|
||||
|
||||
|
||||
@@ -5,6 +5,7 @@ import logging
|
||||
import math
|
||||
import os
|
||||
import re
|
||||
import typing
|
||||
import uuid
|
||||
from typing import *
|
||||
|
||||
@@ -71,6 +72,14 @@ def proxy_eval(eval_str: str, globals_=None, locals_=None):
|
||||
)
|
||||
|
||||
|
||||
TextSpan = Tuple[int, int]
|
||||
SpanDict = Dict[Tuple[Union[str, int], ...], TextSpan]
|
||||
|
||||
|
||||
class SpannedString(str):
|
||||
spans: SpanDict = {}
|
||||
|
||||
|
||||
class ProxiedMessage(Message):
|
||||
__slots__ = ("meta", "injected", "dropped", "direction")
|
||||
|
||||
@@ -83,9 +92,10 @@ class ProxiedMessage(Message):
|
||||
_maybe_reload_templates()
|
||||
|
||||
def to_human_string(self, replacements=None, beautify=False,
|
||||
template: Optional[MessageTemplate] = None):
|
||||
template: Optional[MessageTemplate] = None) -> SpannedString:
|
||||
replacements = replacements or {}
|
||||
_maybe_reload_templates()
|
||||
spans: SpanDict = {}
|
||||
string = ""
|
||||
if self.direction is not None:
|
||||
string += f'{self.direction.name} '
|
||||
@@ -101,11 +111,18 @@ class ProxiedMessage(Message):
|
||||
block_suffix = ""
|
||||
if template and template.get_block(block_name).block_type == MsgBlockType.MBT_VARIABLE:
|
||||
block_suffix = ' # Variable'
|
||||
for block in block_list:
|
||||
for block_num, block in enumerate(block_list):
|
||||
string += f"[{block_name}]{block_suffix}\n"
|
||||
for var_name, val in block.items():
|
||||
start_len = len(string)
|
||||
string += self._format_var(block, var_name, val, replacements, beautify)
|
||||
return string
|
||||
end_len = len(string)
|
||||
# Store the spans for each var so we can highlight specific matches
|
||||
spans[(self.name, block_name, block_num, var_name)] = (start_len, end_len)
|
||||
string += "\n"
|
||||
spanned = SpannedString(string)
|
||||
spanned.spans = spans
|
||||
return spanned
|
||||
|
||||
def _format_var(self, block, var_name, var_val, replacements, beautify=False):
|
||||
string = ""
|
||||
@@ -129,7 +146,7 @@ class ProxiedMessage(Message):
|
||||
if serializer.AS_HEX and isinstance(var_val, int):
|
||||
var_data = hex(var_val)
|
||||
if serializer.ORIG_INLINE:
|
||||
string += f" #{var_data}\n"
|
||||
string += f" #{var_data}"
|
||||
return string
|
||||
else:
|
||||
string += "\n"
|
||||
@@ -146,7 +163,7 @@ class ProxiedMessage(Message):
|
||||
if "CircuitCode" in var_name or ("Code" in var_name and "Circuit" in block.name):
|
||||
if var_val == replacements.get("CIRCUIT_CODE"):
|
||||
var_data = "[[CIRCUIT_CODE]]"
|
||||
string += f" {field_prefix}{var_name} = {var_data}\n"
|
||||
string += f" {field_prefix}{var_name} = {var_data}"
|
||||
return string
|
||||
|
||||
@staticmethod
|
||||
|
||||
@@ -3,28 +3,30 @@ import ast
|
||||
import typing
|
||||
|
||||
from arpeggio import Optional, ZeroOrMore, EOF, \
|
||||
ParserPython, PTNodeVisitor, visit_parse_tree
|
||||
from arpeggio import RegExMatch as _
|
||||
ParserPython, PTNodeVisitor, visit_parse_tree, RegExMatch
|
||||
|
||||
|
||||
def literal():
|
||||
return [
|
||||
# Nightmare. str or bytes literal.
|
||||
# https://stackoverflow.com/questions/14366401/#comment79795017_14366904
|
||||
_(r'''b?(\"\"\"|\'\'\'|\"|\')((?<!\\)(\\\\)*\\\1|.)*?\1'''),
|
||||
_(r'\d+(\.\d+)?'),
|
||||
RegExMatch(r'''b?(\"\"\"|\'\'\'|\"|\')((?<!\\)(\\\\)*\\\1|.)*?\1'''),
|
||||
# base16
|
||||
RegExMatch(r'0x\d+'),
|
||||
# base10 int or float.
|
||||
RegExMatch(r'\d+(\.\d+)?'),
|
||||
"None",
|
||||
"True",
|
||||
"False",
|
||||
# vector3 (tuple)
|
||||
_(r'\(\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*\)'),
|
||||
RegExMatch(r'\(\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*\)'),
|
||||
# vector4 (tuple)
|
||||
_(r'\(\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*\)'),
|
||||
RegExMatch(r'\(\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*,\s*\d+(\.\d+)?\s*\)'),
|
||||
]
|
||||
|
||||
|
||||
def identifier():
|
||||
return _(r'[a-zA-Z*]([a-zA-Z0-9*]+)?')
|
||||
return RegExMatch(r'[a-zA-Z*]([a-zA-Z0-9_*]+)?')
|
||||
|
||||
|
||||
def field_specifier():
|
||||
@@ -43,12 +45,16 @@ def meta_field_specifier():
|
||||
return "Meta", ".", identifier
|
||||
|
||||
|
||||
def enum_field_specifier():
|
||||
return identifier, ".", identifier
|
||||
|
||||
|
||||
def compare_val():
|
||||
return [literal, meta_field_specifier]
|
||||
return [literal, meta_field_specifier, enum_field_specifier]
|
||||
|
||||
|
||||
def binary_expression():
|
||||
return field_specifier, ["==", "!=", "^=", "$=", "~=", ">", ">=", "<", "<="], compare_val
|
||||
return field_specifier, ["==", "!=", "^=", "$=", "~=", ">", ">=", "<", "<=", "&"], compare_val
|
||||
|
||||
|
||||
def term():
|
||||
@@ -63,9 +69,12 @@ def message_filter():
|
||||
return expression, EOF
|
||||
|
||||
|
||||
MATCH_RESULT = typing.Union[bool, typing.Tuple]
|
||||
|
||||
|
||||
class BaseFilterNode(abc.ABC):
|
||||
@abc.abstractmethod
|
||||
def match(self, msg) -> bool:
|
||||
def match(self, msg) -> MATCH_RESULT:
|
||||
raise NotImplementedError()
|
||||
|
||||
@property
|
||||
@@ -95,17 +104,17 @@ class BinaryFilterNode(BaseFilterNode, abc.ABC):
|
||||
|
||||
|
||||
class UnaryNotFilterNode(UnaryFilterNode):
|
||||
def match(self, msg) -> bool:
|
||||
def match(self, msg) -> MATCH_RESULT:
|
||||
return not self.node.match(msg)
|
||||
|
||||
|
||||
class OrFilterNode(BinaryFilterNode):
|
||||
def match(self, msg) -> bool:
|
||||
def match(self, msg) -> MATCH_RESULT:
|
||||
return self.left_node.match(msg) or self.right_node.match(msg)
|
||||
|
||||
|
||||
class AndFilterNode(BinaryFilterNode):
|
||||
def match(self, msg) -> bool:
|
||||
def match(self, msg) -> MATCH_RESULT:
|
||||
return self.left_node.match(msg) and self.right_node.match(msg)
|
||||
|
||||
|
||||
@@ -115,7 +124,7 @@ class MessageFilterNode(BaseFilterNode):
|
||||
self.operator = operator
|
||||
self.value = value
|
||||
|
||||
def match(self, msg) -> bool:
|
||||
def match(self, msg) -> MATCH_RESULT:
|
||||
return msg.matches(self)
|
||||
|
||||
@property
|
||||
@@ -127,6 +136,11 @@ class MetaFieldSpecifier(str):
|
||||
pass
|
||||
|
||||
|
||||
class EnumFieldSpecifier(typing.NamedTuple):
|
||||
enum_name: str
|
||||
field_name: str
|
||||
|
||||
|
||||
class LiteralValue:
|
||||
"""Only exists because we can't return `None` in a visitor, need to box it"""
|
||||
def __init__(self, value):
|
||||
@@ -134,23 +148,26 @@ class LiteralValue:
|
||||
|
||||
|
||||
class MessageFilterVisitor(PTNodeVisitor):
|
||||
def visit_identifier(self, node, children):
|
||||
def visit_identifier(self, node, _children):
|
||||
return str(node.value)
|
||||
|
||||
def visit_field_specifier(self, node, children):
|
||||
def visit_field_specifier(self, _node, children):
|
||||
return children
|
||||
|
||||
def visit_literal(self, node, children):
|
||||
def visit_literal(self, node, _children):
|
||||
return LiteralValue(ast.literal_eval(node.value))
|
||||
|
||||
def visit_meta_field_specifier(self, node, children):
|
||||
def visit_meta_field_specifier(self, _node, children):
|
||||
return MetaFieldSpecifier(children[0])
|
||||
|
||||
def visit_unary_field_specifier(self, node, children):
|
||||
def visit_enum_field_specifier(self, _node, children):
|
||||
return EnumFieldSpecifier(*children)
|
||||
|
||||
def visit_unary_field_specifier(self, _node, children):
|
||||
# Looks like a bare field specifier with no operator
|
||||
return MessageFilterNode(tuple(children), None, None)
|
||||
|
||||
def visit_unary_expression(self, node, children):
|
||||
def visit_unary_expression(self, _node, children):
|
||||
if len(children) == 1:
|
||||
if isinstance(children[0], BaseFilterNode):
|
||||
return children[0]
|
||||
@@ -162,10 +179,10 @@ class MessageFilterVisitor(PTNodeVisitor):
|
||||
else:
|
||||
raise ValueError(f"Unrecognized unary prefix {children[0]}")
|
||||
|
||||
def visit_binary_expression(self, node, children):
|
||||
def visit_binary_expression(self, _node, children):
|
||||
return MessageFilterNode(tuple(children[0]), children[1], children[2])
|
||||
|
||||
def visit_expression(self, node, children):
|
||||
def visit_expression(self, _node, children):
|
||||
if self.debug:
|
||||
print("Expression {}".format(children))
|
||||
if len(children) > 1:
|
||||
638
hippolyzer/lib/proxy/message_logger.py
Normal file
638
hippolyzer/lib/proxy/message_logger.py
Normal file
@@ -0,0 +1,638 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import collections
|
||||
import copy
|
||||
import fnmatch
|
||||
import io
|
||||
import logging
|
||||
import pickle
|
||||
import re
|
||||
import typing
|
||||
import weakref
|
||||
|
||||
from defusedxml import minidom
|
||||
|
||||
from hippolyzer.lib.base import serialization as se, llsd
|
||||
from hippolyzer.lib.base.datatypes import TaggedUnion, UUID, TupleCoord
|
||||
from hippolyzer.lib.base.helpers import bytes_escape
|
||||
from hippolyzer.lib.proxy.message_filter import MetaFieldSpecifier, compile_filter, BaseFilterNode, MessageFilterNode, \
|
||||
EnumFieldSpecifier
|
||||
from hippolyzer.lib.proxy.region import CapType
|
||||
|
||||
if typing.TYPE_CHECKING:
|
||||
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class BaseMessageLogger:
|
||||
def log_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
|
||||
pass
|
||||
|
||||
def log_http_response(self, flow: HippoHTTPFlow):
|
||||
pass
|
||||
|
||||
def log_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
|
||||
pass
|
||||
|
||||
|
||||
class FilteringMessageLogger(BaseMessageLogger):
|
||||
def __init__(self):
|
||||
BaseMessageLogger.__init__(self)
|
||||
self._raw_entries = collections.deque(maxlen=2000)
|
||||
self._filtered_entries: typing.List[AbstractMessageLogEntry] = []
|
||||
self._paused = False
|
||||
self.filter: BaseFilterNode = compile_filter("")
|
||||
|
||||
def set_filter(self, filter_str: str):
|
||||
self.filter = compile_filter(filter_str)
|
||||
self._begin_reset()
|
||||
# Keep any entries that've aged out of the raw entries list that
|
||||
# match the new filter
|
||||
self._filtered_entries = [
|
||||
m for m in self._filtered_entries if
|
||||
m not in self._raw_entries and self.filter.match(m)
|
||||
]
|
||||
self._filtered_entries.extend((m for m in self._raw_entries if self.filter.match(m)))
|
||||
self._end_reset()
|
||||
|
||||
def set_paused(self, paused: bool):
|
||||
self._paused = paused
|
||||
|
||||
def log_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
|
||||
if self._paused:
|
||||
return
|
||||
self._add_log_entry(LLUDPMessageLogEntry(message, region, session))
|
||||
|
||||
def log_http_response(self, flow: HippoHTTPFlow):
|
||||
if self._paused:
|
||||
return
|
||||
# These are huge, let's not log them for now.
|
||||
if flow.cap_data and flow.cap_data.asset_server_cap:
|
||||
return
|
||||
self._add_log_entry(HTTPMessageLogEntry(flow))
|
||||
|
||||
def log_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
|
||||
if self._paused:
|
||||
return
|
||||
self._add_log_entry(EQMessageLogEntry(event, region, session))
|
||||
|
||||
# Hooks that Qt models will want to implement
|
||||
def _begin_insert(self, insert_idx: int):
|
||||
pass
|
||||
|
||||
def _end_insert(self):
|
||||
pass
|
||||
|
||||
def _begin_reset(self):
|
||||
pass
|
||||
|
||||
def _end_reset(self):
|
||||
pass
|
||||
|
||||
def _add_log_entry(self, entry: AbstractMessageLogEntry):
|
||||
try:
|
||||
# Paused, throw it away.
|
||||
if self._paused:
|
||||
return
|
||||
self._raw_entries.append(entry)
|
||||
if self.filter.match(entry):
|
||||
next_idx = len(self._filtered_entries)
|
||||
self._begin_insert(next_idx)
|
||||
self._filtered_entries.append(entry)
|
||||
self._end_insert()
|
||||
|
||||
entry.cache_summary()
|
||||
# In the common case we don't need to keep around the serialization
|
||||
# caches anymore. If the filter changes, the caches will be repopulated
|
||||
# as necessary.
|
||||
entry.freeze()
|
||||
except Exception:
|
||||
LOG.exception("Failed to filter queued message")
|
||||
|
||||
def clear(self):
|
||||
self._begin_reset()
|
||||
self._filtered_entries.clear()
|
||||
self._raw_entries.clear()
|
||||
self._end_reset()
|
||||
|
||||
|
||||
class AbstractMessageLogEntry:
|
||||
region: typing.Optional[ProxiedRegion]
|
||||
session: typing.Optional[Session]
|
||||
name: str
|
||||
type: str
|
||||
|
||||
__slots__ = ["_region", "_session", "_region_name", "_agent_id", "_summary", "meta"]
|
||||
|
||||
def __init__(self, region, session):
|
||||
if region and not isinstance(region, weakref.ReferenceType):
|
||||
region = weakref.ref(region)
|
||||
if session and not isinstance(session, weakref.ReferenceType):
|
||||
session = weakref.ref(session)
|
||||
|
||||
self._region: typing.Optional[weakref.ReferenceType] = region
|
||||
self._session: typing.Optional[weakref.ReferenceType] = session
|
||||
self._region_name = None
|
||||
self._agent_id = None
|
||||
self._summary = None
|
||||
if self.region:
|
||||
self._region_name = self.region.name
|
||||
if self.session:
|
||||
self._agent_id = self.session.agent_id
|
||||
|
||||
agent_obj = None
|
||||
if self.region is not None:
|
||||
agent_obj = self.region.objects.lookup_fullid(self.agent_id)
|
||||
self.meta = {
|
||||
"RegionName": self.region_name,
|
||||
"AgentID": self.agent_id,
|
||||
"SessionID": self.session.id if self.session else None,
|
||||
"AgentLocal": agent_obj.LocalID if agent_obj is not None else None,
|
||||
"Method": self.method,
|
||||
"Type": self.type,
|
||||
"SelectedLocal": self._current_selected_local(),
|
||||
"SelectedFull": self._current_selected_full(),
|
||||
}
|
||||
|
||||
def freeze(self):
|
||||
pass
|
||||
|
||||
def cache_summary(self):
|
||||
self._summary = self.summary
|
||||
|
||||
def _current_selected_local(self):
|
||||
if self.session:
|
||||
return self.session.selected.object_local
|
||||
return None
|
||||
|
||||
def _current_selected_full(self):
|
||||
selected_local = self._current_selected_local()
|
||||
if selected_local is None or self.region is None:
|
||||
return None
|
||||
obj = self.region.objects.lookup_localid(selected_local)
|
||||
return obj and obj.FullID
|
||||
|
||||
def _get_meta(self, name: str):
|
||||
# Slight difference in semantics. Filters are meant to return the same
|
||||
# thing no matter when they're run, so SelectedLocal and friends resolve
|
||||
# to the selected items _at the time the message was logged_. To handle
|
||||
# the case where we want to match on the selected object at the time the
|
||||
# filter is evaluated, we resolve these here.
|
||||
if name == "CurrentSelectedLocal":
|
||||
return self._current_selected_local()
|
||||
elif name == "CurrentSelectedFull":
|
||||
return self._current_selected_full()
|
||||
return self.meta.get(name)
|
||||
|
||||
@property
|
||||
def region(self) -> typing.Optional[ProxiedRegion]:
|
||||
if self._region:
|
||||
return self._region()
|
||||
return None
|
||||
|
||||
@property
|
||||
def session(self) -> typing.Optional[Session]:
|
||||
if self._session:
|
||||
return self._session()
|
||||
return None
|
||||
|
||||
@property
|
||||
def region_name(self) -> str:
|
||||
region = self.region
|
||||
if region:
|
||||
self._region_name = region.name
|
||||
return self._region_name
|
||||
# Region may die after a message is logged, need to keep this around.
|
||||
if self._region_name:
|
||||
return self._region_name
|
||||
|
||||
return ""
|
||||
|
||||
@property
|
||||
def agent_id(self) -> typing.Optional[UUID]:
|
||||
if self._agent_id:
|
||||
return self._agent_id
|
||||
|
||||
session = self.session
|
||||
if session:
|
||||
self._agent_id = session.agent_id
|
||||
return self._agent_id
|
||||
return None
|
||||
|
||||
@property
|
||||
def host(self) -> str:
|
||||
region_name = self.region_name
|
||||
if not region_name:
|
||||
return ""
|
||||
session_str = ""
|
||||
agent_id = self.agent_id
|
||||
if agent_id:
|
||||
session_str = f" ({agent_id})"
|
||||
return region_name + session_str
|
||||
|
||||
def request(self, beautify=False, replacements=None):
|
||||
return None
|
||||
|
||||
def response(self, beautify=False):
|
||||
return None
|
||||
|
||||
def _packet_root_matches(self, pattern):
|
||||
if fnmatch.fnmatchcase(self.name, pattern):
|
||||
return True
|
||||
if fnmatch.fnmatchcase(self.type, pattern):
|
||||
return True
|
||||
return False
|
||||
|
||||
def _val_matches(self, operator, val, expected):
|
||||
if isinstance(expected, MetaFieldSpecifier):
|
||||
expected = self._get_meta(str(expected))
|
||||
if not isinstance(expected, (int, float, bytes, str, type(None), tuple)):
|
||||
if callable(expected):
|
||||
expected = expected()
|
||||
else:
|
||||
expected = str(expected)
|
||||
elif isinstance(expected, EnumFieldSpecifier):
|
||||
# Local import so we get a fresh copy of the templates module
|
||||
from hippolyzer.lib.proxy import templates
|
||||
enum_cls = getattr(templates, expected.enum_name)
|
||||
expected = enum_cls[expected.field_name]
|
||||
elif expected is not None:
|
||||
# Unbox the expected value
|
||||
expected = expected.value
|
||||
if not isinstance(val, (int, float, bytes, str, type(None), tuple, TupleCoord)):
|
||||
val = str(val)
|
||||
|
||||
if not operator:
|
||||
return bool(val)
|
||||
elif operator == "==":
|
||||
return val == expected
|
||||
elif operator == "!=":
|
||||
return val != expected
|
||||
elif operator == "^=":
|
||||
if val is None:
|
||||
return False
|
||||
return val.startswith(expected)
|
||||
elif operator == "$=":
|
||||
if val is None:
|
||||
return False
|
||||
return val.endswith(expected)
|
||||
elif operator == "~=":
|
||||
if val is None:
|
||||
return False
|
||||
return expected in val
|
||||
elif operator == "<":
|
||||
return val < expected
|
||||
elif operator == "<=":
|
||||
return val <= expected
|
||||
elif operator == ">":
|
||||
return val > expected
|
||||
elif operator == ">=":
|
||||
return val >= expected
|
||||
elif operator == "&":
|
||||
return val & expected
|
||||
else:
|
||||
raise ValueError(f"Unexpected operator {operator!r}")
|
||||
|
||||
def _base_matches(self, matcher: "MessageFilterNode") -> typing.Optional[bool]:
|
||||
if len(matcher.selector) == 1:
|
||||
# Comparison operators would make no sense here
|
||||
if matcher.value or matcher.operator:
|
||||
return False
|
||||
return self._packet_root_matches(matcher.selector[0])
|
||||
if len(matcher.selector) == 2 and matcher.selector[0] == "Meta":
|
||||
return self._val_matches(matcher.operator, self._get_meta(matcher.selector[1]), matcher.value)
|
||||
return None
|
||||
|
||||
def matches(self, matcher: "MessageFilterNode"):
|
||||
return self._base_matches(matcher) or False
|
||||
|
||||
@property
|
||||
def seq(self):
|
||||
return ""
|
||||
|
||||
@property
|
||||
def method(self):
|
||||
return ""
|
||||
|
||||
@property
|
||||
def summary(self):
|
||||
return ""
|
||||
|
||||
@staticmethod
|
||||
def _format_llsd(parsed):
|
||||
xmlified = llsd.format_pretty_xml(parsed)
|
||||
# dedent <key> by 1 for easier visual scanning
|
||||
xmlified = re.sub(rb" <key>", b"<key>", xmlified)
|
||||
return xmlified.decode("utf8", errors="replace")
|
||||
|
||||
|
||||
class HTTPMessageLogEntry(AbstractMessageLogEntry):
|
||||
__slots__ = ["flow"]
|
||||
|
||||
def __init__(self, flow: HippoHTTPFlow):
|
||||
self.flow: HippoHTTPFlow = flow
|
||||
cap_data = self.flow.cap_data
|
||||
region = cap_data and cap_data.region
|
||||
session = cap_data and cap_data.session
|
||||
|
||||
super().__init__(region, session)
|
||||
# This was a request the proxy made through itself
|
||||
self.meta["Injected"] = flow.request_injected
|
||||
|
||||
@property
|
||||
def type(self):
|
||||
return "HTTP"
|
||||
|
||||
@property
|
||||
def name(self):
|
||||
cap_data = self.flow.cap_data
|
||||
name = cap_data and cap_data.cap_name
|
||||
if name:
|
||||
return name
|
||||
return self.flow.request.url
|
||||
|
||||
@property
|
||||
def method(self):
|
||||
return self.flow.request.method
|
||||
|
||||
def _format_http_message(self, want_request, beautify):
|
||||
message = self.flow.request if want_request else self.flow.response
|
||||
method = self.flow.request.method
|
||||
buf = io.StringIO()
|
||||
cap_data = self.flow.cap_data
|
||||
cap_name = cap_data and cap_data.cap_name
|
||||
base_url = cap_name and cap_data.base_url
|
||||
temporary_cap = cap_data and cap_data.type == CapType.TEMPORARY
|
||||
beautify_url = (beautify and base_url and cap_name and
|
||||
not temporary_cap and self.session and want_request)
|
||||
if want_request:
|
||||
buf.write(message.method)
|
||||
buf.write(" ")
|
||||
if beautify_url:
|
||||
buf.write(f"[[{cap_name}]]{message.url[len(base_url):]}")
|
||||
else:
|
||||
buf.write(message.url)
|
||||
buf.write(" ")
|
||||
buf.write(message.http_version)
|
||||
else:
|
||||
buf.write(message.http_version)
|
||||
buf.write(" ")
|
||||
buf.write(str(message.status_code))
|
||||
buf.write(" ")
|
||||
buf.write(message.reason)
|
||||
buf.write("\r\n")
|
||||
if beautify_url:
|
||||
buf.write("# ")
|
||||
buf.write(message.url)
|
||||
buf.write("\r\n")
|
||||
|
||||
headers = copy.deepcopy(message.headers)
|
||||
for key in tuple(headers.keys()):
|
||||
if key.lower().startswith("x-hippo-"):
|
||||
LOG.warning(f"Internal header {key!r} leaked out?")
|
||||
# If this header actually came from somewhere untrusted, we can't
|
||||
# include it. It may change the meaning of the message when replayed.
|
||||
headers[f"X-Untrusted-{key}"] = headers[key]
|
||||
headers.pop(key)
|
||||
beautified = None
|
||||
if beautify and message.content:
|
||||
try:
|
||||
serializer = se.HTTP_SERIALIZERS.get(cap_name)
|
||||
if serializer:
|
||||
if want_request:
|
||||
beautified = serializer.deserialize_req_body(method, message.content)
|
||||
else:
|
||||
beautified = serializer.deserialize_resp_body(method, message.content)
|
||||
|
||||
if beautified is se.UNSERIALIZABLE:
|
||||
beautified = None
|
||||
else:
|
||||
beautified = self._format_llsd(beautified)
|
||||
headers["X-Hippo-Beautify"] = "1"
|
||||
|
||||
if not beautified:
|
||||
content_type = self._guess_content_type(message)
|
||||
if content_type.startswith("application/llsd"):
|
||||
beautified = self._format_llsd(llsd.parse(message.content))
|
||||
elif any(content_type.startswith(x) for x in ("application/xml", "text/xml")):
|
||||
beautified = minidom.parseString(message.content).toprettyxml(indent=" ")
|
||||
# kill blank lines. will break cdata sections. meh.
|
||||
beautified = re.sub(r'\n\s*\n', '\n', beautified, flags=re.MULTILINE)
|
||||
beautified = re.sub(r'<([\w]+)>\s*</\1>', r'<\1></\1>',
|
||||
beautified, flags=re.MULTILINE)
|
||||
except:
|
||||
LOG.exception("Failed to beautify message")
|
||||
|
||||
message_body = beautified or message.content
|
||||
if isinstance(message_body, bytes):
|
||||
try:
|
||||
decoded = message.text
|
||||
# Valid in many codecs, but unprintable.
|
||||
if "\x00" in decoded:
|
||||
raise ValueError("Embedded null")
|
||||
message_body = decoded
|
||||
except (UnicodeError, ValueError):
|
||||
# non-printable characters, return the escaped version.
|
||||
headers["X-Hippo-Escaped-Body"] = "1"
|
||||
message_body = bytes_escape(message_body).decode("utf8")
|
||||
|
||||
buf.write(bytes(headers).decode("utf8", errors="replace"))
|
||||
buf.write("\r\n")
|
||||
|
||||
buf.write(message_body)
|
||||
return buf.getvalue()
|
||||
|
||||
def request(self, beautify=False, replacements=None):
|
||||
return self._format_http_message(want_request=True, beautify=beautify)
|
||||
|
||||
def response(self, beautify=False):
|
||||
return self._format_http_message(want_request=False, beautify=beautify)
|
||||
|
||||
@property
|
||||
def summary(self):
|
||||
if self._summary is not None:
|
||||
return self._summary
|
||||
msg = self.flow.response
|
||||
self._summary = f"{msg.status_code}: "
|
||||
if not msg.content:
|
||||
return self._summary
|
||||
if len(msg.content) > 1000000:
|
||||
self._summary += "[too large...]"
|
||||
return self._summary
|
||||
content_type = self._guess_content_type(msg)
|
||||
if content_type.startswith("application/llsd"):
|
||||
notation = llsd.format_notation(llsd.parse(msg.content))
|
||||
self._summary += notation.decode("utf8")[:500]
|
||||
return self._summary
|
||||
|
||||
def _guess_content_type(self, message):
|
||||
content_type = message.headers.get("Content-Type", "")
|
||||
if not message.content or content_type.startswith("application/llsd"):
|
||||
return content_type
|
||||
# Sometimes gets sent with `text/plain` or `text/html`. Cool.
|
||||
if message.content.startswith(rb'<?xml version="1.0" ?><llsd>'):
|
||||
return "application/llsd+xml"
|
||||
if message.content.startswith(rb'<llsd>'):
|
||||
return "application/llsd+xml"
|
||||
if message.content.startswith(rb'<?xml '):
|
||||
return "application/xml"
|
||||
return content_type
|
||||
|
||||
|
||||
class EQMessageLogEntry(AbstractMessageLogEntry):
|
||||
__slots__ = ["event"]
|
||||
|
||||
def __init__(self, event, region, session):
|
||||
super().__init__(region, session)
|
||||
self.event = event
|
||||
|
||||
@property
|
||||
def type(self):
|
||||
return "EQ"
|
||||
|
||||
def request(self, beautify=False, replacements=None):
|
||||
return self._format_llsd(self.event["body"])
|
||||
|
||||
@property
|
||||
def name(self):
|
||||
return self.event["message"]
|
||||
|
||||
@property
|
||||
def summary(self):
|
||||
if self._summary is not None:
|
||||
return self._summary
|
||||
self._summary = ""
|
||||
self._summary = llsd.format_notation(self.event["body"]).decode("utf8")[:500]
|
||||
return self._summary
|
||||
|
||||
|
||||
class LLUDPMessageLogEntry(AbstractMessageLogEntry):
|
||||
__slots__ = ["_message", "_name", "_direction", "_frozen_message", "_seq", "_deserializer"]
|
||||
|
||||
def __init__(self, message: ProxiedMessage, region, session):
|
||||
self._message: ProxiedMessage = message
|
||||
self._deserializer = None
|
||||
self._name = message.name
|
||||
self._direction = message.direction
|
||||
self._frozen_message: typing.Optional[bytes] = None
|
||||
self._seq = message.packet_id
|
||||
super().__init__(region, session)
|
||||
|
||||
_MESSAGE_META_ATTRS = {
|
||||
"Injected", "Dropped", "Extra", "Resent", "Zerocoded", "Acks", "Reliable",
|
||||
}
|
||||
|
||||
def _get_meta(self, name: str):
|
||||
# These may change between when the message is logged and when we
|
||||
# actually filter on it, since logging happens before addons.
|
||||
msg = self.message
|
||||
if name in self._MESSAGE_META_ATTRS:
|
||||
return getattr(msg, name.lower(), None)
|
||||
msg_meta = getattr(msg, "meta", None)
|
||||
if msg_meta is not None:
|
||||
if name in msg_meta:
|
||||
return msg_meta[name]
|
||||
return super()._get_meta(name)
|
||||
|
||||
@property
|
||||
def message(self):
|
||||
if self._message:
|
||||
return self._message
|
||||
elif self._frozen_message:
|
||||
message = pickle.loads(self._frozen_message)
|
||||
message.deserializer = self._deserializer
|
||||
return message
|
||||
else:
|
||||
raise ValueError("Didn't have a fresh or frozen message somehow")
|
||||
|
||||
def freeze(self):
|
||||
self.message.invalidate_caches()
|
||||
# These are expensive to keep around. pickle them and un-pickle on
|
||||
# an as-needed basis.
|
||||
self._deserializer = self.message.deserializer
|
||||
self._frozen_message = pickle.dumps(self._message, protocol=pickle.HIGHEST_PROTOCOL)
|
||||
self._message = None
|
||||
|
||||
@property
|
||||
def type(self):
|
||||
return "LLUDP"
|
||||
|
||||
@property
|
||||
def name(self):
|
||||
if self._message:
|
||||
self._name = self._message.name
|
||||
return self._name
|
||||
|
||||
@property
|
||||
def method(self):
|
||||
if self._message:
|
||||
self._direction = self._message.direction
|
||||
return self._direction.name if self._direction is not None else ""
|
||||
|
||||
def request(self, beautify=False, replacements=None):
|
||||
return self.message.to_human_string(replacements, beautify)
|
||||
|
||||
def matches(self, matcher):
|
||||
base_matched = self._base_matches(matcher)
|
||||
if base_matched is not None:
|
||||
return base_matched
|
||||
|
||||
if not self._packet_root_matches(matcher.selector[0]):
|
||||
return False
|
||||
|
||||
message = self.message
|
||||
|
||||
selector_len = len(matcher.selector)
|
||||
# name, block_name, var_name(, subfield_name)?
|
||||
if selector_len not in (3, 4):
|
||||
return False
|
||||
for block_name in message.blocks:
|
||||
if not fnmatch.fnmatchcase(block_name, matcher.selector[1]):
|
||||
continue
|
||||
for block_num, block in enumerate(message[block_name]):
|
||||
for var_name in block.vars.keys():
|
||||
if not fnmatch.fnmatchcase(var_name, matcher.selector[2]):
|
||||
continue
|
||||
# So we know where the match happened
|
||||
span_key = (message.name, block_name, block_num, var_name)
|
||||
if selector_len == 3:
|
||||
# We're just matching on the var existing, not having any particular value
|
||||
if matcher.value is None:
|
||||
return span_key
|
||||
if self._val_matches(matcher.operator, block[var_name], matcher.value):
|
||||
return span_key
|
||||
# Need to invoke a special unpacker
|
||||
elif selector_len == 4:
|
||||
try:
|
||||
deserialized = block.deserialize_var(var_name)
|
||||
except KeyError:
|
||||
continue
|
||||
# Discard the tag if this is a tagged union, we only want the value
|
||||
if isinstance(deserialized, TaggedUnion):
|
||||
deserialized = deserialized.value
|
||||
if not isinstance(deserialized, dict):
|
||||
return False
|
||||
for key in deserialized.keys():
|
||||
if fnmatch.fnmatchcase(str(key), matcher.selector[3]):
|
||||
if matcher.value is None:
|
||||
return span_key
|
||||
if self._val_matches(matcher.operator, deserialized[key], matcher.value):
|
||||
return span_key
|
||||
|
||||
return False
|
||||
|
||||
@property
|
||||
def summary(self):
|
||||
if self._summary is None:
|
||||
self._summary = self.message.to_summary()[:500]
|
||||
return self._summary
|
||||
|
||||
@property
|
||||
def seq(self):
|
||||
if self._message:
|
||||
self._seq = self._message.packet_id
|
||||
return self._seq
|
||||
@@ -44,8 +44,8 @@ class OrphanManager:
|
||||
del self._orphans[parent_id]
|
||||
return removed
|
||||
|
||||
def collect_orphans(self, parent: Object) -> typing.Sequence[int]:
|
||||
return self._orphans.pop(parent.LocalID, [])
|
||||
def collect_orphans(self, parent_localid: int) -> typing.Sequence[int]:
|
||||
return self._orphans.pop(parent_localid, [])
|
||||
|
||||
def track_orphan(self, obj: Object):
|
||||
self.track_orphan_by_id(obj.LocalID, obj.ParentID)
|
||||
@@ -60,7 +60,19 @@ OBJECT_OR_LOCAL = typing.Union[Object, int]
|
||||
|
||||
|
||||
class ObjectManager:
|
||||
"""Object manager for a specific region"""
|
||||
"""
|
||||
Object manager for a specific region
|
||||
|
||||
TODO: This model does not make sense given how region->region object handoff works.
|
||||
The ObjectManager has to notice when an ObjectUpdate for an object came from a
|
||||
new region and update the associated region itself. It will not receive a KillObject
|
||||
from the old region in the case of physical region crossings. Right now this means
|
||||
physical objects or agents that physically cross a sim border get dangling object
|
||||
references. This is not the case when they teleport, even across a small distance
|
||||
to a neighbor, as that will send a KillObject in the old sim.
|
||||
Needs to switch to one manager managing objects for a full session rather than one
|
||||
manager per region.
|
||||
"""
|
||||
|
||||
def __init__(self, region: ProxiedRegion):
|
||||
self._localid_lookup: typing.Dict[int, Object] = {}
|
||||
@@ -87,6 +99,9 @@ class ObjectManager:
|
||||
message_handler.subscribe("KillObject",
|
||||
self._handle_kill_object)
|
||||
|
||||
def __len__(self):
|
||||
return len(self._localid_lookup)
|
||||
|
||||
@property
|
||||
def all_objects(self) -> typing.Iterable[Object]:
|
||||
return self._localid_lookup.values()
|
||||
@@ -106,7 +121,7 @@ class ObjectManager:
|
||||
return None
|
||||
return self.lookup_localid(local_id)
|
||||
|
||||
def _track_object(self, obj: Object):
|
||||
def _track_object(self, obj: Object, notify: bool = True):
|
||||
self._localid_lookup[obj.LocalID] = obj
|
||||
self._fullid_lookup[obj.FullID] = obj.LocalID
|
||||
# If it was missing, it's not missing anymore.
|
||||
@@ -115,13 +130,34 @@ class ObjectManager:
|
||||
self._parent_object(obj)
|
||||
|
||||
# Adopt any of our orphaned child objects.
|
||||
for orphan_local in self._orphan_manager.collect_orphans(obj):
|
||||
for orphan_local in self._orphan_manager.collect_orphans(obj.LocalID):
|
||||
child_obj = self.lookup_localid(orphan_local)
|
||||
# Shouldn't be any dead children in the orphanage
|
||||
assert child_obj is not None
|
||||
self._parent_object(child_obj)
|
||||
|
||||
self._notify_object_updated(obj, set(obj.to_dict().keys()))
|
||||
if notify:
|
||||
self._notify_object_updated(obj, set(obj.to_dict().keys()))
|
||||
|
||||
def _untrack_object(self, obj: Object):
|
||||
former_child_ids = obj.ChildIDs[:]
|
||||
for child_id in former_child_ids:
|
||||
child_obj = self.lookup_localid(child_id)
|
||||
assert child_obj is not None
|
||||
self._unparent_object(child_obj, child_obj.ParentID)
|
||||
|
||||
# Place any remaining unkilled children in the orphanage
|
||||
for child_id in former_child_ids:
|
||||
self._orphan_manager.track_orphan_by_id(child_id, obj.LocalID)
|
||||
|
||||
assert not obj.ChildIDs
|
||||
|
||||
# Make sure the parent knows we went away
|
||||
self._unparent_object(obj, obj.ParentID)
|
||||
|
||||
# Do this last in case we only have a weak reference
|
||||
del self._fullid_lookup[obj.FullID]
|
||||
del self._localid_lookup[obj.LocalID]
|
||||
|
||||
def _parent_object(self, obj: Object, insert_at_head=False):
|
||||
if obj.ParentID:
|
||||
@@ -163,9 +199,27 @@ class ObjectManager:
|
||||
|
||||
def _update_existing_object(self, obj: Object, new_properties):
|
||||
new_parent_id = new_properties.get("ParentID", obj.ParentID)
|
||||
|
||||
actually_updated_props = set()
|
||||
|
||||
if obj.LocalID != new_properties.get("LocalID", obj.LocalID):
|
||||
# Our LocalID changed, and we deal with linkages to other prims by
|
||||
# LocalID association. Break any links since our LocalID is changing.
|
||||
# Could happen if we didn't mark an attachment prim dead and the parent agent
|
||||
# came back into the sim. Attachment FullIDs do not change across TPs,
|
||||
# LocalIDs do. This at least lets us partially recover from the bad state.
|
||||
# Currently known to happen due to physical region crossings, so only debug.
|
||||
new_localid = new_properties["LocalID"]
|
||||
LOG.debug(f"Got an update with new LocalID for {obj.FullID}, {obj.LocalID} != {new_localid}. "
|
||||
f"May have mishandled a KillObject for a prim that left and re-entered region.")
|
||||
self._untrack_object(obj)
|
||||
obj.LocalID = new_localid
|
||||
self._track_object(obj, notify=False)
|
||||
actually_updated_props |= {"LocalID"}
|
||||
|
||||
old_parent_id = obj.ParentID
|
||||
|
||||
actually_updated_props = obj.update_properties(new_properties)
|
||||
actually_updated_props |= obj.update_properties(new_properties)
|
||||
|
||||
if new_parent_id != old_parent_id:
|
||||
self._unparent_object(obj, old_parent_id)
|
||||
@@ -192,6 +246,7 @@ class ObjectManager:
|
||||
"State": block.deserialize_var("State", make_copy=False),
|
||||
**block.deserialize_var("ObjectData", make_copy=False).value,
|
||||
}
|
||||
object_data["LocalID"] = object_data.pop("ID")
|
||||
# Empty == not updated
|
||||
if not object_data["TextureEntry"]:
|
||||
object_data.pop("TextureEntry")
|
||||
@@ -211,7 +266,7 @@ class ObjectManager:
|
||||
for block in packet['ObjectData']:
|
||||
object_data = self._normalize_object_update(block)
|
||||
|
||||
seen_locals.append(object_data["ID"])
|
||||
seen_locals.append(object_data["LocalID"])
|
||||
obj = self.lookup_fullid(object_data["FullID"])
|
||||
if obj:
|
||||
self._update_existing_object(obj, object_data)
|
||||
@@ -226,6 +281,7 @@ class ObjectManager:
|
||||
**dict(block.items()),
|
||||
"TextureEntry": block.deserialize_var("TextureEntry", make_copy=False),
|
||||
}
|
||||
object_data["LocalID"] = object_data.pop("ID")
|
||||
object_data.pop("Data")
|
||||
# Empty == not updated
|
||||
if object_data["TextureEntry"] is None:
|
||||
@@ -236,19 +292,19 @@ class ObjectManager:
|
||||
seen_locals = []
|
||||
for block in packet['ObjectData']:
|
||||
object_data = self._normalize_terse_object_update(block)
|
||||
obj = self.lookup_localid(object_data["ID"])
|
||||
obj = self.lookup_localid(object_data["LocalID"])
|
||||
# Can only update existing object with this message
|
||||
if obj:
|
||||
# Need the Object as context because decoding state requires PCode.
|
||||
state_deserializer = ObjectStateSerializer.deserialize
|
||||
object_data["State"] = state_deserializer(ctx_obj=obj, val=object_data["State"])
|
||||
|
||||
seen_locals.append(object_data["ID"])
|
||||
seen_locals.append(object_data["LocalID"])
|
||||
if obj:
|
||||
self._update_existing_object(obj, object_data)
|
||||
else:
|
||||
self.missing_locals.add(object_data["ID"])
|
||||
LOG.debug(f"Received terse update for unknown object {object_data['ID']}")
|
||||
self.missing_locals.add(object_data["LocalID"])
|
||||
LOG.debug(f"Received terse update for unknown object {object_data['LocalID']}")
|
||||
|
||||
packet.meta["ObjectUpdateIDs"] = tuple(seen_locals)
|
||||
|
||||
@@ -288,6 +344,7 @@ class ObjectManager:
|
||||
"PSBlock": ps_block.value,
|
||||
# Parent flag not set means explicitly un-parented
|
||||
"ParentID": compressed.pop("ParentID", None) or 0,
|
||||
"LocalID": compressed.pop("ID"),
|
||||
**compressed,
|
||||
**dict(block.items()),
|
||||
"UpdateFlags": block.deserialize_var("UpdateFlags", make_copy=False),
|
||||
@@ -304,8 +361,8 @@ class ObjectManager:
|
||||
seen_locals = []
|
||||
for block in packet['ObjectData']:
|
||||
object_data = self._normalize_object_update_compressed(block)
|
||||
obj = self.lookup_localid(object_data["ID"])
|
||||
seen_locals.append(object_data["ID"])
|
||||
seen_locals.append(object_data["LocalID"])
|
||||
obj = self.lookup_localid(object_data["LocalID"])
|
||||
if obj:
|
||||
self._update_existing_object(obj, object_data)
|
||||
else:
|
||||
@@ -331,33 +388,38 @@ class ObjectManager:
|
||||
def _handle_kill_object(self, packet: ProxiedMessage):
|
||||
seen_locals = []
|
||||
for block in packet["ObjectData"]:
|
||||
obj = self.lookup_localid(block["ID"])
|
||||
self._kill_object_by_local_id(block["ID"])
|
||||
seen_locals.append(block["ID"])
|
||||
self.missing_locals -= {block["ID"]}
|
||||
if obj:
|
||||
AddonManager.handle_object_killed(self._region.session(), self._region, obj)
|
||||
|
||||
former_child_ids = obj.ChildIDs[:]
|
||||
for child_id in former_child_ids:
|
||||
child_obj = self.lookup_localid(child_id)
|
||||
assert child_obj is not None
|
||||
self._unparent_object(child_obj, child_obj.ParentID)
|
||||
|
||||
del self._localid_lookup[obj.LocalID]
|
||||
del self._fullid_lookup[obj.FullID]
|
||||
|
||||
# Place any remaining unkilled children in the orphanage
|
||||
for child_id in former_child_ids:
|
||||
self._orphan_manager.track_orphan_by_id(child_id, obj.LocalID)
|
||||
|
||||
assert not obj.ChildIDs
|
||||
|
||||
# Make sure the parent knows we went away
|
||||
self._unparent_object(obj, obj.ParentID)
|
||||
else:
|
||||
logging.debug(f"Received {packet.name} for unknown {block['ID']}")
|
||||
packet.meta["ObjectUpdateIDs"] = tuple(seen_locals)
|
||||
|
||||
def _kill_object_by_local_id(self, local_id: int):
|
||||
obj = self.lookup_localid(local_id)
|
||||
self.missing_locals -= {local_id}
|
||||
child_ids: Sequence[int]
|
||||
if obj:
|
||||
AddonManager.handle_object_killed(self._region.session(), self._region, obj)
|
||||
child_ids = obj.ChildIDs
|
||||
else:
|
||||
LOG.debug(f"Tried to kill unknown object {local_id}")
|
||||
# If it had any orphans, they need to die.
|
||||
child_ids = self._orphan_manager.collect_orphans(local_id)
|
||||
|
||||
# KillObject implicitly kills descendents
|
||||
# This may mutate child_ids, use the reversed iterator so we don't
|
||||
# invalidate the iterator during removal.
|
||||
for child_id in reversed(child_ids):
|
||||
# indra special-cases avatar PCodes and doesn't mark them dead
|
||||
# due to cascading kill. Is this correct? Do avatars require
|
||||
# explicit kill?
|
||||
child_obj = self.lookup_localid(child_id)
|
||||
if child_obj and child_obj.PCode == PCode.AVATAR:
|
||||
continue
|
||||
self._kill_object_by_local_id(child_id)
|
||||
|
||||
# Have to do this last, since untracking will clear child IDs
|
||||
if obj:
|
||||
self._untrack_object(obj)
|
||||
|
||||
def _handle_get_object_cost(self, flow: HippoHTTPFlow):
|
||||
parsed = llsd.parse_xml(flow.response.content)
|
||||
if "error" in parsed:
|
||||
|
||||
@@ -15,8 +15,7 @@ from hippolyzer.lib.proxy.http_proxy import HTTPFlowContext, is_asset_server_cap
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion, CapType
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.message_logger import BaseMessageLogger
|
||||
|
||||
|
||||
class Session:
|
||||
@@ -144,17 +143,6 @@ class Session:
|
||||
return "<%s %s>" % (self.__class__.__name__, self.id)
|
||||
|
||||
|
||||
class BaseMessageLogger:
|
||||
def log_lludp_message(self, session: Session, region: ProxiedRegion, message: ProxiedMessage):
|
||||
pass
|
||||
|
||||
def log_http_response(self, flow: HippoHTTPFlow):
|
||||
pass
|
||||
|
||||
def log_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
|
||||
pass
|
||||
|
||||
|
||||
class SessionManager:
|
||||
def __init__(self):
|
||||
self.sessions: List[Session] = []
|
||||
|
||||
@@ -1283,8 +1283,8 @@ class ObjectUpdateExtraParamsSerializer(se.SimpleSubfieldSerializer):
|
||||
EMPTY_IS_NONE = True
|
||||
|
||||
|
||||
@se.enum_field_serializer("ObjectUpdate", "ObjectData", "Flags")
|
||||
class SoundFlags(enum.IntEnum):
|
||||
@se.flag_field_serializer("ObjectUpdate", "ObjectData", "Flags")
|
||||
class SoundFlags(enum.IntFlag):
|
||||
LOOP = 1 << 0
|
||||
SYNC_MASTER = 1 << 1
|
||||
SYNC_SLAVE = 1 << 2
|
||||
|
||||
@@ -1,10 +1,9 @@
|
||||
import dataclasses
|
||||
from typing import *
|
||||
|
||||
import pkg_resources
|
||||
|
||||
import hippolyzer.lib.base.serialization as se
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.base.helpers import get_resource_filename
|
||||
from hippolyzer.lib.proxy.templates import AssetType
|
||||
|
||||
|
||||
@@ -64,5 +63,5 @@ class VFS:
|
||||
return self._data_fh.read(block.size)
|
||||
|
||||
|
||||
_static_path = pkg_resources.resource_filename("hippolyzer.lib.proxy", "data/static_index.db2")
|
||||
_static_path = get_resource_filename("lib/proxy/data/static_index.db2")
|
||||
STATIC_VFS = VFS(_static_path)
|
||||
|
||||
4
requirements-test.txt
Normal file
4
requirements-test.txt
Normal file
@@ -0,0 +1,4 @@
|
||||
aioresponses
|
||||
pytest
|
||||
pytest-cov
|
||||
flake8
|
||||
5
setup.py
5
setup.py
@@ -25,7 +25,7 @@ from setuptools import setup, find_packages
|
||||
|
||||
here = path.abspath(path.dirname(__file__))
|
||||
|
||||
version = '0.2'
|
||||
version = '0.4.0'
|
||||
|
||||
with open(path.join(here, 'README.md')) as readme_fh:
|
||||
readme = readme_fh.read()
|
||||
@@ -93,10 +93,11 @@ setup(
|
||||
'Glymur<1.0',
|
||||
'numpy<2.0',
|
||||
# These could be in extras_require if you don't want a GUI.
|
||||
'pyside2',
|
||||
'pyside2<6.0',
|
||||
'qasync',
|
||||
],
|
||||
tests_require=[
|
||||
"pytest",
|
||||
"aioresponses",
|
||||
],
|
||||
)
|
||||
|
||||
121
setup_cxfreeze.py
Normal file
121
setup_cxfreeze.py
Normal file
@@ -0,0 +1,121 @@
|
||||
import setuptools # noqa
|
||||
|
||||
import os
|
||||
import shutil
|
||||
from distutils.core import Command
|
||||
from pathlib import Path
|
||||
|
||||
from cx_Freeze import setup, Executable
|
||||
|
||||
# We don't need any of these and they make the archive huge.
|
||||
TO_DELETE = [
|
||||
"lib/PySide2/Qt3DRender.pyd",
|
||||
"lib/PySide2/Qt53DRender.dll",
|
||||
"lib/PySide2/Qt5Charts.dll",
|
||||
"lib/PySide2/Qt5Location.dll",
|
||||
"lib/PySide2/Qt5Pdf.dll",
|
||||
"lib/PySide2/Qt5Quick.dll",
|
||||
"lib/PySide2/Qt5WebEngineCore.dll",
|
||||
"lib/PySide2/QtCharts.pyd",
|
||||
"lib/PySide2/QtMultimedia.pyd",
|
||||
"lib/PySide2/QtOpenGLFunctions.pyd",
|
||||
"lib/PySide2/QtOpenGLFunctions.pyi",
|
||||
"lib/PySide2/d3dcompiler_47.dll",
|
||||
"lib/PySide2/opengl32sw.dll",
|
||||
"lib/PySide2/translations",
|
||||
"lib/aiohttp/_find_header.c",
|
||||
"lib/aiohttp/_frozenlist.c",
|
||||
"lib/aiohttp/_helpers.c",
|
||||
"lib/aiohttp/_http_parser.c",
|
||||
"lib/aiohttp/_http_writer.c",
|
||||
"lib/aiohttp/_websocket.c",
|
||||
# Improve this to work with different versions.
|
||||
"lib/aiohttp/python39.dll",
|
||||
"lib/lazy_object_proxy/python39.dll",
|
||||
"lib/lxml/python39.dll",
|
||||
"lib/markupsafe/python39.dll",
|
||||
"lib/multidict/python39.dll",
|
||||
"lib/numpy/core/python39.dll",
|
||||
"lib/numpy/fft/python39.dll",
|
||||
"lib/numpy/linalg/python39.dll",
|
||||
"lib/numpy/random/python39.dll",
|
||||
"lib/python39.dll",
|
||||
"lib/recordclass/python39.dll",
|
||||
"lib/regex/python39.dll",
|
||||
"lib/test",
|
||||
"lib/yarl/python39.dll",
|
||||
]
|
||||
|
||||
COPY_TO_ZIP = [
|
||||
"LICENSE.txt",
|
||||
"README.md",
|
||||
"NOTICE.md",
|
||||
# Must have been generated with pip-licenses before. Many dependencies
|
||||
# require their license to be distributed with their binaries.
|
||||
"lib_licenses.txt",
|
||||
]
|
||||
|
||||
|
||||
BASE_DIR = Path(__file__).parent.absolute()
|
||||
|
||||
|
||||
class FinalizeCXFreezeCommand(Command):
|
||||
description = "Prepare cx_Freeze build dirs and create a zip"
|
||||
user_options = []
|
||||
|
||||
def initialize_options(self) -> None:
|
||||
pass
|
||||
|
||||
def finalize_options(self) -> None:
|
||||
pass
|
||||
|
||||
def run(self):
|
||||
(BASE_DIR / "dist").mkdir(exist_ok=True)
|
||||
for path in (BASE_DIR / "build").iterdir():
|
||||
if path.name.startswith("exe.") and path.is_dir():
|
||||
for cleanse_suffix in TO_DELETE:
|
||||
cleanse_path = path / cleanse_suffix
|
||||
shutil.rmtree(cleanse_path, ignore_errors=True)
|
||||
try:
|
||||
os.unlink(cleanse_path)
|
||||
except:
|
||||
pass
|
||||
for to_copy in COPY_TO_ZIP:
|
||||
shutil.copy(BASE_DIR / to_copy, path / to_copy)
|
||||
zip_path = BASE_DIR / "dist" / path.name
|
||||
shutil.make_archive(zip_path, "zip", path)
|
||||
|
||||
|
||||
options = {
|
||||
"build_exe": {
|
||||
"packages": [
|
||||
"passlib",
|
||||
"_cffi_backend",
|
||||
"hippolyzer",
|
||||
],
|
||||
# exclude packages that are not really needed
|
||||
"excludes": [
|
||||
"tkinter",
|
||||
],
|
||||
"include_msvcr": True,
|
||||
}
|
||||
}
|
||||
|
||||
executables = [
|
||||
Executable(
|
||||
"hippolyzer/apps/proxy_gui.py",
|
||||
base=None,
|
||||
target_name="hippolyzer_gui"
|
||||
),
|
||||
]
|
||||
|
||||
setup(
|
||||
name="hippolyzer_gui",
|
||||
version="0.4.0",
|
||||
description="Hippolyzer GUI",
|
||||
options=options,
|
||||
executables=executables,
|
||||
cmdclass={
|
||||
"finalize_cxfreeze": FinalizeCXFreezeCommand,
|
||||
}
|
||||
)
|
||||
@@ -1,16 +1,18 @@
|
||||
import pkg_resources
|
||||
import os
|
||||
import unittest
|
||||
|
||||
from hippolyzer.lib.base.mesh import LLMeshSerializer, MeshAsset
|
||||
import hippolyzer.lib.base.serialization as se
|
||||
|
||||
BASE_PATH = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class TestMesh(unittest.TestCase):
|
||||
|
||||
@classmethod
|
||||
def setUpClass(cls) -> None:
|
||||
# Use a rigged cube SLM from the upload process as a test file
|
||||
slm_file = pkg_resources.resource_filename(__name__, "test_resources/testslm.slm")
|
||||
slm_file = os.path.join(BASE_PATH, "test_resources", "testslm.slm")
|
||||
with open(slm_file, "rb") as f:
|
||||
cls.slm_bytes = f.read()
|
||||
|
||||
|
||||
@@ -126,8 +126,6 @@ class TestMessage(unittest.TestCase):
|
||||
def test_partial_decode_pickle(self):
|
||||
msg = self.deserial.deserialize(self.serial.serialize(self.chat_msg))
|
||||
self.assertEqual(msg.deserializer(), self.deserial)
|
||||
# Have to remove the weak ref so we can pickle
|
||||
msg.deserializer = None
|
||||
msg = pickle.loads(pickle.dumps(msg, protocol=pickle.HIGHEST_PROTOCOL))
|
||||
|
||||
# We should still have the raw body at this point
|
||||
|
||||
@@ -52,6 +52,7 @@ class BaseIntegrationTest(unittest.IsolatedAsyncioTestCase):
|
||||
self.session.open_circuit(self.client_addr, self.region_addr,
|
||||
self.protocol.transport)
|
||||
self.session.main_region = self.session.regions[-1]
|
||||
self.session.main_region.handle = 0
|
||||
|
||||
def _msg_to_datagram(self, msg: ProxiedMessage, src, dst, direction, socks_header=True):
|
||||
serialized = self.serializer.serialize(msg)
|
||||
|
||||
@@ -13,6 +13,7 @@ from hippolyzer.lib.base.objects import Object
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon
|
||||
from hippolyzer.lib.proxy.addons import AddonManager
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.message_logger import FilteringMessageLogger
|
||||
from hippolyzer.lib.proxy.packets import ProxiedUDPPacket, Direction
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
@@ -35,6 +36,12 @@ class MockAddon(BaseAddon):
|
||||
self.events.append(("object_update", session.id, region.circuit_addr, obj.LocalID))
|
||||
|
||||
|
||||
class SimpleMessageLogger(FilteringMessageLogger):
|
||||
@property
|
||||
def entries(self):
|
||||
return self._filtered_entries
|
||||
|
||||
|
||||
class LLUDPIntegrationTests(BaseIntegrationTest):
|
||||
def setUp(self) -> None:
|
||||
super().setUp()
|
||||
@@ -169,3 +176,14 @@ class LLUDPIntegrationTests(BaseIntegrationTest):
|
||||
obj = self.session.regions[0].objects.lookup_localid(1234)
|
||||
self.assertIsInstance(obj.TextureEntry, lazy_object_proxy.Proxy)
|
||||
self.assertEqual(obj.TextureEntry.Textures[None], UUID("89556747-24cb-43ed-920b-47caed15465f"))
|
||||
|
||||
async def test_message_logger(self):
|
||||
message_logger = SimpleMessageLogger()
|
||||
self.session_manager.message_logger = message_logger
|
||||
self._setup_circuit()
|
||||
obj_update = self._make_objectupdate_compressed(1234)
|
||||
self.protocol.datagram_received(obj_update, self.region_addr)
|
||||
await self._wait_drained()
|
||||
entries = message_logger.entries
|
||||
self.assertEqual(len(entries), 1)
|
||||
self.assertEqual(entries[0].name, "ObjectUpdateCompressed")
|
||||
|
||||
65
tests/proxy/test_capsclient.py
Normal file
65
tests/proxy/test_capsclient.py
Normal file
@@ -0,0 +1,65 @@
|
||||
import unittest
|
||||
|
||||
import aiohttp
|
||||
import aioresponses
|
||||
from yarl import URL
|
||||
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.proxy.caps_client import CapsClient
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import SessionManager
|
||||
|
||||
|
||||
class TestCapsClient(unittest.IsolatedAsyncioTestCase):
|
||||
def setUp(self) -> None:
|
||||
self.session = self.session = SessionManager().create_session({
|
||||
"session_id": UUID.random(),
|
||||
"secure_session_id": UUID.random(),
|
||||
"agent_id": UUID.random(),
|
||||
"circuit_code": 0,
|
||||
"sim_ip": "127.0.0.1",
|
||||
"sim_port": "1",
|
||||
"seed_capability": "https://test.localhost:4/foo",
|
||||
})
|
||||
self.region = ProxiedRegion(("127.0.0.1", 1), "", self.session)
|
||||
self.caps_client = CapsClient(self.region)
|
||||
|
||||
async def test_bare_url_works(self):
|
||||
with aioresponses.aioresponses() as m:
|
||||
m.get("https://example.com/", body=b"foo")
|
||||
async with self.caps_client.get("https://example.com/") as resp:
|
||||
self.assertEqual(await resp.read(), b"foo")
|
||||
|
||||
async def test_own_session_works(self):
|
||||
with aioresponses.aioresponses() as m:
|
||||
async with aiohttp.ClientSession() as sess:
|
||||
m.get("https://example.com/", body=b"foo")
|
||||
async with self.caps_client.get("https://example.com/", session=sess) as resp:
|
||||
self.assertEqual(await resp.read(), b"foo")
|
||||
|
||||
async def test_read_llsd(self):
|
||||
with aioresponses.aioresponses() as m:
|
||||
m.get("https://example.com/", body=b"<llsd><integer>2</integer></llsd>")
|
||||
async with self.caps_client.get("https://example.com/") as resp:
|
||||
self.assertEqual(await resp.read_llsd(), 2)
|
||||
|
||||
async def test_caps(self):
|
||||
self.region.update_caps({"Foobar": "https://example.com/"})
|
||||
with aioresponses.aioresponses() as m:
|
||||
m.post("https://example.com/baz", body=b"ok")
|
||||
data = {"hi": "hello"}
|
||||
headers = {"Foo": "bar"}
|
||||
async with self.caps_client.post("Foobar", path="baz", llsd=data, headers=headers) as resp:
|
||||
self.assertEqual(await resp.read(), b"ok")
|
||||
|
||||
# Our original dict should not have been touched
|
||||
self.assertEqual(headers, {"Foo": "bar"})
|
||||
|
||||
req_key = ("POST", URL("https://example.com/baz"))
|
||||
req_body = m.requests[req_key][0].kwargs['data']
|
||||
self.assertEqual(req_body, b'<?xml version="1.0" ?><llsd><map><key>hi</key><string>hello'
|
||||
b'</string></map></llsd>')
|
||||
|
||||
with self.assertRaises(KeyError):
|
||||
with self.caps_client.get("BadCap"):
|
||||
pass
|
||||
41
tests/proxy/test_httpflows.py
Normal file
41
tests/proxy/test_httpflows.py
Normal file
@@ -0,0 +1,41 @@
|
||||
import unittest
|
||||
|
||||
from mitmproxy.test import tflow, tutils
|
||||
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
|
||||
from hippolyzer.lib.proxy.http_proxy import SerializedCapData
|
||||
from hippolyzer.lib.proxy.message_logger import HTTPMessageLogEntry
|
||||
from hippolyzer.lib.proxy.sessions import SessionManager
|
||||
|
||||
|
||||
class TestHTTPFlows(unittest.TestCase):
|
||||
def setUp(self) -> None:
|
||||
self.session_manager = SessionManager()
|
||||
self.session = self.session = self.session_manager.create_session({
|
||||
"session_id": UUID.random(),
|
||||
"secure_session_id": UUID.random(),
|
||||
"agent_id": UUID.random(),
|
||||
"circuit_code": 0,
|
||||
"sim_ip": "127.0.0.1",
|
||||
"sim_port": "1",
|
||||
"seed_capability": "https://test.localhost:4/foo",
|
||||
})
|
||||
|
||||
def test_request_formatting(self):
|
||||
req = tutils.treq(host="example.com", port=80)
|
||||
resp = tutils.tresp()
|
||||
fake_flow = tflow.tflow(req=req, resp=resp)
|
||||
fake_flow.metadata["cap_data_ser"] = SerializedCapData(
|
||||
cap_name="FakeCap",
|
||||
session_id=str(self.session.id),
|
||||
base_url="http://example.com",
|
||||
)
|
||||
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
|
||||
entry = HTTPMessageLogEntry(flow)
|
||||
self.assertEqual(entry.request(beautify=True), """GET [[FakeCap]]/path HTTP/1.1\r
|
||||
# http://example.com/path\r
|
||||
header: qvalue\r
|
||||
content-length: 7\r
|
||||
\r
|
||||
content""")
|
||||
@@ -1,13 +1,17 @@
|
||||
import unittest
|
||||
|
||||
from mitmproxy.test import tflow, tutils
|
||||
|
||||
from hippolyzer.lib.base.datatypes import Vector3
|
||||
from hippolyzer.lib.base.message.message import Block
|
||||
from hippolyzer.lib.base.message.udpdeserializer import UDPMessageDeserializer
|
||||
from hippolyzer.lib.base.settings import Settings
|
||||
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
|
||||
from hippolyzer.lib.proxy.http_proxy import SerializedCapData
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage as Message
|
||||
from hippolyzer.apps.model import LLUDPMessageLogEntry
|
||||
from hippolyzer.apps.message_filter import compile_filter
|
||||
|
||||
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, HTTPMessageLogEntry
|
||||
from hippolyzer.lib.proxy.message_filter import compile_filter
|
||||
from hippolyzer.lib.proxy.sessions import SessionManager
|
||||
|
||||
OBJECT_UPDATE = b'\xc0\x00\x00\x00Q\x00\x0c\x00\x01\xea\x03\x00\x02\xe6\x03\x00\x01\xbe\xff\x01\x06\xbc\x8e\x0b\x00' \
|
||||
b'\x01i\x94\x8cjM"\x1bf\xec\xe4\xac1c\x93\xcbKW\x89\x98\x01\t\x03\x00\x01Q@\x88>Q@\x88>Q@\x88><\xa2D' \
|
||||
@@ -17,7 +21,7 @@ OBJECT_UPDATE = b'\xc0\x00\x00\x00Q\x00\x0c\x00\x01\xea\x03\x00\x02\xe6\x03\x00\
|
||||
b'\x00\x02d&\x00\x03\x0e\x00\x01\x0e\x00\x01\x19\x00\x01\x80\x00\x01\x80\x00\x01\x80\x00\x01\x80\x00' \
|
||||
b'\x01\x80\x00\x01\x80\x91\x11\xd2^/\x12\x8f\x81U\xa7@:x\xb3\x0e-\x00\x10\x03\x01\x00\x03\x1e%n\xa2' \
|
||||
b'\xff\xc5\xe0\x83\x00\x01\x06\x00\x01\r\r\x01\x00\x11\x0e\xdc\x9b\x83\x98\x9aJv\xac\xc3\xdb\xbf7Ta' \
|
||||
b'\x88\x00" '
|
||||
b'\x88\x00"'
|
||||
|
||||
|
||||
class MessageFilterTests(unittest.TestCase):
|
||||
@@ -46,8 +50,10 @@ class MessageFilterTests(unittest.TestCase):
|
||||
def test_equality(self):
|
||||
msg = LLUDPMessageLogEntry(Message("Foo", Block("Bar", Baz=1)), None, None)
|
||||
self.assertTrue(self._filter_matches("Foo.Bar.Baz == 1", msg))
|
||||
self.assertTrue(self._filter_matches("Foo.Bar.Baz == 0x1", msg))
|
||||
msg.message["Bar"]["Baz"] = 2
|
||||
self.assertFalse(self._filter_matches("Foo.Bar.Baz == 1", msg))
|
||||
self.assertFalse(self._filter_matches("Foo.Bar.Baz == 0x1", msg))
|
||||
|
||||
def test_and(self):
|
||||
msg = LLUDPMessageLogEntry(Message("Foo", Block("Bar", Baz=1)), None, None)
|
||||
@@ -95,6 +101,14 @@ class MessageFilterTests(unittest.TestCase):
|
||||
self.assertFalse(self._filter_matches("Foo.Bar.Baz < (0, 3, 0)", msg))
|
||||
self.assertTrue(self._filter_matches("Foo.Bar.Baz > (0, 0, 0)", msg))
|
||||
|
||||
def test_enum_specifier(self):
|
||||
# 2 is the enum val for SculptType.TORUS
|
||||
msg = LLUDPMessageLogEntry(Message("Foo", Block("Bar", Baz=2)), None, None)
|
||||
self.assertTrue(self._filter_matches("Foo.Bar.Baz == SculptType.TORUS", msg))
|
||||
# bitwise AND should work as well
|
||||
self.assertTrue(self._filter_matches("Foo.Bar.Baz & SculptType.TORUS", msg))
|
||||
self.assertFalse(self._filter_matches("Foo.Bar.Baz == SculptType.SPHERE", msg))
|
||||
|
||||
def test_tagged_union_subfield(self):
|
||||
settings = Settings()
|
||||
settings.ENABLE_DEFERRED_PACKET_PARSING = False
|
||||
@@ -105,6 +119,17 @@ class MessageFilterTests(unittest.TestCase):
|
||||
self.assertTrue(self._filter_matches("ObjectUpdate.ObjectData.ObjectData.Position > (88, 41, 25)", entry))
|
||||
self.assertTrue(self._filter_matches("ObjectUpdate.ObjectData.ObjectData.Position < (90, 43, 27)", entry))
|
||||
|
||||
def test_http_flow(self):
|
||||
session_manager = SessionManager()
|
||||
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
|
||||
fake_flow.metadata["cap_data_ser"] = SerializedCapData(
|
||||
cap_name="FakeCap",
|
||||
)
|
||||
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), session_manager)
|
||||
entry = HTTPMessageLogEntry(flow)
|
||||
self.assertTrue(self._filter_matches("FakeCap", entry))
|
||||
self.assertFalse(self._filter_matches("NotFakeCap", entry))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
@@ -12,11 +12,13 @@ from hippolyzer.lib.proxy.addons import AddonManager
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon
|
||||
from hippolyzer.lib.proxy.objects import ObjectManager
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage as Message
|
||||
from hippolyzer.lib.proxy.templates import PCode
|
||||
|
||||
|
||||
class MockRegion:
|
||||
def __init__(self, message_handler: MessageHandler):
|
||||
self.session = lambda: None
|
||||
self.handle = 123
|
||||
self.message_handler = message_handler
|
||||
self.http_message_handler = MessageHandler()
|
||||
|
||||
@@ -43,9 +45,11 @@ class ObjectManagerTests(unittest.TestCase):
|
||||
self.object_addon = ObjectTrackingAddon()
|
||||
AddonManager.init([], None, [self.object_addon])
|
||||
|
||||
def _create_object_update(self, local_id=None, full_id=None, parent_id=None, pos=None, rot=None) -> Message:
|
||||
def _create_object_update(self, local_id=None, full_id=None, parent_id=None, pos=None, rot=None,
|
||||
pcode=None) -> Message:
|
||||
pos = pos if pos is not None else (1.0, 2.0, 3.0)
|
||||
rot = rot if rot is not None else (0.0, 0.0, 0.0, 1.0)
|
||||
pcode = pcode if pcode is not None else 9
|
||||
msg = Message(
|
||||
"ObjectUpdate",
|
||||
Block("RegionData", RegionHandle=123, TimeDilation=123),
|
||||
@@ -53,7 +57,7 @@ class ObjectManagerTests(unittest.TestCase):
|
||||
"ObjectData",
|
||||
ID=local_id if local_id is not None else random.getrandbits(32),
|
||||
FullID=full_id if full_id else UUID.random(),
|
||||
PCode=9,
|
||||
PCode=pcode,
|
||||
Scale=Vector3(0.5, 0.5, 0.5),
|
||||
UpdateFlags=268568894,
|
||||
PathCurve=16,
|
||||
@@ -85,8 +89,9 @@ class ObjectManagerTests(unittest.TestCase):
|
||||
# Run through (de)serializer to fill in any missing vars
|
||||
return self.deserializer.deserialize(self.serializer.serialize(msg))
|
||||
|
||||
def _create_object(self, local_id=None, full_id=None, parent_id=None, pos=None, rot=None) -> Object:
|
||||
msg = self._create_object_update(local_id=local_id, full_id=full_id, parent_id=parent_id, pos=pos, rot=rot)
|
||||
def _create_object(self, local_id=None, full_id=None, parent_id=None, pos=None, rot=None, pcode=None) -> Object:
|
||||
msg = self._create_object_update(
|
||||
local_id=local_id, full_id=full_id, parent_id=parent_id, pos=pos, rot=rot, pcode=pcode)
|
||||
self.message_handler.handle(msg)
|
||||
return self.object_manager.lookup_fullid(msg["ObjectData"]["FullID"])
|
||||
|
||||
@@ -122,14 +127,33 @@ class ObjectManagerTests(unittest.TestCase):
|
||||
self.assertEqual(set(), self.object_manager.missing_locals)
|
||||
self.assertSequenceEqual([child.LocalID], parent.ChildIDs)
|
||||
|
||||
def test_killing_parent_orphans_children(self):
|
||||
child = self._create_object(local_id=2, parent_id=1)
|
||||
def test_killing_parent_kills_children(self):
|
||||
_child = self._create_object(local_id=2, parent_id=1)
|
||||
parent = self._create_object(local_id=1)
|
||||
# This should orphan the child again
|
||||
self._kill_object(parent)
|
||||
parent = self._create_object(local_id=1)
|
||||
# Did we pick the orphan back up?
|
||||
self.assertSequenceEqual([child.LocalID], parent.ChildIDs)
|
||||
# We should not have picked up any children
|
||||
self.assertSequenceEqual([], parent.ChildIDs)
|
||||
|
||||
def test_hierarchy_killed(self):
|
||||
_child = self._create_object(local_id=3, parent_id=2)
|
||||
_other_child = self._create_object(local_id=4, parent_id=2)
|
||||
_parent = self._create_object(local_id=2, parent_id=1)
|
||||
grandparent = self._create_object(local_id=1)
|
||||
# KillObject implicitly kills all known descendents at that point
|
||||
self._kill_object(grandparent)
|
||||
self.assertEqual(0, len(self.object_manager))
|
||||
|
||||
def test_hierarchy_avatar_not_killed(self):
|
||||
_child = self._create_object(local_id=3, parent_id=2)
|
||||
_parent = self._create_object(local_id=2, parent_id=1, pcode=PCode.AVATAR)
|
||||
grandparent = self._create_object(local_id=1)
|
||||
# KillObject should only "unsit" child avatars (does this require an ObjectUpdate
|
||||
# or is ParentID=0 implied?)
|
||||
self._kill_object(grandparent)
|
||||
self.assertEqual(2, len(self.object_manager))
|
||||
self.assertIsNotNone(self.object_manager.lookup_localid(2))
|
||||
|
||||
def test_attachment_orphan_parent_tracking(self):
|
||||
"""
|
||||
@@ -142,15 +166,6 @@ class ObjectManagerTests(unittest.TestCase):
|
||||
parent = self._create_object(local_id=2, parent_id=1)
|
||||
self.assertSequenceEqual([child.LocalID], parent.ChildIDs)
|
||||
|
||||
def test_killing_attachment_parent_orphans_children(self):
|
||||
child = self._create_object(local_id=3, parent_id=2)
|
||||
parent = self._create_object(local_id=2, parent_id=1)
|
||||
# This should orphan the child again
|
||||
self._kill_object(parent)
|
||||
parent = self._create_object(local_id=2, parent_id=1)
|
||||
# Did we pick the orphan back up?
|
||||
self.assertSequenceEqual([child.LocalID], parent.ChildIDs)
|
||||
|
||||
def test_unparenting_succeeds(self):
|
||||
child = self._create_object(local_id=3, parent_id=2)
|
||||
parent = self._create_object(local_id=2)
|
||||
|
||||
Reference in New Issue
Block a user