Compare commits
49 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
b2f0de2db5 | ||
|
|
0b0e031091 | ||
|
|
4eeac738dc | ||
|
|
d9416363b3 | ||
|
|
5906140921 | ||
|
|
58932e585e | ||
|
|
b9f8ce0da2 | ||
|
|
67aa5e6bcd | ||
|
|
2a05529ceb | ||
|
|
a97aa88cc9 | ||
|
|
febc0793f2 | ||
|
|
141eb3afcd | ||
|
|
517888b1fa | ||
|
|
376b100ed9 | ||
|
|
07fbec47e1 | ||
|
|
7836527305 | ||
|
|
21b18b7a52 | ||
|
|
28b09144f2 | ||
|
|
1e13fede82 | ||
|
|
1bfb719f08 | ||
|
|
e5b63f7550 | ||
|
|
91328ac448 | ||
|
|
46dbacd475 | ||
|
|
187742c20a | ||
|
|
5eae956750 | ||
|
|
37e8f8a20e | ||
|
|
b3125f3231 | ||
|
|
46fed98d6a | ||
|
|
3b5938cf5c | ||
|
|
c7aeb03ea4 | ||
|
|
ab1bd16b5c | ||
|
|
0412ca5019 | ||
|
|
4d238c8dc8 | ||
|
|
3bcc510cfd | ||
|
|
0d9593e14c | ||
|
|
28dfe2f1b2 | ||
|
|
c8f7231eae | ||
|
|
00e9ecb765 | ||
|
|
2892bbeb98 | ||
|
|
28f57a8836 | ||
|
|
943b8b11d5 | ||
|
|
88915dd8d7 | ||
|
|
60b39e27f8 | ||
|
|
8af87befbd | ||
|
|
95e34bb07a | ||
|
|
106eb5c063 | ||
|
|
e7f88eeed9 | ||
|
|
d07f100452 | ||
|
|
02c212e4a6 |
12
.github/workflows/pytest.yml
vendored
12
.github/workflows/pytest.yml
vendored
@@ -1,6 +1,6 @@
|
||||
name: Run Python Tests
|
||||
|
||||
on: [push]
|
||||
on: [push, pull_request]
|
||||
|
||||
jobs:
|
||||
build:
|
||||
@@ -21,9 +21,13 @@ jobs:
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
pip install flake8 pytest pytest-cov
|
||||
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
|
||||
pip install -r requirements.txt
|
||||
pip install -r requirements-test.txt
|
||||
- name: Run Flake8
|
||||
run: |
|
||||
flake8 .
|
||||
- name: Test with pytest
|
||||
# Tests are intentionally covered to detect broken tests.
|
||||
run: |
|
||||
pytest --cov=./hippolyzer --cov=./tests --cov-report=xml
|
||||
|
||||
@@ -38,7 +42,5 @@ jobs:
|
||||
env_vars: OS,PYTHON
|
||||
name: codecov-umbrella
|
||||
fail_ci_if_error: false
|
||||
# We don't care if coverage drops
|
||||
continue-on-error: true
|
||||
path_to_write_report: ./coverage/codecov_report.txt
|
||||
verbose: false
|
||||
|
||||
45
README.md
45
README.md
@@ -62,16 +62,27 @@ the [Alchemy](https://github.com/AlchemyViewer/Alchemy) viewer.
|
||||
On Linux that would be `~/.firestorm_x64/` if you're using Firestorm.
|
||||
* * Certificate validation can be disabled entirely through viewer debug setting `NoVerifySSLCert`,
|
||||
but is not recommended.
|
||||
|
||||
#### Windows
|
||||
|
||||
Windows viewers have broken SOCKS 5 proxy support. To work around that, you need to use a wrapper EXE that
|
||||
can make the viewer to correctly talk to Hippolyzer. Follow the instructions on https://github.com/SaladDais/WinHippoAutoProxy
|
||||
to start the viewer and run it through Hippolyzer.
|
||||
|
||||
The proxy should _not_ be configured through the viewer's own preferences panel, it won't work correctly.
|
||||
|
||||
#### OS X & Linux
|
||||
|
||||
SOCKS 5 works correctly on these platforms, so you can just configure it through the
|
||||
`preferences -> network -> proxy settings` panel:
|
||||
|
||||
* Start the viewer and configure it to use `127.0.0.1:9061` as a SOCKS proxy and `127.0.0.1:9062` as
|
||||
an HTTP proxy. You **must** select the option in the viewer to use the HTTP proxy for all HTTP
|
||||
traffic, or logins will fail.
|
||||
* Optionally, If you want to reduce HTTP proxy lag you can have asset requests bypass the HTTP proxy by setting
|
||||
the `no_proxy` env var appropriately. For ex. `no_proxy="asset-cdn.glb.agni.lindenlab.com" ./firestorm` or
|
||||
`setx /m "no_proxy" "asset-cdn.glb.agni.lindenlab.com"` on Windows.
|
||||
the `no_proxy` env var appropriately. For ex. `no_proxy="asset-cdn.glb.agni.lindenlab.com" ./firestorm`.
|
||||
* Log in!
|
||||
|
||||

|
||||
|
||||
### Filtering
|
||||
|
||||
By default, the proxy's display filter is configured to ignore many high-frequency messages.
|
||||
@@ -95,6 +106,9 @@ agent's session, you can do `(Meta.AgentID == None || Meta.AgentID == "d929385f-
|
||||
Vectors can also be compared. This will get any ObjectUpdate variant that occurs within a certain range:
|
||||
`(*ObjectUpdate*.ObjectData.*Data.Position > (110, 50, 100) && *ObjectUpdate*.ObjectData.*Data.Position < (115, 55, 105))`
|
||||
|
||||
If you want to compare against an enum or a flag class in defined in `templates.py`, you can just specify its name:
|
||||
`ViewerEffect.Effect.Type == ViewerEffectType.EFFECT_BEAM`
|
||||
|
||||
### Logging
|
||||
|
||||
Decoded messages are displayed in the log pane, clicking one will show the request and
|
||||
@@ -301,8 +315,6 @@ If you are a viewer developer, please put them in a viewer.
|
||||
|
||||
* AISv3 wrapper?
|
||||
* Higher level wrappers for common things? I don't really need these, so only if people want to write them.
|
||||
* Highlight matched portion of message in log view, if applicable
|
||||
* * Remember deep filters and return a map of them, have message formatter return text ranges?
|
||||
* Move things out of `templates.py`, right now most binary serialization stuff lives there
|
||||
because it's more convenient for me to hot-reload.
|
||||
* Ability to add menus?
|
||||
@@ -315,6 +327,19 @@ This package [includes portions of the Second Life(TM) Viewer Artwork](https://g
|
||||
Copyright (C) 2008 Linden Research, Inc. The viewer artwork is licensed under the Creative Commons
|
||||
Attribution-Share Alike 3.0 License.
|
||||
|
||||
## Contributing
|
||||
|
||||
Ensure that any patches are clean with no unnecessary whitespace or formatting changes, and that you
|
||||
add new tests for any added functionality.
|
||||
|
||||
## Philosophy
|
||||
|
||||
With a few notable exceptions, Hippolyzer focuses mainly on decomposition of data, and doesn't
|
||||
provide many high-level abstractions for interpreting or manipulating that data. It's careful
|
||||
to only do lossless transforms on data that are just prettier representations of the data sent
|
||||
over the wire. Hippolyzer's goal is to help people understand how Second Life actually works,
|
||||
automatically employing abstractions that hide how SL works is counter to that goal.
|
||||
|
||||
## For Client Developers
|
||||
|
||||
This section is mostly useful if you're developing a new SL-compatible client from scratch. Clients based
|
||||
@@ -328,18 +353,20 @@ UDP proxy and an HTTP proxy.
|
||||
To have your client's traffic proxied through Hippolyzer the general flow is:
|
||||
|
||||
* Open a TCP connection to Hippolyzer's SOCKS 5 proxy port
|
||||
* * This should be done once per logical user session, as Hippolyzer assumes a 1:1 mapping of SOCKS
|
||||
* * This should be done once per logical user session, as Hippolyzer assumes a 1:1 mapping of SOCKS TCP
|
||||
connections to SL sessions
|
||||
* Send a UDP associate command without authentication
|
||||
* The proxy will respond with a host / port pair that UDP messages may be sent through
|
||||
* At this point you will no longer need to use the TCP connection, but it must be kept
|
||||
* At this point you will no longer need to use the TCP connection, but it must be kept
|
||||
alive until you want to break the UDP association
|
||||
* Whenever you send a UDP packet to a remote host, you'll need to instead send it to the host / port
|
||||
from the UDP associate response. A SOCKS 5 header must be prepended to the data indicating the ultimate destination
|
||||
of the packet
|
||||
* Any received UDP packets will also have a SOCKS 5 header indicating the real source IP and address
|
||||
* * When in doubt, check `socks_proxy.py`, `packets.py` and the SOCKS 5 RFC for more info on how to deal with SOCKS.
|
||||
* All HTTP requests must be sent through the Hippolyzer's HTTP proxy port.
|
||||
* * <https://github.com/SaladDais/WinHippoAutoProxy/blob/master/winhippoautoproxy/socks5udphooker.cpp> is a simple
|
||||
example that wraps around `recvfrom()` and `sendto()` and could be used as a starting point.
|
||||
* All HTTP requests must be sent through the Hippolyzer's HTTP proxy port.
|
||||
* * You may not need to do any extra plumbing to get this to work if your chosen HTTP client
|
||||
respects the `HTTP_PROXY` environment variable.
|
||||
* All HTTPS connections will be encrypted with the proxy's TLS key. You'll need to either add it to whatever
|
||||
|
||||
@@ -9,23 +9,22 @@ class GreetingAddon(BaseAddon):
|
||||
@handle_command()
|
||||
async def greetings(self, session: Session, region: ProxiedRegion):
|
||||
"""Greet everyone around you"""
|
||||
agent_obj = region.objects.lookup_fullid(session.agent_id)
|
||||
if not agent_obj:
|
||||
our_avatar = region.objects.lookup_avatar(session.agent_id)
|
||||
if not our_avatar:
|
||||
show_message("Don't have an agent object?")
|
||||
|
||||
# Note that this will only have avatars closeish to your camera. The sim sends
|
||||
# KillObjects for avatars that get too far away.
|
||||
other_agents = [o for o in region.objects.all_avatars if o.FullID != agent_obj.FullID]
|
||||
other_avatars = [o for o in region.objects.all_avatars if o.FullID != our_avatar.FullID]
|
||||
|
||||
if not other_agents:
|
||||
show_message("No other agents?")
|
||||
if not other_avatars:
|
||||
show_message("No other avatars?")
|
||||
|
||||
for other_agent in other_agents:
|
||||
dist = Vector3.dist(agent_obj.Position, other_agent.Position)
|
||||
for other_avatar in other_avatars:
|
||||
dist = Vector3.dist(our_avatar.RegionPosition, other_avatar.RegionPosition)
|
||||
if dist >= 19.0:
|
||||
continue
|
||||
nv = other_agent.NameValue.to_dict()
|
||||
send_chat(f"Greetings, {nv['FirstName']} {nv['LastName']}!")
|
||||
if other_avatar.Name is None:
|
||||
continue
|
||||
send_chat(f"Greetings, {other_avatar.Name}!")
|
||||
|
||||
|
||||
addons = [GreetingAddon()]
|
||||
|
||||
@@ -23,8 +23,7 @@ import ctypes
|
||||
import secrets
|
||||
from typing import *
|
||||
|
||||
import mitmproxy
|
||||
from mitmproxy.http import HTTPFlow
|
||||
import mitmproxy.http
|
||||
|
||||
from hippolyzer.lib.base import llsd
|
||||
from hippolyzer.lib.base.datatypes import *
|
||||
|
||||
@@ -37,6 +37,22 @@ from hippolyzer.lib.proxy.templates import TextureEntry
|
||||
|
||||
glymur.set_option('lib.num_threads', 4)
|
||||
|
||||
# These should never be replaced, they're only used as aliases to tell the viewer
|
||||
# it should fetch the relevant texture from the appearance service
|
||||
BAKES_ON_MESH_TEXTURE_IDS = {UUID(x) for x in (
|
||||
"5a9f4a74-30f2-821c-b88d-70499d3e7183",
|
||||
"ae2de45c-d252-50b8-5c6e-19f39ce79317",
|
||||
"24daea5f-0539-cfcf-047f-fbc40b2786ba",
|
||||
"52cc6bb6-2ee5-e632-d3ad-50197b1dcb8a",
|
||||
"43529ce8-7faa-ad92-165a-bc4078371687",
|
||||
"09aac1fb-6bce-0bee-7d44-caac6dbb6c63",
|
||||
"ff62763f-d60a-9855-890b-0c96f8f8cd98",
|
||||
"8e915e25-31d1-cc95-ae08-d58a47488251",
|
||||
"9742065b-19b5-297c-858a-29711d539043",
|
||||
"03642e83-2bd1-4eb9-34b4-4c47ed586d2d",
|
||||
"edd51b77-fc10-ce7a-4b3d-011dfc349e4f",
|
||||
)}
|
||||
|
||||
|
||||
def _modify_crc(crc_tweak: int, crc_val: int):
|
||||
return ctypes.c_uint32(crc_val ^ crc_tweak).value
|
||||
@@ -137,6 +153,8 @@ class MonochromeAddon(BaseAddon):
|
||||
# and we don't want to change the canonical view.
|
||||
parsed_te = copy.deepcopy(parsed_te)
|
||||
for k, v in parsed_te.Textures.items():
|
||||
if v in BAKES_ON_MESH_TEXTURE_IDS:
|
||||
continue
|
||||
# Replace textures with their alias to bust the viewer cache
|
||||
parsed_te.Textures[k] = tracker.get_alias_uuid(v)
|
||||
for k, v in parsed_te.Color.items():
|
||||
@@ -166,6 +184,8 @@ class MonochromeAddon(BaseAddon):
|
||||
orig_texture_id = self.mono_tracker.get_orig_uuid(UUID(texture_id))
|
||||
if not orig_texture_id:
|
||||
return
|
||||
if orig_texture_id in BAKES_ON_MESH_TEXTURE_IDS:
|
||||
return
|
||||
|
||||
# The request was for a fake texture ID we created, rewrite the request to
|
||||
# request the real asset and mark the flow for modification once we receive
|
||||
|
||||
@@ -4,10 +4,9 @@ from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.packets import Direction
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
from hippolyzer.lib.proxy.templates import IMDialogType
|
||||
from hippolyzer.lib.proxy.templates import IMDialogType, XferFilePath
|
||||
|
||||
SUSPICIOUS_PACKETS = {"RequestXfer", "TransferRequest", "UUIDNameRequest",
|
||||
"UUIDGroupNameRequest", "OpenCircuit"}
|
||||
SUSPICIOUS_PACKETS = {"TransferRequest", "UUIDNameRequest", "UUIDGroupNameRequest", "OpenCircuit"}
|
||||
REGULAR_IM_DIALOGS = (IMDialogType.TYPING_STOP, IMDialogType.TYPING_STOP, IMDialogType.NOTHING_SPECIAL)
|
||||
|
||||
|
||||
@@ -29,6 +28,13 @@ class ShieldAddon(BaseAddon):
|
||||
else:
|
||||
expected_id = from_agent ^ session.agent_id
|
||||
msg_block["ID"] = expected_id
|
||||
if message.name == "RequestXfer":
|
||||
xfer_block = message["XferID"][0]
|
||||
# Don't allow Xfers for files, only assets
|
||||
if xfer_block["FilePath"] != XferFilePath.NONE or xfer_block["Filename"].strip(b"\x00"):
|
||||
show_message(f"Blocked suspicious {message.name} packet")
|
||||
region.circuit.drop_message(message)
|
||||
return True
|
||||
|
||||
|
||||
addons = [ShieldAddon()]
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
"""
|
||||
Example of how to request an Xfer
|
||||
"""
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.base.legacy_inv import InventoryModel
|
||||
from hippolyzer.lib.base.message.message import Block
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message
|
||||
@@ -8,7 +9,7 @@ from hippolyzer.lib.proxy.commands import handle_command
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
from hippolyzer.lib.proxy.templates import XferFilePath
|
||||
from hippolyzer.lib.proxy.templates import XferFilePath, AssetType, InventoryType, WearableType
|
||||
|
||||
|
||||
class XferExampleAddon(BaseAddon):
|
||||
@@ -60,5 +61,61 @@ class XferExampleAddon(BaseAddon):
|
||||
item_names = [item.name for item in inv_model.items.values()]
|
||||
show_message(item_names)
|
||||
|
||||
@handle_command()
|
||||
async def eyes_for_you(self, session: Session, region: ProxiedRegion):
|
||||
"""Upload an eye bodypart and create an item for it"""
|
||||
asset_data = f"""LLWearable version 22
|
||||
New Eyes
|
||||
|
||||
\tpermissions 0
|
||||
\t{{
|
||||
\t\tbase_mask\t7fffffff
|
||||
\t\towner_mask\t7fffffff
|
||||
\t\tgroup_mask\t00000000
|
||||
\t\teveryone_mask\t00000000
|
||||
\t\tnext_owner_mask\t00082000
|
||||
\t\tcreator_id\t{session.agent_id}
|
||||
\t\towner_id\t{session.agent_id}
|
||||
\t\tlast_owner_id\t00000000-0000-0000-0000-000000000000
|
||||
\t\tgroup_id\t00000000-0000-0000-0000-000000000000
|
||||
\t}}
|
||||
\tsale_info\t0
|
||||
\t{{
|
||||
\t\tsale_type\tnot
|
||||
\t\tsale_price\t10
|
||||
\t}}
|
||||
type 3
|
||||
parameters 2
|
||||
98 0
|
||||
99 0
|
||||
textures 1
|
||||
3 89556747-24cb-43ed-920b-47caed15465f
|
||||
"""
|
||||
# If we want to create an item containing the asset we need to know the transaction id
|
||||
# used to create the asset.
|
||||
transaction_id = UUID.random()
|
||||
await region.xfer_manager.upload_asset(
|
||||
AssetType.BODYPART,
|
||||
data=asset_data,
|
||||
transaction_id=transaction_id
|
||||
)
|
||||
region.circuit.send_message(ProxiedMessage(
|
||||
'CreateInventoryItem',
|
||||
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
|
||||
Block(
|
||||
'InventoryBlock',
|
||||
CallbackID=0,
|
||||
# Null folder ID will put it in the default folder for the type
|
||||
FolderID=UUID(),
|
||||
TransactionID=transaction_id,
|
||||
NextOwnerMask=0x7fFFffFF,
|
||||
Type=AssetType.BODYPART,
|
||||
InvType=InventoryType.WEARABLE,
|
||||
WearableType=WearableType.EYES,
|
||||
Name='Eyes For You',
|
||||
Description=b''
|
||||
),
|
||||
))
|
||||
|
||||
|
||||
addons = [XferExampleAddon()]
|
||||
|
||||
@@ -5,7 +5,10 @@ coverage:
|
||||
status:
|
||||
project:
|
||||
default:
|
||||
# Do not fail PRs if the code coverage drops.
|
||||
# Do not fail commits if the code coverage drops.
|
||||
target: 0%
|
||||
threshold: 100%
|
||||
base: auto
|
||||
patch:
|
||||
default:
|
||||
only_pulls: true
|
||||
|
||||
@@ -136,7 +136,7 @@ def start_proxy(extra_addons: Optional[list] = None, extra_addon_paths: Optional
|
||||
async_server = loop.run_until_complete(coro)
|
||||
|
||||
event_manager = MITMProxyEventManager(session_manager, flow_context)
|
||||
loop.create_task(event_manager.pump_proxy_events())
|
||||
loop.create_task(event_manager.run())
|
||||
|
||||
addon_paths = sys.argv[1:]
|
||||
addon_paths.extend(extra_addon_paths)
|
||||
@@ -179,7 +179,7 @@ def start_proxy(extra_addons: Optional[list] = None, extra_addon_paths: Optional
|
||||
|
||||
def _windows_timeout_killer(pid: int):
|
||||
time.sleep(2.0)
|
||||
print(f"Killing hanging event loop")
|
||||
print("Killing hanging event loop")
|
||||
os.kill(pid, 9)
|
||||
|
||||
|
||||
|
||||
@@ -35,7 +35,7 @@ from hippolyzer.lib.proxy.ca_utils import setup_ca_everywhere
|
||||
from hippolyzer.lib.proxy.caps_client import CapsClient
|
||||
from hippolyzer.lib.proxy.http_proxy import create_proxy_master, HTTPFlowContext
|
||||
from hippolyzer.lib.proxy.packets import Direction
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage, VerbatimHumanVal, proxy_eval
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage, VerbatimHumanVal, proxy_eval, SpannedString
|
||||
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, AbstractMessageLogEntry
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session, SessionManager
|
||||
@@ -161,6 +161,8 @@ class ProxyGUI(QtWidgets.QMainWindow):
|
||||
"ViewerAsset GetTexture SetAlwaysRun GetDisplayNames MapImageService MapItemReply".split(" ")
|
||||
DEFAULT_FILTER = f"!({' || '.join(ignored for ignored in DEFAULT_IGNORE)})"
|
||||
|
||||
textRequest: QtWidgets.QTextEdit
|
||||
|
||||
def __init__(self):
|
||||
super().__init__()
|
||||
loadUi(MAIN_WINDOW_UI_PATH, self)
|
||||
@@ -263,8 +265,23 @@ class ProxyGUI(QtWidgets.QMainWindow):
|
||||
beautify=self.checkBeautify.isChecked(),
|
||||
replacements=self.buildReplacements(entry.session, entry.region),
|
||||
)
|
||||
resp = entry.response(beautify=self.checkBeautify.isChecked())
|
||||
highlight_range = None
|
||||
if isinstance(req, SpannedString):
|
||||
match_result = self.model.filter.match(entry)
|
||||
# Match result was a tuple indicating what matched
|
||||
if isinstance(match_result, tuple):
|
||||
highlight_range = req.spans.get(match_result)
|
||||
|
||||
self.textRequest.setPlainText(req)
|
||||
if highlight_range:
|
||||
cursor = self.textRequest.textCursor()
|
||||
cursor.setPosition(highlight_range[0], QtGui.QTextCursor.MoveAnchor)
|
||||
cursor.setPosition(highlight_range[1], QtGui.QTextCursor.KeepAnchor)
|
||||
highlight_format = QtGui.QTextBlockFormat()
|
||||
highlight_format.setBackground(QtCore.Qt.yellow)
|
||||
cursor.setBlockFormat(highlight_format)
|
||||
|
||||
resp = entry.response(beautify=self.checkBeautify.isChecked())
|
||||
if resp:
|
||||
self.textResponse.show()
|
||||
self.textResponse.setPlainText(resp)
|
||||
|
||||
@@ -299,6 +299,32 @@ class StringEnum(str, enum.Enum):
|
||||
return self.value
|
||||
|
||||
|
||||
class IntEnum(enum.IntEnum):
|
||||
# Give a special repr() that'll eval in a REPL.
|
||||
def __repr__(self):
|
||||
return f"{self.__class__.__name__}.{self.name}"
|
||||
|
||||
|
||||
class IntFlag(enum.IntFlag):
|
||||
def __repr__(self):
|
||||
# Make an ORed together version of the flags based on the POD version
|
||||
flags = flags_to_pod(type(self), self)
|
||||
flags = " | ".join(
|
||||
(f"{self.__class__.__name__}.{v}" if isinstance(v, str) else str(v))
|
||||
for v in flags
|
||||
)
|
||||
return f"({flags})"
|
||||
|
||||
|
||||
def flags_to_pod(flag_cls: Type[enum.IntFlag], val: int) -> Tuple[Union[str, int], ...]:
|
||||
# Shove any bits not represented in the IntFlag into an int
|
||||
left_over = val
|
||||
for flag in iter(flag_cls):
|
||||
left_over &= ~flag.value
|
||||
extra = (int(left_over),) if left_over else ()
|
||||
return tuple(flag.name for flag in iter(flag_cls) if val & flag.value) + extra
|
||||
|
||||
|
||||
class TaggedUnion(recordclass.datatuple): # type: ignore
|
||||
tag: Any
|
||||
value: Any
|
||||
@@ -306,5 +332,6 @@ class TaggedUnion(recordclass.datatuple): # type: ignore
|
||||
|
||||
__all__ = [
|
||||
"Vector3", "Vector4", "Vector2", "Quaternion", "TupleCoord",
|
||||
"UUID", "RawBytes", "StringEnum", "JankStringyBytes", "TaggedUnion"
|
||||
"UUID", "RawBytes", "StringEnum", "JankStringyBytes", "TaggedUnion",
|
||||
"IntEnum", "IntFlag", "flags_to_pod"
|
||||
]
|
||||
|
||||
@@ -347,7 +347,7 @@ class RegionCapNotAvailable(RegionDomainError):
|
||||
|
||||
class RegionMessageError(RegionDomainError):
|
||||
""" an error raised when a region does not have a connection
|
||||
over which it can send UDP messages
|
||||
over which it can send UDP messages
|
||||
|
||||
accepts a region object as an attribute
|
||||
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import codecs
|
||||
import functools
|
||||
import pkg_resources
|
||||
import re
|
||||
|
||||
@@ -39,6 +39,7 @@ class MeshAsset:
|
||||
# These TypedDicts describe the expected shape of the LLSD in the mesh
|
||||
# header and various segments. They're mainly for type hinting.
|
||||
class MeshHeaderDict(TypedDict, total=False):
|
||||
"""Header of the mesh file, includes offsets & sizes for segments' LLSD"""
|
||||
version: int
|
||||
creator: UUID
|
||||
date: dt.datetime
|
||||
@@ -54,6 +55,7 @@ class MeshHeaderDict(TypedDict, total=False):
|
||||
|
||||
|
||||
class SegmentHeaderDict(TypedDict):
|
||||
"""Standard shape for segment references within the header"""
|
||||
offset: int
|
||||
size: int
|
||||
|
||||
@@ -73,6 +75,7 @@ class PhysicsHavokSegmentHeaderDict(PhysicsSegmentHeaderDict, total=False):
|
||||
|
||||
|
||||
class PhysicsCostDataHeaderDict(TypedDict, total=False):
|
||||
"""Cost of physical representation, populated by server"""
|
||||
decomposition: float
|
||||
decomposition_discounted_vertices: int
|
||||
decomposition_hulls: int
|
||||
@@ -85,6 +88,7 @@ class PhysicsCostDataHeaderDict(TypedDict, total=False):
|
||||
|
||||
|
||||
class MeshSegmentDict(TypedDict, total=False):
|
||||
"""Dict of segments unpacked using the MeshHeaderDict"""
|
||||
high_lod: List[LODSegmentDict]
|
||||
medium_lod: List[LODSegmentDict]
|
||||
low_lod: List[LODSegmentDict]
|
||||
@@ -96,6 +100,7 @@ class MeshSegmentDict(TypedDict, total=False):
|
||||
|
||||
|
||||
class LODSegmentDict(TypedDict, total=False):
|
||||
"""Represents a single entry within the material list of a LOD segment"""
|
||||
# Only present if True and no geometry
|
||||
NoGeometry: bool
|
||||
# -1.0 - 1.0
|
||||
@@ -113,17 +118,22 @@ class LODSegmentDict(TypedDict, total=False):
|
||||
|
||||
|
||||
class DomainDict(TypedDict):
|
||||
"""Description of the real range for quantized coordinates"""
|
||||
# number of elems depends on what the domain is for, Vec2 or Vec3
|
||||
Max: List[float]
|
||||
Min: List[float]
|
||||
|
||||
|
||||
class VertexWeight(recordclass.datatuple): # type: ignore
|
||||
"""Vertex weight for a specific joint on a specific vertex"""
|
||||
# index of the joint within the joint_names list in the skin segment
|
||||
joint_idx: int
|
||||
# 0.0 - 1.0
|
||||
weight: float
|
||||
|
||||
|
||||
class SkinSegmentDict(TypedDict, total=False):
|
||||
"""Rigging information"""
|
||||
joint_names: List[str]
|
||||
# model -> world transform matrix for model
|
||||
bind_shape_matrix: List[float]
|
||||
@@ -137,14 +147,17 @@ class SkinSegmentDict(TypedDict, total=False):
|
||||
|
||||
|
||||
class PhysicsConvexSegmentDict(DomainDict, total=False):
|
||||
"""Data for convex hull collisions, populated by the client"""
|
||||
# Min / Max domain vals are inline, unlike for LODs
|
||||
HullList: List[int]
|
||||
# -1.0 - 1.0
|
||||
# -1.0 - 1.0, dequantized from binary field of U16s
|
||||
Positions: List[Vector3]
|
||||
# -1.0 - 1.0
|
||||
# -1.0 - 1.0, dequantized from binary field of U16s
|
||||
BoundingVerts: List[Vector3]
|
||||
|
||||
|
||||
class PhysicsHavokSegmentDict(TypedDict, total=False):
|
||||
"""Cached data for Havok collisions, populated by sim and not used by client."""
|
||||
HullMassProps: MassPropsDict
|
||||
MOPP: MOPPDict
|
||||
MeshDecompMassProps: MassPropsDict
|
||||
@@ -169,8 +182,11 @@ class MOPPDict(TypedDict, total=False):
|
||||
|
||||
|
||||
def positions_from_domain(positions: Iterable[TupleCoord], domain: DomainDict):
|
||||
# Used for turning positions into their actual positions within the mesh / domain
|
||||
# for ex: positions_from_domain(lod["Position"], lod["PositionDomain])
|
||||
"""
|
||||
Used for turning positions into their actual positions within the mesh / domain
|
||||
|
||||
for ex: positions_from_domain(lod["Position"], lod["PositionDomain])
|
||||
"""
|
||||
lower = domain['Min']
|
||||
upper = domain['Max']
|
||||
return [
|
||||
@@ -179,7 +195,7 @@ def positions_from_domain(positions: Iterable[TupleCoord], domain: DomainDict):
|
||||
|
||||
|
||||
def positions_to_domain(positions: Iterable[TupleCoord], domain: DomainDict):
|
||||
# Used for turning positions into their actual positions within the mesh / domain
|
||||
"""Used for turning positions into their actual positions within the mesh / domain"""
|
||||
lower = domain['Min']
|
||||
upper = domain['Max']
|
||||
return [
|
||||
@@ -187,7 +203,36 @@ def positions_to_domain(positions: Iterable[TupleCoord], domain: DomainDict):
|
||||
]
|
||||
|
||||
|
||||
class VertexWeights(se.SerializableBase):
|
||||
"""Serializer for a list of joint weights on a single vertex"""
|
||||
INFLUENCE_SER = se.QuantizedFloat(se.U16, 0.0, 1.0)
|
||||
INFLUENCE_LIMIT = 4
|
||||
INFLUENCE_TERM = 0xFF
|
||||
|
||||
@classmethod
|
||||
def serialize(cls, vals, writer: se.BufferWriter, ctx=None):
|
||||
if len(vals) > cls.INFLUENCE_LIMIT:
|
||||
raise ValueError(f"{vals!r} is too long, can only have {cls.INFLUENCE_LIMIT} influences!")
|
||||
for val in vals:
|
||||
joint_idx, influence = val
|
||||
writer.write(se.U8, joint_idx)
|
||||
writer.write(cls.INFLUENCE_SER, influence, ctx=ctx)
|
||||
if len(vals) != cls.INFLUENCE_LIMIT:
|
||||
writer.write(se.U8, cls.INFLUENCE_TERM)
|
||||
|
||||
@classmethod
|
||||
def deserialize(cls, reader: se.Reader, ctx=None):
|
||||
influence_list = []
|
||||
for _ in range(cls.INFLUENCE_LIMIT):
|
||||
joint_idx = reader.read(se.U8)
|
||||
if joint_idx == cls.INFLUENCE_TERM:
|
||||
break
|
||||
influence_list.append(VertexWeight(joint_idx, reader.read(cls.INFLUENCE_SER, ctx=ctx)))
|
||||
return influence_list
|
||||
|
||||
|
||||
class SegmentSerializer:
|
||||
"""Serializer for binary fields within an LLSD object"""
|
||||
def __init__(self, templates):
|
||||
self._templates: Dict[str, se.SerializableBase] = templates
|
||||
|
||||
@@ -217,33 +262,6 @@ class SegmentSerializer:
|
||||
return new_segment
|
||||
|
||||
|
||||
class VertexWeights(se.SerializableBase):
|
||||
INFLUENCE_SER = se.QuantizedFloat(se.U16, 0.0, 1.0)
|
||||
INFLUENCE_LIMIT = 4
|
||||
INFLUENCE_TERM = 0xFF
|
||||
|
||||
@classmethod
|
||||
def serialize(cls, vals, writer: se.BufferWriter, ctx=None):
|
||||
if len(vals) > cls.INFLUENCE_LIMIT:
|
||||
raise ValueError(f"{vals!r} is too long, can only have {cls.INFLUENCE_LIMIT} influences!")
|
||||
for val in vals:
|
||||
joint_idx, influence = val
|
||||
writer.write(se.U8, joint_idx)
|
||||
writer.write(cls.INFLUENCE_SER, influence, ctx=ctx)
|
||||
if len(vals) != cls.INFLUENCE_LIMIT:
|
||||
writer.write(se.U8, cls.INFLUENCE_TERM)
|
||||
|
||||
@classmethod
|
||||
def deserialize(cls, reader: se.Reader, ctx=None):
|
||||
influence_list = []
|
||||
for _ in range(cls.INFLUENCE_LIMIT):
|
||||
joint_idx = reader.read(se.U8)
|
||||
if joint_idx == cls.INFLUENCE_TERM:
|
||||
break
|
||||
influence_list.append(VertexWeight(joint_idx, reader.read(cls.INFLUENCE_SER, ctx=ctx)))
|
||||
return influence_list
|
||||
|
||||
|
||||
LOD_SEGMENT_SERIALIZER = SegmentSerializer({
|
||||
# 16-bit indices to the verts making up the tri. Imposes a 16-bit
|
||||
# upper limit on verts in any given material in the mesh.
|
||||
@@ -265,6 +283,7 @@ class LLMeshSerializer(se.SerializableBase):
|
||||
KNOWN_SEGMENTS = ("lowest_lod", "low_lod", "medium_lod", "high_lod",
|
||||
"physics_mesh", "physics_convex", "skin", "physics_havok")
|
||||
|
||||
# Define unpackers for specific binary fields within the parsed LLSD segments
|
||||
SEGMENT_TEMPLATES: Dict[str, SegmentSerializer] = {
|
||||
"lowest_lod": LOD_SEGMENT_SERIALIZER,
|
||||
"low_lod": LOD_SEGMENT_SERIALIZER,
|
||||
|
||||
@@ -19,5 +19,3 @@ You should have received a copy of the GNU Lesser General Public License
|
||||
along with this program; if not, write to the Free Software Foundation,
|
||||
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
"""
|
||||
|
||||
|
||||
|
||||
@@ -20,8 +20,6 @@ along with this program; if not, write to the Free Software Foundation,
|
||||
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
"""
|
||||
|
||||
import os
|
||||
|
||||
from hippolyzer.lib.base.helpers import get_resource_filename
|
||||
|
||||
msg_tmpl = open(get_resource_filename("lib/base/message/data/message_template.msg"))
|
||||
|
||||
@@ -34,13 +34,13 @@ VAR_TYPE = Union[TupleCoord, bytes, str, float, int, Tuple, UUID]
|
||||
|
||||
|
||||
class Block:
|
||||
"""
|
||||
"""
|
||||
base representation of a block
|
||||
Block expects a name, and kwargs for variables (var_name = value)
|
||||
"""
|
||||
__slots__ = ('name', 'size', 'vars', 'message_name', '_ser_cache', 'fill_missing',)
|
||||
|
||||
def __init__(self, name, /, fill_missing=False, **kwargs):
|
||||
def __init__(self, name, /, *, fill_missing=False, **kwargs):
|
||||
self.name = name
|
||||
self.size = 0
|
||||
self.message_name: Optional[str] = None
|
||||
@@ -129,24 +129,7 @@ class Block:
|
||||
continue
|
||||
# We have a serializer, include the pretty output in the repr,
|
||||
# using the _ suffix so the builder knows it needs to be serialized.
|
||||
deserialized = self.deserialize_var(key)
|
||||
type_name = type(deserialized).__name__
|
||||
# TODO: replace __repr__ for these in a context manager so nested
|
||||
# Enums / Flags get handled correctly as well. The point of the
|
||||
# pretty repr() is to make messages directly paste-able into code.
|
||||
if isinstance(deserialized, enum.IntEnum):
|
||||
deserialized = f"{type_name}.{deserialized.name}"
|
||||
elif isinstance(deserialized, enum.IntFlag):
|
||||
# Make an ORed together version of the flags based on the POD version
|
||||
flags = se.flags_to_pod(type(deserialized), deserialized)
|
||||
flags = " | ".join(
|
||||
(f"{type_name}.{v}" if isinstance(v, str) else str(v))
|
||||
for v in flags
|
||||
)
|
||||
deserialized = f"({flags})"
|
||||
else:
|
||||
deserialized = repr(deserialized)
|
||||
block_vars[f"{key}_"] = deserialized
|
||||
block_vars[f"{key}_"] = repr(self.deserialize_var(key))
|
||||
else:
|
||||
block_vars = self.vars
|
||||
|
||||
@@ -193,12 +176,21 @@ class Message:
|
||||
# should be set once a packet is sent / dropped to prevent accidental
|
||||
# re-sending or re-dropping
|
||||
self.finalized = False
|
||||
# Whether message is owned by the queue or should be sent immediately
|
||||
# Whether message is owned by a queue or should be sent immediately
|
||||
self.queued: bool = False
|
||||
self._blocks: BLOCK_DICT = {}
|
||||
|
||||
self.add_blocks(args)
|
||||
|
||||
def __reduce_ex__(self, protocol):
|
||||
reduced: Tuple[Any] = super().__reduce_ex__(protocol)
|
||||
# https://docs.python.org/3/library/pickle.html#object.__reduce__
|
||||
# We need to make some changes to the object state to make it serializable
|
||||
state_dict: Dict = reduced[2][1]
|
||||
# Have to remove the deserializer weak ref so we can pickle
|
||||
state_dict['deserializer'] = None
|
||||
return reduced
|
||||
|
||||
@property
|
||||
def packet_id(self) -> Optional[int]:
|
||||
return self._packet_id
|
||||
|
||||
@@ -66,7 +66,7 @@ class MessageTemplateBlock:
|
||||
self.variables: typing.List[MessageTemplateVariable] = []
|
||||
self.variable_map: typing.Dict[str, MessageTemplateVariable] = {}
|
||||
self.name = name
|
||||
self.block_type = 0
|
||||
self.block_type: MsgBlockType = MsgBlockType.MBT_SINGLE
|
||||
self.number = 0
|
||||
|
||||
def add_variable(self, var):
|
||||
|
||||
@@ -19,6 +19,3 @@ You should have received a copy of the GNU Lesser General Public License
|
||||
along with this program; if not, write to the Free Software Foundation,
|
||||
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
"""
|
||||
|
||||
|
||||
|
||||
|
||||
@@ -20,6 +20,7 @@ Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import dataclasses
|
||||
from typing import *
|
||||
|
||||
import lazy_object_proxy
|
||||
@@ -253,12 +254,17 @@ class Object(recordclass.datatuple): # type: ignore
|
||||
updated_properties = set()
|
||||
for key, val in properties.items():
|
||||
if hasattr(self, key):
|
||||
old_val = getattr(self, key, val)
|
||||
old_val = getattr(self, key, dataclasses.MISSING)
|
||||
# Don't check equality if we're using a lazy proxy,
|
||||
# parsing is deferred until we actually use it.
|
||||
is_proxy = isinstance(val, lazy_object_proxy.Proxy)
|
||||
if is_proxy or old_val != val:
|
||||
updated_properties.add(key)
|
||||
if isinstance(val, lazy_object_proxy.Proxy):
|
||||
# TODO: be smarter about this. Can we store the raw bytes and
|
||||
# compare those if it's an unparsed object?
|
||||
if old_val is not val:
|
||||
updated_properties.add(key)
|
||||
else:
|
||||
if old_val != val:
|
||||
updated_properties.add(key)
|
||||
setattr(self, key, val)
|
||||
return updated_properties
|
||||
|
||||
|
||||
@@ -5,7 +5,6 @@ import enum
|
||||
import math
|
||||
import struct
|
||||
import types
|
||||
import typing
|
||||
import weakref
|
||||
from io import SEEK_CUR, SEEK_SET, SEEK_END, RawIOBase, BufferedIOBase
|
||||
from typing import *
|
||||
@@ -1092,15 +1091,6 @@ class IntEnum(Adapter):
|
||||
return lambda: self.enum_cls(0)
|
||||
|
||||
|
||||
def flags_to_pod(flag_cls: Type[enum.IntFlag], val: int) -> typing.Tuple[Union[str, int], ...]:
|
||||
# Shove any bits not represented in the IntFlag into an int
|
||||
left_over = val
|
||||
for flag in iter(flag_cls):
|
||||
left_over &= ~flag.value
|
||||
extra = (int(left_over),) if left_over else ()
|
||||
return tuple(flag.name for flag in iter(flag_cls) if val & flag.value) + extra
|
||||
|
||||
|
||||
class IntFlag(Adapter):
|
||||
def __init__(self, flag_cls: Type[enum.IntFlag],
|
||||
flag_spec: Optional[SerializablePrimitive] = None):
|
||||
@@ -1121,7 +1111,7 @@ class IntFlag(Adapter):
|
||||
|
||||
def decode(self, val: Any, ctx: Optional[ParseContext], pod: bool = False) -> Any:
|
||||
if pod:
|
||||
return flags_to_pod(self.flag_cls, val)
|
||||
return dtypes.flags_to_pod(self.flag_cls, val)
|
||||
return self.flag_cls(val)
|
||||
|
||||
def default_value(self) -> Any:
|
||||
@@ -1613,7 +1603,7 @@ class BufferedLLSDBinaryParser(llsd.HippoLLSDBinaryParser):
|
||||
byte = self._getc()[0]
|
||||
except IndexError:
|
||||
byte = None
|
||||
raise llsd.LLSDParseError("%s at byte %d: %s" % (message, self._index+offset, byte))
|
||||
raise llsd.LLSDParseError("%s at byte %d: %s" % (message, self._index + offset, byte))
|
||||
|
||||
def _getc(self, num=1):
|
||||
return self._buffer.read_bytes(num)
|
||||
@@ -1641,8 +1631,14 @@ def subfield_serializer(msg_name, block_name, var_name):
|
||||
return f
|
||||
|
||||
|
||||
_ENUM_TYPE = TypeVar("_ENUM_TYPE", bound=Type[dtypes.IntEnum])
|
||||
_FLAG_TYPE = TypeVar("_FLAG_TYPE", bound=Type[dtypes.IntFlag])
|
||||
|
||||
|
||||
def enum_field_serializer(msg_name, block_name, var_name):
|
||||
def f(orig_cls):
|
||||
def f(orig_cls: _ENUM_TYPE) -> _ENUM_TYPE:
|
||||
if not issubclass(orig_cls, dtypes.IntEnum):
|
||||
raise ValueError(f"{orig_cls} must be a subclass of Hippolyzer's IntEnum class")
|
||||
wrapper = subfield_serializer(msg_name, block_name, var_name)
|
||||
wrapper(IntEnumSubfieldSerializer(orig_cls))
|
||||
return orig_cls
|
||||
@@ -1650,7 +1646,9 @@ def enum_field_serializer(msg_name, block_name, var_name):
|
||||
|
||||
|
||||
def flag_field_serializer(msg_name, block_name, var_name):
|
||||
def f(orig_cls):
|
||||
def f(orig_cls: _FLAG_TYPE) -> _FLAG_TYPE:
|
||||
if not issubclass(orig_cls, dtypes.IntFlag):
|
||||
raise ValueError(f"{orig_cls!r} must be a subclass of Hippolyzer's IntFlag class")
|
||||
wrapper = subfield_serializer(msg_name, block_name, var_name)
|
||||
wrapper(IntFlagSubfieldSerializer(orig_cls))
|
||||
return orig_cls
|
||||
|
||||
@@ -22,11 +22,11 @@ Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
|
||||
class Settings:
|
||||
def __init__(self, quiet_logging=False, spammy_logging=False, log_tests=True):
|
||||
""" some lovely configurable settings
|
||||
""" some lovely configurable settings
|
||||
|
||||
These are applied application wide, and can be
|
||||
overridden at any time in a specific instance
|
||||
|
||||
|
||||
quiet_logging overrides spammy_logging
|
||||
"""
|
||||
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import itertools
|
||||
from pathlib import Path
|
||||
import shutil
|
||||
import sys
|
||||
@@ -32,7 +33,6 @@ def setup_ca_everywhere(mitmproxy_master):
|
||||
pass
|
||||
except PermissionError:
|
||||
pass
|
||||
|
||||
return valid_paths
|
||||
|
||||
|
||||
@@ -42,7 +42,8 @@ def _viewer_config_dir_iter():
|
||||
elif sys.platform == "darwin":
|
||||
paths = (Path.home() / "Library" / "Application Support").iterdir()
|
||||
elif sys.platform in ("win32", "msys", "cygwin"):
|
||||
paths = (Path.home() / "AppData" / "Local").iterdir()
|
||||
app_data = Path.home() / "AppData"
|
||||
paths = itertools.chain((app_data / "Local").iterdir(), (app_data / "Roaming").iterdir())
|
||||
else:
|
||||
raise Exception("Unknown OS, can't locate viewer config dirs!")
|
||||
|
||||
|
||||
@@ -51,36 +51,39 @@ class MITMProxyEventManager:
|
||||
self.llsd_message_serializer = LLSDMessageSerializer()
|
||||
self._asset_server_proxied = False
|
||||
|
||||
async def pump_proxy_events(self):
|
||||
async def run(self):
|
||||
while not self.shutdown_signal.is_set():
|
||||
try:
|
||||
try:
|
||||
event_type, flow_state = self.from_proxy_queue.get(False)
|
||||
except queue.Empty:
|
||||
await asyncio.sleep(0.001)
|
||||
continue
|
||||
|
||||
flow = HippoHTTPFlow.from_state(flow_state, self.session_manager)
|
||||
try:
|
||||
if event_type == "request":
|
||||
self._handle_request(flow)
|
||||
# A response was injected early in the cycle, we won't get a response
|
||||
# callback from mitmproxy so just log it now.
|
||||
message_logger = self.session_manager.message_logger
|
||||
if message_logger and flow.response_injected:
|
||||
message_logger.log_http_response(flow)
|
||||
elif event_type == "response":
|
||||
self._handle_response(flow)
|
||||
else:
|
||||
raise Exception(f"Unknown mitmproxy event type {event_type}")
|
||||
finally:
|
||||
# If someone has taken this request out of the regular callback flow,
|
||||
# they'll manually send a callback at some later time.
|
||||
if not flow.taken:
|
||||
self.to_proxy_queue.put(("callback", flow.id, flow.get_state()))
|
||||
await self.pump_proxy_event()
|
||||
except:
|
||||
logging.exception("Exploded when handling parsed packets")
|
||||
|
||||
async def pump_proxy_event(self):
|
||||
try:
|
||||
event_type, flow_state = self.from_proxy_queue.get(False)
|
||||
except queue.Empty:
|
||||
await asyncio.sleep(0.001)
|
||||
return
|
||||
|
||||
flow = HippoHTTPFlow.from_state(flow_state, self.session_manager)
|
||||
try:
|
||||
if event_type == "request":
|
||||
self._handle_request(flow)
|
||||
# A response was injected early in the cycle, we won't get a response
|
||||
# callback from mitmproxy so just log it now.
|
||||
message_logger = self.session_manager.message_logger
|
||||
if message_logger and flow.response_injected:
|
||||
message_logger.log_http_response(flow)
|
||||
elif event_type == "response":
|
||||
self._handle_response(flow)
|
||||
else:
|
||||
raise Exception(f"Unknown mitmproxy event type {event_type}")
|
||||
finally:
|
||||
# If someone has taken this request out of the regular callback flow,
|
||||
# they'll manually send a callback at some later time.
|
||||
if not flow.taken:
|
||||
self.to_proxy_queue.put(("callback", flow.id, flow.get_state()))
|
||||
|
||||
def _handle_request(self, flow: HippoHTTPFlow):
|
||||
url = flow.request.url
|
||||
cap_data = self.session_manager.resolve_cap(url)
|
||||
@@ -118,11 +121,14 @@ class MITMProxyEventManager:
|
||||
else:
|
||||
flow.response = mitmproxy.http.HTTPResponse.make(
|
||||
307,
|
||||
b"Redirecting...",
|
||||
# Can't provide explanation in the body because this results in failing Range requests under
|
||||
# mitmproxy that return garbage data. Chances are there's weird interactions
|
||||
# between HTTP/1.x pipelining and range requests under mitmproxy that no other
|
||||
# applications have hit. If that's a concern then Connection: close should be used.
|
||||
b"",
|
||||
{
|
||||
"Content-Type": "text/plain",
|
||||
"Connection": "keep-alive",
|
||||
"Location": redir_url,
|
||||
"Connection": "close",
|
||||
}
|
||||
)
|
||||
elif cap_data and cap_data.asset_server_cap:
|
||||
|
||||
@@ -249,9 +249,9 @@ def create_http_proxy(bind_host, port, flow_context: HTTPFlowContext): # pragma
|
||||
|
||||
def is_asset_server_cap_name(cap_name):
|
||||
return cap_name and (
|
||||
cap_name.startswith("GetMesh") or
|
||||
cap_name.startswith("GetTexture") or
|
||||
cap_name.startswith("ViewerAsset")
|
||||
cap_name.startswith("GetMesh")
|
||||
or cap_name.startswith("GetTexture")
|
||||
or cap_name.startswith("ViewerAsset")
|
||||
)
|
||||
|
||||
|
||||
|
||||
@@ -71,6 +71,14 @@ def proxy_eval(eval_str: str, globals_=None, locals_=None):
|
||||
)
|
||||
|
||||
|
||||
TextSpan = Tuple[int, int]
|
||||
SpanDict = Dict[Tuple[Union[str, int], ...], TextSpan]
|
||||
|
||||
|
||||
class SpannedString(str):
|
||||
spans: SpanDict = {}
|
||||
|
||||
|
||||
class ProxiedMessage(Message):
|
||||
__slots__ = ("meta", "injected", "dropped", "direction")
|
||||
|
||||
@@ -83,9 +91,10 @@ class ProxiedMessage(Message):
|
||||
_maybe_reload_templates()
|
||||
|
||||
def to_human_string(self, replacements=None, beautify=False,
|
||||
template: Optional[MessageTemplate] = None):
|
||||
template: Optional[MessageTemplate] = None) -> SpannedString:
|
||||
replacements = replacements or {}
|
||||
_maybe_reload_templates()
|
||||
spans: SpanDict = {}
|
||||
string = ""
|
||||
if self.direction is not None:
|
||||
string += f'{self.direction.name} '
|
||||
@@ -101,11 +110,18 @@ class ProxiedMessage(Message):
|
||||
block_suffix = ""
|
||||
if template and template.get_block(block_name).block_type == MsgBlockType.MBT_VARIABLE:
|
||||
block_suffix = ' # Variable'
|
||||
for block in block_list:
|
||||
for block_num, block in enumerate(block_list):
|
||||
string += f"[{block_name}]{block_suffix}\n"
|
||||
for var_name, val in block.items():
|
||||
start_len = len(string)
|
||||
string += self._format_var(block, var_name, val, replacements, beautify)
|
||||
return string
|
||||
end_len = len(string)
|
||||
# Store the spans for each var so we can highlight specific matches
|
||||
spans[(self.name, block_name, block_num, var_name)] = (start_len, end_len)
|
||||
string += "\n"
|
||||
spanned = SpannedString(string)
|
||||
spanned.spans = spans
|
||||
return spanned
|
||||
|
||||
def _format_var(self, block, var_name, var_val, replacements, beautify=False):
|
||||
string = ""
|
||||
@@ -129,7 +145,7 @@ class ProxiedMessage(Message):
|
||||
if serializer.AS_HEX and isinstance(var_val, int):
|
||||
var_data = hex(var_val)
|
||||
if serializer.ORIG_INLINE:
|
||||
string += f" #{var_data}\n"
|
||||
string += f" #{var_data}"
|
||||
return string
|
||||
else:
|
||||
string += "\n"
|
||||
@@ -146,7 +162,7 @@ class ProxiedMessage(Message):
|
||||
if "CircuitCode" in var_name or ("Code" in var_name and "Circuit" in block.name):
|
||||
if var_val == replacements.get("CIRCUIT_CODE"):
|
||||
var_data = "[[CIRCUIT_CODE]]"
|
||||
string += f" {field_prefix}{var_name} = {var_data}\n"
|
||||
string += f" {field_prefix}{var_name} = {var_data}"
|
||||
return string
|
||||
|
||||
@staticmethod
|
||||
|
||||
@@ -11,6 +11,9 @@ def literal():
|
||||
# Nightmare. str or bytes literal.
|
||||
# https://stackoverflow.com/questions/14366401/#comment79795017_14366904
|
||||
RegExMatch(r'''b?(\"\"\"|\'\'\'|\"|\')((?<!\\)(\\\\)*\\\1|.)*?\1'''),
|
||||
# base16
|
||||
RegExMatch(r'0x\d+'),
|
||||
# base10 int or float.
|
||||
RegExMatch(r'\d+(\.\d+)?'),
|
||||
"None",
|
||||
"True",
|
||||
@@ -23,7 +26,7 @@ def literal():
|
||||
|
||||
|
||||
def identifier():
|
||||
return RegExMatch(r'[a-zA-Z*]([a-zA-Z0-9*]+)?')
|
||||
return RegExMatch(r'[a-zA-Z*]([a-zA-Z0-9_*]+)?')
|
||||
|
||||
|
||||
def field_specifier():
|
||||
@@ -42,12 +45,16 @@ def meta_field_specifier():
|
||||
return "Meta", ".", identifier
|
||||
|
||||
|
||||
def enum_field_specifier():
|
||||
return identifier, ".", identifier
|
||||
|
||||
|
||||
def compare_val():
|
||||
return [literal, meta_field_specifier]
|
||||
return [literal, meta_field_specifier, enum_field_specifier]
|
||||
|
||||
|
||||
def binary_expression():
|
||||
return field_specifier, ["==", "!=", "^=", "$=", "~=", ">", ">=", "<", "<="], compare_val
|
||||
return field_specifier, ["==", "!=", "^=", "$=", "~=", ">", ">=", "<", "<=", "&"], compare_val
|
||||
|
||||
|
||||
def term():
|
||||
@@ -62,9 +69,12 @@ def message_filter():
|
||||
return expression, EOF
|
||||
|
||||
|
||||
MATCH_RESULT = typing.Union[bool, typing.Tuple]
|
||||
|
||||
|
||||
class BaseFilterNode(abc.ABC):
|
||||
@abc.abstractmethod
|
||||
def match(self, msg) -> bool:
|
||||
def match(self, msg) -> MATCH_RESULT:
|
||||
raise NotImplementedError()
|
||||
|
||||
@property
|
||||
@@ -94,17 +104,17 @@ class BinaryFilterNode(BaseFilterNode, abc.ABC):
|
||||
|
||||
|
||||
class UnaryNotFilterNode(UnaryFilterNode):
|
||||
def match(self, msg) -> bool:
|
||||
def match(self, msg) -> MATCH_RESULT:
|
||||
return not self.node.match(msg)
|
||||
|
||||
|
||||
class OrFilterNode(BinaryFilterNode):
|
||||
def match(self, msg) -> bool:
|
||||
def match(self, msg) -> MATCH_RESULT:
|
||||
return self.left_node.match(msg) or self.right_node.match(msg)
|
||||
|
||||
|
||||
class AndFilterNode(BinaryFilterNode):
|
||||
def match(self, msg) -> bool:
|
||||
def match(self, msg) -> MATCH_RESULT:
|
||||
return self.left_node.match(msg) and self.right_node.match(msg)
|
||||
|
||||
|
||||
@@ -114,7 +124,7 @@ class MessageFilterNode(BaseFilterNode):
|
||||
self.operator = operator
|
||||
self.value = value
|
||||
|
||||
def match(self, msg) -> bool:
|
||||
def match(self, msg) -> MATCH_RESULT:
|
||||
return msg.matches(self)
|
||||
|
||||
@property
|
||||
@@ -126,6 +136,11 @@ class MetaFieldSpecifier(str):
|
||||
pass
|
||||
|
||||
|
||||
class EnumFieldSpecifier(typing.NamedTuple):
|
||||
enum_name: str
|
||||
field_name: str
|
||||
|
||||
|
||||
class LiteralValue:
|
||||
"""Only exists because we can't return `None` in a visitor, need to box it"""
|
||||
def __init__(self, value):
|
||||
@@ -145,6 +160,9 @@ class MessageFilterVisitor(PTNodeVisitor):
|
||||
def visit_meta_field_specifier(self, _node, children):
|
||||
return MetaFieldSpecifier(children[0])
|
||||
|
||||
def visit_enum_field_specifier(self, _node, children):
|
||||
return EnumFieldSpecifier(*children)
|
||||
|
||||
def visit_unary_field_specifier(self, _node, children):
|
||||
# Looks like a bare field specifier with no operator
|
||||
return MessageFilterNode(tuple(children), None, None)
|
||||
|
||||
@@ -15,7 +15,8 @@ from defusedxml import minidom
|
||||
from hippolyzer.lib.base import serialization as se, llsd
|
||||
from hippolyzer.lib.base.datatypes import TaggedUnion, UUID, TupleCoord
|
||||
from hippolyzer.lib.base.helpers import bytes_escape
|
||||
from hippolyzer.lib.proxy.message_filter import MetaFieldSpecifier, compile_filter, BaseFilterNode, MessageFilterNode
|
||||
from hippolyzer.lib.proxy.message_filter import MetaFieldSpecifier, compile_filter, BaseFilterNode, MessageFilterNode, \
|
||||
EnumFieldSpecifier
|
||||
from hippolyzer.lib.proxy.region import CapType
|
||||
|
||||
if typing.TYPE_CHECKING:
|
||||
@@ -254,6 +255,11 @@ class AbstractMessageLogEntry:
|
||||
expected = expected()
|
||||
else:
|
||||
expected = str(expected)
|
||||
elif isinstance(expected, EnumFieldSpecifier):
|
||||
# Local import so we get a fresh copy of the templates module
|
||||
from hippolyzer.lib.proxy import templates
|
||||
enum_cls = getattr(templates, expected.enum_name)
|
||||
expected = enum_cls[expected.field_name]
|
||||
elif expected is not None:
|
||||
# Unbox the expected value
|
||||
expected = expected.value
|
||||
@@ -286,6 +292,8 @@ class AbstractMessageLogEntry:
|
||||
return val > expected
|
||||
elif operator == ">=":
|
||||
return val >= expected
|
||||
elif operator == "&":
|
||||
return val & expected
|
||||
else:
|
||||
raise ValueError(f"Unexpected operator {operator!r}")
|
||||
|
||||
@@ -359,8 +367,8 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
|
||||
cap_name = cap_data and cap_data.cap_name
|
||||
base_url = cap_name and cap_data.base_url
|
||||
temporary_cap = cap_data and cap_data.type == CapType.TEMPORARY
|
||||
beautify_url = (beautify and base_url and cap_name and
|
||||
not temporary_cap and self.session and want_request)
|
||||
beautify_url = (beautify and base_url and cap_name
|
||||
and not temporary_cap and self.session and want_request)
|
||||
if want_request:
|
||||
buf.write(message.method)
|
||||
buf.write(" ")
|
||||
@@ -546,7 +554,6 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
|
||||
# These are expensive to keep around. pickle them and un-pickle on
|
||||
# an as-needed basis.
|
||||
self._deserializer = self.message.deserializer
|
||||
self.message.deserializer = None
|
||||
self._frozen_message = pickle.dumps(self._message, protocol=pickle.HIGHEST_PROTOCOL)
|
||||
self._message = None
|
||||
|
||||
@@ -586,15 +593,19 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
|
||||
for block_name in message.blocks:
|
||||
if not fnmatch.fnmatchcase(block_name, matcher.selector[1]):
|
||||
continue
|
||||
for block in message[block_name]:
|
||||
for block_num, block in enumerate(message[block_name]):
|
||||
for var_name in block.vars.keys():
|
||||
if not fnmatch.fnmatchcase(var_name, matcher.selector[2]):
|
||||
continue
|
||||
# So we know where the match happened
|
||||
span_key = (message.name, block_name, block_num, var_name)
|
||||
if selector_len == 3:
|
||||
# We're just matching on the var existing, not having any particular value
|
||||
if matcher.value is None:
|
||||
return True
|
||||
return span_key
|
||||
if self._val_matches(matcher.operator, block[var_name], matcher.value):
|
||||
return True
|
||||
return span_key
|
||||
# Need to invoke a special unpacker
|
||||
elif selector_len == 4:
|
||||
try:
|
||||
deserialized = block.deserialize_var(var_name)
|
||||
@@ -608,9 +619,9 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
|
||||
for key in deserialized.keys():
|
||||
if fnmatch.fnmatchcase(str(key), matcher.selector[3]):
|
||||
if matcher.value is None:
|
||||
return True
|
||||
return span_key
|
||||
if self._val_matches(matcher.operator, deserialized[key], matcher.value):
|
||||
return True
|
||||
return span_key
|
||||
|
||||
return False
|
||||
|
||||
|
||||
41
hippolyzer/lib/proxy/namecache.py
Normal file
41
hippolyzer/lib/proxy/namecache.py
Normal file
@@ -0,0 +1,41 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import dataclasses
|
||||
from typing import *
|
||||
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
|
||||
|
||||
@dataclasses.dataclass
|
||||
class NameCacheEntry:
|
||||
FirstName: Optional[str] = None
|
||||
LastName: Optional[str] = None
|
||||
DisplayName: Optional[str] = None
|
||||
|
||||
|
||||
class NameCache:
|
||||
# TODO: persist this somewhere across runs
|
||||
def __init__(self):
|
||||
self._cache: Dict[UUID, NameCacheEntry] = {}
|
||||
|
||||
def lookup(self, uuid: UUID) -> Optional[NameCacheEntry]:
|
||||
return self._cache.get(uuid)
|
||||
|
||||
def update(self, uuid: UUID, vals: dict):
|
||||
# upsert the cache entry
|
||||
entry = self._cache.get(uuid) or NameCacheEntry()
|
||||
entry.LastName = vals.get("LastName") or entry.LastName
|
||||
entry.FirstName = vals.get("FirstName") or entry.FirstName
|
||||
entry.DisplayName = vals.get("DisplayName") or entry.DisplayName
|
||||
self._cache[uuid] = entry
|
||||
|
||||
def handle_uuid_name_reply(self, msg: ProxiedMessage):
|
||||
"""UUID lookup reply handler to be registered by regions"""
|
||||
for block in msg.blocks["UUIDNameBlock"]:
|
||||
self.update(block["ID"], {
|
||||
"FirstName": block["FirstName"],
|
||||
"LastName": block["LastName"],
|
||||
})
|
||||
@@ -2,13 +2,15 @@ from __future__ import annotations
|
||||
|
||||
import collections
|
||||
import copy
|
||||
import enum
|
||||
import logging
|
||||
import math
|
||||
import typing
|
||||
import weakref
|
||||
from typing import *
|
||||
|
||||
from hippolyzer.lib.base import llsd
|
||||
from hippolyzer.lib.base.datatypes import UUID, TaggedUnion
|
||||
from hippolyzer.lib.base.datatypes import UUID, TaggedUnion, Vector3
|
||||
from hippolyzer.lib.base.helpers import proxify
|
||||
from hippolyzer.lib.base.message.message import Block
|
||||
from hippolyzer.lib.base.namevalue import NameValueCollection
|
||||
@@ -16,6 +18,7 @@ from hippolyzer.lib.base.objects import Object
|
||||
from hippolyzer.lib.proxy.addons import AddonManager
|
||||
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage
|
||||
from hippolyzer.lib.proxy.namecache import NameCache
|
||||
from hippolyzer.lib.proxy.templates import PCode, ObjectStateSerializer
|
||||
|
||||
if TYPE_CHECKING:
|
||||
@@ -44,8 +47,8 @@ class OrphanManager:
|
||||
del self._orphans[parent_id]
|
||||
return removed
|
||||
|
||||
def collect_orphans(self, parent: Object) -> typing.Sequence[int]:
|
||||
return self._orphans.pop(parent.LocalID, [])
|
||||
def collect_orphans(self, parent_localid: int) -> typing.Sequence[int]:
|
||||
return self._orphans.pop(parent_localid, [])
|
||||
|
||||
def track_orphan(self, obj: Object):
|
||||
self.track_orphan_by_id(obj.LocalID, obj.ParentID)
|
||||
@@ -59,21 +62,83 @@ class OrphanManager:
|
||||
OBJECT_OR_LOCAL = typing.Union[Object, int]
|
||||
|
||||
|
||||
class LocationType(enum.IntEnum):
|
||||
COARSE = enum.auto()
|
||||
EXACT = enum.auto()
|
||||
|
||||
|
||||
class Avatar:
|
||||
"""Wrapper for an avatar known through ObjectUpdate or CoarseLocationUpdate"""
|
||||
def __init__(
|
||||
self,
|
||||
full_id: UUID,
|
||||
obj: Optional["Object"] = None,
|
||||
coarse_location: Optional[Vector3] = None,
|
||||
resolved_name: Optional[str] = None,
|
||||
):
|
||||
self.FullID: UUID = full_id
|
||||
self.Object: Optional["Object"] = obj
|
||||
self._coarse_location = coarse_location
|
||||
self._resolved_name = resolved_name
|
||||
|
||||
@property
|
||||
def LocationType(self) -> "LocationType":
|
||||
if self.Object:
|
||||
return LocationType.EXACT
|
||||
return LocationType.COARSE
|
||||
|
||||
@property
|
||||
def RegionPosition(self) -> Vector3:
|
||||
if self.Object:
|
||||
return self.Object.RegionPosition
|
||||
if self._coarse_location is not None:
|
||||
return self._coarse_location
|
||||
raise ValueError(f"Avatar {self.FullID} has no known position")
|
||||
|
||||
@property
|
||||
def Name(self) -> Optional[str]:
|
||||
if self.Object:
|
||||
nv: Dict[str, str] = self.Object.NameValue.to_dict()
|
||||
return f"{nv['FirstName']} {nv['LastName']}"
|
||||
return self._resolved_name
|
||||
|
||||
|
||||
class ObjectManager:
|
||||
"""Object manager for a specific region"""
|
||||
"""
|
||||
Object manager for a specific region
|
||||
|
||||
TODO: This model does not make sense given how region->region object handoff works.
|
||||
The ObjectManager has to notice when an ObjectUpdate for an object came from a
|
||||
new region and update the associated region itself. It will not receive a KillObject
|
||||
from the old region in the case of physical region crossings. Right now this means
|
||||
physical objects or agents that physically cross a sim border get dangling object
|
||||
references. This is not the case when they teleport, even across a small distance
|
||||
to a neighbor, as that will send a KillObject in the old sim.
|
||||
Needs to switch to one manager managing objects for a full session rather than one
|
||||
manager per region.
|
||||
"""
|
||||
|
||||
def __init__(self, region: ProxiedRegion):
|
||||
self._localid_lookup: typing.Dict[int, Object] = {}
|
||||
self._fullid_lookup: typing.Dict[UUID, int] = {}
|
||||
self._coarse_locations: typing.Dict[UUID, Vector3] = {}
|
||||
# Objects that we've seen references to but don't have data for
|
||||
self.missing_locals = set()
|
||||
self._region: ProxiedRegion = proxify(region)
|
||||
self._orphan_manager = OrphanManager()
|
||||
name_cache = None
|
||||
session = self._region.session()
|
||||
if session:
|
||||
name_cache = session.session_manager.name_cache
|
||||
# Use a local namecache if we don't have a session manager
|
||||
self.name_cache: Optional[NameCache] = name_cache or NameCache()
|
||||
|
||||
message_handler = region.message_handler
|
||||
message_handler.subscribe("ObjectUpdate", self._handle_object_update)
|
||||
message_handler.subscribe("ImprovedTerseObjectUpdate",
|
||||
self._handle_terse_object_update)
|
||||
message_handler.subscribe("CoarseLocationUpdate",
|
||||
self._handle_coarse_location_update)
|
||||
message_handler.subscribe("ObjectUpdateCompressed",
|
||||
self._handle_object_update_compressed)
|
||||
message_handler.subscribe("ObjectUpdateCached",
|
||||
@@ -87,17 +152,35 @@ class ObjectManager:
|
||||
message_handler.subscribe("KillObject",
|
||||
self._handle_kill_object)
|
||||
|
||||
def __len__(self):
|
||||
return len(self._localid_lookup)
|
||||
|
||||
@property
|
||||
def all_objects(self) -> typing.Iterable[Object]:
|
||||
return self._localid_lookup.values()
|
||||
|
||||
@property
|
||||
def all_avatars(self) -> typing.Iterable[Object]:
|
||||
# This is only avatars within draw distance. Might be useful to have another
|
||||
# accessor for UUID + pos that's based on CoarseLocationUpdate.
|
||||
return (o for o in self.all_objects if o.PCode == PCode.AVATAR)
|
||||
def all_avatars(self) -> typing.Iterable[Avatar]:
|
||||
av_objects = {o.FullID: o for o in self.all_objects if o.PCode == PCode.AVATAR}
|
||||
all_ids = set(av_objects.keys()) | self._coarse_locations.keys()
|
||||
|
||||
def lookup_localid(self, localid) -> typing.Optional[Object]:
|
||||
avatars: List[Avatar] = []
|
||||
for av_id in all_ids:
|
||||
av_obj = av_objects.get(av_id)
|
||||
coarse_location = self._coarse_locations.get(av_id)
|
||||
|
||||
resolved_name = None
|
||||
if namecache_entry := self.name_cache.lookup(av_id):
|
||||
resolved_name = f"{namecache_entry.FirstName} {namecache_entry.LastName}"
|
||||
avatars.append(Avatar(
|
||||
full_id=av_id,
|
||||
coarse_location=coarse_location,
|
||||
obj=av_obj,
|
||||
resolved_name=resolved_name,
|
||||
))
|
||||
return avatars
|
||||
|
||||
def lookup_localid(self, localid: int) -> typing.Optional[Object]:
|
||||
return self._localid_lookup.get(localid, None)
|
||||
|
||||
def lookup_fullid(self, fullid: UUID) -> typing.Optional[Object]:
|
||||
@@ -106,7 +189,13 @@ class ObjectManager:
|
||||
return None
|
||||
return self.lookup_localid(local_id)
|
||||
|
||||
def _track_object(self, obj: Object):
|
||||
def lookup_avatar(self, fullid: UUID) -> typing.Optional[Avatar]:
|
||||
for avatar in self.all_avatars:
|
||||
if avatar.FullID == fullid:
|
||||
return avatar
|
||||
return None
|
||||
|
||||
def _track_object(self, obj: Object, notify: bool = True):
|
||||
self._localid_lookup[obj.LocalID] = obj
|
||||
self._fullid_lookup[obj.FullID] = obj.LocalID
|
||||
# If it was missing, it's not missing anymore.
|
||||
@@ -115,13 +204,34 @@ class ObjectManager:
|
||||
self._parent_object(obj)
|
||||
|
||||
# Adopt any of our orphaned child objects.
|
||||
for orphan_local in self._orphan_manager.collect_orphans(obj):
|
||||
for orphan_local in self._orphan_manager.collect_orphans(obj.LocalID):
|
||||
child_obj = self.lookup_localid(orphan_local)
|
||||
# Shouldn't be any dead children in the orphanage
|
||||
assert child_obj is not None
|
||||
self._parent_object(child_obj)
|
||||
|
||||
self._notify_object_updated(obj, set(obj.to_dict().keys()))
|
||||
if notify:
|
||||
self._run_object_update_hooks(obj, set(obj.to_dict().keys()))
|
||||
|
||||
def _untrack_object(self, obj: Object):
|
||||
former_child_ids = obj.ChildIDs[:]
|
||||
for child_id in former_child_ids:
|
||||
child_obj = self.lookup_localid(child_id)
|
||||
assert child_obj is not None
|
||||
self._unparent_object(child_obj, child_obj.ParentID)
|
||||
|
||||
# Place any remaining unkilled children in the orphanage
|
||||
for child_id in former_child_ids:
|
||||
self._orphan_manager.track_orphan_by_id(child_id, obj.LocalID)
|
||||
|
||||
assert not obj.ChildIDs
|
||||
|
||||
# Make sure the parent knows we went away
|
||||
self._unparent_object(obj, obj.ParentID)
|
||||
|
||||
# Do this last in case we only have a weak reference
|
||||
del self._fullid_lookup[obj.FullID]
|
||||
del self._localid_lookup[obj.LocalID]
|
||||
|
||||
def _parent_object(self, obj: Object, insert_at_head=False):
|
||||
if obj.ParentID:
|
||||
@@ -163,9 +273,27 @@ class ObjectManager:
|
||||
|
||||
def _update_existing_object(self, obj: Object, new_properties):
|
||||
new_parent_id = new_properties.get("ParentID", obj.ParentID)
|
||||
|
||||
actually_updated_props = set()
|
||||
|
||||
if obj.LocalID != new_properties.get("LocalID", obj.LocalID):
|
||||
# Our LocalID changed, and we deal with linkages to other prims by
|
||||
# LocalID association. Break any links since our LocalID is changing.
|
||||
# Could happen if we didn't mark an attachment prim dead and the parent agent
|
||||
# came back into the sim. Attachment FullIDs do not change across TPs,
|
||||
# LocalIDs do. This at least lets us partially recover from the bad state.
|
||||
# Currently known to happen due to physical region crossings, so only debug.
|
||||
new_localid = new_properties["LocalID"]
|
||||
LOG.debug(f"Got an update with new LocalID for {obj.FullID}, {obj.LocalID} != {new_localid}. "
|
||||
f"May have mishandled a KillObject for a prim that left and re-entered region.")
|
||||
self._untrack_object(obj)
|
||||
obj.LocalID = new_localid
|
||||
self._track_object(obj, notify=False)
|
||||
actually_updated_props |= {"LocalID"}
|
||||
|
||||
old_parent_id = obj.ParentID
|
||||
|
||||
actually_updated_props = obj.update_properties(new_properties)
|
||||
actually_updated_props |= obj.update_properties(new_properties)
|
||||
|
||||
if new_parent_id != old_parent_id:
|
||||
self._unparent_object(obj, old_parent_id)
|
||||
@@ -174,7 +302,7 @@ class ObjectManager:
|
||||
# Common case where this may be falsy is if we get an ObjectUpdateCached
|
||||
# that didn't have a changed UpdateFlags field.
|
||||
if actually_updated_props:
|
||||
self._notify_object_updated(obj, actually_updated_props)
|
||||
self._run_object_update_hooks(obj, actually_updated_props)
|
||||
|
||||
def _normalize_object_update(self, block: Block):
|
||||
object_data = {
|
||||
@@ -254,6 +382,24 @@ class ObjectManager:
|
||||
|
||||
packet.meta["ObjectUpdateIDs"] = tuple(seen_locals)
|
||||
|
||||
def _handle_coarse_location_update(self, packet: ProxiedMessage):
|
||||
self._coarse_locations.clear()
|
||||
|
||||
coarse_locations: typing.Dict[UUID, Vector3] = {}
|
||||
for agent_block, location_block in zip(packet["AgentData"], packet["Location"]):
|
||||
x, y, z = location_block["X"], location_block["Y"], location_block["Z"]
|
||||
coarse_locations[agent_block["AgentID"]] = Vector3(
|
||||
X=x,
|
||||
Y=y,
|
||||
# The z-axis is multiplied by 4 to obtain true Z location
|
||||
# The z-axis is also limited to 1020m in height
|
||||
# If z == 255 then the true Z is unknown.
|
||||
# http://wiki.secondlife.com/wiki/CoarseLocationUpdate
|
||||
Z=z * 4 if z != 255 else math.inf,
|
||||
)
|
||||
|
||||
self._coarse_locations.update(coarse_locations)
|
||||
|
||||
def _handle_object_update_cached(self, packet: ProxiedMessage):
|
||||
seen_locals = []
|
||||
for block in packet['ObjectData']:
|
||||
@@ -307,8 +453,8 @@ class ObjectManager:
|
||||
seen_locals = []
|
||||
for block in packet['ObjectData']:
|
||||
object_data = self._normalize_object_update_compressed(block)
|
||||
obj = self.lookup_localid(object_data["LocalID"])
|
||||
seen_locals.append(object_data["LocalID"])
|
||||
obj = self.lookup_localid(object_data["LocalID"])
|
||||
if obj:
|
||||
self._update_existing_object(obj, object_data)
|
||||
else:
|
||||
@@ -334,33 +480,38 @@ class ObjectManager:
|
||||
def _handle_kill_object(self, packet: ProxiedMessage):
|
||||
seen_locals = []
|
||||
for block in packet["ObjectData"]:
|
||||
obj = self.lookup_localid(block["ID"])
|
||||
self._kill_object_by_local_id(block["ID"])
|
||||
seen_locals.append(block["ID"])
|
||||
self.missing_locals -= {block["ID"]}
|
||||
if obj:
|
||||
AddonManager.handle_object_killed(self._region.session(), self._region, obj)
|
||||
|
||||
former_child_ids = obj.ChildIDs[:]
|
||||
for child_id in former_child_ids:
|
||||
child_obj = self.lookup_localid(child_id)
|
||||
assert child_obj is not None
|
||||
self._unparent_object(child_obj, child_obj.ParentID)
|
||||
|
||||
del self._localid_lookup[obj.LocalID]
|
||||
del self._fullid_lookup[obj.FullID]
|
||||
|
||||
# Place any remaining unkilled children in the orphanage
|
||||
for child_id in former_child_ids:
|
||||
self._orphan_manager.track_orphan_by_id(child_id, obj.LocalID)
|
||||
|
||||
assert not obj.ChildIDs
|
||||
|
||||
# Make sure the parent knows we went away
|
||||
self._unparent_object(obj, obj.ParentID)
|
||||
else:
|
||||
logging.debug(f"Received {packet.name} for unknown {block['ID']}")
|
||||
packet.meta["ObjectUpdateIDs"] = tuple(seen_locals)
|
||||
|
||||
def _kill_object_by_local_id(self, local_id: int):
|
||||
obj = self.lookup_localid(local_id)
|
||||
self.missing_locals -= {local_id}
|
||||
child_ids: Sequence[int]
|
||||
if obj:
|
||||
AddonManager.handle_object_killed(self._region.session(), self._region, obj)
|
||||
child_ids = obj.ChildIDs
|
||||
else:
|
||||
LOG.debug(f"Tried to kill unknown object {local_id}")
|
||||
# If it had any orphans, they need to die.
|
||||
child_ids = self._orphan_manager.collect_orphans(local_id)
|
||||
|
||||
# KillObject implicitly kills descendents
|
||||
# This may mutate child_ids, use the reversed iterator so we don't
|
||||
# invalidate the iterator during removal.
|
||||
for child_id in reversed(child_ids):
|
||||
# indra special-cases avatar PCodes and doesn't mark them dead
|
||||
# due to cascading kill. Is this correct? Do avatars require
|
||||
# explicit kill?
|
||||
child_obj = self.lookup_localid(child_id)
|
||||
if child_obj and child_obj.PCode == PCode.AVATAR:
|
||||
continue
|
||||
self._kill_object_by_local_id(child_id)
|
||||
|
||||
# Have to do this last, since untracking will clear child IDs
|
||||
if obj:
|
||||
self._untrack_object(obj)
|
||||
|
||||
def _handle_get_object_cost(self, flow: HippoHTTPFlow):
|
||||
parsed = llsd.parse_xml(flow.response.content)
|
||||
if "error" in parsed:
|
||||
@@ -371,14 +522,18 @@ class ObjectManager:
|
||||
LOG.debug(f"Received ObjectCost for unknown {object_id}")
|
||||
continue
|
||||
obj.ObjectCosts.update(object_costs)
|
||||
self._notify_object_updated(obj, {"ObjectCosts"})
|
||||
self._run_object_update_hooks(obj, {"ObjectCosts"})
|
||||
|
||||
def _notify_object_updated(self, obj: Object, updated_props: Set[str]):
|
||||
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str]):
|
||||
if obj.PCode == PCode.AVATAR and "NameValue" in updated_props:
|
||||
if obj.NameValue:
|
||||
self.name_cache.update(obj.FullID, obj.NameValue.to_dict())
|
||||
AddonManager.handle_object_updated(self._region.session(), self._region, obj, updated_props)
|
||||
|
||||
def clear(self):
|
||||
self._localid_lookup.clear()
|
||||
self._fullid_lookup.clear()
|
||||
self._coarse_locations.clear()
|
||||
self._orphan_manager.clear()
|
||||
self.missing_locals.clear()
|
||||
|
||||
|
||||
@@ -14,6 +14,7 @@ from hippolyzer.lib.base.datatypes import Vector3
|
||||
from hippolyzer.lib.base.message.message_handler import MessageHandler
|
||||
from hippolyzer.lib.proxy.caps_client import CapsClient
|
||||
from hippolyzer.lib.proxy.circuit import ProxiedCircuit
|
||||
from hippolyzer.lib.proxy.namecache import NameCache
|
||||
from hippolyzer.lib.proxy.objects import ObjectManager
|
||||
from hippolyzer.lib.proxy.transfer_manager import TransferManager
|
||||
from hippolyzer.lib.proxy.xfer_manager import XferManager
|
||||
@@ -60,6 +61,9 @@ class ProxiedRegion:
|
||||
self.transfer_manager = TransferManager(self)
|
||||
self.caps_client = CapsClient(self)
|
||||
self.objects = ObjectManager(self)
|
||||
if session:
|
||||
name_cache: NameCache = session.session_manager.name_cache
|
||||
self.message_handler.subscribe("UUIDNameReply", name_cache.handle_uuid_name_reply)
|
||||
|
||||
@property
|
||||
def name(self):
|
||||
@@ -103,6 +107,9 @@ class ProxiedRegion:
|
||||
seed_id = self._caps["Seed"][1].split("/")[-1].encode("utf8")
|
||||
# Give it a unique domain tied to the current Seed URI
|
||||
parsed[1] = f"{name}-{hashlib.sha256(seed_id).hexdigest()[:16]}.hippo-proxy.localhost"
|
||||
# Force the URL to HTTP, we're going to handle the request ourselves so it doesn't need
|
||||
# to be secure. This should save on expensive TLS context setup for each req.
|
||||
parsed[0] = "http"
|
||||
wrapper_url = urllib.parse.urlunsplit(parsed)
|
||||
self._caps.add(name + "ProxyWrapper", (CapType.WRAPPER, wrapper_url))
|
||||
return wrapper_url
|
||||
|
||||
@@ -12,6 +12,7 @@ from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.proxy.circuit import ProxiedCircuit
|
||||
from hippolyzer.lib.proxy.http_asset_repo import HTTPAssetRepo
|
||||
from hippolyzer.lib.proxy.http_proxy import HTTPFlowContext, is_asset_server_cap_name, SerializedCapData
|
||||
from hippolyzer.lib.proxy.namecache import NameCache
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion, CapType
|
||||
|
||||
if TYPE_CHECKING:
|
||||
@@ -151,6 +152,7 @@ class SessionManager:
|
||||
self.asset_repo = HTTPAssetRepo()
|
||||
self.message_logger: Optional[BaseMessageLogger] = None
|
||||
self.addon_ctx: Dict[str, Any] = {}
|
||||
self.name_cache = NameCache()
|
||||
|
||||
def create_session(self, login_data) -> Session:
|
||||
session = Session.from_login_data(login_data, self)
|
||||
|
||||
@@ -162,8 +162,8 @@ class UDPProxyProtocol(asyncio.DatagramProtocol):
|
||||
data = data[4:]
|
||||
elif address_type == 3: # Domain name
|
||||
domain_length = data[0]
|
||||
address = data[1:1+domain_length]
|
||||
data = data[1+domain_length:]
|
||||
address = data[1:1 + domain_length]
|
||||
data = data[1 + domain_length:]
|
||||
else:
|
||||
logging.error("Don't understand addr type %d" % address_type)
|
||||
return None
|
||||
|
||||
@@ -8,7 +8,7 @@ from typing import *
|
||||
|
||||
import hippolyzer.lib.base.serialization as se
|
||||
from hippolyzer.lib.base import llsd
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.base.datatypes import UUID, IntEnum, IntFlag
|
||||
from hippolyzer.lib.base.namevalue import NameValuesSerializer
|
||||
|
||||
try:
|
||||
@@ -25,7 +25,7 @@ except:
|
||||
@se.enum_field_serializer("RezObject", "InventoryData", "Type")
|
||||
@se.enum_field_serializer("RezScript", "InventoryBlock", "Type")
|
||||
@se.enum_field_serializer("UpdateTaskInventory", "InventoryData", "Type")
|
||||
class AssetType(enum.IntEnum):
|
||||
class AssetType(IntEnum):
|
||||
TEXTURE = 0
|
||||
SOUND = 1
|
||||
CALLINGCARD = 2
|
||||
@@ -103,7 +103,7 @@ class AssetType(enum.IntEnum):
|
||||
@se.enum_field_serializer("RezObject", "InventoryData", "InvType")
|
||||
@se.enum_field_serializer("RezScript", "InventoryBlock", "InvType")
|
||||
@se.enum_field_serializer("UpdateTaskInventory", "InventoryData", "InvType")
|
||||
class InventoryType(enum.IntEnum):
|
||||
class InventoryType(IntEnum):
|
||||
TEXTURE = 0
|
||||
SOUND = 1
|
||||
CALLINGCARD = 2
|
||||
@@ -143,7 +143,7 @@ class InventoryType(enum.IntEnum):
|
||||
@se.enum_field_serializer("AgentIsNowWearing", "WearableData", "WearableType")
|
||||
@se.enum_field_serializer("AgentWearablesUpdate", "WearableData", "WearableType")
|
||||
@se.enum_field_serializer("CreateInventoryItem", "InventoryBlock", "WearableType")
|
||||
class WearableType(enum.IntEnum):
|
||||
class WearableType(IntEnum):
|
||||
SHAPE = 0
|
||||
SKIN = 1
|
||||
HAIR = 2
|
||||
@@ -180,7 +180,7 @@ def _register_permissions_flags(message_name, block_name):
|
||||
@_register_permissions_flags("RezObject", "InventoryData")
|
||||
@_register_permissions_flags("RezScript", "InventoryBlock")
|
||||
@_register_permissions_flags("RezMultipleAttachmentsFromInv", "ObjectData")
|
||||
class Permissions(enum.IntFlag):
|
||||
class Permissions(IntFlag):
|
||||
TRANSFER = (1 << 13)
|
||||
MODIFY = (1 << 14)
|
||||
COPY = (1 << 15)
|
||||
@@ -202,7 +202,7 @@ class Permissions(enum.IntFlag):
|
||||
@se.flag_field_serializer("RezObject", "InventoryData", "Flags")
|
||||
@se.flag_field_serializer("UpdateCreateInventoryItem", "InventoryData", "Flags")
|
||||
@se.flag_field_serializer("UpdateTaskInventory", "InventoryData", "Flags")
|
||||
class InventoryItemFlags(enum.IntFlag):
|
||||
class InventoryItemFlags(IntFlag):
|
||||
# The asset has only one reference in the system. If the
|
||||
# inventory item is deleted, or the assetid updated, then we
|
||||
# can remove the old reference.
|
||||
@@ -233,7 +233,7 @@ class InventoryItemFlags(enum.IntFlag):
|
||||
|
||||
|
||||
@se.enum_field_serializer("ObjectPermissions", "ObjectData", "Field")
|
||||
class PermissionType(enum.IntEnum):
|
||||
class PermissionType(IntEnum):
|
||||
BASE = 0x01
|
||||
OWNER = 0x02
|
||||
GROUP = 0x04
|
||||
@@ -242,7 +242,7 @@ class PermissionType(enum.IntEnum):
|
||||
|
||||
|
||||
@se.enum_field_serializer("TransferRequest", "TransferInfo", "SourceType")
|
||||
class TransferSourceType(enum.IntEnum):
|
||||
class TransferSourceType(IntEnum):
|
||||
UNKNOWN = 0
|
||||
FILE = enum.auto()
|
||||
ASSET = enum.auto()
|
||||
@@ -250,7 +250,7 @@ class TransferSourceType(enum.IntEnum):
|
||||
SIM_ESTATE = enum.auto()
|
||||
|
||||
|
||||
class EstateAssetType(enum.IntEnum):
|
||||
class EstateAssetType(IntEnum):
|
||||
NONE = -1
|
||||
COVENANT = 0
|
||||
|
||||
@@ -308,14 +308,14 @@ class TransferParamsSerializer(se.EnumSwitchedSubfieldSerializer):
|
||||
@se.enum_field_serializer("TransferPacket", "TransferData", "ChannelType")
|
||||
@se.enum_field_serializer("TransferRequest", "TransferInfo", "ChannelType")
|
||||
@se.enum_field_serializer("TransferInfo", "TransferInfo", "ChannelType")
|
||||
class TransferChannelType(enum.IntEnum):
|
||||
class TransferChannelType(IntEnum):
|
||||
UNKNOWN = 0
|
||||
MISC = enum.auto()
|
||||
ASSET = enum.auto()
|
||||
|
||||
|
||||
@se.enum_field_serializer("TransferInfo", "TransferInfo", "TargetType")
|
||||
class TransferTargetType(enum.IntEnum):
|
||||
class TransferTargetType(IntEnum):
|
||||
UNKNOWN = 0
|
||||
FILE = enum.auto()
|
||||
VFILE = enum.auto()
|
||||
@@ -323,7 +323,7 @@ class TransferTargetType(enum.IntEnum):
|
||||
|
||||
@se.enum_field_serializer("TransferInfo", "TransferInfo", "Status")
|
||||
@se.enum_field_serializer("TransferPacket", "TransferData", "Status")
|
||||
class TransferStatus(enum.IntEnum):
|
||||
class TransferStatus(IntEnum):
|
||||
OK = 0
|
||||
DONE = 1
|
||||
SKIP = 2
|
||||
@@ -380,7 +380,7 @@ class TransferInfoSerializer(se.BaseSubfieldSerializer):
|
||||
|
||||
|
||||
@se.enum_field_serializer("RequestXfer", "XferID", "FilePath")
|
||||
class XferFilePath(enum.IntEnum):
|
||||
class XferFilePath(IntEnum):
|
||||
NONE = 0
|
||||
USER_SETTINGS = 1
|
||||
APP_SETTINGS = 2
|
||||
@@ -403,7 +403,7 @@ class XferFilePath(enum.IntEnum):
|
||||
|
||||
|
||||
@se.enum_field_serializer("AbortXfer", "XferID", "Result")
|
||||
class XferError(enum.IntEnum):
|
||||
class XferError(IntEnum):
|
||||
FILE_EMPTY = -44
|
||||
FILE_NOT_FOUND = -43
|
||||
CANNOT_OPEN_FILE = -42
|
||||
@@ -423,7 +423,7 @@ class SendXferPacketIDSerializer(se.AdapterSubfieldSerializer):
|
||||
|
||||
|
||||
@se.enum_field_serializer("ViewerEffect", "Effect", "Type")
|
||||
class ViewerEffectType(enum.IntEnum):
|
||||
class ViewerEffectType(IntEnum):
|
||||
TEXT = 0
|
||||
ICON = enum.auto()
|
||||
CONNECTOR = enum.auto()
|
||||
@@ -445,7 +445,7 @@ class ViewerEffectType(enum.IntEnum):
|
||||
EFFECT_BLOB = enum.auto()
|
||||
|
||||
|
||||
class LookAtTarget(enum.IntEnum):
|
||||
class LookAtTarget(IntEnum):
|
||||
NONE = 0
|
||||
IDLE = enum.auto()
|
||||
AUTO_LISTEN = enum.auto()
|
||||
@@ -459,7 +459,7 @@ class LookAtTarget(enum.IntEnum):
|
||||
CLEAR = enum.auto()
|
||||
|
||||
|
||||
class PointAtTarget(enum.IntEnum):
|
||||
class PointAtTarget(IntEnum):
|
||||
NONE = 0
|
||||
SELECT = enum.auto()
|
||||
GRAB = enum.auto()
|
||||
@@ -499,7 +499,7 @@ class ViewerEffectDataSerializer(se.EnumSwitchedSubfieldSerializer):
|
||||
|
||||
@se.enum_field_serializer("MoneyTransferRequest", "MoneyData", "TransactionType")
|
||||
@se.enum_field_serializer("MoneyBalanceReply", "TransactionInfo", "TransactionType")
|
||||
class MoneyTransactionType(enum.IntEnum):
|
||||
class MoneyTransactionType(IntEnum):
|
||||
# _many_ of these codes haven't been used in decades.
|
||||
# Money transaction failure codes
|
||||
NULL = 0
|
||||
@@ -561,7 +561,7 @@ class MoneyTransactionType(enum.IntEnum):
|
||||
|
||||
|
||||
@se.flag_field_serializer("MoneyTransferRequest", "MoneyData", "Flags")
|
||||
class MoneyTransactionFlags(enum.IntFlag):
|
||||
class MoneyTransactionFlags(IntFlag):
|
||||
SOURCE_GROUP = 1
|
||||
DEST_GROUP = 1 << 1
|
||||
OWNER_GROUP = 1 << 2
|
||||
@@ -570,7 +570,7 @@ class MoneyTransactionFlags(enum.IntFlag):
|
||||
|
||||
|
||||
@se.enum_field_serializer("ImprovedInstantMessage", "MessageBlock", "Dialog")
|
||||
class IMDialogType(enum.IntEnum):
|
||||
class IMDialogType(IntEnum):
|
||||
NOTHING_SPECIAL = 0
|
||||
MESSAGEBOX = 1
|
||||
GROUP_INVITATION = 3
|
||||
@@ -728,7 +728,7 @@ class ObjectUpdateDataSerializer(se.SimpleSubfieldSerializer):
|
||||
|
||||
@se.enum_field_serializer("ObjectUpdate", "ObjectData", "PCode")
|
||||
@se.enum_field_serializer("ObjectAdd", "ObjectData", "PCode")
|
||||
class PCode(enum.IntEnum):
|
||||
class PCode(IntEnum):
|
||||
# Should actually be a bitmask, these are just some common ones.
|
||||
PRIMITIVE = 9
|
||||
AVATAR = 47
|
||||
@@ -742,7 +742,7 @@ class PCode(enum.IntEnum):
|
||||
@se.flag_field_serializer("ObjectUpdateCompressed", "ObjectData", "UpdateFlags")
|
||||
@se.flag_field_serializer("ObjectUpdateCached", "ObjectData", "UpdateFlags")
|
||||
@se.flag_field_serializer("ObjectAdd", "ObjectData", "AddFlags")
|
||||
class ObjectUpdateFlags(enum.IntFlag):
|
||||
class ObjectUpdateFlags(IntFlag):
|
||||
USE_PHYSICS = 1 << 0
|
||||
CREATE_SELECTED = 1 << 1
|
||||
OBJECT_MODIFY = 1 << 2
|
||||
@@ -796,7 +796,7 @@ class AttachmentStateAdapter(se.Adapter):
|
||||
|
||||
|
||||
@se.flag_field_serializer("AgentUpdate", "AgentData", "State")
|
||||
class AgentState(enum.IntFlag):
|
||||
class AgentState(IntFlag):
|
||||
TYPING = 1 << 3
|
||||
EDITING = 1 << 4
|
||||
|
||||
@@ -836,7 +836,7 @@ class ImprovedTerseObjectUpdateDataSerializer(se.SimpleSubfieldSerializer):
|
||||
})
|
||||
|
||||
|
||||
class ShineLevel(enum.IntEnum):
|
||||
class ShineLevel(IntEnum):
|
||||
OFF = 0
|
||||
LOW = 1
|
||||
MEDIUM = 2
|
||||
@@ -854,7 +854,7 @@ class BasicMaterials:
|
||||
BUMP_SHINY_FULLBRIGHT = se.BitfieldDataclass(BasicMaterials, se.U8)
|
||||
|
||||
|
||||
class TexGen(enum.IntEnum):
|
||||
class TexGen(IntEnum):
|
||||
DEFAULT = 0
|
||||
PLANAR = 0x2
|
||||
# These are unused / not supported
|
||||
@@ -1056,7 +1056,7 @@ class DPTextureEntrySubfieldSerializer(se.SimpleSubfieldSerializer):
|
||||
TEMPLATE = DATA_PACKER_TE_TEMPLATE
|
||||
|
||||
|
||||
class TextureAnimMode(enum.IntFlag):
|
||||
class TextureAnimMode(IntFlag):
|
||||
ON = 0x01
|
||||
LOOP = 0x02
|
||||
REVERSE = 0x04
|
||||
@@ -1092,7 +1092,7 @@ class TextureIDListSerializer(se.SimpleSubfieldSerializer):
|
||||
TEMPLATE = se.Collection(None, se.UUID)
|
||||
|
||||
|
||||
class ParticleDataFlags(enum.IntFlag):
|
||||
class ParticleDataFlags(IntFlag):
|
||||
INTERP_COLOR = 0x001
|
||||
INTERP_SCALE = 0x002
|
||||
BOUNCE = 0x004
|
||||
@@ -1108,12 +1108,12 @@ class ParticleDataFlags(enum.IntFlag):
|
||||
DATA_BLEND = 0x20000
|
||||
|
||||
|
||||
class ParticleFlags(enum.IntFlag):
|
||||
class ParticleFlags(IntFlag):
|
||||
OBJECT_RELATIVE = 0x1
|
||||
USE_NEW_ANGLE = 0x2
|
||||
|
||||
|
||||
class ParticleBlendFunc(enum.IntEnum):
|
||||
class ParticleBlendFunc(IntEnum):
|
||||
ONE = 0
|
||||
ZERO = 1
|
||||
DEST_COLOR = 2
|
||||
@@ -1150,7 +1150,7 @@ PDATA_BLOCK_TEMPLATE = se.Template({
|
||||
})
|
||||
|
||||
|
||||
class PartPattern(enum.IntFlag):
|
||||
class PartPattern(IntFlag):
|
||||
NONE = 0
|
||||
DROP = 0x1
|
||||
EXPLODE = 0x2
|
||||
@@ -1199,7 +1199,7 @@ class PSBlockSerializer(se.SimpleSubfieldSerializer):
|
||||
|
||||
|
||||
@se.enum_field_serializer("ObjectExtraParams", "ObjectData", "ParamType")
|
||||
class ExtraParamType(enum.IntEnum):
|
||||
class ExtraParamType(IntEnum):
|
||||
FLEXIBLE = 0x10
|
||||
LIGHT = 0x20
|
||||
SCULPT = 0x30
|
||||
@@ -1209,11 +1209,11 @@ class ExtraParamType(enum.IntEnum):
|
||||
EXTENDED_MESH = 0x70
|
||||
|
||||
|
||||
class ExtendedMeshFlags(enum.IntFlag):
|
||||
class ExtendedMeshFlags(IntFlag):
|
||||
ANIMATED_MESH = 0x1
|
||||
|
||||
|
||||
class SculptType(enum.IntEnum):
|
||||
class SculptType(IntEnum):
|
||||
NONE = 0
|
||||
SPHERE = 1
|
||||
TORUS = 2
|
||||
@@ -1238,10 +1238,10 @@ EXTRA_PARAM_TEMPLATES = {
|
||||
"UserForce": se.IfPresent(se.Vector3),
|
||||
}),
|
||||
ExtraParamType.LIGHT: se.Template({
|
||||
"Color": Color4(),
|
||||
"Radius": se.F32,
|
||||
"Cutoff": se.F32,
|
||||
"Falloff": se.F32,
|
||||
"Color": Color4(),
|
||||
"Radius": se.F32,
|
||||
"Cutoff": se.F32,
|
||||
"Falloff": se.F32,
|
||||
}),
|
||||
ExtraParamType.SCULPT: se.Template({
|
||||
"Texture": se.UUID,
|
||||
@@ -1283,8 +1283,8 @@ class ObjectUpdateExtraParamsSerializer(se.SimpleSubfieldSerializer):
|
||||
EMPTY_IS_NONE = True
|
||||
|
||||
|
||||
@se.enum_field_serializer("ObjectUpdate", "ObjectData", "Flags")
|
||||
class SoundFlags(enum.IntEnum):
|
||||
@se.flag_field_serializer("ObjectUpdate", "ObjectData", "Flags")
|
||||
class SoundFlags(IntFlag):
|
||||
LOOP = 1 << 0
|
||||
SYNC_MASTER = 1 << 1
|
||||
SYNC_SLAVE = 1 << 2
|
||||
@@ -1293,7 +1293,7 @@ class SoundFlags(enum.IntEnum):
|
||||
STOP = 1 << 5
|
||||
|
||||
|
||||
class CompressedFlags(enum.IntFlag):
|
||||
class CompressedFlags(IntFlag):
|
||||
SCRATCHPAD = 1
|
||||
TREE = 1 << 1
|
||||
TEXT = 1 << 2
|
||||
@@ -1381,7 +1381,7 @@ class ObjectUpdateCompressedDataSerializer(se.SimpleSubfieldSerializer):
|
||||
|
||||
|
||||
@se.flag_field_serializer("MultipleObjectUpdate", "ObjectData", "Type")
|
||||
class MultipleObjectUpdateFlags(enum.IntFlag):
|
||||
class MultipleObjectUpdateFlags(IntFlag):
|
||||
POSITION = 0x01
|
||||
ROTATION = 0x02
|
||||
SCALE = 0x04
|
||||
@@ -1401,7 +1401,7 @@ class MultipleObjectUpdateDataSerializer(se.FlagSwitchedSubfieldSerializer):
|
||||
|
||||
@se.flag_field_serializer("AgentUpdate", "AgentData", "ControlFlags")
|
||||
@se.flag_field_serializer("ScriptControlChange", "Data", "Controls")
|
||||
class AgentControlFlags(enum.IntFlag):
|
||||
class AgentControlFlags(IntFlag):
|
||||
AT_POS = 1
|
||||
AT_NEG = 1 << 1
|
||||
LEFT_POS = 1 << 2
|
||||
@@ -1437,14 +1437,14 @@ class AgentControlFlags(enum.IntFlag):
|
||||
|
||||
|
||||
@se.flag_field_serializer("AgentUpdate", "AgentData", "Flags")
|
||||
class AgentUpdateFlags(enum.IntFlag):
|
||||
class AgentUpdateFlags(IntFlag):
|
||||
HIDE_TITLE = 1
|
||||
CLIENT_AUTOPILOT = 1 << 1
|
||||
|
||||
|
||||
@se.enum_field_serializer("ChatFromViewer", "ChatData", "Type")
|
||||
@se.enum_field_serializer("ChatFromSimulator", "ChatData", "ChatType")
|
||||
class ChatType(enum.IntEnum):
|
||||
class ChatType(IntEnum):
|
||||
WHISPER = 0
|
||||
NORMAL = 1
|
||||
SHOUT = 2
|
||||
@@ -1461,7 +1461,7 @@ class ChatType(enum.IntEnum):
|
||||
|
||||
|
||||
@se.enum_field_serializer("ChatFromSimulator", "ChatData", "SourceType")
|
||||
class ChatSourceType(enum.IntEnum):
|
||||
class ChatSourceType(IntEnum):
|
||||
SYSTEM = 0
|
||||
AGENT = 1
|
||||
OBJECT = 2
|
||||
@@ -1479,7 +1479,7 @@ class NameValueSerializer(se.SimpleSubfieldSerializer):
|
||||
|
||||
|
||||
@se.enum_field_serializer("SetFollowCamProperties", "CameraProperty", "Type")
|
||||
class CameraPropertyType(enum.IntEnum):
|
||||
class CameraPropertyType(IntEnum):
|
||||
PITCH = 0
|
||||
FOCUS_OFFSET = enum.auto()
|
||||
FOCUS_OFFSET_X = enum.auto()
|
||||
@@ -1506,7 +1506,7 @@ class CameraPropertyType(enum.IntEnum):
|
||||
|
||||
|
||||
@se.enum_field_serializer("DeRezObject", "AgentBlock", "Destination")
|
||||
class DeRezObjectDestination(enum.IntEnum):
|
||||
class DeRezObjectDestination(IntEnum):
|
||||
SAVE_INTO_AGENT_INVENTORY = 0 # deprecated, disabled
|
||||
ACQUIRE_TO_AGENT_INVENTORY = 1 # try to leave copy in world
|
||||
SAVE_INTO_TASK_INVENTORY = 2
|
||||
@@ -1526,7 +1526,7 @@ class DeRezObjectDestination(enum.IntEnum):
|
||||
@se.flag_field_serializer("SimStats", "RegionInfo", "RegionFlagsExtended")
|
||||
@se.flag_field_serializer("RegionInfo", "RegionInfo", "RegionFlags")
|
||||
@se.flag_field_serializer("RegionInfo", "RegionInfo3", "RegionFlagsExtended")
|
||||
class RegionFlags(enum.IntFlag):
|
||||
class RegionFlags(IntFlag):
|
||||
ALLOW_DAMAGE = 1 << 0
|
||||
ALLOW_LANDMARK = 1 << 1
|
||||
ALLOW_SET_HOME = 1 << 2
|
||||
@@ -1562,12 +1562,35 @@ class RegionFlags(enum.IntFlag):
|
||||
|
||||
|
||||
@se.flag_field_serializer("RegionHandshakeReply", "RegionInfo", "Flags")
|
||||
class RegionHandshakeReplyFlags(enum.IntFlag):
|
||||
class RegionHandshakeReplyFlags(IntFlag):
|
||||
VOCACHE_CULLING_ENABLED = 0x1 # ask sim to send all cacheable objects.
|
||||
VOCACHE_IS_EMPTY = 0x2 # the cache file is empty, no need to send cache probes.
|
||||
SUPPORTS_SELF_APPEARANCE = 0x4 # inbound AvatarAppearance for self is ok
|
||||
|
||||
|
||||
@se.flag_field_serializer("TeleportStart", "Info", "TeleportFlags")
|
||||
@se.flag_field_serializer("TeleportProgress", "Info", "TeleportFlags")
|
||||
@se.flag_field_serializer("TeleportFinish", "Info", "TeleportFlags")
|
||||
@se.flag_field_serializer("TeleportLureRequest", "Info", "TeleportFlags")
|
||||
class TeleportFlags(IntFlag):
|
||||
SET_HOME_TO_TARGET = 1 << 0 # newbie leaving prelude (starter area)
|
||||
SET_LAST_TO_TARGET = 1 << 1
|
||||
VIA_LURE = 1 << 2
|
||||
VIA_LANDMARK = 1 << 3
|
||||
VIA_LOCATION = 1 << 4
|
||||
VIA_HOME = 1 << 5
|
||||
VIA_TELEHUB = 1 << 6
|
||||
VIA_LOGIN = 1 << 7
|
||||
VIA_GODLIKE_LURE = 1 << 8
|
||||
GODLIKE = 1 << 9
|
||||
NINE_ONE_ONE = 1 << 10 # What is this?
|
||||
DISABLE_CANCEL = 1 << 11 # Used for llTeleportAgentHome()
|
||||
VIA_REGION_ID = 1 << 12
|
||||
IS_FLYING = 1 << 13
|
||||
SHOW_RESET_HOME = 1 << 14
|
||||
FORCE_REDIRECT = 1 << 15
|
||||
|
||||
|
||||
@se.http_serializer("RenderMaterials")
|
||||
class RenderMaterialsSerializer(se.BaseHTTPSerializer):
|
||||
@classmethod
|
||||
|
||||
@@ -128,7 +128,7 @@ class TransferManager:
|
||||
elif msg.name == "TransferAbort":
|
||||
transfer.error_code = msg["TransferID"][0].deserialize_var("Result")
|
||||
transfer.set_exception(
|
||||
ConnectionAbortedError(f"Unknown failure")
|
||||
ConnectionAbortedError("Unknown failure")
|
||||
)
|
||||
|
||||
def _handle_transfer_packet(self, msg: ProxiedMessage, transfer: Transfer):
|
||||
@@ -136,7 +136,7 @@ class TransferManager:
|
||||
packet_id: int = transfer_block["Packet"]
|
||||
packet_data = transfer_block["Data"]
|
||||
transfer.chunks[packet_id] = packet_data
|
||||
if transfer_block["Status"] == TransferStatus.DONE:
|
||||
if transfer_block["Status"] == TransferStatus.DONE and not transfer.done():
|
||||
transfer.mark_done()
|
||||
|
||||
def _handle_transfer_info(self, msg: ProxiedMessage, transfer: Transfer):
|
||||
|
||||
@@ -1,15 +1,14 @@
|
||||
"""
|
||||
Outbound Xfer only.
|
||||
|
||||
sim->viewer Xfer is only legitimately used for terrain so not worth implementing.
|
||||
Managers for inbound and outbound xfer as well as the AssetUploadRequest flow
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
import enum
|
||||
import random
|
||||
from typing import *
|
||||
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.base.datatypes import UUID, RawBytes
|
||||
from hippolyzer.lib.base.helpers import proxify
|
||||
from hippolyzer.lib.base.message.data_packer import TemplateDataPacker
|
||||
from hippolyzer.lib.base.message.message import Block
|
||||
@@ -24,7 +23,7 @@ _XFER_MESSAGES = {"AbortXfer", "ConfirmXferPacket", "RequestXfer", "SendXferPack
|
||||
|
||||
|
||||
class Xfer:
|
||||
def __init__(self, xfer_id: int):
|
||||
def __init__(self, xfer_id: Optional[int] = None):
|
||||
super().__init__()
|
||||
self.xfer_id: Optional[int] = xfer_id
|
||||
self.chunks: Dict[int, bytes] = {}
|
||||
@@ -65,6 +64,11 @@ class Xfer:
|
||||
return self._future.__await__()
|
||||
|
||||
|
||||
class UploadStrategy(enum.IntEnum):
|
||||
XFER = enum.auto()
|
||||
ASSET_UPLOAD_REQUEST = enum.auto()
|
||||
|
||||
|
||||
class XferManager:
|
||||
def __init__(self, region: ProxiedRegion):
|
||||
self._region: ProxiedRegion = proxify(region)
|
||||
@@ -141,5 +145,96 @@ class XferManager:
|
||||
))
|
||||
|
||||
xfer.chunks[packet_id.PacketID] = packet_data
|
||||
if packet_id.IsEOF:
|
||||
if packet_id.IsEOF and not xfer.done():
|
||||
xfer.mark_done()
|
||||
|
||||
def upload_asset(
|
||||
self,
|
||||
asset_type: AssetType,
|
||||
data: bytes,
|
||||
store_local: bool = False,
|
||||
temp_file: bool = False,
|
||||
transaction_id: Optional[UUID] = None,
|
||||
upload_strategy: Optional[UploadStrategy] = None,
|
||||
) -> asyncio.Future[UUID]:
|
||||
"""Upload an asset through the Xfer upload path"""
|
||||
if not transaction_id:
|
||||
transaction_id = UUID.random()
|
||||
|
||||
# Small amounts of data can be sent inline, decide based on size
|
||||
if upload_strategy is None:
|
||||
if len(data) >= 1150:
|
||||
upload_strategy = UploadStrategy.XFER
|
||||
else:
|
||||
upload_strategy = UploadStrategy.ASSET_UPLOAD_REQUEST
|
||||
|
||||
xfer = None
|
||||
inline_data = b''
|
||||
if upload_strategy == UploadStrategy.XFER:
|
||||
# Prepend the expected length field to the first chunk
|
||||
if not isinstance(data, RawBytes):
|
||||
data = TemplateDataPacker.pack(len(data), MsgType.MVT_S32) + data
|
||||
xfer = Xfer()
|
||||
chunk_num = 0
|
||||
while data:
|
||||
xfer.chunks[chunk_num] = data[:1150]
|
||||
data = data[1150:]
|
||||
else:
|
||||
inline_data = data
|
||||
|
||||
self._region.circuit.send_message(ProxiedMessage(
|
||||
"AssetUploadRequest",
|
||||
Block(
|
||||
"AssetBlock",
|
||||
TransactionID=transaction_id,
|
||||
Type=asset_type,
|
||||
Tempfile=temp_file,
|
||||
StoreLocal=store_local,
|
||||
AssetData=inline_data,
|
||||
)
|
||||
))
|
||||
fut = asyncio.Future()
|
||||
asyncio.create_task(self._pump_asset_upload(xfer, transaction_id, fut))
|
||||
return fut
|
||||
|
||||
async def _pump_asset_upload(self, xfer: Optional[Xfer], transaction_id: UUID, fut: asyncio.Future):
|
||||
message_handler = self._region.message_handler
|
||||
# We'll receive an Xfer request for the asset we're uploading.
|
||||
# asset ID is determined by hashing secure session ID with chosen transaction ID.
|
||||
asset_id: UUID = self._region.session().tid_to_assetid(transaction_id)
|
||||
try:
|
||||
# Only need to do this if we're using the xfer upload strategy, otherwise all the
|
||||
# data was already sent in the AssetUploadRequest and we don't expect a RequestXfer.
|
||||
if xfer is not None:
|
||||
def request_predicate(request_msg: ProxiedMessage):
|
||||
return request_msg["XferID"]["VFileID"] == asset_id
|
||||
msg = await message_handler.wait_for(
|
||||
'RequestXfer', predicate=request_predicate, timeout=5000)
|
||||
xfer.xfer_id = msg["XferID"]["ID"]
|
||||
|
||||
packet_id = 0
|
||||
# TODO: No resend yet. If it's lost, it's lost.
|
||||
while xfer.chunks:
|
||||
chunk = xfer.chunks.pop(packet_id)
|
||||
# EOF if there are no chunks left
|
||||
packet_val = XferPacket(PacketID=packet_id, IsEOF=not bool(xfer.chunks))
|
||||
self._region.circuit.send_message(ProxiedMessage(
|
||||
"SendXferPacket",
|
||||
Block("XferID", ID=xfer.xfer_id, Packet_=packet_val),
|
||||
Block("DataPacket", Data=chunk),
|
||||
))
|
||||
# Don't care about the value, just want to know it was confirmed.
|
||||
await message_handler.wait_for(
|
||||
"ConfirmXferPacket", predicate=xfer.is_our_message, timeout=5000)
|
||||
packet_id += 1
|
||||
|
||||
def complete_predicate(complete_msg: ProxiedMessage):
|
||||
return complete_msg["AssetBlock"]["UUID"] == asset_id
|
||||
msg = await message_handler.wait_for('AssetUploadComplete', predicate=complete_predicate)
|
||||
if msg["AssetBlock"]["Success"] == 1:
|
||||
fut.set_result(asset_id)
|
||||
else:
|
||||
fut.set_exception(RuntimeError(f"Xfer for transaction {transaction_id} failed"))
|
||||
|
||||
except asyncio.TimeoutError as e:
|
||||
fut.set_exception(e)
|
||||
|
||||
4
requirements-test.txt
Normal file
4
requirements-test.txt
Normal file
@@ -0,0 +1,4 @@
|
||||
aioresponses
|
||||
pytest
|
||||
pytest-cov
|
||||
flake8
|
||||
@@ -5,3 +5,8 @@ license_files =
|
||||
|
||||
[bdist_wheel]
|
||||
universal = 1
|
||||
|
||||
[flake8]
|
||||
max-line-length = 160
|
||||
exclude = build/*, .eggs/*
|
||||
ignore = F405, F403, E501, F841, E722, W503, E741
|
||||
|
||||
5
setup.py
5
setup.py
@@ -25,7 +25,7 @@ from setuptools import setup, find_packages
|
||||
|
||||
here = path.abspath(path.dirname(__file__))
|
||||
|
||||
version = '0.3.2'
|
||||
version = '0.5.0'
|
||||
|
||||
with open(path.join(here, 'README.md')) as readme_fh:
|
||||
readme = readme_fh.read()
|
||||
@@ -50,7 +50,7 @@ setup(
|
||||
"Topic :: Software Development :: Testing",
|
||||
],
|
||||
author='Salad Dais',
|
||||
author_email='SaladDais@users.noreply.github.com',
|
||||
author_email='83434023+SaladDais@users.noreply.github.com',
|
||||
url='https://github.com/SaladDais/Hippolyzer/',
|
||||
license='LGPLv3',
|
||||
packages=find_packages(include=["hippolyzer", "hippolyzer.*"]),
|
||||
@@ -98,5 +98,6 @@ setup(
|
||||
],
|
||||
tests_require=[
|
||||
"pytest",
|
||||
"aioresponses",
|
||||
],
|
||||
)
|
||||
|
||||
@@ -111,7 +111,7 @@ executables = [
|
||||
|
||||
setup(
|
||||
name="hippolyzer_gui",
|
||||
version="0.3.2",
|
||||
version="0.5.0",
|
||||
description="Hippolyzer GUI",
|
||||
options=options,
|
||||
executables=executables,
|
||||
|
||||
Binary file not shown.
|
Before Width: | Height: | Size: 31 KiB |
@@ -126,8 +126,6 @@ class TestMessage(unittest.TestCase):
|
||||
def test_partial_decode_pickle(self):
|
||||
msg = self.deserial.deserialize(self.serial.serialize(self.chat_msg))
|
||||
self.assertEqual(msg.deserializer(), self.deserial)
|
||||
# Have to remove the weak ref so we can pickle
|
||||
msg.deserializer = None
|
||||
msg = pickle.loads(pickle.dumps(msg, protocol=pickle.HIGHEST_PROTOCOL))
|
||||
|
||||
# We should still have the raw body at this point
|
||||
|
||||
@@ -664,14 +664,13 @@ class NameValueSerializationTests(BaseSerializationTest):
|
||||
self.assertEqual(test.decode("utf8"), str(reader.read(NameValueSerializer())))
|
||||
|
||||
def test_namevalues_stringify(self):
|
||||
test_list = \
|
||||
b"Alpha STRING R S 'Twas brillig and the slighy toves/Did gyre and gimble in the wabe\n" + \
|
||||
b"Beta F32 R S 3.14159\n" + \
|
||||
b"Gamma S32 R S -12345\n" + \
|
||||
b"Delta VEC3 R S <1.2, -3.4, 5.6>\n" + \
|
||||
b"Epsilon U32 R S 12345\n" + \
|
||||
b"Zeta ASSET R S 041a8591-6f30-42f8-b9f7-7f281351f375\n" + \
|
||||
b"Eta U64 R S 9223372036854775807"
|
||||
test_list = b"Alpha STRING R S 'Twas brillig and the slighy toves/Did gyre and gimble in the wabe\n" + \
|
||||
b"Beta F32 R S 3.14159\n" + \
|
||||
b"Gamma S32 R S -12345\n" + \
|
||||
b"Delta VEC3 R S <1.2, -3.4, 5.6>\n" + \
|
||||
b"Epsilon U32 R S 12345\n" + \
|
||||
b"Zeta ASSET R S 041a8591-6f30-42f8-b9f7-7f281351f375\n" + \
|
||||
b"Eta U64 R S 9223372036854775807"
|
||||
|
||||
self.writer.clear()
|
||||
self.writer.write_bytes(test_list)
|
||||
|
||||
@@ -39,11 +39,7 @@ class TestDictionary(unittest.TestCase):
|
||||
self.template_list = parser.message_templates
|
||||
|
||||
def test_create_dictionary(self):
|
||||
try:
|
||||
_msg_dict = TemplateDictionary(None)
|
||||
assert False, "Template dictionary fail case list==None not caught"
|
||||
except:
|
||||
assert True
|
||||
TemplateDictionary(None)
|
||||
|
||||
def test_get_packet(self):
|
||||
msg_dict = TemplateDictionary(self.template_list)
|
||||
@@ -55,7 +51,7 @@ class TestDictionary(unittest.TestCase):
|
||||
def test_get_packet_pair(self):
|
||||
msg_dict = TemplateDictionary(self.template_list)
|
||||
packet = msg_dict.get_template_by_pair('Medium', 8)
|
||||
assert packet.name == 'ConfirmEnableSimulator', "Frequency-Number pair resulting in incorrect packet"
|
||||
assert packet.name == 'ConfirmEnableSimulator', "Frequency-Number pair resulting in incorrect packet"
|
||||
|
||||
|
||||
class TestTemplates(unittest.TestCase):
|
||||
@@ -69,11 +65,8 @@ class TestTemplates(unittest.TestCase):
|
||||
assert parser.message_templates is not None, "Parsing template file failed"
|
||||
|
||||
def test_parser_fail(self):
|
||||
try:
|
||||
with self.assertRaises(Exception):
|
||||
_parser = MessageTemplateParser(None)
|
||||
assert False, "Fail case TEMPLATE_FILE == NONE not caught"
|
||||
except:
|
||||
assert True
|
||||
|
||||
def test_parser_version(self):
|
||||
version = self.parser.version
|
||||
@@ -111,15 +104,15 @@ class TestTemplates(unittest.TestCase):
|
||||
block = self.msg_dict['OpenCircuit'].get_block('CircuitInfo')
|
||||
tp = block.block_type
|
||||
num = block.number
|
||||
assert tp == MsgBlockType.MBT_SINGLE, "Expected: Single Returned: " + tp
|
||||
assert num == 0, "Expected: 0 Returned: " + str(num)
|
||||
assert tp == MsgBlockType.MBT_SINGLE, "Expected: Single Returned: " + tp
|
||||
assert num == 0, "Expected: 0 Returned: " + str(num)
|
||||
|
||||
def test_block_multiple(self):
|
||||
block = self.msg_dict['NeighborList'].get_block('NeighborBlock')
|
||||
tp = block.block_type
|
||||
num = block.number
|
||||
assert tp == MsgBlockType.MBT_MULTIPLE, "Expected: Multiple Returned: " + tp
|
||||
assert num == 4, "Expected: 4 Returned: " + str(num)
|
||||
assert num == 4, "Expected: 4 Returned: " + str(num)
|
||||
|
||||
def test_variable(self):
|
||||
variable = self.msg_dict['StartPingCheck'].get_block('PingID').get_variable('PingID')
|
||||
@@ -153,7 +146,7 @@ class TestTemplates(unittest.TestCase):
|
||||
medium_count = 0
|
||||
high_count = 0
|
||||
fixed_count = 0
|
||||
while True:
|
||||
while True:
|
||||
try:
|
||||
line = next(lines)
|
||||
except StopIteration:
|
||||
|
||||
@@ -86,6 +86,6 @@ class TestDeserializer(unittest.TestCase):
|
||||
# test the 72 byte ObjectUpdate.ObjectData.ObjectData case
|
||||
hex_string = '00000000000000000000803f6666da41660000432fffff422233e34100000000000000000000000000000000000000' \
|
||||
'000000000000000000000000000e33de3c000000000000000000000000'
|
||||
position = TemplateDataPacker.unpack(unhexlify(hex_string)[16:16+12], MsgType.MVT_LLVector3)
|
||||
position = TemplateDataPacker.unpack(unhexlify(hex_string)[16:16 + 12], MsgType.MVT_LLVector3)
|
||||
self.assertEqual(position, (128.00155639648438, 127.99840545654297, 28.399967193603516))
|
||||
self.assertIsInstance(position, Vector3)
|
||||
|
||||
@@ -52,6 +52,7 @@ class BaseIntegrationTest(unittest.IsolatedAsyncioTestCase):
|
||||
self.session.open_circuit(self.client_addr, self.region_addr,
|
||||
self.protocol.transport)
|
||||
self.session.main_region = self.session.regions[-1]
|
||||
self.session.main_region.handle = 0
|
||||
|
||||
def _msg_to_datagram(self, msg: ProxiedMessage, src, dst, direction, socks_header=True):
|
||||
serialized = self.serializer.serialize(msg)
|
||||
|
||||
72
tests/proxy/integration/test_http.py
Normal file
72
tests/proxy/integration/test_http.py
Normal file
@@ -0,0 +1,72 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import asyncio
|
||||
|
||||
from mitmproxy.test import tflow, tutils
|
||||
from mitmproxy.http import HTTPFlow
|
||||
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon
|
||||
from hippolyzer.lib.proxy.addons import AddonManager
|
||||
from hippolyzer.lib.proxy.http_event_manager import MITMProxyEventManager
|
||||
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
|
||||
from hippolyzer.lib.proxy.http_proxy import HTTPFlowContext, SerializedCapData
|
||||
from hippolyzer.lib.proxy.message_logger import FilteringMessageLogger
|
||||
from hippolyzer.lib.proxy.sessions import SessionManager
|
||||
|
||||
from . import BaseIntegrationTest
|
||||
|
||||
|
||||
class MockAddon(BaseAddon):
|
||||
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
|
||||
flow.metadata["touched_addon"] = True
|
||||
|
||||
def handle_http_response(self, session_manager: SessionManager, flow: HippoHTTPFlow):
|
||||
flow.metadata["touched_addon"] = True
|
||||
|
||||
|
||||
class SimpleMessageLogger(FilteringMessageLogger):
|
||||
@property
|
||||
def entries(self):
|
||||
return self._filtered_entries
|
||||
|
||||
|
||||
class LLUDPIntegrationTests(BaseIntegrationTest):
|
||||
def setUp(self) -> None:
|
||||
super().setUp()
|
||||
self.addon = MockAddon()
|
||||
AddonManager.init([], self.session_manager, [self.addon])
|
||||
self.flow_context = HTTPFlowContext()
|
||||
self.http_event_manager = MITMProxyEventManager(self.session_manager, self.flow_context)
|
||||
self._setup_circuit()
|
||||
|
||||
async def _pump_one_event(self):
|
||||
# If we don't yield then the new entry won't end up in the queue
|
||||
await asyncio.sleep(0.001)
|
||||
await self.http_event_manager.pump_proxy_event()
|
||||
await asyncio.sleep(0.001)
|
||||
|
||||
async def test_http_flow_request(self):
|
||||
# mimic a request coming in from mitmproxy over the queue
|
||||
fake_flow = tflow.tflow(req=tutils.treq(host="example.com"))
|
||||
fake_flow.metadata["cap_data_ser"] = SerializedCapData()
|
||||
self.flow_context.from_proxy_queue.put(("request", fake_flow.get_state()), True)
|
||||
await self._pump_one_event()
|
||||
self.assertTrue(self.flow_context.from_proxy_queue.empty())
|
||||
self.assertFalse(self.flow_context.to_proxy_queue.empty())
|
||||
flow_state = self.flow_context.to_proxy_queue.get(True)[2]
|
||||
mitm_flow: HTTPFlow = HTTPFlow.from_state(flow_state)
|
||||
# The response sent back to mitmproxy should have been our modified version
|
||||
self.assertEqual(True, mitm_flow.metadata["touched_addon"])
|
||||
|
||||
async def test_http_flow_response(self):
|
||||
# mimic a request coming in from mitmproxy over the queue
|
||||
fake_flow = tflow.tflow(req=tutils.treq(host="example.com"), resp=tutils.tresp())
|
||||
fake_flow.metadata["cap_data_ser"] = SerializedCapData()
|
||||
self.flow_context.from_proxy_queue.put(("response", fake_flow.get_state()), True)
|
||||
await self._pump_one_event()
|
||||
self.assertTrue(self.flow_context.from_proxy_queue.empty())
|
||||
self.assertFalse(self.flow_context.to_proxy_queue.empty())
|
||||
flow_state = self.flow_context.to_proxy_queue.get(True)[2]
|
||||
mitm_flow: HTTPFlow = HTTPFlow.from_state(flow_state)
|
||||
# The response sent back to mitmproxy should have been our modified version
|
||||
self.assertEqual(True, mitm_flow.metadata["touched_addon"])
|
||||
@@ -33,7 +33,7 @@ class MockAddon(BaseAddon):
|
||||
|
||||
def handle_object_updated(self, session: Session, region: ProxiedRegion,
|
||||
obj: Object, updated_props: Set[str]):
|
||||
self.events.append(("object_update", session.id, region.circuit_addr, obj.LocalID))
|
||||
self.events.append(("object_update", session.id, region.circuit_addr, obj.LocalID, updated_props))
|
||||
|
||||
|
||||
class SimpleMessageLogger(FilteringMessageLogger):
|
||||
@@ -53,31 +53,31 @@ class LLUDPIntegrationTests(BaseIntegrationTest):
|
||||
localid = random.getrandbits(32)
|
||||
|
||||
return b'\x00\x00\x00\x0c\xba\x00\r\x00\x00\x00\x00\x00\x00\x00\x00\xff\xff\x03\xd0\x04\x00\x10' \
|
||||
b'\xe6\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_' + struct.pack("<I", localid) + \
|
||||
b'\xe6\x00\x12\x12\x10\xbf\x16XB~\x8f\xb4\xfb\x00\x1a\xcd\x9b\xe5' + struct.pack("<I", localid) + \
|
||||
b'\t\x00\xcdG\x00\x00\x03\x00\x00\x00\x1cB\x00\x00\x1cB\xcd\xcc\xcc=\xedG,' \
|
||||
b'B\x9e\xb1\x9eBff\xa0A\x00\x00\x00\x00\x00\x00\x00\x00[' \
|
||||
b'\x8b\xf8\xbe\xc0\x00\x00\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x00\x00\x00\x00\x00' \
|
||||
b'\x8b\xf8\xbe\xc0\x00\x00\x00k\x9b\xc4\xfe3\nOa\xbb\xe2\xe4\xb2C\xac7\xbd\x00\x00\x00\x00\x00' \
|
||||
b'\x00\x00\x00\x00\x00\xa2=\x010\x00\x11\x00\x00\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed' \
|
||||
b'\x15F_@ \x00\x00\x00\x00d\x96\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00' \
|
||||
b'\x00?\x00\x00\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x00\x00\x00\x003\x00ff\x86\xbf' \
|
||||
b'\x00?\x00\x00\x00\x1c\x9fJoI\x8dH\xa0\x9d\xc4&\'\'\x19=g\x00\x00\x00\x003\x00ff\x86\xbf' \
|
||||
b'\x00ff\x86?\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x89UgG$\xcbC' \
|
||||
b'\xed\x92\x0bG\xca\xed\x15F_\x10\x00\x00\x003\x00\x01\x01\x00\x00\x00\x00\xdb\x0f\xc9@\xa6' \
|
||||
b'\x9b\xc4=\xd0\x04\x00\x10\xe6\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\\\x04\x00\x00\t' \
|
||||
b'\x00\xd3G\x00\x00\x03\x00\x00\x00\x1cB\x00\x00\x1cB\xcd\xcc\xcc=\t\x08\x9cA\xf2\x03' \
|
||||
b'\xa5Bff\xa0A\x00\x00\x00\x00\x00\x00\x00\x00[' \
|
||||
b'\x8b\xf8\xbe\xc0\x00\x00\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x00\x00\x00\x00\x00' \
|
||||
b'\x9b\xc4=\xd0\x04\x00\x10\xe6\x00\xc2\xa62\xe2\x9b\xd7L\xc4\xbb\xd6\x1fKC\xa6\xdf\x8d\\\x04\x00' \
|
||||
b'\x00\t\x00\xd3G\x00\x00\x03\x00\x00\x00\x1cB\x00\x00\x1cB\xcd\xcc\xcc=\t\x08\x9cA\xf2\x03' \
|
||||
b'\xa5Bff\xa0A\x00\x00\x00\x00\x00\x00\x00\x00[\x8b\xf8' \
|
||||
b'\xbe\xc0\x00\x00\x00\x0b\x1b\xa0\xd1\x97=C\xcd\xae\x19\xfd\xc9\xbb\x88\x05\xc3\x00\x00\x00\x00\x00' \
|
||||
b'\x00\x00\x00\x00\x00\xa2=\x010\x00\x11\x00\x00\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed' \
|
||||
b'\x15F_@ \x00\x00\x00\x00d\x96\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00' \
|
||||
b'\x00?\x00\x00\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x00\x00\x00\x003\x00ff\x86\xbf' \
|
||||
b'\x00?\x00\x00\x00\xbd\x8b\xd7h{\xdbM\xbc\x8c3X\xa6\xa6\x0c\x94\xd7\x00\x00\x00\x003\x00ff\x86\xbf' \
|
||||
b'\x00ff\x86?\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x89UgG$\xcbC' \
|
||||
b'\xed\x92\x0bG\xca\xed\x15F_\x10\x00\x00\x003\x00\x01\x01\x00\x00\x00\x00\xdb\x0f\xc9@\xa6' \
|
||||
b'\x9b\xc4=\xd0\x04\x00\x10\xe6\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\xe2\x05\x00\x00' \
|
||||
b'\x9b\xc4=\xd0\x04\x00\x10\xe6\x00\xd1e\xac\xff,NBK\x91d\xbb\x15\\\x0b\xc3\x9c\xe2\x05\x00\x00' \
|
||||
b'\t\x00\xbbG\x00\x00\x03\x00\x00\x00\x1cB\x00\x00\x1cB\xcd\xcc\xcc=\x0f5\x97AY\x98ZBff' \
|
||||
b'\xa0A\x00\x00\x00\x00\x00\x00\x00\x00\xe6Y0\xbf\xc0\x00\x00\x00\x89UgG$\xcbC\xed\x92\x0bG' \
|
||||
b'\xca\xed\x15F_\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xa2=\x010\x00\x11\x00\x00\x00' \
|
||||
b'\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_@ ' \
|
||||
b'#\xce\xf8\xf4\x0cJD.\xb7"\x96\x1cK\xd9\x01\x1b@ ' \
|
||||
b'\x00\x00\x00\x00d\x96\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00' \
|
||||
b'?\x00\x00\x00\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x00\x00\x00\x003\x00ff\x86\xbf' \
|
||||
b'?\x00\x00\x003\xe1\xa1\xcf<\xbdD\xc4\xa0\xe6b\xe9\xbf=\xa2@\x00\x00\x00\x003\x00ff\x86\xbf' \
|
||||
b'\x00ff\x86?\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x89UgG$\xcbC' \
|
||||
b'\xed\x92\x0bG\xca\xed\x15F_\x10\x00\x00\x003\x00\x01\x01\x00\x00\x00\x00\xdb\x0f\xc9@\xa6' \
|
||||
b'\x9b\xc4='
|
||||
@@ -175,7 +175,23 @@ class LLUDPIntegrationTests(BaseIntegrationTest):
|
||||
await self._wait_drained()
|
||||
obj = self.session.regions[0].objects.lookup_localid(1234)
|
||||
self.assertIsInstance(obj.TextureEntry, lazy_object_proxy.Proxy)
|
||||
self.assertEqual(obj.TextureEntry.Textures[None], UUID("89556747-24cb-43ed-920b-47caed15465f"))
|
||||
self.assertEqual(obj.TextureEntry.Textures[None], UUID("1c9f4a6f-498d-48a0-9dc4-262727193d67"))
|
||||
self.assertEqual(len(self.session.regions[0].objects), 3)
|
||||
|
||||
async def test_object_updated_changed_property_list(self):
|
||||
self._setup_circuit()
|
||||
# One creating update and one no-op update
|
||||
obj_update = self._make_objectupdate_compressed(1234)
|
||||
self.protocol.datagram_received(obj_update, self.region_addr)
|
||||
obj_update = self._make_objectupdate_compressed(1234)
|
||||
self.protocol.datagram_received(obj_update, self.region_addr)
|
||||
await self._wait_drained()
|
||||
self.assertEqual(len(self.session.regions[0].objects), 3)
|
||||
object_events = [e for e in self.addon.events if e[0] == "object_update"]
|
||||
# 3 objects in example packet and we sent it twice
|
||||
self.assertEqual(len(object_events), 6)
|
||||
# Only TextureEntry should be marked updated since it's a proxy object
|
||||
self.assertEqual(object_events[-1][-1], {"TextureEntry"})
|
||||
|
||||
async def test_message_logger(self):
|
||||
message_logger = SimpleMessageLogger()
|
||||
|
||||
65
tests/proxy/test_capsclient.py
Normal file
65
tests/proxy/test_capsclient.py
Normal file
@@ -0,0 +1,65 @@
|
||||
import unittest
|
||||
|
||||
import aiohttp
|
||||
import aioresponses
|
||||
from yarl import URL
|
||||
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.proxy.caps_client import CapsClient
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import SessionManager
|
||||
|
||||
|
||||
class TestCapsClient(unittest.IsolatedAsyncioTestCase):
|
||||
def setUp(self) -> None:
|
||||
self.session = SessionManager().create_session({
|
||||
"session_id": UUID.random(),
|
||||
"secure_session_id": UUID.random(),
|
||||
"agent_id": UUID.random(),
|
||||
"circuit_code": 0,
|
||||
"sim_ip": "127.0.0.1",
|
||||
"sim_port": "1",
|
||||
"seed_capability": "https://test.localhost:4/foo",
|
||||
})
|
||||
self.region = ProxiedRegion(("127.0.0.1", 1), "", self.session)
|
||||
self.caps_client = CapsClient(self.region)
|
||||
|
||||
async def test_bare_url_works(self):
|
||||
with aioresponses.aioresponses() as m:
|
||||
m.get("https://example.com/", body=b"foo")
|
||||
async with self.caps_client.get("https://example.com/") as resp:
|
||||
self.assertEqual(await resp.read(), b"foo")
|
||||
|
||||
async def test_own_session_works(self):
|
||||
with aioresponses.aioresponses() as m:
|
||||
async with aiohttp.ClientSession() as sess:
|
||||
m.get("https://example.com/", body=b"foo")
|
||||
async with self.caps_client.get("https://example.com/", session=sess) as resp:
|
||||
self.assertEqual(await resp.read(), b"foo")
|
||||
|
||||
async def test_read_llsd(self):
|
||||
with aioresponses.aioresponses() as m:
|
||||
m.get("https://example.com/", body=b"<llsd><integer>2</integer></llsd>")
|
||||
async with self.caps_client.get("https://example.com/") as resp:
|
||||
self.assertEqual(await resp.read_llsd(), 2)
|
||||
|
||||
async def test_caps(self):
|
||||
self.region.update_caps({"Foobar": "https://example.com/"})
|
||||
with aioresponses.aioresponses() as m:
|
||||
m.post("https://example.com/baz", body=b"ok")
|
||||
data = {"hi": "hello"}
|
||||
headers = {"Foo": "bar"}
|
||||
async with self.caps_client.post("Foobar", path="baz", llsd=data, headers=headers) as resp:
|
||||
self.assertEqual(await resp.read(), b"ok")
|
||||
|
||||
# Our original dict should not have been touched
|
||||
self.assertEqual(headers, {"Foo": "bar"})
|
||||
|
||||
req_key = ("POST", URL("https://example.com/baz"))
|
||||
req_body = m.requests[req_key][0].kwargs['data']
|
||||
self.assertEqual(req_body, b'<?xml version="1.0" ?><llsd><map><key>hi</key><string>hello'
|
||||
b'</string></map></llsd>')
|
||||
|
||||
with self.assertRaises(KeyError):
|
||||
with self.caps_client.get("BadCap"):
|
||||
pass
|
||||
109
tests/proxy/test_httpflows.py
Normal file
109
tests/proxy/test_httpflows.py
Normal file
@@ -0,0 +1,109 @@
|
||||
import unittest
|
||||
|
||||
from mitmproxy.test import tflow, tutils
|
||||
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
|
||||
from hippolyzer.lib.proxy.message_logger import HTTPMessageLogEntry
|
||||
from hippolyzer.lib.proxy.sessions import SessionManager
|
||||
|
||||
|
||||
class TestHTTPFlows(unittest.TestCase):
|
||||
def setUp(self) -> None:
|
||||
self.session_manager = SessionManager()
|
||||
self.session = self.session = self.session_manager.create_session({
|
||||
"session_id": UUID.random(),
|
||||
"secure_session_id": UUID.random(),
|
||||
"agent_id": UUID.random(),
|
||||
"circuit_code": 0,
|
||||
"sim_ip": "127.0.0.1",
|
||||
"sim_port": "1",
|
||||
"seed_capability": "https://test.localhost:4/foo",
|
||||
})
|
||||
self.region = self.session.register_region(
|
||||
("127.0.0.1", 2),
|
||||
"https://test.localhost:4/foo",
|
||||
handle=90,
|
||||
)
|
||||
self.region.update_caps({
|
||||
"FakeCap": "http://example.com",
|
||||
"ViewerAsset": "http://assets.example.com",
|
||||
})
|
||||
|
||||
def test_request_formatting(self):
|
||||
req = tutils.treq(host="example.com", port=80)
|
||||
fake_flow = tflow.tflow(req=req, resp=tutils.tresp())
|
||||
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
|
||||
# Make sure cap resolution works correctly
|
||||
flow.cap_data = self.session_manager.resolve_cap(flow.request.url)
|
||||
entry = HTTPMessageLogEntry(flow)
|
||||
self.assertEqual(entry.request(beautify=True), """GET [[FakeCap]]/path HTTP/1.1\r
|
||||
# http://example.com/path\r
|
||||
header: qvalue\r
|
||||
content-length: 7\r
|
||||
\r
|
||||
content""")
|
||||
|
||||
def test_binary_request_formatting(self):
|
||||
req = tutils.treq(host="example.com", port=80)
|
||||
fake_flow = tflow.tflow(req=req, resp=tutils.tresp())
|
||||
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
|
||||
# This should trigger the escaped body path without changing content-length
|
||||
flow.request.content = b"c\x00ntent"
|
||||
entry = HTTPMessageLogEntry(flow)
|
||||
self.assertEqual(entry.request(beautify=True), """GET http://example.com/path HTTP/1.1\r
|
||||
header: qvalue\r
|
||||
content-length: 7\r
|
||||
X-Hippo-Escaped-Body: 1\r
|
||||
\r
|
||||
c\\x00ntent""")
|
||||
|
||||
def test_llsd_response_formatting(self):
|
||||
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
|
||||
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
|
||||
# Half the time LLSD is sent with a random Content-Type and no PI indicating
|
||||
# what flavor of LLSD it is. Make sure the sniffing works correctly.
|
||||
flow.response.content = b"<llsd><integer>1</integer></llsd>"
|
||||
entry = HTTPMessageLogEntry(flow)
|
||||
self.assertEqual(entry.response(beautify=True), """HTTP/1.1 200 OK\r
|
||||
header-response: svalue\r
|
||||
content-length: 33\r
|
||||
\r
|
||||
<?xml version="1.0" ?>
|
||||
<llsd>
|
||||
<integer>1</integer>
|
||||
</llsd>
|
||||
""")
|
||||
|
||||
def test_flow_state_serde(self):
|
||||
fake_flow = tflow.tflow(req=tutils.treq(host="example.com"), resp=tutils.tresp())
|
||||
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
|
||||
# Make sure cap resolution works correctly
|
||||
flow.cap_data = self.session_manager.resolve_cap(flow.request.url)
|
||||
flow_state = flow.get_state()
|
||||
new_flow = HippoHTTPFlow.from_state(flow_state, self.session_manager)
|
||||
self.assertIs(self.session, new_flow.cap_data.session())
|
||||
|
||||
def test_http_asset_repo(self):
|
||||
asset_repo = self.session_manager.asset_repo
|
||||
asset_id = asset_repo.create_asset(b"foobar", one_shot=True)
|
||||
req = tutils.treq(host="assets.example.com", path=f"/?animatn_id={asset_id}")
|
||||
fake_flow = tflow.tflow(req=req)
|
||||
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
|
||||
# Have to resolve cap data so the asset repo knows this is an asset server cap
|
||||
flow.cap_data = self.session_manager.resolve_cap(flow.request.url)
|
||||
self.assertTrue(asset_repo.try_serve_asset(flow))
|
||||
self.assertEqual(b"foobar", flow.response.content)
|
||||
|
||||
def test_temporary_cap_resolution(self):
|
||||
self.region.register_temporary_cap("TempExample", "http://not.example.com")
|
||||
self.region.register_temporary_cap("TempExample", "http://not2.example.com")
|
||||
# Resolving the cap should consume it
|
||||
cap_data = self.session_manager.resolve_cap("http://not.example.com")
|
||||
self.assertEqual(cap_data.cap_name, "TempExample")
|
||||
# A CapData object should always be returned, but the cap_name field will be None
|
||||
new_cap_data = self.session_manager.resolve_cap("http://not.example.com")
|
||||
self.assertIsNone(new_cap_data.cap_name)
|
||||
# The second temp cap with the same name should still be in there
|
||||
cap_data = self.session_manager.resolve_cap("http://not2.example.com")
|
||||
self.assertEqual(cap_data.cap_name, "TempExample")
|
||||
@@ -1,13 +1,17 @@
|
||||
import unittest
|
||||
|
||||
from mitmproxy.test import tflow, tutils
|
||||
|
||||
from hippolyzer.lib.base.datatypes import Vector3
|
||||
from hippolyzer.lib.base.message.message import Block
|
||||
from hippolyzer.lib.base.message.udpdeserializer import UDPMessageDeserializer
|
||||
from hippolyzer.lib.base.settings import Settings
|
||||
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
|
||||
from hippolyzer.lib.proxy.http_proxy import SerializedCapData
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage as Message
|
||||
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry
|
||||
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, HTTPMessageLogEntry
|
||||
from hippolyzer.lib.proxy.message_filter import compile_filter
|
||||
|
||||
from hippolyzer.lib.proxy.sessions import SessionManager
|
||||
|
||||
OBJECT_UPDATE = b'\xc0\x00\x00\x00Q\x00\x0c\x00\x01\xea\x03\x00\x02\xe6\x03\x00\x01\xbe\xff\x01\x06\xbc\x8e\x0b\x00' \
|
||||
b'\x01i\x94\x8cjM"\x1bf\xec\xe4\xac1c\x93\xcbKW\x89\x98\x01\t\x03\x00\x01Q@\x88>Q@\x88>Q@\x88><\xa2D' \
|
||||
@@ -46,8 +50,10 @@ class MessageFilterTests(unittest.TestCase):
|
||||
def test_equality(self):
|
||||
msg = LLUDPMessageLogEntry(Message("Foo", Block("Bar", Baz=1)), None, None)
|
||||
self.assertTrue(self._filter_matches("Foo.Bar.Baz == 1", msg))
|
||||
self.assertTrue(self._filter_matches("Foo.Bar.Baz == 0x1", msg))
|
||||
msg.message["Bar"]["Baz"] = 2
|
||||
self.assertFalse(self._filter_matches("Foo.Bar.Baz == 1", msg))
|
||||
self.assertFalse(self._filter_matches("Foo.Bar.Baz == 0x1", msg))
|
||||
|
||||
def test_and(self):
|
||||
msg = LLUDPMessageLogEntry(Message("Foo", Block("Bar", Baz=1)), None, None)
|
||||
@@ -95,6 +101,14 @@ class MessageFilterTests(unittest.TestCase):
|
||||
self.assertFalse(self._filter_matches("Foo.Bar.Baz < (0, 3, 0)", msg))
|
||||
self.assertTrue(self._filter_matches("Foo.Bar.Baz > (0, 0, 0)", msg))
|
||||
|
||||
def test_enum_specifier(self):
|
||||
# 2 is the enum val for SculptType.TORUS
|
||||
msg = LLUDPMessageLogEntry(Message("Foo", Block("Bar", Baz=2)), None, None)
|
||||
self.assertTrue(self._filter_matches("Foo.Bar.Baz == SculptType.TORUS", msg))
|
||||
# bitwise AND should work as well
|
||||
self.assertTrue(self._filter_matches("Foo.Bar.Baz & SculptType.TORUS", msg))
|
||||
self.assertFalse(self._filter_matches("Foo.Bar.Baz == SculptType.SPHERE", msg))
|
||||
|
||||
def test_tagged_union_subfield(self):
|
||||
settings = Settings()
|
||||
settings.ENABLE_DEFERRED_PACKET_PARSING = False
|
||||
@@ -105,6 +119,17 @@ class MessageFilterTests(unittest.TestCase):
|
||||
self.assertTrue(self._filter_matches("ObjectUpdate.ObjectData.ObjectData.Position > (88, 41, 25)", entry))
|
||||
self.assertTrue(self._filter_matches("ObjectUpdate.ObjectData.ObjectData.Position < (90, 43, 27)", entry))
|
||||
|
||||
def test_http_flow(self):
|
||||
session_manager = SessionManager()
|
||||
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
|
||||
fake_flow.metadata["cap_data_ser"] = SerializedCapData(
|
||||
cap_name="FakeCap",
|
||||
)
|
||||
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), session_manager)
|
||||
entry = HTTPMessageLogEntry(flow)
|
||||
self.assertTrue(self._filter_matches("FakeCap", entry))
|
||||
self.assertFalse(self._filter_matches("NotFakeCap", entry))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
|
||||
@@ -294,10 +294,12 @@ class HumanReadableMessageTests(unittest.TestCase):
|
||||
|
||||
class TestMessageSubfieldSerializers(unittest.TestCase):
|
||||
def setUp(self):
|
||||
self.chat_msg = ProxiedMessage('ChatFromViewer',
|
||||
Block('AgentData', AgentID=UUID('550e8400-e29b-41d4-a716-446655440000'),
|
||||
SessionID=UUID('550e8400-e29b-41d4-a716-446655440000')),
|
||||
Block('ChatData', Message="Chatting\n", Type=1, Channel=0))
|
||||
self.chat_msg = ProxiedMessage(
|
||||
'ChatFromViewer',
|
||||
Block('AgentData',
|
||||
AgentID=UUID('550e8400-e29b-41d4-a716-446655440000'),
|
||||
SessionID=UUID('550e8400-e29b-41d4-a716-446655440000')),
|
||||
Block('ChatData', Message="Chatting\n", Type=1, Channel=0))
|
||||
|
||||
def test_pretty_repr(self):
|
||||
expected_repr = r"""ProxiedMessage('ChatFromViewer',
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import math
|
||||
import random
|
||||
import unittest
|
||||
from typing import *
|
||||
@@ -12,11 +13,13 @@ from hippolyzer.lib.proxy.addons import AddonManager
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon
|
||||
from hippolyzer.lib.proxy.objects import ObjectManager
|
||||
from hippolyzer.lib.proxy.message import ProxiedMessage as Message
|
||||
from hippolyzer.lib.proxy.templates import PCode
|
||||
|
||||
|
||||
class MockRegion:
|
||||
def __init__(self, message_handler: MessageHandler):
|
||||
self.session = lambda: None
|
||||
self.handle = 123
|
||||
self.message_handler = message_handler
|
||||
self.http_message_handler = MessageHandler()
|
||||
|
||||
@@ -43,9 +46,11 @@ class ObjectManagerTests(unittest.TestCase):
|
||||
self.object_addon = ObjectTrackingAddon()
|
||||
AddonManager.init([], None, [self.object_addon])
|
||||
|
||||
def _create_object_update(self, local_id=None, full_id=None, parent_id=None, pos=None, rot=None) -> Message:
|
||||
def _create_object_update(self, local_id=None, full_id=None, parent_id=None, pos=None, rot=None,
|
||||
pcode=None, namevalue=None) -> Message:
|
||||
pos = pos if pos is not None else (1.0, 2.0, 3.0)
|
||||
rot = rot if rot is not None else (0.0, 0.0, 0.0, 1.0)
|
||||
pcode = pcode if pcode is not None else PCode.PRIMITIVE
|
||||
msg = Message(
|
||||
"ObjectUpdate",
|
||||
Block("RegionData", RegionHandle=123, TimeDilation=123),
|
||||
@@ -53,7 +58,7 @@ class ObjectManagerTests(unittest.TestCase):
|
||||
"ObjectData",
|
||||
ID=local_id if local_id is not None else random.getrandbits(32),
|
||||
FullID=full_id if full_id else UUID.random(),
|
||||
PCode=9,
|
||||
PCode=pcode,
|
||||
Scale=Vector3(0.5, 0.5, 0.5),
|
||||
UpdateFlags=268568894,
|
||||
PathCurve=16,
|
||||
@@ -61,6 +66,7 @@ class ObjectManagerTests(unittest.TestCase):
|
||||
ProfileCurve=1,
|
||||
PathScaleX=100,
|
||||
PathScaleY=100,
|
||||
NameValue=namevalue,
|
||||
TextureEntry=b'\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x00\x00\x00\x00\x00\x00\x00\x00\x80?\x00\x00'
|
||||
b'\x00\x80?\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
|
||||
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00',
|
||||
@@ -85,8 +91,11 @@ class ObjectManagerTests(unittest.TestCase):
|
||||
# Run through (de)serializer to fill in any missing vars
|
||||
return self.deserializer.deserialize(self.serializer.serialize(msg))
|
||||
|
||||
def _create_object(self, local_id=None, full_id=None, parent_id=None, pos=None, rot=None) -> Object:
|
||||
msg = self._create_object_update(local_id=local_id, full_id=full_id, parent_id=parent_id, pos=pos, rot=rot)
|
||||
def _create_object(self, local_id=None, full_id=None, parent_id=None, pos=None, rot=None,
|
||||
pcode=None, namevalue=None) -> Object:
|
||||
msg = self._create_object_update(
|
||||
local_id=local_id, full_id=full_id, parent_id=parent_id, pos=pos, rot=rot,
|
||||
pcode=pcode, namevalue=namevalue)
|
||||
self.message_handler.handle(msg)
|
||||
return self.object_manager.lookup_fullid(msg["ObjectData"]["FullID"])
|
||||
|
||||
@@ -102,6 +111,9 @@ class ObjectManagerTests(unittest.TestCase):
|
||||
def _kill_object(self, obj: Object):
|
||||
self.message_handler.handle(self._create_kill_object(obj.LocalID))
|
||||
|
||||
def _get_avatar_positions(self) -> Dict[UUID, Vector3]:
|
||||
return {av.FullID: av.RegionPosition for av in self.object_manager.all_avatars}
|
||||
|
||||
def test_basic_tracking(self):
|
||||
"""Does creating an object result in it being tracked?"""
|
||||
msg = self._create_object_update()
|
||||
@@ -122,14 +134,33 @@ class ObjectManagerTests(unittest.TestCase):
|
||||
self.assertEqual(set(), self.object_manager.missing_locals)
|
||||
self.assertSequenceEqual([child.LocalID], parent.ChildIDs)
|
||||
|
||||
def test_killing_parent_orphans_children(self):
|
||||
child = self._create_object(local_id=2, parent_id=1)
|
||||
def test_killing_parent_kills_children(self):
|
||||
_child = self._create_object(local_id=2, parent_id=1)
|
||||
parent = self._create_object(local_id=1)
|
||||
# This should orphan the child again
|
||||
self._kill_object(parent)
|
||||
parent = self._create_object(local_id=1)
|
||||
# Did we pick the orphan back up?
|
||||
self.assertSequenceEqual([child.LocalID], parent.ChildIDs)
|
||||
# We should not have picked up any children
|
||||
self.assertSequenceEqual([], parent.ChildIDs)
|
||||
|
||||
def test_hierarchy_killed(self):
|
||||
_child = self._create_object(local_id=3, parent_id=2)
|
||||
_other_child = self._create_object(local_id=4, parent_id=2)
|
||||
_parent = self._create_object(local_id=2, parent_id=1)
|
||||
grandparent = self._create_object(local_id=1)
|
||||
# KillObject implicitly kills all known descendents at that point
|
||||
self._kill_object(grandparent)
|
||||
self.assertEqual(0, len(self.object_manager))
|
||||
|
||||
def test_hierarchy_avatar_not_killed(self):
|
||||
_child = self._create_object(local_id=3, parent_id=2)
|
||||
_parent = self._create_object(local_id=2, parent_id=1, pcode=PCode.AVATAR)
|
||||
grandparent = self._create_object(local_id=1)
|
||||
# KillObject should only "unsit" child avatars (does this require an ObjectUpdate
|
||||
# or is ParentID=0 implied?)
|
||||
self._kill_object(grandparent)
|
||||
self.assertEqual(2, len(self.object_manager))
|
||||
self.assertIsNotNone(self.object_manager.lookup_localid(2))
|
||||
|
||||
def test_attachment_orphan_parent_tracking(self):
|
||||
"""
|
||||
@@ -142,15 +173,6 @@ class ObjectManagerTests(unittest.TestCase):
|
||||
parent = self._create_object(local_id=2, parent_id=1)
|
||||
self.assertSequenceEqual([child.LocalID], parent.ChildIDs)
|
||||
|
||||
def test_killing_attachment_parent_orphans_children(self):
|
||||
child = self._create_object(local_id=3, parent_id=2)
|
||||
parent = self._create_object(local_id=2, parent_id=1)
|
||||
# This should orphan the child again
|
||||
self._kill_object(parent)
|
||||
parent = self._create_object(local_id=2, parent_id=1)
|
||||
# Did we pick the orphan back up?
|
||||
self.assertSequenceEqual([child.LocalID], parent.ChildIDs)
|
||||
|
||||
def test_unparenting_succeeds(self):
|
||||
child = self._create_object(local_id=3, parent_id=2)
|
||||
parent = self._create_object(local_id=2)
|
||||
@@ -229,6 +251,65 @@ class ObjectManagerTests(unittest.TestCase):
|
||||
self.assertEqual(parent.RegionPosition, (0.0, 0.0, 0.0))
|
||||
self.assertEqual(child.RegionPosition, (1.0, 2.0, 0.0))
|
||||
|
||||
def test_avatar_locations(self):
|
||||
agent1_id = UUID.random()
|
||||
agent2_id = UUID.random()
|
||||
self.message_handler.handle(Message(
|
||||
"CoarseLocationUpdate",
|
||||
Block("AgentData", AgentID=agent1_id),
|
||||
Block("AgentData", AgentID=agent2_id),
|
||||
Block("Location", X=1, Y=2, Z=3),
|
||||
Block("Location", X=2, Y=3, Z=4),
|
||||
))
|
||||
self.assertDictEqual(self._get_avatar_positions(), {
|
||||
# CoarseLocation's Z axis is multiplied by 4
|
||||
agent1_id: Vector3(1, 2, 12),
|
||||
agent2_id: Vector3(2, 3, 16),
|
||||
})
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
# Simulate an avatar sitting on an object
|
||||
seat_object = self._create_object(pos=(0, 0, 3))
|
||||
# If we have a real object pos it should override coarse pos
|
||||
avatar_obj = self._create_object(full_id=agent1_id, pcode=PCode.AVATAR,
|
||||
parent_id=seat_object.LocalID, pos=Vector3(0, 0, 2))
|
||||
self.assertDictEqual(self._get_avatar_positions(), {
|
||||
# Agent is seated, make sure this is region and not local pos
|
||||
agent1_id: Vector3(0, 0, 5),
|
||||
agent2_id: Vector3(2, 3, 16),
|
||||
})
|
||||
|
||||
# If the object is killed and no coarse pos, it shouldn't be in the dict
|
||||
# CoarseLocationUpdates are expected to be complete, so any agents missing
|
||||
# are no longer in the sim.
|
||||
self._kill_object(avatar_obj)
|
||||
self.message_handler.handle(Message(
|
||||
"CoarseLocationUpdate",
|
||||
Block("AgentData", AgentID=agent2_id),
|
||||
Block("Location", X=2, Y=3, Z=4),
|
||||
))
|
||||
self.assertDictEqual(self._get_avatar_positions(), {
|
||||
agent2_id: Vector3(2, 3, 16),
|
||||
})
|
||||
|
||||
# 255 on Z axis means we can't guess the real Z
|
||||
self.message_handler.handle(Message(
|
||||
"CoarseLocationUpdate",
|
||||
Block("AgentData", AgentID=agent2_id),
|
||||
Block("Location", X=2, Y=3, Z=math.inf),
|
||||
))
|
||||
self.assertDictEqual(self._get_avatar_positions(), {
|
||||
agent2_id: Vector3(2, 3, math.inf),
|
||||
})
|
||||
|
||||
def test_name_cache(self):
|
||||
# Receiving an update with a NameValue for an avatar should update NameCache
|
||||
obj = self._create_object(
|
||||
pcode=PCode.AVATAR,
|
||||
namevalue=b'DisplayName STRING RW DS unicodename\n'
|
||||
b'FirstName STRING RW DS firstname\n'
|
||||
b'LastName STRING RW DS Resident\n'
|
||||
b'Title STRING RW DS foo',
|
||||
)
|
||||
self.assertEqual(self.object_manager.name_cache.lookup(obj.FullID).FirstName, "firstname")
|
||||
av = self.object_manager.lookup_avatar(obj.FullID)
|
||||
self.assertEqual(av.Name, "firstname Resident")
|
||||
|
||||
Reference in New Issue
Block a user