77 Commits

Author SHA1 Message Date
Salad Dais
bc33313fc7 v0.9.0 2022-03-12 18:40:38 +00:00
Salad Dais
affc7fcf89 Clarify comment in proxy object manager 2022-03-05 11:03:28 +00:00
Salad Dais
b8f1593a2c Allow filtering on HTTP status code 2022-03-05 10:50:09 +00:00
Salad Dais
7879f4e118 Split up mitmproxy integration test a bit 2022-03-05 10:49:55 +00:00
Salad Dais
4ba611ae01 Only apply local mesh to selected links 2022-02-28 07:32:46 +00:00
Salad Dais
82ff6d9c64 Add more TeleportFlags 2022-02-28 07:32:22 +00:00
Salad Dais
f603ea6186 Better handle timeouts that have missing cap_data metadata 2021-12-18 20:43:10 +00:00
Salad Dais
fcf6a4568b Better handling for proxied HTTP requests that timeout 2021-12-17 19:27:20 +00:00
Salad Dais
2ad6cc1b51 Better handle broken 'LLSD' responses 2021-12-17 00:18:51 +00:00
Salad Dais
025f7d31f2 Make sure .queued is cleared if message take()n twice 2021-12-15 20:17:54 +00:00
Salad Dais
9fdb281e4a Create example addon for simulating packet loss 2021-12-13 06:12:43 +00:00
Salad Dais
11e28bde2a Allow filtering message log on HTTP headers 2021-12-11 15:08:45 +00:00
Salad Dais
1faa6f977c Update docs on send() and send_reliable() 2021-12-10 13:41:20 +00:00
Salad Dais
6866e7397f Clean up cap registration API 2021-12-10 13:22:54 +00:00
Salad Dais
fa0b3a5340 Mark all Messages synthetic unless they came off the wire 2021-12-10 07:30:02 +00:00
Salad Dais
16c808bce8 Match viewer resend behaviour 2021-12-10 07:04:36 +00:00
Salad Dais
ec4b2d0770 Move last of the explicit direction params 2021-12-10 06:50:07 +00:00
Salad Dais
3b610fdfd1 Add awaitable send_reliable() 2021-12-09 05:30:35 +00:00
Salad Dais
8b93c5eefa Rename send_message() to send() 2021-12-09 05:30:12 +00:00
Salad Dais
f4bb9eae8f Fix __contains__ for JankStringyBytes 2021-12-09 03:48:29 +00:00
Salad Dais
ecb14197cf Make message log filter highlight every matched field
Previously only the first match was being highlighted.
2021-12-09 01:14:09 +00:00
Salad Dais
95fd58e25a Begin PySide6 cleanup 2021-12-09 00:02:48 +00:00
Salad Dais
afc333ab49 Improve highlighting of matched fields in message log 2021-12-08 23:50:16 +00:00
Salad Dais
eb6406bca4 Fix ACK collection logic for injected reliable messages 2021-12-08 22:29:29 +00:00
Salad Dais
d486aa130d Add support for specifying flags in message builder 2021-12-08 21:10:06 +00:00
Salad Dais
d66d5226a2 Initial implementation of reliable injected packets
See #17. Not yet tested for real.
2021-12-08 04:49:45 +00:00
Salad Dais
d86da70eeb v0.8.0 2021-12-07 07:16:25 +00:00
Salad Dais
aa0b4b63a9 Update cx_freeze script to handle PySide6 2021-12-07 07:16:25 +00:00
Salad Dais
5f479e46b4 Automatically offer to install the HTTPS certs on first run 2021-12-07 07:16:25 +00:00
Salad Dais
1e55d5a9d8 Continue handling HTTP flows if flow logging fails
If flow beautification for display throws then we don't want
to bypass other handling of the flow.

This fixes a login failure due to SL's login XML-RPC endpoint
returning a Content-Type of "application/llsd+xml/r/n" when it's
actually "application/xml".
2021-12-06 17:01:13 +00:00
Salad Dais
077a95b5e7 Migrate to PySide6 to support Python 3.10
Update Glymur too
2021-12-06 13:37:31 +00:00
Salad Dais
4f1399cf66 Add note about LinHippoAutoProxy 2021-12-06 12:26:16 +00:00
Salad Dais
9590b30e66 Add note about Python 3.10 support 2021-12-05 20:25:06 +00:00
Salad Dais
34f3ee4c3e Move mtime wrapper to helpers 2021-12-05 18:14:26 +00:00
Salad Dais
7d655543f5 Dont reserialize responses as pretty LLSD-XML
Certain LLSD parsers don't like the empty text nodes it adds around
the root element of the document. Yuck.
2021-12-05 18:12:53 +00:00
Salad Dais
5de3ed0d5e Add support for LLSD inventory representations 2021-12-03 05:59:58 +00:00
Salad Dais
74c3287cc0 Add base addon for creating proxy-only caps based on ASGI apps 2021-12-02 06:04:29 +00:00
Salad Dais
3a7f8072a0 Initial implementation of proxy-provided caps
Useful for mocking out a cap while developing the viewer-side
pieces of it.
2021-12-02 03:22:47 +00:00
dependabot[bot]
5fa91580eb Bump mitmproxy from 7.0.2 to 7.0.3 (#21)
Bumps [mitmproxy](https://github.com/mitmproxy/mitmproxy) from 7.0.2 to 7.0.3.
- [Release notes](https://github.com/mitmproxy/mitmproxy/releases)
- [Changelog](https://github.com/mitmproxy/mitmproxy/blob/main/CHANGELOG.md)
- [Commits](https://github.com/mitmproxy/mitmproxy/compare/v7.0.2...v7.0.3)

---
updated-dependencies:
- dependency-name: mitmproxy
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2021-11-30 05:30:06 -04:00
Salad Dais
d8fbb55438 Improve LLUDP integration tests 2021-11-30 09:25:31 +00:00
Salad Dais
99eb4fed74 Fix _reorient_coord to work correctly for normals again 2021-11-30 09:24:49 +00:00
Salad Dais
6b78b841df Fix range of mesh normals 2021-11-23 01:36:14 +00:00
Salad Dais
dae852db69 Fix filter dialog 2021-11-19 04:30:36 +00:00
Salad Dais
0c0de2bcbc v0.7.1 2021-09-04 07:27:20 +00:00
Salad Dais
9f2d2f2194 Pin recordclass version, use requirements.txt for windows build
recordclass had some breaking changes in 0.15
2021-09-04 07:12:45 +00:00
Salad Dais
c6e0a400a9 v0.7.0 2021-08-10 01:16:20 +00:00
Salad Dais
d01122d542 Call correct method to raise new message log window 2021-08-10 01:11:21 +00:00
Salad Dais
690d6b51b8 Upgrade to mitmproxy 7.0.2
Our fix for `Flow.set_state()` has been upstreamed
2021-08-09 22:16:23 +00:00
Salad Dais
2437a8b14f Add a framework for simple local anim creation, tail animator 2021-08-05 21:08:18 +00:00
Salad Dais
afa601fffe Support session-specific viewer cache directories 2021-08-02 18:23:13 +00:00
Salad Dais
874feff471 Fix incorrect reference to mitmproxy class 2021-08-01 12:16:10 +00:00
Salad Dais
05c53bba9f Add CapsClient to BaseClientSession 2021-08-01 06:39:04 +00:00
Salad Dais
578f1d8c4e Add setting to disable all proxy object autorequests
Will help with #18 by not changing object request behaviour when
running through the proxy.
2021-08-01 06:37:33 +00:00
Salad Dais
7d8e18440a Add local anim mangler support with example
Analogous to local mesh mangler support.
2021-07-31 11:56:17 +00:00
Salad Dais
66e112dd52 Add basic message log import / export feature
Closes #20
2021-07-30 03:13:33 +00:00
Salad Dais
02ac022ab3 Add export formats for message log entries 2021-07-30 01:06:29 +00:00
Salad Dais
33ce74754e Fix mirror_target_agent check in http hooks 2021-07-30 01:06:29 +00:00
Salad Dais
74dd6b977c Add extended to_dict() format for Message class
This will allow proper import / export of message logs.
2021-07-29 10:26:42 +00:00
Salad Dais
387652731a Add Message Mirror example addon 2021-07-29 09:43:20 +00:00
Salad Dais
e4601fd879 Support multiple Message Log windows
Closes #19
2021-07-29 01:00:57 +00:00
Salad Dais
6eb25f96d9 Support logging to a hierarchy of message loggers
Necessary to eventually support multiple message log windows
2021-07-27 02:35:03 +00:00
Salad Dais
22b9eeb5cb Better handling of optional command parameters 2021-07-22 23:59:55 +00:00
Salad Dais
0dbedcb2f5 Improve coverage 2021-07-22 23:58:17 +00:00
Salad Dais
7d9712c16e Fix message dropping and queueing corner cases 2021-07-22 05:08:47 +00:00
Salad Dais
82663c0fc2 Add parse_bool helper function for command parameters 2021-07-21 06:39:29 +00:00
Salad Dais
9fb4884470 Extend TlsLayer.tls_start_server instead of monkeypatching OpenSSL funcs
We have a more elegant way of unsetting `X509_CHECK_FLAG_NEVER_CHECK_SUBJECT`
now that mitmproxy 7.0 is out.

See https://github.com/mitmproxy/mitmproxy/pull/4688
2021-07-19 20:17:31 +00:00
Salad Dais
cf69c42f67 Rework HTTP proxying code to work with mitmproxy 7.0.0 2021-07-18 07:02:45 +00:00
Salad Dais
be658b9026 v0.6.3
Cutting a release before working on mitmproxy upgrade
2021-07-18 06:57:40 +00:00
Salad Dais
c505941595 Improve test for TE serialization 2021-07-18 06:33:55 +00:00
Salad Dais
96f471d6b7 Add initial support for Message-specific Block subclasses 2021-07-07 12:49:32 +00:00
Salad Dais
4238016767 Change readme wording
:)
2021-07-07 12:49:32 +00:00
Salad Dais
a35a67718d Add default_value to MessateTemplateVariable 2021-07-01 21:25:51 +00:00
Salad Dais
c2981b107a Remove CodeQL scanning
Maybe later, doesn't seem to do anything useful out of the box.
2021-06-28 06:00:42 -03:00
Salad Dais
851375499a Add CodeQL scanning 2021-06-28 05:44:02 -03:00
Salad Dais
d064ecd466 Don't raise when reading a new avatar_name_cache.xml 2021-06-25 18:45:42 +00:00
Salad Dais
fda37656c9 Reduce boilerplate for mesh mangling addons
Makes it less annoying to compose separate addons with different manglers
2021-06-24 05:29:23 +00:00
Salad Dais
49a9c6f28f Workaround for failed teleports due to EventQueue timeouts
Closes #16
2021-06-23 16:43:09 +00:00
81 changed files with 2234 additions and 733 deletions

View File

@@ -8,3 +8,5 @@ exclude_lines =
if typing.TYPE_CHECKING:
def __repr__
raise AssertionError
assert False
pass

View File

@@ -29,6 +29,7 @@ jobs:
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install -e .
pip install cx_freeze

View File

@@ -8,7 +8,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [3.8, 3.9]
python-version: ["3.8", "3.10"]
steps:
- uses: actions/checkout@v2

View File

@@ -2,7 +2,7 @@
![Python Test Status](https://github.com/SaladDais/Hippolyzer/workflows/Run%20Python%20Tests/badge.svg) [![codecov](https://codecov.io/gh/SaladDais/Hippolyzer/branch/master/graph/badge.svg?token=HCTFA4RAXX)](https://codecov.io/gh/SaladDais/Hippolyzer)
[Hippolyzer](http://wiki.secondlife.com/wiki/Hippo) is a fork of Linden Lab's abandoned
[Hippolyzer](http://wiki.secondlife.com/wiki/Hippo) is a revival of Linden Lab's
[PyOGP library](http://wiki.secondlife.com/wiki/PyOGP)
targeting modern Python 3, with a focus on debugging issues in Second Life-compatible
servers and clients. There is a secondary focus on mocking up new features without requiring a
@@ -375,6 +375,12 @@ To have your client's traffic proxied through Hippolyzer the general flow is:
* The proxy needs to use content sniffing to figure out which requests are login requests,
so make sure your request would pass `MITMProxyEventManager._is_login_request()`
#### Do I have to do all that?
You might be able to automate some of it on Linux by using
[LinHippoAutoProxy](https://github.com/SaladDais/LinHippoAutoProxy). If you're on Windows or MacOS the
above is your only option.
### Should I use this library to make an SL client in Python?
No. If you just want to write a client in Python, you should instead look at using

View File

@@ -0,0 +1,32 @@
"""
Example anim mangler addon, to be used with local anim addon.
You can edit this live to apply various transforms to local anims,
as well as any uploaded anims. Any changes will be reflected in currently
playing local anims.
This example modifies any position keys of an animation's mHipRight joint.
"""
from hippolyzer.lib.base.llanim import Animation
from hippolyzer.lib.proxy.addons import AddonManager
import local_anim
AddonManager.hot_reload(local_anim, require_addons_loaded=True)
def offset_right_hip(anim: Animation):
hip_joint = anim.joints.get("mHipRight")
if hip_joint:
for pos_frame in hip_joint.pos_keyframes:
pos_frame.pos.Z *= 2.5
pos_frame.pos.X *= 5.0
return anim
class ExampleAnimManglerAddon(local_anim.BaseAnimManglerAddon):
ANIM_MANGLERS = [
offset_right_hip,
]
addons = [ExampleAnimManglerAddon()]

View File

@@ -11,7 +11,7 @@ import enum
import os.path
from typing import *
from PySide2 import QtCore, QtGui, QtWidgets
from PySide6 import QtCore, QtGui, QtWidgets
from hippolyzer.lib.base.datatypes import Vector3
from hippolyzer.lib.base.message.message import Block, Message
@@ -80,7 +80,7 @@ class BlueishObjectListGUIAddon(BaseAddon):
raise
def _highlight_object(self, session: Session, obj: Object):
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"ForceObjectSelect",
Block("Header", ResetList=False),
Block("Data", LocalID=obj.LocalID),
@@ -88,7 +88,7 @@ class BlueishObjectListGUIAddon(BaseAddon):
))
def _teleport_to_object(self, session: Session, obj: Object):
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"TeleportLocationRequest",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
Block(

View File

@@ -105,7 +105,7 @@ class HorrorAnimatorAddon(BaseAddon):
# send the response back immediately
block = STATIC_VFS[orig_anim_id]
anim_data = STATIC_VFS.read_block(block)
flow.response = mitmproxy.http.HTTPResponse.make(
flow.response = mitmproxy.http.Response.make(
200,
_mutate_anim_bytes(anim_data),
{

View File

@@ -5,42 +5,56 @@ Local animations
assuming you loaded something.anim
/524 start_local_anim something
/524 stop_local_anim something
/524 save_local_anim something
If you want to trigger the animation from an object to simulate llStartAnimation():
llOwnerSay("@start_local_anim:something=force");
Also includes a concept of "anim manglers" similar to the "mesh manglers" of the
local mesh addon. This is useful if you want to test making procedural changes
to animations before uploading them. The manglers will be applied to any uploaded
animations as well.
May also be useful if you need to make ad-hoc changes to a bunch of animations on
bulk upload, like changing priority or removing a joint.
"""
import asyncio
import os
import pathlib
from abc import abstractmethod
from typing import *
from hippolyzer.lib.base import serialization as se
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import get_mtime
from hippolyzer.lib.base.llanim import Animation
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty, GlobalProperty, show_message
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.http_asset_repo import HTTPAssetRepo
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
def _get_mtime(path: str):
try:
return os.stat(path).st_mtime
except:
return None
from hippolyzer.lib.proxy.sessions import Session, SessionManager
class LocalAnimAddon(BaseAddon):
# name -> path, only for anims actually from files
local_anim_paths: Dict[str, str] = SessionProperty(dict)
# name -> anim bytes
local_anim_bytes: Dict[str, bytes] = SessionProperty(dict)
# name -> mtime or None. Only for anims from files.
local_anim_mtimes: Dict[str, Optional[float]] = SessionProperty(dict)
# name -> current asset ID (changes each play)
local_anim_playing_ids: Dict[str, UUID] = SessionProperty(dict)
anim_manglers: List[Callable[[Animation], Animation]] = GlobalProperty(list)
def handle_init(self, session_manager: SessionManager):
self.remangle_local_anims(session_manager)
def handle_session_init(self, session: Session):
# Reload anims and reload any manglers if we have any
self._schedule_task(self._try_reload_anims(session))
@handle_command()
@@ -66,11 +80,23 @@ class LocalAnimAddon(BaseAddon):
"""Stop a named local animation"""
self.apply_local_anim(session, region, anim_name, new_data=None)
@handle_command(anim_name=str)
async def save_local_anim(self, _session: Session, _region: ProxiedRegion, anim_name: str):
"""Save a named local anim to disk"""
anim_bytes = self.local_anim_bytes.get(anim_name)
if not anim_bytes:
return
filename = await AddonManager.UI.save_file(filter_str="SL Anim (*.anim)", default_suffix="anim")
if not filename:
return
with open(filename, "wb") as f:
f.write(anim_bytes)
async def _try_reload_anims(self, session: Session):
while True:
region = session.main_region
if not region:
await asyncio.sleep(2.0)
await asyncio.sleep(1.0)
continue
# Loop over local anims we loaded
@@ -80,7 +106,7 @@ class LocalAnimAddon(BaseAddon):
continue
# is playing right now, check if there's a newer version
self.apply_local_anim_from_file(session, region, anim_name, only_if_changed=True)
await asyncio.sleep(2.0)
await asyncio.sleep(1.0)
def handle_rlv_command(self, session: Session, region: ProxiedRegion, source: UUID,
cmd: str, options: List[str], param: str):
@@ -127,11 +153,13 @@ class LocalAnimAddon(BaseAddon):
StartAnim=True,
))
cls.local_anim_playing_ids[anim_name] = next_id
cls.local_anim_bytes[anim_name] = new_data
else:
# No data means just stop the anim
cls.local_anim_playing_ids.pop(anim_name, None)
cls.local_anim_bytes.pop(anim_name, None)
region.circuit.send_message(new_msg)
region.circuit.send(new_msg)
print(f"Changing {anim_name} to {next_id}")
@classmethod
@@ -141,7 +169,7 @@ class LocalAnimAddon(BaseAddon):
anim_data = None
if anim_path:
old_mtime = cls.local_anim_mtimes.get(anim_name)
mtime = _get_mtime(anim_path)
mtime = get_mtime(anim_path)
if only_if_changed and old_mtime == mtime:
return
@@ -156,9 +184,94 @@ class LocalAnimAddon(BaseAddon):
with open(anim_path, "rb") as f:
anim_data = f.read()
anim_data = cls._mangle_anim(anim_data)
else:
print(f"Unknown anim {anim_name!r}")
cls.apply_local_anim(session, region, anim_name, new_data=anim_data)
@classmethod
def _mangle_anim(cls, anim_data: bytes) -> bytes:
if not cls.anim_manglers:
return anim_data
reader = se.BufferReader("<", anim_data)
spec = se.Dataclass(Animation)
anim = reader.read(spec)
for mangler in cls.anim_manglers:
anim = mangler(anim)
writer = se.BufferWriter("<")
writer.write(spec, anim)
return writer.copy_buffer()
@classmethod
def remangle_local_anims(cls, session_manager: SessionManager):
# Anim manglers are global, so we need to re-mangle anims for all sessions
for session in session_manager.sessions:
# Push the context of this session onto the stack so we can access
# session-scoped properties
with addon_ctx.push(new_session=session, new_region=session.main_region):
cls.local_anim_mtimes.clear()
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
if flow.name == "NewFileAgentInventoryUploader":
# Don't bother looking at this if we have no manglers
if not self.anim_manglers:
return
# This is kind of a crappy match but these magic bytes shouldn't match anything that SL
# allows as an upload type but animations.
if not flow.request.content or not flow.request.content.startswith(b"\x01\x00\x00\x00"):
return
# Replace the uploaded anim with the mangled version
flow.request.content = self._mangle_anim(flow.request.content)
show_message("Mangled upload request")
class BaseAnimManglerAddon(BaseAddon):
"""Base class for addons that mangle uploaded or file-based local animations"""
ANIM_MANGLERS: List[Callable[[Animation], Animation]]
def handle_init(self, session_manager: SessionManager):
# Add our manglers into the list
LocalAnimAddon.anim_manglers.extend(self.ANIM_MANGLERS)
LocalAnimAddon.remangle_local_anims(session_manager)
def handle_unload(self, session_manager: SessionManager):
# Clean up our manglers before we go away
mangler_list = LocalAnimAddon.anim_manglers
for mangler in self.ANIM_MANGLERS:
if mangler in mangler_list:
mangler_list.remove(mangler)
LocalAnimAddon.remangle_local_anims(session_manager)
class BaseAnimHelperAddon(BaseAddon):
"""
Base class for local creation of procedural animations
Animation generated by build_anim() gets applied to all active sessions
"""
ANIM_NAME: str
def handle_session_init(self, session: Session):
self._reapply_anim(session, session.main_region)
def handle_session_closed(self, session: Session):
LocalAnimAddon.apply_local_anim(session, session.main_region, self.ANIM_NAME, None)
def handle_unload(self, session_manager: SessionManager):
for session in session_manager.sessions:
# TODO: Nasty. Since we need to access session-local attrs we need to set the
# context even though we also explicitly pass session and region.
# Need to rethink the LocalAnimAddon API.
with addon_ctx.push(session, session.main_region):
LocalAnimAddon.apply_local_anim(session, session.main_region, self.ANIM_NAME, None)
@abstractmethod
def build_anim(self) -> Animation:
pass
def _reapply_anim(self, session: Session, region: ProxiedRegion):
LocalAnimAddon.apply_local_anim(session, region, self.ANIM_NAME, self.build_anim().to_bytes())
addons = [LocalAnimAddon()]

View File

@@ -81,17 +81,16 @@ class MeshUploadInterceptingAddon(BaseAddon):
@handle_command()
async def set_local_mesh_target(self, session: Session, region: ProxiedRegion):
"""Set the currently selected object as the target for local mesh"""
parent_object = region.objects.lookup_localid(session.selected.object_local)
if not parent_object:
"""Set the currently selected objects as the target for local mesh"""
selected_links = [region.objects.lookup_localid(l_id) for l_id in session.selected.object_locals]
selected_links = [o for o in selected_links if o is not None]
if not selected_links:
show_message("Nothing selected")
return
linkset_objects = [parent_object] + parent_object.Children
old_locals = self.local_mesh_target_locals
self.local_mesh_target_locals = [
x.LocalID
for x in linkset_objects
for x in selected_links
if ExtraParamType.MESH in x.ExtraParams
]
@@ -201,7 +200,7 @@ class MeshUploadInterceptingAddon(BaseAddon):
self.local_mesh_mapping = {x["mesh_name"]: x["mesh"] for x in instances}
# Fake a response, we don't want to actually send off the request.
flow.response = mitmproxy.http.HTTPResponse.make(
flow.response = mitmproxy.http.Response.make(
200,
b"",
{
@@ -280,4 +279,23 @@ class MeshUploadInterceptingAddon(BaseAddon):
cls._replace_local_mesh(session.main_region, asset_repo, mesh_list)
class BaseMeshManglerAddon(BaseAddon):
"""Base class for addons that mangle uploaded or local mesh"""
MESH_MANGLERS: List[Callable[[MeshAsset], MeshAsset]]
def handle_init(self, session_manager: SessionManager):
# Add our manglers into the list
MeshUploadInterceptingAddon.mesh_manglers.extend(self.MESH_MANGLERS)
# Tell the local mesh plugin that the mangler list changed, and to re-apply
MeshUploadInterceptingAddon.remangle_local_mesh(session_manager)
def handle_unload(self, session_manager: SessionManager):
# Clean up our manglers before we go away
mangler_list = MeshUploadInterceptingAddon.mesh_manglers
for mangler in self.MESH_MANGLERS:
if mangler in mangler_list:
mangler_list.remove(mangler)
MeshUploadInterceptingAddon.remangle_local_mesh(session_manager)
addons = [MeshUploadInterceptingAddon()]

View File

@@ -11,25 +11,28 @@ to add to give a mesh an arbitrary center of rotation / scaling.
from hippolyzer.lib.base.mesh import MeshAsset
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.sessions import SessionManager
import local_mesh
AddonManager.hot_reload(local_mesh, require_addons_loaded=True)
def _reorient_coord(coord, orientation):
def _reorient_coord(coord, orientation, normals=False):
coords = []
for axis in orientation:
axis_idx = abs(axis) - 1
coords.append(coord[axis_idx] if axis >= 0 else 1.0 - coord[axis_idx])
if normals:
# Normals have a static domain from -1.0 to 1.0, just negate.
new_coord = coord[axis_idx] if axis >= 0 else -coord[axis_idx]
else:
new_coord = coord[axis_idx] if axis >= 0 else 1.0 - coord[axis_idx]
coords.append(new_coord)
if coord.__class__ in (list, tuple):
return coord.__class__(coords)
return coord.__class__(*coords)
def _reorient_coord_list(coord_list, orientation):
return [_reorient_coord(x, orientation) for x in coord_list]
def _reorient_coord_list(coord_list, orientation, normals=False):
return [_reorient_coord(x, orientation, normals) for x in coord_list]
def reorient_mesh(orientation):
@@ -37,37 +40,23 @@ def reorient_mesh(orientation):
# X=1, Y=2, Z=3
def _reorienter(mesh: MeshAsset):
for material in mesh.iter_lod_materials():
if "Position" not in material:
# Must be a NoGeometry LOD
continue
# We don't need to use positions_(to/from)_domain here since we're just naively
# flipping the axes around.
material["Position"] = _reorient_coord_list(material["Position"], orientation)
# Are you even supposed to do this to the normals?
material["Normal"] = _reorient_coord_list(material["Normal"], orientation)
material["Normal"] = _reorient_coord_list(material["Normal"], orientation, normals=True)
return mesh
return _reorienter
OUR_MANGLERS = [
# Negate the X and Y axes on any mesh we upload or create temp
reorient_mesh((-1, -2, 3)),
]
class ExampleMeshManglerAddon(local_mesh.BaseMeshManglerAddon):
MESH_MANGLERS = [
# Negate the X and Y axes on any mesh we upload or create temp
reorient_mesh((-1, -2, 3)),
]
class MeshManglerExampleAddon(BaseAddon):
def handle_init(self, session_manager: SessionManager):
# Add our manglers into the list
local_mesh_addon = local_mesh.MeshUploadInterceptingAddon
local_mesh_addon.mesh_manglers.extend(OUR_MANGLERS)
# Tell the local mesh plugin that the mangler list changed, and to re-apply
local_mesh_addon.remangle_local_mesh(session_manager)
def handle_unload(self, session_manager: SessionManager):
# Clean up our manglers before we go away
local_mesh_addon = local_mesh.MeshUploadInterceptingAddon
mangler_list = local_mesh_addon.mesh_manglers
for mangler in OUR_MANGLERS:
if mangler in mangler_list:
mangler_list.remove(mangler)
local_mesh_addon.remangle_local_mesh(session_manager)
addons = [MeshManglerExampleAddon()]
addons = [ExampleMeshManglerAddon()]

View File

@@ -0,0 +1,244 @@
"""
Message Mirror
Re-routes messages through the circuit of another agent running through this proxy,
rewriting the messages to use the credentials tied to that circuit.
Useful if you need to quickly QA authorization checks on a message handler or script.
Or if you want to chat as two people at once. Whatever.
Also shows some advanced ways of managing / rerouting Messages and HTTP flows.
Fiddle with the values of `SEND_NORMALLY` and `MIRROR` to change how and which
messages get moved to other circuits.
Usage: /524 mirror_to <mirror_agent_uuid>
To Disable: /524 mirror_to
"""
import weakref
from typing import Optional
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.template_dict import DEFAULT_TEMPLATE_DICT
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty, show_message
from hippolyzer.lib.proxy.commands import handle_command, Parameter, parse_bool
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.caps import CapData, CapType
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
# Things that make no sense to mirror, or will make everything explode if mirrored.
SEND_NORMALLY = {
'StartPingCheck', 'CompletePingCheck', 'PacketAck', 'SimulatorViewerTimeMessage', 'SimStats',
'SoundTrigger', 'EventQueueGet', 'GetMesh', 'GetMesh2', 'ParcelDwellRequest', 'ViewerEffect', 'ViewerStats',
'ParcelAccessListRequest', 'FirestormBridge', 'AvatarRenderInfo', 'ParcelPropertiesRequest', 'GetObjectCost',
'RequestMultipleObjects', 'GetObjectPhysicsData', 'GetExperienceInfo', 'RequestTaskInventory', 'AgentRequestSit',
'MuteListRequest', 'UpdateMuteListEntry', 'RemoveMuteListEntry', 'RequestImage',
'AgentThrottle', 'UseCircuitCode', 'AgentWearablesRequest', 'AvatarPickerRequest', 'CloseCircuit',
'CompleteAgentMovement', 'RegionHandshakeReply', 'LogoutRequest', 'ParcelPropertiesRequest',
'ParcelPropertiesRequestByID', 'MapBlockRequest', 'MapLayerRequest', 'MapItemRequest', 'MapNameRequest',
'ParcelAccessListRequest', 'AvatarPropertiesRequest', 'DirFindQuery',
'SetAlwaysRun', 'GetDisplayNames', 'ViewerMetrics', 'AgentResume', 'AgentPause',
'ViewerAsset', 'GetTexture', 'UUIDNameRequest', 'AgentUpdate', 'AgentAnimation'
# Would just be confusing for everyone
'ImprovedInstantMessage',
# Xfer system isn't authed to begin with, and duping Xfers can lead to premature file deletion. Skip.
'RequestXfer', 'ConfirmXferPacket', 'AbortXfer', 'SendXferPacket',
}
# Messages that _must_ be sent normally, but are worth mirroring onto the target session to see how
# they would respond
MIRROR = {
'RequestObjectPropertiesFamily', 'ObjectSelect', 'RequestObjectProperties', 'TransferRequest',
'RequestMultipleObjects', 'RequestTaskInventory', 'FetchInventory2', 'ScriptDialogReply',
'ObjectDeselect', 'GenericMessage', 'ChatFromViewer'
}
for msg_name in DEFAULT_TEMPLATE_DICT.message_templates.keys():
# There are a lot of these.
if msg_name.startswith("Group") and msg_name.endswith("Request"):
MIRROR.add(msg_name)
class MessageMirrorAddon(BaseAddon):
mirror_target_agent: Optional[UUID] = SessionProperty(None)
mirror_use_target_session: bool = SessionProperty(True)
mirror_use_target_agent: bool = SessionProperty(True)
@handle_command(target_agent=Parameter(UUID, optional=True))
async def mirror_to(self, session: Session, _region, target_agent: Optional[UUID] = None):
"""
Send this session's outbound messages over another proxied agent's circuit
"""
if target_agent:
if target_agent == session.agent_id:
show_message("Can't mirror our own session")
target_agent = None
elif not any(s.agent_id == target_agent for s in session.session_manager.sessions):
show_message(f"No active proxied session for agent {target_agent}")
target_agent = None
self.mirror_target_agent = target_agent
if target_agent:
show_message(f"Mirroring to {target_agent}")
else:
show_message("Message mirroring disabled")
@handle_command(enabled=parse_bool)
async def set_mirror_use_target_session(self, _session, _region, enabled):
"""Replace the original session ID with the target session's ID when mirroring"""
self.mirror_use_target_session = enabled
@handle_command(enabled=parse_bool)
async def set_mirror_use_target_agent(self, _session, _region, enabled):
"""Replace the original agent ID with the target agent's ID when mirroring"""
self.mirror_use_target_agent = enabled
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if message.direction != Direction.OUT:
return
if not self.mirror_target_agent:
return
if message.name in SEND_NORMALLY:
return
target_session = None
for poss_session in session.session_manager.sessions:
if poss_session.agent_id == self.mirror_target_agent:
target_session = poss_session
if not target_session:
print("Couldn't find target session?")
return
target_region = None
for poss_region in target_session.regions:
if poss_region.circuit_addr == region.circuit_addr:
target_region = poss_region
if not target_region:
print("Couldn't find equivalent target region?")
return
# Send the message normally first if we're mirroring
if message.name in MIRROR:
region.circuit.send(message)
# We're going to send the message on a new circuit, we need to take
# it so we get a new packet ID and clean ACKs
message = message.take()
self._lludp_fixups(target_session, message)
target_region.circuit.send(message)
return True
def _lludp_fixups(self, target_session: Session, message: Message):
if "AgentData" in message:
agent_block = message["AgentData"][0]
if "AgentID" in agent_block and self.mirror_use_target_agent:
agent_block["AgentID"] = target_session.agent_id
if "SessionID" in agent_block and self.mirror_use_target_session:
agent_block["SessionID"] = target_session.id
if message.name == "TransferRequest":
transfer_block = message["TransferInfo"][0]
# This is a duplicated message so we need to give it a new ID
transfer_block["TransferID"] = UUID.random()
params = transfer_block.deserialize_var("Params")
# This kind of Transfer might not even use agent credentials
if self.mirror_use_target_agent and hasattr(params, 'AgentID'):
params.AgentID = target_session.agent_id
if self.mirror_use_target_session and hasattr(params, 'SessionID'):
params.SessionID = target_session.id
transfer_block.serialize_var("Params", params)
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
# Already mirrored, ignore.
if flow.is_replay:
return
cap_data = flow.cap_data
if not cap_data:
return
if cap_data.cap_name in SEND_NORMALLY:
return
if cap_data.asset_server_cap:
return
# Likely doesn't have an exact equivalent in the target session, this is a temporary
# cap like an uploader URL or a stats URL.
if cap_data.type == CapType.TEMPORARY:
return
session: Optional[Session] = cap_data.session and cap_data.session()
if not session:
return
region: Optional[ProxiedRegion] = cap_data.region and cap_data.region()
if not region:
return
# Session-scoped, so we need to know if we have a session before checking
if not self.mirror_target_agent:
return
target_session: Optional[Session] = None
for poss_session in session.session_manager.sessions:
if poss_session.agent_id == self.mirror_target_agent:
target_session = poss_session
if not target_session:
return
caps_source = target_session
target_region: Optional[ProxiedRegion] = None
if region:
target_region = None
for poss_region in target_session.regions:
if poss_region.circuit_addr == region.circuit_addr:
target_region = poss_region
if not target_region:
print("No region in cap?")
return
caps_source = target_region
new_base_url = caps_source.cap_urls.get(cap_data.cap_name)
if not new_base_url:
print("No equiv cap?")
return
if cap_data.cap_name in MIRROR:
flow = flow.copy()
# Have the cap data reflect the new URL we're pointing at
flow.metadata["cap_data"] = CapData(
cap_name=cap_data.cap_name,
region=weakref.ref(target_region) if target_region else None,
session=weakref.ref(target_session),
base_url=new_base_url,
)
# Tack any params onto the new base URL for the cap
new_url = new_base_url + flow.request.url[len(cap_data.base_url):]
flow.request.url = new_url
if cap_data.cap_name in MIRROR:
self._replay_flow(flow, session.session_manager)
def _replay_flow(self, flow: HippoHTTPFlow, session_manager: SessionManager):
# Work around mitmproxy bug, changing the URL updates the Host header, which may
# cause it to drop the port even when it shouldn't have. Fix the host header.
if flow.request.port not in (80, 443) and ":" not in flow.request.host_header:
flow.request.host_header = f"{flow.request.host}:{flow.request.port}"
# Should get repopulated when it goes back through the MITM addon
flow.metadata.pop("cap_data_ser", None)
flow.metadata.pop("cap_data", None)
proxy_queue = session_manager.flow_context.to_proxy_queue
proxy_queue.put_nowait(("replay", None, flow.get_state()))
addons = [MessageMirrorAddon()]

View File

@@ -0,0 +1,49 @@
"""
Example of proxy-provided caps
Useful for mocking out a cap that isn't actually implemented by the server
while developing the viewer-side pieces of it.
Implements a cap that accepts an `obj_id` UUID query parameter and returns
the name of the object.
"""
import asyncio
import asgiref.wsgi
from flask import Flask, Response, request
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.webapp_cap_addon import WebAppCapAddon
app = Flask("GetObjectNameCapApp")
@app.route('/')
async def get_object_name():
# Should always have the current region, the cap handler is bound to one.
# Just need to pull it from the `addon_ctx` module's global.
obj_mgr = addon_ctx.region.get().objects
obj_id = UUID(request.args['obj_id'])
obj = obj_mgr.lookup_fullid(obj_id)
if not obj:
return Response(f"Couldn't find {obj_id!r}", status=404, mimetype="text/plain")
try:
await asyncio.wait_for(obj_mgr.request_object_properties(obj)[0], 1.0)
except asyncio.TimeoutError:
return Response(f"Timed out requesting {obj_id!r}'s properties", status=500, mimetype="text/plain")
return Response(obj.Name, mimetype="text/plain")
class MockProxyCapExampleAddon(WebAppCapAddon):
# A cap URL with this name will be tied to each region when
# the sim is first connected to. The URL will be returned to the
# viewer in the Seed if the viewer requests it by name.
CAP_NAME = "GetObjectNameExample"
# Any asgi app should be fine.
APP = asgiref.wsgi.WsgiToAsgi(app)
addons = [MockProxyCapExampleAddon()]

View File

@@ -37,7 +37,7 @@ class PaydayAddon(BaseAddon):
chat_type=ChatType.SHOUT,
)
# Do the traditional money dance.
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"AgentAnimation",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
Block("AnimationList", AnimID=UUID("928cae18-e31d-76fd-9cc9-2f55160ff818"), StartAnim=True),

View File

@@ -9,7 +9,7 @@ import asyncio
import struct
from typing import *
from PySide2.QtGui import QImage
from PySide6.QtGui import QImage
from hippolyzer.lib.base.datatypes import UUID, Vector3, Quaternion
from hippolyzer.lib.base.helpers import to_chunks
@@ -42,7 +42,7 @@ class PixelArtistAddon(BaseAddon):
return
img = QImage()
with open(filename, "rb") as f:
img.loadFromData(f.read(), aformat=None)
img.loadFromData(f.read(), format=None)
img = img.convertToFormat(QImage.Format_RGBA8888)
height = img.height()
width = img.width()
@@ -80,7 +80,7 @@ class PixelArtistAddon(BaseAddon):
# TODO: We don't track the land group or user's active group, so
# "anyone can build" must be on for rezzing to work.
group_id = UUID()
region.circuit.send_message(Message(
region.circuit.send(Message(
'ObjectAdd',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id, GroupID=group_id),
Block(
@@ -129,7 +129,7 @@ class PixelArtistAddon(BaseAddon):
# Set the prim color to the color from the pixel
te.Color[None] = pixel_color
# Set the prim texture and color
region.circuit.send_message(Message(
region.circuit.send(Message(
'ObjectImage',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('ObjectData', ObjectLocalID=obj.LocalID, MediaURL=b'', TextureEntry_=te),
@@ -149,7 +149,7 @@ class PixelArtistAddon(BaseAddon):
# Move the "pixels" to their correct position in chunks
for chunk in to_chunks(positioning_blocks, 25):
region.circuit.send_message(Message(
region.circuit.send(Message(
'MultipleObjectUpdate',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
*chunk,

View File

@@ -116,7 +116,7 @@ class RecapitatorAddon(BaseAddon):
except:
logging.exception("Exception while recapitating")
# Tell the viewer about the status of its original upload
region.circuit.send_message(Message(
region.circuit.send(Message(
"AssetUploadComplete",
Block("AssetBlock", UUID=asset_id, Type=asset_block["Type"], Success=success),
direction=Direction.IN,

View File

@@ -0,0 +1,22 @@
import random
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
class SimulatePacketLossAddon(BaseAddon):
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
# Messing with these may kill your circuit
if message.name in {"PacketAck", "StartPingCheck", "CompletePingCheck", "UseCircuitCode",
"CompleteAgentMovement", "AgentMovementComplete"}:
return
# Simulate 30% packet loss
if random.random() > 0.7:
# Do nothing, drop this packet on the floor
return True
return
addons = [SimulatePacketLossAddon()]

View File

@@ -0,0 +1,55 @@
"""
Tail animation generator
Demonstrates programmatic generation of local motions using BaseAnimHelperAddon
You can use this to create an animation with a script, fiddle with it until it
looks right, then finally save it with /524 save_local_anim <ANIM_NAME>.
The built animation is automatically applied to all active sessions when loaded,
and is re-generated whenever the script is edited. Unloading the script stops
the animations.
"""
from hippolyzer.lib.base.anim_utils import shift_keyframes, smooth_rot
from hippolyzer.lib.base.datatypes import Quaternion
from hippolyzer.lib.base.llanim import Animation, Joint
from hippolyzer.lib.proxy.addons import AddonManager
import local_anim
AddonManager.hot_reload(local_anim, require_addons_loaded=True)
class TailAnimator(local_anim.BaseAnimHelperAddon):
# Should be unique
ANIM_NAME = "tail_anim"
def build_anim(self) -> Animation:
anim = Animation(
base_priority=5,
duration=5.0,
loop_out_point=5.0,
loop=True,
)
# Iterate along tail joints 1 through 6
for joint_num in range(1, 7):
# Give further along joints a wider range of motion
start_rot = Quaternion.from_euler(0.2, -0.3, 0.15 * joint_num)
end_rot = Quaternion.from_euler(-0.2, -0.3, -0.15 * joint_num)
rot_keyframes = [
# Tween between start_rot and end_rot, using smooth interpolation.
# SL's keyframes only allow linear interpolation which doesn't look great
# for natural motions. `smooth_rot()` gets around that by generating
# smooth inter frames for SL to linearly interpolate between.
*smooth_rot(start_rot, end_rot, inter_frames=10, time=0.0, duration=2.5),
*smooth_rot(end_rot, start_rot, inter_frames=10, time=2.5, duration=2.5),
]
anim.joints[f"mTail{joint_num}"] = Joint(
priority=5,
# Each joint's frames should be ahead of the previous joint's by 2 frames
rot_keyframes=shift_keyframes(rot_keyframes, joint_num * 2),
)
return anim
addons = [TailAnimator()]

View File

@@ -3,7 +3,7 @@ Example of how to request a Transfer
"""
from typing import *
from hippolyzer.lib.base.legacy_inv import InventoryModel, InventoryItem
from hippolyzer.lib.base.inventory import InventoryModel, InventoryItem
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.templates import (
AssetType,
@@ -35,7 +35,7 @@ class TransferExampleAddon(BaseAddon):
async def get_first_script(self, session: Session, region: ProxiedRegion):
"""Get the contents of the first script in the selected object"""
# Ask for the object inventory so we can find a script
region.circuit.send_message(Message(
region.circuit.send(Message(
'RequestTaskInventory',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('InventoryData', LocalID=session.selected.object_local),

View File

@@ -64,12 +64,12 @@ class TurboObjectInventoryAddon(BaseAddon):
# Any previous requests will have triggered a delete of the inventory file
# by marking it complete on the server-side. Re-send our RequestTaskInventory
# To make sure there's a fresh copy.
region.circuit.send_message(request_msg.take())
region.circuit.send(request_msg.take())
inv_message = await region.message_handler.wait_for(('ReplyTaskInventory',), timeout=5.0)
# No task inventory, send the reply as-is
file_name = inv_message["InventoryData"]["Filename"]
if not file_name:
region.circuit.send_message(inv_message)
region.circuit.send(inv_message)
return
xfer = region.xfer_manager.request(
@@ -87,7 +87,7 @@ class TurboObjectInventoryAddon(BaseAddon):
continue
# Send the original ReplyTaskInventory to the viewer so it knows the file is ready
region.circuit.send_message(inv_message)
region.circuit.send(inv_message)
proxied_xfer = Xfer(data=xfer.reassemble_chunks())
# Wait for the viewer to request the inventory file

View File

@@ -102,7 +102,7 @@ class UploaderAddon(BaseAddon):
ais_item_to_inventory_data(ais_item),
direction=Direction.IN
)
region.circuit.send_message(message)
region.circuit.send(message)
addons = [UploaderAddon()]

View File

@@ -2,7 +2,7 @@
Example of how to request an Xfer
"""
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.legacy_inv import InventoryModel
from hippolyzer.lib.base.inventory import InventoryModel
from hippolyzer.lib.base.templates import XferFilePath, AssetType, InventoryType, WearableType
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message
@@ -15,7 +15,7 @@ class XferExampleAddon(BaseAddon):
@handle_command()
async def get_mute_list(self, session: Session, region: ProxiedRegion):
"""Fetch the current user's mute list"""
region.circuit.send_message(Message(
region.circuit.send(Message(
'MuteListRequest',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block("MuteData", MuteCRC=0),
@@ -35,7 +35,7 @@ class XferExampleAddon(BaseAddon):
@handle_command()
async def get_task_inventory(self, session: Session, region: ProxiedRegion):
"""Get the inventory of the currently selected object"""
region.circuit.send_message(Message(
region.circuit.send(Message(
'RequestTaskInventory',
# If no session is passed in we'll use the active session when the coro was created
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
@@ -98,7 +98,7 @@ textures 1
data=asset_data,
transaction_id=transaction_id
)
region.circuit.send_message(Message(
region.circuit.send(Message(
'CreateInventoryItem',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block(

View File

@@ -2,7 +2,7 @@ import enum
import logging
import typing
from PySide2 import QtCore, QtGui
from PySide6 import QtCore, QtGui
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.message_logger import FilteringMessageLogger
@@ -19,9 +19,9 @@ class MessageLogHeader(enum.IntEnum):
class MessageLogModel(QtCore.QAbstractTableModel, FilteringMessageLogger):
def __init__(self, parent=None):
def __init__(self, parent=None, maxlen=2000):
QtCore.QAbstractTableModel.__init__(self, parent)
FilteringMessageLogger.__init__(self)
FilteringMessageLogger.__init__(self, maxlen=maxlen)
def _begin_insert(self, insert_idx: int):
self.beginInsertRows(QtCore.QModelIndex(), insert_idx, insert_idx)

View File

@@ -7,6 +7,7 @@ import sys
import time
from typing import Optional
import mitmproxy.ctx
import mitmproxy.exceptions
from hippolyzer.lib.base import llsd
@@ -43,7 +44,7 @@ class SelectionManagerAddon(BaseAddon):
LOG.debug(f"Don't know about selected {local_id}, requesting object")
needed_objects.add(local_id)
if needed_objects:
if needed_objects and session.session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS:
region.objects.request_objects(needed_objects)
# ParcelDwellRequests are sent whenever "about land" is opened. This gives us a
# decent mechanism for selecting parcels.
@@ -89,7 +90,6 @@ def run_http_proxy_process(proxy_host, http_proxy_port, flow_context: HTTPFlowCo
mitmproxy_master = create_http_proxy(proxy_host, http_proxy_port, flow_context)
mitmproxy_master.start_server()
gc.freeze()
flow_context.mitmproxy_ready.set()
mitm_loop.run_forever()
@@ -120,7 +120,7 @@ def start_proxy(session_manager: SessionManager, extra_addons: Optional[list] =
if sys.argv[1] == "--setup-ca":
try:
mitmproxy_master = create_http_proxy(proxy_host, http_proxy_port, flow_context)
except mitmproxy.exceptions.ServerException:
except mitmproxy.exceptions.MitmproxyException:
# Proxy already running, create the master so we don't try to bind to a port
mitmproxy_master = create_proxy_master(proxy_host, http_proxy_port, flow_context)
setup_ca(sys.argv[2], mitmproxy_master)
@@ -132,6 +132,9 @@ def start_proxy(session_manager: SessionManager, extra_addons: Optional[list] =
daemon=True,
)
http_proc.start()
# These need to be set for mitmproxy's ASGIApp serving code to work.
mitmproxy.ctx.master = None
mitmproxy.ctx.log = logging.getLogger("mitmproxy log")
server = SLSOCKS5Server(session_manager)
coro = asyncio.start_server(server.handle_connection, proxy_host, udp_proxy_port)

View File

@@ -17,8 +17,8 @@ import urllib.parse
from typing import *
import multidict
from qasync import QEventLoop
from PySide2 import QtCore, QtWidgets, QtGui
from qasync import QEventLoop, asyncSlot
from PySide6 import QtCore, QtWidgets, QtGui
from hippolyzer.apps.model import MessageLogModel, MessageLogHeader, RegionListModel
from hippolyzer.apps.proxy import start_proxy
@@ -35,6 +35,7 @@ from hippolyzer.lib.base.message.message_formatting import (
)
from hippolyzer.lib.base.message.msgtypes import MsgType
from hippolyzer.lib.base.message.template_dict import DEFAULT_TEMPLATE_DICT
from hippolyzer.lib.base.settings import SettingDescriptor
from hippolyzer.lib.base.ui_helpers import loadUi
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base.network.transport import Direction, SocketUDPTransport
@@ -42,7 +43,8 @@ from hippolyzer.lib.proxy.addons import BaseInteractionManager, AddonManager
from hippolyzer.lib.proxy.ca_utils import setup_ca_everywhere
from hippolyzer.lib.proxy.caps_client import ProxyCapsClient
from hippolyzer.lib.proxy.http_proxy import create_proxy_master, HTTPFlowContext
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, AbstractMessageLogEntry
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, AbstractMessageLogEntry, WrappingMessageLogger, \
import_log_entries, export_log_entries
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
from hippolyzer.lib.proxy.settings import ProxySettings
@@ -60,7 +62,7 @@ def show_error_message(error_msg, parent=None):
error_dialog = QtWidgets.QErrorMessage(parent=parent)
# No obvious way to set this to plaintext, yuck...
error_dialog.showMessage(html.escape(error_msg))
error_dialog.exec_()
error_dialog.exec()
error_dialog.raise_()
@@ -68,11 +70,11 @@ class GUISessionManager(SessionManager, QtCore.QObject):
regionAdded = QtCore.Signal(ProxiedRegion)
regionRemoved = QtCore.Signal(ProxiedRegion)
def __init__(self, settings, model):
def __init__(self, settings):
SessionManager.__init__(self, settings)
QtCore.QObject.__init__(self)
self.all_regions = []
self.message_logger = model
self.message_logger = WrappingMessageLogger()
def checkRegions(self):
new_regions = itertools.chain(*[s.regions for s in self.sessions])
@@ -87,13 +89,13 @@ class GUISessionManager(SessionManager, QtCore.QObject):
self.all_regions = new_regions
class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
def __init__(self, parent):
class GUIInteractionManager(BaseInteractionManager):
def __init__(self, parent: QtWidgets.QWidget):
BaseInteractionManager.__init__(self)
QtCore.QObject.__init__(self, parent=parent)
self._parent = parent
def main_window_handle(self) -> Any:
return self.parent()
return self._parent
def _dialog_async_exec(self, dialog: QtWidgets.QDialog):
future = asyncio.Future()
@@ -101,12 +103,16 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
dialog.open()
return future
async def _file_dialog(self, caption: str, directory: str, filter_str: str, mode: QtWidgets.QFileDialog.FileMode) \
-> Tuple[bool, QtWidgets.QFileDialog]:
dialog = QtWidgets.QFileDialog(self.parent(), caption=caption, directory=directory, filter=filter_str)
async def _file_dialog(
self, caption: str, directory: str, filter_str: str, mode: QtWidgets.QFileDialog.FileMode,
default_suffix: str = '',
) -> Tuple[bool, QtWidgets.QFileDialog]:
dialog = QtWidgets.QFileDialog(self._parent, caption=caption, directory=directory, filter=filter_str)
dialog.setFileMode(mode)
if mode == QtWidgets.QFileDialog.FileMode.AnyFile:
dialog.setAcceptMode(QtWidgets.QFileDialog.AcceptMode.AcceptSave)
if default_suffix:
dialog.setDefaultSuffix(default_suffix)
res = await self._dialog_async_exec(dialog)
return res, dialog
@@ -134,9 +140,10 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
return None
return dialog.selectedFiles()[0]
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '') -> Optional[str]:
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '',
default_suffix: str = '') -> Optional[str]:
res, dialog = await self._file_dialog(
caption, directory, filter_str, QtWidgets.QFileDialog.FileMode.AnyFile
caption, directory, filter_str, QtWidgets.QFileDialog.FileMode.AnyFile, default_suffix,
)
if not res or not dialog.selectedFiles():
return None
@@ -148,7 +155,7 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
title,
caption,
QtWidgets.QMessageBox.Ok | QtWidgets.QMessageBox.Cancel,
self.parent(),
self._parent,
)
fut = asyncio.Future()
msg.finished.connect(lambda r: fut.set_result(r))
@@ -156,6 +163,24 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
return (await fut) == QtWidgets.QMessageBox.Ok
class GUIProxySettings(ProxySettings):
FIRST_RUN: bool = SettingDescriptor(True)
"""Persistent settings backed by QSettings"""
def __init__(self, settings: QtCore.QSettings):
super().__init__()
self._settings_obj = settings
def get_setting(self, name: str) -> Any:
val: Any = self._settings_obj.value(name, defaultValue=dataclasses.MISSING)
if val is dataclasses.MISSING:
return val
return json.loads(val)
def set_setting(self, name: str, val: Any):
self._settings_obj.setValue(name, json.dumps(val))
def nonFatalExceptions(f):
@functools.wraps(f)
def _wrapper(self, *args, **kwargs):
@@ -169,7 +194,35 @@ def nonFatalExceptions(f):
return _wrapper
class ProxyGUI(QtWidgets.QMainWindow):
def buildReplacements(session: Session, region: ProxiedRegion):
if not session or not region:
return {}
selected = session.selected
agent_object = region.objects.lookup_fullid(session.agent_id)
selected_local = selected.object_local
selected_object = None
if selected_local:
# We may or may not have an object for this
selected_object = region.objects.lookup_localid(selected_local)
return {
"SELECTED_LOCAL": selected_local,
"SELECTED_FULL": selected_object.FullID if selected_object else None,
"SELECTED_PARCEL_LOCAL": selected.parcel_local,
"SELECTED_PARCEL_FULL": selected.parcel_full,
"SELECTED_SCRIPT_ITEM": selected.script_item,
"SELECTED_TASK_ITEM": selected.task_item,
"AGENT_ID": session.agent_id,
"AGENT_LOCAL": agent_object.LocalID if agent_object else None,
"SESSION_ID": session.id,
"AGENT_POS": agent_object.Position if agent_object else None,
"NULL_KEY": UUID(),
"RANDOM_KEY": UUID.random,
"CIRCUIT_CODE": session.circuit_code,
"REGION_HANDLE": region.handle,
}
class MessageLogWindow(QtWidgets.QMainWindow):
DEFAULT_IGNORE = "StartPingCheck CompletePingCheck PacketAck SimulatorViewerTimeMessage SimStats " \
"AgentUpdate AgentAnimation AvatarAnimation ViewerEffect CoarseLocationUpdate LayerData " \
"CameraConstraint ObjectUpdateCached RequestMultipleObjects ObjectUpdate ObjectUpdateCompressed " \
@@ -183,26 +236,39 @@ class ProxyGUI(QtWidgets.QMainWindow):
textRequest: QtWidgets.QTextEdit
def __init__(self):
super().__init__()
def __init__(
self, settings: GUIProxySettings, session_manager: GUISessionManager,
log_live_messages: bool, parent: Optional[QtWidgets.QWidget] = None,
):
super().__init__(parent=parent)
loadUi(MAIN_WINDOW_UI_PATH, self)
if parent:
self.setWindowTitle("Message Log")
self.menuBar.setEnabled(False) # type: ignore
self.menuBar.hide() # type: ignore
self._selectedEntry: Optional[AbstractMessageLogEntry] = None
self.settings = GUIProxySettings(QtCore.QSettings("SaladDais", "hippolyzer"))
self.model = MessageLogModel(parent=self.tableView)
self.settings = settings
self.sessionManager = session_manager
if log_live_messages:
self.model = MessageLogModel(parent=self.tableView)
session_manager.message_logger.loggers.append(self.model)
else:
self.model = MessageLogModel(parent=self.tableView, maxlen=None)
self.tableView.setModel(self.model)
self.model.rowsAboutToBeInserted.connect(self.beforeInsert)
self.model.rowsInserted.connect(self.afterInsert)
self.tableView.selectionModel().selectionChanged.connect(self._messageSelected)
self.checkBeautify.clicked.connect(self._showSelectedMessage)
self.checkPause.clicked.connect(self._setPaused)
self._setFilter(self.DEFAULT_FILTER)
self.setFilter(self.DEFAULT_FILTER)
self.btnClearLog.clicked.connect(self.model.clear)
self.lineEditFilter.editingFinished.connect(self._setFilter)
self.lineEditFilter.editingFinished.connect(self.setFilter)
self.btnMessageBuilder.clicked.connect(self._sendToMessageBuilder)
self.btnCopyRepr.clicked.connect(self._copyRepr)
self.actionInstallHTTPSCerts.triggered.connect(self._installHTTPSCerts)
self.actionInstallHTTPSCerts.triggered.connect(self.installHTTPSCerts)
self.actionManageAddons.triggered.connect(self._manageAddons)
self.actionManageFilters.triggered.connect(self._manageFilters)
self.actionOpenMessageBuilder.triggered.connect(self._openMessageBuilder)
@@ -213,15 +279,14 @@ class ProxyGUI(QtWidgets.QMainWindow):
self.actionProxyRemotelyAccessible.triggered.connect(self._setProxyRemotelyAccessible)
self.actionUseViewerObjectCache.triggered.connect(self._setUseViewerObjectCache)
self.actionRequestMissingObjects.triggered.connect(self._setRequestMissingObjects)
self.actionOpenNewMessageLogWindow.triggered.connect(self._openNewMessageLogWindow)
self.actionImportLogEntries.triggered.connect(self._importLogEntries)
self.actionExportLogEntries.triggered.connect(self._exportLogEntries)
self._filterMenu = QtWidgets.QMenu()
self._populateFilterMenu()
self.toolButtonFilter.setMenu(self._filterMenu)
self.sessionManager = GUISessionManager(self.settings, self.model)
self.interactionManager = GUIInteractionManager(self)
AddonManager.UI = self.interactionManager
self._shouldScrollOnInsert = True
self.tableView.horizontalHeader().resizeSection(MessageLogHeader.Host, 80)
self.tableView.horizontalHeader().resizeSection(MessageLogHeader.Method, 60)
@@ -230,10 +295,16 @@ class ProxyGUI(QtWidgets.QMainWindow):
self.textResponse.hide()
def closeEvent(self, event) -> None:
loggers = self.sessionManager.message_logger.loggers
if self.model in loggers:
loggers.remove(self.model)
super().closeEvent(event)
def _populateFilterMenu(self):
def _addFilterAction(text, filter_str):
filter_action = QtWidgets.QAction(text, self)
filter_action.triggered.connect(lambda: self._setFilter(filter_str))
filter_action = QtGui.QAction(text, self)
filter_action.triggered.connect(lambda: self.setFilter(filter_str))
self._filterMenu.addAction(filter_action)
self._filterMenu.clear()
@@ -243,16 +314,19 @@ class ProxyGUI(QtWidgets.QMainWindow):
for preset_name, preset_filter in filters.items():
_addFilterAction(preset_name, preset_filter)
def getFilterDict(self):
return self.settings.FILTERS
def setFilterDict(self, val: dict):
self.settings.FILTERS = val
self._populateFilterMenu()
def _manageFilters(self):
dialog = FilterDialog(self)
dialog.exec_()
dialog.exec()
@nonFatalExceptions
def _setFilter(self, filter_str=None):
def setFilter(self, filter_str=None):
if filter_str is None:
filter_str = self.lineEditFilter.text()
else:
@@ -284,23 +358,22 @@ class ProxyGUI(QtWidgets.QMainWindow):
return
req = entry.request(
beautify=self.checkBeautify.isChecked(),
replacements=self.buildReplacements(entry.session, entry.region),
replacements=buildReplacements(entry.session, entry.region),
)
highlight_range = None
if isinstance(req, SpannedString):
match_result = self.model.filter.match(entry)
# Match result was a tuple indicating what matched
if isinstance(match_result, tuple):
highlight_range = req.spans.get(match_result)
self.textRequest.setPlainText(req)
if highlight_range:
cursor = self.textRequest.textCursor()
cursor.setPosition(highlight_range[0], QtGui.QTextCursor.MoveAnchor)
cursor.setPosition(highlight_range[1], QtGui.QTextCursor.KeepAnchor)
highlight_format = QtGui.QTextBlockFormat()
highlight_format.setBackground(QtCore.Qt.yellow)
cursor.setBlockFormat(highlight_format)
# The string has a map of fields and their associated positions within the string,
# use that to highlight any individual fields the filter matched on.
if isinstance(req, SpannedString):
for field in self.model.filter.match(entry, short_circuit=False).fields:
field_span = req.spans.get(field)
if not field_span:
continue
cursor = self.textRequest.textCursor()
cursor.setPosition(field_span[0], QtGui.QTextCursor.MoveAnchor)
cursor.setPosition(field_span[1], QtGui.QTextCursor.KeepAnchor)
highlight_format = QtGui.QTextBlockFormat()
highlight_format.setBackground(QtCore.Qt.yellow)
cursor.setBlockFormat(highlight_format)
resp = entry.response(beautify=self.checkBeautify.isChecked())
if resp:
@@ -324,7 +397,7 @@ class ProxyGUI(QtWidgets.QMainWindow):
win.show()
msg = self._selectedEntry
beautify = self.checkBeautify.isChecked()
replacements = self.buildReplacements(msg.session, msg.region)
replacements = buildReplacements(msg.session, msg.region)
win.setMessageText(msg.request(beautify=beautify, replacements=replacements))
@nonFatalExceptions
@@ -340,37 +413,43 @@ class ProxyGUI(QtWidgets.QMainWindow):
win = MessageBuilderWindow(self, self.sessionManager)
win.show()
def buildReplacements(self, session: Session, region: ProxiedRegion):
if not session or not region:
return {}
selected = session.selected
agent_object = region.objects.lookup_fullid(session.agent_id)
selected_local = selected.object_local
selected_object = None
if selected_local:
# We may or may not have an object for this
selected_object = region.objects.lookup_localid(selected_local)
return {
"SELECTED_LOCAL": selected_local,
"SELECTED_FULL": selected_object.FullID if selected_object else None,
"SELECTED_PARCEL_LOCAL": selected.parcel_local,
"SELECTED_PARCEL_FULL": selected.parcel_full,
"SELECTED_SCRIPT_ITEM": selected.script_item,
"SELECTED_TASK_ITEM": selected.task_item,
"AGENT_ID": session.agent_id,
"AGENT_LOCAL": agent_object.LocalID if agent_object else None,
"SESSION_ID": session.id,
"AGENT_POS": agent_object.Position if agent_object else None,
"NULL_KEY": UUID(),
"RANDOM_KEY": UUID.random,
"CIRCUIT_CODE": session.circuit_code,
"REGION_HANDLE": region.handle,
}
def _openNewMessageLogWindow(self):
win: QtWidgets.QMainWindow = MessageLogWindow(
self.settings, self.sessionManager, log_live_messages=True, parent=self)
win.setFilter(self.lineEditFilter.text())
win.show()
win.activateWindow()
def _installHTTPSCerts(self):
@asyncSlot()
async def _importLogEntries(self):
log_file = await AddonManager.UI.open_file(
caption="Import Log Entries", filter_str="Hippolyzer Logs (*.hippolog)"
)
if not log_file:
return
win = MessageLogWindow(self.settings, self.sessionManager, log_live_messages=False, parent=self)
win.setFilter(self.lineEditFilter.text())
with open(log_file, "rb") as f:
entries = import_log_entries(f.read())
for entry in entries:
win.model.add_log_entry(entry)
win.show()
win.activateWindow()
@asyncSlot()
async def _exportLogEntries(self):
log_file = await AddonManager.UI.save_file(
caption="Export Log Entries", filter_str="Hippolyzer Logs (*.hippolog)", default_suffix="hippolog",
)
if not log_file:
return
with open(log_file, "wb") as f:
f.write(export_log_entries(self.model))
def installHTTPSCerts(self):
msg = QtWidgets.QMessageBox()
msg.setText("This will install the proxy's HTTPS certificate in the config dir"
" of any installed viewers, continue?")
msg.setText("Would you like to install the proxy's HTTPS certificate in the config dir"
" of any installed viewers so that HTTPS connections will work?")
yes_btn = msg.addButton("Yes", QtWidgets.QMessageBox.NoRole)
msg.addButton("No", QtWidgets.QMessageBox.NoRole)
msg.exec()
@@ -402,7 +481,7 @@ class ProxyGUI(QtWidgets.QMainWindow):
def _manageAddons(self):
dialog = AddonDialog(self)
dialog.exec_()
dialog.exec()
def getAddonList(self) -> List[str]:
return self.sessionManager.settings.ADDON_SCRIPTS
@@ -491,7 +570,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
else:
self.comboUntrusted.addItem(message_name)
cap_names = sorted(set(itertools.chain(*[r.caps.keys() for r in self.regionModel.regions])))
cap_names = sorted(set(itertools.chain(*[r.cap_urls.keys() for r in self.regionModel.regions])))
for cap_name in cap_names:
if cap_name.endswith("ProxyWrapper"):
continue
@@ -522,7 +601,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
break
self.textRequest.setPlainText(
f"""{method} [[{cap_name}]]{path}{params} HTTP/1.1
# {region.caps.get(cap_name, "<unknown URI>")}
# {region.cap_urls.get(cap_name, "<unknown URI>")}
{headers}
{body}"""
)
@@ -575,24 +654,9 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
if var.name in ("TaskID", "ObjectID"):
return VerbatimHumanVal("[[SELECTED_FULL]]")
if var.type.is_int:
return 0
elif var.type.is_float:
return 0.0
elif var.type == MsgType.MVT_LLUUID:
return UUID()
elif var.type == MsgType.MVT_BOOL:
return False
elif var.type == MsgType.MVT_VARIABLE:
return ""
elif var.type in (MsgType.MVT_LLVector3, MsgType.MVT_LLVector3d, MsgType.MVT_LLQuaternion):
return VerbatimHumanVal("(0.0, 0.0, 0.0)")
elif var.type == MsgType.MVT_LLVector4:
return VerbatimHumanVal("(0.0, 0.0, 0.0, 0.0)")
elif var.type == MsgType.MVT_FIXED:
return b"\x00" * var.size
elif var.type == MsgType.MVT_IP_ADDR:
return "0.0.0.0"
default_val = var.default_value
if default_val is not None:
return default_val
return VerbatimHumanVal("")
@nonFatalExceptions
@@ -600,7 +664,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
session, region = self._getTarget()
msg_text = self.textRequest.toPlainText()
replacements = self.parent().buildReplacements(session, region)
replacements = buildReplacements(session, region)
if re.match(r"\A\s*(in|out)\s+", msg_text, re.I):
sender_func = self._sendLLUDPMessage
@@ -632,13 +696,13 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
msg = HumanMessageSerializer.from_human_string(msg_text, replacements, env, safe=False)
if self.checkLLUDPViaCaps.isChecked():
if msg.direction == Direction.IN:
region.eq_manager.queue_event(
region.eq_manager.inject_event(
self.llsdSerializer.serialize(msg, as_dict=True)
)
else:
self._sendHTTPRequest(
"POST",
region.caps["UntrustedSimulatorMessage"],
region.cap_urls["UntrustedSimulatorMessage"],
{"Content-Type": "application/llsd+xml", "Accept": "application/llsd+xml"},
self.llsdSerializer.serialize(msg),
)
@@ -647,7 +711,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
off_circuit = self.checkOffCircuit.isChecked()
if off_circuit:
transport = SocketUDPTransport(socket.socket(socket.AF_INET, socket.SOCK_DGRAM))
region.circuit.send_message(msg, transport=transport)
region.circuit.send(msg, transport=transport)
if off_circuit:
transport.close()
@@ -656,7 +720,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
raise RuntimeError("Need a valid session and region to send EQ event")
message_line, _, body = (x.strip() for x in msg_text.partition("\n"))
message_name = message_line.rsplit(" ", 1)[-1]
region.eq_manager.queue_event({
region.eq_manager.inject_event({
"message": message_name,
"body": llsd.parse_xml(body.encode("utf8")),
})
@@ -682,7 +746,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
cap_name = match.group(1)
cap_url = session.global_caps.get(cap_name)
if not cap_url:
cap_url = region.caps.get(cap_name)
cap_url = region.cap_urls.get(cap_name)
if not cap_url:
raise ValueError("Don't have a Cap for %s" % cap_name)
uri = cap_url + match.group(2)
@@ -749,7 +813,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
class AddonDialog(QtWidgets.QDialog):
listAddons: QtWidgets.QListWidget
def __init__(self, parent: ProxyGUI):
def __init__(self, parent: MessageLogWindow):
super().__init__()
loadUi(ADDON_DIALOG_UI_PATH, self)
@@ -800,7 +864,7 @@ class AddonDialog(QtWidgets.QDialog):
class FilterDialog(QtWidgets.QDialog):
listFilters: QtWidgets.QListWidget
def __init__(self, parent: ProxyGUI):
def __init__(self, parent: MessageLogWindow):
super().__init__()
loadUi(FILTER_DIALOG_UI_PATH, self)
@@ -838,29 +902,16 @@ class FilterDialog(QtWidgets.QDialog):
self.listFilters.takeItem(idx)
class GUIProxySettings(ProxySettings):
"""Persistent settings backed by QSettings"""
def __init__(self, settings: QtCore.QSettings):
super().__init__()
self._settings_obj = settings
def get_setting(self, name: str) -> Any:
val: Any = self._settings_obj.value(name, defaultValue=dataclasses.MISSING)
if val is dataclasses.MISSING:
return val
return json.loads(val)
def set_setting(self, name: str, val: Any):
self._settings_obj.setValue(name, json.dumps(val))
def gui_main():
multiprocessing.set_start_method('spawn')
QtCore.QCoreApplication.setAttribute(QtCore.Qt.AA_ShareOpenGLContexts)
app = QtWidgets.QApplication(sys.argv)
loop = QEventLoop(app)
asyncio.set_event_loop(loop)
window = ProxyGUI()
settings = GUIProxySettings(QtCore.QSettings("SaladDais", "hippolyzer"))
session_manager = GUISessionManager(settings)
window = MessageLogWindow(settings, session_manager, log_live_messages=True)
AddonManager.UI = GUIInteractionManager(window)
timer = QtCore.QTimer(app)
timer.timeout.connect(window.sessionManager.checkRegions)
timer.start(100)
@@ -869,6 +920,10 @@ def gui_main():
http_host = None
if window.sessionManager.settings.REMOTELY_ACCESSIBLE:
http_host = "0.0.0.0"
if settings.FIRST_RUN:
settings.FIRST_RUN = False
# Automatically offer to install the HTTPS certs on first run.
window.installHTTPSCerts()
start_proxy(
session_manager=window.sessionManager,
extra_addon_paths=window.getAddonList(),

View File

@@ -256,6 +256,10 @@
<bool>true</bool>
</property>
<addaction name="actionOpenMessageBuilder"/>
<addaction name="actionOpenNewMessageLogWindow"/>
<addaction name="separator"/>
<addaction name="actionImportLogEntries"/>
<addaction name="actionExportLogEntries"/>
<addaction name="separator"/>
<addaction name="actionInstallHTTPSCerts"/>
<addaction name="actionManageAddons"/>
@@ -323,6 +327,21 @@
<string>Force the proxy to request objects that it doesn't know about due to cache misses</string>
</property>
</action>
<action name="actionOpenNewMessageLogWindow">
<property name="text">
<string>Open New Message Log Window</string>
</property>
</action>
<action name="actionImportLogEntries">
<property name="text">
<string>Import Log Entries</string>
</property>
</action>
<action name="actionExportLogEntries">
<property name="text">
<string>Export Log Entries</string>
</property>
</action>
</widget>
<resources/>
<connections/>

View File

@@ -0,0 +1,91 @@
"""
Assorted utilities to make creating animations from scratch easier
"""
import copy
from typing import List, Union
from hippolyzer.lib.base.datatypes import Vector3, Quaternion
from hippolyzer.lib.base.llanim import PosKeyframe, RotKeyframe
def smooth_step(t: float):
t = max(0.0, min(1.0, t))
return t * t * (3 - 2 * t)
def rot_interp(r0: Quaternion, r1: Quaternion, t: float):
"""
Bad quaternion interpolation
TODO: This is definitely not correct yet seems to work ok? Implement slerp.
"""
# Ignore W
r0 = r0.data(3)
r1 = r1.data(3)
return Quaternion(*map(lambda pair: ((pair[0] * (1.0 - t)) + (pair[1] * t)), zip(r0, r1)))
def unique_frames(frames: List[Union[PosKeyframe, RotKeyframe]]):
"""Drop frames where time and coordinate are exact duplicates of another frame"""
new_frames = []
for frame in frames:
# TODO: fudge factor for float comparison instead
if frame not in new_frames:
new_frames.append(frame)
return new_frames
def shift_keyframes(frames: List[Union[PosKeyframe, RotKeyframe]], num: int):
"""
Shift keyframes around by `num` frames
Assumes keyframes occur at a set cadence, and that first and last keyframe are at the same coord.
"""
# Get rid of duplicate frames
frames = unique_frames(frames)
pop_idx = -1
insert_idx = 0
if num < 0:
insert_idx = len(frames) - 1
pop_idx = 0
num = -num
old_times = [f.time for f in frames]
new_frames = frames.copy()
# Drop last, duped frame. We'll copy the first frame to replace it later
new_frames.pop(-1)
for _ in range(num):
new_frames.insert(insert_idx, new_frames.pop(pop_idx))
# Put first frame back on the end
new_frames.append(copy.copy(new_frames[0]))
assert len(old_times) == len(new_frames)
assert new_frames[0] == new_frames[-1]
# Make the times of the shifted keyframes match up with the previous timeline
for old_time, new_frame in zip(old_times, new_frames):
new_frame.time = old_time
return new_frames
def smooth_pos(start: Vector3, end: Vector3, inter_frames: int, time: float, duration: float) -> List[PosKeyframe]:
"""Generate keyframes to smoothly interpolate between two positions"""
frames = [PosKeyframe(time=time, pos=start)]
for i in range(0, inter_frames):
t = (i + 1) / (inter_frames + 1)
smooth_t = smooth_step(t)
pos = Vector3(smooth_t, smooth_t, smooth_t).interpolate(start, end)
frames.append(PosKeyframe(time=time + (t * duration), pos=pos))
return frames + [PosKeyframe(time=time + duration, pos=end)]
def smooth_rot(start: Quaternion, end: Quaternion, inter_frames: int, time: float, duration: float)\
-> List[RotKeyframe]:
"""Generate keyframes to smoothly interpolate between two rotations"""
frames = [RotKeyframe(time=time, rot=start)]
for i in range(0, inter_frames):
t = (i + 1) / (inter_frames + 1)
smooth_t = smooth_step(t)
frames.append(RotKeyframe(time=time + (t * duration), rot=rot_interp(start, end, smooth_t)))
return frames + [RotKeyframe(time=time + duration, rot=end)]

View File

@@ -273,7 +273,8 @@ class JankStringyBytes(bytes):
Treat bytes as UTF8 if used in string context
Sinful, but necessary evil for now since templates don't specify what's
binary and what's a string.
binary and what's a string. There are also certain fields where the value
may be either binary _or_ a string, depending on the context.
"""
__slots__ = ()
@@ -288,12 +289,28 @@ class JankStringyBytes(bytes):
def __ne__(self, other):
return not self.__eq__(other)
def __contains__(self, item):
if isinstance(item, str):
return item in str(self)
return item in bytes(self)
class RawBytes(bytes):
__slots__ = ()
pass
_T = TypeVar("_T")
class Pretty(Generic[_T]):
"""Wrapper for var values so Messages will know to serialize"""
__slots__ = ("value",)
def __init__(self, value: _T):
self.value: _T = value
class StringEnum(str, enum.Enum):
def __str__(self):
return self.value
@@ -333,5 +350,5 @@ class TaggedUnion(recordclass.datatuple): # type: ignore
__all__ = [
"Vector3", "Vector4", "Vector2", "Quaternion", "TupleCoord",
"UUID", "RawBytes", "StringEnum", "JankStringyBytes", "TaggedUnion",
"IntEnum", "IntFlag", "flags_to_pod"
"IntEnum", "IntFlag", "flags_to_pod", "Pretty"
]

View File

@@ -2,6 +2,8 @@ from __future__ import annotations
import codecs
import functools
import os
import pkg_resources
import re
import weakref
@@ -145,3 +147,10 @@ def to_chunks(chunkable: Sequence[_T], chunk_size: int) -> Generator[_T, None, N
while chunkable:
yield chunkable[:chunk_size]
chunkable = chunkable[chunk_size:]
def get_mtime(path):
try:
return os.stat(path).st_mtime
except:
return None

View File

@@ -9,6 +9,7 @@ import dataclasses
import datetime as dt
import itertools
import logging
import struct
import weakref
from io import StringIO
from typing import *
@@ -33,6 +34,17 @@ LOG = logging.getLogger(__name__)
_T = TypeVar("_T")
class SchemaFlagField(SchemaHexInt):
"""Like a hex int, but must be serialized as bytes in LLSD due to being a U32"""
@classmethod
def from_llsd(cls, val: Any) -> int:
return struct.unpack("!I", val)[0]
@classmethod
def to_llsd(cls, val: int) -> Any:
return struct.pack("!I", val)
def _yield_schema_tokens(reader: StringIO):
in_bracket = False
# empty str == EOF in Python
@@ -76,7 +88,7 @@ class InventoryBase(SchemaBase):
if schema_name != cls.SCHEMA_NAME:
raise ValueError(f"Expected schema name {schema_name!r} to be {cls.SCHEMA_NAME!r}")
fields = cls._fields_dict()
fields = cls._get_fields_dict()
obj_dict = {}
for key, val in tok_iter:
if key in fields:
@@ -100,7 +112,7 @@ class InventoryBase(SchemaBase):
def to_writer(self, writer: StringIO):
writer.write(f"\t{self.SCHEMA_NAME}\t0\n")
writer.write("\t{\n")
for field_name, field in self._fields_dict().items():
for field_name, field in self._get_fields_dict().items():
spec = field.metadata.get("spec")
# Not meant to be serialized
if not spec:
@@ -147,12 +159,38 @@ class InventoryModel(InventoryBase):
model.reparent_nodes()
return model
@classmethod
def from_llsd(cls, llsd_val: List[Dict]) -> InventoryModel:
model = cls()
for obj_dict in llsd_val:
if InventoryCategory.ID_ATTR in obj_dict:
if (obj := InventoryCategory.from_llsd(obj_dict)) is not None:
model.add_container(obj)
elif InventoryObject.ID_ATTR in obj_dict:
if (obj := InventoryObject.from_llsd(obj_dict)) is not None:
model.add_container(obj)
elif InventoryItem.ID_ATTR in obj_dict:
if (obj := InventoryItem.from_llsd(obj_dict)) is not None:
model.add_item(obj)
else:
LOG.warning(f"Unknown object type {obj_dict!r}")
model.reparent_nodes()
return model
def to_writer(self, writer: StringIO):
for container in self.containers.values():
container.to_writer(writer)
for item in self.items.values():
item.to_writer(writer)
def to_llsd(self):
vals = []
for container in self.containers.values():
vals.append(container.to_llsd())
for item in self.items.values():
vals.append(item.to_llsd())
return vals
def add_container(self, container: InventoryContainerBase):
self.containers[container.node_id] = container
container.model = weakref.proxy(self)
@@ -246,7 +284,7 @@ class InventoryCategory(InventoryContainerBase):
SCHEMA_NAME: ClassVar[str] = "inv_object"
cat_id: UUID = schema_field(SchemaUUID)
pref_type: str = schema_field(SchemaStr)
pref_type: str = schema_field(SchemaStr, llsd_name="preferred_type")
owner_id: UUID = schema_field(SchemaUUID)
version: int = schema_field(SchemaInt)
@@ -259,10 +297,10 @@ class InventoryItem(InventoryNodeBase):
item_id: UUID = schema_field(SchemaUUID)
type: str = schema_field(SchemaStr)
inv_type: str = schema_field(SchemaStr)
flags: int = schema_field(SchemaHexInt)
flags: int = schema_field(SchemaFlagField)
name: str = schema_field(SchemaMultilineStr)
desc: str = schema_field(SchemaMultilineStr)
creation_date: dt.datetime = schema_field(SchemaDate)
creation_date: dt.datetime = schema_field(SchemaDate, llsd_name="created_at")
permissions: InventoryPermissions = schema_field(InventoryPermissions)
sale_info: InventorySaleInfo = schema_field(InventorySaleInfo)
asset_id: Optional[UUID] = schema_field(SchemaUUID, default=None)

View File

@@ -1,7 +1,6 @@
import os
import tempfile
from io import BytesIO
from typing import *
import defusedxml.ElementTree
from glymur import jp2box, Jp2k
@@ -10,12 +9,6 @@ from glymur import jp2box, Jp2k
jp2box.ET = defusedxml.ElementTree
SL_DEFAULT_ENCODE = {
"cratios": (1920.0, 480.0, 120.0, 30.0, 10.0),
"irreversible": True,
}
class BufferedJp2k(Jp2k):
"""
For manipulating JP2K from within a binary buffer.
@@ -24,12 +17,7 @@ class BufferedJp2k(Jp2k):
based on filename, so this is the least brittle approach.
"""
def __init__(self, contents: bytes, encode_kwargs: Optional[Dict] = None):
if encode_kwargs is None:
self.encode_kwargs = SL_DEFAULT_ENCODE.copy()
else:
self.encode_kwargs = encode_kwargs
def __init__(self, contents: bytes):
stream = BytesIO(contents)
self.temp_file = tempfile.NamedTemporaryFile(delete=False)
stream.seek(0)
@@ -44,11 +32,12 @@ class BufferedJp2k(Jp2k):
os.remove(self.temp_file.name)
self.temp_file = None
def _write(self, img_array, verbose=False, **kwargs):
# Glymur normally only lets you control encode params when a write happens within
# the constructor. Keep around the encode params from the constructor and pass
# them to successive write calls.
return super()._write(img_array, verbose=False, **self.encode_kwargs, **kwargs)
def _populate_cparams(self, img_array):
if self._cratios is None:
self._cratios = (1920.0, 480.0, 120.0, 30.0, 10.0)
if self._irreversible is None:
self.irreversible = True
return super()._populate_cparams(img_array)
def __bytes__(self):
with open(self.temp_file.name, "rb") as f:

View File

@@ -31,6 +31,14 @@ class SchemaFieldSerializer(abc.ABC, Generic[_T]):
def serialize(cls, val: _T) -> str:
pass
@classmethod
def from_llsd(cls, val: Any) -> _T:
return val
@classmethod
def to_llsd(cls, val: _T) -> Any:
return val
class SchemaDate(SchemaFieldSerializer[dt.datetime]):
@classmethod
@@ -41,6 +49,14 @@ class SchemaDate(SchemaFieldSerializer[dt.datetime]):
def serialize(cls, val: dt.datetime) -> str:
return str(calendar.timegm(val.utctimetuple()))
@classmethod
def from_llsd(cls, val: Any) -> dt.datetime:
return dt.datetime.utcfromtimestamp(val)
@classmethod
def to_llsd(cls, val: dt.datetime):
return calendar.timegm(val.utctimetuple())
class SchemaHexInt(SchemaFieldSerializer[int]):
@classmethod
@@ -95,10 +111,11 @@ class SchemaUUID(SchemaFieldSerializer[UUID]):
def schema_field(spec: Type[Union[SchemaBase, SchemaFieldSerializer]], *, default=dataclasses.MISSING, init=True,
repr=True, hash=None, compare=True) -> dataclasses.Field: # noqa
repr=True, hash=None, compare=True, llsd_name=None) -> dataclasses.Field: # noqa
"""Describe a field in the inventory schema and the shape of its value"""
return dataclasses.field(
metadata={"spec": spec}, default=default, init=init, repr=repr, hash=hash, compare=compare
metadata={"spec": spec, "llsd_name": llsd_name}, default=default,
init=init, repr=repr, hash=hash, compare=compare,
)
@@ -121,8 +138,14 @@ def parse_schema_line(line: str):
@dataclasses.dataclass
class SchemaBase(abc.ABC):
@classmethod
def _fields_dict(cls):
return {f.name: f for f in dataclasses.fields(cls)}
def _get_fields_dict(cls, llsd=False):
fields_dict = {}
for field in dataclasses.fields(cls):
field_name = field.name
if llsd:
field_name = field.metadata.get("llsd_name") or field_name
fields_dict[field_name] = field
return fields_dict
@classmethod
def from_str(cls, text: str):
@@ -137,6 +160,30 @@ class SchemaBase(abc.ABC):
def from_bytes(cls, data: bytes):
return cls.from_str(data.decode("utf8"))
@classmethod
def from_llsd(cls, inv_dict: Dict):
fields = cls._get_fields_dict(llsd=True)
obj_dict = {}
for key, val in inv_dict.items():
if key in fields:
field: dataclasses.Field = fields[key]
key = field.name
spec = field.metadata.get("spec")
# Not a real key, an internal var on our dataclass
if not spec:
LOG.warning(f"Internal key {key!r}")
continue
# some kind of nested structure like sale_info
if issubclass(spec, SchemaBase):
obj_dict[key] = spec.from_llsd(val)
elif issubclass(spec, SchemaFieldSerializer):
obj_dict[key] = spec.from_llsd(val)
else:
raise ValueError(f"Unsupported spec for {key!r}, {spec!r}")
else:
LOG.warning(f"Unknown key {key!r}")
return cls._obj_from_dict(obj_dict)
def to_bytes(self) -> bytes:
return self.to_str().encode("utf8")
@@ -146,6 +193,28 @@ class SchemaBase(abc.ABC):
writer.seek(0)
return writer.read()
def to_llsd(self):
obj_dict = {}
for field_name, field in self._get_fields_dict(llsd=True).items():
spec = field.metadata.get("spec")
# Not meant to be serialized
if not spec:
continue
val = getattr(self, field.name)
if val is None:
continue
# Some kind of nested structure like sale_info
if isinstance(val, SchemaBase):
val = val.to_llsd()
elif issubclass(spec, SchemaFieldSerializer):
val = spec.to_llsd(val)
else:
raise ValueError(f"Bad inventory spec {spec!r}")
obj_dict[field_name] = val
return obj_dict
@abc.abstractmethod
def to_writer(self, writer: StringIO):
pass

View File

@@ -270,8 +270,8 @@ LOD_SEGMENT_SERIALIZER = SegmentSerializer({
# Each position represents a single vert.
"Position": se.Collection(None, se.Vector3U16(0.0, 1.0)),
"TexCoord0": se.Collection(None, se.Vector2U16(0.0, 1.0)),
# Normals have a static domain between -1 and 1
"Normal": se.Collection(None, se.Vector3U16(0.0, 1.0)),
# Normals have a static domain between -1 and 1, so just use that.
"Normal": se.Collection(None, se.Vector3U16(-1.0, 1.0)),
"Weights": se.Collection(None, VertexWeights)
})

View File

@@ -1,6 +1,9 @@
from __future__ import annotations
import abc
import asyncio
import copy
import dataclasses
import datetime as dt
import logging
from typing import *
@@ -13,6 +16,14 @@ from .msgtypes import PacketFlags
from .udpserializer import UDPMessageSerializer
@dataclasses.dataclass
class ReliableResendInfo:
last_resent: dt.datetime
message: Message
completed: asyncio.Future = dataclasses.field(default_factory=asyncio.Future)
tries_left: int = 10
class Circuit:
def __init__(self, near_host: Optional[ADDR_TUPLE], far_host: ADDR_TUPLE, transport):
self.near_host: Optional[ADDR_TUPLE] = near_host
@@ -22,6 +33,8 @@ class Circuit:
self.serializer = UDPMessageSerializer()
self.last_packet_at = dt.datetime.now()
self.packet_id_base = 0
self.unacked_reliable: Dict[Tuple[Direction, int], ReliableResendInfo] = {}
self.resend_every: float = 3.0
def _send_prepared_message(self, message: Message, transport=None):
try:
@@ -46,22 +59,69 @@ class Circuit:
raise RuntimeError(f"Trying to re-send finalized {message!r}")
message.packet_id = self.packet_id_base
self.packet_id_base += 1
if not message.acks:
message.send_flags &= PacketFlags.ACK
if message.acks:
message.send_flags |= PacketFlags.ACK
else:
message.send_flags &= ~PacketFlags.ACK
# If it was queued, it's not anymore
message.queued = False
message.finalized = True
def send_message(self, message: Message, transport=None):
def send(self, message: Message, transport=None) -> UDPPacket:
if self.prepare_message(message):
# If the message originates from us then we're responsible for resends.
if message.reliable and message.synthetic:
self.unacked_reliable[(message.direction, message.packet_id)] = ReliableResendInfo(
last_resent=dt.datetime.now(),
message=message,
)
return self._send_prepared_message(message, transport)
# Temporary alias
send_message = send
def send_reliable(self, message: Message, transport=None) -> asyncio.Future:
"""send() wrapper that always sends reliably and allows `await`ing ACK receipt"""
if not message.synthetic:
raise ValueError("Not able to send non-synthetic message reliably!")
message.send_flags |= PacketFlags.RELIABLE
self.send(message, transport)
return self.unacked_reliable[(message.direction, message.packet_id)].completed
def collect_acks(self, message: Message):
effective_acks = list(message.acks)
if message.name == "PacketAck":
effective_acks.extend(x["ID"] for x in message["Packets"])
for ack in effective_acks:
resend_info = self.unacked_reliable.pop((~message.direction, ack), None)
if resend_info:
resend_info.completed.set_result(None)
def resend_unacked(self):
for resend_info in list(self.unacked_reliable.values()):
# Not time to attempt a resend yet
if dt.datetime.now() - resend_info.last_resent < dt.timedelta(seconds=self.resend_every):
continue
msg = copy.copy(resend_info.message)
resend_info.tries_left -= 1
# We were on our last try and we never received an ack
if not resend_info.tries_left:
logging.warning(f"Giving up on unacked {msg.packet_id}")
del self.unacked_reliable[(msg.direction, msg.packet_id)]
resend_info.completed.set_exception(TimeoutError("Exceeded resend limit"))
continue
resend_info.last_resent = dt.datetime.now()
msg.send_flags |= PacketFlags.RESENT
self._send_prepared_message(msg)
def send_acks(self, to_ack: Sequence[int], direction=Direction.OUT, packet_id=None):
logging.debug("%r acking %r" % (direction, to_ack))
# TODO: maybe tack this onto `.acks` for next message?
message = Message('PacketAck', *[Block('Packets', ID=x) for x in to_ack])
message.packet_id = packet_id
message.direction = direction
message.injected = True
self.send_message(message)
self.send(message)
def __repr__(self):
return "<%s %r : %r>" % (self.__class__.__name__, self.near_host, self.host)

View File

@@ -32,6 +32,7 @@ from typing import *
from hippolyzer.lib.base.datatypes import *
import hippolyzer.lib.base.serialization as se
import hippolyzer.lib.base.templates as templates
from hippolyzer.lib.base.datatypes import Pretty
from hippolyzer.lib.base.message.msgtypes import PacketFlags
from hippolyzer.lib.base.network.transport import Direction, ADDR_TUPLE
@@ -62,11 +63,12 @@ class Block:
Block expects a name, and kwargs for variables (var_name = value)
"""
__slots__ = ('name', 'size', 'vars', 'message_name', '_ser_cache', 'fill_missing',)
PARENT_MESSAGE_NAME: ClassVar[Optional[str]] = None
def __init__(self, name, /, *, fill_missing=False, **kwargs):
self.name = name
self.size = 0
self.message_name: Optional[str] = None
self.message_name: Optional[str] = self.PARENT_MESSAGE_NAME
self.vars: Dict[str, VAR_TYPE] = {}
self._ser_cache: Dict[str, Any] = {}
self.fill_missing = fill_missing
@@ -83,6 +85,9 @@ class Block:
return self.vars[name]
def __setitem__(self, key, value):
if isinstance(value, Pretty):
return self.serialize_var(key, value.value)
# These don't pickle well since they're likely to get hot-reloaded
if isinstance(value, (enum.IntEnum, enum.IntFlag)):
value = int(value)
@@ -181,9 +186,9 @@ class MsgBlockList(List["Block"]):
class Message:
__slots__ = ("name", "send_flags", "_packet_id", "acks", "body_boundaries", "queued",
__slots__ = ("name", "send_flags", "packet_id", "acks", "body_boundaries", "queued",
"offset", "raw_extra", "raw_body", "deserializer", "_blocks", "finalized",
"direction", "meta", "injected", "dropped", "sender")
"direction", "meta", "synthetic", "dropped", "sender")
def __init__(self, name, *args, packet_id=None, flags=0, acks=None, direction=None):
# TODO: Do this on a timer or something.
@@ -191,7 +196,7 @@ class Message:
self.name = name
self.send_flags = flags
self._packet_id: Optional[int] = packet_id # aka, sequence number
self.packet_id: Optional[int] = packet_id # aka, sequence number
self.acks = acks if acks is not None else tuple()
self.body_boundaries = (-1, -1)
@@ -208,22 +213,12 @@ class Message:
self.queued: bool = False
self._blocks: BLOCK_DICT = {}
self.meta = {}
self.injected = False
self.synthetic = packet_id is None
self.dropped = False
self.sender: Optional[ADDR_TUPLE] = None
self.add_blocks(args)
@property
def packet_id(self) -> Optional[int]:
return self._packet_id
@packet_id.setter
def packet_id(self, val: Optional[int]):
self._packet_id = val
# Changing packet ID clears the finalized flag
self.finalized = False
def add_blocks(self, block_list):
# can have a list of blocks if it is multiple or variable
for block in block_list:
@@ -296,7 +291,7 @@ class Message:
if self.raw_body and self.deserializer():
self.deserializer().parse_message_body(self)
def to_dict(self):
def to_dict(self, extended=False):
""" A dict representation of a message.
This is the form used for templated messages sent via EQ.
@@ -312,6 +307,18 @@ class Message:
new_vars[var_name] = val
dict_blocks.append(new_vars)
if extended:
base_repr.update({
"packet_id": self.packet_id,
"meta": self.meta.copy(),
"dropped": self.dropped,
"synthetic": self.synthetic,
"direction": self.direction.name,
"send_flags": int(self.send_flags),
"extra": self.extra,
"acks": self.acks,
})
return base_repr
@classmethod
@@ -321,6 +328,17 @@ class Message:
msg.create_block_list(block_type)
for block in blocks:
msg.add_block(Block(block_type, **block))
if 'packet_id' in dict_val:
# extended format
msg.packet_id = dict_val['packet_id']
msg.meta = dict_val['meta']
msg.dropped = dict_val['dropped']
msg.synthetic = dict_val['synthetic']
msg.direction = Direction[dict_val['direction']]
msg.send_flags = dict_val['send_flags']
msg.extra = dict_val['extra']
msg.acks = dict_val['acks']
return msg
def invalidate_caches(self):
@@ -359,12 +377,16 @@ class Message:
message_copy = copy.deepcopy(self)
# Set the queued flag so the original will be dropped and acks will be sent
self.queued = True
if not self.finalized:
self.queued = True
# Original was dropped so let's make sure we have clean acks and packet id
message_copy.acks = tuple()
message_copy.send_flags &= ~PacketFlags.ACK
message_copy.packet_id = None
message_copy.dropped = False
message_copy.finalized = False
message_copy.queued = False
return message_copy
def to_summary(self):

View File

@@ -62,9 +62,16 @@ class HumanMessageSerializer:
continue
if first_line:
direction, message_name = line.split(" ", 1)
first_split = [x for x in line.split(" ") if x]
direction, message_name = first_split[:2]
options = [x.strip("[]") for x in first_split[2:]]
msg = Message(message_name)
msg.direction = Direction[direction.upper()]
for option in options:
if option in PacketFlags.__members__:
msg.send_flags |= PacketFlags[option]
elif re.match(r"^\d+$", option):
msg.send_flags |= int(option)
first_line = False
continue
@@ -137,9 +144,17 @@ class HumanMessageSerializer:
if msg.direction is not None:
string += f'{msg.direction.name} '
string += msg.name
flags = msg.send_flags
for poss_flag in iter(PacketFlags):
if flags & poss_flag:
flags &= ~poss_flag
string += f" [{poss_flag.name}]"
# Make sure flags with unknown meanings don't get lost
if flags:
string += f" [{int(flags)}]"
if msg.packet_id is not None:
string += f'\n# {msg.packet_id}: {PacketFlags(msg.send_flags)!r}'
string += f'{", DROPPED" if msg.dropped else ""}{", INJECTED" if msg.injected else ""}'
string += f'\n# ID: {msg.packet_id}'
string += f'{", DROPPED" if msg.dropped else ""}{", SYNTHETIC" if msg.synthetic else ""}'
if msg.extra:
string += f'\n# EXTRA: {msg.extra!r}'
string += '\n\n'

View File

@@ -22,6 +22,7 @@ Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
import typing
from .msgtypes import MsgType, MsgBlockType
from ..datatypes import UUID
class MessageTemplateVariable:
@@ -61,6 +62,32 @@ class MessageTemplateVariable:
self._probably_text = self._probably_text and self.name != "NameValue"
return self._probably_text
@property
def default_value(self):
if self.type.is_int:
return 0
elif self.type.is_float:
return 0.0
elif self.type == MsgType.MVT_LLUUID:
return UUID()
elif self.type == MsgType.MVT_BOOL:
return False
elif self.type == MsgType.MVT_VARIABLE:
if self.probably_binary:
return b""
if self.probably_text:
return ""
return b""
elif self.type in (MsgType.MVT_LLVector3, MsgType.MVT_LLVector3d, MsgType.MVT_LLQuaternion):
return 0.0, 0.0, 0.0
elif self.type == MsgType.MVT_LLVector4:
return 0.0, 0.0, 0.0, 0.0
elif self.type == MsgType.MVT_FIXED:
return b"\x00" * self.size
elif self.type == MsgType.MVT_IP_ADDR:
return "0.0.0.0"
return None
class MessageTemplateBlock:
def __init__(self, name):

View File

@@ -68,7 +68,7 @@ class UDPMessageDeserializer:
self.settings = settings or Settings()
self.template_dict = self.DEFAULT_TEMPLATE
def deserialize(self, msg_buff: bytes):
def deserialize(self, msg_buff: bytes) -> Message:
msg = self._parse_message_header(msg_buff)
if not self.settings.ENABLE_DEFERRED_PACKET_PARSING:
try:
@@ -85,6 +85,7 @@ class UDPMessageDeserializer:
reader = se.BufferReader("!", data)
msg: Message = Message("Placeholder")
msg.synthetic = False
msg.send_flags = reader.read(se.U8)
msg.packet_id = reader.read(se.U32)

View File

@@ -1600,6 +1600,7 @@ class RegionHandshakeReplyFlags(IntFlag):
@se.flag_field_serializer("TeleportStart", "Info", "TeleportFlags")
@se.flag_field_serializer("TeleportProgress", "Info", "TeleportFlags")
@se.flag_field_serializer("TeleportFinish", "Info", "TeleportFlags")
@se.flag_field_serializer("TeleportLocal", "Info", "TeleportFlags")
@se.flag_field_serializer("TeleportLureRequest", "Info", "TeleportFlags")
class TeleportFlags(IntFlag):
SET_HOME_TO_TARGET = 1 << 0 # newbie leaving prelude (starter area)
@@ -1618,6 +1619,8 @@ class TeleportFlags(IntFlag):
IS_FLYING = 1 << 13
SHOW_RESET_HOME = 1 << 14
FORCE_REDIRECT = 1 << 15
VIA_GLOBAL_COORDS = 1 << 16
WITHIN_REGION = 1 << 17
@se.http_serializer("RenderMaterials")

View File

@@ -94,7 +94,7 @@ class TransferManager:
if params_dict.get("SessionID", dataclasses.MISSING) is None:
params.SessionID = self._session_id
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
'TransferRequest',
Block(
'TransferInfo',

View File

@@ -1,5 +1,5 @@
from PySide2.QtCore import QMetaObject
from PySide2.QtUiTools import QUiLoader
from PySide6.QtCore import QMetaObject
from PySide6.QtUiTools import QUiLoader
class UiLoader(QUiLoader):

View File

@@ -13,7 +13,7 @@ from xml.etree.ElementTree import parse as parse_etree
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import get_resource_filename
from hippolyzer.lib.base.legacy_inv import InventorySaleInfo, InventoryPermissions
from hippolyzer.lib.base.inventory import InventorySaleInfo, InventoryPermissions
from hippolyzer.lib.base.legacy_schema import SchemaBase, parse_schema_line, SchemaParsingError
from hippolyzer.lib.base.templates import WearableType

View File

@@ -110,7 +110,7 @@ class XferManager:
direction: Direction = Direction.OUT,
) -> Xfer:
xfer_id = xfer_id if xfer_id is not None else random.getrandbits(64)
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
'RequestXfer',
Block(
'XferID',
@@ -174,7 +174,7 @@ class XferManager:
to_ack = range(xfer.next_ackable, ack_max)
xfer.next_ackable = ack_max
for ack_id in to_ack:
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
"ConfirmXferPacket",
Block("XferID", ID=xfer.xfer_id, Packet=ack_id),
direction=xfer.direction,
@@ -216,7 +216,7 @@ class XferManager:
else:
inline_data = data
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
"AssetUploadRequest",
Block(
"AssetBlock",
@@ -272,7 +272,7 @@ class XferManager:
chunk = xfer.chunks.pop(packet_id)
# EOF if there are no chunks left
packet_val = XferPacket(PacketID=packet_id, IsEOF=not bool(xfer.chunks))
self._connection_holder.circuit.send_message(Message(
self._connection_holder.circuit.send(Message(
"SendXferPacket",
Block("XferID", ID=xfer.xfer_id, Packet_=packet_val),
Block("DataPacket", Data=chunk),

View File

@@ -116,8 +116,8 @@ class ClientObjectManager:
*[Block("ObjectData", ObjectLocalID=x) for x in ids_to_req[:255]],
]
# Selecting causes ObjectProperties to be sent
self._region.circuit.send_message(Message("ObjectSelect", blocks))
self._region.circuit.send_message(Message("ObjectDeselect", blocks))
self._region.circuit.send(Message("ObjectSelect", blocks))
self._region.circuit.send(Message("ObjectDeselect", blocks))
ids_to_req = ids_to_req[255:]
futures = []
@@ -150,7 +150,7 @@ class ClientObjectManager:
ids_to_req = local_ids
while ids_to_req:
self._region.circuit.send_message(Message(
self._region.circuit.send(Message(
"RequestMultipleObjects",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
*[Block("ObjectData", CacheMissType=0, ID=x) for x in ids_to_req[:255]],

View File

@@ -10,6 +10,7 @@ from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.circuit import ConnectionHolder
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.base.network.caps_client import CapsClient
from hippolyzer.lib.base.network.transport import ADDR_TUPLE
if TYPE_CHECKING:
@@ -18,10 +19,11 @@ if TYPE_CHECKING:
class BaseClientRegion(ConnectionHolder, abc.ABC):
"""Represents a client's view of a remote region"""
# Actually a weakref
handle: Optional[int]
# Actually a weakref
session: Callable[[], BaseClientSession]
objects: ClientObjectManager
caps_client: CapsClient
class BaseClientSession(abc.ABC):

View File

@@ -73,17 +73,17 @@ def show_message(text, session=None) -> None:
direction=Direction.IN,
)
if session:
session.main_region.circuit.send_message(message)
session.main_region.circuit.send(message)
else:
for session in AddonManager.SESSION_MANAGER.sessions:
session.main_region.circuit.send_message(copy.copy(message))
session.main_region.circuit.send(copy.copy(message))
def send_chat(message: Union[bytes, str], channel=0, chat_type=ChatType.NORMAL, session=None):
session = session or addon_ctx.session.get(None) or None
if not session:
raise RuntimeError("Tried to send chat without session")
session.main_region.circuit.send_message(Message(
session.main_region.circuit.send(Message(
"ChatFromViewer",
Block(
"AgentData",
@@ -181,6 +181,9 @@ class BaseAddon(abc.ABC):
def handle_region_changed(self, session: Session, region: ProxiedRegion):
pass
def handle_region_registered(self, session: Session, region: ProxiedRegion):
pass
def handle_circuit_created(self, session: Session, region: ProxiedRegion):
pass

View File

@@ -16,6 +16,7 @@ from types import ModuleType
from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import get_mtime
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.network.transport import UDPPacket
from hippolyzer.lib.proxy import addon_ctx
@@ -31,13 +32,6 @@ if TYPE_CHECKING:
LOG = logging.getLogger(__name__)
def _get_mtime(path):
try:
return os.stat(path).st_mtime
except:
return None
class BaseInteractionManager:
@abc.abstractmethod
async def open_dir(self, caption: str = '', directory: str = '', filter_str: str = '') -> Optional[str]:
@@ -52,7 +46,8 @@ class BaseInteractionManager:
pass
@abc.abstractmethod
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '') -> Optional[str]:
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '',
default_suffix: str = '') -> Optional[str]:
pass
@abc.abstractmethod
@@ -186,7 +181,7 @@ class AddonManager:
def _check_hotreloads(cls):
"""Mark addons that rely on changed files for reloading"""
for filename, importers in cls.HOTRELOAD_IMPORTERS.items():
mtime = _get_mtime(filename)
mtime = get_mtime(filename)
if not mtime or mtime == cls.FILE_MTIMES.get(filename, None):
continue
@@ -215,7 +210,7 @@ class AddonManager:
# Mark the caller as having imported (and being dependent on) `module`
stack = inspect.stack()[1]
cls.HOTRELOAD_IMPORTERS[imported_file].add(stack.filename)
cls.FILE_MTIMES[imported_file] = _get_mtime(imported_file)
cls.FILE_MTIMES[imported_file] = get_mtime(imported_file)
importing_spec = next((s for s in cls.BASE_ADDON_SPECS if s.origin == stack.filename), None)
imported_spec = next((s for s in cls.BASE_ADDON_SPECS if s.origin == imported_file), None)
@@ -263,7 +258,7 @@ class AddonManager:
for spec in cls.BASE_ADDON_SPECS[:]:
had_mod = spec.name in cls.FRESH_ADDON_MODULES
try:
mtime = _get_mtime(spec.origin)
mtime = get_mtime(spec.origin)
mtime_changed = mtime != cls.FILE_MTIMES.get(spec.origin, None)
if not mtime_changed and had_mod:
continue
@@ -526,6 +521,11 @@ class AddonManager:
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_region_changed", session, region)
@classmethod
def handle_region_registered(cls, session: Session, region: ProxiedRegion):
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_region_registered", session, region)
@classmethod
def handle_circuit_created(cls, session: Session, region: ProxiedRegion):
with addon_ctx.push(session, region):

View File

@@ -0,0 +1,93 @@
from __future__ import annotations
import enum
import typing
from weakref import ref
from typing import *
if TYPE_CHECKING:
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
def is_asset_server_cap_name(cap_name):
return cap_name and (
cap_name.startswith("GetMesh")
or cap_name.startswith("GetTexture")
or cap_name.startswith("ViewerAsset")
)
class CapType(enum.Enum):
NORMAL = enum.auto()
TEMPORARY = enum.auto()
WRAPPER = enum.auto()
PROXY_ONLY = enum.auto()
@property
def fake(self) -> bool:
return self == CapType.PROXY_ONLY or self == CapType.WRAPPER
class SerializedCapData(typing.NamedTuple):
cap_name: typing.Optional[str] = None
region_addr: typing.Optional[str] = None
session_id: typing.Optional[str] = None
base_url: typing.Optional[str] = None
type: str = "NORMAL"
def __bool__(self):
return bool(self.cap_name or self.session_id)
@property
def asset_server_cap(self):
return is_asset_server_cap_name(self.cap_name)
class CapData(NamedTuple):
cap_name: Optional[str] = None
# Actually they're weakrefs but the type sigs suck.
region: Optional[Callable[[], Optional[ProxiedRegion]]] = None
session: Optional[Callable[[], Optional[Session]]] = None
base_url: Optional[str] = None
type: CapType = CapType.NORMAL
def __bool__(self):
return bool(self.cap_name or self.session)
def serialize(self) -> "SerializedCapData":
return SerializedCapData(
cap_name=self.cap_name,
region_addr=str(self.region().circuit_addr) if self.region and self.region() else None,
session_id=str(self.session().id) if self.session and self.session() else None,
base_url=self.base_url,
type=self.type.name,
)
@classmethod
def deserialize(
cls,
ser_cap_data: "SerializedCapData",
session_mgr: Optional[SessionManager],
) -> "CapData":
cap_session = None
cap_region = None
if session_mgr and ser_cap_data.session_id:
for session in session_mgr.sessions:
if ser_cap_data.session_id == str(session.id):
cap_session = session
if cap_session and ser_cap_data.region_addr:
for region in cap_session.regions:
if ser_cap_data.region_addr == str(region.circuit_addr):
cap_region = region
return cls(
cap_name=ser_cap_data.cap_name,
region=ref(cap_region) if cap_region else None,
session=ref(cap_session) if cap_session else None,
base_url=ser_cap_data.base_url,
type=CapType[ser_cap_data.type],
)
@property
def asset_server_cap(self) -> bool:
return is_asset_server_cap_name(self.cap_name)

View File

@@ -20,7 +20,7 @@ class ProxyCapsClient(CapsClient):
def _get_caps(self) -> Optional[CAPS_DICT]:
if not self._region:
return None
return self._region.caps
return self._region.cap_urls
def _request_fixups(self, cap_or_url: str, headers: Dict, proxy: Optional[bool], ssl: Any):
# We want to proxy this through Hippolyzer

View File

@@ -25,7 +25,7 @@ class ProxiedCircuit(Circuit):
except:
logging.exception(f"Failed to serialize: {message.to_dict()!r}")
raise
if self.logging_hook and message.injected:
if self.logging_hook and message.synthetic:
self.logging_hook(message)
return self.send_datagram(serialized, message.direction, transport=transport)
@@ -34,44 +34,46 @@ class ProxiedCircuit(Circuit):
return self.out_injections, self.in_injections
return self.in_injections, self.out_injections
def prepare_message(self, message: Message, direction=None):
def prepare_message(self, message: Message):
if message.finalized:
raise RuntimeError(f"Trying to re-send finalized {message!r}")
direction = direction or getattr(message, 'direction')
fwd_injections, reverse_injections = self._get_injections(direction)
if message.queued:
# This is due to be dropped, nothing should be sending the original
raise RuntimeError(f"Trying to send original of queued {message!r}")
fwd_injections, reverse_injections = self._get_injections(message.direction)
message.finalized = True
# Injected, let's gen an ID
if message.packet_id is None:
message.packet_id = fwd_injections.gen_injectable_id()
message.injected = True
else:
message.synthetic = True
# This message wasn't injected by the proxy so we need to rewrite packet IDs
# to account for IDs the real creator of the packet couldn't have known about.
elif not message.synthetic:
# was_dropped needs the unmodified packet ID
if fwd_injections.was_dropped(message.packet_id) and message.name != "PacketAck":
logging.warning("Attempting to re-send previously dropped %s:%s, did we ack?" %
(message.packet_id, message.name))
message.packet_id = fwd_injections.get_effective_id(message.packet_id)
fwd_injections.track_seen(message.packet_id)
message.finalized = True
if not message.injected:
# This message wasn't injected by the proxy so we need to rewrite packet IDs
# to account for IDs the other parties couldn't have known about.
message.acks = tuple(
reverse_injections.get_original_id(x) for x in message.acks
if not reverse_injections.was_injected(x)
)
if message.name == "PacketAck":
if not self._rewrite_packet_ack(message, reverse_injections):
logging.debug(f"Dropping {direction} ack for injected packets!")
if not self._rewrite_packet_ack(message, reverse_injections) and not message.acks:
logging.debug(f"Dropping {message.direction} ack for injected packets!")
# Let caller know this shouldn't be sent at all, it's strictly ACKs for
# injected packets.
return False
elif message.name == "StartPingCheck":
self._rewrite_start_ping_check(message, fwd_injections)
if not message.acks:
if message.acks:
message.send_flags |= PacketFlags.ACK
else:
message.send_flags &= ~PacketFlags.ACK
return True
@@ -97,15 +99,18 @@ class ProxiedCircuit(Circuit):
new_id = fwd_injections.get_effective_id(orig_id)
if orig_id != new_id:
logging.debug("Rewrote oldest unacked %s -> %s" % (orig_id, new_id))
# Get a list of unacked IDs for the direction this StartPingCheck is heading
fwd_unacked = (a for (d, a) in self.unacked_reliable.keys() if d == message.direction)
# Use the proxy's oldest unacked ID if it's older than the client's
new_id = min((new_id, *fwd_unacked))
message["PingID"]["OldestUnacked"] = new_id
def drop_message(self, message: Message, orig_direction=None):
def drop_message(self, message: Message):
if message.finalized:
raise RuntimeError(f"Trying to drop finalized {message!r}")
if message.packet_id is None:
return
orig_direction = orig_direction or message.direction
fwd_injections, reverse_injections = self._get_injections(orig_direction)
fwd_injections, reverse_injections = self._get_injections(message.direction)
fwd_injections.mark_dropped(message.packet_id)
message.dropped = True
@@ -113,7 +118,7 @@ class ProxiedCircuit(Circuit):
# Was sent reliably, tell the other end that we saw it and to shut up.
if message.reliable:
self.send_acks([message.packet_id], ~orig_direction)
self.send_acks([message.packet_id], ~message.direction)
# This packet had acks for the other end, send them in a separate PacketAck
effective_acks = tuple(
@@ -121,7 +126,7 @@ class ProxiedCircuit(Circuit):
if not reverse_injections.was_injected(x)
)
if effective_acks:
self.send_acks(effective_acks, orig_direction, packet_id=message.packet_id)
self.send_acks(effective_acks, message.direction, packet_id=message.packet_id)
class InjectionTracker:

View File

@@ -26,6 +26,10 @@ class CommandDetails(NamedTuple):
lifetime: Optional[TaskLifeScope] = None
def parse_bool(val: str) -> bool:
return val.lower() in ('on', 'true', '1', '1.0', 'yes')
def handle_command(command_name: Optional[str] = None, /, *, lifetime: Optional[TaskLifeScope] = None,
single_instance: bool = False, **params: Union[Parameter, callable]):
"""
@@ -61,13 +65,13 @@ def handle_command(command_name: Optional[str] = None, /, *, lifetime: Optional[
# Greedy, takes the rest of the message
if param.sep is None:
param_val = message
message = None
message = ""
else:
message = message.lstrip(param.sep)
if not message:
if param.optional:
break
raise KeyError(f"Missing parameter {param_name}")
if not param.optional:
raise KeyError(f"Missing parameter {param_name}")
continue
param_val, _, message = message.partition(param.sep) # type: ignore
param_vals[param_name] = param.parser(param_val)

View File

@@ -58,7 +58,7 @@ class HTTPAssetRepo(collections.UserDict):
return False
asset = self[asset_id]
flow.response = http.HTTPResponse.make(
flow.response = http.Response.make(
content=asset.data,
headers={
"Content-Type": "application/octet-stream",

View File

@@ -18,8 +18,9 @@ from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.llsd_msg_serializer import LLSDMessageSerializer
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.region import ProxiedRegion, CapType
from hippolyzer.lib.proxy.sessions import SessionManager, CapData, Session
from hippolyzer.lib.proxy.caps import CapData, CapType
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager, Session
from hippolyzer.lib.proxy.http_proxy import HTTPFlowContext
@@ -82,16 +83,19 @@ class MITMProxyEventManager:
finally:
# If someone has taken this request out of the regular callback flow,
# they'll manually send a callback at some later time.
if not flow.taken:
self.to_proxy_queue.put(("callback", flow.id, flow.get_state()))
if not flow.taken and not flow.resumed:
# Addon hasn't taken ownership of this flow, send it back to mitmproxy
# ourselves.
flow.resume()
def _handle_request(self, flow: HippoHTTPFlow):
url = flow.request.url
cap_data = self.session_manager.resolve_cap(url)
flow.cap_data = cap_data
# Don't do anything special with the proxy's own requests,
# we only pass it through for logging purposes.
if flow.request_injected:
# Don't do anything special with the proxy's own requests unless the requested
# URL can only be handled by the proxy. Ideally we only pass the request through
# for logging purposes.
if flow.request_injected and (not cap_data or not cap_data.type.fake):
return
# The local asset repo gets first bite at the apple
@@ -103,7 +107,7 @@ class MITMProxyEventManager:
AddonManager.handle_http_request(flow)
if cap_data and cap_data.cap_name.endswith("ProxyWrapper"):
orig_cap_name = cap_data.cap_name.rsplit("ProxyWrapper", 1)[0]
orig_cap_url = cap_data.region().caps[orig_cap_name]
orig_cap_url = cap_data.region().cap_urls[orig_cap_name]
split_orig_url = urllib.parse.urlsplit(orig_cap_url)
orig_cap_host = split_orig_url[1]
@@ -120,7 +124,7 @@ class MITMProxyEventManager:
if not flow.can_stream or self._asset_server_proxied:
flow.request.url = redir_url
else:
flow.response = mitmproxy.http.HTTPResponse.make(
flow.response = mitmproxy.http.Response.make(
307,
# Can't provide explanation in the body because this results in failing Range requests under
# mitmproxy that return garbage data. Chances are there's weird interactions
@@ -134,9 +138,41 @@ class MITMProxyEventManager:
)
elif cap_data and cap_data.asset_server_cap:
# Both the wrapper request and the actual asset server request went through
# the proxy
# the proxy. Don't bother trying the redirect strategy anymore.
self._asset_server_proxied = True
logging.warning("noproxy not used, switching to URI rewrite strategy")
elif cap_data and cap_data.cap_name == "EventQueueGet":
# HACK: The sim's EQ acking mechanism doesn't seem to actually work.
# if the client drops the connection due to timeout before we can
# proxy back the response then it will be lost forever. Keep around
# the last EQ response we got so we can re-send it if the client repeats
# its previous request.
req_ack_id = llsd.parse_xml(flow.request.content)["ack"]
eq_manager = cap_data.region().eq_manager
cached_resp = eq_manager.get_cached_poll_response(req_ack_id)
if cached_resp:
logging.warning("Had to serve a cached EventQueueGet due to client desync")
flow.response = mitmproxy.http.Response.make(
200,
llsd.format_xml(cached_resp),
{
"Content-Type": "application/llsd+xml",
# So we can differentiate these in the log
"X-Hippo-Fake-EQ": "1",
"Connection": "close",
},
)
elif cap_data and cap_data.cap_name == "Seed":
# Drop any proxy-only caps from the seed request we send to the server,
# add those cap names as metadata so we know to send their urls in the response
parsed_seed: List[str] = llsd.parse_xml(flow.request.content)
flow.metadata['needed_proxy_caps'] = []
for known_cap_name, (known_cap_type, known_cap_url) in cap_data.region().caps.items():
if known_cap_type == CapType.PROXY_ONLY and known_cap_name in parsed_seed:
parsed_seed.remove(known_cap_name)
flow.metadata['needed_proxy_caps'].append(known_cap_name)
if flow.metadata['needed_proxy_caps']:
flow.request.content = llsd.format_xml(parsed_seed)
elif not cap_data:
if self._is_login_request(flow):
# Not strictly a Cap, but makes it easier to filter on.
@@ -145,7 +181,7 @@ class MITMProxyEventManager:
if cap_data and cap_data.type == CapType.PROXY_ONLY:
# A proxy addon was supposed to respond itself, but it didn't.
if not flow.taken and not flow.response_injected:
flow.response = mitmproxy.http.HTTPResponse.make(
flow.response = mitmproxy.http.Response.make(
500,
b"Proxy didn't handle proxy-only Cap correctly",
{
@@ -176,10 +212,14 @@ class MITMProxyEventManager:
def _handle_response(self, flow: HippoHTTPFlow):
message_logger = self.session_manager.message_logger
if message_logger:
message_logger.log_http_response(flow)
try:
message_logger.log_http_response(flow)
except:
logging.exception("Failed while logging HTTP flow")
# Don't handle responses for requests injected by the proxy
if flow.request_injected:
# Don't process responses for requests or responses injected by the proxy.
# We already processed it, it came from us!
if flow.request_injected or flow.response_injected:
return
status = flow.response.status_code
@@ -240,7 +280,10 @@ class MITMProxyEventManager:
for cap_name in wrappable_caps:
if cap_name in parsed:
parsed[cap_name] = region.register_wrapper_cap(cap_name)
flow.response.content = llsd.format_pretty_xml(parsed)
# Send the client the URLs for any proxy-only caps it requested
for cap_name in flow.metadata['needed_proxy_caps']:
parsed[cap_name] = region.cap_urls[cap_name]
flow.response.content = llsd.format_xml(parsed)
elif cap_data.cap_name == "EventQueueGet":
parsed_eq_resp = llsd.parse_xml(flow.response.content)
if parsed_eq_resp:
@@ -251,18 +294,21 @@ class MITMProxyEventManager:
new_events.append(event)
# Add on any fake events that've been queued by addons
eq_manager = cap_data.region().eq_manager
new_events.extend(eq_manager.take_events())
new_events.extend(eq_manager.take_injected_events())
parsed_eq_resp["events"] = new_events
# Empty event list is an error, need to return undef instead.
if old_events and not new_events:
# Need at least one event or the viewer will refuse to ack!
new_events.append({"message": "NOP", "body": {}})
flow.response.content = llsd.format_pretty_xml(parsed_eq_resp)
parsed_eq_resp = None
# HACK: see note in above request handler for EventQueueGet
req_ack_id = llsd.parse_xml(flow.request.content)["ack"]
eq_manager.cache_last_poll_response(req_ack_id, parsed_eq_resp)
flow.response.content = llsd.format_xml(parsed_eq_resp)
elif cap_data.cap_name in self.UPLOAD_CREATING_CAPS:
if not region:
return
parsed = llsd.parse_xml(flow.response.content)
if "uploader" in parsed:
region.register_temporary_cap(cap_data.cap_name + "Uploader", parsed["uploader"])
region.register_cap(cap_data.cap_name + "Uploader", parsed["uploader"], CapType.TEMPORARY)
except:
logging.exception("OOPS, blew up in HTTP proxy!")

View File

@@ -1,13 +1,18 @@
from __future__ import annotations
import copy
import multiprocessing
import weakref
from typing import *
from typing import Optional
import mitmproxy.http
from mitmproxy.http import HTTPFlow
from hippolyzer.lib.proxy.caps import CapData
if TYPE_CHECKING:
from hippolyzer.lib.proxy.sessions import CapData, SessionManager
from hippolyzer.lib.proxy.sessions import SessionManager
class HippoHTTPFlow:
@@ -17,24 +22,26 @@ class HippoHTTPFlow:
Hides the nastiness of writing to flow.metadata so we can pass
state back and forth between the two proxies
"""
__slots__ = ("flow",)
__slots__ = ("flow", "callback_queue", "resumed", "taken")
def __init__(self, flow: HTTPFlow):
def __init__(self, flow: HTTPFlow, callback_queue: Optional[multiprocessing.Queue] = None):
self.flow: HTTPFlow = flow
self.resumed = False
self.taken = False
self.callback_queue = weakref.ref(callback_queue) if callback_queue else None
meta = self.flow.metadata
meta.setdefault("taken", False)
meta.setdefault("can_stream", True)
meta.setdefault("response_injected", False)
meta.setdefault("request_injected", False)
meta.setdefault("cap_data", None)
meta.setdefault("cap_data", CapData())
meta.setdefault("from_browser", False)
@property
def request(self) -> mitmproxy.http.HTTPRequest:
def request(self) -> mitmproxy.http.Request:
return self.flow.request
@property
def response(self) -> Optional[mitmproxy.http.HTTPResponse]:
def response(self) -> Optional[mitmproxy.http.Response]:
return self.flow.response
@property
@@ -42,7 +49,7 @@ class HippoHTTPFlow:
return self.flow.id
@response.setter
def response(self, val: Optional[mitmproxy.http.HTTPResponse]):
def response(self, val: Optional[mitmproxy.http.Response]):
self.flow.metadata["response_injected"] = True
self.flow.response = val
@@ -88,12 +95,21 @@ class HippoHTTPFlow:
def take(self) -> HippoHTTPFlow:
"""Don't automatically pass this flow back to mitmproxy"""
self.metadata["taken"] = True
# TODO: Having to explicitly take / release Flows to use them in an async
# context is kind of janky. The HTTP callback handling code should probably
# be made totally async, including the addon hooks. Would coroutine per-callback
# be expensive?
assert not self.taken and not self.resumed
self.taken = True
return self
@property
def taken(self) -> bool:
return self.metadata["taken"]
def resume(self):
"""Release the HTTP flow back to the normal processing flow"""
assert self.callback_queue
assert not self.resumed
self.taken = False
self.resumed = True
self.callback_queue().put(("callback", self.flow.id, self.get_state()))
@property
def is_replay(self) -> bool:
@@ -113,15 +129,18 @@ class HippoHTTPFlow:
return state
@classmethod
def from_state(cls, flow_state: Dict, session_manager: SessionManager) -> HippoHTTPFlow:
def from_state(cls, flow_state: Dict, session_manager: Optional[SessionManager]) -> HippoHTTPFlow:
flow: Optional[HTTPFlow] = HTTPFlow.from_state(flow_state)
assert flow is not None
cap_data_ser = flow.metadata.get("cap_data_ser")
callback_queue = None
if session_manager:
callback_queue = session_manager.flow_context.to_proxy_queue
if cap_data_ser is not None:
flow.metadata["cap_data"] = session_manager.deserialize_cap_data(cap_data_ser)
flow.metadata["cap_data"] = CapData.deserialize(cap_data_ser, session_manager)
else:
flow.metadata["cap_data"] = None
return cls(flow)
return cls(flow, callback_queue)
def copy(self) -> HippoHTTPFlow:
# HACK: flow.copy() expects the flow to be fully JSON serializable, but

View File

@@ -1,5 +1,4 @@
import asyncio
import functools
import logging
import multiprocessing
import os
@@ -15,42 +14,30 @@ import mitmproxy.log
import mitmproxy.master
import mitmproxy.options
import mitmproxy.proxy
from mitmproxy.addons import core, clientplayback
from mitmproxy.addons import core, clientplayback, proxyserver, next_layer, disable_h2c
from mitmproxy.http import HTTPFlow
from mitmproxy.proxy.layers import tls
import OpenSSL
from hippolyzer.lib.base.helpers import get_resource_filename
from hippolyzer.lib.base.multiprocessing_utils import ParentProcessWatcher
orig_sethostflags = OpenSSL.SSL._lib.X509_VERIFY_PARAM_set_hostflags # noqa
@functools.wraps(orig_sethostflags)
def _sethostflags_wrapper(param, flags):
# Since 2000 the recommendation per RFCs has been to only check SANs and not the CN field.
# Most browsers do this, as does mitmproxy. The viewer does not, and the sim certs have no SAN
# field. Just monkeypatch out this flag since mitmproxy's internals are in flux and there's
# no good way to stop setting this flag currently.
return orig_sethostflags(
param,
flags & (~OpenSSL.SSL._lib.X509_CHECK_FLAG_NEVER_CHECK_SUBJECT) # noqa
)
OpenSSL.SSL._lib.X509_VERIFY_PARAM_set_hostflags = _sethostflags_wrapper # noqa
from hippolyzer.lib.proxy.caps import SerializedCapData
class SLCertStore(mitmproxy.certs.CertStore):
def get_cert(self, commonname: typing.Optional[bytes], sans: typing.List[bytes], *args):
cert, privkey, chain = super().get_cert(commonname, sans, *args)
x509: OpenSSL.crypto.X509 = cert.x509
def get_cert(self, commonname: typing.Optional[str], sans: typing.List[str], *args, **kwargs):
entry = super().get_cert(commonname, sans, *args, **kwargs)
cert, privkey, chain = entry.cert, entry.privatekey, entry.chain_file
x509 = cert.to_pyopenssl()
# The cert must have a subject key ID or the viewer will reject it.
for i in range(0, x509.get_extension_count()):
ext = x509.get_extension(i)
# This cert already has a subject key id, pass through.
if ext.get_short_name() == b"subjectKeyIdentifier":
return cert, privkey, chain
return entry
# Need to add a subject key ID onto this cert or the viewer will reject it.
# The viewer doesn't actually use the subject key ID for its intended purpose,
# so a random, unique value is fine.
x509.add_extensions([
OpenSSL.crypto.X509Extension(
b"subjectKeyIdentifier",
@@ -58,17 +45,24 @@ class SLCertStore(mitmproxy.certs.CertStore):
uuid.uuid4().hex.encode("utf8"),
),
])
x509.sign(privkey, "sha256") # type: ignore
return cert, privkey, chain
x509.sign(OpenSSL.crypto.PKey.from_cryptography_key(privkey), "sha256") # type: ignore
new_entry = mitmproxy.certs.CertStoreEntry(
mitmproxy.certs.Cert.from_pyopenssl(x509), privkey, chain
)
# Replace the cert that was created in the base `get_cert()` with our modified cert
self.certs[(commonname, tuple(sans))] = new_entry
self.expire_queue.pop(-1)
self.expire(new_entry)
return new_entry
class SLProxyConfig(mitmproxy.proxy.ProxyConfig):
def configure(self, options, updated) -> None:
super().configure(options, updated)
class SLTlsConfig(mitmproxy.addons.tlsconfig.TlsConfig):
def running(self):
super().running()
old_cert_store = self.certstore
# Replace the cert store with one that knows how to add
# a subject key ID extension.
self.certstore = SLCertStore( # noqa
self.certstore = SLCertStore(
default_privatekey=old_cert_store.default_privatekey,
default_ca=old_cert_store.default_ca,
default_chain_file=old_cert_store.default_chain_file,
@@ -76,6 +70,18 @@ class SLProxyConfig(mitmproxy.proxy.ProxyConfig):
)
self.certstore.certs = old_cert_store.certs
def tls_start_server(self, tls_start: tls.TlsStartData):
super().tls_start_server(tls_start)
# Since 2000 the recommendation per RFCs has been to only check SANs and not the CN field.
# Most browsers do this, as does mitmproxy. The viewer does not, and the sim certs have no SAN
# field. set the host verification flags to remove the flag that disallows falling back to
# checking the CN (X509_CHECK_FLAG_NEVER_CHECK_SUBJECT)
param = OpenSSL.SSL._lib.SSL_get0_param(tls_start.ssl_conn._ssl) # noqa
# get_hostflags() doesn't seem to be exposed, just set the usual flags without
# the problematic `X509_CHECK_FLAG_NEVER_CHECK_SUBJECT` flag.
flags = OpenSSL.SSL._lib.X509_CHECK_FLAG_NO_PARTIAL_WILDCARDS # noqa
OpenSSL.SSL._lib.X509_VERIFY_PARAM_set_hostflags(param, flags) # noqa
class HTTPFlowContext:
def __init__(self):
@@ -92,12 +98,13 @@ class IPCInterceptionAddon:
flow which is merged in and resumed.
"""
def __init__(self, flow_context: HTTPFlowContext):
self.mitmproxy_ready = flow_context.mitmproxy_ready
self.intercepted_flows: typing.Dict[str, HTTPFlow] = {}
self.from_proxy_queue: multiprocessing.Queue = flow_context.from_proxy_queue
self.to_proxy_queue: multiprocessing.Queue = flow_context.to_proxy_queue
self.shutdown_signal: multiprocessing.Event = flow_context.shutdown_signal
def log(self, entry: mitmproxy.log.LogEntry):
def add_log(self, entry: mitmproxy.log.LogEntry):
if entry.level == "debug":
logging.debug(entry.msg)
elif entry.level in ("alert", "info"):
@@ -112,6 +119,8 @@ class IPCInterceptionAddon:
def running(self):
# register to pump the events or something here
asyncio.create_task(self._pump_callbacks())
# Tell the main process mitmproxy is ready to handle requests
self.mitmproxy_ready.set()
async def _pump_callbacks(self):
watcher = ParentProcessWatcher(self.shutdown_signal)
@@ -127,9 +136,6 @@ class IPCInterceptionAddon:
if event_type == "callback":
orig_flow = self.intercepted_flows.pop(flow_id)
orig_flow.set_state(flow_state)
# Remove the taken flag from the flow if present, the flow by definition
# isn't take()n anymore once it's been passed back to the proxy.
orig_flow.metadata.pop("taken", None)
elif event_type == "replay":
flow: HTTPFlow = HTTPFlow.from_state(flow_state)
# mitmproxy won't replay intercepted flows, this is an old flow so
@@ -169,7 +175,7 @@ class IPCInterceptionAddon:
def responseheaders(self, flow: HTTPFlow):
# The response was injected earlier in an earlier handler,
# we don't want to touch this anymore.
if flow.metadata["response_injected"]:
if flow.metadata.get("response_injected"):
return
# Someone fucked up and put a mimetype in Content-Encoding.
@@ -180,7 +186,10 @@ class IPCInterceptionAddon:
flow.response.headers["Content-Encoding"] = "identity"
def response(self, flow: HTTPFlow):
if flow.metadata["response_injected"]:
cap_data: typing.Optional[SerializedCapData] = flow.metadata.get("cap_data")
if flow.metadata.get("response_injected") and cap_data and cap_data.asset_server_cap:
# Don't bother intercepting asset server requests where we injected a response.
# We don't want to log them and they don't need any more processing by user hooks.
return
self._queue_flow_interception("response", flow)
@@ -188,10 +197,10 @@ class IPCInterceptionAddon:
class SLMITMAddon(IPCInterceptionAddon):
def responseheaders(self, flow: HTTPFlow):
super().responseheaders(flow)
cap_data: typing.Optional[SerializedCapData] = flow.metadata["cap_data_ser"]
cap_data: typing.Optional[SerializedCapData] = flow.metadata.get("cap_data_ser")
# Request came from the proxy itself, don't touch it.
if flow.metadata["request_injected"]:
if flow.metadata.get("request_injected"):
return
# This is an asset server response that we're not interested in intercepting.
@@ -200,7 +209,7 @@ class SLMITMAddon(IPCInterceptionAddon):
# Can't stream if we injected our own response or we were asked not to stream
if not flow.metadata["response_injected"] and flow.metadata["can_stream"]:
flow.response.stream = True
elif not cap_data and not flow.metadata["from_browser"]:
elif not cap_data and not flow.metadata.get("from_browser"):
object_name = flow.response.headers.get("X-SecondLife-Object-Name", "")
# Meh. Add some fake Cap data for this so it can be matched on.
if object_name.startswith("#Firestorm LSL Bridge"):
@@ -213,7 +222,11 @@ class SLMITMMaster(mitmproxy.master.Master):
self.addons.add(
core.Core(),
clientplayback.ClientPlayback(),
SLMITMAddon(flow_context)
disable_h2c.DisableH2C(),
proxyserver.Proxyserver(),
next_layer.NextLayer(),
SLTlsConfig(),
SLMITMAddon(flow_context),
)
def start_server(self):
@@ -242,30 +255,4 @@ def create_proxy_master(host, port, flow_context: HTTPFlowContext): # pragma: n
def create_http_proxy(bind_host, port, flow_context: HTTPFlowContext): # pragma: no cover
master = create_proxy_master(bind_host, port, flow_context)
pconf = SLProxyConfig(master.options)
server = mitmproxy.proxy.server.ProxyServer(pconf)
master.server = server
return master
def is_asset_server_cap_name(cap_name):
return cap_name and (
cap_name.startswith("GetMesh")
or cap_name.startswith("GetTexture")
or cap_name.startswith("ViewerAsset")
)
class SerializedCapData(typing.NamedTuple):
cap_name: typing.Optional[str] = None
region_addr: typing.Optional[str] = None
session_id: typing.Optional[str] = None
base_url: typing.Optional[str] = None
type: str = "NORMAL"
def __bool__(self):
return bool(self.cap_name or self.session_id)
@property
def asset_server_cap(self):
return is_asset_server_cap_name(self.cap_name)

View File

@@ -1,3 +1,4 @@
import asyncio
import logging
import weakref
from typing import Optional, Tuple
@@ -35,6 +36,17 @@ class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
)
self.message_xml = MessageDotXML()
self.session: Optional[Session] = None
self.resend_task = asyncio.get_event_loop().create_task(self.attempt_resends())
async def attempt_resends(self):
while True:
await asyncio.sleep(0.1)
if self.session is None:
continue
for region in self.session.regions:
if not region.circuit or not region.circuit.is_alive:
continue
region.circuit.resend_unacked()
def _ensure_message_allowed(self, msg: Message):
if not self.message_xml.validate_udp_msg(msg.name):
@@ -99,6 +111,9 @@ class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
LOG.error("No circuit for %r, dropping packet!" % (packet.far_addr,))
return
# Process any ACKs for messages we injected first
region.circuit.collect_acks(message)
if message.name == "AgentMovementComplete":
self.session.main_region = region
if region.handle is None:
@@ -131,7 +146,7 @@ class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
# This message is owned by an async handler, drop it so it doesn't get
# sent with the normal flow.
if message.queued and not message.dropped:
if message.queued:
region.circuit.drop_message(message)
# Shouldn't mutate the message past this point, so log it now.
@@ -146,8 +161,9 @@ class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
elif message.name == "RegionHandshake":
region.name = str(message["RegionInfo"][0]["SimName"])
if not message.dropped:
region.circuit.send_message(message)
# Send the message if it wasn't explicitly dropped or sent before
if not message.finalized:
region.circuit.send(message)
def close(self):
super().close()
@@ -155,3 +171,4 @@ class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
AddonManager.handle_session_closed(self.session)
self.session_manager.close_session(self.session)
self.session = None
self.resend_task.cancel()

View File

@@ -3,7 +3,7 @@ import ast
import typing
from arpeggio import Optional, ZeroOrMore, EOF, \
ParserPython, PTNodeVisitor, visit_parse_tree, RegExMatch
ParserPython, PTNodeVisitor, visit_parse_tree, RegExMatch, OneOrMore
def literal():
@@ -12,7 +12,7 @@ def literal():
# https://stackoverflow.com/questions/14366401/#comment79795017_14366904
RegExMatch(r'''b?(\"\"\"|\'\'\'|\"|\')((?<!\\)(\\\\)*\\\1|.)*?\1'''),
# base16
RegExMatch(r'0x\d+'),
RegExMatch(r'0x[0-9a-fA-F]+'),
# base10 int or float.
RegExMatch(r'\d+(\.\d+)?'),
"None",
@@ -26,7 +26,9 @@ def literal():
def identifier():
return RegExMatch(r'[a-zA-Z*]([a-zA-Z0-9_*]+)?')
# Identifiers are allowed to have "-". It's not a special character
# in our grammar, and we expect them to show up some places, like header names.
return RegExMatch(r'[a-zA-Z*]([a-zA-Z0-9_*-]+)?')
def field_specifier():
@@ -42,7 +44,7 @@ def unary_expression():
def meta_field_specifier():
return "Meta", ".", identifier
return "Meta", OneOrMore(".", identifier)
def enum_field_specifier():
@@ -69,12 +71,17 @@ def message_filter():
return expression, EOF
MATCH_RESULT = typing.Union[bool, typing.Tuple]
class MatchResult(typing.NamedTuple):
result: bool
fields: typing.List[typing.Tuple]
def __bool__(self):
return self.result
class BaseFilterNode(abc.ABC):
@abc.abstractmethod
def match(self, msg) -> MATCH_RESULT:
def match(self, msg, short_circuit=True) -> MatchResult:
raise NotImplementedError()
@property
@@ -104,18 +111,36 @@ class BinaryFilterNode(BaseFilterNode, abc.ABC):
class UnaryNotFilterNode(UnaryFilterNode):
def match(self, msg) -> MATCH_RESULT:
return not self.node.match(msg)
def match(self, msg, short_circuit=True) -> MatchResult:
# Should we pass fields up here? Maybe not.
return MatchResult(not self.node.match(msg, short_circuit), [])
class OrFilterNode(BinaryFilterNode):
def match(self, msg) -> MATCH_RESULT:
return self.left_node.match(msg) or self.right_node.match(msg)
def match(self, msg, short_circuit=True) -> MatchResult:
left_match = self.left_node.match(msg, short_circuit)
if left_match and short_circuit:
return MatchResult(True, left_match.fields)
right_match = self.right_node.match(msg, short_circuit)
if right_match and short_circuit:
return MatchResult(True, right_match.fields)
if left_match or right_match:
# Fine since fields should be empty when result=False
return MatchResult(True, left_match.fields + right_match.fields)
return MatchResult(False, [])
class AndFilterNode(BinaryFilterNode):
def match(self, msg) -> MATCH_RESULT:
return self.left_node.match(msg) and self.right_node.match(msg)
def match(self, msg, short_circuit=True) -> MatchResult:
left_match = self.left_node.match(msg, short_circuit)
if not left_match:
return MatchResult(False, [])
right_match = self.right_node.match(msg, short_circuit)
if not right_match:
return MatchResult(False, [])
return MatchResult(True, left_match.fields + right_match.fields)
class MessageFilterNode(BaseFilterNode):
@@ -124,15 +149,15 @@ class MessageFilterNode(BaseFilterNode):
self.operator = operator
self.value = value
def match(self, msg) -> MATCH_RESULT:
return msg.matches(self)
def match(self, msg, short_circuit=True) -> MatchResult:
return msg.matches(self, short_circuit)
@property
def children(self):
return self.selector, self.operator, self.value
class MetaFieldSpecifier(str):
class MetaFieldSpecifier(tuple):
pass
@@ -158,7 +183,7 @@ class MessageFilterVisitor(PTNodeVisitor):
return LiteralValue(ast.literal_eval(node.value))
def visit_meta_field_specifier(self, _node, children):
return MetaFieldSpecifier(children[0])
return MetaFieldSpecifier(children)
def visit_enum_field_specifier(self, _node, children):
return EnumFieldSpecifier(*children)

View File

@@ -1,8 +1,11 @@
from __future__ import annotations
import abc
import ast
import collections
import copy
import fnmatch
import gzip
import io
import logging
import pickle
@@ -13,16 +16,16 @@ import weakref
from defusedxml import minidom
from hippolyzer.lib.base import serialization as se, llsd
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.datatypes import TaggedUnion, UUID, TupleCoord
from hippolyzer.lib.base.helpers import bytes_escape
from hippolyzer.lib.base.message.message_formatting import HumanMessageSerializer
from hippolyzer.lib.proxy.message_filter import MetaFieldSpecifier, compile_filter, BaseFilterNode, MessageFilterNode, \
EnumFieldSpecifier
from hippolyzer.lib.proxy.region import CapType
EnumFieldSpecifier, MatchResult
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.caps import CapType, SerializedCapData
if typing.TYPE_CHECKING:
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
@@ -30,24 +33,42 @@ LOG = logging.getLogger(__name__)
class BaseMessageLogger:
paused: bool
def log_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
pass
if self.paused:
return False
return self.add_log_entry(LLUDPMessageLogEntry(message, region, session))
def log_http_response(self, flow: HippoHTTPFlow):
pass
if self.paused:
return False
# These are huge, let's not log them for now.
if flow.cap_data and flow.cap_data.asset_server_cap:
return False
return self.add_log_entry(HTTPMessageLogEntry(flow))
def log_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
if self.paused:
return False
return self.add_log_entry(EQMessageLogEntry(event, region, session))
@abc.abstractmethod
def add_log_entry(self, entry: AbstractMessageLogEntry):
pass
class FilteringMessageLogger(BaseMessageLogger):
def __init__(self):
def __init__(self, maxlen=2000):
BaseMessageLogger.__init__(self)
self._raw_entries = collections.deque(maxlen=2000)
self._raw_entries = collections.deque(maxlen=maxlen)
self._filtered_entries: typing.List[AbstractMessageLogEntry] = []
self._paused = False
self.paused = False
self.filter: BaseFilterNode = compile_filter("")
def __iter__(self) -> typing.Iterator[AbstractMessageLogEntry]:
return iter(self._filtered_entries)
def set_filter(self, filter_str: str):
self.filter = compile_filter(filter_str)
self._begin_reset()
@@ -61,25 +82,7 @@ class FilteringMessageLogger(BaseMessageLogger):
self._end_reset()
def set_paused(self, paused: bool):
self._paused = paused
def log_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
if self._paused:
return
self._add_log_entry(LLUDPMessageLogEntry(message, region, session))
def log_http_response(self, flow: HippoHTTPFlow):
if self._paused:
return
# These are huge, let's not log them for now.
if flow.cap_data and flow.cap_data.asset_server_cap:
return
self._add_log_entry(HTTPMessageLogEntry(flow))
def log_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
if self._paused:
return
self._add_log_entry(EQMessageLogEntry(event, region, session))
self.paused = paused
# Hooks that Qt models will want to implement
def _begin_insert(self, insert_idx: int):
@@ -94,25 +97,21 @@ class FilteringMessageLogger(BaseMessageLogger):
def _end_reset(self):
pass
def _add_log_entry(self, entry: AbstractMessageLogEntry):
def add_log_entry(self, entry: AbstractMessageLogEntry):
try:
# Paused, throw it away.
if self._paused:
return
if self.paused:
return False
self._raw_entries.append(entry)
if self.filter.match(entry):
next_idx = len(self._filtered_entries)
self._begin_insert(next_idx)
self._filtered_entries.append(entry)
self._end_insert()
entry.cache_summary()
# In the common case we don't need to keep around the serialization
# caches anymore. If the filter changes, the caches will be repopulated
# as necessary.
entry.freeze()
return True
except Exception:
LOG.exception("Failed to filter queued message")
return False
def clear(self):
self._begin_reset()
@@ -121,7 +120,27 @@ class FilteringMessageLogger(BaseMessageLogger):
self._end_reset()
class AbstractMessageLogEntry:
class WrappingMessageLogger(BaseMessageLogger):
def __init__(self):
self.loggers: typing.List[BaseMessageLogger] = []
@property
def paused(self):
return all(x.paused for x in self.loggers)
def add_log_entry(self, entry: AbstractMessageLogEntry):
logged = False
for logger in self.loggers:
if logger.add_log_entry(entry):
logged = True
# At least one logger ended up keeping the message around, so let's
# cache the summary before we freeze the message.
if logged:
entry.cache_summary()
entry.freeze()
class AbstractMessageLogEntry(abc.ABC):
region: typing.Optional[ProxiedRegion]
session: typing.Optional[Session]
name: str
@@ -129,7 +148,7 @@ class AbstractMessageLogEntry:
__slots__ = ["_region", "_session", "_region_name", "_agent_id", "_summary", "meta"]
def __init__(self, region, session):
def __init__(self, region: typing.Optional[ProxiedRegion], session: typing.Optional[Session]):
if region and not isinstance(region, weakref.ReferenceType):
region = weakref.ref(region)
if session and not isinstance(session, weakref.ReferenceType):
@@ -159,6 +178,45 @@ class AbstractMessageLogEntry:
"SelectedFull": self._current_selected_full(),
}
def to_dict(self) -> dict:
meta = self.meta.copy()
def _dehydrate_meta_uuid(key: str):
if meta[key]:
meta[key] = str(meta[key])
_dehydrate_meta_uuid("AgentID")
_dehydrate_meta_uuid("SelectedFull")
_dehydrate_meta_uuid("SessionID")
return {
"type": self.type,
"region_name": self.region_name,
"agent_id": str(self.agent_id) if self.agent_id is not None else None,
"summary": self.summary,
"meta": meta,
}
@classmethod
@abc.abstractmethod
def from_dict(cls, val: dict):
pass
def apply_dict(self, val: dict) -> None:
self._region_name = val['region_name']
self._agent_id = UUID(val['agent_id']) if val['agent_id'] else None
self._summary = val['summary']
meta = val['meta'].copy()
def _hydrate_meta_uuid(key: str):
if meta[key]:
meta[key] = UUID(meta[key])
_hydrate_meta_uuid("AgentID")
_hydrate_meta_uuid("SelectedFull")
_hydrate_meta_uuid("SessionID")
self.meta.update(meta)
def freeze(self):
pass
@@ -177,7 +235,7 @@ class AbstractMessageLogEntry:
obj = self.region.objects.lookup_localid(selected_local)
return obj and obj.FullID
def _get_meta(self, name: str):
def _get_meta(self, name: str) -> typing.Any:
# Slight difference in semantics. Filters are meant to return the same
# thing no matter when they're run, so SelectedLocal and friends resolve
# to the selected items _at the time the message was logged_. To handle
@@ -250,7 +308,9 @@ class AbstractMessageLogEntry:
def _val_matches(self, operator, val, expected):
if isinstance(expected, MetaFieldSpecifier):
expected = self._get_meta(str(expected))
if len(expected) != 1:
raise ValueError(f"Can only support single-level Meta specifiers, not {expected!r}")
expected = self._get_meta(str(expected[0]))
if not isinstance(expected, (int, float, bytes, str, type(None), tuple)):
if callable(expected):
expected = expected()
@@ -304,12 +364,18 @@ class AbstractMessageLogEntry:
if matcher.value or matcher.operator:
return False
return self._packet_root_matches(matcher.selector[0])
if len(matcher.selector) == 2 and matcher.selector[0] == "Meta":
return self._val_matches(matcher.operator, self._get_meta(matcher.selector[1]), matcher.value)
if matcher.selector[0] == "Meta":
if len(matcher.selector) == 2:
return self._val_matches(matcher.operator, self._get_meta(matcher.selector[1]), matcher.value)
elif len(matcher.selector) == 3:
meta_dict = self._get_meta(matcher.selector[1])
if not meta_dict or not hasattr(meta_dict, 'get'):
return False
return self._val_matches(matcher.operator, meta_dict.get(matcher.selector[2]), matcher.value)
return None
def matches(self, matcher: "MessageFilterNode"):
return self._base_matches(matcher) or False
def matches(self, matcher: "MessageFilterNode", short_circuit=True) -> "MatchResult":
return MatchResult(self._base_matches(matcher) or False, [])
@property
def seq(self):
@@ -330,6 +396,14 @@ class AbstractMessageLogEntry:
xmlified = re.sub(rb" <key>", b"<key>", xmlified)
return xmlified.decode("utf8", errors="replace")
@staticmethod
def _format_xml(content):
beautified = minidom.parseString(content).toprettyxml(indent=" ")
# kill blank lines. will break cdata sections. meh.
beautified = re.sub(r'\n\s*\n', '\n', beautified, flags=re.MULTILINE)
return re.sub(r'<([\w]+)>\s*</\1>', r'<\1></\1>',
beautified, flags=re.MULTILINE)
class HTTPMessageLogEntry(AbstractMessageLogEntry):
__slots__ = ["flow"]
@@ -342,7 +416,7 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
super().__init__(region, session)
# This was a request the proxy made through itself
self.meta["Injected"] = flow.request_injected
self.meta["Synthetic"] = flow.request_injected
@property
def type(self):
@@ -418,13 +492,17 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
if not beautified:
content_type = self._guess_content_type(message)
if content_type.startswith("application/llsd"):
beautified = self._format_llsd(llsd.parse(message.content))
try:
beautified = self._format_llsd(llsd.parse(message.content))
except llsd.LLSDParseError:
# Sometimes LL sends plain XML with a Content-Type of application/llsd+xml.
# Try to detect that case and work around it
if content_type == "application/llsd+xml" and message.content.startswith(b'<'):
beautified = self._format_xml(message.content)
else:
raise
elif any(content_type.startswith(x) for x in ("application/xml", "text/xml")):
beautified = minidom.parseString(message.content).toprettyxml(indent=" ")
# kill blank lines. will break cdata sections. meh.
beautified = re.sub(r'\n\s*\n', '\n', beautified, flags=re.MULTILINE)
beautified = re.sub(r'<([\w]+)>\s*</\1>', r'<\1></\1>',
beautified, flags=re.MULTILINE)
beautified = self._format_xml(message.content)
except:
LOG.exception("Failed to beautify message")
@@ -483,6 +561,40 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
return "application/xml"
return content_type
def _get_meta(self, name: str) -> typing.Any:
lower_name = name.lower()
if lower_name == "url":
return self.flow.request.url
elif lower_name == "reqheaders":
return self.flow.request.headers
elif lower_name == "respheaders":
return self.flow.response.headers
elif lower_name == "host":
return self.flow.request.host.lower()
elif lower_name == "status":
return self.flow.response.status_code
return super()._get_meta(name)
def to_dict(self):
val = super().to_dict()
val['flow'] = self.flow.get_state()
cap_data = val['flow'].get('metadata', {}).get('cap_data_ser')
if cap_data is not None:
# Have to convert this from a namedtuple to a dict to make
# it importable
cap_dict = cap_data._asdict() # noqa
val['flow']['metadata']['cap_data_ser'] = cap_dict
return val
@classmethod
def from_dict(cls, val: dict):
cap_data = val['flow'].get('metadata', {}).get('cap_data_ser')
if cap_data:
val['flow']['metadata']['cap_data_ser'] = SerializedCapData(**cap_data)
ev = cls(HippoHTTPFlow.from_state(val['flow'], None))
ev.apply_dict(val)
return ev
class EQMessageLogEntry(AbstractMessageLogEntry):
__slots__ = ["event"]
@@ -510,6 +622,17 @@ class EQMessageLogEntry(AbstractMessageLogEntry):
self._summary = llsd.format_notation(self.event["body"]).decode("utf8")[:500]
return self._summary
def to_dict(self) -> dict:
val = super().to_dict()
val['event'] = llsd.format_notation(self.event)
return val
@classmethod
def from_dict(cls, val: dict):
ev = cls(llsd.parse_notation(val['event']), None, None)
ev.apply_dict(val)
return ev
class LLUDPMessageLogEntry(AbstractMessageLogEntry):
__slots__ = ["_message", "_name", "_direction", "_frozen_message", "_seq", "_deserializer"]
@@ -524,7 +647,7 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
super().__init__(region, session)
_MESSAGE_META_ATTRS = {
"Injected", "Dropped", "Extra", "Resent", "Zerocoded", "Acks", "Reliable",
"Synthetic", "Dropped", "Extra", "Resent", "Zerocoded", "Acks", "Reliable",
}
def _get_meta(self, name: str):
@@ -582,20 +705,21 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
def request(self, beautify=False, replacements=None):
return HumanMessageSerializer.to_human_string(self.message, replacements, beautify)
def matches(self, matcher):
def matches(self, matcher, short_circuit=True) -> "MatchResult":
base_matched = self._base_matches(matcher)
if base_matched is not None:
return base_matched
return MatchResult(base_matched, [])
if not self._packet_root_matches(matcher.selector[0]):
return False
return MatchResult(False, [])
message = self.message
selector_len = len(matcher.selector)
# name, block_name, var_name(, subfield_name)?
if selector_len not in (3, 4):
return False
return MatchResult(False, [])
found_field_keys = []
for block_name in message.blocks:
if not fnmatch.fnmatchcase(block_name, matcher.selector[1]):
continue
@@ -604,13 +728,13 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
if not fnmatch.fnmatchcase(var_name, matcher.selector[2]):
continue
# So we know where the match happened
span_key = (message.name, block_name, block_num, var_name)
field_key = (message.name, block_name, block_num, var_name)
if selector_len == 3:
# We're just matching on the var existing, not having any particular value
if matcher.value is None:
return span_key
if self._val_matches(matcher.operator, block[var_name], matcher.value):
return span_key
found_field_keys.append(field_key)
elif self._val_matches(matcher.operator, block[var_name], matcher.value):
found_field_keys.append(field_key)
# Need to invoke a special unpacker
elif selector_len == 4:
try:
@@ -621,15 +745,21 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
if isinstance(deserialized, TaggedUnion):
deserialized = deserialized.value
if not isinstance(deserialized, dict):
return False
continue
for key in deserialized.keys():
if fnmatch.fnmatchcase(str(key), matcher.selector[3]):
if matcher.value is None:
return span_key
if self._val_matches(matcher.operator, deserialized[key], matcher.value):
return span_key
# Short-circuiting checking individual subfields is fine since
# we only highlight fields anyway.
found_field_keys.append(field_key)
break
elif self._val_matches(matcher.operator, deserialized[key], matcher.value):
found_field_keys.append(field_key)
break
return False
if short_circuit and found_field_keys:
return MatchResult(True, found_field_keys)
return MatchResult(bool(found_field_keys), found_field_keys)
@property
def summary(self):
@@ -642,3 +772,30 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
if self._message:
self._seq = self._message.packet_id
return self._seq
def to_dict(self):
val = super().to_dict()
val['message'] = llsd.format_notation(self.message.to_dict(extended=True))
return val
@classmethod
def from_dict(cls, val: dict):
ev = cls(Message.from_dict(llsd.parse_notation(val['message'])), None, None)
ev.apply_dict(val)
return ev
def export_log_entries(entries: typing.Iterable[AbstractMessageLogEntry]) -> bytes:
return gzip.compress(repr([e.to_dict() for e in entries]).encode("utf8"))
_TYPE_CLASSES = {
"HTTP": HTTPMessageLogEntry,
"LLUDP": LLUDPMessageLogEntry,
"EQ": EQMessageLogEntry,
}
def import_log_entries(data: bytes) -> typing.List[AbstractMessageLogEntry]:
entries = ast.literal_eval(gzip.decompress(data).decode("utf8"))
return [_TYPE_CLASSES[e['type']].from_dict(e) for e in entries]

View File

@@ -32,6 +32,9 @@ class ProxyNameCache(NameCache):
with open(namecache_file, "rb") as f:
namecache_bytes = f.read()
agents = llsd.parse_xml(namecache_bytes)["agents"]
# Can be `None` if the file was just created
if not agents:
continue
for agent_id, agent_data in agents.items():
# Don't set display name if they just have the default
display_name = None

View File

@@ -57,7 +57,11 @@ class ProxyObjectManager(ClientObjectManager):
LOG.warning(f"Tried to load cache for {self._region} without a handle")
return
self.cache_loaded = True
self.object_cache = RegionViewerObjectCacheChain.for_region(handle, self._region.cache_id)
self.object_cache = RegionViewerObjectCacheChain.for_region(
handle=handle,
cache_id=self._region.cache_id,
cache_dir=self._region.session().cache_dir,
)
def request_missed_cached_objects_soon(self):
if self._cache_miss_timer:
@@ -106,6 +110,8 @@ class ProxyWorldObjectManager(ClientWorldObjectManager):
)
def _handle_object_update_cached_misses(self, region_handle: int, missing_locals: Set[int]):
if not self._settings.ALLOW_AUTO_REQUEST_OBJECTS:
return
if self._settings.AUTOMATICALLY_REQUEST_MISSING_OBJECTS:
# Schedule these local IDs to be requested soon if the viewer doesn't request
# them itself. Ideally we could just mutate the CRC of the ObjectUpdateCached
@@ -120,14 +126,15 @@ class ProxyWorldObjectManager(ClientWorldObjectManager):
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str], update_type: UpdateType):
super()._run_object_update_hooks(obj, updated_props, update_type)
region = self._session.region_by_handle(obj.RegionHandle)
if obj.PCode == PCode.AVATAR and "ParentID" in updated_props:
if obj.ParentID and not region.objects.lookup_localid(obj.ParentID):
# If an avatar just sat on an object we don't know about, add it to the queued
# cache misses and request if if the viewer doesn't. This should happen
# regardless of the auto-request object setting because otherwise we have no way
# to get a sitting agent's true region location, even if it's ourself.
region.objects.queued_cache_misses.add(obj.ParentID)
region.objects.request_missed_cached_objects_soon()
if self._settings.ALLOW_AUTO_REQUEST_OBJECTS:
if obj.PCode == PCode.AVATAR and "ParentID" in updated_props:
if obj.ParentID and not region.objects.lookup_localid(obj.ParentID):
# If an avatar just sat on an object we don't know about, add it to the queued
# cache misses and request it if the viewer doesn't. This should happen
# regardless of the auto-request missing objects setting because otherwise we
# have no way to get a sitting agent's true region location, even if it's ourselves.
region.objects.queued_cache_misses.add(obj.ParentID)
region.objects.request_missed_cached_objects_soon()
AddonManager.handle_object_updated(self._session, region, obj, updated_props)
def _run_kill_object_hooks(self, obj: Object):

View File

@@ -1,6 +1,5 @@
from __future__ import annotations
import enum
import logging
import hashlib
import uuid
@@ -18,6 +17,7 @@ from hippolyzer.lib.base.objects import handle_to_global_pos
from hippolyzer.lib.client.state import BaseClientRegion
from hippolyzer.lib.proxy.caps_client import ProxyCapsClient
from hippolyzer.lib.proxy.circuit import ProxiedCircuit
from hippolyzer.lib.proxy.caps import CapType
from hippolyzer.lib.proxy.object_manager import ProxyObjectManager
from hippolyzer.lib.base.transfer_manager import TransferManager
from hippolyzer.lib.base.xfer_manager import XferManager
@@ -27,13 +27,6 @@ if TYPE_CHECKING:
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
class CapType(enum.Enum):
NORMAL = enum.auto()
TEMPORARY = enum.auto()
WRAPPER = enum.auto()
PROXY_ONLY = enum.auto()
class CapsMultiDict(multidict.MultiDict[Tuple[CapType, str]]):
# TODO: Make a view object for this that's just name -> URL
# deriving from MultiMapping[_T] so we don't have to do
@@ -58,10 +51,11 @@ class ProxiedRegion(BaseClientRegion):
self.cache_id: Optional[UUID] = None
self.circuit: Optional[ProxiedCircuit] = None
self.circuit_addr = circuit_addr
self._caps = CapsMultiDict()
self.caps = CapsMultiDict()
# Reverse lookup for URL -> cap data
self._caps_url_lookup: Dict[str, Tuple[CapType, str]] = {}
if seed_cap:
self._caps["Seed"] = (CapType.NORMAL, seed_cap)
self.caps["Seed"] = (CapType.NORMAL, seed_cap)
self.session: Callable[[], Session] = weakref.ref(session)
self.message_handler: MessageHandler[Message, str] = MessageHandler()
self.http_message_handler: MessageHandler[HippoHTTPFlow, str] = MessageHandler()
@@ -84,8 +78,8 @@ class ProxiedRegion(BaseClientRegion):
self._name = val
@property
def caps(self):
return multidict.MultiDict((x, y[1]) for x, y in self._caps.items())
def cap_urls(self) -> multidict.MultiDict[str, str]:
return multidict.MultiDict((x, y[1]) for x, y in self.caps.items())
@property
def global_pos(self) -> Vector3:
@@ -102,12 +96,12 @@ class ProxiedRegion(BaseClientRegion):
def update_caps(self, caps: Mapping[str, str]):
for cap_name, cap_url in caps.items():
if isinstance(cap_url, str) and cap_url.startswith('http'):
self._caps.add(cap_name, (CapType.NORMAL, cap_url))
self.caps.add(cap_name, (CapType.NORMAL, cap_url))
self._recalc_caps()
def _recalc_caps(self):
self._caps_url_lookup.clear()
for name, cap_info in self._caps.items():
for name, cap_info in self.caps.items():
cap_type, cap_url = cap_info
self._caps_url_lookup[cap_url] = (cap_type, name)
@@ -116,32 +110,31 @@ class ProxiedRegion(BaseClientRegion):
Wrap an existing, non-unique cap with a unique URL
caps like ViewerAsset may be the same globally and wouldn't let us infer
which session / region the request was related to without a wrapper
which session / region the request was related to without a wrapper URL
that we inject into the seed response sent to the viewer.
"""
parsed = list(urllib.parse.urlsplit(self._caps[name][1]))
seed_id = self._caps["Seed"][1].split("/")[-1].encode("utf8")
parsed = list(urllib.parse.urlsplit(self.caps[name][1]))
seed_id = self.caps["Seed"][1].split("/")[-1].encode("utf8")
# Give it a unique domain tied to the current Seed URI
parsed[1] = f"{name.lower()}-{hashlib.sha256(seed_id).hexdigest()[:16]}.hippo-proxy.localhost"
# Force the URL to HTTP, we're going to handle the request ourselves so it doesn't need
# to be secure. This should save on expensive TLS context setup for each req.
parsed[0] = "http"
wrapper_url = urllib.parse.urlunsplit(parsed)
self._caps.add(name + "ProxyWrapper", (CapType.WRAPPER, wrapper_url))
self._recalc_caps()
# Register it with "ProxyWrapper" appended so we don't shadow the real cap URL
# in our own view of the caps
self.register_cap(name + "ProxyWrapper", wrapper_url, CapType.WRAPPER)
return wrapper_url
def register_proxy_cap(self, name: str):
"""
Register a cap to be completely handled by the proxy
"""
cap_url = f"https://caps.hippo-proxy.localhost/cap/{uuid.uuid4()!s}"
self._caps.add(name, (CapType.PROXY_ONLY, cap_url))
self._recalc_caps()
"""Register a cap to be completely handled by the proxy"""
cap_url = f"http://{uuid.uuid4()!s}.caps.hippo-proxy.localhost"
self.register_cap(name, cap_url, CapType.PROXY_ONLY)
return cap_url
def register_temporary_cap(self, name: str, cap_url: str):
def register_cap(self, name: str, cap_url: str, cap_type: CapType = CapType.NORMAL):
"""Register a Cap that only has meaning the first time it's used"""
self._caps.add(name, (CapType.TEMPORARY, cap_url))
self.caps.add(name, (cap_type, cap_url))
self._recalc_caps()
def resolve_cap(self, url: str, consume=True) -> Optional[Tuple[str, str, CapType]]:
@@ -150,9 +143,9 @@ class ProxiedRegion(BaseClientRegion):
cap_type, name = self._caps_url_lookup[cap_url]
if cap_type == CapType.TEMPORARY and consume:
# Resolving a temporary cap pops it out of the dict
temporary_caps = self._caps.popall(name)
temporary_caps = self.caps.popall(name)
temporary_caps.remove((cap_type, cap_url))
self._caps.extend((name, x) for x in temporary_caps)
self.caps.extend((name, x) for x in temporary_caps)
self._recalc_caps()
return name, cap_url, cap_type
return None
@@ -162,6 +155,7 @@ class ProxiedRegion(BaseClientRegion):
if self.circuit:
self.circuit.is_alive = False
self.objects.clear()
self.eq_manager.clear()
def __repr__(self):
return "<%s %s>" % (self.__class__.__name__, self.name)
@@ -172,11 +166,27 @@ class EventQueueManager:
# TODO: Per-EQ InjectionTracker so we can inject fake responses on 499
self._queued_events = []
self._region = weakref.proxy(region)
self._last_ack: Optional[int] = None
self._last_payload: Optional[Any] = None
def queue_event(self, event: dict):
def inject_event(self, event: dict):
self._queued_events.append(event)
def take_events(self):
def take_injected_events(self):
events = self._queued_events
self._queued_events = []
return events
def cache_last_poll_response(self, req_ack: int, payload: Any):
self._last_ack = req_ack
self._last_payload = payload
def get_cached_poll_response(self, req_ack: Optional[int]) -> Optional[Any]:
if self._last_ack == req_ack:
return self._last_payload
return None
def clear(self):
self._queued_events.clear()
self._last_ack = None
self._last_payload = None

View File

@@ -16,10 +16,11 @@ from hippolyzer.lib.client.state import BaseClientSession
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.circuit import ProxiedCircuit
from hippolyzer.lib.proxy.http_asset_repo import HTTPAssetRepo
from hippolyzer.lib.proxy.http_proxy import HTTPFlowContext, is_asset_server_cap_name, SerializedCapData
from hippolyzer.lib.proxy.http_proxy import HTTPFlowContext
from hippolyzer.lib.proxy.caps import is_asset_server_cap_name, CapData, CapType
from hippolyzer.lib.proxy.namecache import ProxyNameCache
from hippolyzer.lib.proxy.object_manager import ProxyWorldObjectManager
from hippolyzer.lib.proxy.region import ProxiedRegion, CapType
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.settings import ProxySettings
if TYPE_CHECKING:
@@ -46,6 +47,8 @@ class Session(BaseClientSession):
self.message_handler: MessageHandler[Message, str] = MessageHandler()
self.http_message_handler: MessageHandler[HippoHTTPFlow, str] = MessageHandler()
self.objects = ProxyWorldObjectManager(self, session_manager.settings, session_manager.name_cache)
# Base path of a newview type cache directory for this session
self.cache_dir: Optional[str] = None
self._main_region = None
@property
@@ -96,12 +99,12 @@ class Session(BaseClientSession):
for region in self.regions:
if region.circuit_addr == circuit_addr:
if seed_url and region.caps.get("Seed") != seed_url:
if seed_url and region.cap_urls.get("Seed") != seed_url:
region.update_caps({"Seed": seed_url})
if handle:
region.handle = handle
return region
if seed_url and region.caps.get("Seed") == seed_url:
if seed_url and region.cap_urls.get("Seed") == seed_url:
return region
if not circuit_addr:
@@ -110,6 +113,7 @@ class Session(BaseClientSession):
logging.info("Registering region for %r" % (circuit_addr,))
region = ProxiedRegion(circuit_addr, seed_url, self, handle=handle)
self.regions.append(region)
AddonManager.handle_region_registered(self, region)
return region
def region_by_circuit_addr(self, circuit_addr) -> Optional[ProxiedRegion]:
@@ -211,50 +215,6 @@ class SessionManager:
return cap_data
return CapData()
def deserialize_cap_data(self, ser_cap_data: "SerializedCapData") -> "CapData":
cap_session = None
cap_region = None
if ser_cap_data.session_id:
for session in self.sessions:
if ser_cap_data.session_id == str(session.id):
cap_session = session
if cap_session and ser_cap_data.region_addr:
for region in cap_session.regions:
if ser_cap_data.region_addr == str(region.circuit_addr):
cap_region = region
return CapData(
cap_name=ser_cap_data.cap_name,
region=ref(cap_region) if cap_region else None,
session=ref(cap_session) if cap_session else None,
base_url=ser_cap_data.base_url,
type=CapType[ser_cap_data.type],
)
class CapData(NamedTuple):
cap_name: Optional[str] = None
# Actually they're weakrefs but the type sigs suck.
region: Optional[Callable[[], Optional[ProxiedRegion]]] = None
session: Optional[Callable[[], Optional[Session]]] = None
base_url: Optional[str] = None
type: CapType = CapType.NORMAL
def __bool__(self):
return bool(self.cap_name or self.session)
def serialize(self) -> "SerializedCapData":
return SerializedCapData(
cap_name=self.cap_name,
region_addr=str(self.region().circuit_addr) if self.region and self.region() else None,
session_id=str(self.session().id) if self.session and self.session() else None,
base_url=self.base_url,
type=self.type.name,
)
@property
def asset_server_cap(self) -> bool:
return is_asset_server_cap_name(self.cap_name)
@dataclasses.dataclass
class SelectionModel:

View File

@@ -28,6 +28,9 @@ class ProxySettings(Settings):
PROXY_BIND_ADDR: str = EnvSettingDescriptor("127.0.0.1", "HIPPO_BIND_HOST", str)
REMOTELY_ACCESSIBLE: bool = SettingDescriptor(False)
USE_VIEWER_OBJECT_CACHE: bool = SettingDescriptor(False)
# Whether having the proxy do automatic internal requests objects is allowed at all
ALLOW_AUTO_REQUEST_OBJECTS: bool = SettingDescriptor(True)
# Whether the viewer should request any directly referenced objects it didn't know about.
AUTOMATICALLY_REQUEST_MISSING_OBJECTS: bool = SettingDescriptor(False)
ADDON_SCRIPTS: List[str] = SettingDescriptor(list)
FILTERS: Dict[str, str] = SettingDescriptor(dict)

View File

@@ -37,6 +37,9 @@ class BaseProxyTest(unittest.IsolatedAsyncioTestCase):
self.serializer = UDPMessageSerializer()
self.session.objects.track_region_objects(123)
def tearDown(self) -> None:
self.protocol.close()
async def _wait_drained(self):
await asyncio.sleep(0.001)

View File

@@ -58,6 +58,7 @@ from __future__ import annotations
import io
import logging
import pathlib
from pathlib import Path
from typing import *
@@ -82,6 +83,7 @@ class ViewerObjectCache:
@classmethod
def from_path(cls, base_path: Union[str, Path]):
base_path = pathlib.Path(base_path)
cache = cls(base_path)
with open(cache.base_path / "object.cache", "rb") as fh:
reader = se.BufferReader("<", fh.read())
@@ -143,6 +145,10 @@ class ViewerObjectCacheEntry(recordclass.datatuple): # type: ignore
data: bytes
def is_valid_vocache_dir(cache_dir):
return (pathlib.Path(cache_dir) / "objectcache" / "object.cache").exists()
class RegionViewerObjectCache:
"""Parser and container for .slc files"""
def __init__(self, cache_id: UUID, entries: List[ViewerObjectCacheEntry]):
@@ -201,7 +207,7 @@ class RegionViewerObjectCacheChain:
return None
@classmethod
def for_region(cls, handle: int, cache_id: UUID):
def for_region(cls, handle: int, cache_id: UUID, cache_dir: Optional[str] = None):
"""
Get a cache chain for a specific region, called on region connection
@@ -209,8 +215,13 @@ class RegionViewerObjectCacheChain:
so we have to try every region object cache file for every viewer installed.
"""
caches = []
for cache_dir in iter_viewer_cache_dirs():
if not (cache_dir / "objectcache" / "object.cache").exists():
if cache_dir is None:
cache_dirs = iter_viewer_cache_dirs()
else:
cache_dirs = [pathlib.Path(cache_dir)]
for cache_dir in cache_dirs:
if not is_valid_vocache_dir(cache_dir):
continue
cache = ViewerObjectCache.from_path(cache_dir / "objectcache")
if cache:

View File

@@ -0,0 +1,46 @@
import abc
from mitmproxy.addons import asgiapp
from mitmproxy.controller import DummyReply
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
async def serve(app, flow: HippoHTTPFlow):
"""Serve a request based on a Hippolyzer HTTP flow using a provided app"""
# Shove this on mitmproxy's flow object so asgiapp doesn't explode when it tries
# to commit the flow reply. Our take / commit semantics are different than mitmproxy
# proper, so we ignore what mitmproxy sets here anyhow.
flow.flow.reply = DummyReply()
flow.flow.reply.take()
await asgiapp.serve(app, flow.flow)
flow.flow.reply = None
# Send the modified flow object back to mitmproxy
flow.resume()
class WebAppCapAddon(BaseAddon, abc.ABC):
"""
Addon that provides a cap via an ASGI webapp
Handles all registration of the cap URL and routing of the request.
"""
CAP_NAME: str
APP: any
def handle_region_registered(self, session: Session, region: ProxiedRegion):
# Register a fake URL for our cap. This will add the cap URL to the Seed
# response that gets sent back to the client if that cap name was requested.
if self.CAP_NAME not in region.cap_urls:
region.register_proxy_cap(self.CAP_NAME)
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
if flow.cap_data.cap_name != self.CAP_NAME:
return
# This request may take a while to generate a response for, take it out of the normal
# HTTP handling flow and handle it in a async task.
# TODO: Make all HTTP handling hooks async so this isn't necessary
self._schedule_task(serve(self.APP, flow.take()))

View File

@@ -1,69 +1,66 @@
aiohttp==3.7.4.post0
aiohttp==3.8.1
aiosignal==1.2.0
appdirs==1.4.4
Arpeggio==1.10.2
asgiref==3.3.4
async-timeout==3.0.1
attrs==20.3.0
black==21.4b2
asgiref==3.4.1
async-timeout==4.0.1
attrs==21.2.0
blinker==1.4
Brotli==1.0.9
certifi==2020.12.5
cffi==1.14.5
chardet==4.0.0
click==7.1.2
cryptography==3.3.2
certifi==2021.10.8
cffi==1.15.0
charset-normalizer==2.0.9
click==8.0.3
cryptography==3.4.8
defusedxml==0.7.1
Flask==1.1.2
Glymur==0.9.3
Flask==2.0.2
frozenlist==1.2.0
Glymur==0.9.6
h11==0.12.0
h2==4.0.0
h2==4.1.0
hpack==4.0.0
hyperframe==6.0.1
idna==2.10
itsdangerous==1.1.0
jedi==0.18.0
Jinja2==2.11.3
itsdangerous==2.0.1
jedi==0.18.1
Jinja2==3.0.3
kaitaistruct==0.9
lazy-object-proxy==1.6.0
ldap3==2.8.1
llbase==1.2.10
lxml==4.6.3
MarkupSafe==1.1.1
mitmproxy==6.0.2
msgpack==1.0.2
multidict==5.1.0
mypy-extensions==0.4.3
numpy==1.20.2
parso==0.8.2
ldap3==2.9.1
llbase==1.2.11
lxml==4.6.4
MarkupSafe==2.0.1
mitmproxy==7.0.4
msgpack==1.0.3
multidict==5.2.0
numpy==1.21.4
parso==0.8.3
passlib==1.7.4
pathspec==0.8.1
prompt-toolkit==3.0.18
protobuf==3.14.0
ptpython==3.0.17
prompt-toolkit==3.0.23
protobuf==3.18.1
ptpython==3.0.20
publicsuffix2==2.20191221
pyasn1==0.4.8
pycparser==2.20
Pygments==2.8.1
pycparser==2.21
Pygments==2.10.0
pyOpenSSL==20.0.1
pyparsing==2.4.7
pyperclip==1.8.2
PySide2==5.15.2
qasync==0.15.0
PySide6==6.2.2
qasync==0.22.0
recordclass==0.14.3
regex==2021.4.4
requests==2.25.1
ruamel.yaml==0.16.13
ruamel.yaml.clib==0.2.2
shiboken2==5.15.2
six==1.15.0
sortedcontainers==2.3.0
toml==0.10.2
requests==2.26.0
ruamel.yaml==0.17.16
ruamel.yaml.clib==0.2.6
shiboken6==6.2.2
six==1.16.0
sortedcontainers==2.4.0
tornado==6.1
typing-extensions==3.7.4.3
urllib3==1.26.5
typing-extensions==4.0.1
urllib3==1.26.7
urwid==2.1.2
wcwidth==0.2.5
Werkzeug==1.0.1
Werkzeug==2.0.2
wsproto==1.0.0
yarl==1.6.3
zstandard==0.14.1
yarl==1.7.2
zstandard==0.15.2

View File

@@ -25,7 +25,7 @@ from setuptools import setup, find_packages
here = path.abspath(path.dirname(__file__))
version = '0.6.2'
version = '0.9.0'
with open(path.join(here, 'README.md')) as readme_fh:
readme = readme_fh.read()
@@ -44,6 +44,7 @@ setup(
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: Implementation :: CPython",
"Topic :: System :: Networking :: Monitoring",
"Topic :: Software Development :: Libraries :: Python Modules",
@@ -82,20 +83,20 @@ setup(
'llbase>=1.2.5',
'defusedxml',
'aiohttp<4.0.0',
'recordclass',
'recordclass<0.15',
'lazy-object-proxy',
'arpeggio',
# requests breaks with newer idna
'idna<3,>=2.5',
# 7.x will be a major change.
'mitmproxy<7.0.0',
'mitmproxy>=7.0.2,<8.0',
# For REPLs
'ptpython<4.0',
# JP2 codec
'Glymur<1.0',
'Glymur<0.9.7',
'numpy<2.0',
# These could be in extras_require if you don't want a GUI.
'pyside2<6.0',
'pyside6',
'qasync',
],
tests_require=[

View File

@@ -9,20 +9,21 @@ from cx_Freeze import setup, Executable
# We don't need any of these and they make the archive huge.
TO_DELETE = [
"lib/PySide2/Qt3DRender.pyd",
"lib/PySide2/Qt53DRender.dll",
"lib/PySide2/Qt5Charts.dll",
"lib/PySide2/Qt5Location.dll",
"lib/PySide2/Qt5Pdf.dll",
"lib/PySide2/Qt5Quick.dll",
"lib/PySide2/Qt5WebEngineCore.dll",
"lib/PySide2/QtCharts.pyd",
"lib/PySide2/QtMultimedia.pyd",
"lib/PySide2/QtOpenGLFunctions.pyd",
"lib/PySide2/QtOpenGLFunctions.pyi",
"lib/PySide2/d3dcompiler_47.dll",
"lib/PySide2/opengl32sw.dll",
"lib/PySide2/translations",
"lib/PySide6/Qt6DRender.pyd",
"lib/PySide6/Qt63DRender.dll",
"lib/PySide6/Qt6Charts.dll",
"lib/PySide6/Qt6Location.dll",
"lib/PySide6/Qt6Pdf.dll",
"lib/PySide6/Qt6Quick.dll",
"lib/PySide6/Qt6WebEngineCore.dll",
"lib/PySide6/QtCharts.pyd",
"lib/PySide6/QtMultimedia.pyd",
"lib/PySide6/QtOpenGLFunctions.pyd",
"lib/PySide6/QtOpenGLFunctions.pyi",
"lib/PySide6/d3dcompiler_47.dll",
"lib/PySide6/opengl32sw.dll",
"lib/PySide6/lupdate.exe",
"lib/PySide6/translations",
"lib/aiohttp/_find_header.c",
"lib/aiohttp/_frozenlist.c",
"lib/aiohttp/_helpers.c",
@@ -112,7 +113,7 @@ executables = [
setup(
name="hippolyzer_gui",
version="0.6.2",
version="0.9.0",
description="Hippolyzer GUI",
options=options,
executables=executables,

View File

@@ -50,4 +50,4 @@ class TestCapsClient(unittest.IsolatedAsyncioTestCase):
with self.assertRaises(KeyError):
with self.caps_client.get("BadCap"):
pass
assert False

View File

@@ -134,3 +134,15 @@ class TestDatatypes(unittest.TestCase):
val = llsd.parse_binary(llsd.format_binary(orig))
self.assertIsInstance(val, UUID)
self.assertEqual(orig, val)
def test_jank_stringy_bytes(self):
val = JankStringyBytes(b"foo\x00")
self.assertTrue("o" in val)
self.assertTrue(b"o" in val)
self.assertFalse(b"z" in val)
self.assertFalse("z" in val)
self.assertEqual("foo", val)
self.assertEqual(b"foo\x00", val)
self.assertNotEqual(b"foo", val)
self.assertEqual(b"foo", JankStringyBytes(b"foo"))
self.assertEqual("foo", JankStringyBytes(b"foo"))

View File

@@ -1,7 +1,7 @@
import unittest
from hippolyzer.lib.base.datatypes import *
from hippolyzer.lib.base.legacy_inv import InventoryModel
from hippolyzer.lib.base.inventory import InventoryModel
from hippolyzer.lib.base.wearables import Wearable, VISUAL_PARAMS
SIMPLE_INV = """\tinv_object\t0
@@ -61,6 +61,51 @@ class TestLegacyInv(unittest.TestCase):
self.assertEqual(item.sale_info.sale_type, "not")
self.assertEqual(item.model, model)
def test_llsd_serialization(self):
model = InventoryModel.from_str(SIMPLE_INV)
self.assertEqual(
model.to_llsd(),
[
{
'name': 'Contents',
'obj_id': UUID('f4d91477-def1-487a-b4f3-6fa201c17376'),
'parent_id': UUID('00000000-0000-0000-0000-000000000000'),
'type': 'category'
},
{
'asset_id': UUID('00000000-0000-0000-0000-000000000000'),
'created_at': 1587367239,
'desc': '2020-04-20 04:20:39 lsl2 script',
'flags': b'\x00\x00\x00\x00',
'inv_type': 'script',
'item_id': UUID('dd163122-946b-44df-99f6-a6030e2b9597'),
'name': 'New Script',
'parent_id': UUID('f4d91477-def1-487a-b4f3-6fa201c17376'),
'permissions': {
'base_mask': 2147483647,
'creator_id': UUID('a2e76fcd-9360-4f6d-a924-000000000003'),
'everyone_mask': 0,
'group_id': UUID('00000000-0000-0000-0000-000000000000'),
'group_mask': 0,
'last_owner_id': UUID('a2e76fcd-9360-4f6d-a924-000000000003'),
'next_owner_mask': 581632,
'owner_id': UUID('a2e76fcd-9360-4f6d-a924-000000000003'),
'owner_mask': 2147483647
},
'sale_info': {
'sale_price': 10,
'sale_type': 'not'
},
'type': 'lsltext'
}
]
)
def test_llsd_legacy_equality(self):
model = InventoryModel.from_str(SIMPLE_INV)
new_model = InventoryModel.from_llsd(model.to_llsd())
self.assertEqual(model, new_model)
GIRL_NEXT_DOOR_SHAPE = """LLWearable version 22
Girl Next Door - C2 - med - Adam n Eve

View File

@@ -146,6 +146,12 @@ class TestMessage(unittest.TestCase):
new_msg = Message.from_dict(self.chat_msg.to_dict())
self.assertEqual(pickle.dumps(self.chat_msg), pickle.dumps(new_msg))
def test_todict_extended(self):
self.chat_msg.packet_id = 5
new_msg = Message.from_dict(self.chat_msg.to_dict(extended=True))
self.assertEqual(5, new_msg.packet_id)
self.assertEqual(pickle.dumps(self.chat_msg), pickle.dumps(new_msg))
def test_todict_multiple_blocks(self):
chat_msg = self.chat_msg
# If we dupe the ChatData block it should survive to_dict()
@@ -294,3 +300,14 @@ class HumanReadableMessageTests(unittest.TestCase):
with self.assertRaises(ValueError):
HumanMessageSerializer.from_human_string(val)
def test_flags(self):
val = """
OUT FooMessage [ZEROCODED] [RELIABLE] [1]
[SomeBlock]
foo = 1
"""
msg = HumanMessageSerializer.from_human_string(val)
self.assertEqual(HumanMessageSerializer.to_human_string(msg).strip(), val.strip())

View File

@@ -791,7 +791,3 @@ class SubfieldSerializationTests(BaseSerializationTest):
self.assertEqual(ser.serialize(None, FooFlags.FOO), 1)
self.assertEqual(ser.serialize(None, 3), 3)
self.assertEqual(ser.serialize(None, 7), 7)
if __name__ == "__main__":
unittest.main()

View File

@@ -70,7 +70,7 @@ class XferManagerTests(BaseTransferTests):
manager = XferManager(self.server_connection)
xfer = await manager.request(vfile_id=asset_id, vfile_type=AssetType.BODYPART)
self.received_bytes = xfer.reassemble_chunks()
self.server_circuit.send_message(Message(
self.server_circuit.send(Message(
"AssetUploadComplete",
Block("AssetBlock", UUID=asset_id, Type=asset_block["Type"], Success=True),
direction=Direction.IN,
@@ -109,7 +109,7 @@ class TestTransferManager(BaseTransferTests):
self.assertEqual(EstateAssetType.COVENANT, params.EstateAssetType)
data = self.LARGE_PAYLOAD
self.server_circuit.send_message(Message(
self.server_circuit.send(Message(
'TransferInfo',
Block(
'TransferInfo',
@@ -125,7 +125,7 @@ class TestTransferManager(BaseTransferTests):
while True:
chunk = data[:1000]
data = data[1000:]
self.server_circuit.send_message(Message(
self.server_circuit.send(Message(
'TransferPacket',
Block(
'TransferData',

View File

@@ -6,9 +6,8 @@ import multiprocessing
from urllib.parse import urlparse
import aioresponses
from mitmproxy.net import http
from mitmproxy.test import tflow, tutils
from mitmproxy.http import HTTPFlow
from mitmproxy.http import HTTPFlow, Headers
from yarl import URL
from hippolyzer.apps.proxy import run_http_proxy_process
@@ -17,8 +16,7 @@ from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.http_event_manager import MITMProxyEventManager
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.http_proxy import SerializedCapData
from hippolyzer.lib.proxy.message_logger import FilteringMessageLogger
from hippolyzer.lib.proxy.caps import SerializedCapData
from hippolyzer.lib.proxy.sessions import SessionManager
from hippolyzer.lib.proxy.test_utils import BaseProxyTest
@@ -31,12 +29,6 @@ class MockAddon(BaseAddon):
flow.metadata["touched_addon"] = True
class SimpleMessageLogger(FilteringMessageLogger):
@property
def entries(self):
return self._filtered_entries
class HTTPIntegrationTests(BaseProxyTest):
def setUp(self) -> None:
super().setUp()
@@ -88,7 +80,7 @@ class HTTPIntegrationTests(BaseProxyTest):
fake_flow = tflow.tflow(
req=tutils.treq(host="example.com", content=b'<llsd><string>getZOffsets|'),
resp=tutils.tresp(
headers=http.Headers((
headers=Headers((
(b"X-SecondLife-Object-Name", b"#Firestorm LSL Bridge v99999"),
(b"X-SecondLife-Owner-Key", str(self.session.agent_id).encode("utf8")),
)),
@@ -153,25 +145,26 @@ class TestMITMProxy(BaseProxyTest):
super().setUp()
self._setup_default_circuit()
self.caps_client = self.session.main_region.caps_client
def test_mitmproxy_works(self):
proxy_port = 9905
self.session_manager.settings.HTTP_PROXY_PORT = proxy_port
http_proc = multiprocessing.Process(
self.http_proc = multiprocessing.Process(
target=run_http_proxy_process,
args=("127.0.0.1", proxy_port, self.session_manager.flow_context),
daemon=True,
)
http_proc.start()
self.http_proc.start()
self.session_manager.flow_context.mitmproxy_ready.wait(1.0)
http_event_manager = MITMProxyEventManager(self.session_manager, self.session_manager.flow_context)
self.http_event_manager = MITMProxyEventManager(
self.session_manager,
self.session_manager.flow_context
)
def test_mitmproxy_works(self):
async def _request_example_com():
# Pump callbacks from mitmproxy
asyncio.create_task(http_event_manager.run())
asyncio.create_task(self.http_event_manager.run())
try:
async with self.caps_client.get("http://example.com/", timeout=0.5) as resp:
self.assertIn(b"Example Domain", await resp.read())
@@ -181,4 +174,4 @@ class TestMITMProxy(BaseProxyTest):
# Tell the event pump and mitmproxy they need to shut down
self.session_manager.flow_context.shutdown_signal.set()
asyncio.run(_request_example_com())
http_proc.join()
self.http_proc.join()

View File

@@ -12,7 +12,6 @@ from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.udpdeserializer import UDPMessageDeserializer
from hippolyzer.lib.base.objects import Object
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.message_logger import FilteringMessageLogger, LLUDPMessageLogEntry
@@ -205,8 +204,8 @@ class LLUDPIntegrationTests(BaseProxyTest):
self.protocol.datagram_received(obj_update, self.region_addr)
await self._wait_drained()
entries = message_logger.entries
self.assertEqual(len(entries), 1)
self.assertEqual(entries[0].name, "ObjectUpdateCompressed")
self.assertEqual(1, len(entries))
self.assertEqual("ObjectUpdateCompressed", entries[0].name)
async def test_filtering_logged_messages(self):
message_logger = SimpleMessageLogger()
@@ -223,8 +222,8 @@ class LLUDPIntegrationTests(BaseProxyTest):
await self._wait_drained()
message_logger.set_filter("ObjectUpdateCompressed")
entries = message_logger.entries
self.assertEqual(len(entries), 1)
self.assertEqual(entries[0].name, "ObjectUpdateCompressed")
self.assertEqual(1, len(entries))
self.assertEqual("ObjectUpdateCompressed", entries[0].name)
async def test_logging_taken_message(self):
message_logger = SimpleMessageLogger()
@@ -262,11 +261,6 @@ class LLUDPIntegrationTests(BaseProxyTest):
# Don't have a serializer, onto the next field
continue
deser = serializer.deserialize(block, orig_val)
# For now we consider returning UNSERIALIZABLE to be acceptable.
# We should probably consider raising instead of returning that.
if deser is se.UNSERIALIZABLE:
continue
new_val = serializer.serialize(block, deser)
if orig_val != new_val:
raise AssertionError(f"{block.name}.{var_name} didn't reserialize correctly,"

View File

@@ -26,7 +26,13 @@ class ExampleCommandHandler:
y=str,
)
async def own_name(self, _session, _region, y):
self.bar = y
pass
@handle_command(
x=Parameter(str, optional=True),
)
async def optional(self, _session, _region, x=42):
self.bar = x
class TestCommandHandlers(unittest.IsolatedAsyncioTestCase):
@@ -47,9 +53,20 @@ class TestCommandHandlers(unittest.IsolatedAsyncioTestCase):
async def test_own_name(self):
self.assertEqual(self.handler.own_name.command.name, "own_name")
async def test_missing_param(self):
with self.assertRaises(KeyError):
await self.handler.foo(None, None, "")
async def test_optional_param(self):
await self.handler.optional(None, None, "foo") # type: ignore
self.assertEqual(self.handler.bar, "foo")
await self.handler.optional(None, None, "") # type: ignore
# Should have picked up the default value
self.assertEqual(self.handler.bar, 42)
async def test_bad_command(self):
with self.assertRaises(ValueError):
class _BadCommandHandler:
@handle_command("foobaz")
def bad_command(self, session, region):
pass
assert False

View File

@@ -1,5 +1,6 @@
from mitmproxy.test import tflow, tutils
from hippolyzer.lib.proxy.caps import CapType
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.message_logger import HTTPMessageLogEntry
from hippolyzer.lib.proxy.test_utils import BaseProxyTest
@@ -84,8 +85,8 @@ content-length: 33\r
self.assertEqual(b"foobar", flow.response.content)
def test_temporary_cap_resolution(self):
self.region.register_temporary_cap("TempExample", "http://not.example.com")
self.region.register_temporary_cap("TempExample", "http://not2.example.com")
self.region.register_cap("TempExample", "http://not.example.com", CapType.TEMPORARY)
self.region.register_cap("TempExample", "http://not2.example.com", CapType.TEMPORARY)
# Resolving the cap should consume it
cap_data = self.session_manager.resolve_cap("http://not.example.com")
self.assertEqual(cap_data.cap_name, "TempExample")

View File

@@ -2,13 +2,14 @@ import unittest
from mitmproxy.test import tflow, tutils
from hippolyzer.lib.base.datatypes import Vector3
from hippolyzer.lib.base.datatypes import Vector3, UUID
from hippolyzer.lib.base.message.message import Block, Message as Message
from hippolyzer.lib.base.message.udpdeserializer import UDPMessageDeserializer
from hippolyzer.lib.base.settings import Settings
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.http_proxy import SerializedCapData
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, HTTPMessageLogEntry
from hippolyzer.lib.proxy.caps import SerializedCapData
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, HTTPMessageLogEntry, export_log_entries, \
import_log_entries
from hippolyzer.lib.proxy.message_filter import compile_filter
from hippolyzer.lib.proxy.sessions import SessionManager
from hippolyzer.lib.proxy.settings import ProxySettings
@@ -24,7 +25,7 @@ OBJECT_UPDATE = b'\xc0\x00\x00\x00Q\x00\x0c\x00\x01\xea\x03\x00\x02\xe6\x03\x00\
b'\x88\x00"'
class MessageFilterTests(unittest.TestCase):
class MessageFilterTests(unittest.IsolatedAsyncioTestCase):
def _filter_matches(self, filter_str, message):
compiled = compile_filter(filter_str)
return compiled.match(message)
@@ -118,6 +119,17 @@ class MessageFilterTests(unittest.TestCase):
self.assertTrue(self._filter_matches("ObjectUpdate.ObjectData.ObjectData.Position > (88, 41, 25)", entry))
self.assertTrue(self._filter_matches("ObjectUpdate.ObjectData.ObjectData.Position < (90, 43, 27)", entry))
def test_import_export_message(self):
msg = LLUDPMessageLogEntry(Message(
"Foo",
Block("Bar", Baz=1, Quux=UUID.random(), Foo=0xFFffFFffFF)
), None, None)
msg.freeze()
msg = import_log_entries(export_log_entries([msg]))[0]
self.assertTrue(self._filter_matches("Foo.Bar.Baz == 1", msg))
# Make sure numbers outside 32bit range come through
self.assertTrue(self._filter_matches("Foo.Bar.Foo == 0xFFffFFffFF", msg))
def test_http_flow(self):
session_manager = SessionManager(ProxySettings())
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
@@ -129,6 +141,21 @@ class MessageFilterTests(unittest.TestCase):
self.assertTrue(self._filter_matches("FakeCap", entry))
self.assertFalse(self._filter_matches("NotFakeCap", entry))
def test_http_header_filter(self):
session_manager = SessionManager(ProxySettings())
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
fake_flow.request.headers["Cookie"] = 'foo="bar"'
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), session_manager)
entry = HTTPMessageLogEntry(flow)
# The header map is case-insensitive!
self.assertTrue(self._filter_matches('Meta.ReqHeaders.cookie ~= "foo"', entry))
self.assertFalse(self._filter_matches('Meta.ReqHeaders.foobar ~= "foo"', entry))
if __name__ == "__main__":
unittest.main()
def test_export_import_http_flow(self):
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
fake_flow.metadata["cap_data_ser"] = SerializedCapData(
cap_name="FakeCap",
)
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), None)
new_entry = import_log_entries(export_log_entries([HTTPMessageLogEntry(flow)]))[0]
self.assertEqual("FakeCap", new_entry.name)

View File

@@ -17,17 +17,17 @@ class MockedProxyCircuit(ProxiedCircuit):
self.in_injections = InjectionTracker(0, maxlen=10)
def _send_prepared_message(self, msg: Message, transport=None):
self.sent_simple.append((msg.packet_id, msg.name, msg.direction, msg.injected, msg.acks))
self.sent_simple.append((msg.packet_id, msg.name, msg.direction, msg.synthetic, msg.acks))
self.sent_msgs.append(msg)
class PacketIDTests(unittest.TestCase):
class PacketIDTests(unittest.IsolatedAsyncioTestCase):
def setUp(self) -> None:
self.circuit = MockedProxyCircuit()
def _send_message(self, msg, outgoing=True):
msg.direction = Direction.OUT if outgoing else Direction.IN
return self.circuit.send_message(msg)
return self.circuit.send(msg)
def test_basic(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
@@ -178,10 +178,7 @@ class PacketIDTests(unittest.TestCase):
def test_drop_proxied_message(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self.circuit.drop_message(
Message('ChatFromViewer', packet_id=2, flags=PacketFlags.RELIABLE),
Direction.OUT,
)
self.circuit.drop_message(Message('ChatFromViewer', packet_id=2, flags=PacketFlags.RELIABLE))
self._send_message(Message('ChatFromViewer', packet_id=3))
self.assertSequenceEqual(self.circuit.sent_simple, [
@@ -193,10 +190,7 @@ class PacketIDTests(unittest.TestCase):
def test_unreliable_proxied_message(self):
self._send_message(Message('ChatFromViewer', packet_id=1))
self.circuit.drop_message(
Message('ChatFromViewer', packet_id=2),
Direction.OUT,
)
self.circuit.drop_message(Message('ChatFromViewer', packet_id=2))
self._send_message(Message('ChatFromViewer', packet_id=3))
self.assertSequenceEqual(self.circuit.sent_simple, [
@@ -209,10 +203,7 @@ class PacketIDTests(unittest.TestCase):
self._send_message(Message('ChatFromViewer', packet_id=2))
self._send_message(Message('ChatFromViewer', packet_id=3))
self._send_message(Message('ChatFromSimulator'), outgoing=False)
self.circuit.drop_message(
Message('ChatFromViewer', packet_id=4, acks=(4,)),
Direction.OUT,
)
self.circuit.drop_message(Message('ChatFromViewer', packet_id=4, acks=(4,)))
self._send_message(Message('ChatFromViewer', packet_id=5))
self.assertSequenceEqual(self.circuit.sent_simple, [
@@ -230,20 +221,80 @@ class PacketIDTests(unittest.TestCase):
self.assertEqual(self.circuit.sent_msgs[4]["Packets"][0]["ID"], 3)
def test_resending_or_dropping(self):
self.circuit.send_message(Message('ChatFromViewer', packet_id=1))
self.circuit.send(Message('ChatFromViewer', packet_id=1))
to_drop = Message('ChatFromViewer', packet_id=2, flags=PacketFlags.RELIABLE)
self.circuit.drop_message(to_drop)
with self.assertRaises(RuntimeError):
# Re-dropping the same message should raise
self.circuit.drop_message(to_drop)
# Clears finalized flag
to_drop.packet_id = None
self.circuit.send_message(to_drop)
# Returns a new message without finalized flag
new_msg = to_drop.take()
self.circuit.send(new_msg)
with self.assertRaises(RuntimeError):
self.circuit.send_message(to_drop)
self.circuit.send(new_msg)
self.assertSequenceEqual(self.circuit.sent_simple, [
(1, "ChatFromViewer", Direction.OUT, False, ()),
(1, "PacketAck", Direction.IN, True, ()),
# ended up getting the same packet ID when injected
(2, "ChatFromViewer", Direction.OUT, True, ()),
])
def test_reliable_unacked_queueing(self):
self._send_message(Message('ChatFromViewer', flags=PacketFlags.RELIABLE))
self._send_message(Message('ChatFromViewer', flags=PacketFlags.RELIABLE, packet_id=2))
# Only the first, injected message should be queued for resends
self.assertEqual({(Direction.OUT, 1)}, set(self.circuit.unacked_reliable))
def test_reliable_resend_cadence(self):
self._send_message(Message('ChatFromViewer', flags=PacketFlags.RELIABLE))
resend_info = self.circuit.unacked_reliable[(Direction.OUT, 1)]
self.circuit.resend_unacked()
# Should have been too soon to retry
self.assertEqual(10, resend_info.tries_left)
# Switch to allowing resends every 0s
self.circuit.resend_every = 0.0
self.circuit.resend_unacked()
self.assertSequenceEqual(self.circuit.sent_simple, [
(1, "ChatFromViewer", Direction.OUT, True, ()),
# Should have resent
(1, "ChatFromViewer", Direction.OUT, True, ()),
])
self.assertEqual(9, resend_info.tries_left)
for _ in range(resend_info.tries_left):
self.circuit.resend_unacked()
# Should have used up all the retry attempts and been kicked out of the retry queue
self.assertEqual(set(), set(self.circuit.unacked_reliable))
def test_reliable_ack_collection(self):
msg = Message('ChatFromViewer', flags=PacketFlags.RELIABLE)
fut = self.circuit.send_reliable(msg)
self.assertEqual(1, len(self.circuit.unacked_reliable))
# Shouldn't count, this is an ACK going in the wrong direction!
ack_msg = Message("PacketAck", Block("Packets", ID=msg.packet_id))
self.circuit.collect_acks(ack_msg)
self.assertEqual(1, len(self.circuit.unacked_reliable))
self.assertFalse(fut.done())
# But it should count if the ACK message is heading in
ack_msg.direction = Direction.IN
self.circuit.collect_acks(ack_msg)
self.assertEqual(0, len(self.circuit.unacked_reliable))
self.assertTrue(fut.done())
def test_start_ping_check(self):
# Should not break if no unacked
self._send_message(Message(
"StartPingCheck",
Block("PingID", PingID=0, OldestUnacked=20),
packet_id=5,
))
injected_msg = Message('ChatFromViewer', flags=PacketFlags.RELIABLE)
self._send_message(injected_msg)
self._send_message(Message(
"StartPingCheck",
Block("PingID", PingID=0, OldestUnacked=20),
packet_id=8,
))
# Oldest unacked should have been replaced with the injected packet's ID, it's older!
self.assertEqual(self.circuit.sent_msgs[2]["PingID"]["OldestUnacked"], injected_msg.packet_id)

View File

@@ -5,10 +5,10 @@ from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message_formatting import HumanMessageSerializer
from hippolyzer.lib.base.templates import TextureEntrySubfieldSerializer, TEFaceBitfield, TextureEntry
EXAMPLE_TE = b"\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x08\xe7\xb2\x98\x04\xca\x10;\x85\x94\x05Lj\x8d\xd4" \
b"\x0b\x1f\x01B\xcb\xe6|\x1d,\xa7sc\xa6\x1a\xa2L\xb1u\x01\x00\x00\x00\x00\x00\x00\x00\x00\x80?" \
b"\x00\x00\x00\x80?\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00"
EXAMPLE_TE = b'\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x08\xca*\x98:\x18\x02,\r\xf4\x1e\xc6\xf5\x91\x01]\x83\x014' \
b'\x00\x90i+\x10\x80\xa1\xaa\xa2g\x11o\xa8]\xc6\x00\x00\x00\x00\x00\x00\x00\x00\x80?\x00\x00\x00\x80?' \
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00' \
b'\x00\x00\x00\x00\x00\x00\x00'
class TemplateTests(unittest.TestCase):
@@ -58,13 +58,12 @@ class TemplateTests(unittest.TestCase):
str_msg = HumanMessageSerializer.to_human_string(msg, beautify=True)
msg = HumanMessageSerializer.from_human_string(str_msg)
spec = msg["ObjectData"][0].get_serializer("TextureEntry")
deser = spec.deserialize(None, msg["ObjectData"]["TextureEntry"], pod=True)
data_field = msg["ObjectData"]["TextureEntry"]
# Serialization order and format should match indra's exactly
self.assertEqual(EXAMPLE_TE, data_field)
deser = spec.deserialize(None, data_field, pod=True)
self.assertEqual(deser, pod_te)
def test_textureentry_defaults(self):
te = TextureEntry()
self.assertEqual(UUID('89556747-24cb-43ed-920b-47caed15465f'), te.Textures[None])
if __name__ == "__main__":
unittest.main()