45 Commits

Author SHA1 Message Date
Salad Dais
be658b9026 v0.6.3
Cutting a release before working on mitmproxy upgrade
2021-07-18 06:57:40 +00:00
Salad Dais
c505941595 Improve test for TE serialization 2021-07-18 06:33:55 +00:00
Salad Dais
96f471d6b7 Add initial support for Message-specific Block subclasses 2021-07-07 12:49:32 +00:00
Salad Dais
4238016767 Change readme wording
:)
2021-07-07 12:49:32 +00:00
Salad Dais
a35a67718d Add default_value to MessateTemplateVariable 2021-07-01 21:25:51 +00:00
Salad Dais
c2981b107a Remove CodeQL scanning
Maybe later, doesn't seem to do anything useful out of the box.
2021-06-28 06:00:42 -03:00
Salad Dais
851375499a Add CodeQL scanning 2021-06-28 05:44:02 -03:00
Salad Dais
d064ecd466 Don't raise when reading a new avatar_name_cache.xml 2021-06-25 18:45:42 +00:00
Salad Dais
fda37656c9 Reduce boilerplate for mesh mangling addons
Makes it less annoying to compose separate addons with different manglers
2021-06-24 05:29:23 +00:00
Salad Dais
49a9c6f28f Workaround for failed teleports due to EventQueue timeouts
Closes #16
2021-06-23 16:43:09 +00:00
Salad Dais
050ac5e3a9 v0.6.2 2021-06-19 03:06:39 +00:00
Salad Dais
fe0d3132e4 Update shield addon 2021-06-18 20:49:31 +00:00
Salad Dais
d7f18e05be Fix typo 2021-06-18 20:49:20 +00:00
Salad Dais
9bf4240411 Allow tagging UDPPackets with arbitrary metadata
The metadata should propagate to any Messages deserialized
from the packet as well.
2021-06-18 20:31:15 +00:00
Salad Dais
76df9a0424 Streamline template dictionary use 2021-06-17 21:28:22 +00:00
Salad Dais
a91bc67a43 v0.6.1 2021-06-16 14:27:26 +00:00
Salad Dais
48180b85d1 Export proxy test utils for use in addon test suites 2021-06-15 18:48:05 +00:00
Salad Dais
77d3bf2fe1 Make ObjectCacheChain handle invalid caches properly 2021-06-14 14:17:21 +00:00
Salad Dais
d8ec9ee77a Add hooks to allow swapping out transports 2021-06-14 13:48:30 +00:00
Salad Dais
0b46b95f81 Minor API changes 2021-06-14 13:33:17 +00:00
Salad Dais
73e66c56e5 Clarify addon state management example addon 2021-06-13 12:06:04 +00:00
Salad Dais
fd2a4d8dce Remove incorrect comment from JPEG2000 test 2021-06-13 10:23:18 +00:00
Salad Dais
2209ebdd0c Add unit tests for JPEG2000 utils 2021-06-13 10:20:18 +00:00
Salad Dais
ccfb641cc2 Add pixel artist example addon 2021-06-12 15:44:26 +00:00
Salad Dais
220d8ddf65 Add confirmation helper for InteractionManager API 2021-06-12 15:15:34 +00:00
Salad Dais
235bc8e09e Change TextureEntry type signatures to play nicer with type checker 2021-06-12 15:15:03 +00:00
Salad Dais
41fd67577a Add ability to wait on object-related events 2021-06-12 10:43:16 +00:00
Salad Dais
8347b341f5 Give default values for TextureEntry fields 2021-06-12 10:26:52 +00:00
Salad Dais
9d5599939e Add MCode enum definition 2021-06-12 08:54:34 +00:00
Salad Dais
1fd6decf91 Add integration tests for addon (un)loading 2021-06-11 19:44:53 +00:00
Salad Dais
4ddc6aa852 Remove unloaded addon scripts from sys.modules 2021-06-11 19:44:35 +00:00
Salad Dais
ab89f6bc14 Add integration test for asset server wrapper cap 2021-06-11 17:53:55 +00:00
Salad Dais
cb8c1cfe91 Only generate lowercase hostnames in register_wrapper_cap()
Hostnames are case insensitive and passing a URL through urlparse()
will always give you a lowercase domain name.
2021-06-11 17:52:03 +00:00
Salad Dais
52679bf708 HTTPAssetRepo: Don't throw when trying to serve invalid UUID 2021-06-11 17:51:45 +00:00
Salad Dais
a21c0439e9 Test for mitmproxy handling HTTPS requests as well 2021-06-10 23:32:38 +00:00
Salad Dais
216ffb3777 Add integration test for mitmproxy interception 2021-06-10 23:22:59 +00:00
Salad Dais
d4c30d998d Allow handling Firestorm Bridge responses, use to guess avatar Z pos 2021-06-09 02:02:09 +00:00
Salad Dais
003f37c3d3 Auto-request unknown objects when an avatar sits on them
We need to know about an avatar's parent to get their exact position
due to the Object.Position field always being relative to the parent.
2021-06-08 23:44:08 +00:00
Salad Dais
d64a07c04c Better guard to prevent accidental lazy serializable hydration 2021-06-08 18:57:57 +00:00
Salad Dais
82b156813b Add more name accessors to Avatar class 2021-06-08 18:57:24 +00:00
Salad Dais
b71da8f5a4 Add option to automatically request missing cached objects 2021-06-08 18:41:44 +00:00
Salad Dais
5618bcbac1 Add new persistent (Proxy)Settings object, use to pass down settings 2021-06-08 16:55:19 +00:00
Salad Dais
24abc36df2 Correct AgentState enum definition 2021-06-07 12:56:39 +00:00
Salad Dais
9ceea8324a Fix templates.py reloading by importing importlib 2021-06-07 12:56:21 +00:00
Salad Dais
29653c350f Bundle addon examples with Windows build 2021-06-07 11:40:45 +00:00
67 changed files with 1255 additions and 542 deletions

View File

@@ -1,5 +1,6 @@
[run]
omit =
concurrency = multiprocessing
[report]
exclude_lines =
pragma: no cover

View File

@@ -23,6 +23,7 @@ jobs:
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install -r requirements-test.txt
sudo apt-get install libopenjp2-7
- name: Run Flake8
run: |
flake8 .

View File

@@ -2,7 +2,7 @@
![Python Test Status](https://github.com/SaladDais/Hippolyzer/workflows/Run%20Python%20Tests/badge.svg) [![codecov](https://codecov.io/gh/SaladDais/Hippolyzer/branch/master/graph/badge.svg?token=HCTFA4RAXX)](https://codecov.io/gh/SaladDais/Hippolyzer)
[Hippolyzer](http://wiki.secondlife.com/wiki/Hippo) is a fork of Linden Lab's abandoned
[Hippolyzer](http://wiki.secondlife.com/wiki/Hippo) is a revival of Linden Lab's
[PyOGP library](http://wiki.secondlife.com/wiki/PyOGP)
targeting modern Python 3, with a focus on debugging issues in Second Life-compatible
servers and clients. There is a secondary focus on mocking up new features without requiring a
@@ -224,7 +224,7 @@ OUT ObjectAdd
```
The repeat spinner at the bottom of the window lets you send a message multiple times.
an `i` variable is put into the eval context and can be used to vary messages accros repeats.
an `i` variable is put into the eval context and can be used to vary messages across repeats.
With repeat set to two:
```

View File

@@ -9,7 +9,7 @@ from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
class PropertyHelloWorldAddon(BaseAddon):
class AddonStateHelloWorldAddon(BaseAddon):
# How to say hello, value shared across sessions and will be the same
# regardless of which session is active when accessed.
# "hello_greeting" is added to session_manager.addon_ctx's dict and will survive reloads
@@ -28,7 +28,11 @@ class PropertyHelloWorldAddon(BaseAddon):
# Shared across sessions and will die if the addon is reloaded
self.hello_punctuation = "!"
@handle_command(greeting=Parameter(str, sep=None))
@handle_command(
# Use the longer-form `Parameter()` for declaring this because
# this field should be greedy and take the rest of the message (no separator.)
greeting=Parameter(str, sep=None),
)
async def set_hello_greeting(self, _session: Session, _region: ProxiedRegion, greeting: str):
"""Set the person to say hello to"""
self.hello_greeting = greeting
@@ -38,7 +42,10 @@ class PropertyHelloWorldAddon(BaseAddon):
"""Set the person to say hello to"""
self.hello_person = person
@handle_command(punctuation=Parameter(str, sep=None))
@handle_command(
# Punctuation should have no whitespace, so using a simple parameter is OK.
punctuation=str,
)
async def set_hello_punctuation(self, _session: Session, _region: ProxiedRegion, punctuation: str):
"""Set the punctuation to use for saying hello"""
self.hello_punctuation = punctuation
@@ -47,8 +54,8 @@ class PropertyHelloWorldAddon(BaseAddon):
async def say_hello(self, _session: Session, _region: ProxiedRegion):
"""Say hello using the configured hello variables"""
# These aren't instance properties, they can be accessed via the class as well.
hello_person = PropertyHelloWorldAddon.hello_person
hello_person = AddonStateHelloWorldAddon.hello_person
send_chat(f"{self.hello_greeting} {hello_person}{self.hello_punctuation}")
addons = [PropertyHelloWorldAddon()]
addons = [AddonStateHelloWorldAddon()]

View File

@@ -280,4 +280,23 @@ class MeshUploadInterceptingAddon(BaseAddon):
cls._replace_local_mesh(session.main_region, asset_repo, mesh_list)
class BaseMeshManglerAddon(BaseAddon):
"""Base class for addons that mangle uploaded or local mesh"""
MESH_MANGLERS: List[Callable[[MeshAsset], MeshAsset]]
def handle_init(self, session_manager: SessionManager):
# Add our manglers into the list
MeshUploadInterceptingAddon.mesh_manglers.extend(self.MESH_MANGLERS)
# Tell the local mesh plugin that the mangler list changed, and to re-apply
MeshUploadInterceptingAddon.remangle_local_mesh(session_manager)
def handle_unload(self, session_manager: SessionManager):
# Clean up our manglers before we go away
mangler_list = MeshUploadInterceptingAddon.mesh_manglers
for mangler in self.MESH_MANGLERS:
if mangler in mangler_list:
mangler_list.remove(mangler)
MeshUploadInterceptingAddon.remangle_local_mesh(session_manager)
addons = [MeshUploadInterceptingAddon()]

View File

@@ -11,8 +11,6 @@ to add to give a mesh an arbitrary center of rotation / scaling.
from hippolyzer.lib.base.mesh import MeshAsset
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.sessions import SessionManager
import local_mesh
AddonManager.hot_reload(local_mesh, require_addons_loaded=True)
@@ -37,6 +35,9 @@ def reorient_mesh(orientation):
# X=1, Y=2, Z=3
def _reorienter(mesh: MeshAsset):
for material in mesh.iter_lod_materials():
if "Position" not in material:
# Must be a NoGeometry LOD
continue
# We don't need to use positions_(to/from)_domain here since we're just naively
# flipping the axes around.
material["Position"] = _reorient_coord_list(material["Position"], orientation)
@@ -46,28 +47,11 @@ def reorient_mesh(orientation):
return _reorienter
OUR_MANGLERS = [
# Negate the X and Y axes on any mesh we upload or create temp
reorient_mesh((-1, -2, 3)),
]
class ExampleMeshManglerAddon(local_mesh.BaseMeshManglerAddon):
MESH_MANGLERS = [
# Negate the X and Y axes on any mesh we upload or create temp
reorient_mesh((-1, -2, 3)),
]
class MeshManglerExampleAddon(BaseAddon):
def handle_init(self, session_manager: SessionManager):
# Add our manglers into the list
local_mesh_addon = local_mesh.MeshUploadInterceptingAddon
local_mesh_addon.mesh_manglers.extend(OUR_MANGLERS)
# Tell the local mesh plugin that the mangler list changed, and to re-apply
local_mesh_addon.remangle_local_mesh(session_manager)
def handle_unload(self, session_manager: SessionManager):
# Clean up our manglers before we go away
local_mesh_addon = local_mesh.MeshUploadInterceptingAddon
mangler_list = local_mesh_addon.mesh_manglers
for mangler in OUR_MANGLERS:
if mangler in mangler_list:
mangler_list.remove(mangler)
local_mesh_addon.remangle_local_mesh(session_manager)
addons = [MeshManglerExampleAddon()]
addons = [ExampleMeshManglerAddon()]

View File

@@ -4,7 +4,7 @@ Do the money dance whenever someone in the sim pays you directly
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.templates import MoneyTransactionType, PCode, ChatType
from hippolyzer.lib.base.templates import MoneyTransactionType, ChatType
from hippolyzer.lib.proxy.addon_utils import send_chat, BaseAddon
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
@@ -27,8 +27,8 @@ class PaydayAddon(BaseAddon):
return
# Check if they're likely to be in the sim
sender_obj = region.objects.lookup_fullid(sender)
if not sender_obj or sender_obj.PCode != PCode.AVATAR:
sender_obj = region.objects.lookup_avatar(sender)
if not sender_obj:
return
amount = transaction_block['Amount']

View File

@@ -0,0 +1,161 @@
"""
Import a small image (like a nintendo sprite) and create it out of cube prims
Inefficient and doesn't even do line fill, expect it to take `width * height`
prims for whatever image you import!
"""
import asyncio
import struct
from typing import *
from PySide2.QtGui import QImage
from hippolyzer.lib.base.datatypes import UUID, Vector3, Quaternion
from hippolyzer.lib.base.helpers import to_chunks
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.templates import ObjectUpdateFlags, PCode, MCode, MultipleObjectUpdateFlags, TextureEntry
from hippolyzer.lib.client.object_manager import ObjectEvent, UpdateType
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
JUST_CREATED_FLAGS = (ObjectUpdateFlags.CREATE_SELECTED | ObjectUpdateFlags.OBJECT_YOU_OWNER)
PRIM_SCALE = 0.2
class PixelArtistAddon(BaseAddon):
@handle_command()
async def import_pixel_art(self, session: Session, region: ProxiedRegion):
"""
Import a small image (like a nintendo sprite) and create it out of cube prims
"""
filename = await AddonManager.UI.open_file(
"Open an image",
filter_str="Images (*.png *.jpg *.jpeg *.bmp)",
)
if not filename:
return
img = QImage()
with open(filename, "rb") as f:
img.loadFromData(f.read(), aformat=None)
img = img.convertToFormat(QImage.Format_RGBA8888)
height = img.height()
width = img.width()
pixels: List[Optional[bytes]] = []
needed_prims = 0
for y in range(height):
for x in range(width):
color: int = img.pixel(x, y)
# This will be ARGB, SL wants RGBA
alpha = (color & 0xFF000000) >> 24
color = color & 0x00FFFFFF
if alpha > 20:
# Repack RGBA to the bytes format we use for colors
pixels.append(struct.pack("!I", (color << 8) | alpha))
needed_prims += 1
else:
# Pretty transparent, skip it
pixels.append(None)
if not await AddonManager.UI.confirm("Confirm prim use", f"This will take {needed_prims} prims"):
return
agent_obj = region.objects.lookup_fullid(session.agent_id)
agent_pos = agent_obj.RegionPosition
created_prims = []
# Watch for any newly created prims, this is basically what the viewer does to find
# prims that it just created with the build tool.
with session.objects.events.subscribe_async(
(UpdateType.OBJECT_UPDATE,),
predicate=lambda e: e.object.UpdateFlags & JUST_CREATED_FLAGS and "LocalID" in e.updated
) as get_events:
# Create a pool of prims to use for building the pixel art
for _ in range(needed_prims):
# TODO: We don't track the land group or user's active group, so
# "anyone can build" must be on for rezzing to work.
group_id = UUID()
region.circuit.send_message(Message(
'ObjectAdd',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id, GroupID=group_id),
Block(
'ObjectData',
PCode=PCode.PRIMITIVE,
Material=MCode.WOOD,
AddFlags=ObjectUpdateFlags.CREATE_SELECTED,
PathCurve=16,
ProfileCurve=1,
PathScaleX=100,
PathScaleY=100,
BypassRaycast=1,
RayStart=agent_obj.RegionPosition + Vector3(0, 0, 2),
RayEnd=agent_obj.RegionPosition + Vector3(0, 0, 2),
RayTargetID=UUID(),
RayEndIsIntersection=0,
Scale=Vector3(PRIM_SCALE, PRIM_SCALE, PRIM_SCALE),
Rotation=Quaternion(0.0, 0.0, 0.0, 1.0),
fill_missing=True,
),
))
# Don't spam a ton of creates at once
await asyncio.sleep(0.02)
# Read any creation events that queued up while we were creating the objects
# So we can figure out the newly-created objects' IDs
for _ in range(needed_prims):
evt: ObjectEvent = await asyncio.wait_for(get_events(), 1.0)
created_prims.append(evt.object)
# Drawing origin starts at the top left, should be positioned just above the
# avatar on Z and centered on Y.
top_left = Vector3(0, (width * PRIM_SCALE) * -0.5, (height * PRIM_SCALE) + 2.0) + agent_pos
positioning_blocks = []
prim_idx = 0
for i, pixel_color in enumerate(pixels):
# Transparent, skip
if pixel_color is None:
continue
x = i % width
y = i // width
obj = created_prims[prim_idx]
# Set a blank texture on all faces
te = TextureEntry()
te.Textures[None] = UUID('5748decc-f629-461c-9a36-a35a221fe21f')
# Set the prim color to the color from the pixel
te.Color[None] = pixel_color
# Set the prim texture and color
region.circuit.send_message(Message(
'ObjectImage',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('ObjectData', ObjectLocalID=obj.LocalID, MediaURL=b'', TextureEntry_=te),
direction=Direction.OUT,
))
# Save the repositioning data for later since it uses a different message,
# but it can be set in batches.
positioning_blocks.append(Block(
'ObjectData',
ObjectLocalID=obj.LocalID,
Type=MultipleObjectUpdateFlags.POSITION,
Data_={'POSITION': top_left + Vector3(0, x * PRIM_SCALE, y * -PRIM_SCALE)},
))
await asyncio.sleep(0.01)
# We actually used a prim for this, so increment the index
prim_idx += 1
# Move the "pixels" to their correct position in chunks
for chunk in to_chunks(positioning_blocks, 25):
region.circuit.send_message(Message(
'MultipleObjectUpdate',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
*chunk,
direction=Direction.OUT,
))
await asyncio.sleep(0.01)
addons = [PixelArtistAddon()]

View File

@@ -70,7 +70,7 @@ class RecapitatorAddon(BaseAddon):
async def _proxy_bodypart_upload(self, session: Session, region: ProxiedRegion, message: Message):
asset_block = message["AssetBlock"]
# Asset will already be in the viewer's VFS as the expected asset ID, calculate it.
asset_id = session.tid_to_assetid(asset_block["TransactionID"])
asset_id = session.transaction_to_assetid(asset_block["TransactionID"])
success = False
try:
# Xfer the asset from the viewer if it wasn't small enough to fit in AssetData

View File

@@ -29,10 +29,11 @@ class SerializationSanityChecker(BaseAddon):
self.deserializer = UDPMessageDeserializer()
def handle_proxied_packet(self, session_manager: SessionManager, packet: UDPPacket,
session: Optional[Session], region: Optional[ProxiedRegion],
message: Optional[Message]):
session: Optional[Session], region: Optional[ProxiedRegion]):
# Well this doesn't even parse as a message, can't do anything about it.
if message is None:
try:
message = self.deserializer.deserialize(packet.data)
except:
LOG.error(f"Received unparseable message from {packet.src_addr!r}: {packet.data!r}")
return
try:

View File

@@ -6,7 +6,13 @@ from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
SUSPICIOUS_PACKETS = {"TransferRequest", "UUIDNameRequest", "UUIDGroupNameRequest", "OpenCircuit"}
SUSPICIOUS_PACKETS = {
"TransferRequest",
"UUIDNameRequest",
"UUIDGroupNameRequest",
"OpenCircuit",
"AddCircuitCode",
}
REGULAR_IM_DIALOGS = (IMDialogType.TYPING_STOP, IMDialogType.TYPING_STOP, IMDialogType.NOTHING_SPECIAL)

View File

@@ -40,7 +40,7 @@ class TransferExampleAddon(BaseAddon):
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
Block('InventoryData', LocalID=session.selected.object_local),
))
inv_message = await region.message_handler.wait_for('ReplyTaskInventory', timeout=5.0)
inv_message = await region.message_handler.wait_for(('ReplyTaskInventory',), timeout=5.0)
# Xfer the inventory file and look for a script
xfer = await region.xfer_manager.request(

View File

@@ -65,7 +65,7 @@ class TurboObjectInventoryAddon(BaseAddon):
# by marking it complete on the server-side. Re-send our RequestTaskInventory
# To make sure there's a fresh copy.
region.circuit.send_message(request_msg.take())
inv_message = await region.message_handler.wait_for('ReplyTaskInventory', timeout=5.0)
inv_message = await region.message_handler.wait_for(('ReplyTaskInventory',), timeout=5.0)
# No task inventory, send the reply as-is
file_name = inv_message["InventoryData"]["Filename"]
if not file_name:

View File

@@ -22,7 +22,7 @@ class XferExampleAddon(BaseAddon):
))
# Wait for any MuteListUpdate, dropping it before it reaches the viewer
update_msg = await region.message_handler.wait_for('MuteListUpdate', timeout=5.0)
update_msg = await region.message_handler.wait_for(('MuteListUpdate',), timeout=5.0)
mute_file_name = update_msg["MuteData"]["Filename"]
if not mute_file_name:
show_message("Nobody muted?")
@@ -42,7 +42,7 @@ class XferExampleAddon(BaseAddon):
Block('InventoryData', LocalID=session.selected.object_local),
))
inv_message = await region.message_handler.wait_for('ReplyTaskInventory', timeout=5.0)
inv_message = await region.message_handler.wait_for(('ReplyTaskInventory',), timeout=5.0)
# Xfer doesn't need to be immediately awaited, multiple signals can be waited on.
xfer = region.xfer_manager.request(

View File

@@ -20,6 +20,7 @@ from hippolyzer.lib.proxy.lludp_proxy import SLSOCKS5Server
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager, Session
from hippolyzer.lib.proxy.settings import ProxySettings
LOG = logging.getLogger(__name__)
@@ -88,11 +89,12 @@ def run_http_proxy_process(proxy_host, http_proxy_port, flow_context: HTTPFlowCo
mitmproxy_master = create_http_proxy(proxy_host, http_proxy_port, flow_context)
mitmproxy_master.start_server()
gc.freeze()
flow_context.mitmproxy_ready.set()
mitm_loop.run_forever()
def start_proxy(extra_addons: Optional[list] = None, extra_addon_paths: Optional[list] = None,
session_manager=None, proxy_host=None):
def start_proxy(session_manager: SessionManager, extra_addons: Optional[list] = None,
extra_addon_paths: Optional[list] = None, proxy_host=None):
extra_addons = extra_addons or []
extra_addon_paths = extra_addon_paths or []
extra_addons.append(SelectionManagerAddon())
@@ -105,12 +107,11 @@ def start_proxy(extra_addons: Optional[list] = None, extra_addon_paths: Optional
loop = asyncio.get_event_loop()
udp_proxy_port = int(os.environ.get("HIPPO_UDP_PORT", 9061))
http_proxy_port = int(os.environ.get("HIPPO_HTTP_PORT", 9062))
udp_proxy_port = session_manager.settings.SOCKS_PROXY_PORT
http_proxy_port = session_manager.settings.HTTP_PROXY_PORT
if proxy_host is None:
proxy_host = os.environ.get("HIPPO_BIND_HOST", "127.0.0.1")
proxy_host = session_manager.settings.PROXY_BIND_ADDR
session_manager = session_manager or SessionManager()
flow_context = session_manager.flow_context
session_manager.name_cache.load_viewer_caches()
@@ -186,7 +187,7 @@ def _windows_timeout_killer(pid: int):
def main():
multiprocessing.set_start_method("spawn")
start_proxy()
start_proxy(SessionManager(ProxySettings()))
if __name__ == "__main__":

View File

@@ -1,5 +1,6 @@
import asyncio
import base64
import dataclasses
import email
import functools
import html
@@ -33,10 +34,10 @@ from hippolyzer.lib.base.message.message_formatting import (
SpannedString,
)
from hippolyzer.lib.base.message.msgtypes import MsgType
from hippolyzer.lib.base.message.template_dict import TemplateDictionary
from hippolyzer.lib.base.message.template_dict import DEFAULT_TEMPLATE_DICT
from hippolyzer.lib.base.ui_helpers import loadUi
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base.network.transport import Direction, WrappingUDPTransport
from hippolyzer.lib.base.network.transport import Direction, SocketUDPTransport
from hippolyzer.lib.proxy.addons import BaseInteractionManager, AddonManager
from hippolyzer.lib.proxy.ca_utils import setup_ca_everywhere
from hippolyzer.lib.proxy.caps_client import ProxyCapsClient
@@ -44,6 +45,7 @@ from hippolyzer.lib.proxy.http_proxy import create_proxy_master, HTTPFlowContext
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, AbstractMessageLogEntry
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
from hippolyzer.lib.proxy.settings import ProxySettings
from hippolyzer.lib.proxy.templates import CAP_TEMPLATES
LOG = logging.getLogger(__name__)
@@ -66,8 +68,8 @@ class GUISessionManager(SessionManager, QtCore.QObject):
regionAdded = QtCore.Signal(ProxiedRegion)
regionRemoved = QtCore.Signal(ProxiedRegion)
def __init__(self, model):
SessionManager.__init__(self)
def __init__(self, settings, model):
SessionManager.__init__(self, settings)
QtCore.QObject.__init__(self)
self.all_regions = []
self.message_logger = model
@@ -140,6 +142,19 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
return None
return dialog.selectedFiles()[0]
async def confirm(self, title: str, caption: str) -> bool:
msg = QtWidgets.QMessageBox(
QtWidgets.QMessageBox.Icon.Question,
title,
caption,
QtWidgets.QMessageBox.Ok | QtWidgets.QMessageBox.Cancel,
self.parent(),
)
fut = asyncio.Future()
msg.finished.connect(lambda r: fut.set_result(r))
msg.open()
return (await fut) == QtWidgets.QMessageBox.Ok
def nonFatalExceptions(f):
@functools.wraps(f)
@@ -172,9 +187,9 @@ class ProxyGUI(QtWidgets.QMainWindow):
super().__init__()
loadUi(MAIN_WINDOW_UI_PATH, self)
self.settings = QtCore.QSettings("SaladDais", "hippolyzer")
self._selectedEntry: Optional[AbstractMessageLogEntry] = None
self.settings = GUIProxySettings(QtCore.QSettings("SaladDais", "hippolyzer"))
self.model = MessageLogModel(parent=self.tableView)
self.tableView.setModel(self.model)
self.model.rowsAboutToBeInserted.connect(self.beforeInsert)
@@ -191,18 +206,19 @@ class ProxyGUI(QtWidgets.QMainWindow):
self.actionManageAddons.triggered.connect(self._manageAddons)
self.actionManageFilters.triggered.connect(self._manageFilters)
self.actionOpenMessageBuilder.triggered.connect(self._openMessageBuilder)
self.actionProxyRemotelyAccessible.setChecked(
self.settings.value("RemotelyAccessible", False, type=bool))
self.actionUseViewerObjectCache.setChecked(
self.settings.value("UseViewerObjectCache", False, type=bool))
self.actionProxyRemotelyAccessible.setChecked(self.settings.REMOTELY_ACCESSIBLE)
self.actionUseViewerObjectCache.setChecked(self.settings.USE_VIEWER_OBJECT_CACHE)
self.actionRequestMissingObjects.setChecked(self.settings.AUTOMATICALLY_REQUEST_MISSING_OBJECTS)
self.actionProxyRemotelyAccessible.triggered.connect(self._setProxyRemotelyAccessible)
self.actionUseViewerObjectCache.triggered.connect(self._setUseViewerObjectCache)
self.actionRequestMissingObjects.triggered.connect(self._setRequestMissingObjects)
self._filterMenu = QtWidgets.QMenu()
self._populateFilterMenu()
self.toolButtonFilter.setMenu(self._filterMenu)
self.sessionManager = GUISessionManager(self.model)
self.sessionManager = GUISessionManager(self.settings, self.model)
self.interactionManager = GUIInteractionManager(self)
AddonManager.UI = self.interactionManager
@@ -223,15 +239,12 @@ class ProxyGUI(QtWidgets.QMainWindow):
self._filterMenu.clear()
_addFilterAction("Default", self.DEFAULT_FILTER)
filters = self.getFilterDict()
filters = self.settings.FILTERS
for preset_name, preset_filter in filters.items():
_addFilterAction(preset_name, preset_filter)
def getFilterDict(self):
return json.loads(str(self.settings.value("Filters", "{}")))
def setFilterDict(self, val: dict):
self.settings.setValue("Filters", json.dumps(val))
self.settings.FILTERS = val
self._populateFilterMenu()
def _manageFilters(self):
@@ -376,24 +389,26 @@ class ProxyGUI(QtWidgets.QMainWindow):
msg.exec()
def _setProxyRemotelyAccessible(self, checked: bool):
self.settings.setValue("RemotelyAccessible", checked)
self.sessionManager.settings.REMOTELY_ACCESSIBLE = checked
msg = QtWidgets.QMessageBox()
msg.setText("Remote accessibility setting changes will take effect on next run")
msg.exec()
def _setUseViewerObjectCache(self, checked: bool):
self.settings.setValue("UseViewerObjectCache", checked)
self.sessionManager.use_viewer_object_cache = checked
self.sessionManager.settings.USE_VIEWER_OBJECT_CACHE = checked
def _setRequestMissingObjects(self, checked: bool):
self.sessionManager.settings.AUTOMATICALLY_REQUEST_MISSING_OBJECTS = checked
def _manageAddons(self):
dialog = AddonDialog(self)
dialog.exec_()
def getAddonList(self) -> List[str]:
return json.loads(str(self.settings.value("Addons", "[]")))
return self.sessionManager.settings.ADDON_SCRIPTS
def setAddonList(self, val: List[str]):
self.settings.setValue("Addons", json.dumps(val))
self.sessionManager.settings.ADDON_SCRIPTS = val
BANNED_HEADERS = ("content-length", "host")
@@ -431,7 +446,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
def __init__(self, parent, session_manager):
super().__init__(parent=parent)
loadUi(MESSAGE_BUILDER_UI_PATH, self)
self.templateDict = TemplateDictionary()
self.templateDict = DEFAULT_TEMPLATE_DICT
self.llsdSerializer = LLSDMessageSerializer()
self.sessionManager: SessionManager = session_manager
self.regionModel = RegionListModel(self, self.sessionManager)
@@ -560,24 +575,9 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
if var.name in ("TaskID", "ObjectID"):
return VerbatimHumanVal("[[SELECTED_FULL]]")
if var.type.is_int:
return 0
elif var.type.is_float:
return 0.0
elif var.type == MsgType.MVT_LLUUID:
return UUID()
elif var.type == MsgType.MVT_BOOL:
return False
elif var.type == MsgType.MVT_VARIABLE:
return ""
elif var.type in (MsgType.MVT_LLVector3, MsgType.MVT_LLVector3d, MsgType.MVT_LLQuaternion):
return VerbatimHumanVal("(0.0, 0.0, 0.0)")
elif var.type == MsgType.MVT_LLVector4:
return VerbatimHumanVal("(0.0, 0.0, 0.0, 0.0)")
elif var.type == MsgType.MVT_FIXED:
return b"\x00" * var.size
elif var.type == MsgType.MVT_IP_ADDR:
return "0.0.0.0"
default_val = var.default_value
if default_val is not None:
return default_val
return VerbatimHumanVal("")
@nonFatalExceptions
@@ -617,7 +617,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
msg = HumanMessageSerializer.from_human_string(msg_text, replacements, env, safe=False)
if self.checkLLUDPViaCaps.isChecked():
if msg.direction == Direction.IN:
region.eq_manager.queue_event(
region.eq_manager.inject_event(
self.llsdSerializer.serialize(msg, as_dict=True)
)
else:
@@ -631,7 +631,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
transport = None
off_circuit = self.checkOffCircuit.isChecked()
if off_circuit:
transport = WrappingUDPTransport(socket.socket(socket.AF_INET, socket.SOCK_DGRAM))
transport = SocketUDPTransport(socket.socket(socket.AF_INET, socket.SOCK_DGRAM))
region.circuit.send_message(msg, transport=transport)
if off_circuit:
transport.close()
@@ -641,7 +641,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
raise RuntimeError("Need a valid session and region to send EQ event")
message_line, _, body = (x.strip() for x in msg_text.partition("\n"))
message_name = message_line.rsplit(" ", 1)[-1]
region.eq_manager.queue_event({
region.eq_manager.inject_event({
"message": message_name,
"body": llsd.parse_xml(body.encode("utf8")),
})
@@ -719,7 +719,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
return val
def _sendHTTPRequest(self, method, uri, headers, body):
caps_client = ProxyCapsClient()
caps_client = ProxyCapsClient(self.sessionManager.settings)
async def _send_request():
req = caps_client.request(method, uri, headers=headers, data=body)
@@ -823,6 +823,22 @@ class FilterDialog(QtWidgets.QDialog):
self.listFilters.takeItem(idx)
class GUIProxySettings(ProxySettings):
"""Persistent settings backed by QSettings"""
def __init__(self, settings: QtCore.QSettings):
super().__init__()
self._settings_obj = settings
def get_setting(self, name: str) -> Any:
val: Any = self._settings_obj.value(name, defaultValue=dataclasses.MISSING)
if val is dataclasses.MISSING:
return val
return json.loads(val)
def set_setting(self, name: str, val: Any):
self._settings_obj.setValue(name, json.dumps(val))
def gui_main():
multiprocessing.set_start_method('spawn')
QtCore.QCoreApplication.setAttribute(QtCore.Qt.AA_ShareOpenGLContexts)
@@ -835,11 +851,8 @@ def gui_main():
timer.start(100)
signal.signal(signal.SIGINT, lambda *args: QtWidgets.QApplication.quit())
window.show()
remote_access = window.settings.value("RemotelyAccessible", False, type=bool)
use_vocache = window.settings.value("UseViewerObjectCache", False, type=bool)
window.sessionManager.use_viewer_object_cache = use_vocache
http_host = None
if remote_access:
if window.sessionManager.settings.REMOTELY_ACCESSIBLE:
http_host = "0.0.0.0"
start_proxy(
session_manager=window.sessionManager,

View File

@@ -263,6 +263,7 @@
<addaction name="separator"/>
<addaction name="actionProxyRemotelyAccessible"/>
<addaction name="actionUseViewerObjectCache"/>
<addaction name="actionRequestMissingObjects"/>
</widget>
<addaction name="menuFile"/>
</widget>
@@ -311,6 +312,17 @@
<string>Can help make the proxy aware of certain objects, but can cause slowdowns</string>
</property>
</action>
<action name="actionRequestMissingObjects">
<property name="checkable">
<bool>true</bool>
</property>
<property name="text">
<string>Automatically Request Missing Objects</string>
</property>
<property name="toolTip">
<string>Force the proxy to request objects that it doesn't know about due to cache misses</string>
</property>
</action>
</widget>
<resources/>
<connections/>

View File

@@ -294,6 +294,17 @@ class RawBytes(bytes):
pass
_T = TypeVar("_T")
class Pretty(Generic[_T]):
"""Wrapper for var values so Messages will know to serialize"""
__slots__ = ("value",)
def __init__(self, value: _T):
self.value: _T = value
class StringEnum(str, enum.Enum):
def __str__(self):
return self.value
@@ -333,5 +344,5 @@ class TaggedUnion(recordclass.datatuple): # type: ignore
__all__ = [
"Vector3", "Vector4", "Vector2", "Quaternion", "TupleCoord",
"UUID", "RawBytes", "StringEnum", "JankStringyBytes", "TaggedUnion",
"IntEnum", "IntFlag", "flags_to_pod"
"IntEnum", "IntFlag", "flags_to_pod", "Pretty"
]

View File

@@ -139,3 +139,9 @@ def bytes_escape(val: bytes) -> bytes:
def get_resource_filename(resource_filename: str):
return pkg_resources.resource_filename("hippolyzer", resource_filename)
def to_chunks(chunkable: Sequence[_T], chunk_size: int) -> Generator[_T, None, None]:
while chunkable:
yield chunkable[:chunk_size]
chunkable = chunkable[chunk_size:]

View File

@@ -77,4 +77,4 @@ class ConnectionHolder(abc.ABC):
lifetime of a session (due to region restarts, etc.)
"""
circuit: Optional[Circuit]
message_handler: MessageHandler[Message]
message_handler: MessageHandler[Message, str]

View File

@@ -5,14 +5,13 @@ from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.message.data_packer import LLSDDataPacker
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.template import MessageTemplateVariable
from hippolyzer.lib.base.message.template_dict import TemplateDictionary
from hippolyzer.lib.base.message.template_dict import TemplateDictionary, DEFAULT_TEMPLATE_DICT
VAR_PAIR = Tuple[dict, MessageTemplateVariable]
class LLSDMessageSerializer:
DEFAULT_TEMPLATE = TemplateDictionary()
DEFAULT_TEMPLATE = DEFAULT_TEMPLATE_DICT
def __init__(self, message_template=None, message_cls: Type[Message] = Message):
if message_template is not None:

View File

@@ -22,6 +22,7 @@ from __future__ import annotations
import copy
import enum
import importlib
import itertools
import logging
import os
@@ -31,6 +32,7 @@ from typing import *
from hippolyzer.lib.base.datatypes import *
import hippolyzer.lib.base.serialization as se
import hippolyzer.lib.base.templates as templates
from hippolyzer.lib.base.datatypes import Pretty
from hippolyzer.lib.base.message.msgtypes import PacketFlags
from hippolyzer.lib.base.network.transport import Direction, ADDR_TUPLE
@@ -61,11 +63,12 @@ class Block:
Block expects a name, and kwargs for variables (var_name = value)
"""
__slots__ = ('name', 'size', 'vars', 'message_name', '_ser_cache', 'fill_missing',)
PARENT_MESSAGE_NAME: ClassVar[Optional[str]] = None
def __init__(self, name, /, *, fill_missing=False, **kwargs):
self.name = name
self.size = 0
self.message_name: Optional[str] = None
self.message_name: Optional[str] = self.PARENT_MESSAGE_NAME
self.vars: Dict[str, VAR_TYPE] = {}
self._ser_cache: Dict[str, Any] = {}
self.fill_missing = fill_missing
@@ -82,6 +85,9 @@ class Block:
return self.vars[name]
def __setitem__(self, key, value):
if isinstance(value, Pretty):
return self.serialize_var(key, value.value)
# These don't pickle well since they're likely to get hot-reloaded
if isinstance(value, (enum.IntEnum, enum.IntFlag)):
value = int(value)

View File

@@ -28,28 +28,28 @@ from hippolyzer.lib.base.events import Event
LOG = logging.getLogger(__name__)
_T = TypeVar("_T")
_K = TypeVar("_K", bound=Hashable)
MESSAGE_HANDLER = Callable[[_T], Any]
PREDICATE = Callable[[_T], bool]
MESSAGE_NAMES = Union[str, Iterable[str]]
MESSAGE_NAMES = Iterable[_K]
class MessageHandler(Generic[_T]):
def __init__(self):
self.handlers: Dict[str, Event] = {}
class MessageHandler(Generic[_T, _K]):
def __init__(self, take_by_default: bool = True):
self.handlers: Dict[_K, Event] = {}
self.take_by_default = take_by_default
def register(self, message_name: str) -> Event:
def register(self, message_name: _K) -> Event:
LOG.debug('Creating a monitor for %s' % message_name)
return self.handlers.setdefault(message_name, Event())
def subscribe(self, message_name: str, handler: MESSAGE_HANDLER) -> Event:
def subscribe(self, message_name: _K, handler: MESSAGE_HANDLER) -> Event:
notifier = self.register(message_name)
notifier.subscribe(handler)
return notifier
def _subscribe_all(self, message_names: MESSAGE_NAMES, handler: MESSAGE_HANDLER,
predicate: Optional[PREDICATE] = None) -> List[Event]:
if isinstance(message_names, str):
message_names = (message_names,)
notifiers = [self.register(name) for name in message_names]
for n in notifiers:
n.subscribe(handler, predicate=predicate)
@@ -57,7 +57,7 @@ class MessageHandler(Generic[_T]):
@contextlib.contextmanager
def subscribe_async(self, message_names: MESSAGE_NAMES, predicate: Optional[PREDICATE] = None,
take: bool = True) -> ContextManager[Callable[[], Awaitable[_T]]]:
take: Optional[bool] = None) -> ContextManager[Callable[[], Awaitable[_T]]]:
"""
Subscribe to a set of message matching predicate while within a block
@@ -69,6 +69,8 @@ class MessageHandler(Generic[_T]):
If a subscriber is just an observer that will never drop or modify a message, take=False
may be used and messages will be sent as usual.
"""
if take is None:
take = self.take_by_default
msg_queue = asyncio.Queue()
def _handler_wrapper(message: _T):
@@ -91,8 +93,8 @@ class MessageHandler(Generic[_T]):
for n in notifiers:
n.unsubscribe(_handler_wrapper)
def wait_for(self, message_names: MESSAGE_NAMES,
predicate: Optional[PREDICATE] = None, timeout=None, take=True) -> Awaitable[_T]:
def wait_for(self, message_names: MESSAGE_NAMES, predicate: Optional[PREDICATE] = None,
timeout: Optional[float] = None, take: Optional[bool] = None) -> Awaitable[_T]:
"""
Wait for a single instance one of message_names matching predicate
@@ -101,8 +103,8 @@ class MessageHandler(Generic[_T]):
sequence of packets, since multiple packets may come in after the future has already
been marked completed, causing some to be missed.
"""
if isinstance(message_names, str):
message_names = (message_names,)
if take is None:
take = self.take_by_default
notifiers = [self.register(name) for name in message_names]
fut = asyncio.get_event_loop().create_future()
@@ -132,7 +134,7 @@ class MessageHandler(Generic[_T]):
notifier.subscribe(_handler, predicate=predicate)
return fut
def is_handled(self, message_name: str):
def is_handled(self, message_name: _K):
return message_name in self.handlers
def handle(self, message: _T):
@@ -140,7 +142,7 @@ class MessageHandler(Generic[_T]):
# Always try to call wildcard handlers
self._handle_type('*', message)
def _handle_type(self, name: str, message: _T):
def _handle_type(self, name: _K, message: _T):
handler = self.handlers.get(name)
if not handler:
return

View File

@@ -22,6 +22,7 @@ Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
import typing
from .msgtypes import MsgType, MsgBlockType
from ..datatypes import UUID
class MessageTemplateVariable:
@@ -61,6 +62,32 @@ class MessageTemplateVariable:
self._probably_text = self._probably_text and self.name != "NameValue"
return self._probably_text
@property
def default_value(self):
if self.type.is_int:
return 0
elif self.type.is_float:
return 0.0
elif self.type == MsgType.MVT_LLUUID:
return UUID()
elif self.type == MsgType.MVT_BOOL:
return False
elif self.type == MsgType.MVT_VARIABLE:
if self.probably_binary:
return b""
if self.probably_text:
return ""
return b""
elif self.type in (MsgType.MVT_LLVector3, MsgType.MVT_LLVector3d, MsgType.MVT_LLQuaternion):
return 0.0, 0.0, 0.0
elif self.type == MsgType.MVT_LLVector4:
return 0.0, 0.0, 0.0, 0.0
elif self.type == MsgType.MVT_FIXED:
return b"\x00" * self.size
elif self.type == MsgType.MVT_IP_ADDR:
return "0.0.0.0"
return None
class MessageTemplateBlock:
def __init__(self, name):

View File

@@ -27,25 +27,35 @@ from .template import MessageTemplate
from .template_parser import MessageTemplateParser
DEFAULT_PARSER = MessageTemplateParser(msg_tmpl)
class TemplateDictionary:
"""the dictionary with all known templates"""
def __init__(self, template_list=None, message_template=None):
if template_list is None:
if message_template is None:
parser = MessageTemplateParser(msg_tmpl)
parser = DEFAULT_PARSER
else:
parser = MessageTemplateParser(message_template)
template_list = parser.message_templates
self.template_list: typing.List[MessageTemplate] = template_list
self.template_list: typing.List[MessageTemplate] = []
# maps name to template
self.message_templates = {}
# maps (freq,num) to template
self.message_dict = {}
self.load_templates(template_list)
def load_templates(self, template_list):
self.template_list.clear()
self.template_list.extend(template_list)
self.message_templates.clear()
self.message_dict.clear()
self.build_dictionaries(template_list)
self.build_message_ids()
@@ -99,3 +109,6 @@ class TemplateDictionary:
def __iter__(self):
return iter(self.template_list)
DEFAULT_TEMPLATE_DICT = TemplateDictionary()

View File

@@ -26,7 +26,7 @@ from logging import getLogger
from hippolyzer.lib.base.datatypes import JankStringyBytes
from hippolyzer.lib.base.settings import Settings
from .template import MessageTemplateVariable
from .template_dict import TemplateDictionary
from .template_dict import DEFAULT_TEMPLATE_DICT
from .msgtypes import MsgType, MsgBlockType, PacketLayout
from .data_packer import TemplateDataPacker
from .message import Message, Block
@@ -62,7 +62,7 @@ def _parse_msg_num(reader: se.BufferReader):
class UDPMessageDeserializer:
DEFAULT_TEMPLATE = TemplateDictionary()
DEFAULT_TEMPLATE = DEFAULT_TEMPLATE_DICT
def __init__(self, settings=None):
self.settings = settings or Settings()

View File

@@ -26,7 +26,7 @@ from .data_packer import TemplateDataPacker
from .message import Message, MsgBlockList
from .msgtypes import MsgType, MsgBlockType
from .template import MessageTemplateVariable, MessageTemplateBlock
from .template_dict import TemplateDictionary
from .template_dict import TemplateDictionary, DEFAULT_TEMPLATE_DICT
from hippolyzer.lib.base import exc
from hippolyzer.lib.base import serialization as se
from hippolyzer.lib.base.datatypes import RawBytes
@@ -35,7 +35,7 @@ logger = getLogger('message.udpserializer')
class UDPMessageSerializer:
DEFAULT_TEMPLATE = TemplateDictionary(None)
DEFAULT_TEMPLATE = DEFAULT_TEMPLATE_DICT
def __init__(self, message_template=None):
if message_template is not None:

View File

@@ -30,6 +30,7 @@ class UDPPacket:
self.dst_addr = dst_addr
self.data = data
self.direction = direction
self.meta = {}
@property
def outgoing(self):
@@ -58,7 +59,7 @@ class AbstractUDPTransport(abc.ABC):
pass
class WrappingUDPTransport(AbstractUDPTransport):
class SocketUDPTransport(AbstractUDPTransport):
def __init__(self, transport: Union[asyncio.DatagramTransport, socket.socket]):
super().__init__()
self.transport = transport

View File

@@ -45,8 +45,8 @@ class Object(recordclass.datatuple): # type: ignore
State: Optional[int] = None
FullID: Optional[UUID] = None
CRC: Optional[int] = None
PCode: Optional[int] = None
Material: Optional[int] = None
PCode: Optional[tmpls.PCode] = None
Material: Optional[tmpls.MCode] = None
ClickAction: Optional[int] = None
Scale: Optional[Vector3] = None
ParentID: Optional[int] = None
@@ -182,14 +182,14 @@ class Object(recordclass.datatuple): # type: ignore
old_val = getattr(self, key, dataclasses.MISSING)
# Don't check equality if we're using a lazy proxy,
# parsing is deferred until we actually use it.
if isinstance(val, lazy_object_proxy.Proxy):
if any(isinstance(x, lazy_object_proxy.Proxy) for x in (old_val, val)):
# TODO: be smarter about this. Can we store the raw bytes and
# compare those if it's an unparsed object?
if old_val is not val:
updated_properties.add(key)
is_updated = old_val is not val
else:
if old_val != val:
updated_properties.add(key)
is_updated = old_val != val
if is_updated:
updated_properties.add(key)
setattr(self, key, val)
return updated_properties

View File

@@ -19,81 +19,48 @@ along with this program; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
from __future__ import annotations
import dataclasses
from typing import *
_T = TypeVar("_T")
class SettingDescriptor(Generic[_T]):
__slots__ = ("name", "default")
def __init__(self, default: Union[Callable[[], _T], _T]):
self.default = default
self.name: Optional[str] = None
def __set_name__(self, owner: Settings, name: str):
self.name = name
def _make_default(self) -> _T:
if callable(self.default):
return self.default()
return self.default
def __get__(self, obj: Settings, owner: Optional[Type] = None) -> _T:
val: Union[_T, dataclasses.MISSING] = obj.get_setting(self.name)
if val is dataclasses.MISSING:
val = self._make_default()
return val
def __set__(self, obj: Settings, value: _T) -> None:
obj.set_setting(self.name, value)
class Settings:
def __init__(self, quiet_logging=False, spammy_logging=False, log_tests=True):
""" some lovely configurable settings
ENABLE_DEFERRED_PACKET_PARSING: bool = SettingDescriptor(True)
These are applied application wide, and can be
overridden at any time in a specific instance
def __init__(self):
self._settings: Dict[str, Any] = {}
quiet_logging overrides spammy_logging
"""
def get_setting(self, name: str) -> Any:
return self._settings.get(name, dataclasses.MISSING)
self.quiet_logging = quiet_logging
self.spammy_logging = spammy_logging
# toggle handling udp packets
self.HANDLE_PACKETS = True
self.HANDLE_OUTGOING_PACKETS = False
# toggle parsing all/handled packets
self.ENABLE_DEFERRED_PACKET_PARSING = True
# ~~~~~~~~~~~~~~~~~~
# Logging behaviors
# ~~~~~~~~~~~~~~~~~~
# being a test tool, and an immature one at that,
# enable fine granularity in the logging, but
# make sure we can tone it down as well
self.LOG_VERBOSE = True
self.ENABLE_BYTES_TO_HEX_LOGGING = False
self.ENABLE_CAPS_LOGGING = True
self.ENABLE_CAPS_LLSD_LOGGING = False
self.ENABLE_EQ_LOGGING = True
self.ENABLE_UDP_LOGGING = True
self.ENABLE_OBJECT_LOGGING = True
self.LOG_SKIPPED_PACKETS = True
self.ENABLE_HOST_LOGGING = True
self.LOG_COROUTINE_SPAWNS = True
self.PROXY_LOGGING = False
# allow disabling logging of certain packets
self.DISABLE_SPAMMERS = True
self.UDP_SPAMMERS = ['PacketAck', 'AgentUpdate']
# toggle handling a region's event queue
self.ENABLE_REGION_EVENT_QUEUE = True
# how many seconds to wait between polling
# a region's event queue
self.REGION_EVENT_QUEUE_POLL_INTERVAL = 1
if self.spammy_logging:
self.ENABLE_BYTES_TO_HEX_LOGGING = True
self.ENABLE_CAPS_LLSD_LOGGING = True
self.DISABLE_SPAMMERS = False
# override the defaults
if self.quiet_logging:
self.LOG_VERBOSE = False
self.ENABLE_BYTES_TO_HEX_LOGGING = False
self.ENABLE_CAPS_LOGGING = False
self.ENABLE_CAPS_LLSD_LOGGING = False
self.ENABLE_EQ_LOGGING = False
self.ENABLE_UDP_LOGGING = False
self.LOG_SKIPPED_PACKETS = False
self.ENABLE_OBJECT_LOGGING = False
self.ENABLE_HOST_LOGGING = False
self.LOG_COROUTINE_SPAWNS = False
self.DISABLE_SPAMMERS = True
# ~~~~~~~~~~~~~~~~~~~~~~
# Test related settings
# ~~~~~~~~~~~~~~~~~~~~~~
if log_tests:
self.ENABLE_LOGGING_IN_TESTS = True
else:
self.ENABLE_LOGGING_IN_TESTS = False
def set_setting(self, name: str, val: Any):
self._settings[name] = val

View File

@@ -742,6 +742,21 @@ class PCode(IntEnum):
TREE = 255
@se.enum_field_serializer("ObjectUpdate", "ObjectData", "Material")
@se.enum_field_serializer("ObjectAdd", "ObjectData", "Material")
@se.enum_field_serializer("ObjectMaterial", "ObjectData", "Material")
class MCode(IntEnum):
# Seems like this is normally stored in a U8 with the high nybble masked off?
# What's in the high nybble, anything?
STONE = 0
METAL = 1
WOOD = 3
FLESH = 4
PLASTIC = 5
RUBBER = 6
LIGHT = 7
@se.flag_field_serializer("ObjectUpdate", "ObjectData", "UpdateFlags")
@se.flag_field_serializer("ObjectUpdateCompressed", "ObjectData", "UpdateFlags")
@se.flag_field_serializer("ObjectUpdateCached", "ObjectData", "UpdateFlags")
@@ -801,7 +816,7 @@ class AttachmentStateAdapter(se.Adapter):
@se.flag_field_serializer("AgentUpdate", "AgentData", "State")
class AgentState(IntFlag):
TYPING = 1 << 3
TYPING = 1 << 2
EDITING = 1 << 4
@@ -1007,28 +1022,45 @@ class TEExceptionField(se.SerializableBase):
return dict
def _te_dataclass_field(spec: se.SERIALIZABLE_TYPE, first=False, optional=False):
return se.dataclass_field(TEExceptionField(spec, first=first, optional=optional))
def _te_field(spec: se.SERIALIZABLE_TYPE, first=False, optional=False,
default_factory=dataclasses.MISSING, default=dataclasses.MISSING):
if default_factory is not dataclasses.MISSING:
new_default_factory = lambda: {None: default_factory()}
elif default is not None:
new_default_factory = lambda: {None: default}
else:
new_default_factory = dataclasses.MISSING
return se.dataclass_field(
TEExceptionField(spec, first=first, optional=optional),
default_factory=new_default_factory,
)
_T = TypeVar("_T")
TE_FIELD_TYPE = Dict[Optional[Sequence[int]], _T]
_TE_FIELD_KEY = Optional[Sequence[int]]
@dataclasses.dataclass
class TextureEntry:
Textures: TE_FIELD_TYPE[UUID] = _te_dataclass_field(se.UUID, first=True)
Textures: Dict[_TE_FIELD_KEY, UUID] = _te_field(
# Plywood texture
se.UUID, first=True, default=UUID('89556747-24cb-43ed-920b-47caed15465f'))
# Bytes are inverted so fully opaque white is \x00\x00\x00\x00
Color: TE_FIELD_TYPE[bytes] = _te_dataclass_field(Color4(invert_bytes=True))
ScalesS: TE_FIELD_TYPE[float] = _te_dataclass_field(se.F32)
ScalesT: TE_FIELD_TYPE[float] = _te_dataclass_field(se.F32)
OffsetsS: TE_FIELD_TYPE[int] = _te_dataclass_field(se.S16)
OffsetsT: TE_FIELD_TYPE[int] = _te_dataclass_field(se.S16)
Rotation: TE_FIELD_TYPE[int] = _te_dataclass_field(se.S16)
BasicMaterials: TE_FIELD_TYPE["BasicMaterials"] = _te_dataclass_field(BUMP_SHINY_FULLBRIGHT)
MediaFlags: TE_FIELD_TYPE["MediaFlags"] = _te_dataclass_field(MEDIA_FLAGS)
Glow: TE_FIELD_TYPE[int] = _te_dataclass_field(se.U8)
Materials: TE_FIELD_TYPE[UUID] = _te_dataclass_field(se.UUID, optional=True)
Color: Dict[_TE_FIELD_KEY, bytes] = _te_field(Color4(invert_bytes=True), default=b"\xff\xff\xff\xff")
ScalesS: Dict[_TE_FIELD_KEY, float] = _te_field(se.F32, default=1.0)
ScalesT: Dict[_TE_FIELD_KEY, float] = _te_field(se.F32, default=1.0)
OffsetsS: Dict[_TE_FIELD_KEY, int] = _te_field(se.S16, default=0)
OffsetsT: Dict[_TE_FIELD_KEY, int] = _te_field(se.S16, default=0)
Rotation: Dict[_TE_FIELD_KEY, int] = _te_field(se.S16, default=0)
BasicMaterials: Dict[_TE_FIELD_KEY, "BasicMaterials"] = _te_field(
BUMP_SHINY_FULLBRIGHT, default_factory=lambda: BasicMaterials(Bump=0, FullBright=False, Shiny=0),
)
MediaFlags: Dict[_TE_FIELD_KEY, "MediaFlags"] = _te_field(
MEDIA_FLAGS,
default_factory=lambda: MediaFlags(WebPage=False, TexGen=TexGen.DEFAULT, _Unused=0),
)
Glow: Dict[_TE_FIELD_KEY, int] = _te_field(se.U8, default=0)
Materials: Dict[_TE_FIELD_KEY, UUID] = _te_field(se.UUID, optional=True, default=UUID())
TE_SERIALIZER = se.Dataclass(TextureEntry)
@@ -1326,7 +1358,7 @@ class ObjectUpdateCompressedDataSerializer(se.SimpleSubfieldSerializer):
# point if an object with parents set to an avatar.
"State": ObjectStateAdapter(se.U8),
"CRC": se.U32,
"Material": se.U8,
"Material": se.IntEnum(MCode, se.U8),
"ClickAction": se.U8,
"Scale": se.Vector3,
"Position": se.Vector3,

View File

@@ -246,7 +246,7 @@ class XferManager:
def complete_predicate(complete_msg: Message):
return complete_msg["AssetBlock"]["UUID"] == asset_id
msg = await message_handler.wait_for('AssetUploadComplete', predicate=complete_predicate)
msg = await message_handler.wait_for(('AssetUploadComplete',), predicate=complete_predicate)
if msg["AssetBlock"]["Success"] == 1:
fut.set_result(asset_id)
else:
@@ -263,7 +263,7 @@ class XferManager:
):
message_handler = self._connection_holder.message_handler
request_msg = await message_handler.wait_for(
'RequestXfer', predicate=request_predicate, timeout=5.0)
('RequestXfer',), predicate=request_predicate, timeout=5.0)
xfer.xfer_id = request_msg["XferID"]["ID"]
packet_id = 0
@@ -282,5 +282,5 @@ class XferManager:
# Don't care about the value, just want to know it was confirmed.
if wait_for_confirm:
await message_handler.wait_for(
"ConfirmXferPacket", predicate=xfer.is_our_message, timeout=5.0)
("ConfirmXferPacket",), predicate=xfer.is_our_message, timeout=5.0)
packet_id += 1

View File

@@ -39,7 +39,7 @@ class NameCache:
def create_subscriptions(
self,
message_handler: MessageHandler[Message],
message_handler: MessageHandler[Message, str],
):
message_handler.subscribe("UUIDNameReply", self._handle_uuid_name_reply)

View File

@@ -16,6 +16,7 @@ from typing import *
from hippolyzer.lib.base.datatypes import UUID, Vector3
from hippolyzer.lib.base.helpers import proxify
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.base.objects import (
normalize_object_update,
normalize_terse_object_update,
@@ -23,6 +24,7 @@ from hippolyzer.lib.base.objects import (
normalize_object_update_compressed,
Object, handle_to_global_pos,
)
from hippolyzer.lib.base.settings import Settings
from hippolyzer.lib.client.namecache import NameCache, NameCacheEntry
from hippolyzer.lib.client.state import BaseClientSession, BaseClientRegion
from hippolyzer.lib.base.templates import PCode, ObjectStateSerializer
@@ -37,6 +39,7 @@ class UpdateType(enum.IntEnum):
PROPERTIES = enum.auto()
FAMILY = enum.auto()
COSTS = enum.auto()
KILL = enum.auto()
class ClientObjectManager:
@@ -63,11 +66,11 @@ class ClientObjectManager:
return self.state.missing_locals
def clear(self):
self.state.clear()
if self._region.handle is not None:
# We're tracked by the world object manager, tell it to untrack
# any objects that we owned
self._world_objects.clear_region_objects(self._region.handle)
self.state.clear()
def lookup_localid(self, localid: int) -> Optional[Object]:
return self.state.lookup_localid(localid)
@@ -110,12 +113,12 @@ class ClientObjectManager:
while ids_to_req:
blocks = [
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
*[Block("ObjectData", ObjectLocalID=x) for x in ids_to_req[:100]],
*[Block("ObjectData", ObjectLocalID=x) for x in ids_to_req[:255]],
]
# Selecting causes ObjectProperties to be sent
self._region.circuit.send_message(Message("ObjectSelect", blocks))
self._region.circuit.send_message(Message("ObjectDeselect", blocks))
ids_to_req = ids_to_req[100:]
ids_to_req = ids_to_req[255:]
futures = []
for local_id in local_ids:
@@ -150,9 +153,9 @@ class ClientObjectManager:
self._region.circuit.send_message(Message(
"RequestMultipleObjects",
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
*[Block("ObjectData", CacheMissType=0, ID=x) for x in ids_to_req[:100]],
*[Block("ObjectData", CacheMissType=0, ID=x) for x in ids_to_req[:255]],
))
ids_to_req = ids_to_req[100:]
ids_to_req = ids_to_req[255:]
futures = []
for local_id in local_ids:
@@ -160,15 +163,34 @@ class ClientObjectManager:
return futures
class ObjectEvent:
__slots__ = ("object", "updated", "update_type")
object: Object
updated: Set[str]
update_type: UpdateType
def __init__(self, obj: Object, updated: Set[str], update_type: UpdateType):
self.object = obj
self.updated = updated
self.update_type = update_type
@property
def name(self) -> UpdateType:
return self.update_type
class ClientWorldObjectManager:
"""Manages Objects for a session's whole world"""
def __init__(self, session: BaseClientSession, name_cache: Optional[NameCache]):
def __init__(self, session: BaseClientSession, settings: Settings, name_cache: Optional[NameCache]):
self._session: BaseClientSession = session
self._settings = settings
self.name_cache = name_cache or NameCache()
self.events: MessageHandler[ObjectEvent, UpdateType] = MessageHandler(take_by_default=False)
self._fullid_lookup: Dict[UUID, Object] = {}
self._avatars: Dict[UUID, Avatar] = {}
self._avatar_objects: Dict[UUID, Object] = {}
self._region_managers: Dict[int, ClientObjectManager] = {}
self.name_cache = name_cache or NameCache()
message_handler = self._session.message_handler
message_handler.subscribe("ObjectUpdate", self._handle_object_update)
message_handler.subscribe("ImprovedTerseObjectUpdate",
@@ -215,13 +237,12 @@ class ClientWorldObjectManager:
self._region_managers[handle] = proxify(self._session.region_by_handle(handle).objects)
def clear_region_objects(self, handle: int):
"""Signal that a region object manager is being cleared"""
region_mgr = self._region_managers.get(handle)
if region_mgr is None:
return
# Make sure they're gone from our lookup table first
for obj in region_mgr.all_objects:
del self._fullid_lookup[obj.FullID]
"""Handle signal that a region object manager was just cleared"""
# Make sure they're gone from our lookup table
for obj in tuple(self._fullid_lookup.values()):
if obj.RegionHandle == handle:
del self._fullid_lookup[obj.FullID]
self._rebuild_avatar_objects()
def _get_region_manager(self, handle: int) -> Optional[ClientObjectManager]:
return self._region_managers.get(handle)
@@ -452,15 +473,17 @@ class ClientWorldObjectManager:
missing_locals.add(block["ID"])
if region_state:
region_state.missing_locals.update(missing_locals)
self._handle_object_update_cached_misses(handle, missing_locals)
if missing_locals:
self._handle_object_update_cached_misses(handle, missing_locals)
msg.meta["ObjectUpdateIDs"] = tuple(seen_locals)
def _handle_object_update_cached_misses(self, region_handle: int, local_ids: Set[int]):
def _handle_object_update_cached_misses(self, region_handle: int, missing_locals: Set[int]):
"""Handle an ObjectUpdateCached that referenced some un-cached local IDs"""
region_mgr = self._get_region_manager(region_handle)
region_mgr.request_objects(local_ids)
region_mgr.request_objects(missing_locals)
# noinspection PyUnusedLocal
def _lookup_cache_entry(self, handle: int, local_id: int, crc: int) -> Optional[bytes]:
def _lookup_cache_entry(self, region_handle: int, local_id: int, crc: int) -> Optional[bytes]:
return None
def _handle_object_update_compressed(self, msg: Message):
@@ -546,9 +569,10 @@ class ClientWorldObjectManager:
if obj.PCode == PCode.AVATAR and "NameValue" in updated_props:
if obj.NameValue:
self.name_cache.update(obj.FullID, obj.NameValue.to_dict())
self.events.handle(ObjectEvent(obj, updated_props, update_type))
def _run_kill_object_hooks(self, obj: Object):
pass
self.events.handle(ObjectEvent(obj, set(), UpdateType.KILL))
def _rebuild_avatar_objects(self):
# Get all avatars known through coarse locations and which region the location was in
@@ -574,6 +598,9 @@ class ClientWorldObjectManager:
coarse_handle, coarse_location = coarse_pair
av.CoarseLocation = coarse_location
av.RegionHandle = coarse_handle
# If we have a real value for Z then throw away any stale guesses
if av.CoarseLocation.Z != math.inf:
av.GuessedZ = None
if av_obj:
av.Object = av_obj
av.RegionHandle = av_obj.RegionHandle
@@ -799,6 +826,7 @@ class Avatar:
# to fill in the Z axis if it's infinite
self.CoarseLocation = coarse_location
self.Valid = True
self.GuessedZ: Optional[float] = None
self._resolved_name = resolved_name
@property
@@ -814,6 +842,9 @@ class Avatar:
if self.Object and self.Object.AncestorsKnown:
return self.Object.RegionPosition
if self.CoarseLocation is not None:
if self.CoarseLocation.Z == math.inf and self.GuessedZ is not None:
coarse = self.CoarseLocation
return Vector3(coarse.X, coarse.Y, self.GuessedZ)
return self.CoarseLocation
raise ValueError(f"Avatar {self.FullID} has no known position")
@@ -833,6 +864,18 @@ class Avatar:
return None
return self._resolved_name.preferred_name
@property
def DisplayName(self) -> Optional[str]:
if not self._resolved_name:
return None
return self._resolved_name.display_name
@property
def LegacyName(self) -> Optional[str]:
if not self._resolved_name:
return None
return self._resolved_name.legacy_name
def __repr__(self):
loc_str = str(self.RegionPosition) if self.LocationType != LocationType.NONE else "?"
return f"<{self.__class__.__name__} {self.FullID} {self.Name!r} @ {loc_str}>"

View File

@@ -29,7 +29,7 @@ class BaseClientSession(abc.ABC):
id: UUID
agent_id: UUID
secure_session_id: UUID
message_handler: MessageHandler[Message]
message_handler: MessageHandler[Message, str]
regions: Sequence[BaseClientRegion]
region_by_handle: Callable[[int], Optional[BaseClientRegion]]
region_by_circuit_addr: Callable[[ADDR_TUPLE], Optional[BaseClientRegion]]

View File

@@ -181,13 +181,15 @@ class BaseAddon(abc.ABC):
def handle_region_changed(self, session: Session, region: ProxiedRegion):
pass
def handle_circuit_created(self, session: Session, region: ProxiedRegion):
pass
def handle_rlv_command(self, session: Session, region: ProxiedRegion, source: UUID,
cmd: str, options: List[str], param: str):
pass
def handle_proxied_packet(self, session_manager: SessionManager, packet: UDPPacket,
session: Optional[Session], region: Optional[ProxiedRegion],
message: Optional[Message]):
session: Optional[Session], region: Optional[ProxiedRegion]):
pass

View File

@@ -16,15 +16,15 @@ from types import ModuleType
from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.network.transport import UDPPacket
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.task_scheduler import TaskLifeScope, TaskScheduler
if TYPE_CHECKING:
from hippolyzer.lib.proxy.commands import CommandDetails, WrappedCommandCallable
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.objects import Object
from hippolyzer.lib.base.network.transport import UDPPacket
from hippolyzer.lib.proxy.object_manager import Object
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session, SessionManager
@@ -55,6 +55,10 @@ class BaseInteractionManager:
async def save_file(self, caption: str = '', directory: str = '', filter_str: str = '') -> Optional[str]:
pass
@abc.abstractmethod
async def confirm(self, title: str, caption: str) -> bool:
pass
def main_window_handle(self) -> Any:
return None
@@ -97,9 +101,14 @@ class AddonManager:
@classmethod
def shutdown(cls):
to_pop = []
for mod in cls.FRESH_ADDON_MODULES.values():
to_pop.append(mod)
cls._call_module_hooks(mod, "handle_unload", cls.SESSION_MANAGER)
cls.SCHEDULER.shutdown()
for mod in to_pop:
if isinstance(mod, ModuleType):
sys.modules.pop(mod.__name__, None)
@classmethod
def have_active_repl(cls):
@@ -169,6 +178,7 @@ class AddonManager:
old_mod = cls.FRESH_ADDON_MODULES.pop(specs[0].name, None)
if old_mod:
cls._unload_module(old_mod)
sys.modules.pop(old_mod.__name__, None)
if reload:
cls._reload_addons()
@@ -516,9 +526,14 @@ class AddonManager:
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_region_changed", session, region)
@classmethod
def handle_circuit_created(cls, session: Session, region: ProxiedRegion):
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_circuit_created", session, region)
@classmethod
def handle_proxied_packet(cls, session_manager: SessionManager, packet: UDPPacket,
session: Optional[Session], region: Optional[ProxiedRegion],
message: Optional[Message]):
return cls._call_all_addon_hooks("handle_proxied_packet", session_manager,
packet, session, region, message)
session: Optional[Session], region: Optional[ProxiedRegion]):
with addon_ctx.push(session, region):
return cls._call_all_addon_hooks("handle_proxied_packet", session_manager,
packet, session, region)

View File

@@ -1,20 +1,21 @@
from __future__ import annotations
import os
import re
import sys
from typing import *
from hippolyzer.lib.base.network.caps_client import CapsClient, CAPS_DICT
from hippolyzer.lib.proxy.settings import ProxySettings
if TYPE_CHECKING:
from hippolyzer.lib.proxy.region import ProxiedRegion
class ProxyCapsClient(CapsClient):
def __init__(self, region: Optional[ProxiedRegion] = None):
def __init__(self, settings: ProxySettings, region: Optional[ProxiedRegion] = None):
super().__init__(None)
self._region = region
self._settings = settings
def _get_caps(self) -> Optional[CAPS_DICT]:
if not self._region:
@@ -28,8 +29,7 @@ class ProxyCapsClient(CapsClient):
# request came from us so we can tag the request as injected. The header will be popped
# off before passing through to the server.
headers["X-Hippo-Injected"] = "1"
# TODO: Have a setting for this
proxy_port = int(os.environ.get("HIPPO_HTTP_PORT", 9062))
proxy_port = self._settings.HTTP_PROXY_PORT
proxy = f"http://127.0.0.1:{proxy_port}"
# TODO: set up the SSLContext to validate mitmproxy's cert
ssl = ssl or False

View File

@@ -48,7 +48,11 @@ class HTTPAssetRepo(collections.UserDict):
asset_id = None
for name, val in flow.request.query.items():
if name.endswith("_id"):
asset_id = UUID(val)
try:
asset_id = UUID(val)
break
except ValueError:
pass
if not asset_id or asset_id not in self.data:
return False

View File

@@ -14,6 +14,7 @@ import defusedxml.xmlrpc
import mitmproxy.http
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.llsd_msg_serializer import LLSDMessageSerializer
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
@@ -136,6 +137,27 @@ class MITMProxyEventManager:
# the proxy
self._asset_server_proxied = True
logging.warning("noproxy not used, switching to URI rewrite strategy")
elif cap_data and cap_data.cap_name == "EventQueueGet":
# HACK: The sim's EQ acking mechanism doesn't seem to actually work.
# if the client drops the connection due to timeout before we can
# proxy back the response then it will be lost forever. Keep around
# the last EQ response we got so we can re-send it if the client repeats
# its previous request.
req_ack_id = llsd.parse_xml(flow.request.content)["ack"]
eq_manager = cap_data.region().eq_manager
cached_resp = eq_manager.get_cached_poll_response(req_ack_id)
if cached_resp:
logging.warning("Had to serve a cached EventQueueGet due to client desync")
flow.response = mitmproxy.http.HTTPResponse.make(
200,
llsd.format_xml(cached_resp),
{
"Content-Type": "application/llsd+xml",
# So we can differentiate these in the log
"X-Hippo-Fake-EQ": "1",
"Connection": "close",
},
)
elif not cap_data:
if self._is_login_request(flow):
# Not strictly a Cap, but makes it easier to filter on.
@@ -181,69 +203,92 @@ class MITMProxyEventManager:
if flow.request_injected:
return
if AddonManager.handle_http_response(flow):
return
status = flow.response.status_code
cap_data: Optional[CapData] = flow.metadata["cap_data"]
if cap_data:
if status != 200:
if status == 200 and cap_data and cap_data.cap_name == "FirestormBridge":
# Fake FirestormBridge cap based on a bridge-like response coming from
# a non-browser HTTP request. Figure out what session it belongs to
# so it can be handled in the session and region HTTP MessageHandlers
agent_id_str = flow.response.headers.get("X-SecondLife-Owner-Key", "")
if not agent_id_str:
return
agent_id = UUID(agent_id_str)
for session in self.session_manager.sessions:
if session.pending:
continue
if session.agent_id == agent_id:
# Enrich the flow with the session and region info
cap_data = CapData(
cap_name="FirestormBridge",
region=weakref.ref(session.main_region),
session=weakref.ref(session),
)
flow.cap_data = cap_data
break
if cap_data.cap_name == "LoginRequest":
self._handle_login_flow(flow)
if AddonManager.handle_http_response(flow):
return
if status != 200 or not cap_data:
return
if cap_data.cap_name == "LoginRequest":
self._handle_login_flow(flow)
return
try:
session = cap_data.session and cap_data.session()
if not session:
return
try:
session = cap_data.session and cap_data.session()
if not session:
return
session.http_message_handler.handle(flow)
session.http_message_handler.handle(flow)
region = cap_data.region and cap_data.region()
region = cap_data.region and cap_data.region()
if not region:
return
region.http_message_handler.handle(flow)
if cap_data.cap_name == "Seed":
parsed = llsd.parse_xml(flow.response.content)
logging.debug("Got seed cap for %r : %r" % (cap_data, parsed))
region.update_caps(parsed)
# On LL's grid these URIs aren't unique across sessions or regions,
# so we get request attribution by replacing them with a unique
# alias URI.
logging.debug("Replacing GetMesh caps with wrapped versions")
wrappable_caps = {"GetMesh2", "GetMesh", "GetTexture", "ViewerAsset"}
for cap_name in wrappable_caps:
if cap_name in parsed:
parsed[cap_name] = region.register_wrapper_cap(cap_name)
flow.response.content = llsd.format_pretty_xml(parsed)
elif cap_data.cap_name == "EventQueueGet":
parsed_eq_resp = llsd.parse_xml(flow.response.content)
if parsed_eq_resp:
old_events = parsed_eq_resp["events"]
new_events = []
for event in old_events:
if not self._handle_eq_event(cap_data.session(), region, event):
new_events.append(event)
# Add on any fake events that've been queued by addons
eq_manager = cap_data.region().eq_manager
new_events.extend(eq_manager.take_injected_events())
parsed_eq_resp["events"] = new_events
# Empty event list is an error, need to return undef instead.
if old_events and not new_events:
parsed_eq_resp = None
# HACK: see note in above request handler for EventQueueGet
req_ack_id = llsd.parse_xml(flow.request.content)["ack"]
eq_manager.cache_last_poll_response(req_ack_id, parsed_eq_resp)
flow.response.content = llsd.format_pretty_xml(parsed_eq_resp)
elif cap_data.cap_name in self.UPLOAD_CREATING_CAPS:
if not region:
return
region.http_message_handler.handle(flow)
if cap_data.cap_name == "Seed":
parsed = llsd.parse_xml(flow.response.content)
logging.debug("Got seed cap for %r : %r" % (cap_data, parsed))
region.update_caps(parsed)
# On LL's grid these URIs aren't unique across sessions or regions,
# so we get request attribution by replacing them with a unique
# alias URI.
logging.debug("Replacing GetMesh caps with wrapped versions")
wrappable_caps = {"GetMesh2", "GetMesh", "GetTexture", "ViewerAsset"}
for cap_name in wrappable_caps:
if cap_name in parsed:
parsed[cap_name] = region.register_wrapper_cap(cap_name)
flow.response.content = llsd.format_pretty_xml(parsed)
elif cap_data.cap_name == "EventQueueGet":
parsed_eq_resp = llsd.parse_xml(flow.response.content)
if parsed_eq_resp:
old_events = parsed_eq_resp["events"]
new_events = []
for event in old_events:
if not self._handle_eq_event(cap_data.session(), region, event):
new_events.append(event)
# Add on any fake events that've been queued by addons
eq_manager = cap_data.region().eq_manager
new_events.extend(eq_manager.take_events())
parsed_eq_resp["events"] = new_events
if old_events and not new_events:
# Need at least one event or the viewer will refuse to ack!
new_events.append({"message": "NOP", "body": {}})
flow.response.content = llsd.format_pretty_xml(parsed_eq_resp)
elif cap_data.cap_name in self.UPLOAD_CREATING_CAPS:
if not region:
return
parsed = llsd.parse_xml(flow.response.content)
if "uploader" in parsed:
region.register_temporary_cap(cap_data.cap_name + "Uploader", parsed["uploader"])
except:
logging.exception("OOPS, blew up in HTTP proxy!")
parsed = llsd.parse_xml(flow.response.content)
if "uploader" in parsed:
region.register_temporary_cap(cap_data.cap_name + "Uploader", parsed["uploader"])
except:
logging.exception("OOPS, blew up in HTTP proxy!")
def _handle_login_flow(self, flow: HippoHTTPFlow):
resp = xmlrpc.client.loads(flow.response.content)[0][0] # type: ignore

View File

@@ -82,6 +82,7 @@ class HTTPFlowContext:
self.from_proxy_queue = multiprocessing.Queue()
self.to_proxy_queue = multiprocessing.Queue()
self.shutdown_signal = multiprocessing.Event()
self.mitmproxy_ready = multiprocessing.Event()
class IPCInterceptionAddon:

View File

@@ -5,7 +5,6 @@ from typing import Optional, Tuple
from hippolyzer.lib.base.message.message_dot_xml import MessageDotXML
from hippolyzer.lib.base.message.udpdeserializer import UDPMessageDeserializer
from hippolyzer.lib.base.message.udpserializer import UDPMessageSerializer
from hippolyzer.lib.base.settings import Settings
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.base.network.transport import UDPPacket
from hippolyzer.lib.base.message.message import Message
@@ -26,17 +25,16 @@ class SLSOCKS5Server(SOCKS5Server):
return lambda: InterceptingLLUDPProxyProtocol(source_addr, self.session_manager)
class BaseLLUDPProxyProtocol(UDPProxyProtocol):
def __init__(self, source_addr: Tuple[str, int]):
class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
def __init__(self, source_addr: Tuple[str, int], session_manager: SessionManager):
super().__init__(source_addr)
self.settings = Settings()
self.settings.ENABLE_DEFERRED_PACKET_PARSING = True
self.settings.HANDLE_PACKETS = False
self.session_manager: SessionManager = session_manager
self.serializer = UDPMessageSerializer()
self.deserializer = UDPMessageDeserializer(
settings=self.settings,
settings=self.session_manager.settings,
)
self.message_xml = MessageDotXML()
self.session: Optional[Session] = None
def _ensure_message_allowed(self, msg: Message):
if not self.message_xml.validate_udp_msg(msg.name):
@@ -45,37 +43,22 @@ class BaseLLUDPProxyProtocol(UDPProxyProtocol):
)
raise PermissionError(f"UDPBanned message {msg.name}")
class InterceptingLLUDPProxyProtocol(BaseLLUDPProxyProtocol):
def __init__(self, source_addr: Tuple[str, int], session_manager: SessionManager):
super().__init__(source_addr)
self.session_manager: SessionManager = session_manager
self.session: Optional[Session] = None
def _handle_proxied_packet(self, packet: UDPPacket):
message: Optional[Message] = None
def handle_proxied_packet(self, packet: UDPPacket):
region: Optional[ProxiedRegion] = None
# Try to do an initial region lookup so we have it for handle_proxied_packet()
if self.session:
region = self.session.region_by_circuit_addr(packet.far_addr)
deserialize_exc = None
try:
message = self.deserializer.deserialize(packet.data)
message.direction = packet.direction
message.sender = packet.src_addr
except Exception as e:
# Hang onto this since handle_proxied_packet doesn't need a parseable
# message. If that hook doesn't handle the packet then re-raise.
deserialize_exc = e
# the proxied packet handler is allowed to mutate `packet.data` before
# the message gets parsed.
if AddonManager.handle_proxied_packet(self.session_manager, packet,
self.session, region, message):
# Swallow any error raised by above message deserialization, it was handled.
self.session, region):
return
if deserialize_exc is not None:
# handle_proxied_packet() didn't deal with the error, so it's fatal.
raise deserialize_exc
message = self.deserializer.deserialize(packet.data)
message.direction = packet.direction
message.sender = packet.src_addr
message.meta.update(packet.meta)
assert message is not None
# Check for UDP bans on inbound messages
@@ -125,7 +108,7 @@ class InterceptingLLUDPProxyProtocol(BaseLLUDPProxyProtocol):
if message.name == "RegionHandshake":
region.cache_id = message["RegionInfo"]["CacheID"]
self.session.objects.track_region_objects(region.handle)
if self.session_manager.use_viewer_object_cache:
if self.session_manager.settings.USE_VIEWER_OBJECT_CACHE:
try:
region.objects.load_cache()
except:

View File

@@ -17,8 +17,8 @@ if TYPE_CHECKING:
class ProxyNameCache(NameCache):
def create_subscriptions(
self,
message_handler: MessageHandler[Message],
http_message_handler: Optional[MessageHandler[HippoHTTPFlow]] = None,
message_handler: MessageHandler[Message, str],
http_message_handler: Optional[MessageHandler[HippoHTTPFlow, str]] = None,
):
super().create_subscriptions(message_handler)
if http_message_handler is not None:
@@ -32,6 +32,9 @@ class ProxyNameCache(NameCache):
with open(namecache_file, "rb") as f:
namecache_bytes = f.read()
agents = llsd.parse_xml(namecache_bytes)["agents"]
# Can be `None` if the file was just created
if not agents:
continue
for agent_id, agent_data in agents.items():
# Don't set display name if they just have the default
display_name = None

View File

@@ -1,9 +1,13 @@
from __future__ import annotations
import asyncio
import logging
from typing import *
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.templates import PCode
from hippolyzer.lib.client.namecache import NameCache
from hippolyzer.lib.client.object_manager import (
ClientObjectManager,
@@ -13,6 +17,7 @@ from hippolyzer.lib.client.object_manager import (
from hippolyzer.lib.base.objects import Object
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.settings import ProxySettings
from hippolyzer.lib.proxy.vocache import RegionViewerObjectCacheChain
if TYPE_CHECKING:
@@ -31,15 +36,21 @@ class ProxyObjectManager(ClientObjectManager):
def __init__(
self,
region: ProxiedRegion,
use_vo_cache: bool = False
may_use_vo_cache: bool = False
):
super().__init__(region)
self.use_vo_cache = use_vo_cache
self.may_use_vo_cache = may_use_vo_cache
self.cache_loaded = False
self.object_cache = RegionViewerObjectCacheChain([])
self._cache_miss_timer: Optional[asyncio.TimerHandle] = None
self.queued_cache_misses: Set[int] = set()
region.message_handler.subscribe(
"RequestMultipleObjects",
self._handle_request_multiple_objects,
)
def load_cache(self):
if not self.use_vo_cache or self.cache_loaded:
if not self.may_use_vo_cache or self.cache_loaded:
return
handle = self._region.handle
if not handle:
@@ -48,33 +59,75 @@ class ProxyObjectManager(ClientObjectManager):
self.cache_loaded = True
self.object_cache = RegionViewerObjectCacheChain.for_region(handle, self._region.cache_id)
def request_missed_cached_objects_soon(self):
if self._cache_miss_timer:
self._cache_miss_timer.cancel()
# Basically debounce. Will only trigger 0.2 seconds after the last time it's invoked to
# deal with the initial flood of ObjectUpdateCached and the natural lag time between that
# and the viewers' RequestMultipleObjects messages
self._cache_miss_timer = asyncio.get_event_loop().call_later(
0.2, self._request_missed_cached_objects)
def _request_missed_cached_objects(self):
self._cache_miss_timer = None
self.request_objects(self.queued_cache_misses)
self.queued_cache_misses.clear()
def clear(self):
super().clear()
self.object_cache = RegionViewerObjectCacheChain([])
self.cache_loaded = False
self.queued_cache_misses.clear()
if self._cache_miss_timer:
self._cache_miss_timer.cancel()
self._cache_miss_timer = None
def _is_localid_selected(self, localid: int):
return localid in self._region.session().selected.object_locals
def _handle_request_multiple_objects(self, msg: Message):
# Remove any queued cache misses that the viewer just requested for itself
self.queued_cache_misses -= {b["ID"] for b in msg["ObjectData"]}
class ProxyWorldObjectManager(ClientWorldObjectManager):
_session: Session
_settings: ProxySettings
def __init__(self, session: Session, name_cache: Optional[NameCache]):
super().__init__(session, name_cache)
def __init__(self, session: Session, settings: ProxySettings, name_cache: Optional[NameCache]):
super().__init__(session, settings, name_cache)
session.http_message_handler.subscribe(
"GetObjectCost",
self._handle_get_object_cost
)
session.http_message_handler.subscribe(
"FirestormBridge",
self._handle_firestorm_bridge_request,
)
def _handle_object_update_cached_misses(self, region_handle: int, local_ids: Set[int]):
# Don't do anything automatically. People have to manually ask for
# missed objects to be fetched.
pass
def _handle_object_update_cached_misses(self, region_handle: int, missing_locals: Set[int]):
if self._settings.AUTOMATICALLY_REQUEST_MISSING_OBJECTS:
# Schedule these local IDs to be requested soon if the viewer doesn't request
# them itself. Ideally we could just mutate the CRC of the ObjectUpdateCached
# to force a CRC cache miss in the viewer, but that appears to cause the viewer
# to drop the resulting ObjectUpdateCompressed when the CRC doesn't match?
# It was causing all objects to go missing even though the ObjectUpdateCompressed
# was received.
region_mgr: Optional[ProxyObjectManager] = self._get_region_manager(region_handle)
region_mgr.queued_cache_misses |= missing_locals
region_mgr.request_missed_cached_objects_soon()
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str], update_type: UpdateType):
super()._run_object_update_hooks(obj, updated_props, update_type)
region = self._session.region_by_handle(obj.RegionHandle)
if obj.PCode == PCode.AVATAR and "ParentID" in updated_props:
if obj.ParentID and not region.objects.lookup_localid(obj.ParentID):
# If an avatar just sat on an object we don't know about, add it to the queued
# cache misses and request if if the viewer doesn't. This should happen
# regardless of the auto-request object setting because otherwise we have no way
# to get a sitting agent's true region location, even if it's ourself.
region.objects.queued_cache_misses.add(obj.ParentID)
region.objects.request_missed_cached_objects_soon()
AddonManager.handle_object_updated(self._session, region, obj, updated_props)
def _run_kill_object_hooks(self, obj: Object):
@@ -82,10 +135,34 @@ class ProxyWorldObjectManager(ClientWorldObjectManager):
region = self._session.region_by_handle(obj.RegionHandle)
AddonManager.handle_object_killed(self._session, region, obj)
def _lookup_cache_entry(self, handle: int, local_id: int, crc: int) -> Optional[bytes]:
region_mgr: Optional[ProxyObjectManager] = self._get_region_manager(handle)
def _lookup_cache_entry(self, region_handle: int, local_id: int, crc: int) -> Optional[bytes]:
region_mgr: Optional[ProxyObjectManager] = self._get_region_manager(region_handle)
return region_mgr.object_cache.lookup_object_data(local_id, crc)
def _handle_get_object_cost(self, flow: HippoHTTPFlow):
parsed = llsd.parse_xml(flow.response.content)
self._process_get_object_cost_response(parsed)
def _handle_firestorm_bridge_request(self, flow: HippoHTTPFlow):
"""
Pull guessed avatar Z offsets from Firestorm Bridge requests
CoarseLocationUpdate packets can only represent heights up to 1024, so
viewers typically use an LSL bridge to get avatar heights beyond that range
and combine it with their X and Y coords from CoarseLocationUpdate packets.
"""
if not flow.request.content.startswith(b'<llsd><string>getZOffsets|'):
return
parsed: str = llsd.parse_xml(flow.response.content)
if not parsed:
return
# av_1_id, 1025.001, av_2_id, 3000.0, ...
split = parsed.split(", ")
for av_id, z_offset in zip(split[0::2], split[1::2]):
av_id = UUID(av_id)
z_offset = float(z_offset)
av = self.lookup_avatar(av_id)
if not av:
continue
av.GuessedZ = z_offset

View File

@@ -49,7 +49,7 @@ class CapsMultiDict(multidict.MultiDict[Tuple[CapType, str]]):
class ProxiedRegion(BaseClientRegion):
def __init__(self, circuit_addr, seed_cap: str, session, handle=None):
def __init__(self, circuit_addr, seed_cap: str, session: Session, handle=None):
# A client may make a Seed request twice, and may get back two (valid!) sets of
# Cap URIs. We need to be able to look up both, so MultiDict is necessary.
self.handle: Optional[int] = handle
@@ -63,11 +63,12 @@ class ProxiedRegion(BaseClientRegion):
if seed_cap:
self._caps["Seed"] = (CapType.NORMAL, seed_cap)
self.session: Callable[[], Session] = weakref.ref(session)
self.message_handler: MessageHandler[Message] = MessageHandler()
self.http_message_handler: MessageHandler[HippoHTTPFlow] = MessageHandler()
self.message_handler: MessageHandler[Message, str] = MessageHandler()
self.http_message_handler: MessageHandler[HippoHTTPFlow, str] = MessageHandler()
self.eq_manager = EventQueueManager(self)
self.caps_client = ProxyCapsClient(proxify(self))
self.objects: ProxyObjectManager = ProxyObjectManager(self, use_vo_cache=True)
settings = session.session_manager.settings
self.caps_client = ProxyCapsClient(settings, proxify(self))
self.objects: ProxyObjectManager = ProxyObjectManager(self, may_use_vo_cache=True)
self.xfer_manager = XferManager(proxify(self), self.session().secure_session_id)
self.transfer_manager = TransferManager(proxify(self), session.agent_id, session.id)
self._recalc_caps()
@@ -120,7 +121,7 @@ class ProxiedRegion(BaseClientRegion):
parsed = list(urllib.parse.urlsplit(self._caps[name][1]))
seed_id = self._caps["Seed"][1].split("/")[-1].encode("utf8")
# Give it a unique domain tied to the current Seed URI
parsed[1] = f"{name}-{hashlib.sha256(seed_id).hexdigest()[:16]}.hippo-proxy.localhost"
parsed[1] = f"{name.lower()}-{hashlib.sha256(seed_id).hexdigest()[:16]}.hippo-proxy.localhost"
# Force the URL to HTTP, we're going to handle the request ourselves so it doesn't need
# to be secure. This should save on expensive TLS context setup for each req.
parsed[0] = "http"
@@ -161,6 +162,7 @@ class ProxiedRegion(BaseClientRegion):
if self.circuit:
self.circuit.is_alive = False
self.objects.clear()
self.eq_manager.clear()
def __repr__(self):
return "<%s %s>" % (self.__class__.__name__, self.name)
@@ -171,11 +173,27 @@ class EventQueueManager:
# TODO: Per-EQ InjectionTracker so we can inject fake responses on 499
self._queued_events = []
self._region = weakref.proxy(region)
self._last_ack: Optional[int] = None
self._last_payload: Optional[Any] = None
def queue_event(self, event: dict):
def inject_event(self, event: dict):
self._queued_events.append(event)
def take_events(self):
def take_injected_events(self):
events = self._queued_events
self._queued_events = []
return events
def cache_last_poll_response(self, req_ack: int, payload: Any):
self._last_ack = req_ack
self._last_payload = payload
def get_cached_poll_response(self, req_ack: Optional[int]) -> Optional[Any]:
if self._last_ack == req_ack:
return self._last_payload
return None
def clear(self):
self._queued_events.clear()
self._last_ack = None
self._last_payload = None

View File

@@ -13,12 +13,14 @@ from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.client.state import BaseClientSession
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.circuit import ProxiedCircuit
from hippolyzer.lib.proxy.http_asset_repo import HTTPAssetRepo
from hippolyzer.lib.proxy.http_proxy import HTTPFlowContext, is_asset_server_cap_name, SerializedCapData
from hippolyzer.lib.proxy.namecache import ProxyNameCache
from hippolyzer.lib.proxy.object_manager import ProxyWorldObjectManager
from hippolyzer.lib.proxy.region import ProxiedRegion, CapType
from hippolyzer.lib.proxy.settings import ProxySettings
if TYPE_CHECKING:
from hippolyzer.lib.proxy.message_logger import BaseMessageLogger
@@ -27,7 +29,7 @@ if TYPE_CHECKING:
class Session(BaseClientSession):
def __init__(self, session_id, secure_session_id, agent_id, circuit_code,
login_data=None, session_manager: Optional[SessionManager] = None):
session_manager: Optional[SessionManager], login_data=None):
self.login_data = login_data or {}
self.pending = True
self.id: UUID = session_id
@@ -41,9 +43,9 @@ class Session(BaseClientSession):
self.selected: SelectionModel = SelectionModel()
self.regions: List[ProxiedRegion] = []
self.started_at = datetime.datetime.now()
self.message_handler: MessageHandler[Message] = MessageHandler()
self.http_message_handler: MessageHandler[HippoHTTPFlow] = MessageHandler()
self.objects = ProxyWorldObjectManager(self, session_manager.name_cache)
self.message_handler: MessageHandler[Message, str] = MessageHandler()
self.http_message_handler: MessageHandler[HippoHTTPFlow, str] = MessageHandler()
self.objects = ProxyWorldObjectManager(self, session_manager.settings, session_manager.name_cache)
self._main_region = None
@property
@@ -59,8 +61,8 @@ class Session(BaseClientSession):
secure_session_id=UUID(login_data["secure_session_id"]),
agent_id=UUID(login_data["agent_id"]),
circuit_code=int(login_data["circuit_code"]),
login_data=login_data,
session_manager=session_manager,
login_data=login_data,
)
appearance_service = login_data.get("agent_appearance_service")
map_image_service = login_data.get("map-server-url")
@@ -135,6 +137,7 @@ class Session(BaseClientSession):
)
region.circuit = ProxiedCircuit(
near_addr, circuit_addr, transport, logging_hook=logging_hook)
AddonManager.handle_circuit_created(self, region)
return True
if region.circuit and region.circuit.is_alive:
# Whatever, already open
@@ -160,7 +163,7 @@ class Session(BaseClientSession):
return CapData(cap_name, ref(region), ref(self), base_url, cap_type)
return None
def tid_to_assetid(self, transaction_id: UUID):
def transaction_to_assetid(self, transaction_id: UUID):
return UUID.combine(transaction_id, self.secure_session_id)
def __repr__(self):
@@ -168,7 +171,8 @@ class Session(BaseClientSession):
class SessionManager:
def __init__(self):
def __init__(self, settings: ProxySettings):
self.settings: ProxySettings = settings
self.sessions: List[Session] = []
self.shutdown_signal = multiprocessing.Event()
self.flow_context = HTTPFlowContext()
@@ -176,7 +180,6 @@ class SessionManager:
self.message_logger: Optional[BaseMessageLogger] = None
self.addon_ctx: Dict[str, Any] = {}
self.name_cache = ProxyNameCache()
self.use_viewer_object_cache: bool = False
def create_session(self, login_data) -> Session:
session = Session.from_login_data(login_data, self)

View File

@@ -0,0 +1,33 @@
import os
from typing import *
from hippolyzer.lib.base.settings import Settings, SettingDescriptor
_T = TypeVar("_T")
class EnvSettingDescriptor(SettingDescriptor):
"""A setting that prefers to pull its value from the environment"""
__slots__ = ("_env_name", "_env_callable")
def __init__(self, default: Union[Callable[[], _T], _T], env_name: str, spec: Callable[[str], _T]):
super().__init__(default)
self._env_name = env_name
self._env_callable = spec
def __get__(self, obj, owner=None) -> _T:
val = os.getenv(self._env_name)
if val is not None:
return self._env_callable(val)
return super().__get__(obj, owner)
class ProxySettings(Settings):
SOCKS_PROXY_PORT: int = EnvSettingDescriptor(9061, "HIPPO_UDP_PORT", int)
HTTP_PROXY_PORT: int = EnvSettingDescriptor(9062, "HIPPO_HTTP_PORT", int)
PROXY_BIND_ADDR: str = EnvSettingDescriptor("127.0.0.1", "HIPPO_BIND_HOST", str)
REMOTELY_ACCESSIBLE: bool = SettingDescriptor(False)
USE_VIEWER_OBJECT_CACHE: bool = SettingDescriptor(False)
AUTOMATICALLY_REQUEST_MISSING_OBJECTS: bool = SettingDescriptor(False)
ADDON_SCRIPTS: List[str] = SettingDescriptor(list)
FILTERS: Dict[str, str] = SettingDescriptor(dict)

View File

@@ -207,12 +207,12 @@ class UDPProxyProtocol(asyncio.DatagramProtocol):
)
try:
self._handle_proxied_packet(src_packet)
self.handle_proxied_packet(src_packet)
except:
logging.exception("Barfed while handling UDP packet!")
raise
def _handle_proxied_packet(self, packet):
def handle_proxied_packet(self, packet):
self.transport.send_packet(packet)
def close(self):

View File

@@ -0,0 +1,80 @@
import asyncio
import unittest
from typing import Any, Optional, List, Tuple
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.udpserializer import UDPMessageSerializer
from hippolyzer.lib.base.network.transport import UDPPacket, AbstractUDPTransport, ADDR_TUPLE
from hippolyzer.lib.proxy.lludp_proxy import InterceptingLLUDPProxyProtocol
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager
from hippolyzer.lib.proxy.settings import ProxySettings
from hippolyzer.lib.proxy.transport import SOCKS5UDPTransport
class BaseProxyTest(unittest.IsolatedAsyncioTestCase):
def setUp(self) -> None:
self.client_addr = ("127.0.0.1", 1)
self.region_addr = ("127.0.0.1", 3)
self.circuit_code = 1234
self.session_manager = SessionManager(ProxySettings())
self.session = self.session_manager.create_session({
"session_id": UUID.random(),
"secure_session_id": UUID.random(),
"agent_id": UUID.random(),
"circuit_code": self.circuit_code,
"sim_ip": self.region_addr[0],
"sim_port": self.region_addr[1],
"region_x": 0,
"region_y": 123,
"seed_capability": "https://test.localhost:4/foo",
})
self.transport = MockTransport()
self.protocol = InterceptingLLUDPProxyProtocol(
self.client_addr, self.session_manager)
self.protocol.transport = self.transport
self.serializer = UDPMessageSerializer()
self.session.objects.track_region_objects(123)
async def _wait_drained(self):
await asyncio.sleep(0.001)
def _setup_default_circuit(self):
self._setup_region_circuit(self.session.regions[-1])
self.session.main_region = self.session.regions[-1]
def _setup_region_circuit(self, region: ProxiedRegion):
# Not going to send a UseCircuitCode, so have to pretend we already did the
# client -> region NAT hole-punching
self.protocol.session = self.session
self.protocol.far_to_near_map[region.circuit_addr] = self.client_addr
self.session_manager.claim_session(self.session.id)
self.session.open_circuit(self.client_addr, region.circuit_addr,
self.protocol.transport)
def _msg_to_packet(self, msg: Message, src, dst) -> UDPPacket:
return UDPPacket(src_addr=src, dst_addr=dst, data=self.serializer.serialize(msg),
direction=msg.direction)
def _msg_to_datagram(self, msg: Message, src, dst, socks_header=True):
packet = self._msg_to_packet(msg, src, dst)
return SOCKS5UDPTransport.serialize(packet, force_socks_header=socks_header)
class MockTransport(AbstractUDPTransport):
def sendto(self, data: Any, addr: Optional[ADDR_TUPLE] = ...) -> None:
pass
def abort(self) -> None:
pass
def close(self) -> None:
pass
def __init__(self):
super().__init__()
self.packets: List[Tuple[bytes, Tuple[str, int]]] = []
def send_packet(self, packet: UDPPacket) -> None:
self.packets.append((packet.data, packet.dst_addr))

View File

@@ -1,10 +1,10 @@
import socket
import struct
from hippolyzer.lib.base.network.transport import WrappingUDPTransport, UDPPacket
from hippolyzer.lib.base.network.transport import SocketUDPTransport, UDPPacket
class SOCKS5UDPTransport(WrappingUDPTransport):
class SOCKS5UDPTransport(SocketUDPTransport):
HEADER_STRUCT = struct.Struct("!HBB4sH")
@classmethod

View File

@@ -212,7 +212,9 @@ class RegionViewerObjectCacheChain:
for cache_dir in iter_viewer_cache_dirs():
if not (cache_dir / "objectcache" / "object.cache").exists():
continue
caches.append(ViewerObjectCache.from_path(cache_dir / "objectcache"))
cache = ViewerObjectCache.from_path(cache_dir / "objectcache")
if cache:
caches.append(cache)
regions = []
for cache in caches:
region = cache.read_region(handle)

View File

@@ -9,4 +9,4 @@ universal = 1
[flake8]
max-line-length = 160
exclude = build/*, .eggs/*
ignore = F405, F403, E501, F841, E722, W503, E741
ignore = F405, F403, E501, F841, E722, W503, E741, E731

View File

@@ -25,7 +25,7 @@ from setuptools import setup, find_packages
here = path.abspath(path.dirname(__file__))
version = '0.6.0'
version = '0.6.3'
with open(path.join(here, 'README.md')) as readme_fh:
readme = readme_fh.read()

View File

@@ -82,6 +82,7 @@ class FinalizeCXFreezeCommand(Command):
pass
for to_copy in COPY_TO_ZIP:
shutil.copy(BASE_DIR / to_copy, path / to_copy)
shutil.copytree(BASE_DIR / "addon_examples", path / "addon_examples")
zip_path = BASE_DIR / "dist" / path.name
shutil.make_archive(zip_path, "zip", path)
@@ -111,7 +112,7 @@ executables = [
setup(
name="hippolyzer_gui",
version="0.6.0",
version="0.6.3",
description="Hippolyzer GUI",
options=options,
executables=executables,

40
tests/base/test_jp2.py Normal file
View File

@@ -0,0 +1,40 @@
import os.path
import unittest
import glymur
from glymur.codestream import CMEsegment
from hippolyzer.lib.base.jp2_utils import BufferedJp2k
BASE_PATH = os.path.dirname(os.path.abspath(__file__))
@unittest.skipIf(glymur.jp2k.opj2.OPENJP2 is None, "OpenJPEG library missing")
class TestJP2Utils(unittest.TestCase):
@classmethod
def setUpClass(cls) -> None:
with open(os.path.join(BASE_PATH, "test_resources", "plywood.j2c"), "rb") as f:
cls.j2c_bytes = f.read()
def test_load_j2c(self):
j = BufferedJp2k(contents=self.j2c_bytes)
j.parse()
# Last segment in the header is the comment section
com: CMEsegment = j.codestream.segment[-1]
self.assertEqual("CME", com.marker_id)
# In this case the comment is the encoder version
self.assertEqual(b'Kakadu-3.0.3', com.ccme)
def test_read_j2c_data(self):
j = BufferedJp2k(self.j2c_bytes)
pixels = j[::]
self.assertEqual((512, 512, 3), pixels.shape)
def test_save_j2c_data(self):
j = BufferedJp2k(self.j2c_bytes)
pixels = j[::]
j[::] = pixels
new_j2c_bytes = bytes(j)
self.assertNotEqual(self.j2c_bytes, new_j2c_bytes)
# Glymur will have replaced the CME section with its own
self.assertIn(b"Created by OpenJPEG", new_j2c_bytes)

View File

@@ -46,7 +46,6 @@ class TestMessage(unittest.TestCase):
self.serial = UDPMessageSerializer()
settings = Settings()
settings.ENABLE_DEFERRED_PACKET_PARSING = True
settings.HANDLE_PACKETS = False
self.deserial = UDPMessageDeserializer(settings=settings)
def test_block(self):
@@ -170,7 +169,7 @@ class TestMessage(unittest.TestCase):
class TestMessageHandlers(unittest.IsolatedAsyncioTestCase):
def setUp(self) -> None:
self.message_handler: MessageHandler[Message] = MessageHandler()
self.message_handler: MessageHandler[Message, str] = MessageHandler()
def _fake_received_message(self, msg: Message):
self.message_handler.handle(msg)
@@ -204,7 +203,7 @@ class TestMessageHandlers(unittest.IsolatedAsyncioTestCase):
self.assertEqual(len(foo_handlers), 0)
async def test_subscription_no_take(self):
with self.message_handler.subscribe_async("Foo", take=False) as get_msg:
with self.message_handler.subscribe_async(("Foo",), take=False) as get_msg:
msg = Message("Foo", Block("Bar", Baz=1, Biz=1))
self._fake_received_message(msg)
# Should not copy
@@ -213,7 +212,7 @@ class TestMessageHandlers(unittest.IsolatedAsyncioTestCase):
self.assertFalse(msg.queued)
async def test_wait_for(self):
fut = self.message_handler.wait_for("Foo", timeout=0.001, take=False)
fut = self.message_handler.wait_for(("Foo",), timeout=0.001, take=False)
foo_handlers = self.message_handler.handlers['Foo']
# We are subscribed
self.assertEqual(len(foo_handlers), 1)
@@ -227,7 +226,7 @@ class TestMessageHandlers(unittest.IsolatedAsyncioTestCase):
self.assertEqual(len(foo_handlers), 0)
async def test_wait_for_take(self):
fut = self.message_handler.wait_for("Foo", timeout=0.001)
fut = self.message_handler.wait_for(("Foo",), timeout=0.001)
foo_handlers = self.message_handler.handlers['Foo']
# We are subscribed
self.assertEqual(len(foo_handlers), 1)

Binary file not shown.

View File

@@ -32,32 +32,6 @@ class TestEvents(unittest.TestCase):
def test_base_settings(self):
settings = Settings()
self.assertEqual(settings.quiet_logging, False)
self.assertEqual(settings.HANDLE_PACKETS, True)
self.assertEqual(settings.LOG_VERBOSE, True)
self.assertEqual(settings.ENABLE_BYTES_TO_HEX_LOGGING, False)
self.assertEqual(settings.ENABLE_CAPS_LOGGING, True)
self.assertEqual(settings.ENABLE_CAPS_LLSD_LOGGING, False)
self.assertEqual(settings.ENABLE_EQ_LOGGING, True)
self.assertEqual(settings.ENABLE_UDP_LOGGING, True)
self.assertEqual(settings.ENABLE_OBJECT_LOGGING, True)
self.assertEqual(settings.LOG_SKIPPED_PACKETS, True)
self.assertEqual(settings.ENABLE_HOST_LOGGING, True)
self.assertEqual(settings.LOG_COROUTINE_SPAWNS, True)
self.assertEqual(settings.DISABLE_SPAMMERS, True)
self.assertEqual(settings.UDP_SPAMMERS, ['PacketAck', 'AgentUpdate'])
def test_quiet_settings(self):
settings = Settings(True)
self.assertEqual(settings.quiet_logging, True)
self.assertEqual(settings.HANDLE_PACKETS, True)
self.assertEqual(settings.LOG_VERBOSE, False)
self.assertEqual(settings.ENABLE_BYTES_TO_HEX_LOGGING, False)
self.assertEqual(settings.ENABLE_CAPS_LOGGING, False)
self.assertEqual(settings.ENABLE_CAPS_LLSD_LOGGING, False)
self.assertEqual(settings.ENABLE_EQ_LOGGING, False)
self.assertEqual(settings.ENABLE_UDP_LOGGING, False)
self.assertEqual(settings.ENABLE_OBJECT_LOGGING, False)
self.assertEqual(settings.LOG_SKIPPED_PACKETS, False)
self.assertEqual(settings.ENABLE_HOST_LOGGING, False)
self.assertEqual(settings.LOG_COROUTINE_SPAWNS, False)
self.assertEqual(settings.ENABLE_DEFERRED_PACKET_PARSING, True)
settings.ENABLE_DEFERRED_PACKET_PARSING = False
self.assertEqual(settings.ENABLE_DEFERRED_PACKET_PARSING, False)

View File

@@ -23,7 +23,7 @@ from hippolyzer.lib.base.xfer_manager import XferManager
class MockHandlingCircuit(ProxiedCircuit):
def __init__(self, handler: MessageHandler[Message]):
def __init__(self, handler: MessageHandler[Message, str]):
super().__init__(("127.0.0.1", 1), ("127.0.0.1", 2), None)
self.handler = handler
@@ -42,8 +42,8 @@ class BaseTransferTests(unittest.IsolatedAsyncioTestCase):
LARGE_PAYLOAD = b"foobar" * 500
def setUp(self) -> None:
self.server_message_handler: MessageHandler[Message] = MessageHandler()
self.client_message_handler: MessageHandler[Message] = MessageHandler()
self.server_message_handler: MessageHandler[Message, str] = MessageHandler()
self.client_message_handler: MessageHandler[Message, str] = MessageHandler()
# The client side should send messages to the server side's message handler
# and vice-versa
self.client_circuit = MockHandlingCircuit(self.server_message_handler)
@@ -60,7 +60,7 @@ class XferManagerTests(BaseTransferTests):
self.received_bytes: Optional[bytes] = None
async def _handle_vfile_upload(self):
msg = await self.server_message_handler.wait_for('AssetUploadRequest', timeout=0.01)
msg = await self.server_message_handler.wait_for(('AssetUploadRequest',), timeout=0.01)
asset_block = msg["AssetBlock"]
transaction_id = asset_block["TransactionID"]
asset_id = UUID.combine(transaction_id, self.secure_session_id)
@@ -102,7 +102,7 @@ class TestTransferManager(BaseTransferTests):
)
async def _handle_covenant_download(self):
msg = await self.server_message_handler.wait_for('TransferRequest', timeout=0.01)
msg = await self.server_message_handler.wait_for(('TransferRequest',), timeout=0.01)
self.assertEqual(TransferSourceType.SIM_ESTATE, msg["TransferInfo"]["SourceType"])
tid = msg["TransferInfo"]["TransferID"]
params: TransferRequestParamsSimEstate = msg["TransferInfo"][0].deserialize_var("Params")

View File

@@ -1,77 +0,0 @@
import asyncio
from typing import *
import unittest
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.udpserializer import UDPMessageSerializer
from hippolyzer.lib.base.network.transport import AbstractUDPTransport, UDPPacket, ADDR_TUPLE
from hippolyzer.lib.proxy.lludp_proxy import InterceptingLLUDPProxyProtocol
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import SessionManager
from hippolyzer.lib.proxy.transport import SOCKS5UDPTransport
class MockTransport(AbstractUDPTransport):
def sendto(self, data: Any, addr: Optional[ADDR_TUPLE] = ...) -> None:
pass
def abort(self) -> None:
pass
def close(self) -> None:
pass
def __init__(self):
super().__init__()
self.packets: List[Tuple[bytes, Tuple[str, int]]] = []
def send_packet(self, packet: UDPPacket) -> None:
self.packets.append((packet.data, packet.dst_addr))
class BaseProxyTest(unittest.IsolatedAsyncioTestCase):
def setUp(self) -> None:
self.client_addr = ("127.0.0.1", 1)
self.region_addr = ("127.0.0.1", 3)
self.circuit_code = 1234
self.session_manager = SessionManager()
self.session = self.session_manager.create_session({
"session_id": UUID.random(),
"secure_session_id": UUID.random(),
"agent_id": UUID.random(),
"circuit_code": self.circuit_code,
"sim_ip": self.region_addr[0],
"sim_port": self.region_addr[1],
"region_x": 0,
"region_y": 123,
"seed_capability": "https://test.localhost:4/foo",
})
self.transport = MockTransport()
self.protocol = InterceptingLLUDPProxyProtocol(
self.client_addr, self.session_manager)
self.protocol.transport = self.transport
self.serializer = UDPMessageSerializer()
self.session.objects.track_region_objects(123)
async def _wait_drained(self):
await asyncio.sleep(0.001)
def _setup_default_circuit(self):
self._setup_region_circuit(self.session.regions[-1])
self.session.main_region = self.session.regions[-1]
def _setup_region_circuit(self, region: ProxiedRegion):
# Not going to send a UseCircuitCode, so have to pretend we already did the
# client -> region NAT hole-punching
self.protocol.session = self.session
self.protocol.far_to_near_map[region.circuit_addr] = self.client_addr
self.session_manager.claim_session(self.session.id)
self.session.open_circuit(self.client_addr, region.circuit_addr,
self.protocol.transport)
def _msg_to_datagram(self, msg: Message, src, dst, direction, socks_header=True):
serialized = self.serializer.serialize(msg)
packet = UDPPacket(src_addr=src, dst_addr=dst, data=serialized,
direction=direction)
return SOCKS5UDPTransport.serialize(packet, force_socks_header=socks_header)

View File

@@ -1,5 +1,10 @@
from __future__ import annotations
import asyncio
import sys
from pathlib import Path
from tempfile import TemporaryDirectory
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.proxy import addon_ctx
from hippolyzer.lib.proxy.addon_utils import (
@@ -10,11 +15,9 @@ from hippolyzer.lib.proxy.addon_utils import (
)
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
from .. import BaseProxyTest
from hippolyzer.lib.proxy.test_utils import BaseProxyTest
class MockAddon(BaseAddon):
@@ -29,14 +32,47 @@ class MockAddon(BaseAddon):
show_message(bar)
PARENT_ADDON_SOURCE = """
from hippolyzer.lib.proxy.addon_utils import BaseAddon
class ParentAddon(BaseAddon):
baz = None
@classmethod
def foo(cls):
cls.baz = 1
addons = [ParentAddon()]
"""
CHILD_ADDON_SOURCE = """
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.addons import AddonManager
import parent_addon
AddonManager.hot_reload(parent_addon)
class ChildAddon(BaseAddon):
def handle_init(self, session_manager):
parent_addon.ParentAddon.foo()
addons = [ChildAddon()]
"""
class AddonIntegrationTests(BaseProxyTest):
def setUp(self) -> None:
super().setUp()
self.addon = MockAddon()
AddonManager.init([], self.session_manager, [self.addon], swallow_addon_exceptions=False)
self.temp_dir = TemporaryDirectory(prefix="addon_test_sources")
self.child_path = Path(self.temp_dir.name) / "child_addon.py"
self.parent_path = Path(self.temp_dir.name) / "parent_addon.py"
def tearDown(self) -> None:
AddonManager.shutdown()
self.temp_dir.cleanup()
def _fake_command(self, command: str) -> None:
msg = Message(
@@ -44,9 +80,8 @@ class AddonIntegrationTests(BaseProxyTest):
Block("AgentData", AgentID=self.session.agent_id, SessionID=self.session.id),
Block("ChatData", Message=command, Channel=AddonManager.COMMAND_CHANNEL, fill_missing=True),
)
packet = self._msg_to_datagram(msg, src=self.client_addr,
dst=self.region_addr, direction=Direction.OUT)
self.protocol.datagram_received(packet, self.client_addr)
packet = self._msg_to_packet(msg, src=self.client_addr, dst=self.region_addr)
self.protocol.handle_proxied_packet(packet)
async def test_simple_command_setting_params(self):
self._setup_default_circuit()
@@ -76,3 +111,28 @@ class AddonIntegrationTests(BaseProxyTest):
# Should have sent out the two injected packets for inbound and outbound chat
# But not the original chatfromviewer from our command.
self.assertEqual(len(self.transport.packets), 2)
async def test_loading_addons(self):
with open(self.parent_path, "w") as f:
f.write(PARENT_ADDON_SOURCE)
with open(self.child_path, "w") as f:
f.write(CHILD_ADDON_SOURCE)
AddonManager.load_addon_from_path(str(self.parent_path), reload=True)
AddonManager.load_addon_from_path(str(self.child_path), reload=True)
# Wait for the init hooks to run
await asyncio.sleep(0.001)
# Should be able to import this by name now
import parent_addon # noqa
# ChildAddon calls a classmethod that mutates this
self.assertEqual(1, parent_addon.ParentAddon.baz)
async def test_unloading_addons(self):
with open(self.parent_path, "w") as f:
f.write(PARENT_ADDON_SOURCE)
AddonManager.load_addon_from_path(str(self.parent_path), reload=True)
# Wait for the init hooks to run
await asyncio.sleep(0.001)
# Should be able to import this by name now
AddonManager.unload_addon_from_path(str(self.parent_path), reload=True)
await asyncio.sleep(0.001)
self.assertNotIn('hippolyzer.user_addon_parent_addon', sys.modules)

View File

@@ -1,21 +1,26 @@
from __future__ import annotations
import asyncio
import math
import multiprocessing
from urllib.parse import urlparse
import aioresponses
from mitmproxy.net import http
from mitmproxy.test import tflow, tutils
from mitmproxy.http import HTTPFlow
from yarl import URL
from hippolyzer.apps.proxy import run_http_proxy_process
from hippolyzer.lib.base.datatypes import Vector3
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.http_event_manager import MITMProxyEventManager
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.http_proxy import HTTPFlowContext, SerializedCapData
from hippolyzer.lib.proxy.http_proxy import SerializedCapData
from hippolyzer.lib.proxy.message_logger import FilteringMessageLogger
from hippolyzer.lib.proxy.sessions import SessionManager
from .. import BaseProxyTest
from hippolyzer.lib.proxy.test_utils import BaseProxyTest
class MockAddon(BaseAddon):
@@ -32,12 +37,12 @@ class SimpleMessageLogger(FilteringMessageLogger):
return self._filtered_entries
class LLUDPIntegrationTests(BaseProxyTest):
class HTTPIntegrationTests(BaseProxyTest):
def setUp(self) -> None:
super().setUp()
self.addon = MockAddon()
AddonManager.init([], self.session_manager, [self.addon])
self.flow_context = HTTPFlowContext()
self.flow_context = self.session_manager.flow_context
self.http_event_manager = MITMProxyEventManager(self.session_manager, self.flow_context)
self._setup_default_circuit()
@@ -73,12 +78,63 @@ class LLUDPIntegrationTests(BaseProxyTest):
# The response sent back to mitmproxy should have been our modified version
self.assertEqual(True, mitm_flow.metadata["touched_addon"])
async def test_firestorm_bridge_avatar_z_pos(self):
# Simulate an avatar with a non-finite Z pos in a coarselocation
self.session.main_region.objects.state.coarse_locations.update({
self.session.agent_id: Vector3(1, 2, math.inf),
})
self.session.objects._rebuild_avatar_objects()
# GuessedZ should be picked up for the avatar based on the bridge request
fake_flow = tflow.tflow(
req=tutils.treq(host="example.com", content=b'<llsd><string>getZOffsets|'),
resp=tutils.tresp(
headers=http.Headers((
(b"X-SecondLife-Object-Name", b"#Firestorm LSL Bridge v99999"),
(b"X-SecondLife-Owner-Key", str(self.session.agent_id).encode("utf8")),
)),
content=f"<llsd><string>{self.session.agent_id}, 2000.0</string></llsd>".encode("utf8")
),
)
fake_flow.metadata["cap_data_ser"] = SerializedCapData("FirestormBridge")
fake_flow.metadata["from_browser"] = False
self.session_manager.flow_context.from_proxy_queue.put(("response", fake_flow.get_state()), True)
await self._pump_one_event()
av = tuple(self.session.objects.all_avatars)[0]
self.assertEqual(Vector3(1, 2, 2000.0), av.RegionPosition)
async def test_asset_server_proxy_wrapper_caps(self):
# We support "wrapper caps" that disambiguate otherwise ambiguous caps.
# The URL provided by the sim may not be unique across regions or sessions,
# in the case of ViewerAsset on agni, so we generate a random hostname
# as an alias and send that to the viewer instead.
region = self.session.main_region
region.update_caps({
"ViewerAsset": "http://assets.local/foo",
})
wrapper_url = region.register_wrapper_cap("ViewerAsset")
parsed = urlparse(wrapper_url)
fake_flow = tflow.tflow(req=tutils.treq(
host=parsed.hostname,
path="/foo/baz?asset_id=bar",
port=80,
))
fake_flow.metadata["cap_data_ser"] = SerializedCapData()
self.flow_context.from_proxy_queue.put(("request", fake_flow.get_state()), True)
await self._pump_one_event()
flow_state = self.flow_context.to_proxy_queue.get(True)[2]
mitm_flow: HTTPFlow = HTTPFlow.from_state(flow_state)
self.assertIsNotNone(mitm_flow.response)
self.assertEqual(307, mitm_flow.response.status_code)
self.assertEqual(
"http://assets.local/foo/baz?asset_id=bar",
mitm_flow.response.headers["Location"],
)
class TestCapsClient(BaseProxyTest):
def setUp(self) -> None:
super().setUp()
self._setup_default_circuit()
self.caps = {}
self.caps_client = self.session.main_region.caps_client
async def test_requests_proxied_by_default(self):
@@ -90,3 +146,39 @@ class TestCapsClient(BaseProxyTest):
# Request should have been proxied, with a header marking it
self.assertEqual(kwargs['headers']["X-Hippo-Injected"], "1")
self.assertEqual(kwargs['proxy'], "http://127.0.0.1:9062")
class TestMITMProxy(BaseProxyTest):
def setUp(self) -> None:
super().setUp()
self._setup_default_circuit()
self.caps_client = self.session.main_region.caps_client
def test_mitmproxy_works(self):
proxy_port = 9905
self.session_manager.settings.HTTP_PROXY_PORT = proxy_port
http_proc = multiprocessing.Process(
target=run_http_proxy_process,
args=("127.0.0.1", proxy_port, self.session_manager.flow_context),
daemon=True,
)
http_proc.start()
self.session_manager.flow_context.mitmproxy_ready.wait(1.0)
http_event_manager = MITMProxyEventManager(self.session_manager, self.session_manager.flow_context)
async def _request_example_com():
# Pump callbacks from mitmproxy
asyncio.create_task(http_event_manager.run())
try:
async with self.caps_client.get("http://example.com/", timeout=0.5) as resp:
self.assertIn(b"Example Domain", await resp.read())
async with self.caps_client.get("https://example.com/", timeout=0.5) as resp:
self.assertIn(b"Example Domain", await resp.read())
finally:
# Tell the event pump and mitmproxy they need to shut down
self.session_manager.flow_context.shutdown_signal.set()
asyncio.run(_request_example_com())
http_proc.join()

View File

@@ -19,8 +19,7 @@ from hippolyzer.lib.proxy.message_logger import FilteringMessageLogger, LLUDPMes
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
from .. import BaseProxyTest
from hippolyzer.lib.proxy.test_utils import BaseProxyTest
class MockAddon(BaseAddon):
@@ -98,7 +97,7 @@ class LLUDPIntegrationTests(BaseProxyTest):
packet_id=1,
)
datagram = self._msg_to_datagram(msg, self.client_addr, self.region_addr,
Direction.OUT, socks_header=True)
socks_header=True)
self.protocol.datagram_received(datagram, self.client_addr)
await self._wait_drained()
self.assertFalse(self.session.pending)
@@ -116,7 +115,7 @@ class LLUDPIntegrationTests(BaseProxyTest):
packet_id=1,
)
datagram = self._msg_to_datagram(msg, self.client_addr, self.region_addr,
Direction.OUT, socks_header=True)
socks_header=True)
self.protocol.datagram_received(datagram, source_addr=self.client_addr)
await self._wait_drained()
# Packet got dropped completely
@@ -133,7 +132,7 @@ class LLUDPIntegrationTests(BaseProxyTest):
packet_id=1,
)
datagram = self._msg_to_datagram(msg, self.client_addr, (self.region_addr[0], 9),
Direction.OUT, socks_header=True)
socks_header=True)
self.protocol.datagram_received(datagram, source_addr=self.client_addr)
await self._wait_drained()
# The session claim will still work
@@ -247,7 +246,7 @@ class LLUDPIntegrationTests(BaseProxyTest):
async def test_session_message_handler(self):
self._setup_default_circuit()
obj_update = self._make_objectupdate_compressed(1234)
fut = self.session.message_handler.wait_for('ObjectUpdateCompressed')
fut = self.session.message_handler.wait_for(('ObjectUpdateCompressed',))
self.protocol.datagram_received(obj_update, self.region_addr)
self.assertEqual("ObjectUpdateCompressed", (await fut).name)

View File

@@ -2,8 +2,7 @@ from mitmproxy.test import tflow, tutils
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.message_logger import HTTPMessageLogEntry
from . import BaseProxyTest
from hippolyzer.lib.proxy.test_utils import BaseProxyTest
class TestHTTPFlows(BaseProxyTest):

View File

@@ -11,6 +11,7 @@ from hippolyzer.lib.proxy.http_proxy import SerializedCapData
from hippolyzer.lib.proxy.message_logger import LLUDPMessageLogEntry, HTTPMessageLogEntry
from hippolyzer.lib.proxy.message_filter import compile_filter
from hippolyzer.lib.proxy.sessions import SessionManager
from hippolyzer.lib.proxy.settings import ProxySettings
OBJECT_UPDATE = b'\xc0\x00\x00\x00Q\x00\x0c\x00\x01\xea\x03\x00\x02\xe6\x03\x00\x01\xbe\xff\x01\x06\xbc\x8e\x0b\x00' \
b'\x01i\x94\x8cjM"\x1bf\xec\xe4\xac1c\x93\xcbKW\x89\x98\x01\t\x03\x00\x01Q@\x88>Q@\x88>Q@\x88><\xa2D' \
@@ -111,7 +112,6 @@ class MessageFilterTests(unittest.TestCase):
def test_tagged_union_subfield(self):
settings = Settings()
settings.ENABLE_DEFERRED_PACKET_PARSING = False
settings.HANDLE_PACKETS = False
deser = UDPMessageDeserializer(settings=settings)
update_msg = deser.deserialize(OBJECT_UPDATE)
entry = LLUDPMessageLogEntry(update_msg, None, None)
@@ -119,7 +119,7 @@ class MessageFilterTests(unittest.TestCase):
self.assertTrue(self._filter_matches("ObjectUpdate.ObjectData.ObjectData.Position < (90, 43, 27)", entry))
def test_http_flow(self):
session_manager = SessionManager()
session_manager = SessionManager(ProxySettings())
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
fake_flow.metadata["cap_data_ser"] = SerializedCapData(
cap_name="FakeCap",

View File

@@ -15,9 +15,7 @@ from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.vocache import RegionViewerObjectCacheChain, RegionViewerObjectCache, ViewerObjectCacheEntry
from . import BaseProxyTest
from hippolyzer.lib.proxy.test_utils import BaseProxyTest
OBJECT_UPDATE_COMPRESSED_DATA = (
b"\x12\x12\x10\xbf\x16XB~\x8f\xb4\xfb\x00\x1a\xcd\x9b\xe5\xd2\x04\x00\x00\t\x00\xcdG\x00\x00"
@@ -141,6 +139,19 @@ class ObjectManagerTestMixin(BaseProxyTest):
def _kill_object(self, local_id: int):
self.message_handler.handle(self._create_kill_object(local_id))
def _create_object_update_cached(self, local_id: int, region_handle: int = 123,
crc: int = 22, flags: int = 4321):
return Message(
'ObjectUpdateCached',
Block("RegionData", TimeDilation=102, RegionHandle=region_handle),
Block(
"ObjectData",
ID=local_id,
CRC=crc,
UpdateFlags=flags,
)
)
def _get_avatar_positions(self) -> Dict[UUID, Vector3]:
return {av.FullID: av.RegionPosition for av in self.region_object_manager.all_avatars}
@@ -451,16 +462,7 @@ class RegionObjectManagerTests(ObjectManagerTestMixin, unittest.IsolatedAsyncioT
)
])
])
cache_msg = Message(
'ObjectUpdateCached',
Block("RegionData", TimeDilation=102, RegionHandle=123),
Block(
"ObjectData",
ID=1234,
CRC=22,
UpdateFlags=4321,
)
)
cache_msg = self._create_object_update_cached(1234, flags=4321)
obj = self.region_object_manager.lookup_localid(1234)
self.assertIsNone(obj)
self.region_object_manager.load_cache()
@@ -623,3 +625,41 @@ class SessionObjectManagerTests(ObjectManagerTestMixin, unittest.IsolatedAsyncio
await self.session.objects.load_ancestors(parentless)
with self.assertRaises(asyncio.TimeoutError):
await self.session.objects.load_ancestors(orphaned, wait_time=0.005)
async def test_auto_request_objects(self):
self.session_manager.settings.AUTOMATICALLY_REQUEST_MISSING_OBJECTS = True
self.message_handler.handle(self._create_object_update_cached(1234))
self.message_handler.handle(self._create_object_update_cached(1235))
self.assertEqual({1234, 1235}, self.region_object_manager.queued_cache_misses)
# Pretend viewer sent out its own RequestMultipleObjects
self.message_handler.handle(Message(
'RequestMultipleObjects',
Block("RegionData", SessionID=self.session.id, AgentID=self.session.agent_id),
Block(
"ObjectData",
ID=1234,
)
))
# Proxy should have killed its pending request for 1234
self.assertEqual({1235}, self.region_object_manager.queued_cache_misses)
async def test_auto_request_avatar_seats(self):
# Avatars' parent links should always be requested regardless of
# object auto-request setting's value.
seat_id = 999
av = self._create_object(pcode=PCode.AVATAR, parent_id=seat_id)
self.assertEqual({seat_id}, self.region_object_manager.queued_cache_misses)
# Need to wait for it to decide it's worth requesting
await asyncio.sleep(0.22)
self.assertEqual(set(), self.region_object_manager.queued_cache_misses)
# Make sure we sent a request after the timeout
req_msg = self.deserializer.deserialize(self.transport.packets[-1][0])
self.assertEqual("RequestMultipleObjects", req_msg.name)
self.assertEqual(
[{'CacheMissType': 0, 'ID': seat_id}],
req_msg.to_dict()['body']['ObjectData'],
)
# Parent should not be requested again if an unrelated property like pos changes
self._create_object(local_id=av.LocalID, full_id=av.FullID,
pcode=PCode.AVATAR, parent_id=seat_id, pos=(1, 2, 9))
self.assertEqual(set(), self.region_object_manager.queued_cache_misses)

View File

@@ -3,12 +3,12 @@ import unittest
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message_formatting import HumanMessageSerializer
from hippolyzer.lib.base.templates import TextureEntrySubfieldSerializer, TEFaceBitfield
from hippolyzer.lib.base.templates import TextureEntrySubfieldSerializer, TEFaceBitfield, TextureEntry
EXAMPLE_TE = b"\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x08\xe7\xb2\x98\x04\xca\x10;\x85\x94\x05Lj\x8d\xd4" \
b"\x0b\x1f\x01B\xcb\xe6|\x1d,\xa7sc\xa6\x1a\xa2L\xb1u\x01\x00\x00\x00\x00\x00\x00\x00\x00\x80?" \
b"\x00\x00\x00\x80?\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00" \
b"\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00"
EXAMPLE_TE = b'\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x08\xca*\x98:\x18\x02,\r\xf4\x1e\xc6\xf5\x91\x01]\x83\x014' \
b'\x00\x90i+\x10\x80\xa1\xaa\xa2g\x11o\xa8]\xc6\x00\x00\x00\x00\x00\x00\x00\x00\x80?\x00\x00\x00\x80?' \
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00' \
b'\x00\x00\x00\x00\x00\x00\x00'
class TemplateTests(unittest.TestCase):
@@ -31,8 +31,8 @@ class TemplateTests(unittest.TestCase):
pod_te = {
'Textures': {
None: '89556747-24cb-43ed-920b-47caed15465f',
(3,): 'e7b29804-ca10-3b85-9405-4c6a8dd40b1f',
(0,): '42cbe67c-1d2c-a773-63a6-1aa24cb17501'
(3,): 'ca2a983a-1802-2c0d-f41e-c6f591015d83',
(0,): '34009069-2b10-80a1-aaa2-67116fa85dc6'
},
'Color': {None: b'\xff\xff\xff\xff'},
'ScalesS': {None: 1.0},
@@ -58,9 +58,16 @@ class TemplateTests(unittest.TestCase):
str_msg = HumanMessageSerializer.to_human_string(msg, beautify=True)
msg = HumanMessageSerializer.from_human_string(str_msg)
spec = msg["ObjectData"][0].get_serializer("TextureEntry")
deser = spec.deserialize(None, msg["ObjectData"]["TextureEntry"], pod=True)
data_field = msg["ObjectData"]["TextureEntry"]
# Serialization order and format should match indra's exactly
self.assertEqual(EXAMPLE_TE, data_field)
deser = spec.deserialize(None, data_field, pod=True)
self.assertEqual(deser, pod_te)
def test_textureentry_defaults(self):
te = TextureEntry()
self.assertEqual(UUID('89556747-24cb-43ed-920b-47caed15465f'), te.Textures[None])
if __name__ == "__main__":
unittest.main()