48 Commits

Author SHA1 Message Date
Salad Dais
5a5b471fe4 v0.15.1 2024-01-10 16:12:23 +00:00
Salad Dais
ff0f20d1dd Correct parcel bitmap parsing 2024-01-10 07:27:50 +00:00
Salad Dais
4898c852c1 Cache render materials in proxy object manager 2024-01-09 13:42:45 +00:00
Salad Dais
adf5295e2b Add start of ProxyParcelManager 2024-01-09 13:41:37 +00:00
Salad Dais
7514baaa5f Add serializer for ParcelProperty bitmaps 2024-01-09 13:40:52 +00:00
Salad Dais
0ba1a779ef Allow handling EQ events through message_handler in proxy 2024-01-09 13:40:07 +00:00
Salad Dais
3ea8a27914 Bitten by YAML floatification... 2024-01-09 12:26:30 +00:00
Salad Dais
2451ad3674 v0.15.0 2024-01-09 12:19:53 +00:00
Salad Dais
25804df238 Windows build needs mitmproxy-windows 2024-01-09 12:09:18 +00:00
Salad Dais
474173ba54 Update workflow python versions 2024-01-09 09:21:12 +00:00
Salad Dais
049a3b703f Update requirements 2024-01-09 09:19:15 +00:00
Salad Dais
ac77fde892 Update mitmproxy, change required Python to 3.10 2024-01-09 09:17:05 +00:00
Salad Dais
6ee9b22923 Start updating Windows release bundling 2024-01-09 08:53:33 +00:00
Salad Dais
f355138cd2 Update requirements 2024-01-08 22:43:08 +00:00
dependabot[bot]
478d135d1f Bump pygments from 2.10.0 to 2.15.0 (#40)
Bumps [pygments](https://github.com/pygments/pygments) from 2.10.0 to 2.15.0.
- [Release notes](https://github.com/pygments/pygments/releases)
- [Changelog](https://github.com/pygments/pygments/blob/master/CHANGES)
- [Commits](https://github.com/pygments/pygments/compare/2.10.0...2.15.0)

---
updated-dependencies:
- dependency-name: pygments
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-08 17:03:58 -04:00
dependabot[bot]
80c9acdabe Bump tornado from 6.1 to 6.3.3 (#41)
Bumps [tornado](https://github.com/tornadoweb/tornado) from 6.1 to 6.3.3.
- [Changelog](https://github.com/tornadoweb/tornado/blob/master/docs/releases.rst)
- [Commits](https://github.com/tornadoweb/tornado/compare/v6.1.0...v6.3.3)

---
updated-dependencies:
- dependency-name: tornado
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-08 17:03:23 -04:00
dependabot[bot]
d4eaa7c543 Bump urllib3 from 1.26.7 to 1.26.18 (#38)
Bumps [urllib3](https://github.com/urllib3/urllib3) from 1.26.7 to 1.26.18.
- [Release notes](https://github.com/urllib3/urllib3/releases)
- [Changelog](https://github.com/urllib3/urllib3/blob/main/CHANGES.rst)
- [Commits](https://github.com/urllib3/urllib3/compare/1.26.7...1.26.18)

---
updated-dependencies:
- dependency-name: urllib3
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-08 17:00:00 -04:00
dependabot[bot]
2571550da4 Bump aiohttp from 3.8.3 to 3.9.0 (#37)
Bumps [aiohttp](https://github.com/aio-libs/aiohttp) from 3.8.3 to 3.9.0.
- [Release notes](https://github.com/aio-libs/aiohttp/releases)
- [Changelog](https://github.com/aio-libs/aiohttp/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/aiohttp/compare/v3.8.3...v3.9.0)

---
updated-dependencies:
- dependency-name: aiohttp
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-08 16:55:37 -04:00
Salad Dais
b3ee3a3506 Add packet stats addon example 2024-01-08 00:03:45 +00:00
Salad Dais
11feccd93b Add support for Material inventory types 2024-01-07 17:47:09 +00:00
Salad Dais
bb6ce5c013 Handle binary LLSD headers generated by indra 2024-01-07 17:46:54 +00:00
Salad Dais
a35aa9046e v0.14.3 2024-01-07 08:00:21 +00:00
Salad Dais
6c32da878d Handle (and ignore by default) the new GenericStreamingMessage
This is _enormously_ spammy, good god. Apparently related to PBR.
2024-01-07 07:51:52 +00:00
Salad Dais
49c54bc896 Automatically request all materials by default 2024-01-06 21:50:29 +00:00
Salad Dais
4c9fa38ffb Move material management to ClientObjectManager 2024-01-06 21:40:49 +00:00
Salad Dais
2856e78f16 Start adding MaterialManager for RenderMaterials 2024-01-06 20:40:04 +00:00
Salad Dais
33884925f4 enum.IntFlag -> IntFlag 2024-01-06 20:39:29 +00:00
Salad Dais
a11ef96d9a Serve inbound Xfers reliably 2024-01-05 02:53:05 +00:00
Salad Dais
7b6239d66a Add more parcel enums 2024-01-05 02:49:51 +00:00
Salad Dais
2c3bd140ff Update MapImageFlags 2024-01-04 22:24:36 +00:00
Salad Dais
9d2087a0fb Add ParcelManager to HippoClient 2024-01-04 21:45:54 +00:00
Salad Dais
67db8110a1 Fix ParcelOverlay data template 2024-01-04 20:01:32 +00:00
Salad Dais
ab1c56ff3e Start writing client parcel manager 2024-01-04 19:51:47 +00:00
Salad Dais
142f2e42ca Clean up message template code 2024-01-04 19:08:09 +00:00
Salad Dais
e7764c1665 Display templated EQ messages as templated messages
This makes them less annoying to read, and allows us to use
subfield serializers to pretty-print their contents.
2024-01-04 18:00:14 +00:00
Salad Dais
582cfea47c Send AgentUpdate after connecting to main region 2024-01-03 07:53:47 +00:00
Salad Dais
6f38d84a1c Add ParcelOverlay serializers 2024-01-03 07:51:51 +00:00
Salad Dais
1fc46e66bc Support __add__ and __radd_ on JankStringyBytes 2023-12-31 15:58:05 +00:00
Salad Dais
167673aa08 Be nicer about zero-length strings in Messages 2023-12-31 15:52:15 +00:00
Salad Dais
5ad8ee986f Keep track of user's groups in their session 2023-12-31 15:28:00 +00:00
Salad Dais
e9d7ee7e8e ObjectUpdateType.OBJECT_UPDATE -> ObjectUpdateType.UPDATE 2023-12-31 14:57:28 +00:00
Salad Dais
d21c3ec004 Update templates 2023-12-31 14:55:46 +00:00
Salad Dais
01c6931d53 v0.14.2 2023-12-24 18:05:05 +00:00
Salad Dais
493563bb6f Add a few asset type lookups 2023-12-24 06:47:04 +00:00
Salad Dais
ca5c71402b Bump Python requirement to 3.9 2023-12-24 05:57:14 +00:00
Salad Dais
ad765a1ede Load inventory cache in a background thread
llsd.parse_notation() is slow as hell, no way around it.
2023-12-24 05:55:56 +00:00
Salad Dais
9adee14e0f Allow non-byte legacy schema flag fields 2023-12-23 15:40:00 +00:00
Salad Dais
57c4bd0e7c Improve AIS support 2023-12-22 21:25:05 +00:00
49 changed files with 1470 additions and 290 deletions

View File

@@ -1,5 +1,3 @@
# Have to manually unzip this (it gets double zipped) and add it
# onto the release after it gets created. Don't want actions with repo write.
name: Bundle Windows EXE
@@ -9,8 +7,12 @@ on:
types:
- created
workflow_dispatch:
inputs:
ref_name:
description: Name to use for the release
env:
target_tag: ${{ github.ref_name }}
target_tag: ${{ github.ref_name || github.event.inputs.ref_name }}
sha: ${{ github.sha || github.event.inputs.ref_name }}
jobs:
@@ -21,7 +23,7 @@ jobs:
contents: write
strategy:
matrix:
python-version: [3.9]
python-version: ["3.11"]
steps:
- uses: actions/checkout@v2
@@ -51,10 +53,11 @@ jobs:
- name: Upload the artifact
uses: actions/upload-artifact@v2
with:
name: hippolyzer-windows-${{ github.sha }}
name: hippolyzer-windows-${{ env.sha }}
path: ./hippolyzer-windows-${{ env.target_tag }}.zip
- uses: ncipollo/release-action@v1.10.0
if: github.event_name != 'workflow_dispatch'
with:
artifacts: hippolyzer-windows-${{ env.target_tag }}.zip
tag: ${{ env.target_tag }}

View File

@@ -19,7 +19,7 @@ jobs:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.9
python-version: "3.10"
- name: Install dependencies
run: |

View File

@@ -14,7 +14,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.8", "3.11"]
python-version: ["3.10", "3.12"]
steps:
- uses: actions/checkout@v2

View File

@@ -27,7 +27,7 @@ with low-level SL details. See the [Local Animation addon example](https://githu
### From Source
* Python 3.8 or above is **required**. If you're unable to upgrade your system Python package due to
* Python 3.10 or above is **required**. If you're unable to upgrade your system Python package due to
being on a stable distro, you can use [pyenv](https://github.com/pyenv/pyenv) to create
a self-contained Python install with the appropriate version.
* [Create a clean Python 3 virtualenv](https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/#creating-a-virtual-environment)
@@ -325,7 +325,7 @@ The REPL is fully async aware and allows awaiting events without blocking:
```python
>>> from hippolyzer.lib.client.object_manager import ObjectUpdateType
>>> evt = await session.objects.events.wait_for((ObjectUpdateType.OBJECT_UPDATE,), timeout=2.0)
>>> evt = await session.objects.events.wait_for((ObjectUpdateType.UPDATE,), timeout=2.0)
>>> evt.updated
{'Position'}
```

View File

@@ -0,0 +1,21 @@
import collections
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.proxy.addon_utils import BaseAddon, GlobalProperty
from hippolyzer.lib.proxy.commands import handle_command
from hippolyzer.lib.proxy.region import ProxiedRegion
from hippolyzer.lib.proxy.sessions import Session
class PacketStatsAddon(BaseAddon):
packet_stats: collections.Counter = GlobalProperty(collections.Counter)
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
self.packet_stats[message.name] += 1
@handle_command()
async def print_packet_stats(self, _session: Session, _region: ProxiedRegion):
print(self.packet_stats.most_common(10))
addons = [PacketStatsAddon()]

View File

@@ -72,14 +72,13 @@ class PixelArtistAddon(BaseAddon):
# Watch for any newly created prims, this is basically what the viewer does to find
# prims that it just created with the build tool.
with session.objects.events.subscribe_async(
(ObjectUpdateType.OBJECT_UPDATE,),
(ObjectUpdateType.UPDATE,),
predicate=lambda e: e.object.UpdateFlags & JUST_CREATED_FLAGS and "LocalID" in e.updated
) as get_events:
# Create a pool of prims to use for building the pixel art
for _ in range(needed_prims):
# TODO: We don't track the land group or user's active group, so
# "anyone can build" must be on for rezzing to work.
group_id = UUID()
# TODO: Can't get land group atm, just tries to rez with the user's active group
group_id = session.active_group
region.circuit.send(Message(
'ObjectAdd',
Block('AgentData', AgentID=session.agent_id, SessionID=session.id, GroupID=group_id),

View File

@@ -3,6 +3,7 @@ A simple client that just says hello to people
"""
import asyncio
import pprint
from contextlib import aclosing
import os
@@ -19,7 +20,7 @@ async def amain():
return
if message["ChatData"]["SourceType"] != ChatSourceType.AGENT:
return
if "hello" not in str(message["ChatData"]["Message"]).lower():
if "hello" not in message["ChatData"]["Message"].lower():
return
await client.send_chat(f'Hello {message["ChatData"]["FromName"]}!', chat_type=ChatType.SHOUT)
@@ -30,9 +31,13 @@ async def amain():
start_location=os.environ.get("HIPPO_START_LOCATION", "last"),
)
print("I'm here")
# Wait until we have details about parcels and print them
await client.main_region.parcel_manager.parcels_downloaded.wait()
pprint.pprint(client.main_region.parcel_manager.parcels)
await client.send_chat("Hello World!", chat_type=ChatType.SHOUT)
client.session.message_handler.subscribe("ChatFromSimulator", _respond_to_chat)
# Example of how to work with caps
async with client.main_caps_client.get("SimulatorFeatures") as features_resp:
print("Features:", await features_resp.read_llsd())

View File

@@ -77,6 +77,15 @@ class SelectionManagerAddon(BaseAddon):
selected.task_item = parsed["item-id"]
class AgentUpdaterAddon(BaseAddon):
def handle_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
if event['message'] != 'AgentGroupDataUpdate':
return
session.groups.clear()
for group in event['body']['GroupData']:
session.groups.add(group['GroupID'])
class REPLAddon(BaseAddon):
@handle_command()
async def spawn_repl(self, session: Session, region: ProxiedRegion):
@@ -103,6 +112,7 @@ def start_proxy(session_manager: SessionManager, extra_addons: Optional[list] =
extra_addon_paths = extra_addon_paths or []
extra_addons.append(SelectionManagerAddon())
extra_addons.append(REPLAddon())
extra_addons.append(AgentUpdaterAddon())
root_log = logging.getLogger()
root_log.addHandler(logging.StreamHandler())

View File

@@ -234,7 +234,7 @@ class MessageLogWindow(QtWidgets.QMainWindow):
"ParcelDwellReply ParcelAccessListReply AttachedSoundGainChange " \
"ParcelPropertiesRequest ParcelProperties GetObjectCost GetObjectPhysicsData ObjectImage " \
"ViewerAsset GetTexture SetAlwaysRun GetDisplayNames MapImageService MapItemReply " \
"AgentFOV".split(" ")
"AgentFOV GenericStreamingMessage".split(" ")
DEFAULT_FILTER = f"!({' || '.join(ignored for ignored in DEFAULT_IGNORE)})"
textRequest: QtWidgets.QTextEdit
@@ -576,7 +576,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
message_names = sorted(x.name for x in self.templateDict)
for message_name in message_names:
if self.templateDict[message_name].msg_trust:
if self.templateDict[message_name].trusted:
self.comboTrusted.addItem(message_name)
else:
self.comboUntrusted.addItem(message_name)

View File

@@ -317,6 +317,22 @@ class JankStringyBytes(bytes):
return item in str(self)
return item in bytes(self)
def __add__(self, other):
if isinstance(other, bytes):
return bytes(self) + other
return str(self) + other
def __radd__(self, other):
if isinstance(other, bytes):
return other + bytes(self)
return other + str(self)
def lower(self):
return str(self).lower()
def upper(self):
return str(self).upper()
class RawBytes(bytes):
__slots__ = ()

View File

@@ -17,7 +17,6 @@ import inspect
import logging
import secrets
import struct
import typing
import weakref
from io import StringIO
from typing import *
@@ -49,6 +48,10 @@ class SchemaFlagField(SchemaHexInt):
"""Like a hex int, but must be serialized as bytes in LLSD due to being a U32"""
@classmethod
def from_llsd(cls, val: Any, flavor: str) -> int:
# Sometimes values in S32 range will just come through normally
if isinstance(val, int):
return val
if flavor == "legacy":
return struct.unpack("!I", val)[0]
return val
@@ -190,7 +193,7 @@ class InventoryBase(SchemaBase):
writer.write("\t}\n")
class InventoryDifferences(typing.NamedTuple):
class InventoryDifferences(NamedTuple):
changed: List[InventoryNodeBase]
removed: List[InventoryNodeBase]
@@ -400,7 +403,6 @@ class InventoryNodeBase(InventoryBase, _HasName):
@dataclasses.dataclass
class InventoryContainerBase(InventoryNodeBase):
# TODO: Not a string in AIS
type: AssetType = schema_field(SchemaEnumField(AssetType))
@property
@@ -461,7 +463,6 @@ class InventoryCategory(InventoryContainerBase):
VERSION_NONE: ClassVar[int] = -1
cat_id: UUID = schema_field(SchemaUUID)
# TODO: not a string in AIS
pref_type: FolderType = schema_field(SchemaEnumField(FolderType), llsd_name="preferred_type")
name: str = schema_field(SchemaMultilineStr)
owner_id: Optional[UUID] = schema_field(SchemaUUID, default=None)
@@ -488,6 +489,18 @@ class InventoryCategory(InventoryContainerBase):
type=AssetType.CATEGORY,
)
@classmethod
def _get_fields_dict(cls, llsd_flavor: Optional[str] = None):
fields = super()._get_fields_dict(llsd_flavor)
if llsd_flavor == "ais":
# AIS is smart enough to know that all categories are asset type category...
fields.pop("type")
# These have different names though
fields["type_default"] = fields.pop("preferred_type")
fields["agent_id"] = fields.pop("owner_id")
fields["category_id"] = fields.pop("cat_id")
return fields
__hash__ = InventoryNodeBase.__hash__
@@ -573,5 +586,12 @@ class InventoryItem(InventoryNodeBase):
creation_date=block["CreationDate"],
)
def to_llsd(self, flavor: str = "legacy"):
val = super().to_llsd(flavor=flavor)
if flavor == "ais":
# There's little chance this differs from owner ID, just place it.
val["agent_id"] = val["permissions"]["owner_id"]
return val
INVENTORY_TYPES: Tuple[Type[InventoryNodeBase], ...] = (InventoryCategory, InventoryObject, InventoryItem)

View File

@@ -104,6 +104,13 @@ class SchemaStr(SchemaFieldSerializer[str]):
class SchemaUUID(SchemaFieldSerializer[UUID]):
@classmethod
def from_llsd(cls, val: Any, flavor: str) -> UUID:
# FetchInventory2 will return a string, but we want a UUID. It's not an issue
# for us to return a UUID later there because it'll just cast to string if
# that's what it wants
return UUID(val)
@classmethod
def deserialize(cls, val: str) -> UUID:
return UUID(val)
@@ -157,11 +164,11 @@ def parse_schema_line(line: str):
@dataclasses.dataclass
class SchemaBase(abc.ABC):
@classmethod
def _get_fields_dict(cls, llsd=False):
def _get_fields_dict(cls, llsd_flavor: Optional[str] = None):
fields_dict = {}
for field in dataclasses.fields(cls):
field_name = field.name
if llsd:
if llsd_flavor:
field_name = field.metadata.get("llsd_name") or field_name
fields_dict[field_name] = field
return fields_dict
@@ -181,7 +188,7 @@ class SchemaBase(abc.ABC):
@classmethod
def from_llsd(cls, inv_dict: Dict, flavor: str = "legacy"):
fields = cls._get_fields_dict(llsd=True)
fields = cls._get_fields_dict(llsd_flavor=flavor)
obj_dict = {}
for key, val in inv_dict.items():
if key in fields:
@@ -205,7 +212,10 @@ class SchemaBase(abc.ABC):
else:
raise ValueError(f"Unsupported spec for {key!r}, {spec!r}")
else:
LOG.warning(f"Unknown key {key!r}")
if flavor != "ais":
# AIS has a number of different fields that are irrelevant depending on
# what exactly sent the payload
LOG.warning(f"Unknown key {key!r}")
return cls._obj_from_dict(obj_dict)
def to_bytes(self) -> bytes:
@@ -219,7 +229,7 @@ class SchemaBase(abc.ABC):
def to_llsd(self, flavor: str = "legacy"):
obj_dict = {}
for field_name, field in self._get_fields_dict(llsd=True).items():
for field_name, field in self._get_fields_dict(llsd_flavor=flavor).items():
spec = field.metadata.get("spec")
# Not meant to be serialized
if not spec:

View File

@@ -160,8 +160,12 @@ class HippoLLSDBinaryParser(base_llsd.serde_binary.LLSDBinaryParser):
return bytes_val
# Python uses one, C++ uses the other, and everyone's unhappy.
_BINARY_HEADERS = (b'<? LLSD/Binary ?>', b'<?llsd/binary?>')
def parse_binary(data: bytes):
if data.startswith(b'<?llsd/binary?>'):
if any(data.startswith(x) for x in _BINARY_HEADERS):
data = data.split(b'\n', 1)[1]
return HippoLLSDBinaryParser().parse(data)
@@ -187,7 +191,7 @@ def parse(data: bytes):
# content-type is usually nonsense.
try:
data = data.lstrip()
if data.startswith(b'<?llsd/binary?>'):
if any(data.startswith(x) for x in _BINARY_HEADERS):
return parse_binary(data)
elif data.startswith(b'<'):
return parse_xml(data)

View File

@@ -5802,6 +5802,25 @@ version 2.0
}
}
// GenericStreamingMessage
// Optimized generic message for streaming arbitrary data to viewer
// Avoid payloads over 7KB (8KB ceiling)
// Method -- magic number indicating method to use to decode payload:
// 0x4175 - GLTF material override data
// Payload -- data to be decoded
{
GenericStreamingMessage High 31 Trusted Unencoded
{
MethodData Single
{ Method U16 }
}
{
DataBlock Single
{ Data Variable 2 }
}
}
// LargeGenericMessage
// Similar to the above messages, but can handle larger payloads and serialized
// LLSD. Uses HTTP transport

View File

@@ -29,7 +29,10 @@ from hippolyzer.lib.base.message.msgtypes import MsgType
PACKER = Callable[[Any], bytes]
UNPACKER = Callable[[bytes], Any]
LLSD_PACKER = Callable[[Any], Any]
LLSD_UNPACKER = Callable[[Any], Any]
SPEC = Tuple[UNPACKER, PACKER]
LLSD_SPEC = Tuple[LLSD_UNPACKER, LLSD_PACKER]
def _pack_string(pack_string):
@@ -64,6 +67,21 @@ def _make_tuplecoord_spec(typ: Type[TupleCoord], struct_fmt: str,
return lambda x: typ(*struct_obj.unpack(x)), _packer
def _make_llsd_tuplecoord_spec(typ: Type[TupleCoord], needed_elems: Optional[int] = None):
if needed_elems is None:
# Number of elems needed matches the number in the coord type
def _packer(x):
return list(x)
else:
# Special case, we only want to pack some of the components.
# Mostly for Quaternion since we don't actually need to send W.
def _packer(x):
if isinstance(x, TupleCoord):
x = x.data()
return list(x.data(needed_elems))
return lambda x: typ(*x), _packer
def _unpack_specs(cls):
cls.UNPACKERS = {k: v[0] for (k, v) in cls.SPECS.items()}
cls.PACKERS = {k: v[1] for (k, v) in cls.SPECS.items()}
@@ -110,10 +128,15 @@ class TemplateDataPacker:
class LLSDDataPacker(TemplateDataPacker):
# Some template var types aren't directly representable in LLSD, so they
# get encoded to binary fields.
SPECS = {
SPECS: Dict[MsgType, LLSD_SPEC] = {
MsgType.MVT_IP_ADDR: (socket.inet_ntoa, socket.inet_aton),
# LLSD ints are technically bound to S32 range.
MsgType.MVT_U32: _make_struct_spec('!I'),
MsgType.MVT_U64: _make_struct_spec('!Q'),
MsgType.MVT_S64: _make_struct_spec('!q'),
# These are arrays in LLSD, we need to turn them into coords.
MsgType.MVT_LLVector3: _make_llsd_tuplecoord_spec(Vector3),
MsgType.MVT_LLVector3d: _make_llsd_tuplecoord_spec(Vector3),
MsgType.MVT_LLVector4: _make_llsd_tuplecoord_spec(Vector4),
MsgType.MVT_LLQuaternion: _make_llsd_tuplecoord_spec(Quaternion, needed_elems=3)
}

View File

@@ -341,6 +341,21 @@ class Message:
msg.acks = dict_val['acks']
return msg
@classmethod
def from_eq_event(cls, event) -> Message:
# If this isn't a templated message (like some EQ-only events are),
# then we wrap it in a synthetic `Message` so that the API for handling
# both EQ-only and templated message events can be the same. Ick.
msg = cls(event["message"])
if isinstance(event["body"], dict):
msg.add_block(Block("EventData", **event["body"]))
else:
# Shouldn't be any events that have anything other than a dict
# as a body, but just to be sure...
msg.add_block(Block("EventData", Data=event["body"]))
msg.synthetic = True
return msg
def invalidate_caches(self):
# Don't have any caches if we haven't even parsed
if self.raw_body:

View File

@@ -47,7 +47,6 @@ class MsgBlockType:
MBT_SINGLE = 0
MBT_MULTIPLE = 1
MBT_VARIABLE = 2
MBT_String_List = ['Single', 'Multiple', 'Variable']
class PacketFlags(enum.IntFlag):
@@ -55,6 +54,8 @@ class PacketFlags(enum.IntFlag):
RELIABLE = 0x40
RESENT = 0x20
ACK = 0x10
# Not a real flag, just used for display.
EQ = 1 << 10
# frequency for messages
@@ -62,28 +63,23 @@ class PacketFlags(enum.IntFlag):
# = '\xFF\xFF'
# = '\xFF'
# = ''
class MsgFrequency:
FIXED_FREQUENCY_MESSAGE = -1 # marking it
LOW_FREQUENCY_MESSAGE = 4
MEDIUM_FREQUENCY_MESSAGE = 2
HIGH_FREQUENCY_MESSAGE = 1
class MsgFrequency(enum.IntEnum):
FIXED = -1 # marking it
LOW = 4
MEDIUM = 2
HIGH = 1
class MsgTrust:
LL_NOTRUST = 0
LL_TRUSTED = 1
class MsgEncoding(enum.IntEnum):
UNENCODED = 0
ZEROCODED = 1
class MsgEncoding:
LL_UNENCODED = 0
LL_ZEROCODED = 1
class MsgDeprecation:
LL_DEPRECATED = 0
LL_UDPDEPRECATED = 1
LL_UDPBLACKLISTED = 2
LL_NOTDEPRECATED = 3
class MsgDeprecation(enum.IntEnum):
DEPRECATED = 0
UDPDEPRECATED = 1
UDPBLACKLISTED = 2
NOTDEPRECATED = 3
# message variable types

View File

@@ -21,7 +21,7 @@ Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
import typing
from .msgtypes import MsgType, MsgBlockType
from .msgtypes import MsgType, MsgBlockType, MsgFrequency
from ..datatypes import UUID
@@ -105,26 +105,19 @@ class MessageTemplateBlock:
return self.variable_map[name]
class MessageTemplate(object):
frequency_strings = {-1: 'fixed', 1: 'high', 2: 'medium', 4: 'low'} # strings for printout
deprecation_strings = ["Deprecated", "UDPDeprecated", "UDPBlackListed", "NotDeprecated"] # using _as_string methods
encoding_strings = ["Unencoded", "Zerocoded"] # etc
trusted_strings = ["Trusted", "NotTrusted"] # etc LDE 24oct2008
class MessageTemplate:
def __init__(self, name):
self.blocks: typing.List[MessageTemplateBlock] = []
self.block_map: typing.Dict[str, MessageTemplateBlock] = {}
# this is the function or object that will handle this type of message
self.received_count = 0
self.name = name
self.frequency = None
self.msg_num = 0
self.msg_freq_num_bytes = None
self.msg_trust = None
self.msg_deprecation = None
self.msg_encoding = None
self.frequency: typing.Optional[MsgFrequency] = None
self.num = 0
# Frequency + msg num as bytes
self.freq_num_bytes = None
self.trusted = False
self.deprecation = None
self.encoding = None
def add_block(self, block):
self.block_map[block.name] = block
@@ -134,12 +127,6 @@ class MessageTemplate(object):
return self.block_map[name]
def get_msg_freq_num_len(self):
if self.frequency == -1:
if self.frequency == MsgFrequency.FIXED:
return 4
return self.frequency
def get_frequency_as_string(self):
return MessageTemplate.frequency_strings[self.frequency]
def get_deprecation_as_string(self):
return MessageTemplate.deprecation_strings[self.msg_deprecation]

View File

@@ -68,32 +68,32 @@ class TemplateDictionary:
# do a mapping of type to a string for easier reference
frequency_str = ''
if template.frequency == MsgFrequency.FIXED_FREQUENCY_MESSAGE:
if template.frequency == MsgFrequency.FIXED:
frequency_str = "Fixed"
elif template.frequency == MsgFrequency.LOW_FREQUENCY_MESSAGE:
elif template.frequency == MsgFrequency.LOW:
frequency_str = "Low"
elif template.frequency == MsgFrequency.MEDIUM_FREQUENCY_MESSAGE:
elif template.frequency == MsgFrequency.MEDIUM:
frequency_str = "Medium"
elif template.frequency == MsgFrequency.HIGH_FREQUENCY_MESSAGE:
elif template.frequency == MsgFrequency.HIGH:
frequency_str = "High"
self.message_dict[(frequency_str,
template.msg_num)] = template
template.num)] = template
def build_message_ids(self):
for template in list(self.message_templates.values()):
frequency = template.frequency
num_bytes = None
if frequency == MsgFrequency.FIXED_FREQUENCY_MESSAGE:
if frequency == MsgFrequency.FIXED:
# have to do this because Fixed messages are stored as a long in the template
num_bytes = b'\xff\xff\xff' + struct.pack("B", template.msg_num)
elif frequency == MsgFrequency.LOW_FREQUENCY_MESSAGE:
num_bytes = b'\xff\xff' + struct.pack("!H", template.msg_num)
elif frequency == MsgFrequency.MEDIUM_FREQUENCY_MESSAGE:
num_bytes = b'\xff' + struct.pack("B", template.msg_num)
elif frequency == MsgFrequency.HIGH_FREQUENCY_MESSAGE:
num_bytes = struct.pack("B", template.msg_num)
template.msg_freq_num_bytes = num_bytes
num_bytes = b'\xff\xff\xff' + struct.pack("B", template.num)
elif frequency == MsgFrequency.LOW:
num_bytes = b'\xff\xff' + struct.pack("!H", template.num)
elif frequency == MsgFrequency.MEDIUM:
num_bytes = b'\xff' + struct.pack("B", template.num)
elif frequency == MsgFrequency.HIGH:
num_bytes = struct.pack("B", template.num)
template.freq_num_bytes = num_bytes
def get_template_by_name(self, template_name) -> typing.Optional[MessageTemplate]:
return self.message_templates.get(template_name)

View File

@@ -22,7 +22,7 @@ import struct
import re
from . import template
from .msgtypes import MsgFrequency, MsgTrust, MsgEncoding
from .msgtypes import MsgFrequency, MsgEncoding
from .msgtypes import MsgDeprecation, MsgBlockType, MsgType
from ..exc import MessageTemplateParsingError, MessageTemplateNotFound
@@ -112,67 +112,69 @@ class MessageTemplateParser:
frequency = None
freq_str = match.group(2)
if freq_str == 'Low':
frequency = MsgFrequency.LOW_FREQUENCY_MESSAGE
frequency = MsgFrequency.LOW
elif freq_str == 'Medium':
frequency = MsgFrequency.MEDIUM_FREQUENCY_MESSAGE
frequency = MsgFrequency.MEDIUM
elif freq_str == 'High':
frequency = MsgFrequency.HIGH_FREQUENCY_MESSAGE
frequency = MsgFrequency.HIGH
elif freq_str == 'Fixed':
frequency = MsgFrequency.FIXED_FREQUENCY_MESSAGE
frequency = MsgFrequency.FIXED
new_template.frequency = frequency
msg_num = int(match.group(3), 0)
if frequency == MsgFrequency.FIXED_FREQUENCY_MESSAGE:
if frequency == MsgFrequency.FIXED:
# have to do this because Fixed messages are stored as a long in the template
msg_num &= 0xff
msg_num_bytes = struct.pack('!BBBB', 0xff, 0xff, 0xff, msg_num)
elif frequency == MsgFrequency.LOW_FREQUENCY_MESSAGE:
elif frequency == MsgFrequency.LOW:
msg_num_bytes = struct.pack('!BBH', 0xff, 0xff, msg_num)
elif frequency == MsgFrequency.MEDIUM_FREQUENCY_MESSAGE:
elif frequency == MsgFrequency.MEDIUM:
msg_num_bytes = struct.pack('!BB', 0xff, msg_num)
elif frequency == MsgFrequency.HIGH_FREQUENCY_MESSAGE:
elif frequency == MsgFrequency.HIGH:
msg_num_bytes = struct.pack('!B', msg_num)
else:
raise Exception("don't know about frequency %s" % frequency)
new_template.msg_num = msg_num
new_template.msg_freq_num_bytes = msg_num_bytes
new_template.num = msg_num
new_template.freq_num_bytes = msg_num_bytes
msg_trust = None
msg_trust_str = match.group(4)
if msg_trust_str == 'Trusted':
msg_trust = MsgTrust.LL_TRUSTED
msg_trust = True
elif msg_trust_str == 'NotTrusted':
msg_trust = MsgTrust.LL_NOTRUST
msg_trust = False
else:
raise ValueError(f"Invalid trust {msg_trust_str}")
new_template.msg_trust = msg_trust
new_template.trusted = msg_trust
msg_encoding = None
msg_encoding_str = match.group(5)
if msg_encoding_str == 'Unencoded':
msg_encoding = MsgEncoding.LL_UNENCODED
msg_encoding = MsgEncoding.UNENCODED
elif msg_encoding_str == 'Zerocoded':
msg_encoding = MsgEncoding.LL_ZEROCODED
msg_encoding = MsgEncoding.ZEROCODED
else:
raise ValueError(f"Invalid encoding {msg_encoding_str}")
new_template.msg_encoding = msg_encoding
new_template.encoding = msg_encoding
msg_dep = None
msg_dep_str = match.group(7)
if msg_dep_str:
if msg_dep_str == 'Deprecated':
msg_dep = MsgDeprecation.LL_DEPRECATED
msg_dep = MsgDeprecation.DEPRECATED
elif msg_dep_str == 'UDPDeprecated':
msg_dep = MsgDeprecation.LL_UDPDEPRECATED
msg_dep = MsgDeprecation.UDPDEPRECATED
elif msg_dep_str == 'UDPBlackListed':
msg_dep = MsgDeprecation.LL_UDPBLACKLISTED
msg_dep = MsgDeprecation.UDPBLACKLISTED
elif msg_dep_str == 'NotDeprecated':
msg_dep = MsgDeprecation.LL_NOTDEPRECATED
msg_dep = MsgDeprecation.NOTDEPRECATED
else:
msg_dep = MsgDeprecation.LL_NOTDEPRECATED
msg_dep = MsgDeprecation.NOTDEPRECATED
if msg_dep is None:
raise MessageTemplateParsingError("Unknown msg_dep field %s" % match.group(0))
new_template.msg_deprecation = msg_dep
new_template.deprecation = msg_dep
return new_template

View File

@@ -220,11 +220,17 @@ class UDPMessageDeserializer:
if tmpl_variable.probably_binary:
return unpacked_data
# Truncated strings need to be treated carefully
if tmpl_variable.probably_text and unpacked_data.endswith(b"\x00"):
try:
return unpacked_data.decode("utf8").rstrip("\x00")
except UnicodeDecodeError:
return JankStringyBytes(unpacked_data)
if tmpl_variable.probably_text:
# If it has a null terminator, let's try to decode it first.
# We don't want to do this if there isn't one, because that may change
# the meaning of the data.
if unpacked_data.endswith(b"\x00"):
try:
return unpacked_data.decode("utf8").rstrip("\x00")
except UnicodeDecodeError:
pass
# Failed, return jank stringy bytes
return JankStringyBytes(unpacked_data)
elif tmpl_variable.type in {MsgType.MVT_FIXED, MsgType.MVT_VARIABLE}:
# No idea if this should be bytes or a string... make an object that's sort of both.
return JankStringyBytes(unpacked_data)

View File

@@ -69,7 +69,7 @@ class UDPMessageSerializer:
# frequency and message number. The template stores it because it doesn't
# change per template.
body_writer = se.BufferWriter("<")
body_writer.write_bytes(current_template.msg_freq_num_bytes)
body_writer.write_bytes(current_template.freq_num_bytes)
body_writer.write_bytes(msg.extra)
# We're going to pop off keys as we go, so shallow copy the dict.

View File

@@ -1583,12 +1583,13 @@ class BitfieldDataclass(DataclassAdapter):
PRIM_SPEC: ClassVar[Optional[SerializablePrimitive]] = None
def __init__(self, data_cls: Optional[Type] = None,
prim_spec: Optional[SerializablePrimitive] = None, shift: bool = True):
prim_spec: Optional[SerializablePrimitive] = None, shift: Optional[bool] = None):
if not dataclasses.is_dataclass(data_cls):
raise ValueError(f"{data_cls!r} is not a dataclass")
if prim_spec is None:
prim_spec = getattr(data_cls, 'PRIM_SPEC', None)
if shift is None:
shift = getattr(data_cls, 'SHIFT', True)
super().__init__(data_cls, prim_spec)
self._shift = shift
self._bitfield_spec = self._build_bitfield(data_cls)

View File

@@ -12,11 +12,14 @@ import math
import zlib
from typing import *
import numpy as np
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID, IntEnum, IntFlag, Vector3, Quaternion
from hippolyzer.lib.base.helpers import BiDiDict
from hippolyzer.lib.base.namevalue import NameValuesSerializer
from hippolyzer.lib.base.serialization import ParseContext
class LookupIntEnum(IntEnum):
@@ -37,13 +40,15 @@ class LookupIntEnum(IntEnum):
_ASSET_TYPE_BIDI: BiDiDict[str] = BiDiDict({
"animation": "animatn",
"callingcard": "callcard",
"texture_tga": "txtr_tga",
"image_tga": "img_tga",
"sound_wav": "snd_wav",
"lsl_text": "lsltext",
"lsl_bytecode": "lslbyte",
"texture_tga": "txtr_tga",
"image_tga": "img_tga",
"image_jpeg": "jpg",
"sound_wav": "snd_wav",
"folder_link": "link_f",
"unknown": "invalid",
"none": "-1",
})
@@ -88,6 +93,7 @@ class AssetType(LookupIntEnum):
RESERVED_5 = 54
RESERVED_6 = 55
SETTINGS = 56
MATERIAL = 57
UNKNOWN = 255
NONE = -1
@@ -123,6 +129,7 @@ class AssetType(LookupIntEnum):
AssetType.PERSON: InventoryType.PERSON,
AssetType.MESH: InventoryType.MESH,
AssetType.SETTINGS: InventoryType.SETTINGS,
AssetType.MATERIAL: InventoryType.MATERIAL,
}.get(self, AssetType.NONE)
@@ -164,6 +171,7 @@ class InventoryType(LookupIntEnum):
WIDGET = 23
PERSON = 24
SETTINGS = 25
MATERIAL = 26
UNKNOWN = 255
NONE = -1
@@ -233,10 +241,11 @@ class FolderType(LookupIntEnum):
# Note: We actually *never* create folders with that type. This is used for icon override only.
MARKETPLACE_VERSION = 55
SETTINGS = 56
MATERIAL = 57
# Firestorm folders, may not actually exist in legacy schema
FIRESTORM = 57
PHOENIX = 58
RLV = 59
FIRESTORM = 58
PHOENIX = 59
RLV = 60
# Opensim folders
MY_SUITCASE = 100
NONE = -1
@@ -351,6 +360,7 @@ class ParcelInfoFlags(IntFlag):
class MapImageFlags(IntFlag):
# No clue, honestly. I guess there's potentially different image types you could request.
LAYER = 1 << 1
RETURN_NONEXISTENT = 0x10000
@se.enum_field_serializer("MapBlockReply", "Data", "Access")
@@ -1950,8 +1960,8 @@ class AvatarPropertiesFlags(IntFlag):
@se.flag_field_serializer("AvatarGroupsReply", "GroupData", "GroupPowers")
@se.flag_field_serializer("AvatarGroupDataUpdate", "GroupData", "GroupPowers")
@se.flag_field_serializer("AvatarDataUpdate", "AgentDataData", "GroupPowers")
@se.flag_field_serializer("AgentGroupDataUpdate", "GroupData", "GroupPowers")
@se.flag_field_serializer("AgentDataUpdate", "AgentData", "GroupPowers")
@se.flag_field_serializer("GroupProfileReply", "GroupData", "PowersMask")
@se.flag_field_serializer("GroupRoleDataReply", "RoleData", "Powers")
class GroupPowerFlags(IntFlag):
@@ -2132,6 +2142,43 @@ class ScriptPermissions(IntFlag):
CHANGE_ENVIRONMENT = 1 << 18
@se.flag_field_serializer("ParcelProperties", "ParcelData", "ParcelFlags")
class ParcelFlags(IntFlag):
ALLOW_FLY = 1 << 0 # Can start flying
ALLOW_OTHER_SCRIPTS = 1 << 1 # Scripts by others can run.
FOR_SALE = 1 << 2 # Can buy this land
FOR_SALE_OBJECTS = 1 << 7 # Can buy all objects on this land
ALLOW_LANDMARK = 1 << 3 # Always true/deprecated
ALLOW_TERRAFORM = 1 << 4
ALLOW_DAMAGE = 1 << 5
CREATE_OBJECTS = 1 << 6
# 7 is moved above
USE_ACCESS_GROUP = 1 << 8
USE_ACCESS_LIST = 1 << 9
USE_BAN_LIST = 1 << 10
USE_PASS_LIST = 1 << 11
SHOW_DIRECTORY = 1 << 12
ALLOW_DEED_TO_GROUP = 1 << 13
CONTRIBUTE_WITH_DEED = 1 << 14
SOUND_LOCAL = 1 << 15 # Hear sounds in this parcel only
SELL_PARCEL_OBJECTS = 1 << 16 # Objects on land are included as part of the land when the land is sold
ALLOW_PUBLISH = 1 << 17 # Allow publishing of parcel information on the web
MATURE_PUBLISH = 1 << 18 # The information on this parcel is mature
URL_WEB_PAGE = 1 << 19 # The "media URL" is an HTML page
URL_RAW_HTML = 1 << 20 # The "media URL" is a raw HTML string like <H1>Foo</H1>
RESTRICT_PUSHOBJECT = 1 << 21 # Restrict push object to either on agent or on scripts owned by parcel owner
DENY_ANONYMOUS = 1 << 22 # Deny all non identified/transacted accounts
# DENY_IDENTIFIED = 1 << 23 # Deny identified accounts
# DENY_TRANSACTED = 1 << 24 # Deny identified accounts
ALLOW_GROUP_SCRIPTS = 1 << 25 # Allow scripts owned by group
CREATE_GROUP_OBJECTS = 1 << 26 # Allow object creation by group members or objects
ALLOW_ALL_OBJECT_ENTRY = 1 << 27 # Allow all objects to enter a parcel
ALLOW_GROUP_OBJECT_ENTRY = 1 << 28 # Only allow group (and owner) objects to enter the parcel
ALLOW_VOICE_CHAT = 1 << 29 # Allow residents to use voice chat on this parcel
USE_ESTATE_VOICE_CHAN = 1 << 30
DENY_AGEUNVERIFIED = 1 << 31 # Prevent residents who aren't age-verified
@se.enum_field_serializer("UpdateMuteListEntry", "MuteData", "MuteType")
class MuteType(IntEnum):
BY_NAME = 0
@@ -2162,20 +2209,130 @@ class MuteFlags(IntFlag):
return 0xF
class CreationDateAdapter(se.Adapter):
class DateAdapter(se.Adapter):
def __init__(self, multiplier: int = 1):
super(DateAdapter, self).__init__(None)
self._multiplier = multiplier
def decode(self, val: Any, ctx: Optional[se.ParseContext], pod: bool = False) -> Any:
return datetime.datetime.fromtimestamp(val / 1_000_000).isoformat()
return datetime.datetime.fromtimestamp(val / self._multiplier).isoformat()
def encode(self, val: Any, ctx: Optional[se.ParseContext]) -> Any:
return int(datetime.datetime.fromisoformat(val).timestamp() * 1_000_000)
return int(datetime.datetime.fromisoformat(val).timestamp() * self._multiplier)
@se.enum_field_serializer("MeanCollisionAlert", "MeanCollision", "Type")
class MeanCollisionType(IntEnum):
INVALID = 0
BUMP = enum.auto()
LLPUSHOBJECT = enum.auto()
SELECTED_OBJECT_COLLIDE = enum.auto()
SCRIPTED_OBJECT_COLLIDE = enum.auto()
PHYSICAL_OBJECT_COLLIDE = enum.auto()
@se.subfield_serializer("ObjectProperties", "ObjectData", "CreationDate")
class CreationDateSerializer(se.AdapterSubfieldSerializer):
ADAPTER = CreationDateAdapter(None)
ADAPTER = DateAdapter(1_000_000)
ORIG_INLINE = True
@se.subfield_serializer("MeanCollisionAlert", "MeanCollision", "Time")
@se.subfield_serializer("ParcelProperties", "ParcelData", "ClaimDate")
class DateSerializer(se.AdapterSubfieldSerializer):
ADAPTER = DateAdapter()
ORIG_INLINE = True
class ParcelGridType(IntEnum):
PUBLIC = 0x00
OWNED = 0x01 # Presumably non-linden owned land
GROUP = 0x02
SELF = 0x03
FOR_SALE = 0x04
AUCTION = 0x05
class ParcelGridFlags(IntFlag):
UNUSED = 0x8
HIDDEN_AVS = 0x10
SOUND_LOCAL = 0x20
WEST_LINE = 0x40
SOUTH_LINE = 0x80
@dataclasses.dataclass
class ParcelGridInfo(se.BitfieldDataclass):
PRIM_SPEC: ClassVar[se.SerializablePrimitive] = se.U8
SHIFT: ClassVar[bool] = False
Type: Union[ParcelGridType, int] = se.bitfield_field(bits=3, adapter=se.IntEnum(ParcelGridType))
Flags: ParcelGridFlags = se.bitfield_field(bits=5, adapter=se.IntFlag(ParcelGridFlags))
@se.subfield_serializer("ParcelOverlay", "ParcelData", "Data")
class ParcelOverlaySerializer(se.SimpleSubfieldSerializer):
TEMPLATE = se.Collection(None, se.BitfieldDataclass(ParcelGridInfo))
class BitmapAdapter(se.Adapter):
def __init__(self, shape: Tuple[int, int]):
super().__init__(None)
self._shape = shape
def encode(self, val: Any, ctx: Optional[ParseContext]) -> Any:
if val and isinstance(val[0], bytes):
return b''.join(val)
return np.packbits(np.array(val, dtype=np.uint8).flatten(), bitorder="little").tobytes()
def decode(self, val: Any, ctx: Optional[ParseContext], pod: bool = False) -> Any:
if pod:
return [val[i:i + (self._shape[1] // 8)] for i in range(0, len(val), (self._shape[1] // 8))]
parcel_bitmap = np.frombuffer(val, dtype=np.uint8)
# This is a boolean array where each bit says whether the parcel occupies that grid.
return np.unpackbits(parcel_bitmap, bitorder="little").reshape(self._shape)
@se.subfield_serializer("ParcelProperties", "ParcelData", "Bitmap")
class ParcelPropertiesBitmapSerializer(se.AdapterSubfieldSerializer):
"""Bitmap that describes which grids a parcel occupies"""
ADAPTER = BitmapAdapter((256 // 4, 256 // 4))
@se.enum_field_serializer("ParcelProperties", "ParcelData", "LandingType")
class LandingType(IntEnum):
NONE = 1
LANDING_POINT = 1
DIRECT = 2
@se.enum_field_serializer("ParcelProperties", "ParcelData", "Status")
class LandOwnershipStatus(IntEnum):
LEASED = 0
LEASE_PENDING = 1
ABANDONED = 2
NONE = -1
@se.enum_field_serializer("ParcelProperties", "ParcelData", "Category")
class LandCategory(IntEnum):
NONE = 0
LINDEN = enum.auto()
ADULT = enum.auto()
ARTS = enum.auto()
BUSINESS = enum.auto()
EDUCATIONAL = enum.auto()
GAMING = enum.auto()
HANGOUT = enum.auto()
NEWCOMER = enum.auto()
PARK = enum.auto()
RESIDENTIAL = enum.auto()
SHOPPING = enum.auto()
STAGE = enum.auto()
OTHER = enum.auto()
ANY = -1
@se.http_serializer("RenderMaterials")
class RenderMaterialsSerializer(se.BaseHTTPSerializer):
@classmethod
@@ -2211,7 +2368,7 @@ class RetrieveNavMeshSrcSerializer(se.BaseHTTPSerializer):
# Beta puppetry stuff, subject to change!
class PuppetryEventMask(enum.IntFlag):
class PuppetryEventMask(IntFlag):
POSITION = 1 << 0
POSITION_IN_PARENT_FRAME = 1 << 1
ROTATION = 1 << 2

View File

@@ -39,3 +39,7 @@ class MockConnectionHolder(ConnectionHolder):
def __init__(self, circuit, message_handler):
self.circuit = circuit
self.message_handler = message_handler
async def soon(awaitable) -> Message:
return await asyncio.wait_for(awaitable, timeout=1.0)

View File

@@ -269,12 +269,13 @@ class XferManager:
xfer.xfer_id = request_msg["XferID"]["ID"]
packet_id = 0
# TODO: No resend yet. If it's lost, it's lost.
while xfer.chunks:
chunk = xfer.chunks.pop(packet_id)
# EOF if there are no chunks left
packet_val = XferPacket(PacketID=packet_id, IsEOF=not bool(xfer.chunks))
self._connection_holder.circuit.send(Message(
# We just send reliably since I don't care to implement the Xfer-specific
# resend-on-unacked nastiness
_ = self._connection_holder.circuit.send_reliable(Message(
"SendXferPacket",
Block("XferID", ID=xfer.xfer_id, Packet_=packet_val),
Block("DataPacket", Data=chunk),

View File

@@ -29,6 +29,7 @@ from hippolyzer.lib.base.xfer_manager import XferManager
from hippolyzer.lib.client.asset_uploader import AssetUploader
from hippolyzer.lib.client.inventory_manager import InventoryManager
from hippolyzer.lib.client.object_manager import ClientObjectManager, ClientWorldObjectManager
from hippolyzer.lib.client.parcel_manager import ParcelManager
from hippolyzer.lib.client.state import BaseClientSession, BaseClientRegion, BaseClientSessionManager
@@ -41,10 +42,16 @@ class StartLocation(StringEnum):
class ClientSettings(Settings):
# Off by default for now, the cert validation is a big mess due to LL using an internal CA.
SSL_VERIFY: bool = SettingDescriptor(False)
"""Off by default for now, the cert validation is a big mess due to LL using an internal CA."""
SSL_CERT_PATH: str = SettingDescriptor(get_resource_filename("lib/base/network/data/ca-bundle.crt"))
USER_AGENT: str = SettingDescriptor(f"Hippolyzer/v{version('hippolyzer')}")
SEND_AGENT_UPDATES: bool = SettingDescriptor(True)
"""Generally you want to send these, lots of things will break if you don't send at least one."""
AUTO_REQUEST_PARCELS: bool = SettingDescriptor(True)
"""Automatically request all parcel details when connecting to a region"""
AUTO_REQUEST_MATERIALS: bool = SettingDescriptor(True)
"""Automatically request all materials when connecting to a region"""
class HippoCapsClient(CapsClient):
@@ -106,7 +113,7 @@ class HippoClientProtocol(asyncio.DatagramProtocol):
class HippoClientRegion(BaseClientRegion):
def __init__(self, circuit_addr, seed_cap: str, session: HippoClientSession, handle=None):
def __init__(self, circuit_addr, seed_cap: Optional[str], session: HippoClientSession, handle=None):
super().__init__()
self.caps = multidict.MultiDict()
self.message_handler: MessageHandler[Message, str] = MessageHandler(take_by_default=False)
@@ -119,6 +126,7 @@ class HippoClientRegion(BaseClientRegion):
self.xfer_manager = XferManager(proxify(self), self.session().secure_session_id)
self.transfer_manager = TransferManager(proxify(self), session.agent_id, session.id)
self.asset_uploader = AssetUploader(proxify(self))
self.parcel_manager = ParcelManager(proxify(self))
self.objects = ClientObjectManager(self)
self._llsd_serializer = LLSDMessageSerializer()
self._eq_task: Optional[asyncio.Task] = None
@@ -203,11 +211,34 @@ class HippoClientRegion(BaseClientRegion):
)
)
)
if self.session().session_manager.settings.SEND_AGENT_UPDATES:
# Usually we want to send at least one, since lots of messages will never be sent by the sim
# until we send at least one AgentUpdate. For example, ParcelOverlay and LayerData.
await self.circuit.send_reliable(
Message(
"AgentUpdate",
Block(
'AgentData',
AgentID=self.session().agent_id,
SessionID=self.session().id,
# Don't really care about the other fields.
fill_missing=True,
)
)
)
async with seed_resp_fut as seed_resp:
seed_resp.raise_for_status()
self.update_caps(await seed_resp.read_llsd())
self._eq_task = asyncio.create_task(self._poll_event_queue())
settings = self.session().session_manager.settings
if settings.AUTO_REQUEST_PARCELS:
_ = asyncio.create_task(self.parcel_manager.request_dirty_parcels())
if settings.AUTO_REQUEST_MATERIALS:
_ = asyncio.create_task(self.objects.request_all_materials())
except Exception as e:
# Let consumers who were `await`ing the connected signal know there was an error
if not self.connected.done():
@@ -254,17 +285,7 @@ class HippoClientRegion(BaseClientRegion):
if self._llsd_serializer.can_handle(event["message"]):
msg = self._llsd_serializer.deserialize(event)
else:
# If this isn't a templated message (like some EQ-only events are),
# then we wrap it in a synthetic `Message` so that the API for handling
# both EQ-only and templated message events can be the same. Ick.
msg = Message(event["message"])
if isinstance(event["body"], dict):
msg.add_block(Block("EventData", **event["body"]))
else:
# Shouldn't be any events that have anything other than a dict
# as a body, but just to be sure...
msg.add_block(Block("EventData", Data=event["body"]))
msg.synthetic = True
msg = Message.from_eq_event(event)
msg.sender = self.circuit_addr
msg.direction = Direction.IN
self.session().message_handler.handle(msg)
@@ -289,6 +310,7 @@ class HippoClientSession(BaseClientSession):
region_by_circuit_addr: Callable[[ADDR_TUPLE], Optional[HippoClientRegion]]
regions: List[HippoClientRegion]
session_manager: HippoClient
main_region: Optional[HippoClientRegion]
def __init__(self, id, secure_session_id, agent_id, circuit_code, session_manager: Optional[HippoClient] = None,
login_data=None):
@@ -581,7 +603,8 @@ class HippoClient(BaseClientSessionManager):
password: str,
login_uri: Optional[str] = None,
agree_to_tos: bool = False,
start_location: Union[StartLocation, str, None] = StartLocation.LAST
start_location: Union[StartLocation, str, None] = StartLocation.LAST,
connect: bool = True,
):
if self.session:
raise RuntimeError("Already logged in!")
@@ -638,10 +661,13 @@ class HippoClient(BaseClientSessionManager):
self.session.transport, self.session.protocol = await self._create_transport()
self._resend_task = asyncio.create_task(self._attempt_resends())
self.session.message_handler.subscribe("AgentDataUpdate", self._handle_agent_data_update)
self.session.message_handler.subscribe("AgentGroupDataUpdate", self._handle_agent_group_data_update)
assert self.session.open_circuit(self.session.regions[-1].circuit_addr)
region = self.session.regions[-1]
await region.connect(main_region=True)
if connect:
region = self.session.regions[-1]
await region.connect(main_region=True)
def logout(self):
if not self.session:
@@ -729,3 +755,11 @@ class HippoClient(BaseClientSessionManager):
continue
region.circuit.resend_unacked()
await asyncio.sleep(0.5)
def _handle_agent_data_update(self, msg: Message):
self.session.active_group = msg["AgentData"]["ActiveGroupID"]
def _handle_agent_group_data_update(self, msg: Message):
self.session.groups.clear()
for block in msg["GroupData"]:
self.session.groups.add(block["GroupID"])

View File

@@ -86,6 +86,7 @@ class InventoryManager:
self.model.add(cached_item)
def _parse_cache(self, path: Union[str, Path]) -> Tuple[List[InventoryCategory], List[InventoryItem]]:
"""Warning, may be incredibly slow due to llsd.parse_notation() behavior"""
categories: List[InventoryCategory] = []
items: List[InventoryItem] = []
# Parse our cached items and categories out of the compressed inventory cache

View File

@@ -28,6 +28,7 @@ from hippolyzer.lib.base.objects import (
from hippolyzer.lib.base.settings import Settings
from hippolyzer.lib.client.namecache import NameCache, NameCacheEntry
from hippolyzer.lib.base.templates import PCode, ObjectStateSerializer
from hippolyzer.lib.base import llsd
if TYPE_CHECKING:
from hippolyzer.lib.client.state import BaseClientRegion, BaseClientSession
@@ -35,10 +36,11 @@ if TYPE_CHECKING:
LOG = logging.getLogger(__name__)
OBJECT_OR_LOCAL = Union[Object, int]
MATERIAL_MAP_TYPE = Dict[UUID, dict]
class ObjectUpdateType(enum.IntEnum):
OBJECT_UPDATE = enum.auto()
UPDATE = enum.auto()
PROPERTIES = enum.auto()
FAMILY = enum.auto()
COSTS = enum.auto()
@@ -50,12 +52,13 @@ class ClientObjectManager:
Object manager for a specific region
"""
__slots__ = ("_region", "_world_objects", "state", "__weakref__")
__slots__ = ("_region", "_world_objects", "state", "__weakref__", "_requesting_all_mats_lock")
def __init__(self, region: BaseClientRegion):
self._region: BaseClientRegion = proxify(region)
self._world_objects: ClientWorldObjectManager = proxify(region.session().objects)
self.state: RegionObjectsState = RegionObjectsState()
self._requesting_all_mats_lock = asyncio.Lock()
def __len__(self):
return len(self.state.localid_lookup)
@@ -163,9 +166,56 @@ class ClientObjectManager:
futures = []
for local_id in local_ids:
futures.append(self.state.register_future(local_id, ObjectUpdateType.OBJECT_UPDATE))
futures.append(self.state.register_future(local_id, ObjectUpdateType.UPDATE))
return futures
async def request_all_materials(self) -> MATERIAL_MAP_TYPE:
"""
Request all materials within the sim
Sigh, yes, this is best practice per indra :(
"""
if self._requesting_all_mats_lock.locked():
# We're already requesting all materials, wait until the lock is free
# and just return what was returned.
async with self._requesting_all_mats_lock:
return self.state.materials
async with self._requesting_all_mats_lock:
async with self._region.caps_client.get("RenderMaterials") as resp:
resp.raise_for_status()
# Clear out all previous materials, this is a complete response.
self.state.materials.clear()
self._process_materials_response(await resp.read())
return self.state.materials
async def request_materials(self, material_ids: Sequence[UUID]) -> MATERIAL_MAP_TYPE:
if self._requesting_all_mats_lock.locked():
# Just wait for the in-flight request for all materials to complete
# if we have one in flight.
async with self._requesting_all_mats_lock:
# Wait for the lock to be released
pass
not_found = set(x for x in material_ids if (x not in self.state.materials))
if not_found:
# Request any materials we don't already have, if there were any
data = {"Zipped": llsd.zip_llsd([x.bytes for x in material_ids])}
async with self._region.caps_client.post("RenderMaterials", data=data) as resp:
resp.raise_for_status()
self._process_materials_response(await resp.read())
# build up a dict of just the requested mats
mats = {}
for mat_id in material_ids:
mats[mat_id] = self.state.materials[mat_id]
return mats
def _process_materials_response(self, response: bytes):
entries = llsd.unzip_llsd(llsd.parse_xml(response)["Zipped"])
for entry in entries:
self.state.materials[UUID(bytes=entry["ID"])] = entry["Material"]
class ObjectEvent:
__slots__ = ("object", "updated", "update_type")
@@ -361,7 +411,7 @@ class ClientWorldObjectManager:
if obj.PCode == PCode.AVATAR:
self._avatar_objects[obj.FullID] = obj
self._rebuild_avatar_objects()
self._run_object_update_hooks(obj, set(obj.to_dict().keys()), ObjectUpdateType.OBJECT_UPDATE, msg)
self._run_object_update_hooks(obj, set(obj.to_dict().keys()), ObjectUpdateType.UPDATE, msg)
def _kill_object_by_local_id(self, region_state: RegionObjectsState, local_id: int):
obj = region_state.lookup_localid(local_id)
@@ -413,7 +463,7 @@ class ClientWorldObjectManager:
# our view of the world then we want to move it to this region.
obj = self.lookup_fullid(object_data["FullID"])
if obj:
self._update_existing_object(obj, object_data, ObjectUpdateType.OBJECT_UPDATE, msg)
self._update_existing_object(obj, object_data, ObjectUpdateType.UPDATE, msg)
else:
if region_state is None:
continue
@@ -437,7 +487,7 @@ class ClientWorldObjectManager:
# Need the Object as context because decoding state requires PCode.
state_deserializer = ObjectStateSerializer.deserialize
object_data["State"] = state_deserializer(ctx_obj=obj, val=object_data["State"])
self._update_existing_object(obj, object_data, ObjectUpdateType.OBJECT_UPDATE, msg)
self._update_existing_object(obj, object_data, ObjectUpdateType.UPDATE, msg)
else:
if region_state:
region_state.missing_locals.add(object_data["LocalID"])
@@ -465,7 +515,7 @@ class ClientWorldObjectManager:
self._update_existing_object(obj, {
"UpdateFlags": update_flags,
"RegionHandle": handle,
}, ObjectUpdateType.OBJECT_UPDATE, msg)
}, ObjectUpdateType.UPDATE, msg)
continue
cached_obj_data = self._lookup_cache_entry(handle, block["ID"], block["CRC"])
@@ -504,7 +554,7 @@ class ClientWorldObjectManager:
LOG.warning(f"Got ObjectUpdateCompressed for unknown region {handle}: {object_data!r}")
obj = self.lookup_fullid(object_data["FullID"])
if obj:
self._update_existing_object(obj, object_data, ObjectUpdateType.OBJECT_UPDATE, msg)
self._update_existing_object(obj, object_data, ObjectUpdateType.UPDATE, msg)
else:
if region_state is None:
continue
@@ -654,13 +704,14 @@ class RegionObjectsState:
__slots__ = (
"handle", "missing_locals", "_orphans", "localid_lookup", "coarse_locations",
"_object_futures"
"_object_futures", "materials"
)
def __init__(self):
self.missing_locals = set()
self.localid_lookup: Dict[int, Object] = {}
self.coarse_locations: Dict[UUID, Vector3] = {}
self.materials: MATERIAL_MAP_TYPE = {}
self._object_futures: Dict[Tuple[int, int], List[asyncio.Future]] = {}
self._orphans: Dict[int, List[int]] = collections.defaultdict(list)
@@ -673,6 +724,7 @@ class RegionObjectsState:
self.coarse_locations.clear()
self.missing_locals.clear()
self.localid_lookup.clear()
self.materials.clear()
def lookup_localid(self, localid: int) -> Optional[Object]:
return self.localid_lookup.get(localid)

View File

@@ -0,0 +1,251 @@
import asyncio
import dataclasses
import logging
from typing import *
import numpy as np
from hippolyzer.lib.base.datatypes import UUID, Vector3, Vector2
from hippolyzer.lib.base.message.message import Message, Block
from hippolyzer.lib.base.templates import ParcelGridFlags, ParcelFlags
from hippolyzer.lib.client.state import BaseClientRegion
LOG = logging.getLogger(__name__)
@dataclasses.dataclass
class Parcel:
local_id: int
name: str
flags: ParcelFlags
group_id: UUID
# TODO: More properties
class ParcelManager:
# We expect to receive this number of ParcelOverlay messages
NUM_CHUNKS = 4
# No, we don't support varregion or whatever.
REGION_SIZE = 256
# Basically, the minimum parcel size is 4 on either axis so each "point" in the
# ParcelOverlay represents an area this size
GRID_STEP = 4
GRIDS_PER_EDGE = REGION_SIZE // GRID_STEP
def __init__(self, region: BaseClientRegion):
# dimensions are south to north, west to east
self.overlay = np.zeros((self.GRIDS_PER_EDGE, self.GRIDS_PER_EDGE), dtype=np.uint8)
# 1-indexed parcel list index
self.parcel_indices = np.zeros((self.GRIDS_PER_EDGE, self.GRIDS_PER_EDGE), dtype=np.uint16)
self.parcels: List[Optional[Parcel]] = []
self.overlay_chunks: List[Optional[bytes]] = [None] * self.NUM_CHUNKS
self.overlay_complete = asyncio.Event()
self.parcels_downloaded = asyncio.Event()
self._parcels_dirty: bool = True
self._region = region
self._next_seq = 1
self._region.message_handler.subscribe("ParcelOverlay", self._handle_parcel_overlay)
def _handle_parcel_overlay(self, message: Message):
self.add_overlay_chunk(message["ParcelData"]["Data"], message["ParcelData"]["SequenceID"])
def add_overlay_chunk(self, chunk: bytes, chunk_num: int) -> bool:
self.overlay_chunks[chunk_num] = chunk
# Still have some pending chunks, don't try to parse this yet
if not all(self.overlay_chunks):
return False
new_overlay_data = b"".join(self.overlay_chunks)
self.overlay_chunks = [None] * self.NUM_CHUNKS
self._parcels_dirty = False
if new_overlay_data != self.overlay.data[:]:
# If the raw data doesn't match, then we have to parse again
new_data = np.frombuffer(new_overlay_data, dtype=np.uint8).reshape(self.overlay.shape)
np.copyto(self.overlay, new_data)
self._parse_overlay()
# We could optimize this by just marking specific squares dirty
# if the parcel indices have changed between parses, but I don't care
# to do that.
self._parcels_dirty = True
self.parcels_downloaded.clear()
if not self.overlay_complete.is_set():
self.overlay_complete.set()
return True
@classmethod
def _pos_to_grid_coords(cls, pos: Vector3) -> Tuple[int, int]:
return round(pos.Y // cls.GRID_STEP), round(pos.X // cls.GRID_STEP)
def _parse_overlay(self):
# Zero out all parcel indices
self.parcel_indices[:, :] = 0
next_parcel_idx = 1
for y in range(0, self.GRIDS_PER_EDGE):
for x in range(0, self.GRIDS_PER_EDGE):
# We already have a parcel index for this grid, continue
if self.parcel_indices[y, x]:
continue
# Fill all adjacent grids with this parcel index
self._flood_fill_parcel_index(y, x, next_parcel_idx)
# SL doesn't allow disjoint grids to be part of the same parcel, so
# whatever grid we find next without a parcel index must be a new parcel
next_parcel_idx += 1
# Should have found at least one parcel
assert next_parcel_idx >= 2
# Have a different number of parcels now, we can't use the existing parcel objects
# because it's unlikely that just parcel boundaries have changed.
if len(self.parcels) != next_parcel_idx - 1:
# We don't know about any of these parcels yet, fill with none
self.parcels = [None] * (next_parcel_idx - 1)
def _flood_fill_parcel_index(self, start_y, start_x, parcel_idx):
"""Flood fill all neighboring grids with the parcel index, being mindful of parcel boundaries"""
# We know the start grid is assigned to this parcel index
self.parcel_indices[start_y, start_x] = parcel_idx
# Queue of grids to test the neighbors of, start with the start grid.
neighbor_test_queue: List[Tuple[int, int]] = [(start_y, start_x)]
while neighbor_test_queue:
to_test = neighbor_test_queue.pop(0)
test_grid = self.overlay[to_test]
for direction in ((-1, 0), (1, 0), (0, -1), (0, 1)):
new_pos = to_test[0] + direction[0], to_test[1] + direction[1]
if any(x < 0 or x >= self.GRIDS_PER_EDGE for x in new_pos):
# Outside bounds
continue
if self.parcel_indices[new_pos]:
# Already set, skip
continue
if direction[0] == -1 and test_grid & ParcelGridFlags.SOUTH_LINE:
# Test grid is already on a south line, can't go south.
continue
if direction[1] == -1 and test_grid & ParcelGridFlags.WEST_LINE:
# Test grid is already on a west line, can't go west.
continue
grid = self.overlay[new_pos]
if direction[0] == 1 and grid & ParcelGridFlags.SOUTH_LINE:
# Hit a south line going north, this is outside the current parcel
continue
if direction[1] == 1 and grid & ParcelGridFlags.WEST_LINE:
# Hit a west line going east, this is outside the current parcel
continue
# This grid is within the current parcel, set the parcel index
self.parcel_indices[new_pos] = parcel_idx
# Append the grid to the neighbour testing queue
neighbor_test_queue.append(new_pos)
async def request_dirty_parcels(self) -> Tuple[Parcel, ...]:
if self._parcels_dirty:
return await self.request_all_parcels()
return tuple(self.parcels)
async def request_all_parcels(self) -> Tuple[Parcel, ...]:
await self.overlay_complete.wait()
# Because of how we build up the parcel index map, it's safe for us to
# do this instead of keeping track of seen IDs in a set or similar
last_seen_parcel_index = 0
futs = []
for y in range(0, self.GRIDS_PER_EDGE):
for x in range(0, self.GRIDS_PER_EDGE):
parcel_index = self.parcel_indices[y, x]
assert parcel_index != 0
if parcel_index <= last_seen_parcel_index:
continue
assert parcel_index == last_seen_parcel_index + 1
last_seen_parcel_index = parcel_index
# Request a position within the parcel
futs.append(self.request_parcel_properties(
Vector2(x * self.GRID_STEP + 1.0, y * self.GRID_STEP + 1.0)
))
# Wait for all parcel properties to come in
await asyncio.gather(*futs)
self.parcels_downloaded.set()
self._parcels_dirty = False
return tuple(self.parcels)
async def request_parcel_properties(self, pos: Vector2) -> Parcel:
await self.overlay_complete.wait()
seq_id = self._next_seq
# Register a wait on a ParcelProperties matching this seq
parcel_props_fut = self._region.message_handler.wait_for(
("ParcelProperties",),
predicate=lambda msg: msg["ParcelData"]["SequenceID"] == seq_id,
timeout=10.0,
)
# We don't care about when we receive an ack, we only care about when we receive the parcel props
_ = self._region.circuit.send_reliable(Message(
"ParcelPropertiesRequest",
Block("AgentData", AgentID=self._region.session().agent_id, SessionID=self._region.session().id),
Block(
"ParcelData",
SequenceID=seq_id,
West=pos.X,
East=pos.X,
North=pos.Y,
South=pos.Y,
# What does this even mean?
SnapSelection=0,
),
))
self._next_seq += 1
return self._process_parcel_properties(await parcel_props_fut, pos)
def _process_parcel_properties(self, parcel_props: Message, pos: Optional[Vector2] = None) -> Parcel:
data_block = parcel_props["ParcelData"][0]
grid_coord = None
# Parcel indices are one-indexed, convert to zero-indexed.
if pos is not None:
# We have a pos, figure out where in the grid we should look for the parcel index
grid_coord = self._pos_to_grid_coords(pos)
else:
# Need to look at the parcel bitmap to figure out a valid grid coord.
# This is a boolean array where each bit says whether the parcel occupies that grid.
parcel_bitmap = data_block.deserialize_var("Bitmap")
for y in range(self.GRIDS_PER_EDGE):
for x in range(self.GRIDS_PER_EDGE):
if parcel_bitmap[y, x]:
# This is the first grid the parcel occupies per the bitmap
grid_coord = y, x
break
if grid_coord:
break
parcel = Parcel(
local_id=data_block["LocalID"],
name=data_block["Name"],
flags=ParcelFlags(data_block["ParcelFlags"]),
group_id=data_block["GroupID"],
# Parcel UUID isn't in this response :/
)
# I guess the bitmap _could_ be empty, but probably not.
if grid_coord is not None:
parcel_idx = self.parcel_indices[grid_coord] - 1
if len(self.parcels) > parcel_idx >= 0:
# Okay, parcels list is sane, place the parcel in there.
self.parcels[parcel_idx] = parcel
else:
LOG.warning(f"Received ParcelProperties with incomplete overlay for {grid_coord!r}")
return parcel
async def get_parcel_at(self, pos: Vector2, request_if_missing: bool = True) -> Optional[Parcel]:
grid_coord = self._pos_to_grid_coords(pos)
parcel = None
if parcel_idx := self.parcel_indices[grid_coord]:
parcel = self.parcels[parcel_idx - 1]
if request_if_missing and parcel is None:
return await self.request_parcel_properties(pos)
return parcel

View File

@@ -82,6 +82,8 @@ class BaseClientSession(abc.ABC):
id: UUID
agent_id: UUID
secure_session_id: UUID
active_group: UUID
groups: Set[UUID]
message_handler: MessageHandler[Message, str]
regions: MutableSequence[BaseClientRegion]
region_by_handle: Callable[[int], Optional[BaseClientRegion]]
@@ -100,6 +102,8 @@ class BaseClientSession(abc.ABC):
self.circuit_code = circuit_code
self.global_caps = {}
self.session_manager = session_manager
self.active_group: UUID = UUID.ZERO
self.groups: Set[UUID] = set()
self.regions = []
self._main_region = None
self.message_handler: MessageHandler[Message, str] = MessageHandler()

View File

@@ -16,6 +16,8 @@ import mitmproxy.http
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.llsd_msg_serializer import LLSDMessageSerializer
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.caps import CapData, CapType
@@ -32,6 +34,9 @@ def apply_security_monkeypatches():
apply_security_monkeypatches()
LOG = logging.getLogger(__name__)
class MITMProxyEventManager:
"""
Handles HTTP request and response events from the mitmproxy process
@@ -58,7 +63,7 @@ class MITMProxyEventManager:
try:
await self.pump_proxy_event()
except:
logging.exception("Exploded when handling parsed packets")
LOG.exception("Exploded when handling parsed packets")
async def pump_proxy_event(self):
try:
@@ -140,7 +145,7 @@ class MITMProxyEventManager:
# Both the wrapper request and the actual asset server request went through
# the proxy. Don't bother trying the redirect strategy anymore.
self._asset_server_proxied = True
logging.warning("noproxy not used, switching to URI rewrite strategy")
LOG.warning("noproxy not used, switching to URI rewrite strategy")
elif cap_data and cap_data.cap_name == "EventQueueGet":
# HACK: The sim's EQ acking mechanism doesn't seem to actually work.
# if the client drops the connection due to timeout before we can
@@ -151,7 +156,7 @@ class MITMProxyEventManager:
eq_manager = cap_data.region().eq_manager
cached_resp = eq_manager.get_cached_poll_response(req_ack_id)
if cached_resp:
logging.warning("Had to serve a cached EventQueueGet due to client desync")
LOG.warning("Had to serve a cached EventQueueGet due to client desync")
flow.response = mitmproxy.http.Response.make(
200,
llsd.format_xml(cached_resp),
@@ -215,7 +220,7 @@ class MITMProxyEventManager:
try:
message_logger.log_http_response(flow)
except:
logging.exception("Failed while logging HTTP flow")
LOG.exception("Failed while logging HTTP flow")
# Don't process responses for requests or responses injected by the proxy.
# We already processed it, it came from us!
@@ -274,13 +279,13 @@ class MITMProxyEventManager:
if cap_data.cap_name == "Seed":
parsed = llsd.parse_xml(flow.response.content)
logging.debug("Got seed cap for %r : %r" % (cap_data, parsed))
LOG.debug("Got seed cap for %r : %r" % (cap_data, parsed))
region.update_caps(parsed)
# On LL's grid these URIs aren't unique across sessions or regions,
# so we get request attribution by replacing them with a unique
# alias URI.
logging.debug("Replacing GetMesh caps with wrapped versions")
LOG.debug("Replacing GetMesh caps with wrapped versions")
wrappable_caps = {"GetMesh2", "GetMesh", "GetTexture", "ViewerAsset"}
for cap_name in wrappable_caps:
if cap_name in parsed:
@@ -315,7 +320,7 @@ class MITMProxyEventManager:
if "uploader" in parsed:
region.register_cap(cap_data.cap_name + "Uploader", parsed["uploader"], CapType.TEMPORARY)
except:
logging.exception("OOPS, blew up in HTTP proxy!")
LOG.exception("OOPS, blew up in HTTP proxy!")
def _handle_login_flow(self, flow: HippoHTTPFlow):
resp = xmlrpc.client.loads(flow.response.content)[0][0] # type: ignore
@@ -324,20 +329,30 @@ class MITMProxyEventManager:
flow.cap_data = CapData("LoginRequest", session=weakref.ref(sess))
def _handle_eq_event(self, session: Session, region: ProxiedRegion, event: Dict[str, Any]):
logging.debug("Event received on %r: %r" % (self, event))
LOG.debug("Event received on %r: %r" % (self, event))
message_logger = self.session_manager.message_logger
if message_logger:
message_logger.log_eq_event(session, region, event)
if self.llsd_message_serializer.can_handle(event["message"]):
msg = self.llsd_message_serializer.deserialize(event)
else:
msg = Message.from_eq_event(event)
msg.sender = region.circuit_addr
msg.direction = Direction.IN
try:
region.message_handler.handle(msg)
except:
LOG.exception("Failed while handling EQ message")
handle_event = AddonManager.handle_eq_event(session, region, event)
if handle_event is True:
# Addon handled the event and didn't want it sent to the viewer
return True
msg = None
# Handle events that inform us about new regions
sim_addr, sim_handle, sim_seed = None, None, None
if self.llsd_message_serializer.can_handle(event["message"]):
msg = self.llsd_message_serializer.deserialize(event)
# Sim is asking us to talk to a neighbour
if event["message"] == "EstablishAgentCommunication":
ip_split = event["body"]["sim-ip-and-port"].split(":")

View File

@@ -8,6 +8,7 @@ import queue
import typing
import uuid
import weakref
from typing import Iterable
import mitmproxy.certs
import mitmproxy.ctx
@@ -15,7 +16,10 @@ import mitmproxy.log
import mitmproxy.master
import mitmproxy.options
import mitmproxy.proxy
from cryptography import x509
from cryptography.x509 import GeneralNames
from mitmproxy.addons import core, clientplayback, proxyserver, next_layer, disable_h2c
from mitmproxy.certs import CertStoreEntry
from mitmproxy.http import HTTPFlow
from mitmproxy.proxy.layers import tls
import OpenSSL
@@ -26,9 +30,16 @@ from hippolyzer.lib.proxy.caps import SerializedCapData
class SLCertStore(mitmproxy.certs.CertStore):
def get_cert(self, commonname: typing.Optional[str], sans: typing.List[str], *args, **kwargs):
def get_cert(
self,
commonname: str | None,
sans: Iterable[x509.GeneralName],
organization: str | None = None,
*args,
**kwargs
) -> CertStoreEntry:
entry = super().get_cert(commonname, sans, *args, **kwargs)
cert, privkey, chain = entry.cert, entry.privatekey, entry.chain_file
cert, privkey, chain, chain_certs = entry.cert, entry.privatekey, entry.chain_file, entry.chain_certs
x509 = cert.to_pyopenssl()
# The cert must have a subject key ID or the viewer will reject it.
for i in range(0, x509.get_extension_count()):
@@ -48,10 +59,10 @@ class SLCertStore(mitmproxy.certs.CertStore):
])
x509.sign(OpenSSL.crypto.PKey.from_cryptography_key(privkey), "sha256") # type: ignore
new_entry = mitmproxy.certs.CertStoreEntry(
mitmproxy.certs.Cert.from_pyopenssl(x509), privkey, chain
mitmproxy.certs.Cert.from_pyopenssl(x509), privkey, chain, chain_certs,
)
# Replace the cert that was created in the base `get_cert()` with our modified cert
self.certs[(commonname, tuple(sans))] = new_entry
self.certs[(commonname, GeneralNames(sans))] = new_entry
self.expire_queue.pop(-1)
self.expire(new_entry)
return new_entry

View File

@@ -1,5 +1,5 @@
import asyncio
import datetime as dt
import logging
from hippolyzer.lib.base.helpers import get_mtime
from hippolyzer.lib.client.inventory_manager import InventoryManager
@@ -12,6 +12,8 @@ class ProxyInventoryManager(InventoryManager):
super().__init__(session)
newest_cache = None
newest_timestamp = dt.datetime(year=1970, month=1, day=1, tzinfo=dt.timezone.utc)
# So consumers know when the inventory should be complete
self.cache_loaded: asyncio.Event = asyncio.Event()
# Look for the newest version of the cached inventory and use that.
# Not foolproof, but close enough if we're not sure what viewer is being used.
for cache_dir in iter_viewer_cache_dirs():
@@ -26,7 +28,8 @@ class ProxyInventoryManager(InventoryManager):
newest_cache = inv_cache_path
if newest_cache:
try:
self.load_cache(newest_cache)
except:
logging.exception("Failed to load invcache")
cache_load_fut = asyncio.ensure_future(asyncio.to_thread(self.load_cache, newest_cache))
# Meh. Don't care if it fails.
cache_load_fut.add_done_callback(lambda *args: self.cache_loaded.set())
else:
self.cache_loaded.set()

View File

@@ -161,6 +161,8 @@ class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
region.mark_dead()
elif message.name == "RegionHandshake":
region.name = str(message["RegionInfo"][0]["SimName"])
elif message.name == "AgentDataUpdate" and self.session:
self.session.active_group = message["AgentData"]["ActiveGroupID"]
# Send the message if it wasn't explicitly dropped or sent before
if not message.finalized:

View File

@@ -16,10 +16,14 @@ import weakref
from defusedxml import minidom
from hippolyzer.lib.base import serialization as se, llsd
from hippolyzer.lib.base.message.llsd_msg_serializer import LLSDMessageSerializer
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.datatypes import TaggedUnion, UUID, TupleCoord
from hippolyzer.lib.base.helpers import bytes_escape
from hippolyzer.lib.base.message.message_formatting import HumanMessageSerializer
from hippolyzer.lib.base.message.msgtypes import PacketFlags
from hippolyzer.lib.base.message.template_dict import DEFAULT_TEMPLATE_DICT
from hippolyzer.lib.base.network.transport import Direction
from hippolyzer.lib.proxy.message_filter import MetaFieldSpecifier, compile_filter, BaseFilterNode, MessageFilterNode, \
EnumFieldSpecifier, MatchResult
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
@@ -614,6 +618,19 @@ class EQMessageLogEntry(AbstractMessageLogEntry):
return "EQ"
def request(self, beautify=False, replacements=None):
# TODO: This is a bit of a hack! Templated messages can be sent over the EQ, so let's
# display them as template messages if that's what they are.
if self.event['message'] in DEFAULT_TEMPLATE_DICT.message_templates:
msg = LLSDMessageSerializer().deserialize(self.event)
msg.synthetic = True
msg.send_flags = PacketFlags.EQ
msg.direction = Direction.IN
# Annoyingly, templated messages sent over the EQ can have extra fields not specified
# in the template, and this is often the case. ParcelProperties has fields that aren't
# in the template. Luckily, we don't really care about extra fields, we just may not
# be able to automatically decode U32 and friends without the hint from the template
# that that is what they are.
return HumanMessageSerializer.to_human_string(msg, replacements, beautify)
return f'EQ {self.event["message"]}\n\n{self._format_llsd(self.event["body"])}'
@property

View File

@@ -48,6 +48,7 @@ class ProxyObjectManager(ClientObjectManager):
"RequestMultipleObjects",
self._handle_request_multiple_objects,
)
region.http_message_handler.subscribe("RenderMaterials", self._handle_render_materials)
def load_cache(self):
if not self.may_use_vo_cache or self.cache_loaded:
@@ -100,6 +101,13 @@ class ProxyObjectManager(ClientObjectManager):
# Remove any queued cache misses that the viewer just requested for itself
self.queued_cache_misses -= {b["ID"] for b in msg["ObjectData"]}
def _handle_render_materials(self, flow: HippoHTTPFlow):
if flow.response.status_code != 200:
return
if flow.request.method not in ("GET", "POST"):
return
self._process_materials_response(flow.response.content)
class ProxyWorldObjectManager(ClientWorldObjectManager):
_session: Session

View File

@@ -0,0 +1,18 @@
from typing import *
from hippolyzer.lib.base.helpers import proxify
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.client.parcel_manager import ParcelManager
if TYPE_CHECKING:
from hippolyzer.lib.proxy.region import ProxiedRegion
class ProxyParcelManager(ParcelManager):
def __init__(self, region: "ProxiedRegion"):
super().__init__(proxify(region))
# Handle ParcelProperties messages that we didn't specifically ask for
self._region.message_handler.subscribe("ParcelProperties", self._handle_parcel_properties)
def _handle_parcel_properties(self, msg: Message):
self._process_parcel_properties(msg)
return None

View File

@@ -21,6 +21,7 @@ from hippolyzer.lib.proxy.object_manager import ProxyObjectManager
from hippolyzer.lib.base.transfer_manager import TransferManager
from hippolyzer.lib.base.xfer_manager import XferManager
from hippolyzer.lib.proxy.asset_uploader import ProxyAssetUploader
from hippolyzer.lib.proxy.parcel_manager import ProxyParcelManager
if TYPE_CHECKING:
from hippolyzer.lib.proxy.sessions import Session
@@ -67,6 +68,7 @@ class ProxiedRegion(BaseClientRegion):
self.xfer_manager = XferManager(proxify(self), self.session().secure_session_id)
self.transfer_manager = TransferManager(proxify(self), session.agent_id, session.id)
self.asset_uploader = ProxyAssetUploader(proxify(self))
self.parcel_manager = ProxyParcelManager(proxify(self))
self._recalc_caps()
@property

View File

@@ -1,70 +1,77 @@
aiohttp==3.8.3
aiosignal==1.2.0
aiohttp==3.9.1
aioquic==0.9.25
aiosignal==1.3.1
appdirs==1.4.4
Arpeggio==1.10.2
asgiref==3.4.1
async-timeout==4.0.1
attrs==21.2.0
blinker==1.4
Brotli==1.0.9
certifi==2022.12.7
cffi==1.15.0
charset-normalizer==2.0.9
click==8.0.3
cryptography==36.0.2
Arpeggio==2.0.2
asgiref==3.7.2
attrs==23.2.0
blinker==1.7.0
Brotli==1.1.0
certifi==2023.11.17
cffi==1.16.0
click==8.1.7
cryptography==41.0.7
dataclasses-json==0.6.3
defusedxml==0.7.1
Flask==2.0.2
frozenlist==1.3.3
Flask==2.3.3
frozenlist==1.4.1
gltflib==1.0.13
Glymur==0.9.6
h11==0.12.0
h11==0.14.0
h2==4.1.0
hpack==4.0.0
hyperframe==6.0.1
idna==2.10
itsdangerous==2.0.1
jedi==0.18.1
Jinja2==3.0.3
kaitaistruct==0.9
lazy-object-proxy==1.6.0
itsdangerous==2.1.2
jedi==0.19.1
Jinja2==3.1.2
kaitaistruct==0.10
lazy-object-proxy==1.10.0
ldap3==2.9.1
llsd~=1.0.0
lxml==4.9.2
MarkupSafe==2.0.1
mitmproxy==8.0.0
msgpack==1.0.3
multidict==5.2.0
numpy==1.24.2
outleap~=0.4.1
llsd==1.0.0
lxml==5.1.0
MarkupSafe==2.1.3
marshmallow==3.20.1
mitmproxy==10.2.1
mitmproxy_rs==0.5.1
msgpack==1.0.7
multidict==6.0.4
mypy-extensions==1.0.0
numpy==1.26.3
outleap==0.5.1
packaging==23.2
parso==0.8.3
passlib==1.7.4
prompt-toolkit==3.0.23
protobuf==3.18.1
ptpython==3.0.20
prompt-toolkit==3.0.43
protobuf==4.25.1
ptpython==3.0.25
publicsuffix2==2.20191221
pyasn1==0.4.8
pyasn1==0.5.1
pyasn1-modules==0.3.0
pycollada==0.8
pycparser==2.21
pycollada==0.7.2
Pygments==2.10.0
pyOpenSSL==22.0.0
pyparsing==2.4.7
Pygments==2.17.2
pylsqpack==0.3.18
pyOpenSSL==23.3.0
pyparsing==3.1.1
pyperclip==1.8.2
PySide6-Essentials==6.4.2
qasync==0.22.0
PySide6-Essentials==6.6.1
python-dateutil==2.8.2
qasync==0.27.1
recordclass==0.18.2
requests==2.26.0
ruamel.yaml==0.17.21
ruamel.yaml.clib==0.2.7
shiboken6==6.4.2
ruamel.yaml==0.18.5
ruamel.yaml.clib==0.2.8
service-identity==23.1.0
shiboken6==6.6.1
six==1.16.0
sortedcontainers==2.4.0
tornado==6.1
transformations==2021.6.6
typing-extensions==4.0.1
urllib3==1.26.7
urwid==2.1.2
wcwidth==0.2.5
Werkzeug==2.0.2
wsproto==1.0.0
yarl==1.8.2
zstandard<0.18.0
tornado==6.4
transformations==2024.6.1
typing-inspect==0.9.0
typing_extensions==4.9.0
urwid-mitmproxy==2.1.2.1
wcwidth==0.2.13
Werkzeug==2.3.8
wsproto==1.2.0
yarl==1.9.4
zstandard==0.22.0

View File

@@ -25,7 +25,7 @@ from setuptools import setup, find_packages
here = path.abspath(path.dirname(__file__))
version = '0.14.1'
version = '0.15.1'
with open(path.join(here, 'README.md')) as readme_fh:
readme = readme_fh.read()
@@ -42,8 +42,6 @@ setup(
"Operating System :: POSIX",
"Operating System :: Microsoft :: Windows",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: Implementation :: CPython",
@@ -80,7 +78,7 @@ setup(
}
},
zip_safe=False,
python_requires='>=3.8',
python_requires='>=3.10',
install_requires=[
'llsd<1.1.0',
'defusedxml',
@@ -101,8 +99,8 @@ setup(
# Proxy-specific stuff
'outleap<1.0',
'arpeggio',
# 7.x will be a major change.
'mitmproxy>=8.0.0,<8.1',
# 11.x will be a major change.
'mitmproxy>=10.0.0,<11',
'Werkzeug<3.0',
# For REPLs
'ptpython<4.0',

View File

@@ -1,3 +1,5 @@
import glob
import setuptools # noqa
import os
@@ -32,20 +34,20 @@ TO_DELETE = [
"lib/aiohttp/_http_writer.c",
"lib/aiohttp/_websocket.c",
# Improve this to work with different versions.
"lib/aiohttp/python39.dll",
"lib/lazy_object_proxy/python39.dll",
"lib/lxml/python39.dll",
"lib/markupsafe/python39.dll",
"lib/multidict/python39.dll",
"lib/numpy/core/python39.dll",
"lib/numpy/fft/python39.dll",
"lib/numpy/linalg/python39.dll",
"lib/numpy/random/python39.dll",
"lib/python39.dll",
"lib/recordclass/python39.dll",
"lib/regex/python39.dll",
"lib/aiohttp/python3*.dll",
"lib/lazy_object_proxy/python3*.dll",
"lib/lxml/python3*.dll",
"lib/markupsafe/python3*.dll",
"lib/multidict/python3*.dll",
"lib/numpy/core/python3*.dll",
"lib/numpy/fft/python3*.dll",
"lib/numpy/linalg/python3*.dll",
"lib/numpy/random/python3*.dll",
"lib/python3*.dll",
"lib/recordclass/python3*.dll",
"lib/regex/python3*.dll",
"lib/test",
"lib/yarl/python39.dll",
"lib/yarl/python3*.dll",
]
COPY_TO_ZIP = [
@@ -77,11 +79,12 @@ class FinalizeCXFreezeCommand(Command):
if path.name.startswith("exe.") and path.is_dir():
for cleanse_suffix in TO_DELETE:
cleanse_path = path / cleanse_suffix
shutil.rmtree(cleanse_path, ignore_errors=True)
try:
os.unlink(cleanse_path)
except:
pass
for globbed in glob.glob(str(cleanse_path)):
shutil.rmtree(globbed, ignore_errors=True)
try:
os.unlink(globbed)
except:
pass
for to_copy in COPY_TO_ZIP:
shutil.copy(BASE_DIR / to_copy, path / to_copy)
shutil.copytree(BASE_DIR / "addon_examples", path / "addon_examples")
@@ -95,6 +98,7 @@ options = {
"passlib",
"_cffi_backend",
"hippolyzer",
"mitmproxy_windows",
],
# exclude packages that are not really needed
"excludes": [

View File

@@ -54,6 +54,7 @@ INV_CATEGORY = """\tinv_category\t0
\t\ttype\tlsltext
\t\tpref_type\tlsltext
\t\tname\tScripts|
\t\towner_id\ta2e76fcd-9360-4f6d-a924-000000000003
\t}
"""
@@ -160,6 +161,22 @@ class TestLegacyInv(unittest.TestCase):
]
)
def test_llsd_serialization_ais(self):
model = InventoryModel.from_str(INV_CATEGORY)
self.assertEqual(
[
{
'agent_id': UUID('a2e76fcd-9360-4f6d-a924-000000000003'),
'category_id': UUID('f4d91477-def1-487a-b4f3-6fa201c17376'),
'name': 'Scripts',
'parent_id': UUID('00000000-0000-0000-0000-000000000000'),
'type_default': 10,
'version': -1
}
],
model.to_llsd("ais")
)
def test_llsd_legacy_equality(self):
new_model = InventoryModel.from_llsd(self.model.to_llsd())
self.assertEqual(self.model, new_model)

View File

@@ -27,7 +27,7 @@ from hippolyzer.lib.base.message.data import msg_tmpl
from hippolyzer.lib.base.message.template import MessageTemplate, MessageTemplateBlock, MessageTemplateVariable
from hippolyzer.lib.base.message.template_dict import TemplateDictionary
from hippolyzer.lib.base.message.template_parser import MessageTemplateParser
from hippolyzer.lib.base.message.msgtypes import MsgFrequency, MsgTrust, MsgEncoding, \
from hippolyzer.lib.base.message.msgtypes import MsgFrequency, MsgEncoding, \
MsgDeprecation, MsgBlockType, MsgType
@@ -45,8 +45,8 @@ class TestDictionary(unittest.TestCase):
msg_dict = TemplateDictionary(self.template_list)
packet = msg_dict.get_template_by_name('ConfirmEnableSimulator')
assert packet is not None, "get_packet failed"
assert packet.frequency == MsgFrequency.MEDIUM_FREQUENCY_MESSAGE, "Incorrect frequency"
assert packet.msg_num == 8, "Incorrect message number for ConfirmEnableSimulator"
assert packet.frequency == MsgFrequency.MEDIUM, "Incorrect frequency"
assert packet.num == 8, "Incorrect message number for ConfirmEnableSimulator"
def test_get_packet_pair(self):
msg_dict = TemplateDictionary(self.template_list)
@@ -76,29 +76,29 @@ class TestTemplates(unittest.TestCase):
template = self.msg_dict['CompletePingCheck']
name = template.name
freq = template.frequency
num = template.msg_num
trust = template.msg_trust
enc = template.msg_encoding
num = template.num
trust = template.trusted
enc = template.encoding
assert name == 'CompletePingCheck', "Expected: CompletePingCheck Returned: " + name
assert freq == MsgFrequency.HIGH_FREQUENCY_MESSAGE, "Expected: High Returned: " + freq
assert freq == MsgFrequency.HIGH, "Expected: High Returned: " + freq
assert num == 2, "Expected: 2 Returned: " + str(num)
assert trust == MsgTrust.LL_NOTRUST, "Expected: NotTrusted Returned: " + trust
assert enc == MsgEncoding.LL_UNENCODED, "Expected: Unencoded Returned: " + enc
assert not trust, "Expected: NotTrusted Returned: " + trust
assert enc == MsgEncoding.UNENCODED, "Expected: Unencoded Returned: " + enc
def test_deprecated(self):
template = self.msg_dict['ObjectPosition']
dep = template.msg_deprecation
assert dep == MsgDeprecation.LL_DEPRECATED, "Expected: Deprecated Returned: " + str(dep)
dep = template.deprecation
assert dep == MsgDeprecation.DEPRECATED, "Expected: Deprecated Returned: " + str(dep)
def test_template_fixed(self):
template = self.msg_dict['PacketAck']
num = template.msg_num
num = template.num
assert num == 251, "Expected: 251 Returned: " + str(num)
def test_blacklisted(self):
template = self.msg_dict['TeleportFinish']
self.assertEqual(template.msg_deprecation,
MsgDeprecation.LL_UDPBLACKLISTED)
self.assertEqual(template.deprecation,
MsgDeprecation.UDPBLACKLISTED)
def test_block(self):
block = self.msg_dict['OpenCircuit'].get_block('CircuitInfo')
@@ -167,7 +167,7 @@ class TestTemplates(unittest.TestCase):
frequency_counter = {"low": 0, 'medium': 0, "high": 0, 'fixed': 0}
for template in list(self.msg_dict.message_templates.values()):
frequency_counter[template.get_frequency_as_string()] += 1
frequency_counter[template.frequency.name.lower()] += 1
self.assertEqual(low_count, frequency_counter["low"])
self.assertEqual(medium_count, frequency_counter["medium"])
self.assertEqual(high_count, frequency_counter["high"])

View File

@@ -0,0 +1,39 @@
from typing import Mapping, Optional
import multidict
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.base.network.caps_client import CapsClient
from hippolyzer.lib.base.test_utils import MockHandlingCircuit
from hippolyzer.lib.client.hippo_client import ClientSettings
from hippolyzer.lib.client.object_manager import ClientWorldObjectManager
from hippolyzer.lib.client.state import BaseClientRegion, BaseClientSession, BaseClientSessionManager
class MockClientRegion(BaseClientRegion):
def __init__(self, caps_urls: Optional[dict] = None):
super().__init__()
self.handle = None
self.circuit_addr = ("127.0.0.1", 1)
self.message_handler: MessageHandler[Message, str] = MessageHandler(take_by_default=False)
self.circuit = MockHandlingCircuit(self.message_handler)
self._name = "Test"
self.cap_urls = multidict.MultiDict()
if caps_urls:
self.cap_urls.update(caps_urls)
self.caps_client = CapsClient(self.cap_urls)
def session(self):
return MockClientSession(UUID.ZERO, UUID.ZERO, UUID.ZERO, 0, None)
def update_caps(self, caps: Mapping[str, str]) -> None:
pass
class MockClientSession(BaseClientSession):
def __init__(self, id, secure_session_id, agent_id, circuit_code,
session_manager: Optional[BaseClientSessionManager]):
super().__init__(id, secure_session_id, agent_id, circuit_code, session_manager)
self.objects = ClientWorldObjectManager(self, ClientSettings(), None)

View File

@@ -14,7 +14,7 @@ from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.base.message.msgtypes import PacketFlags
from hippolyzer.lib.base.message.udpdeserializer import UDPMessageDeserializer
from hippolyzer.lib.base.network.transport import AbstractUDPTransport, UDPPacket, Direction
from hippolyzer.lib.base.test_utils import MockTransport, MockConnectionHolder
from hippolyzer.lib.base.test_utils import MockTransport, MockConnectionHolder, soon
from hippolyzer.lib.client.hippo_client import HippoClient, HippoClientProtocol
@@ -72,10 +72,6 @@ class MockHippoClient(HippoClient):
return MockServerTransport(self.server), protocol
async def _soon(get_msg) -> Message:
return await asyncio.wait_for(get_msg(), timeout=1.0)
class TestHippoClient(unittest.IsolatedAsyncioTestCase):
FAKE_LOGIN_URI = "http://127.0.0.1:1/login.cgi"
FAKE_LOGIN_RESP = {
@@ -130,8 +126,8 @@ class TestHippoClient(unittest.IsolatedAsyncioTestCase):
with self.server_handler.subscribe_async(
("*",),
) as get_msg:
assert (await _soon(get_msg)).name == "UseCircuitCode"
assert (await _soon(get_msg)).name == "CompleteAgentMovement"
assert (await soon(get_msg())).name == "UseCircuitCode"
assert (await soon(get_msg())).name == "CompleteAgentMovement"
self.server.circuit.send(Message(
'RegionHandshake',
Block('RegionInfo', fill_missing=True),
@@ -139,8 +135,8 @@ class TestHippoClient(unittest.IsolatedAsyncioTestCase):
Block('RegionInfo3', fill_missing=True),
Block('RegionInfo4', fill_missing=True),
))
assert (await _soon(get_msg)).name == "RegionHandshakeReply"
assert (await _soon(get_msg)).name == "AgentThrottle"
assert (await soon(get_msg())).name == "RegionHandshakeReply"
assert (await soon(get_msg())).name == "AgentThrottle"
await login_task
async def test_login(self):
@@ -149,15 +145,15 @@ class TestHippoClient(unittest.IsolatedAsyncioTestCase):
("*",),
) as get_msg:
self.client.logout()
assert (await _soon(get_msg)).name == "LogoutRequest"
assert (await soon(get_msg())).name == "LogoutRequest"
async def test_eq(self):
await self._log_client_in(self.client)
with self.client.session.message_handler.subscribe_async(
("ViewerFrozenMessage", "NotTemplated"),
) as get_msg:
assert (await _soon(get_msg)).name == "ViewerFrozenMessage"
msg = await _soon(get_msg)
assert (await soon(get_msg())).name == "ViewerFrozenMessage"
msg = await soon(get_msg())
assert msg.name == "NotTemplated"
assert msg["EventData"]["foo"]["bar"] == 1
@@ -179,5 +175,5 @@ class TestHippoClient(unittest.IsolatedAsyncioTestCase):
self.server_transport.send_packet(packet)
self.server_circuit.send(Message("AgentDataUpdate", Block("AgentData", fill_missing=True)))
assert (await _soon(get_msg)).name == "ChatFromSimulator"
assert (await _soon(get_msg)).name == "AgentDataUpdate"
assert (await soon(get_msg())).name == "ChatFromSimulator"
assert (await soon(get_msg())).name == "AgentDataUpdate"

View File

@@ -0,0 +1,69 @@
import unittest
from typing import Any
import aioresponses
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base import llsd
from hippolyzer.lib.client.object_manager import ClientObjectManager
from . import MockClientRegion
class MaterialManagerTest(unittest.IsolatedAsyncioTestCase):
FAKE_CAPS = {
"RenderMaterials": "http://127.0.0.1:8023"
}
GET_RENDERMATERIALS_BODY = [
{'ID': UUID(int=1).bytes,
'Material': {'AlphaMaskCutoff': 0, 'DiffuseAlphaMode': 1, 'EnvIntensity': 0,
'NormMap': UUID(int=4), 'NormOffsetX': 0, 'NormOffsetY': 0,
'NormRepeatX': 10000, 'NormRepeatY': 10000, 'NormRotation': 0, 'SpecColor': [255, 255, 255, 255],
'SpecExp': 51, 'SpecMap': UUID(int=5), 'SpecOffsetX': 0,
'SpecOffsetY': 0, 'SpecRepeatX': 10000, 'SpecRepeatY': 10000, 'SpecRotation': 0}},
{'ID': UUID(int=2).bytes,
'Material': {'AlphaMaskCutoff': 0, 'DiffuseAlphaMode': 0, 'EnvIntensity': 0,
'NormMap': UUID(int=6), 'NormOffsetX': 0, 'NormOffsetY': 0,
'NormRepeatX': 10000, 'NormRepeatY': -10000, 'NormRotation': 0,
'SpecColor': [255, 255, 255, 255], 'SpecExp': 51,
'SpecMap': UUID(int=7), 'SpecOffsetX': 0, 'SpecOffsetY': 0,
'SpecRepeatX': 10000, 'SpecRepeatY': -10000, 'SpecRotation': 0}},
{'ID': UUID(int=3).bytes,
'Material': {'AlphaMaskCutoff': 0, 'DiffuseAlphaMode': 1, 'EnvIntensity': 50,
'NormMap': UUID.ZERO, 'NormOffsetX': 0, 'NormOffsetY': 0,
'NormRepeatX': 10000, 'NormRepeatY': 10000, 'NormRotation': 0, 'SpecColor': [255, 255, 255, 255],
'SpecExp': 200, 'SpecMap': UUID(int=8), 'SpecOffsetX': 0,
'SpecOffsetY': 0, 'SpecRepeatX': 10000, 'SpecRepeatY': 10000, 'SpecRotation': 0}},
]
def _make_rendermaterials_resp(self, resp: Any) -> bytes:
return llsd.format_xml({"Zipped": llsd.zip_llsd(resp)})
async def asyncSetUp(self):
self.aio_mock = aioresponses.aioresponses()
self.aio_mock.start()
# Requesting all materials
self.aio_mock.get(
self.FAKE_CAPS['RenderMaterials'],
body=self._make_rendermaterials_resp(self.GET_RENDERMATERIALS_BODY)
)
# Specific material request
self.aio_mock.post(
self.FAKE_CAPS['RenderMaterials'],
body=self._make_rendermaterials_resp([self.GET_RENDERMATERIALS_BODY[0]])
)
self.region = MockClientRegion(self.FAKE_CAPS)
self.manager = ClientObjectManager(self.region)
async def asyncTearDown(self):
self.aio_mock.stop()
async def test_fetch_all_materials(self):
await self.manager.request_all_materials()
self.assertListEqual([UUID(int=1), UUID(int=2), UUID(int=3)], list(self.manager.state.materials.keys()))
async def test_fetch_some_materials(self):
mats = await self.manager.request_materials((UUID(int=1),))
self.assertListEqual([UUID(int=1)], list(mats.keys()))
self.assertListEqual([UUID(int=1)], list(self.manager.state.materials.keys()))

View File

@@ -0,0 +1,333 @@
import asyncio
import collections
import unittest
from typing import Dict
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.message.message import Block, Message
import hippolyzer.lib.base.serialization as se
from hippolyzer.lib.base.templates import ParcelGridInfo, ParcelGridType, ParcelGridFlags, \
ParcelPropertiesBitmapSerializer
from hippolyzer.lib.base.test_utils import soon
from hippolyzer.lib.client.parcel_manager import ParcelManager
from . import MockClientRegion
OVERLAY_CHUNKS = (
b'\xc2\x82\x82\xc2\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82'
b'\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82'
b'\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\x82\xc2B\x02\x02B\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x82B\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\xc2\x82\x82\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02',
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02',
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02',
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'B\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02'
b'\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02\x02',
)
BITMAPS = (
b'\x07\x00\x00\x00\x00\x00\x00\x00\x07\x00\x00\x00\x00\x00\x00\x00\x07\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00',
b'\xf8\xff\xff\xff\xff\xff\xff\x7f\xf8\xff\xff\xff\xff\xff\xff\xff\xf8\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'
b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'
b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'
b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'
b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'
b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'
b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'
b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'
b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'
b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'
b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'
b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'
b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'
b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'
b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'
b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'
b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'
b'\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff'
b'\xff\xff\xff\xff\xff\xff\xff\xff',
b'\x00\x00\x00\x00\x00\x00\x00\x80\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
b'\x00\x00\x00\x00\x00\x00\x00\x00',
)
class TestParcelOverlay(unittest.IsolatedAsyncioTestCase):
async def asyncSetUp(self):
self.region = MockClientRegion()
self.parcel_manager = ParcelManager(self.region)
self.handler = self.region.message_handler
self.test_msgs = []
for i, chunk in enumerate(OVERLAY_CHUNKS):
self.test_msgs.append(Message(
'ParcelOverlay',
Block('ParcelData', SequenceID=i, Data=chunk),
))
def test_low_level_parse(self):
spec = se.BitfieldDataclass(ParcelGridInfo)
reader = se.BufferReader("<", OVERLAY_CHUNKS[0])
self.assertEqual(
ParcelGridInfo(ParcelGridType.GROUP, ParcelGridFlags.SOUTH_LINE | ParcelGridFlags.WEST_LINE),
reader.read(spec),
)
self.assertEqual(
ParcelGridInfo(ParcelGridType.GROUP, ParcelGridFlags.SOUTH_LINE),
reader.read(spec),
)
def _get_parcel_areas(self) -> Dict[int, int]:
c = collections.Counter()
for parcel_idx in self.parcel_manager.parcel_indices.flatten():
c[parcel_idx] += self.parcel_manager.GRID_STEP
return dict(c.items())
async def test_handle_overlay(self):
self.assertFalse(self.parcel_manager.overlay_complete.is_set())
for msg in self.test_msgs:
self.handler.handle(msg)
self.assertTrue(self.parcel_manager.overlay_complete.is_set())
self.assertDictEqual({1: 36, 2: 16344, 3: 4}, self._get_parcel_areas())
async def test_request_parcel_properties(self):
for msg in self.test_msgs:
self.handler.handle(msg)
req_task = asyncio.create_task(self.parcel_manager.request_dirty_parcels())
# HACK: Wait for requests to be sent out
await asyncio.sleep(0.01)
for i in range(3):
self.handler.handle(Message(
"ParcelProperties",
Block(
"ParcelData",
LocalID=i + 1,
SequenceID=i + 1,
Name=str(i + 1),
GroupID=UUID.ZERO,
ParcelFlags=0,
Bitmap=BITMAPS[i],
),
))
await soon(req_task)
self.assertEqual(3, len(self.parcel_manager.parcels))
self.assertEqual("1", self.parcel_manager.parcels[0].name)
async def test_parcel_bitmap_equivalence(self):
for msg in self.test_msgs:
self.handler.handle(msg)
serializer = ParcelPropertiesBitmapSerializer()
bitmaps = [serializer.deserialize(None, x) for x in BITMAPS]
for y in range(ParcelManager.GRID_STEP):
for x in range(ParcelManager.GRID_STEP):
parcel_idx = self.parcel_manager.parcel_indices[y, x] - 1
for i, bitmap in enumerate(bitmaps):
bmp_set = bitmap[y, x]
if bmp_set and parcel_idx != i:
raise AssertionError(f"Parcel {parcel_idx} unexpected set in Bitmap {i} at {y, x}")
elif not bmp_set and parcel_idx == i:
raise AssertionError(f"Parcel {parcel_idx} not set in Bitmap {i} at {y, x}")

View File

@@ -667,7 +667,7 @@ class SessionObjectManagerTests(ObjectManagerTestMixin, unittest.IsolatedAsyncio
async def test_handle_object_update_event(self):
with self.session.objects.events.subscribe_async(
message_names=(ObjectUpdateType.OBJECT_UPDATE,),
message_names=(ObjectUpdateType.UPDATE,),
predicate=lambda e: e.object.UpdateFlags & JUST_CREATED_FLAGS and "LocalID" in e.updated,
) as get_events:
self._create_object(local_id=999)
@@ -676,7 +676,7 @@ class SessionObjectManagerTests(ObjectManagerTestMixin, unittest.IsolatedAsyncio
async def test_handle_object_update_predicate(self):
with self.session.objects.events.subscribe_async(
message_names=(ObjectUpdateType.OBJECT_UPDATE,),
message_names=(ObjectUpdateType.UPDATE,),
) as get_events:
self._create_object(local_id=999)
evt = await asyncio.wait_for(get_events(), 1.0)
@@ -684,10 +684,10 @@ class SessionObjectManagerTests(ObjectManagerTestMixin, unittest.IsolatedAsyncio
async def test_handle_object_update_events_two_subscribers(self):
with self.session.objects.events.subscribe_async(
message_names=(ObjectUpdateType.OBJECT_UPDATE,),
message_names=(ObjectUpdateType.UPDATE,),
) as get_events:
with self.session.objects.events.subscribe_async(
message_names=(ObjectUpdateType.OBJECT_UPDATE,),
message_names=(ObjectUpdateType.UPDATE,),
) as get_events2:
self._create_object(local_id=999)
evt = await asyncio.wait_for(get_events(), 1.0)
@@ -697,10 +697,10 @@ class SessionObjectManagerTests(ObjectManagerTestMixin, unittest.IsolatedAsyncio
async def test_handle_object_update_events_two_subscribers_timeout(self):
with self.session.objects.events.subscribe_async(
message_names=(ObjectUpdateType.OBJECT_UPDATE,),
message_names=(ObjectUpdateType.UPDATE,),
) as get_events:
with self.session.objects.events.subscribe_async(
message_names=(ObjectUpdateType.OBJECT_UPDATE,),
message_names=(ObjectUpdateType.UPDATE,),
) as get_events2:
self._create_object(local_id=999)
evt = asyncio.wait_for(get_events(), 0.01)