Compare commits
80 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
722e8eeabf | ||
|
|
a6a26a9999 | ||
|
|
a6328d5aee | ||
|
|
4e76ebe7cf | ||
|
|
c0a26ffb57 | ||
|
|
7dfb10cb51 | ||
|
|
de33906db5 | ||
|
|
605337b280 | ||
|
|
235cd4929f | ||
|
|
220a02543e | ||
|
|
8ac47c2397 | ||
|
|
d384978322 | ||
|
|
f02a479834 | ||
|
|
b5e8b36173 | ||
|
|
08a39f4df7 | ||
|
|
61ec51beec | ||
|
|
9adbdcdcc8 | ||
|
|
e7b05f72ca | ||
|
|
75f2f363a4 | ||
|
|
cc1bb9ac1d | ||
|
|
d498d1f2c8 | ||
|
|
8c0635bb2a | ||
|
|
309dbeeb52 | ||
|
|
4cc87bf81e | ||
|
|
f34bb42dcb | ||
|
|
59ec99809a | ||
|
|
4b963f96d2 | ||
|
|
58db8f66de | ||
|
|
95623eba58 | ||
|
|
8dba0617bd | ||
|
|
289073be8e | ||
|
|
f3c8015366 | ||
|
|
99e8118458 | ||
|
|
80745cfd1c | ||
|
|
92a06bccaf | ||
|
|
fde9ddf4d9 | ||
|
|
03a56c9982 | ||
|
|
d07a0df0fd | ||
|
|
848397fe63 | ||
|
|
0f9246c5c6 | ||
|
|
2e7f887970 | ||
|
|
ef9df6b058 | ||
|
|
baae0f6d6e | ||
|
|
0f369b682d | ||
|
|
1f1e4de254 | ||
|
|
75ddc0a5ba | ||
|
|
e4cb168138 | ||
|
|
63aebba754 | ||
|
|
8cf1a43d59 | ||
|
|
bbc8813b61 | ||
|
|
5b51dbd30f | ||
|
|
295c7972e7 | ||
|
|
b034661c38 | ||
|
|
f12fd95ee1 | ||
|
|
bc33313fc7 | ||
|
|
affc7fcf89 | ||
|
|
b8f1593a2c | ||
|
|
7879f4e118 | ||
|
|
4ba611ae01 | ||
|
|
82ff6d9c64 | ||
|
|
f603ea6186 | ||
|
|
fcf6a4568b | ||
|
|
2ad6cc1b51 | ||
|
|
025f7d31f2 | ||
|
|
9fdb281e4a | ||
|
|
11e28bde2a | ||
|
|
1faa6f977c | ||
|
|
6866e7397f | ||
|
|
fa0b3a5340 | ||
|
|
16c808bce8 | ||
|
|
ec4b2d0770 | ||
|
|
3b610fdfd1 | ||
|
|
8b93c5eefa | ||
|
|
f4bb9eae8f | ||
|
|
ecb14197cf | ||
|
|
95fd58e25a | ||
|
|
afc333ab49 | ||
|
|
eb6406bca4 | ||
|
|
d486aa130d | ||
|
|
d66d5226a2 |
21
.github/workflows/bundle_windows.yml
vendored
21
.github/workflows/bundle_windows.yml
vendored
@@ -2,18 +2,23 @@
|
||||
# onto the release after it gets created. Don't want actions with repo write.
|
||||
name: Bundle Windows EXE
|
||||
|
||||
|
||||
on:
|
||||
# Only trigger on release creation
|
||||
release:
|
||||
types:
|
||||
- created
|
||||
workflow_dispatch:
|
||||
env:
|
||||
target_tag: ${{ github.ref_name }}
|
||||
|
||||
|
||||
jobs:
|
||||
build:
|
||||
|
||||
runs-on: windows-latest
|
||||
runs-on: windows-2019
|
||||
permissions:
|
||||
contents: write
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: [3.9]
|
||||
@@ -34,14 +39,24 @@ jobs:
|
||||
pip install cx_freeze
|
||||
|
||||
- name: Bundle with cx_Freeze
|
||||
shell: bash
|
||||
run: |
|
||||
python setup_cxfreeze.py build_exe
|
||||
pip install pip-licenses
|
||||
pip-licenses --format=plain-vertical --with-license-file --no-license-path --output-file=lib_licenses.txt
|
||||
python setup_cxfreeze.py finalize_cxfreeze
|
||||
# Should only be one, but we don't know what it's named
|
||||
mv ./dist/*.zip hippolyzer-windows-${{ env.target_tag }}.zip
|
||||
|
||||
- name: Upload the artifact
|
||||
uses: actions/upload-artifact@v2
|
||||
with:
|
||||
name: hippolyzer-gui-windows-${{ github.sha }}
|
||||
path: ./dist/**
|
||||
name: hippolyzer-windows-${{ github.sha }}
|
||||
path: ./hippolyzer-windows-${{ env.target_tag }}.zip
|
||||
|
||||
- uses: ncipollo/release-action@v1.10.0
|
||||
with:
|
||||
artifacts: hippolyzer-windows-${{ env.target_tag }}.zip
|
||||
tag: ${{ env.target_tag }}
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
allowUpdates: true
|
||||
|
||||
38
README.md
38
README.md
@@ -83,6 +83,28 @@ SOCKS 5 works correctly on these platforms, so you can just configure it through
|
||||
the `no_proxy` env var appropriately. For ex. `no_proxy="asset-cdn.glb.agni.lindenlab.com" ./firestorm`.
|
||||
* Log in!
|
||||
|
||||
##### Firestorm
|
||||
|
||||
The proxy selection dialog in the most recent Firestorm release is non-functional, as
|
||||
https://bitbucket.org/lindenlab/viewer/commits/454c7f4543688126b2fa5c0560710f5a1733702e was not pulled in.
|
||||
|
||||
As a workaround, you can go to `Debug -> Show Debug Settings` and enter the following values:
|
||||
|
||||
| Name | Value |
|
||||
|---------------------|-----------|
|
||||
| HttpProxyType | Web |
|
||||
| BrowserProxyAddress | 127.0.0.1 |
|
||||
| BrowserProxyEnabled | TRUE |
|
||||
| BrowserProxyPort | 9062 |
|
||||
| Socks5ProxyEnabled | TRUE |
|
||||
| Socks5ProxyHost | 127.0.0.1 |
|
||||
| Socks5ProxyPort | 9061 |
|
||||
|
||||
Or, if you're on Linux, you can also use [LinHippoAutoProxy](https://github.com/SaladDais/LinHippoAutoProxy).
|
||||
|
||||
Connections from the in-viewer browser will likely _not_ be run through Hippolyzer when using either of
|
||||
these workarounds.
|
||||
|
||||
### Filtering
|
||||
|
||||
By default, the proxy's display filter is configured to ignore many high-frequency messages.
|
||||
@@ -311,6 +333,22 @@ If you are a viewer developer, please put them in a viewer.
|
||||
apply the mesh to the local mesh target. It works on attachments too. Useful for testing rigs before a
|
||||
final, real upload.
|
||||
|
||||
## REPL
|
||||
|
||||
A quick and dirty REPL is also included for when you want to do ad-hoc introspection of proxy state.
|
||||
It can be launched at any time by typing `/524 spawn_repl` in chat.
|
||||
|
||||

|
||||
|
||||
The REPL is fully async aware and allows awaiting events without blocking:
|
||||
|
||||
```python
|
||||
>>> from hippolyzer.lib.client.object_manager import ObjectUpdateType
|
||||
>>> evt = await session.objects.events.wait_for((ObjectUpdateType.OBJECT_UPDATE,), timeout=2.0)
|
||||
>>> evt.updated
|
||||
{'Position'}
|
||||
```
|
||||
|
||||
## Potential Changes
|
||||
|
||||
* AISv3 wrapper?
|
||||
|
||||
@@ -80,7 +80,7 @@ class BlueishObjectListGUIAddon(BaseAddon):
|
||||
raise
|
||||
|
||||
def _highlight_object(self, session: Session, obj: Object):
|
||||
session.main_region.circuit.send_message(Message(
|
||||
session.main_region.circuit.send(Message(
|
||||
"ForceObjectSelect",
|
||||
Block("Header", ResetList=False),
|
||||
Block("Data", LocalID=obj.LocalID),
|
||||
@@ -88,7 +88,7 @@ class BlueishObjectListGUIAddon(BaseAddon):
|
||||
))
|
||||
|
||||
def _teleport_to_object(self, session: Session, obj: Object):
|
||||
session.main_region.circuit.send_message(Message(
|
||||
session.main_region.circuit.send(Message(
|
||||
"TeleportLocationRequest",
|
||||
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
|
||||
Block(
|
||||
|
||||
158
addon_examples/demo_autoattacher.py
Normal file
158
addon_examples/demo_autoattacher.py
Normal file
@@ -0,0 +1,158 @@
|
||||
"""
|
||||
Detect receipt of a marketplace order for a demo, and auto-attach the most appropriate object
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import re
|
||||
from typing import List, Tuple, Dict, Optional, Sequence
|
||||
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.base.message.message import Message, Block
|
||||
from hippolyzer.lib.base.templates import InventoryType, Permissions, FolderType
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon, show_message
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
|
||||
|
||||
MARKETPLACE_TRANSACTION_ID = UUID('ffffffff-ffff-ffff-ffff-ffffffffffff')
|
||||
|
||||
|
||||
class DemoAutoAttacher(BaseAddon):
|
||||
def handle_eq_event(self, session: Session, region: ProxiedRegion, event: dict):
|
||||
if event["message"] != "BulkUpdateInventory":
|
||||
return
|
||||
# Check that this update even possibly came from the marketplace
|
||||
if event["body"]["AgentData"][0]["TransactionID"] != MARKETPLACE_TRANSACTION_ID:
|
||||
return
|
||||
# Make sure that the transaction targeted our real received items folder
|
||||
folders = event["body"]["FolderData"]
|
||||
received_folder = folders[0]
|
||||
if received_folder["Name"] != "Received Items":
|
||||
return
|
||||
skel = session.login_data['inventory-skeleton']
|
||||
actual_received = [x for x in skel if x['type_default'] == FolderType.INBOX]
|
||||
assert actual_received
|
||||
if UUID(actual_received[0]['folder_id']) != received_folder["FolderID"]:
|
||||
show_message(f"Strange received folder ID spoofing? {folders!r}")
|
||||
return
|
||||
|
||||
if not re.match(r".*\bdemo\b.*", folders[1]["Name"], flags=re.I):
|
||||
return
|
||||
# Alright, so we have a demo... thing from the marketplace. What now?
|
||||
items = event["body"]["ItemData"]
|
||||
object_items = [x for x in items if x["InvType"] == InventoryType.OBJECT]
|
||||
if not object_items:
|
||||
return
|
||||
self._schedule_task(self._attach_best_object(session, region, object_items))
|
||||
|
||||
async def _attach_best_object(self, session: Session, region: ProxiedRegion, object_items: List[Dict]):
|
||||
own_body_type = await self._guess_own_body(session, region)
|
||||
show_message(f"Trying to find demo for {own_body_type}")
|
||||
guess_patterns = self.BODY_CLOTHING_PATTERNS.get(own_body_type)
|
||||
to_attach = []
|
||||
if own_body_type and guess_patterns:
|
||||
matching_items = self._get_matching_items(object_items, guess_patterns)
|
||||
if matching_items:
|
||||
# Only take the first one
|
||||
to_attach.append(matching_items[0])
|
||||
if not to_attach:
|
||||
# Don't know what body's being used or couldn't figure out what item
|
||||
# would work best with our body. Just attach the first object in the folder.
|
||||
to_attach.append(object_items[0])
|
||||
|
||||
# Also attach whatever HUDs, maybe we need them.
|
||||
for hud in self._get_matching_items(object_items, ("hud",)):
|
||||
if hud not in to_attach:
|
||||
to_attach.append(hud)
|
||||
|
||||
region.circuit.send(Message(
|
||||
'RezMultipleAttachmentsFromInv',
|
||||
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
|
||||
Block('HeaderData', CompoundMsgID=UUID.random(), TotalObjects=len(to_attach), FirstDetachAll=0),
|
||||
*[Block(
|
||||
'ObjectData',
|
||||
ItemID=o["ItemID"],
|
||||
OwnerID=session.agent_id,
|
||||
# 128 = "add", uses whatever attachmentpt was defined on the object
|
||||
AttachmentPt=128,
|
||||
ItemFlags_=(),
|
||||
GroupMask_=(),
|
||||
EveryoneMask_=(),
|
||||
NextOwnerMask_=(Permissions.COPY | Permissions.MOVE),
|
||||
Name=o["Name"],
|
||||
Description=o["Description"],
|
||||
) for o in to_attach]
|
||||
))
|
||||
|
||||
def _get_matching_items(self, items: List[dict], patterns: Sequence[str]):
|
||||
# Loop over patterns to search for our body type, in order of preference
|
||||
matched = []
|
||||
for guess_pattern in patterns:
|
||||
# Check each item for that pattern
|
||||
for item in items:
|
||||
if re.match(rf".*\b{guess_pattern}\b.*", item["Name"], re.I):
|
||||
matched.append(item)
|
||||
return matched
|
||||
|
||||
# We scan the agent's attached objects to guess what kind of body they use
|
||||
BODY_PREFIXES = {
|
||||
"-Belleza- Jake ": "jake",
|
||||
"-Belleza- Freya ": "freya",
|
||||
"-Belleza- Isis ": "isis",
|
||||
"-Belleza- Venus ": "venus",
|
||||
"[Signature] Gianni Body": "gianni",
|
||||
"[Signature] Geralt Body": "geralt",
|
||||
"Maitreya Mesh Body - Lara": "maitreya",
|
||||
"Slink Physique Hourglass Petite": "hg_petite",
|
||||
"Slink Physique Mesh Body Hourglass": "hourglass",
|
||||
"Slink Physique Original Petite": "phys_petite",
|
||||
"Slink Physique Mesh Body Original": "physique",
|
||||
"[BODY] Legacy (f)": "legacy_f",
|
||||
"[BODY] Legacy (m)": "legacy_m",
|
||||
"[Signature] Alice Body": "sig_alice",
|
||||
"Slink Physique MALE Mesh Body": "slink_male",
|
||||
"AESTHETIC - [Mesh Body]": "aesthetic",
|
||||
}
|
||||
|
||||
# Different bodies' clothes have different naming conventions according to different merchants.
|
||||
# These are common naming patterns we use to choose objects to attach, in order of preference.
|
||||
BODY_CLOTHING_PATTERNS: Dict[str, Tuple[str, ...]] = {
|
||||
"jake": ("jake", "belleza"),
|
||||
"freya": ("freya", "belleza"),
|
||||
"isis": ("isis", "belleza"),
|
||||
"venus": ("venus", "belleza"),
|
||||
"gianni": ("gianni", "signature", "sig"),
|
||||
"geralt": ("geralt", "signature", "sig"),
|
||||
"hg_petite": ("hourglass petite", "hg petite", "hourglass", "hg", "slink"),
|
||||
"hourglass": ("hourglass", "hg", "slink"),
|
||||
"phys_petite": ("physique petite", "phys petite", "physique", "phys", "slink"),
|
||||
"physique": ("physique", "phys", "slink"),
|
||||
"legacy_f": ("legacy",),
|
||||
"legacy_m": ("legacy",),
|
||||
"sig_alice": ("alice", "signature"),
|
||||
"slink_male": ("physique", "slink"),
|
||||
"aesthetic": ("aesthetic",),
|
||||
}
|
||||
|
||||
async def _guess_own_body(self, session: Session, region: ProxiedRegion) -> Optional[str]:
|
||||
agent_obj = region.objects.lookup_fullid(session.agent_id)
|
||||
if not agent_obj:
|
||||
return None
|
||||
# We probably won't know the names for all of our attachments, request them.
|
||||
# Could be obviated by looking at the COF, not worth it for this.
|
||||
try:
|
||||
await asyncio.wait(region.objects.request_object_properties(agent_obj.Children), timeout=0.5)
|
||||
except asyncio.TimeoutError:
|
||||
# We expect that we just won't ever receive some property requests, that's fine
|
||||
pass
|
||||
|
||||
for prefix, body_type in self.BODY_PREFIXES.items():
|
||||
for obj in agent_obj.Children:
|
||||
if not obj.Name:
|
||||
continue
|
||||
if obj.Name.startswith(prefix):
|
||||
return body_type
|
||||
return None
|
||||
|
||||
|
||||
addons = [DemoAutoAttacher()]
|
||||
119
addon_examples/get_task_inventory_cap.py
Normal file
119
addon_examples/get_task_inventory_cap.py
Normal file
@@ -0,0 +1,119 @@
|
||||
"""
|
||||
Loading task inventory doesn't actually need to be slow.
|
||||
|
||||
By using a cap instead of the slow xfer path and sending the LLSD inventory
|
||||
model we get 15x speedups even when mocking things behind the scenes by using
|
||||
a hacked up version of xfer. See turbo_object_inventory.py
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
|
||||
import asgiref.wsgi
|
||||
from typing import *
|
||||
|
||||
from flask import Flask, Response, request
|
||||
|
||||
from hippolyzer.lib.base import llsd
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.base.inventory import InventoryModel, InventoryObject
|
||||
from hippolyzer.lib.base.message.message import Message, Block
|
||||
from hippolyzer.lib.base.templates import XferFilePath
|
||||
from hippolyzer.lib.proxy import addon_ctx
|
||||
from hippolyzer.lib.proxy.webapp_cap_addon import WebAppCapAddon
|
||||
|
||||
app = Flask("GetTaskInventoryCapApp")
|
||||
|
||||
|
||||
@app.route('/', methods=["GET"])
|
||||
async def get_task_inventory():
|
||||
# Should always have the current region, the cap handler is bound to one.
|
||||
# Just need to pull it from the `addon_ctx` module's global.
|
||||
region = addon_ctx.region.get()
|
||||
session = addon_ctx.session.get()
|
||||
obj_id = UUID(request.args["task_id"])
|
||||
obj = region.objects.lookup_fullid(obj_id)
|
||||
if not obj:
|
||||
return Response(f"Couldn't find {obj_id}", status=404, mimetype="text/plain")
|
||||
request_msg = Message(
|
||||
'RequestTaskInventory',
|
||||
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
|
||||
Block('InventoryData', LocalID=obj.LocalID),
|
||||
)
|
||||
# Keep around a dict of chunks we saw previously in case we have to restart
|
||||
# an Xfer due to missing chunks. We don't expect chunks to change across Xfers
|
||||
# so this can be used to recover from dropped SendXferPackets in subsequent attempts
|
||||
existing_chunks: Dict[int, bytes] = {}
|
||||
for _ in range(3):
|
||||
# Any previous requests will have triggered a delete of the inventory file
|
||||
# by marking it complete on the server-side. Re-send our RequestTaskInventory
|
||||
# To make sure there's a fresh copy.
|
||||
region.circuit.send(request_msg.take())
|
||||
inv_message = await region.message_handler.wait_for(
|
||||
('ReplyTaskInventory',),
|
||||
predicate=lambda x: x["InventoryData"]["TaskID"] == obj.FullID,
|
||||
timeout=5.0,
|
||||
)
|
||||
# No task inventory, send the reply as-is
|
||||
file_name = inv_message["InventoryData"]["Filename"]
|
||||
if not file_name:
|
||||
# The "Contents" folder always has to be there, if we don't put it here
|
||||
# then the viewer will have to lie about it being there itself.
|
||||
return Response(
|
||||
llsd.format_xml({
|
||||
"inventory": [
|
||||
InventoryObject(
|
||||
name="Contents",
|
||||
parent_id=UUID.ZERO,
|
||||
type="category",
|
||||
obj_id=obj_id
|
||||
).to_llsd()
|
||||
],
|
||||
"inv_serial": inv_message["InventoryData"]["Serial"],
|
||||
}),
|
||||
headers={"Content-Type": "application/llsd+xml"},
|
||||
status=200,
|
||||
)
|
||||
|
||||
last_serial = request.args.get("last_serial", None)
|
||||
if last_serial:
|
||||
last_serial = int(last_serial)
|
||||
if inv_message["InventoryData"]["Serial"] == last_serial:
|
||||
# Nothing has changed since the version of the inventory they say they have, say so.
|
||||
return Response("", status=304)
|
||||
|
||||
xfer = region.xfer_manager.request(
|
||||
file_name=file_name,
|
||||
file_path=XferFilePath.CACHE,
|
||||
turbo=True,
|
||||
)
|
||||
xfer.chunks.update(existing_chunks)
|
||||
try:
|
||||
await xfer
|
||||
except asyncio.TimeoutError:
|
||||
# We likely failed the request due to missing chunks, store
|
||||
# the chunks that we _did_ get for the next attempt.
|
||||
existing_chunks.update(xfer.chunks)
|
||||
continue
|
||||
|
||||
inv_model = InventoryModel.from_str(xfer.reassemble_chunks().decode("utf8"))
|
||||
|
||||
return Response(
|
||||
llsd.format_xml({
|
||||
"inventory": inv_model.to_llsd(),
|
||||
"inv_serial": inv_message["InventoryData"]["Serial"],
|
||||
}),
|
||||
headers={"Content-Type": "application/llsd+xml"},
|
||||
)
|
||||
raise asyncio.TimeoutError("Failed to get inventory after 3 tries")
|
||||
|
||||
|
||||
class GetTaskInventoryCapExampleAddon(WebAppCapAddon):
|
||||
# A cap URL with this name will be tied to each region when
|
||||
# the sim is first connected to. The URL will be returned to the
|
||||
# viewer in the Seed if the viewer requests it by name.
|
||||
CAP_NAME = "GetTaskInventoryExample"
|
||||
# Any asgi app should be fine.
|
||||
APP = asgiref.wsgi.WsgiToAsgi(app)
|
||||
|
||||
|
||||
addons = [GetTaskInventoryCapExampleAddon()]
|
||||
@@ -29,6 +29,7 @@ from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.base.helpers import get_mtime
|
||||
from hippolyzer.lib.base.llanim import Animation
|
||||
from hippolyzer.lib.base.message.message import Block, Message
|
||||
from hippolyzer.lib.base.message.msgtypes import PacketFlags
|
||||
from hippolyzer.lib.proxy import addon_ctx
|
||||
from hippolyzer.lib.proxy.addons import AddonManager
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon, SessionProperty, GlobalProperty, show_message
|
||||
@@ -133,6 +134,7 @@ class LocalAnimAddon(BaseAddon):
|
||||
AgentID=session.agent_id,
|
||||
SessionID=session.id,
|
||||
),
|
||||
flags=PacketFlags.RELIABLE,
|
||||
)
|
||||
|
||||
# Stop any old version of the anim that might be playing first
|
||||
@@ -159,7 +161,7 @@ class LocalAnimAddon(BaseAddon):
|
||||
cls.local_anim_playing_ids.pop(anim_name, None)
|
||||
cls.local_anim_bytes.pop(anim_name, None)
|
||||
|
||||
region.circuit.send_message(new_msg)
|
||||
region.circuit.send(new_msg)
|
||||
print(f"Changing {anim_name} to {next_id}")
|
||||
|
||||
@classmethod
|
||||
|
||||
@@ -81,17 +81,16 @@ class MeshUploadInterceptingAddon(BaseAddon):
|
||||
|
||||
@handle_command()
|
||||
async def set_local_mesh_target(self, session: Session, region: ProxiedRegion):
|
||||
"""Set the currently selected object as the target for local mesh"""
|
||||
parent_object = region.objects.lookup_localid(session.selected.object_local)
|
||||
if not parent_object:
|
||||
"""Set the currently selected objects as the target for local mesh"""
|
||||
selected_links = [region.objects.lookup_localid(l_id) for l_id in session.selected.object_locals]
|
||||
selected_links = [o for o in selected_links if o is not None]
|
||||
if not selected_links:
|
||||
show_message("Nothing selected")
|
||||
return
|
||||
linkset_objects = [parent_object] + parent_object.Children
|
||||
|
||||
old_locals = self.local_mesh_target_locals
|
||||
self.local_mesh_target_locals = [
|
||||
x.LocalID
|
||||
for x in linkset_objects
|
||||
for x in selected_links
|
||||
if ExtraParamType.MESH in x.ExtraParams
|
||||
]
|
||||
|
||||
|
||||
@@ -126,14 +126,14 @@ class MessageMirrorAddon(BaseAddon):
|
||||
|
||||
# Send the message normally first if we're mirroring
|
||||
if message.name in MIRROR:
|
||||
region.circuit.send_message(message)
|
||||
region.circuit.send(message)
|
||||
|
||||
# We're going to send the message on a new circuit, we need to take
|
||||
# it so we get a new packet ID and clean ACKs
|
||||
message = message.take()
|
||||
|
||||
self._lludp_fixups(target_session, message)
|
||||
target_region.circuit.send_message(message)
|
||||
target_region.circuit.send(message)
|
||||
return True
|
||||
|
||||
def _lludp_fixups(self, target_session: Session, message: Message):
|
||||
|
||||
@@ -27,7 +27,7 @@ from mitmproxy.http import HTTPFlow
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.base.jp2_utils import BufferedJp2k
|
||||
from hippolyzer.lib.base.multiprocessing_utils import ParentProcessWatcher
|
||||
from hippolyzer.lib.base.templates import TextureEntry
|
||||
from hippolyzer.lib.base.templates import TextureEntryCollection
|
||||
from hippolyzer.lib.proxy.addon_utils import AssetAliasTracker, BaseAddon, GlobalProperty, AddonProcess
|
||||
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
|
||||
from hippolyzer.lib.base.message.message import Message
|
||||
@@ -148,7 +148,7 @@ class MonochromeAddon(BaseAddon):
|
||||
message["RegionInfo"][field_name] = tracker.get_alias_uuid(val)
|
||||
|
||||
@staticmethod
|
||||
def _make_te_monochrome(tracker: AssetAliasTracker, parsed_te: TextureEntry):
|
||||
def _make_te_monochrome(tracker: AssetAliasTracker, parsed_te: TextureEntryCollection):
|
||||
# Need a deepcopy because TEs are owned by the ObjectManager
|
||||
# and we don't want to change the canonical view.
|
||||
parsed_te = copy.deepcopy(parsed_te)
|
||||
|
||||
111
addon_examples/object_management_validator.py
Normal file
111
addon_examples/object_management_validator.py
Normal file
@@ -0,0 +1,111 @@
|
||||
"""
|
||||
Check object manager state against region ViewerObject cache
|
||||
|
||||
Can't look at every object we've tracked and every object in VOCache
|
||||
and report mismatches due to weird VOCache cache eviction criteria and certain
|
||||
cacheable objects not being added to the VOCache.
|
||||
|
||||
Off the top of my head, animesh objects get explicit KillObjects at extreme
|
||||
view distances same as avatars, but will still be present in the cache even
|
||||
though they will not be in gObjectList.
|
||||
"""
|
||||
import asyncio
|
||||
import logging
|
||||
from typing import *
|
||||
|
||||
from hippolyzer.lib.base.objects import normalize_object_update_compressed_data
|
||||
from hippolyzer.lib.base.templates import ObjectUpdateFlags, PCode
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon, GlobalProperty
|
||||
from hippolyzer.lib.base.message.message import Message
|
||||
from hippolyzer.lib.proxy.addons import AddonManager
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import SessionManager, Session
|
||||
from hippolyzer.lib.proxy.vocache import is_valid_vocache_dir, RegionViewerObjectCacheChain
|
||||
|
||||
LOG = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ObjectManagementValidator(BaseAddon):
|
||||
base_cache_path: Optional[str] = GlobalProperty(None)
|
||||
orig_auto_request: Optional[bool] = GlobalProperty(None)
|
||||
|
||||
def handle_init(self, session_manager: SessionManager):
|
||||
if self.orig_auto_request is None:
|
||||
self.orig_auto_request = session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS
|
||||
session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS = False
|
||||
|
||||
async def _choose_cache_path():
|
||||
while not self.base_cache_path:
|
||||
cache_dir = await AddonManager.UI.open_dir("Choose the base cache directory")
|
||||
if not cache_dir:
|
||||
return
|
||||
if not is_valid_vocache_dir(cache_dir):
|
||||
continue
|
||||
self.base_cache_path = cache_dir
|
||||
|
||||
if not self.base_cache_path:
|
||||
self._schedule_task(_choose_cache_path(), session_scoped=False)
|
||||
|
||||
def handle_unload(self, session_manager: SessionManager):
|
||||
session_manager.settings.ALLOW_AUTO_REQUEST_OBJECTS = self.orig_auto_request
|
||||
|
||||
def handle_session_init(self, session: Session):
|
||||
# Use only the specified cache path for the vocache
|
||||
session.cache_dir = self.base_cache_path
|
||||
|
||||
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
|
||||
if message.name != "DisableSimulator":
|
||||
return
|
||||
# Send it off to the client without handling it normally,
|
||||
# we need to defer region teardown in the proxy
|
||||
region.circuit.send(message)
|
||||
self._schedule_task(self._check_cache_before_region_teardown(region))
|
||||
return True
|
||||
|
||||
async def _check_cache_before_region_teardown(self, region: ProxiedRegion):
|
||||
await asyncio.sleep(0.5)
|
||||
print("Ok, checking cache differences")
|
||||
try:
|
||||
# Index will have been rewritten, so re-read it.
|
||||
region_cache_chain = RegionViewerObjectCacheChain.for_region(
|
||||
handle=region.handle,
|
||||
cache_id=region.cache_id,
|
||||
cache_dir=self.base_cache_path
|
||||
)
|
||||
if not region_cache_chain.region_caches:
|
||||
print(f"no caches for {region!r}?")
|
||||
return
|
||||
all_full_ids = set()
|
||||
for obj in region.objects.all_objects:
|
||||
cacheable = True
|
||||
orig_obj = obj
|
||||
# Walk along the ancestry checking for things that would make the tree non-cacheable
|
||||
while obj is not None:
|
||||
if obj.UpdateFlags & ObjectUpdateFlags.TEMPORARY_ON_REZ:
|
||||
cacheable = False
|
||||
if obj.PCode == PCode.AVATAR:
|
||||
cacheable = False
|
||||
obj = obj.Parent
|
||||
if cacheable:
|
||||
all_full_ids.add(orig_obj.FullID)
|
||||
|
||||
for key in all_full_ids:
|
||||
obj = region.objects.lookup_fullid(key)
|
||||
cached_data = region_cache_chain.lookup_object_data(obj.LocalID, obj.CRC)
|
||||
if not cached_data:
|
||||
continue
|
||||
orig_dict = obj.to_dict()
|
||||
parsed_data = normalize_object_update_compressed_data(cached_data)
|
||||
updated = obj.update_properties(parsed_data)
|
||||
# Can't compare this yet
|
||||
updated -= {"TextureEntry"}
|
||||
if updated:
|
||||
print(key)
|
||||
for attr in updated:
|
||||
print("\t", attr, orig_dict[attr], parsed_data[attr])
|
||||
finally:
|
||||
# Ok to teardown region in the proxy now
|
||||
region.mark_dead()
|
||||
|
||||
|
||||
addons = [ObjectManagementValidator()]
|
||||
@@ -37,7 +37,7 @@ class PaydayAddon(BaseAddon):
|
||||
chat_type=ChatType.SHOUT,
|
||||
)
|
||||
# Do the traditional money dance.
|
||||
session.main_region.circuit.send_message(Message(
|
||||
session.main_region.circuit.send(Message(
|
||||
"AgentAnimation",
|
||||
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
|
||||
Block("AnimationList", AnimID=UUID("928cae18-e31d-76fd-9cc9-2f55160ff818"), StartAnim=True),
|
||||
|
||||
@@ -14,8 +14,9 @@ from PySide6.QtGui import QImage
|
||||
from hippolyzer.lib.base.datatypes import UUID, Vector3, Quaternion
|
||||
from hippolyzer.lib.base.helpers import to_chunks
|
||||
from hippolyzer.lib.base.message.message import Block, Message
|
||||
from hippolyzer.lib.base.templates import ObjectUpdateFlags, PCode, MCode, MultipleObjectUpdateFlags, TextureEntry
|
||||
from hippolyzer.lib.client.object_manager import ObjectEvent, UpdateType
|
||||
from hippolyzer.lib.base.templates import ObjectUpdateFlags, PCode, MCode, MultipleObjectUpdateFlags, \
|
||||
TextureEntryCollection, JUST_CREATED_FLAGS
|
||||
from hippolyzer.lib.client.object_manager import ObjectEvent, ObjectUpdateType
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon
|
||||
from hippolyzer.lib.proxy.addons import AddonManager
|
||||
from hippolyzer.lib.proxy.commands import handle_command
|
||||
@@ -24,7 +25,6 @@ from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
|
||||
|
||||
JUST_CREATED_FLAGS = (ObjectUpdateFlags.CREATE_SELECTED | ObjectUpdateFlags.OBJECT_YOU_OWNER)
|
||||
PRIM_SCALE = 0.2
|
||||
|
||||
|
||||
@@ -72,7 +72,7 @@ class PixelArtistAddon(BaseAddon):
|
||||
# Watch for any newly created prims, this is basically what the viewer does to find
|
||||
# prims that it just created with the build tool.
|
||||
with session.objects.events.subscribe_async(
|
||||
(UpdateType.OBJECT_UPDATE,),
|
||||
(ObjectUpdateType.OBJECT_UPDATE,),
|
||||
predicate=lambda e: e.object.UpdateFlags & JUST_CREATED_FLAGS and "LocalID" in e.updated
|
||||
) as get_events:
|
||||
# Create a pool of prims to use for building the pixel art
|
||||
@@ -80,7 +80,7 @@ class PixelArtistAddon(BaseAddon):
|
||||
# TODO: We don't track the land group or user's active group, so
|
||||
# "anyone can build" must be on for rezzing to work.
|
||||
group_id = UUID()
|
||||
region.circuit.send_message(Message(
|
||||
region.circuit.send(Message(
|
||||
'ObjectAdd',
|
||||
Block('AgentData', AgentID=session.agent_id, SessionID=session.id, GroupID=group_id),
|
||||
Block(
|
||||
@@ -124,12 +124,12 @@ class PixelArtistAddon(BaseAddon):
|
||||
y = i // width
|
||||
obj = created_prims[prim_idx]
|
||||
# Set a blank texture on all faces
|
||||
te = TextureEntry()
|
||||
te = TextureEntryCollection()
|
||||
te.Textures[None] = UUID('5748decc-f629-461c-9a36-a35a221fe21f')
|
||||
# Set the prim color to the color from the pixel
|
||||
te.Color[None] = pixel_color
|
||||
# Set the prim texture and color
|
||||
region.circuit.send_message(Message(
|
||||
region.circuit.send(Message(
|
||||
'ObjectImage',
|
||||
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
|
||||
Block('ObjectData', ObjectLocalID=obj.LocalID, MediaURL=b'', TextureEntry_=te),
|
||||
@@ -149,7 +149,7 @@ class PixelArtistAddon(BaseAddon):
|
||||
|
||||
# Move the "pixels" to their correct position in chunks
|
||||
for chunk in to_chunks(positioning_blocks, 25):
|
||||
region.circuit.send_message(Message(
|
||||
region.circuit.send(Message(
|
||||
'MultipleObjectUpdate',
|
||||
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
|
||||
*chunk,
|
||||
|
||||
@@ -116,7 +116,7 @@ class RecapitatorAddon(BaseAddon):
|
||||
except:
|
||||
logging.exception("Exception while recapitating")
|
||||
# Tell the viewer about the status of its original upload
|
||||
region.circuit.send_message(Message(
|
||||
region.circuit.send(Message(
|
||||
"AssetUploadComplete",
|
||||
Block("AssetBlock", UUID=asset_id, Type=asset_block["Type"], Success=success),
|
||||
direction=Direction.IN,
|
||||
|
||||
22
addon_examples/simulate_packet_loss.py
Normal file
22
addon_examples/simulate_packet_loss.py
Normal file
@@ -0,0 +1,22 @@
|
||||
import random
|
||||
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon
|
||||
from hippolyzer.lib.base.message.message import Message
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
from hippolyzer.lib.proxy.sessions import Session
|
||||
|
||||
|
||||
class SimulatePacketLossAddon(BaseAddon):
|
||||
def handle_lludp_message(self, session: Session, region: ProxiedRegion, message: Message):
|
||||
# Messing with these may kill your circuit
|
||||
if message.name in {"PacketAck", "StartPingCheck", "CompletePingCheck", "UseCircuitCode",
|
||||
"CompleteAgentMovement", "AgentMovementComplete"}:
|
||||
return
|
||||
# Simulate 30% packet loss
|
||||
if random.random() > 0.7:
|
||||
# Do nothing, drop this packet on the floor
|
||||
return True
|
||||
return
|
||||
|
||||
|
||||
addons = [SimulatePacketLossAddon()]
|
||||
@@ -35,7 +35,7 @@ class TransferExampleAddon(BaseAddon):
|
||||
async def get_first_script(self, session: Session, region: ProxiedRegion):
|
||||
"""Get the contents of the first script in the selected object"""
|
||||
# Ask for the object inventory so we can find a script
|
||||
region.circuit.send_message(Message(
|
||||
region.circuit.send(Message(
|
||||
'RequestTaskInventory',
|
||||
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
|
||||
Block('InventoryData', LocalID=session.selected.object_local),
|
||||
@@ -47,7 +47,7 @@ class TransferExampleAddon(BaseAddon):
|
||||
file_name=inv_message["InventoryData"]["Filename"], file_path=XferFilePath.CACHE)
|
||||
inv_model = InventoryModel.from_bytes(xfer.reassemble_chunks())
|
||||
first_script: Optional[InventoryItem] = None
|
||||
for item in inv_model.items.values():
|
||||
for item in inv_model.all_items:
|
||||
if item.type == "lsltext":
|
||||
first_script = item
|
||||
if not first_script:
|
||||
|
||||
@@ -64,12 +64,12 @@ class TurboObjectInventoryAddon(BaseAddon):
|
||||
# Any previous requests will have triggered a delete of the inventory file
|
||||
# by marking it complete on the server-side. Re-send our RequestTaskInventory
|
||||
# To make sure there's a fresh copy.
|
||||
region.circuit.send_message(request_msg.take())
|
||||
region.circuit.send(request_msg.take())
|
||||
inv_message = await region.message_handler.wait_for(('ReplyTaskInventory',), timeout=5.0)
|
||||
# No task inventory, send the reply as-is
|
||||
file_name = inv_message["InventoryData"]["Filename"]
|
||||
if not file_name:
|
||||
region.circuit.send_message(inv_message)
|
||||
region.circuit.send(inv_message)
|
||||
return
|
||||
|
||||
xfer = region.xfer_manager.request(
|
||||
@@ -87,7 +87,7 @@ class TurboObjectInventoryAddon(BaseAddon):
|
||||
continue
|
||||
|
||||
# Send the original ReplyTaskInventory to the viewer so it knows the file is ready
|
||||
region.circuit.send_message(inv_message)
|
||||
region.circuit.send(inv_message)
|
||||
proxied_xfer = Xfer(data=xfer.reassemble_chunks())
|
||||
|
||||
# Wait for the viewer to request the inventory file
|
||||
|
||||
@@ -102,7 +102,7 @@ class UploaderAddon(BaseAddon):
|
||||
ais_item_to_inventory_data(ais_item),
|
||||
direction=Direction.IN
|
||||
)
|
||||
region.circuit.send_message(message)
|
||||
region.circuit.send(message)
|
||||
|
||||
|
||||
addons = [UploaderAddon()]
|
||||
|
||||
@@ -15,7 +15,7 @@ class XferExampleAddon(BaseAddon):
|
||||
@handle_command()
|
||||
async def get_mute_list(self, session: Session, region: ProxiedRegion):
|
||||
"""Fetch the current user's mute list"""
|
||||
region.circuit.send_message(Message(
|
||||
region.circuit.send(Message(
|
||||
'MuteListRequest',
|
||||
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
|
||||
Block("MuteData", MuteCRC=0),
|
||||
@@ -35,7 +35,7 @@ class XferExampleAddon(BaseAddon):
|
||||
@handle_command()
|
||||
async def get_task_inventory(self, session: Session, region: ProxiedRegion):
|
||||
"""Get the inventory of the currently selected object"""
|
||||
region.circuit.send_message(Message(
|
||||
region.circuit.send(Message(
|
||||
'RequestTaskInventory',
|
||||
# If no session is passed in we'll use the active session when the coro was created
|
||||
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
|
||||
@@ -57,7 +57,7 @@ class XferExampleAddon(BaseAddon):
|
||||
await xfer
|
||||
|
||||
inv_model = InventoryModel.from_bytes(xfer.reassemble_chunks())
|
||||
item_names = [item.name for item in inv_model.items.values()]
|
||||
item_names = [item.name for item in inv_model.all_items]
|
||||
show_message(item_names)
|
||||
|
||||
@handle_command()
|
||||
@@ -98,7 +98,7 @@ textures 1
|
||||
data=asset_data,
|
||||
transaction_id=transaction_id
|
||||
)
|
||||
region.circuit.send_message(Message(
|
||||
region.circuit.send(Message(
|
||||
'CreateInventoryItem',
|
||||
Block('AgentData', AgentID=session.agent_id, SessionID=session.id),
|
||||
Block(
|
||||
|
||||
@@ -87,10 +87,13 @@ class REPLAddon(BaseAddon):
|
||||
def run_http_proxy_process(proxy_host, http_proxy_port, flow_context: HTTPFlowContext):
|
||||
mitm_loop = asyncio.new_event_loop()
|
||||
asyncio.set_event_loop(mitm_loop)
|
||||
mitmproxy_master = create_http_proxy(proxy_host, http_proxy_port, flow_context)
|
||||
mitmproxy_master.start_server()
|
||||
gc.freeze()
|
||||
mitm_loop.run_forever()
|
||||
|
||||
async def mitmproxy_loop():
|
||||
mitmproxy_master = create_http_proxy(proxy_host, http_proxy_port, flow_context)
|
||||
gc.freeze()
|
||||
await mitmproxy_master.run()
|
||||
|
||||
asyncio.run(mitmproxy_loop())
|
||||
|
||||
|
||||
def start_proxy(session_manager: SessionManager, extra_addons: Optional[list] = None,
|
||||
@@ -105,7 +108,7 @@ def start_proxy(session_manager: SessionManager, extra_addons: Optional[list] =
|
||||
root_log.setLevel(logging.INFO)
|
||||
logging.basicConfig()
|
||||
|
||||
loop = asyncio.get_event_loop()
|
||||
loop = asyncio.get_event_loop_policy().get_event_loop()
|
||||
|
||||
udp_proxy_port = session_manager.settings.SOCKS_PROXY_PORT
|
||||
http_proxy_port = session_manager.settings.HTTP_PROXY_PORT
|
||||
|
||||
@@ -62,7 +62,7 @@ def show_error_message(error_msg, parent=None):
|
||||
error_dialog = QtWidgets.QErrorMessage(parent=parent)
|
||||
# No obvious way to set this to plaintext, yuck...
|
||||
error_dialog.showMessage(html.escape(error_msg))
|
||||
error_dialog.exec_()
|
||||
error_dialog.exec()
|
||||
error_dialog.raise_()
|
||||
|
||||
|
||||
@@ -89,13 +89,13 @@ class GUISessionManager(SessionManager, QtCore.QObject):
|
||||
self.all_regions = new_regions
|
||||
|
||||
|
||||
class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
|
||||
def __init__(self, parent):
|
||||
class GUIInteractionManager(BaseInteractionManager):
|
||||
def __init__(self, parent: QtWidgets.QWidget):
|
||||
BaseInteractionManager.__init__(self)
|
||||
QtCore.QObject.__init__(self, parent=parent)
|
||||
self._parent = parent
|
||||
|
||||
def main_window_handle(self) -> Any:
|
||||
return self.parent()
|
||||
return self._parent
|
||||
|
||||
def _dialog_async_exec(self, dialog: QtWidgets.QDialog):
|
||||
future = asyncio.Future()
|
||||
@@ -107,7 +107,7 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
|
||||
self, caption: str, directory: str, filter_str: str, mode: QtWidgets.QFileDialog.FileMode,
|
||||
default_suffix: str = '',
|
||||
) -> Tuple[bool, QtWidgets.QFileDialog]:
|
||||
dialog = QtWidgets.QFileDialog(self.parent(), caption=caption, directory=directory, filter=filter_str)
|
||||
dialog = QtWidgets.QFileDialog(self._parent, caption=caption, directory=directory, filter=filter_str)
|
||||
dialog.setFileMode(mode)
|
||||
if mode == QtWidgets.QFileDialog.FileMode.AnyFile:
|
||||
dialog.setAcceptMode(QtWidgets.QFileDialog.AcceptMode.AcceptSave)
|
||||
@@ -155,7 +155,7 @@ class GUIInteractionManager(BaseInteractionManager, QtCore.QObject):
|
||||
title,
|
||||
caption,
|
||||
QtWidgets.QMessageBox.Ok | QtWidgets.QMessageBox.Cancel,
|
||||
self.parent(),
|
||||
self._parent,
|
||||
)
|
||||
fut = asyncio.Future()
|
||||
msg.finished.connect(lambda r: fut.set_result(r))
|
||||
@@ -323,7 +323,7 @@ class MessageLogWindow(QtWidgets.QMainWindow):
|
||||
|
||||
def _manageFilters(self):
|
||||
dialog = FilterDialog(self)
|
||||
dialog.exec_()
|
||||
dialog.exec()
|
||||
|
||||
@nonFatalExceptions
|
||||
def setFilter(self, filter_str=None):
|
||||
@@ -360,21 +360,20 @@ class MessageLogWindow(QtWidgets.QMainWindow):
|
||||
beautify=self.checkBeautify.isChecked(),
|
||||
replacements=buildReplacements(entry.session, entry.region),
|
||||
)
|
||||
highlight_range = None
|
||||
if isinstance(req, SpannedString):
|
||||
match_result = self.model.filter.match(entry)
|
||||
# Match result was a tuple indicating what matched
|
||||
if isinstance(match_result, tuple):
|
||||
highlight_range = req.spans.get(match_result)
|
||||
|
||||
self.textRequest.setPlainText(req)
|
||||
if highlight_range:
|
||||
cursor = self.textRequest.textCursor()
|
||||
cursor.setPosition(highlight_range[0], QtGui.QTextCursor.MoveAnchor)
|
||||
cursor.setPosition(highlight_range[1], QtGui.QTextCursor.KeepAnchor)
|
||||
highlight_format = QtGui.QTextBlockFormat()
|
||||
highlight_format.setBackground(QtCore.Qt.yellow)
|
||||
cursor.setBlockFormat(highlight_format)
|
||||
# The string has a map of fields and their associated positions within the string,
|
||||
# use that to highlight any individual fields the filter matched on.
|
||||
if isinstance(req, SpannedString):
|
||||
for field in self.model.filter.match(entry, short_circuit=False).fields:
|
||||
field_span = req.spans.get(field)
|
||||
if not field_span:
|
||||
continue
|
||||
cursor = self.textRequest.textCursor()
|
||||
cursor.setPosition(field_span[0], QtGui.QTextCursor.MoveAnchor)
|
||||
cursor.setPosition(field_span[1], QtGui.QTextCursor.KeepAnchor)
|
||||
highlight_format = QtGui.QTextBlockFormat()
|
||||
highlight_format.setBackground(QtCore.Qt.yellow)
|
||||
cursor.setBlockFormat(highlight_format)
|
||||
|
||||
resp = entry.response(beautify=self.checkBeautify.isChecked())
|
||||
if resp:
|
||||
@@ -482,7 +481,7 @@ class MessageLogWindow(QtWidgets.QMainWindow):
|
||||
|
||||
def _manageAddons(self):
|
||||
dialog = AddonDialog(self)
|
||||
dialog.exec_()
|
||||
dialog.exec()
|
||||
|
||||
def getAddonList(self) -> List[str]:
|
||||
return self.sessionManager.settings.ADDON_SCRIPTS
|
||||
@@ -697,9 +696,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
|
||||
msg = HumanMessageSerializer.from_human_string(msg_text, replacements, env, safe=False)
|
||||
if self.checkLLUDPViaCaps.isChecked():
|
||||
if msg.direction == Direction.IN:
|
||||
region.eq_manager.inject_event(
|
||||
self.llsdSerializer.serialize(msg, as_dict=True)
|
||||
)
|
||||
region.eq_manager.inject_message(msg)
|
||||
else:
|
||||
self._sendHTTPRequest(
|
||||
"POST",
|
||||
@@ -712,18 +709,25 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
|
||||
off_circuit = self.checkOffCircuit.isChecked()
|
||||
if off_circuit:
|
||||
transport = SocketUDPTransport(socket.socket(socket.AF_INET, socket.SOCK_DGRAM))
|
||||
region.circuit.send_message(msg, transport=transport)
|
||||
region.circuit.send(msg, transport=transport)
|
||||
if off_circuit:
|
||||
transport.close()
|
||||
|
||||
def _sendEQMessage(self, session, region: Optional[ProxiedRegion], msg_text: str, _replacements: dict):
|
||||
def _sendEQMessage(self, session, region: Optional[ProxiedRegion], msg_text: str, replacements: dict):
|
||||
if not session or not region:
|
||||
raise RuntimeError("Need a valid session and region to send EQ event")
|
||||
message_line, _, body = (x.strip() for x in msg_text.partition("\n"))
|
||||
message_name = message_line.rsplit(" ", 1)[-1]
|
||||
|
||||
env = self._buildEnv(session, region)
|
||||
|
||||
def directive_handler(m):
|
||||
return self._handleHTTPDirective(env, replacements, False, m)
|
||||
body = re.sub(rb"<!HIPPO(\w+)\[\[(.*?)]]>", directive_handler, body.encode("utf8"), flags=re.S)
|
||||
|
||||
region.eq_manager.inject_event({
|
||||
"message": message_name,
|
||||
"body": llsd.parse_xml(body.encode("utf8")),
|
||||
"body": llsd.parse_xml(body),
|
||||
})
|
||||
|
||||
def _sendHTTPMessage(self, session, region, msg_text: str, replacements: dict):
|
||||
@@ -787,7 +791,10 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
|
||||
val = subfield_eval(contents.decode("utf8").strip(), globals_={**env, **replacements})
|
||||
val = _coerce_to_bytes(val)
|
||||
elif directive == b"REPL":
|
||||
val = _coerce_to_bytes(replacements[contents.decode("utf8").strip()])
|
||||
repl = replacements[contents.decode("utf8").strip()]
|
||||
if callable(repl):
|
||||
repl = repl()
|
||||
val = _coerce_to_bytes(repl)
|
||||
else:
|
||||
raise ValueError(f"Unknown directive {directive}")
|
||||
|
||||
|
||||
306
hippolyzer/lib/base/colladatools.py
Normal file
306
hippolyzer/lib/base/colladatools.py
Normal file
@@ -0,0 +1,306 @@
|
||||
# This currently implements basic LLMesh -> Collada.
|
||||
#
|
||||
# TODO:
|
||||
# * inverse, Collada -> LLMesh (for simple cases, maybe using impasse rather than pycollada)
|
||||
# * round-tripping tests, LLMesh->Collada->LLMesh
|
||||
# * * Can't really test using Collada->LLMesh->Collada because Collada->LLMesh is almost always
|
||||
# going to be lossy due to how SL represents vertex data and materials compared to what
|
||||
# Collada allows.
|
||||
# * Eventually scrap this and just use GLTF instead once we know we have the semantics correct
|
||||
# * * Collada was just easier to bootstrap given that it's the only officially supported input format
|
||||
# * * Collada tooling sucks and even LL is moving away from it
|
||||
# * * Ensuring LLMesh->Collada and LLMesh->GLTF conversion don't differ semantically is easy via assimp.
|
||||
|
||||
import collections
|
||||
import os.path
|
||||
import secrets
|
||||
import statistics
|
||||
import sys
|
||||
from typing import Dict, List, Iterable, Optional
|
||||
|
||||
import collada
|
||||
import collada.source
|
||||
from collada import E
|
||||
from lxml import etree
|
||||
import numpy as np
|
||||
import transformations
|
||||
|
||||
from hippolyzer.lib.base.helpers import get_resource_filename
|
||||
from hippolyzer.lib.base.serialization import BufferReader
|
||||
from hippolyzer.lib.base.mesh import LLMeshSerializer, MeshAsset, positions_from_domain, SkinSegmentDict
|
||||
|
||||
DIR = os.path.dirname(os.path.realpath(__file__))
|
||||
|
||||
|
||||
def mesh_to_collada(ll_mesh: MeshAsset, include_skin=True) -> collada.Collada:
|
||||
dae = collada.Collada()
|
||||
axis = collada.asset.UP_AXIS.Z_UP
|
||||
dae.assetInfo.upaxis = axis
|
||||
scene = collada.scene.Scene("scene", [llmesh_to_node(ll_mesh, dae, include_skin=include_skin)])
|
||||
|
||||
dae.scenes.append(scene)
|
||||
dae.scene = scene
|
||||
return dae
|
||||
|
||||
|
||||
def llmesh_to_node(ll_mesh: MeshAsset, dae: collada.Collada, uniq=None,
|
||||
include_skin=True, node_transform: Optional[np.ndarray] = None) -> collada.scene.Node:
|
||||
if node_transform is None:
|
||||
node_transform = np.identity(4)
|
||||
|
||||
should_skin = False
|
||||
skin_seg = ll_mesh.segments.get('skin')
|
||||
bind_shape_matrix = None
|
||||
if include_skin and skin_seg:
|
||||
bind_shape_matrix = np.array(skin_seg["bind_shape_matrix"]).reshape((4, 4))
|
||||
should_skin = True
|
||||
# Transform from the skin will be applied on the controller, not the node
|
||||
node_transform = np.identity(4)
|
||||
|
||||
if not uniq:
|
||||
uniq = secrets.token_urlsafe(4)
|
||||
|
||||
geom_nodes = []
|
||||
node_name = f"mainnode{uniq}"
|
||||
# TODO: do the other LODs?
|
||||
for submesh_num, submesh in enumerate(ll_mesh.segments["high_lod"]):
|
||||
# Make sure none of our IDs collide with those of other nodes
|
||||
sub_uniq = uniq + str(submesh_num)
|
||||
|
||||
range_xyz = positions_from_domain(submesh["Position"], submesh["PositionDomain"])
|
||||
xyz = np.array([x.data() for x in range_xyz])
|
||||
|
||||
range_uv = positions_from_domain(submesh['TexCoord0'], submesh['TexCoord0Domain'])
|
||||
uv = np.array([x.data() for x in range_uv]).flatten()
|
||||
|
||||
norms = np.array([x.data() for x in submesh["Normal"]])
|
||||
|
||||
effect = collada.material.Effect(
|
||||
id=f"effect{sub_uniq}",
|
||||
params=[],
|
||||
specular=(0.0, 0.0, 0.0, 0.0),
|
||||
reflectivity=(0.0, 0.0, 0.0, 0.0),
|
||||
emission=(0.0, 0.0, 0.0, 0.0),
|
||||
ambient=(0.0, 0.0, 0.0, 0.0),
|
||||
reflective=0.0,
|
||||
shadingtype="blinn",
|
||||
shininess=0.0,
|
||||
diffuse=(0.0, 0.0, 0.0),
|
||||
)
|
||||
mat = collada.material.Material(f"material{sub_uniq}", f"material{sub_uniq}", effect)
|
||||
|
||||
dae.materials.append(mat)
|
||||
dae.effects.append(effect)
|
||||
|
||||
vert_src = collada.source.FloatSource(f"verts-array{sub_uniq}", xyz.flatten(), ("X", "Y", "Z"))
|
||||
norm_src = collada.source.FloatSource(f"norms-array{sub_uniq}", norms.flatten(), ("X", "Y", "Z"))
|
||||
# UV maps have to have the same name or they'll behave weirdly when objects are merged.
|
||||
uv_src = collada.source.FloatSource("uvs-array", np.array(uv), ("U", "V"))
|
||||
|
||||
geom = collada.geometry.Geometry(dae, f"geometry{sub_uniq}", "geometry", [vert_src, norm_src, uv_src])
|
||||
|
||||
input_list = collada.source.InputList()
|
||||
input_list.addInput(0, 'VERTEX', f'#verts-array{sub_uniq}', set="0")
|
||||
input_list.addInput(0, 'NORMAL', f'#norms-array{sub_uniq}', set="0")
|
||||
input_list.addInput(0, 'TEXCOORD', '#uvs-array', set="0")
|
||||
|
||||
tri_idxs = np.array(submesh["TriangleList"]).flatten()
|
||||
matnode = collada.scene.MaterialNode(f"materialref{sub_uniq}", mat, inputs=[])
|
||||
tri_set = geom.createTriangleSet(tri_idxs, input_list, f'materialref{sub_uniq}')
|
||||
geom.primitives.append(tri_set)
|
||||
dae.geometries.append(geom)
|
||||
|
||||
if should_skin:
|
||||
joint_names = np.array(skin_seg['joint_names'], dtype=object)
|
||||
joints_source = collada.source.NameSource(f"joint-names{sub_uniq}", joint_names, ("JOINT",))
|
||||
# PyCollada has a bug where it doesn't set the source URI correctly. Fix it.
|
||||
accessor = joints_source.xmlnode.find(f"{dae.tag('technique_common')}/{dae.tag('accessor')}")
|
||||
if not accessor.get('source').startswith('#'):
|
||||
accessor.set('source', f"#{accessor.get('source')}")
|
||||
|
||||
flattened_bind_poses = []
|
||||
# LLMesh matrices are row-major, convert to col-major for Collada.
|
||||
for bind_pose in skin_seg['inverse_bind_matrix']:
|
||||
flattened_bind_poses.append(np.array(bind_pose).reshape((4, 4)).flatten('F'))
|
||||
flattened_bind_poses = np.array(flattened_bind_poses)
|
||||
inv_bind_source = _create_mat4_source(f"bind-poses{sub_uniq}", flattened_bind_poses, "TRANSFORM")
|
||||
|
||||
weight_joint_idxs = []
|
||||
weights = []
|
||||
vert_weight_counts = []
|
||||
cur_weight_idx = 0
|
||||
for vert_weights in submesh['Weights']:
|
||||
vert_weight_counts.append(len(vert_weights))
|
||||
for vert_weight in vert_weights:
|
||||
weights.append(vert_weight.weight)
|
||||
weight_joint_idxs.append(vert_weight.joint_idx)
|
||||
weight_joint_idxs.append(cur_weight_idx)
|
||||
cur_weight_idx += 1
|
||||
|
||||
weights_source = collada.source.FloatSource(f"skin-weights{sub_uniq}", np.array(weights), ("WEIGHT",))
|
||||
# We need to make a controller for each material since materials are essentially distinct meshes
|
||||
# in SL, with their own distinct sets of weights and vertex data.
|
||||
controller_node = E.controller(
|
||||
E.skin(
|
||||
E.bind_shape_matrix(' '.join(str(x) for x in bind_shape_matrix.flatten('F'))),
|
||||
joints_source.xmlnode,
|
||||
inv_bind_source.xmlnode,
|
||||
weights_source.xmlnode,
|
||||
E.joints(
|
||||
E.input(semantic="JOINT", source=f"#joint-names{sub_uniq}"),
|
||||
E.input(semantic="INV_BIND_MATRIX", source=f"#bind-poses{sub_uniq}")
|
||||
),
|
||||
E.vertex_weights(
|
||||
E.input(semantic="JOINT", source=f"#joint-names{sub_uniq}", offset="0"),
|
||||
E.input(semantic="WEIGHT", source=f"#skin-weights{sub_uniq}", offset="1"),
|
||||
E.vcount(' '.join(str(x) for x in vert_weight_counts)),
|
||||
E.v(' '.join(str(x) for x in weight_joint_idxs)),
|
||||
count=str(len(submesh['Weights']))
|
||||
),
|
||||
source=f"#geometry{sub_uniq}"
|
||||
),
|
||||
id=f"Armature-{sub_uniq}",
|
||||
name=node_name
|
||||
)
|
||||
controller = collada.controller.Controller.load(dae, {}, controller_node)
|
||||
dae.controllers.append(controller)
|
||||
geom_node = collada.scene.ControllerNode(controller, [matnode])
|
||||
else:
|
||||
geom_node = collada.scene.GeometryNode(geom, [matnode])
|
||||
|
||||
geom_nodes.append(geom_node)
|
||||
|
||||
node = collada.scene.Node(
|
||||
node_name,
|
||||
children=geom_nodes,
|
||||
transforms=[collada.scene.MatrixTransform(np.array(node_transform.flatten('F')))],
|
||||
)
|
||||
if should_skin:
|
||||
# We need a skeleton per _mesh asset_ because you could have incongruous skeletons
|
||||
# within the same linkset.
|
||||
skel_root = load_skeleton_nodes()
|
||||
transform_skeleton(skel_root, dae, skin_seg)
|
||||
skel = collada.scene.Node.load(dae, skel_root, {})
|
||||
skel.children.append(node)
|
||||
skel.id = f"Skel-{uniq}"
|
||||
skel.save()
|
||||
node = skel
|
||||
return node
|
||||
|
||||
|
||||
def load_skeleton_nodes() -> etree.ElementBase:
|
||||
# TODO: this sucks. Can't we construct nodes with the appropriate transformation
|
||||
# matrices from the data in `avatar_skeleton.xml`?
|
||||
skel_path = get_resource_filename("lib/base/data/male_collada_joints.xml")
|
||||
with open(skel_path, 'r') as f:
|
||||
return etree.fromstring(f.read())
|
||||
|
||||
|
||||
def transform_skeleton(skel_root: etree.ElementBase, dae: collada.Collada, skin_seg: SkinSegmentDict,
|
||||
include_unreferenced_bones=False):
|
||||
"""Update skeleton XML nodes to account for joint translations in the mesh"""
|
||||
# TODO: Use translation component only.
|
||||
joint_nodes: Dict[str, collada.scene.Node] = {}
|
||||
for skel_node in skel_root.iter():
|
||||
# xpath is loathsome so this is easier.
|
||||
if skel_node.tag != dae.tag('node') or skel_node.get('type') != 'JOINT':
|
||||
continue
|
||||
joint_nodes[skel_node.get('name')] = collada.scene.Node.load(dae, skel_node, {})
|
||||
for joint_name, matrix in zip(skin_seg['joint_names'], skin_seg.get('alt_inverse_bind_matrix', [])):
|
||||
joint_node = joint_nodes[joint_name]
|
||||
joint_node.matrix = np.array(matrix).reshape((4, 4)).flatten('F')
|
||||
# Update the underlying XML element with the new transform matrix
|
||||
joint_node.save()
|
||||
|
||||
if not include_unreferenced_bones:
|
||||
needed_heirarchy = set()
|
||||
for skel_node in joint_nodes.values():
|
||||
skel_node = skel_node.xmlnode
|
||||
if skel_node.get('name') in skin_seg['joint_names']:
|
||||
# Add this joint and any ancestors the list of needed joints
|
||||
while skel_node is not None:
|
||||
needed_heirarchy.add(skel_node.get('name'))
|
||||
skel_node = skel_node.getparent()
|
||||
|
||||
for skel_node in joint_nodes.values():
|
||||
skel_node = skel_node.xmlnode
|
||||
if skel_node.get('name') not in needed_heirarchy:
|
||||
skel_node.getparent().remove(skel_node)
|
||||
|
||||
pelvis_offset = skin_seg.get('pelvis_offset')
|
||||
|
||||
# TODO: should we even do this here? It's not present in the collada, just
|
||||
# something that's specified in the uploader before conversion to LLMesh.
|
||||
if pelvis_offset and 'mPelvis' in joint_nodes:
|
||||
pelvis_node = joint_nodes['mPelvis']
|
||||
# Column-major!
|
||||
pelvis_node.matrix[3][2] += pelvis_offset
|
||||
pelvis_node.save()
|
||||
|
||||
|
||||
def _create_mat4_source(name: str, data: np.ndarray, semantic: str):
|
||||
# PyCollada has no way to make a source with a float4x4 semantic. Do it a bad way.
|
||||
# Note that collada demands column-major matrices whereas LLSD mesh has them row-major!
|
||||
source = collada.source.FloatSource(name, data, tuple(f"M{x}" for x in range(16)))
|
||||
accessor = source.xmlnode[1][0]
|
||||
for child in list(accessor):
|
||||
accessor.remove(child)
|
||||
accessor.append(E.param(name=semantic, type="float4x4"))
|
||||
return source
|
||||
|
||||
|
||||
def fix_weird_bind_matrices(skin_seg: SkinSegmentDict):
|
||||
"""
|
||||
Fix weird-looking bind matrices to have normal scaling
|
||||
|
||||
Not sure why these even happen (weird mesh authoring programs?)
|
||||
Sometimes get enormous inverse bind matrices (each component 10k+) and tiny
|
||||
bind shape matrix components. This detects inverse bind shape matrices
|
||||
with weird scales and tries to set them to what they "should" be without
|
||||
the weird inverted scaling.
|
||||
"""
|
||||
axis_counters = [collections.Counter() for _ in range(3)]
|
||||
for joint_inv in skin_seg['inverse_bind_matrix']:
|
||||
joint_mat = np.array(joint_inv).reshape((4, 4))
|
||||
joint_scale = transformations.decompose_matrix(joint_mat)[0]
|
||||
for axis_counter, axis_val in zip(axis_counters, joint_scale):
|
||||
axis_counter[axis_val] += 1
|
||||
most_common_inv_scale = []
|
||||
for axis_counter in axis_counters:
|
||||
most_common_inv_scale.append(axis_counter.most_common(1)[0][0])
|
||||
|
||||
if abs(1.0 - statistics.fmean(most_common_inv_scale)) > 1.0:
|
||||
# The magnitude of the scales in the inverse bind matrices look very strange.
|
||||
# The bind matrix itself is probably messed up as well, try to fix it.
|
||||
skin_seg['bind_shape_matrix'] = fix_llsd_matrix_scale(skin_seg['bind_shape_matrix'], most_common_inv_scale)
|
||||
if joint_positions := skin_seg.get('alt_inverse_bind_matrix', None):
|
||||
fix_matrix_list_scale(joint_positions, most_common_inv_scale)
|
||||
rev_scale = tuple(1.0 / x for x in most_common_inv_scale)
|
||||
fix_matrix_list_scale(skin_seg['inverse_bind_matrix'], rev_scale)
|
||||
|
||||
|
||||
def fix_matrix_list_scale(source: List[List[float]], scale_fixup: Iterable[float]):
|
||||
for i, alt_inv_matrix in enumerate(source):
|
||||
source[i] = fix_llsd_matrix_scale(alt_inv_matrix, scale_fixup)
|
||||
|
||||
|
||||
def fix_llsd_matrix_scale(source: List[float], scale_fixup: Iterable[float]):
|
||||
matrix = np.array(source).reshape((4, 4))
|
||||
decomposed = list(transformations.decompose_matrix(matrix))
|
||||
# Need to handle both the scale and translation matrices
|
||||
for idx in (0, 3):
|
||||
decomposed[idx] = tuple(x * y for x, y in zip(decomposed[idx], scale_fixup))
|
||||
return list(transformations.compose_matrix(*decomposed).flatten('C'))
|
||||
|
||||
|
||||
def main():
|
||||
# Take an llmesh file as an argument and spit out basename-converted.dae
|
||||
with open(sys.argv[1], "rb") as f:
|
||||
reader = BufferReader("<", f.read())
|
||||
|
||||
mesh = mesh_to_collada(reader.read(LLMeshSerializer(parse_segment_contents=True)))
|
||||
mesh.write(sys.argv[1].rsplit(".", 1)[0] + "-converted.dae")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
485
hippolyzer/lib/base/data/male_collada_joints.xml
Normal file
485
hippolyzer/lib/base/data/male_collada_joints.xml
Normal file
@@ -0,0 +1,485 @@
|
||||
<!-- from http://wiki.secondlife.com/wiki/Project_Bento_Resources_and_Information collada -->
|
||||
<node id="Avatar" name="Avatar" type="NODE" xmlns="http://www.collada.org/2005/11/COLLADASchema">
|
||||
<translate sid="location">0 0 0</translate>
|
||||
<rotate sid="rotationZ">0 0 1 0</rotate>
|
||||
<rotate sid="rotationY">0 1 0 0</rotate>
|
||||
<rotate sid="rotationX">1 0 0 0</rotate>
|
||||
<scale sid="scale">1 1 1</scale>
|
||||
<node id="mPelvis" name="mPelvis" sid="mPelvis" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 1.067 0 0 0 1</matrix>
|
||||
<node id="PELVIS" name="PELVIS" sid="PELVIS" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.01 0 1 0 0 0 0 1 -0.02 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="BUTT" name="BUTT" sid="BUTT" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.06 0 1 0 0 0 0 1 -0.1 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mSpine1" name="mSpine1" sid="mSpine1" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.084 0 0 0 1</matrix>
|
||||
<node id="mSpine2" name="mSpine2" sid="mSpine2" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 -0.084 0 0 0 1</matrix>
|
||||
<node id="mTorso" name="mTorso" sid="mTorso" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.084 0 0 0 1</matrix>
|
||||
<node id="BELLY" name="BELLY" sid="BELLY" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.028 0 1 0 0 0 0 1 0.04 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="LEFT_HANDLE" name="LEFT_HANDLE" sid="LEFT_HANDLE" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 0.1 0 0 1 0.058 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="RIGHT_HANDLE" name="RIGHT_HANDLE" sid="RIGHT_HANDLE" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 -0.1 0 0 1 0.058 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="LOWER_BACK" name="LOWER_BACK" sid="LOWER_BACK" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.023 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mSpine3" name="mSpine3" sid="mSpine3" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.015 0 1 0 0 0 0 1 0.205 0 0 0 1</matrix>
|
||||
<node id="mSpine4" name="mSpine4" sid="mSpine4" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.015 0 1 0 0 0 0 1 -0.205 0 0 0 1</matrix>
|
||||
<node id="mChest" name="mChest" sid="mChest" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.015 0 1 0 0 0 0 1 0.205 0 0 0 1</matrix>
|
||||
<node id="CHEST" name="CHEST" sid="CHEST" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.028 0 1 0 0 0 0 1 0.07 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="LEFT_PEC" name="LEFT_PEC" sid="LEFT_PEC" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.119 0 1 0 0.082 0 0 1 0.042 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="RIGHT_PEC" name="RIGHT_PEC" sid="RIGHT_PEC" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.119 0 1 0 -0.082 0 0 1 0.042 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="UPPER_BACK" name="UPPER_BACK" sid="UPPER_BACK" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.017 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mNeck" name="mNeck" sid="mNeck" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.01 0 1 0 0 0 0 1 0.251 0 0 0 1</matrix>
|
||||
<node id="NECK" name="NECK" sid="NECK" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mHead" name="mHead" sid="mHead" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.076 0 0 0 1</matrix>
|
||||
<node id="HEAD" name="HEAD" sid="HEAD" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 0.07 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mSkull" name="mSkull" sid="mSkull" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0.079 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mEyeRight" name="mEyeRight" sid="mEyeRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.098 0 1 0 -0.036 0 0 1 0.079 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mEyeLeft" name="mEyeLeft" sid="mEyeLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.098 0 1 0 0.036 0 0 1 0.079 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceRoot" name="mFaceRoot" sid="mFaceRoot" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.025 0 1 0 0 0 0 1 0.045 0 0 0 1</matrix>
|
||||
<node id="mFaceEyeAltRight" name="mFaceEyeAltRight" sid="mFaceEyeAltRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.073 0 1 0 -0.036 0 0 1 0.034 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceEyeAltLeft" name="mFaceEyeAltLeft" sid="mFaceEyeAltLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.073 0 1 0 0.036 0 0 1 0.034 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceForeheadLeft" name="mFaceForeheadLeft" sid="mFaceForeheadLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.061 0 1 0 0.035 0 0 1 0.083 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceForeheadRight" name="mFaceForeheadRight" sid="mFaceForeheadRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.061 0 1 0 -0.035 0 0 1 0.083 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceEyebrowOuterLeft" name="mFaceEyebrowOuterLeft" sid="mFaceEyebrowOuterLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.064 0 1 0 0.051 0 0 1 0.048 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceEyebrowCenterLeft" name="mFaceEyebrowCenterLeft" sid="mFaceEyebrowCenterLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.07 0 1 0 0.043 0 0 1 0.056 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceEyebrowInnerLeft" name="mFaceEyebrowInnerLeft" sid="mFaceEyebrowInnerLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.075 0 1 0 0.022 0 0 1 0.051 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceEyebrowOuterRight" name="mFaceEyebrowOuterRight" sid="mFaceEyebrowOuterRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.064 0 1 0 -0.051 0 0 1 0.048 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceEyebrowCenterRight" name="mFaceEyebrowCenterRight" sid="mFaceEyebrowCenterRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.07 0 1 0 -0.043 0 0 1 0.056 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceEyebrowInnerRight" name="mFaceEyebrowInnerRight" sid="mFaceEyebrowInnerRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.075 0 1 0 -0.022 0 0 1 0.051 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceEyeLidUpperLeft" name="mFaceEyeLidUpperLeft" sid="mFaceEyeLidUpperLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.073 0 1 0 0.036 0 0 1 0.034 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceEyeLidLowerLeft" name="mFaceEyeLidLowerLeft" sid="mFaceEyeLidLowerLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.073 0 1 0 0.036 0 0 1 0.034 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceEyeLidUpperRight" name="mFaceEyeLidUpperRight" sid="mFaceEyeLidUpperRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.073 0 1 0 -0.036 0 0 1 0.034 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceEyeLidLowerRight" name="mFaceEyeLidLowerRight" sid="mFaceEyeLidLowerRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.073 0 1 0 -0.036 0 0 1 0.034 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceEar1Left" name="mFaceEar1Left" sid="mFaceEar1Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 0.08 0 0 1 0.002 0 0 0 1</matrix>
|
||||
<node id="mFaceEar2Left" name="mFaceEar2Left" sid="mFaceEar2Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.019 0 1 0 0.018 0 0 1 0.025 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mFaceEar1Right" name="mFaceEar1Right" sid="mFaceEar1Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 -0.08 0 0 1 0.002 0 0 0 1</matrix>
|
||||
<node id="mFaceEar2Right" name="mFaceEar2Right" sid="mFaceEar2Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.019 0 1 0 -0.018 0 0 1 0.025 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mFaceNoseLeft" name="mFaceNoseLeft" sid="mFaceNoseLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.086 0 1 0 0.015 0 0 1 -0.004 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceNoseCenter" name="mFaceNoseCenter" sid="mFaceNoseCenter" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.102 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceNoseRight" name="mFaceNoseRight" sid="mFaceNoseRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.086 0 1 0 -0.015 0 0 1 -0.004 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceCheekLowerLeft" name="mFaceCheekLowerLeft" sid="mFaceCheekLowerLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.05 0 1 0 0.034 0 0 1 -0.031 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceCheekUpperLeft" name="mFaceCheekUpperLeft" sid="mFaceCheekUpperLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.07 0 1 0 0.034 0 0 1 -0.005 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceCheekLowerRight" name="mFaceCheekLowerRight" sid="mFaceCheekLowerRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.05 0 1 0 -0.034 0 0 1 -0.031 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceCheekUpperRight" name="mFaceCheekUpperRight" sid="mFaceCheekUpperRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.07 0 1 0 -0.034 0 0 1 -0.005 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceJaw" name="mFaceJaw" sid="mFaceJaw" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.001 0 1 0 0 0 0 1 -0.015 0 0 0 1</matrix>
|
||||
<node id="mFaceChin" name="mFaceChin" sid="mFaceChin" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.074 0 1 0 0 0 0 1 -0.054 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceTeethLower" name="mFaceTeethLower" sid="mFaceTeethLower" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.021 0 1 0 0 0 0 1 -0.039 0 0 0 1</matrix>
|
||||
<node id="mFaceLipLowerLeft" name="mFaceLipLowerLeft" sid="mFaceLipLowerLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceLipLowerRight" name="mFaceLipLowerRight" sid="mFaceLipLowerRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceLipLowerCenter" name="mFaceLipLowerCenter" sid="mFaceLipLowerCenter" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceTongueBase" name="mFaceTongueBase" sid="mFaceTongueBase" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.039 0 1 0 0 0 0 1 0.005 0 0 0 1</matrix>
|
||||
<node id="mFaceTongueTip" name="mFaceTongueTip" sid="mFaceTongueTip" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.022 0 1 0 0 0 0 1 0.007 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mFaceJawShaper" name="mFaceJawShaper" sid="mFaceJawShaper" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceForeheadCenter" name="mFaceForeheadCenter" sid="mFaceForeheadCenter" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.069 0 1 0 0 0 0 1 0.065 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceNoseBase" name="mFaceNoseBase" sid="mFaceNoseBase" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.094 0 1 0 0 0 0 1 -0.016 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceTeethUpper" name="mFaceTeethUpper" sid="mFaceTeethUpper" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 -0.03 0 0 0 1</matrix>
|
||||
<node id="mFaceLipUpperLeft" name="mFaceLipUpperLeft" sid="mFaceLipUpperLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 -0.003 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceLipUpperRight" name="mFaceLipUpperRight" sid="mFaceLipUpperRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 -0.003 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceLipCornerLeft" name="mFaceLipCornerLeft" sid="mFaceLipCornerLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.028 0 1 0 -0.019 0 0 1 -0.01 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceLipCornerRight" name="mFaceLipCornerRight" sid="mFaceLipCornerRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.028 0 1 0 0.019 0 0 1 -0.01 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceLipUpperCenter" name="mFaceLipUpperCenter" sid="mFaceLipUpperCenter" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.045 0 1 0 0 0 0 1 -0.003 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mFaceEyecornerInnerLeft" name="mFaceEyecornerInnerLeft" sid="mFaceEyecornerInnerLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.075 0 1 0 0.017 0 0 1 0.032 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceEyecornerInnerRight" name="mFaceEyecornerInnerRight" sid="mFaceEyecornerInnerRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.075 0 1 0 -0.017 0 0 1 0.032 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFaceNoseBridge" name="mFaceNoseBridge" sid="mFaceNoseBridge" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.091 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mCollarLeft" name="mCollarLeft" sid="mCollarLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.021 0 1 0 0.085 0 0 1 0.165 0 0 0 1</matrix>
|
||||
<node id="L_CLAVICLE" name="L_CLAVICLE" sid="L_CLAVICLE" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mShoulderLeft" name="mShoulderLeft" sid="mShoulderLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 0.079 0 0 1 0 0 0 0 1</matrix>
|
||||
<node id="L_UPPER_ARM" name="L_UPPER_ARM" sid="L_UPPER_ARM" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 0.12 0 0 1 0.01 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mElbowLeft" name="mElbowLeft" sid="mElbowLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 0.248 0 0 1 0 0 0 0 1</matrix>
|
||||
<node id="L_LOWER_ARM" name="L_LOWER_ARM" sid="L_LOWER_ARM" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 0.1 0 0 1 0 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mWristLeft" name="mWristLeft" sid="mWristLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 0.205 0 0 1 0 0 0 0 1</matrix>
|
||||
<node id="L_HAND" name="L_HAND" sid="L_HAND" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.01 0 1 0 0.05 0 0 1 0 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mHandMiddle1Left" name="mHandMiddle1Left" sid="mHandMiddle1Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.013 0 1 0 0.101 0 0 1 0.015 0 0 0 1</matrix>
|
||||
<node id="mHandMiddle2Left" name="mHandMiddle2Left" sid="mHandMiddle2Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.001 0 1 0 0.04 0 0 1 -0.006 0 0 0 1</matrix>
|
||||
<node id="mHandMiddle3Left" name="mHandMiddle3Left" sid="mHandMiddle3Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.001 0 1 0 0.049 0 0 1 -0.008 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mHandIndex1Left" name="mHandIndex1Left" sid="mHandIndex1Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.038 0 1 0 0.097 0 0 1 0.015 0 0 0 1</matrix>
|
||||
<node id="mHandIndex2Left" name="mHandIndex2Left" sid="mHandIndex2Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.017 0 1 0 0.036 0 0 1 -0.006 0 0 0 1</matrix>
|
||||
<node id="mHandIndex3Left" name="mHandIndex3Left" sid="mHandIndex3Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.014 0 1 0 0.032 0 0 1 -0.006 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mHandRing1Left" name="mHandRing1Left" sid="mHandRing1Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.01 0 1 0 0.099 0 0 1 0.009 0 0 0 1</matrix>
|
||||
<node id="mHandRing2Left" name="mHandRing2Left" sid="mHandRing2Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.013 0 1 0 0.038 0 0 1 -0.008 0 0 0 1</matrix>
|
||||
<node id="mHandRing3Left" name="mHandRing3Left" sid="mHandRing3Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.013 0 1 0 0.04 0 0 1 -0.009 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mHandPinky1Left" name="mHandPinky1Left" sid="mHandPinky1Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.031 0 1 0 0.095 0 0 1 0.003 0 0 0 1</matrix>
|
||||
<node id="mHandPinky2Left" name="mHandPinky2Left" sid="mHandPinky2Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.024 0 1 0 0.025 0 0 1 -0.006 0 0 0 1</matrix>
|
||||
<node id="mHandPinky3Left" name="mHandPinky3Left" sid="mHandPinky3Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.015 0 1 0 0.018 0 0 1 -0.004 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mHandThumb1Left" name="mHandThumb1Left" sid="mHandThumb1Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.031 0 1 0 0.026 0 0 1 0.004 0 0 0 1</matrix>
|
||||
<node id="mHandThumb2Left" name="mHandThumb2Left" sid="mHandThumb2Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.028 0 1 0 0.032 0 0 1 -0.001 0 0 0 1</matrix>
|
||||
<node id="mHandThumb3Left" name="mHandThumb3Left" sid="mHandThumb3Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.023 0 1 0 0.031 0 0 1 -0.001 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mCollarRight" name="mCollarRight" sid="mCollarRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.021 0 1 0 -0.085 0 0 1 0.165 0 0 0 1</matrix>
|
||||
<node id="R_CLAVICLE" name="R_CLAVICLE" sid="R_CLAVICLE" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.02 0 1 0 0 0 0 1 0.02 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mShoulderRight" name="mShoulderRight" sid="mShoulderRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 -0.079 0 0 1 0 0 0 0 1</matrix>
|
||||
<node id="R_UPPER_ARM" name="R_UPPER_ARM" sid="R_UPPER_ARM" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 -0.12 0 0 1 0.01 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mElbowRight" name="mElbowRight" sid="mElbowRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 -0.248 0 0 1 0 0 0 0 1</matrix>
|
||||
<node id="R_LOWER_ARM" name="R_LOWER_ARM" sid="R_LOWER_ARM" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 -0.1 0 0 1 0 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mWristRight" name="mWristRight" sid="mWristRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0 0 1 0 -0.205 0 0 1 0 0 0 0 1</matrix>
|
||||
<node id="R_HAND" name="R_HAND" sid="R_HAND" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.01 0 1 0 -0.05 0 0 1 0 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mHandMiddle1Right" name="mHandMiddle1Right" sid="mHandMiddle1Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.013 0 1 0 -0.101 0 0 1 0.015 0 0 0 1</matrix>
|
||||
<node id="mHandMiddle2Right" name="mHandMiddle2Right" sid="mHandMiddle2Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.001 0 1 0 -0.04 0 0 1 -0.006 0 0 0 1</matrix>
|
||||
<node id="mHandMiddle3Right" name="mHandMiddle3Right" sid="mHandMiddle3Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.001 0 1 0 -0.049 0 0 1 -0.008 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mHandIndex1Right" name="mHandIndex1Right" sid="mHandIndex1Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.038 0 1 0 -0.097 0 0 1 0.015 0 0 0 1</matrix>
|
||||
<node id="mHandIndex2Right" name="mHandIndex2Right" sid="mHandIndex2Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.017 0 1 0 -0.036 0 0 1 -0.006 0 0 0 1</matrix>
|
||||
<node id="mHandIndex3Right" name="mHandIndex3Right" sid="mHandIndex3Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.014 0 1 0 -0.032 0 0 1 -0.006 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mHandRing1Right" name="mHandRing1Right" sid="mHandRing1Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.01 0 1 0 -0.099 0 0 1 0.009 0 0 0 1</matrix>
|
||||
<node id="mHandRing2Right" name="mHandRing2Right" sid="mHandRing2Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.013 0 1 0 -0.038 0 0 1 -0.008 0 0 0 1</matrix>
|
||||
<node id="mHandRing3Right" name="mHandRing3Right" sid="mHandRing3Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.013 0 1 0 -0.04 0 0 1 -0.009 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mHandPinky1Right" name="mHandPinky1Right" sid="mHandPinky1Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.031 0 1 0 -0.095 0 0 1 0.003 0 0 0 1</matrix>
|
||||
<node id="mHandPinky2Right" name="mHandPinky2Right" sid="mHandPinky2Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.024 0 1 0 -0.025 0 0 1 -0.006 0 0 0 1</matrix>
|
||||
<node id="mHandPinky3Right" name="mHandPinky3Right" sid="mHandPinky3Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.015 0 1 0 -0.018 0 0 1 -0.004 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mHandThumb1Right" name="mHandThumb1Right" sid="mHandThumb1Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.031 0 1 0 -0.026 0 0 1 0.004 0 0 0 1</matrix>
|
||||
<node id="mHandThumb2Right" name="mHandThumb2Right" sid="mHandThumb2Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.028 0 1 0 -0.032 0 0 1 -0.001 0 0 0 1</matrix>
|
||||
<node id="mHandThumb3Right" name="mHandThumb3Right" sid="mHandThumb3Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.023 0 1 0 -0.031 0 0 1 -0.001 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mWingsRoot" name="mWingsRoot" sid="mWingsRoot" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.014 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
|
||||
<node id="mWing1Left" name="mWing1Left" sid="mWing1Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.099 0 1 0 0.105 0 0 1 0.181 0 0 0 1</matrix>
|
||||
<node id="mWing2Left" name="mWing2Left" sid="mWing2Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.168 0 1 0 0.169 0 0 1 0.067 0 0 0 1</matrix>
|
||||
<node id="mWing3Left" name="mWing3Left" sid="mWing3Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.181 0 1 0 0.183 0 0 1 0 0 0 0 1</matrix>
|
||||
<node id="mWing4Left" name="mWing4Left" sid="mWing4Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.171 0 1 0 0.173 0 0 1 0 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mWing4FanLeft" name="mWing4FanLeft" sid="mWing4FanLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.171 0 1 0 0.173 0 0 1 0 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mWing1Right" name="mWing1Right" sid="mWing1Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.099 0 1 0 -0.105 0 0 1 0.181 0 0 0 1</matrix>
|
||||
<node id="mWing2Right" name="mWing2Right" sid="mWing2Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.168 0 1 0 -0.169 0 0 1 0.067 0 0 0 1</matrix>
|
||||
<node id="mWing3Right" name="mWing3Right" sid="mWing3Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.181 0 1 0 -0.183 0 0 1 0 0 0 0 1</matrix>
|
||||
<node id="mWing4Right" name="mWing4Right" sid="mWing4Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.171 0 1 0 -0.173 0 0 1 0 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mWing4FanRight" name="mWing4FanRight" sid="mWing4FanRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.171 0 1 0 -0.173 0 0 1 0 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mHipRight" name="mHipRight" sid="mHipRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.034 0 1 0 -0.129 0 0 1 -0.041 0 0 0 1</matrix>
|
||||
<node id="R_UPPER_LEG" name="R_UPPER_LEG" sid="R_UPPER_LEG" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.02 0 1 0 0.05 0 0 1 -0.22 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mKneeRight" name="mKneeRight" sid="mKneeRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.001 0 1 0 0.049 0 0 1 -0.491 0 0 0 1</matrix>
|
||||
<node id="R_LOWER_LEG" name="R_LOWER_LEG" sid="R_LOWER_LEG" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.02 0 1 0 0 0 0 1 -0.2 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mAnkleRight" name="mAnkleRight" sid="mAnkleRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.029 0 1 0 0 0 0 1 -0.468 0 0 0 1</matrix>
|
||||
<node id="R_FOOT" name="R_FOOT" sid="R_FOOT" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.077 0 1 0 0 0 0 1 -0.041 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFootRight" name="mFootRight" sid="mFootRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
|
||||
<node id="mToeRight" name="mToeRight" sid="mToeRight" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.109 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mHipLeft" name="mHipLeft" sid="mHipLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.034 0 1 0 0.127 0 0 1 -0.041 0 0 0 1</matrix>
|
||||
<node id="L_UPPER_LEG" name="L_UPPER_LEG" sid="L_UPPER_LEG" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.02 0 1 0 -0.05 0 0 1 -0.22 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mKneeLeft" name="mKneeLeft" sid="mKneeLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.001 0 1 0 -0.046 0 0 1 -0.491 0 0 0 1</matrix>
|
||||
<node id="L_LOWER_LEG" name="L_LOWER_LEG" sid="L_LOWER_LEG" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.02 0 1 0 0 0 0 1 -0.2 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mAnkleLeft" name="mAnkleLeft" sid="mAnkleLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.029 0 1 0 0.001 0 0 1 -0.468 0 0 0 1</matrix>
|
||||
<node id="L_FOOT" name="L_FOOT" sid="L_FOOT" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.077 0 1 0 0 0 0 1 -0.041 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mFootLeft" name="mFootLeft" sid="mFootLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
|
||||
<node id="mToeLeft" name="mToeLeft" sid="mToeLeft" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.109 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mTail1" name="mTail1" sid="mTail1" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.116 0 1 0 0 0 0 1 0.047 0 0 0 1</matrix>
|
||||
<node id="mTail2" name="mTail2" sid="mTail2" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.197 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
|
||||
<node id="mTail3" name="mTail3" sid="mTail3" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.168 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
|
||||
<node id="mTail4" name="mTail4" sid="mTail4" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.142 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
|
||||
<node id="mTail5" name="mTail5" sid="mTail5" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.112 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
|
||||
<node id="mTail6" name="mTail6" sid="mTail6" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.094 0 1 0 0 0 0 1 0 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mGroin" name="mGroin" sid="mGroin" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.064 0 1 0 0 0 0 1 -0.097 0 0 0 1</matrix>
|
||||
</node>
|
||||
<node id="mHindLimbsRoot" name="mHindLimbsRoot" sid="mHindLimbsRoot" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.2 0 1 0 0 0 0 1 0.084 0 0 0 1</matrix>
|
||||
<node id="mHindLimb1Left" name="mHindLimb1Left" sid="mHindLimb1Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.204 0 1 0 0.129 0 0 1 -0.125 0 0 0 1</matrix>
|
||||
<node id="mHindLimb2Left" name="mHindLimb2Left" sid="mHindLimb2Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.002 0 1 0 -0.046 0 0 1 -0.491 0 0 0 1</matrix>
|
||||
<node id="mHindLimb3Left" name="mHindLimb3Left" sid="mHindLimb3Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.03 0 1 0 -0.003 0 0 1 -0.468 0 0 0 1</matrix>
|
||||
<node id="mHindLimb4Left" name="mHindLimb4Left" sid="mHindLimb4Left" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
<node id="mHindLimb1Right" name="mHindLimb1Right" sid="mHindLimb1Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.204 0 1 0 -0.129 0 0 1 -0.125 0 0 0 1</matrix>
|
||||
<node id="mHindLimb2Right" name="mHindLimb2Right" sid="mHindLimb2Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.002 0 1 0 0.046 0 0 1 -0.491 0 0 0 1</matrix>
|
||||
<node id="mHindLimb3Right" name="mHindLimb3Right" sid="mHindLimb3Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 -0.03 0 1 0 0.003 0 0 1 -0.468 0 0 0 1</matrix>
|
||||
<node id="mHindLimb4Right" name="mHindLimb4Right" sid="mHindLimb4Right" type="JOINT">
|
||||
<matrix sid="transform">1 0 0 0.112 0 1 0 0 0 0 1 -0.061 0 0 0 1</matrix>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
</node>
|
||||
@@ -18,6 +18,8 @@ You should have received a copy of the GNU Lesser General Public License
|
||||
along with this program; if not, write to the Free Software Foundation,
|
||||
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import ast
|
||||
import enum
|
||||
import hashlib
|
||||
@@ -58,6 +60,9 @@ class TupleCoord(recordclass.datatuple, _IterableStub): # type: ignore
|
||||
def __abs__(self):
|
||||
return self.__class__(*(abs(x) for x in self))
|
||||
|
||||
def __neg__(self):
|
||||
return self.__class__(*(-x for x in self))
|
||||
|
||||
def __add__(self, other):
|
||||
return self.__class__(*(x + y for x, y in zip(self, other)))
|
||||
|
||||
@@ -244,6 +249,7 @@ class Quaternion(TupleCoord):
|
||||
|
||||
class UUID(uuid.UUID):
|
||||
_NULL_UUID_STR = '00000000-0000-0000-0000-000000000000'
|
||||
ZERO: UUID
|
||||
__slots__ = ()
|
||||
|
||||
def __init__(self, val: Union[uuid.UUID, str, None] = None, bytes=None, int=None):
|
||||
@@ -268,12 +274,16 @@ class UUID(uuid.UUID):
|
||||
return self.__class__(int=self.int ^ other.int)
|
||||
|
||||
|
||||
UUID.ZERO = UUID()
|
||||
|
||||
|
||||
class JankStringyBytes(bytes):
|
||||
"""
|
||||
Treat bytes as UTF8 if used in string context
|
||||
|
||||
Sinful, but necessary evil for now since templates don't specify what's
|
||||
binary and what's a string.
|
||||
binary and what's a string. There are also certain fields where the value
|
||||
may be either binary _or_ a string, depending on the context.
|
||||
"""
|
||||
__slots__ = ()
|
||||
|
||||
@@ -288,6 +298,11 @@ class JankStringyBytes(bytes):
|
||||
def __ne__(self, other):
|
||||
return not self.__eq__(other)
|
||||
|
||||
def __contains__(self, item):
|
||||
if isinstance(item, str):
|
||||
return item in str(self)
|
||||
return item in bytes(self)
|
||||
|
||||
|
||||
class RawBytes(bytes):
|
||||
__slots__ = ()
|
||||
|
||||
@@ -4,6 +4,7 @@ import codecs
|
||||
import functools
|
||||
import os
|
||||
|
||||
import lazy_object_proxy
|
||||
import pkg_resources
|
||||
import re
|
||||
import weakref
|
||||
@@ -19,7 +20,7 @@ def _with_patched_multidict(f):
|
||||
# There's no way to tell pprint "hey, this is a dict,
|
||||
# this is how you access its items." A lot of the formatting logic
|
||||
# is in the module-level `_safe_repr()` which we don't want to mess with.
|
||||
# Instead, pretend our MultiDict has dict's __repr__ and while we're inside
|
||||
# Instead, pretend our MultiDict has dict's __repr__ while we're inside
|
||||
# calls to pprint. Hooray.
|
||||
orig_repr = MultiDict.__repr__
|
||||
if orig_repr is dict.__repr__:
|
||||
@@ -67,6 +68,9 @@ class HippoPrettyPrinter(PrettyPrinter):
|
||||
return f"({reprs})"
|
||||
|
||||
def pformat(self, obj: object, *args, **kwargs) -> str:
|
||||
# Unwrap lazy object proxies before pprinting them
|
||||
if isinstance(obj, lazy_object_proxy.Proxy):
|
||||
obj = obj.__wrapped__
|
||||
if isinstance(obj, (bytes, str)):
|
||||
return self._str_format(obj)
|
||||
return self._base_pformat(obj, *args, **kwargs)
|
||||
|
||||
@@ -7,9 +7,9 @@ from __future__ import annotations
|
||||
|
||||
import dataclasses
|
||||
import datetime as dt
|
||||
import itertools
|
||||
import logging
|
||||
import struct
|
||||
import typing
|
||||
import weakref
|
||||
from io import StringIO
|
||||
from typing import *
|
||||
@@ -132,10 +132,14 @@ class InventoryBase(SchemaBase):
|
||||
writer.write("\t}\n")
|
||||
|
||||
|
||||
class InventoryDifferences(typing.NamedTuple):
|
||||
changed: List[InventoryNodeBase]
|
||||
removed: List[InventoryNodeBase]
|
||||
|
||||
|
||||
class InventoryModel(InventoryBase):
|
||||
def __init__(self):
|
||||
self.containers: Dict[UUID, InventoryContainerBase] = {}
|
||||
self.items: Dict[UUID, InventoryItem] = {}
|
||||
self.nodes: Dict[UUID, InventoryNodeBase] = {}
|
||||
self.root: Optional[InventoryContainerBase] = None
|
||||
|
||||
@classmethod
|
||||
@@ -145,18 +149,17 @@ class InventoryModel(InventoryBase):
|
||||
if key == "inv_object":
|
||||
obj = InventoryObject.from_reader(reader)
|
||||
if obj is not None:
|
||||
model.add_container(obj)
|
||||
model.add(obj)
|
||||
elif key == "inv_category":
|
||||
cat = InventoryCategory.from_reader(reader)
|
||||
if cat is not None:
|
||||
model.add_container(cat)
|
||||
model.add(cat)
|
||||
elif key == "inv_item":
|
||||
item = InventoryItem.from_reader(reader)
|
||||
if item is not None:
|
||||
model.add_item(item)
|
||||
model.add(item)
|
||||
else:
|
||||
LOG.warning("Unknown key {0}".format(key))
|
||||
model.reparent_nodes()
|
||||
return model
|
||||
|
||||
@classmethod
|
||||
@@ -165,54 +168,94 @@ class InventoryModel(InventoryBase):
|
||||
for obj_dict in llsd_val:
|
||||
if InventoryCategory.ID_ATTR in obj_dict:
|
||||
if (obj := InventoryCategory.from_llsd(obj_dict)) is not None:
|
||||
model.add_container(obj)
|
||||
model.add(obj)
|
||||
elif InventoryObject.ID_ATTR in obj_dict:
|
||||
if (obj := InventoryObject.from_llsd(obj_dict)) is not None:
|
||||
model.add_container(obj)
|
||||
model.add(obj)
|
||||
elif InventoryItem.ID_ATTR in obj_dict:
|
||||
if (obj := InventoryItem.from_llsd(obj_dict)) is not None:
|
||||
model.add_item(obj)
|
||||
model.add(obj)
|
||||
else:
|
||||
LOG.warning(f"Unknown object type {obj_dict!r}")
|
||||
model.reparent_nodes()
|
||||
return model
|
||||
|
||||
@property
|
||||
def ordered_nodes(self) -> Iterable[InventoryNodeBase]:
|
||||
yield from self.all_containers
|
||||
yield from self.all_items
|
||||
|
||||
@property
|
||||
def all_containers(self) -> Iterable[InventoryContainerBase]:
|
||||
for node in self.nodes.values():
|
||||
if isinstance(node, InventoryContainerBase):
|
||||
yield node
|
||||
|
||||
@property
|
||||
def all_items(self) -> Iterable[InventoryItem]:
|
||||
for node in self.nodes.values():
|
||||
if not isinstance(node, InventoryContainerBase):
|
||||
yield node
|
||||
|
||||
def __eq__(self, other):
|
||||
if not isinstance(other, InventoryModel):
|
||||
return False
|
||||
return set(self.nodes.values()) == set(other.nodes.values())
|
||||
|
||||
def to_writer(self, writer: StringIO):
|
||||
for container in self.containers.values():
|
||||
container.to_writer(writer)
|
||||
for item in self.items.values():
|
||||
item.to_writer(writer)
|
||||
for node in self.ordered_nodes:
|
||||
node.to_writer(writer)
|
||||
|
||||
def to_llsd(self):
|
||||
vals = []
|
||||
for container in self.containers.values():
|
||||
vals.append(container.to_llsd())
|
||||
for item in self.items.values():
|
||||
vals.append(item.to_llsd())
|
||||
return vals
|
||||
return list(node.to_llsd() for node in self.ordered_nodes)
|
||||
|
||||
def add_container(self, container: InventoryContainerBase):
|
||||
self.containers[container.node_id] = container
|
||||
container.model = weakref.proxy(self)
|
||||
def add(self, node: InventoryNodeBase):
|
||||
if node.node_id in self.nodes:
|
||||
raise KeyError(f"{node.node_id} already exists in the inventory model")
|
||||
|
||||
def add_item(self, item: InventoryItem):
|
||||
self.items[item.item_id] = item
|
||||
item.model = weakref.proxy(self)
|
||||
self.nodes[node.node_id] = node
|
||||
if isinstance(node, InventoryContainerBase):
|
||||
if node.parent_id == UUID.ZERO:
|
||||
self.root = node
|
||||
node.model = weakref.proxy(self)
|
||||
|
||||
def reparent_nodes(self):
|
||||
self.root = None
|
||||
for container in self.containers.values():
|
||||
container.children.clear()
|
||||
if container.parent_id == UUID():
|
||||
self.root = container
|
||||
for obj in itertools.chain(self.items.values(), self.containers.values()):
|
||||
if not obj.parent_id or obj.parent_id == UUID():
|
||||
continue
|
||||
parent_container = self.containers.get(obj.parent_id)
|
||||
if not parent_container:
|
||||
LOG.warning("{0} had an invalid parent {1}".format(obj, obj.parent_id))
|
||||
continue
|
||||
parent_container.children.append(obj)
|
||||
def unlink(self, node: InventoryNodeBase) -> Sequence[InventoryNodeBase]:
|
||||
"""Unlink a node and its descendants from the tree, returning the removed nodes"""
|
||||
assert node.model == self
|
||||
if node == self.root:
|
||||
self.root = None
|
||||
unlinked = [node]
|
||||
if isinstance(node, InventoryContainerBase):
|
||||
for child in node.children:
|
||||
unlinked.extend(self.unlink(child))
|
||||
self.nodes.pop(node.node_id, None)
|
||||
node.model = None
|
||||
return unlinked
|
||||
|
||||
def get_differences(self, other: InventoryModel) -> InventoryDifferences:
|
||||
# Includes modified things with the same ID
|
||||
changed_in_other = []
|
||||
removed_in_other = []
|
||||
|
||||
other_keys = set(other.nodes.keys())
|
||||
our_keys = set(self.nodes.keys())
|
||||
|
||||
# Removed
|
||||
for key in our_keys - other_keys:
|
||||
removed_in_other.append(self.nodes[key])
|
||||
|
||||
# Updated
|
||||
for key in other_keys.intersection(our_keys):
|
||||
other_node = other.nodes[key]
|
||||
if other_node != self.nodes[key]:
|
||||
changed_in_other.append(other_node)
|
||||
|
||||
# Added
|
||||
for key in other_keys - our_keys:
|
||||
changed_in_other.append(other.nodes[key])
|
||||
return InventoryDifferences(
|
||||
changed=changed_in_other,
|
||||
removed=removed_in_other,
|
||||
)
|
||||
|
||||
|
||||
@dataclasses.dataclass
|
||||
@@ -242,16 +285,27 @@ class InventorySaleInfo(InventoryBase):
|
||||
class InventoryNodeBase(InventoryBase):
|
||||
ID_ATTR: ClassVar[str]
|
||||
|
||||
name: str
|
||||
|
||||
parent_id: Optional[UUID] = schema_field(SchemaUUID)
|
||||
model: Optional[InventoryModel] = dataclasses.field(default=None, init=False)
|
||||
model: Optional[InventoryModel] = dataclasses.field(
|
||||
default=None, init=False, hash=False, compare=False, repr=False
|
||||
)
|
||||
|
||||
@property
|
||||
def node_id(self) -> UUID:
|
||||
return getattr(self, self.ID_ATTR)
|
||||
|
||||
@node_id.setter
|
||||
def node_id(self, val: UUID):
|
||||
setattr(self, self.ID_ATTR, val)
|
||||
|
||||
@property
|
||||
def parent(self):
|
||||
return self.model.containers.get(self.parent_id)
|
||||
def parent(self) -> Optional[InventoryContainerBase]:
|
||||
return self.model.nodes.get(self.parent_id)
|
||||
|
||||
def unlink(self) -> Sequence[InventoryNodeBase]:
|
||||
return self.model.unlink(self)
|
||||
|
||||
@classmethod
|
||||
def _obj_from_dict(cls, obj_dict):
|
||||
@@ -262,12 +316,58 @@ class InventoryNodeBase(InventoryBase):
|
||||
return None
|
||||
return super()._obj_from_dict(obj_dict)
|
||||
|
||||
def __hash__(self):
|
||||
return hash(self.node_id)
|
||||
|
||||
def __iter__(self) -> Iterator[InventoryNodeBase]:
|
||||
return iter(())
|
||||
|
||||
def __contains__(self, item) -> bool:
|
||||
return item in tuple(self)
|
||||
|
||||
|
||||
@dataclasses.dataclass
|
||||
class InventoryContainerBase(InventoryNodeBase):
|
||||
type: str = schema_field(SchemaStr)
|
||||
name: str = schema_field(SchemaMultilineStr)
|
||||
children: List[InventoryNodeBase] = dataclasses.field(default_factory=list, init=False)
|
||||
|
||||
@property
|
||||
def children(self) -> Sequence[InventoryNodeBase]:
|
||||
return tuple(
|
||||
x for x in self.model.nodes.values()
|
||||
if x.parent_id == self.node_id
|
||||
)
|
||||
|
||||
def __getitem__(self, item: Union[int, str]) -> InventoryNodeBase:
|
||||
if isinstance(item, int):
|
||||
return self.children[item]
|
||||
|
||||
for child in self.children:
|
||||
if child.name == item:
|
||||
return child
|
||||
raise KeyError(f"{item!r} not found in children")
|
||||
|
||||
def __iter__(self) -> Iterator[InventoryNodeBase]:
|
||||
return iter(self.children)
|
||||
|
||||
def get_or_create_subcategory(self, name: str) -> InventoryCategory:
|
||||
for child in self:
|
||||
if child.name == name and isinstance(child, InventoryCategory):
|
||||
return child
|
||||
child = InventoryCategory(
|
||||
name=name,
|
||||
cat_id=UUID.random(),
|
||||
parent_id=self.node_id,
|
||||
type="category",
|
||||
pref_type="-1",
|
||||
owner_id=getattr(self, 'owner_id', UUID.ZERO),
|
||||
version=1,
|
||||
)
|
||||
self.model.add(child)
|
||||
return child
|
||||
|
||||
# So autogenerated __hash__ doesn't kill our inherited one
|
||||
__hash__ = InventoryNodeBase.__hash__
|
||||
|
||||
|
||||
@dataclasses.dataclass
|
||||
@@ -277,17 +377,21 @@ class InventoryObject(InventoryContainerBase):
|
||||
|
||||
obj_id: UUID = schema_field(SchemaUUID)
|
||||
|
||||
__hash__ = InventoryNodeBase.__hash__
|
||||
|
||||
|
||||
@dataclasses.dataclass
|
||||
class InventoryCategory(InventoryContainerBase):
|
||||
ID_ATTR: ClassVar[str] = "cat_id"
|
||||
SCHEMA_NAME: ClassVar[str] = "inv_object"
|
||||
SCHEMA_NAME: ClassVar[str] = "inv_category"
|
||||
|
||||
cat_id: UUID = schema_field(SchemaUUID)
|
||||
pref_type: str = schema_field(SchemaStr, llsd_name="preferred_type")
|
||||
owner_id: UUID = schema_field(SchemaUUID)
|
||||
version: int = schema_field(SchemaInt)
|
||||
|
||||
__hash__ = InventoryNodeBase.__hash__
|
||||
|
||||
|
||||
@dataclasses.dataclass
|
||||
class InventoryItem(InventoryNodeBase):
|
||||
@@ -306,6 +410,8 @@ class InventoryItem(InventoryNodeBase):
|
||||
asset_id: Optional[UUID] = schema_field(SchemaUUID, default=None)
|
||||
shadow_id: Optional[UUID] = schema_field(SchemaUUID, default=None)
|
||||
|
||||
__hash__ = InventoryNodeBase.__hash__
|
||||
|
||||
@property
|
||||
def true_asset_id(self) -> UUID:
|
||||
if self.asset_id is not None:
|
||||
|
||||
@@ -1,6 +1,9 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import abc
|
||||
import asyncio
|
||||
import copy
|
||||
import dataclasses
|
||||
import datetime as dt
|
||||
import logging
|
||||
from typing import *
|
||||
@@ -13,6 +16,14 @@ from .msgtypes import PacketFlags
|
||||
from .udpserializer import UDPMessageSerializer
|
||||
|
||||
|
||||
@dataclasses.dataclass
|
||||
class ReliableResendInfo:
|
||||
last_resent: dt.datetime
|
||||
message: Message
|
||||
completed: asyncio.Future = dataclasses.field(default_factory=asyncio.Future)
|
||||
tries_left: int = 10
|
||||
|
||||
|
||||
class Circuit:
|
||||
def __init__(self, near_host: Optional[ADDR_TUPLE], far_host: ADDR_TUPLE, transport):
|
||||
self.near_host: Optional[ADDR_TUPLE] = near_host
|
||||
@@ -22,6 +33,8 @@ class Circuit:
|
||||
self.serializer = UDPMessageSerializer()
|
||||
self.last_packet_at = dt.datetime.now()
|
||||
self.packet_id_base = 0
|
||||
self.unacked_reliable: Dict[Tuple[Direction, int], ReliableResendInfo] = {}
|
||||
self.resend_every: float = 3.0
|
||||
|
||||
def _send_prepared_message(self, message: Message, transport=None):
|
||||
try:
|
||||
@@ -46,24 +59,69 @@ class Circuit:
|
||||
raise RuntimeError(f"Trying to re-send finalized {message!r}")
|
||||
message.packet_id = self.packet_id_base
|
||||
self.packet_id_base += 1
|
||||
if not message.acks:
|
||||
message.send_flags &= PacketFlags.ACK
|
||||
if message.acks:
|
||||
message.send_flags |= PacketFlags.ACK
|
||||
else:
|
||||
message.send_flags &= ~PacketFlags.ACK
|
||||
# If it was queued, it's not anymore
|
||||
message.queued = False
|
||||
message.finalized = True
|
||||
|
||||
def send_message(self, message: Message, transport=None):
|
||||
def send(self, message: Message, transport=None) -> UDPPacket:
|
||||
if self.prepare_message(message):
|
||||
# If the message originates from us then we're responsible for resends.
|
||||
if message.reliable and message.synthetic:
|
||||
self.unacked_reliable[(message.direction, message.packet_id)] = ReliableResendInfo(
|
||||
last_resent=dt.datetime.now(),
|
||||
message=message,
|
||||
)
|
||||
return self._send_prepared_message(message, transport)
|
||||
|
||||
# Temporary alias
|
||||
send_message = send
|
||||
|
||||
def send_reliable(self, message: Message, transport=None) -> asyncio.Future:
|
||||
"""send() wrapper that always sends reliably and allows `await`ing ACK receipt"""
|
||||
if not message.synthetic:
|
||||
raise ValueError("Not able to send non-synthetic message reliably!")
|
||||
message.send_flags |= PacketFlags.RELIABLE
|
||||
self.send(message, transport)
|
||||
return self.unacked_reliable[(message.direction, message.packet_id)].completed
|
||||
|
||||
def collect_acks(self, message: Message):
|
||||
effective_acks = list(message.acks)
|
||||
if message.name == "PacketAck":
|
||||
effective_acks.extend(x["ID"] for x in message["Packets"])
|
||||
for ack in effective_acks:
|
||||
resend_info = self.unacked_reliable.pop((~message.direction, ack), None)
|
||||
if resend_info:
|
||||
resend_info.completed.set_result(None)
|
||||
|
||||
def resend_unacked(self):
|
||||
for resend_info in list(self.unacked_reliable.values()):
|
||||
# Not time to attempt a resend yet
|
||||
if dt.datetime.now() - resend_info.last_resent < dt.timedelta(seconds=self.resend_every):
|
||||
continue
|
||||
|
||||
msg = copy.copy(resend_info.message)
|
||||
resend_info.tries_left -= 1
|
||||
# We were on our last try and we never received an ack
|
||||
if not resend_info.tries_left:
|
||||
logging.warning(f"Giving up on unacked {msg.packet_id}")
|
||||
del self.unacked_reliable[(msg.direction, msg.packet_id)]
|
||||
resend_info.completed.set_exception(TimeoutError("Exceeded resend limit"))
|
||||
continue
|
||||
resend_info.last_resent = dt.datetime.now()
|
||||
msg.send_flags |= PacketFlags.RESENT
|
||||
self._send_prepared_message(msg)
|
||||
|
||||
def send_acks(self, to_ack: Sequence[int], direction=Direction.OUT, packet_id=None):
|
||||
logging.debug("%r acking %r" % (direction, to_ack))
|
||||
# TODO: maybe tack this onto `.acks` for next message?
|
||||
message = Message('PacketAck', *[Block('Packets', ID=x) for x in to_ack])
|
||||
message.packet_id = packet_id
|
||||
message.direction = direction
|
||||
message.injected = True
|
||||
self.send_message(message)
|
||||
self.send(message)
|
||||
|
||||
def __repr__(self):
|
||||
return "<%s %r : %r>" % (self.__class__.__name__, self.near_host, self.host)
|
||||
|
||||
@@ -3002,6 +3002,16 @@ version 2.0
|
||||
RegionInfo3 Variable
|
||||
{ RegionFlagsExtended U64 }
|
||||
}
|
||||
{
|
||||
RegionInfo5 Variable
|
||||
{ ChatWhisperRange F32 }
|
||||
{ ChatNormalRange F32 }
|
||||
{ ChatShoutRange F32 }
|
||||
{ ChatWhisperOffset F32 }
|
||||
{ ChatNormalOffset F32 }
|
||||
{ ChatShoutOffset F32 }
|
||||
{ ChatFlags U32 }
|
||||
}
|
||||
}
|
||||
|
||||
// GodUpdateRegionInfo
|
||||
@@ -5792,6 +5802,28 @@ version 2.0
|
||||
}
|
||||
}
|
||||
|
||||
// LargeGenericMessage
|
||||
// Similar to the above messages, but can handle larger payloads and serialized
|
||||
// LLSD. Uses HTTP transport
|
||||
{
|
||||
LargeGenericMessage Low 430 NotTrusted Unencoded UDPDeprecated
|
||||
{
|
||||
AgentData Single
|
||||
{ AgentID LLUUID }
|
||||
{ SessionID LLUUID }
|
||||
{ TransactionID LLUUID }
|
||||
}
|
||||
{
|
||||
MethodData Single
|
||||
{ Method Variable 1 }
|
||||
{ Invoice LLUUID }
|
||||
}
|
||||
{
|
||||
ParamList Variable
|
||||
{ Parameter Variable 2 }
|
||||
}
|
||||
}
|
||||
|
||||
// ***************************************************************************
|
||||
// Requests for possessions, acquisition, money, etc
|
||||
// ***************************************************************************
|
||||
|
||||
@@ -188,7 +188,7 @@ class MsgBlockList(List["Block"]):
|
||||
class Message:
|
||||
__slots__ = ("name", "send_flags", "packet_id", "acks", "body_boundaries", "queued",
|
||||
"offset", "raw_extra", "raw_body", "deserializer", "_blocks", "finalized",
|
||||
"direction", "meta", "injected", "dropped", "sender")
|
||||
"direction", "meta", "synthetic", "dropped", "sender")
|
||||
|
||||
def __init__(self, name, *args, packet_id=None, flags=0, acks=None, direction=None):
|
||||
# TODO: Do this on a timer or something.
|
||||
@@ -213,7 +213,7 @@ class Message:
|
||||
self.queued: bool = False
|
||||
self._blocks: BLOCK_DICT = {}
|
||||
self.meta = {}
|
||||
self.injected = False
|
||||
self.synthetic = packet_id is None
|
||||
self.dropped = False
|
||||
self.sender: Optional[ADDR_TUPLE] = None
|
||||
|
||||
@@ -312,7 +312,7 @@ class Message:
|
||||
"packet_id": self.packet_id,
|
||||
"meta": self.meta.copy(),
|
||||
"dropped": self.dropped,
|
||||
"injected": self.injected,
|
||||
"synthetic": self.synthetic,
|
||||
"direction": self.direction.name,
|
||||
"send_flags": int(self.send_flags),
|
||||
"extra": self.extra,
|
||||
@@ -334,7 +334,7 @@ class Message:
|
||||
msg.packet_id = dict_val['packet_id']
|
||||
msg.meta = dict_val['meta']
|
||||
msg.dropped = dict_val['dropped']
|
||||
msg.injected = dict_val['injected']
|
||||
msg.synthetic = dict_val['synthetic']
|
||||
msg.direction = Direction[dict_val['direction']]
|
||||
msg.send_flags = dict_val['send_flags']
|
||||
msg.extra = dict_val['extra']
|
||||
@@ -386,6 +386,7 @@ class Message:
|
||||
message_copy.packet_id = None
|
||||
message_copy.dropped = False
|
||||
message_copy.finalized = False
|
||||
message_copy.queued = False
|
||||
return message_copy
|
||||
|
||||
def to_summary(self):
|
||||
|
||||
@@ -62,9 +62,16 @@ class HumanMessageSerializer:
|
||||
continue
|
||||
|
||||
if first_line:
|
||||
direction, message_name = line.split(" ", 1)
|
||||
first_split = [x for x in line.split(" ") if x]
|
||||
direction, message_name = first_split[:2]
|
||||
options = [x.strip("[]") for x in first_split[2:]]
|
||||
msg = Message(message_name)
|
||||
msg.direction = Direction[direction.upper()]
|
||||
for option in options:
|
||||
if option in PacketFlags.__members__:
|
||||
msg.send_flags |= PacketFlags[option]
|
||||
elif re.match(r"^\d+$", option):
|
||||
msg.send_flags |= int(option)
|
||||
first_line = False
|
||||
continue
|
||||
|
||||
@@ -137,9 +144,17 @@ class HumanMessageSerializer:
|
||||
if msg.direction is not None:
|
||||
string += f'{msg.direction.name} '
|
||||
string += msg.name
|
||||
flags = msg.send_flags
|
||||
for poss_flag in iter(PacketFlags):
|
||||
if flags & poss_flag:
|
||||
flags &= ~poss_flag
|
||||
string += f" [{poss_flag.name}]"
|
||||
# Make sure flags with unknown meanings don't get lost
|
||||
if flags:
|
||||
string += f" [{int(flags)}]"
|
||||
if msg.packet_id is not None:
|
||||
string += f'\n# {msg.packet_id}: {PacketFlags(msg.send_flags)!r}'
|
||||
string += f'{", DROPPED" if msg.dropped else ""}{", INJECTED" if msg.injected else ""}'
|
||||
string += f'\n# ID: {msg.packet_id}'
|
||||
string += f'{", DROPPED" if msg.dropped else ""}{", SYNTHETIC" if msg.synthetic else ""}'
|
||||
if msg.extra:
|
||||
string += f'\n# EXTRA: {msg.extra!r}'
|
||||
string += '\n\n'
|
||||
|
||||
@@ -107,12 +107,14 @@ class MessageHandler(Generic[_T, _K]):
|
||||
take = self.take_by_default
|
||||
notifiers = [self.register(name) for name in message_names]
|
||||
|
||||
fut = asyncio.get_event_loop().create_future()
|
||||
loop = asyncio.get_event_loop_policy().get_event_loop()
|
||||
fut = loop.create_future()
|
||||
timeout_task = None
|
||||
|
||||
async def _canceller():
|
||||
await asyncio.sleep(timeout)
|
||||
fut.set_exception(asyncio.exceptions.TimeoutError("Timed out waiting for packet"))
|
||||
if not fut.done():
|
||||
fut.set_exception(asyncio.exceptions.TimeoutError("Timed out waiting for packet"))
|
||||
for n in notifiers:
|
||||
n.unsubscribe(_handler)
|
||||
|
||||
@@ -125,7 +127,8 @@ class MessageHandler(Generic[_T, _K]):
|
||||
# Whatever was awaiting this future now owns this message
|
||||
if take:
|
||||
message = message.take()
|
||||
fut.set_result(message)
|
||||
if not fut.done():
|
||||
fut.set_result(message)
|
||||
# Make sure to unregister this handler for all message types
|
||||
for n in notifiers:
|
||||
n.unsubscribe(_handler)
|
||||
|
||||
@@ -68,7 +68,7 @@ class UDPMessageDeserializer:
|
||||
self.settings = settings or Settings()
|
||||
self.template_dict = self.DEFAULT_TEMPLATE
|
||||
|
||||
def deserialize(self, msg_buff: bytes):
|
||||
def deserialize(self, msg_buff: bytes) -> Message:
|
||||
msg = self._parse_message_header(msg_buff)
|
||||
if not self.settings.ENABLE_DEFERRED_PACKET_PARSING:
|
||||
try:
|
||||
@@ -85,6 +85,7 @@ class UDPMessageDeserializer:
|
||||
reader = se.BufferReader("!", data)
|
||||
|
||||
msg: Message = Message("Placeholder")
|
||||
msg.synthetic = False
|
||||
msg.send_flags = reader.read(se.U8)
|
||||
msg.packet_id = reader.read(se.U32)
|
||||
|
||||
|
||||
@@ -71,7 +71,7 @@ class Object(recordclass.datatuple): # type: ignore
|
||||
ProfileBegin: Optional[int] = None
|
||||
ProfileEnd: Optional[int] = None
|
||||
ProfileHollow: Optional[int] = None
|
||||
TextureEntry: Optional[tmpls.TextureEntry] = None
|
||||
TextureEntry: Optional[tmpls.TextureEntryCollection] = None
|
||||
TextureAnim: Optional[tmpls.TextureAnim] = None
|
||||
NameValue: Optional[Any] = None
|
||||
Data: Optional[Any] = None
|
||||
@@ -270,6 +270,9 @@ def normalize_object_update_compressed_data(data: bytes):
|
||||
# Only used for determining which sections are present
|
||||
del compressed["Flags"]
|
||||
|
||||
# Unlike other ObjectUpdate types, a null value in an ObjectUpdateCompressed
|
||||
# always means that there is no value, not that the value hasn't changed
|
||||
# from the client's view. Use the default value when that happens.
|
||||
ps_block = compressed.pop("PSBlockNew", None)
|
||||
if ps_block is None:
|
||||
ps_block = compressed.pop("PSBlock", None)
|
||||
@@ -278,6 +281,20 @@ def normalize_object_update_compressed_data(data: bytes):
|
||||
compressed.pop("PSBlock", None)
|
||||
if compressed["NameValue"] is None:
|
||||
compressed["NameValue"] = NameValueCollection()
|
||||
if compressed["Text"] is None:
|
||||
compressed["Text"] = b""
|
||||
compressed["TextColor"] = b""
|
||||
if compressed["MediaURL"] is None:
|
||||
compressed["MediaURL"] = b""
|
||||
if compressed["AngularVelocity"] is None:
|
||||
compressed["AngularVelocity"] = Vector3()
|
||||
if compressed["SoundFlags"] is None:
|
||||
compressed["SoundFlags"] = 0
|
||||
compressed["SoundGain"] = 0.0
|
||||
compressed["SoundRadius"] = 0.0
|
||||
compressed["Sound"] = UUID()
|
||||
if compressed["TextureEntry"] is None:
|
||||
compressed["TextureEntry"] = tmpls.TextureEntryCollection()
|
||||
|
||||
object_data = {
|
||||
"PSBlock": ps_block.value,
|
||||
@@ -286,9 +303,9 @@ def normalize_object_update_compressed_data(data: bytes):
|
||||
"LocalID": compressed.pop("ID"),
|
||||
**compressed,
|
||||
}
|
||||
if object_data["TextureEntry"] is None:
|
||||
object_data.pop("TextureEntry")
|
||||
# Don't clobber OwnerID in case the object has a proper one.
|
||||
# Don't clobber OwnerID in case the object has a proper one from
|
||||
# a previous ObjectProperties. OwnerID isn't expected to be populated
|
||||
# on ObjectUpdates unless an attached sound is playing.
|
||||
if object_data["OwnerID"] == UUID():
|
||||
del object_data["OwnerID"]
|
||||
return object_data
|
||||
|
||||
@@ -1339,6 +1339,12 @@ class TypedBytesBase(SerializableBase, abc.ABC):
|
||||
return self._spec.default_value()
|
||||
|
||||
|
||||
class TypedBytesGreedy(TypedBytesBase):
|
||||
def __init__(self, spec, empty_is_none=False, check_trailing_bytes=True, lazy=False):
|
||||
self._bytes_tmpl = BytesGreedy()
|
||||
super().__init__(spec, empty_is_none, check_trailing_bytes, lazy=lazy)
|
||||
|
||||
|
||||
class TypedByteArray(TypedBytesBase):
|
||||
def __init__(self, len_spec, spec, empty_is_none=False, check_trailing_bytes=True, lazy=False):
|
||||
self._bytes_tmpl = ByteArray(len_spec)
|
||||
|
||||
@@ -3,16 +3,18 @@ Serialization templates for structures used in LLUDP and HTTP bodies.
|
||||
"""
|
||||
|
||||
import abc
|
||||
import collections
|
||||
import dataclasses
|
||||
import enum
|
||||
import importlib
|
||||
import logging
|
||||
import math
|
||||
import zlib
|
||||
from typing import *
|
||||
|
||||
import hippolyzer.lib.base.serialization as se
|
||||
from hippolyzer.lib.base import llsd
|
||||
from hippolyzer.lib.base.datatypes import UUID, IntEnum, IntFlag
|
||||
from hippolyzer.lib.base.datatypes import UUID, IntEnum, IntFlag, Vector3
|
||||
from hippolyzer.lib.base.namevalue import NameValuesSerializer
|
||||
|
||||
try:
|
||||
@@ -144,6 +146,50 @@ class InventoryType(IntEnum):
|
||||
}.get(lower, lower)
|
||||
|
||||
|
||||
class FolderType(IntEnum):
|
||||
TEXTURE = 0
|
||||
SOUND = 1
|
||||
CALLINGCARD = 2
|
||||
LANDMARK = 3
|
||||
CLOTHING = 5
|
||||
OBJECT = 6
|
||||
NOTECARD = 7
|
||||
# We'd really like to change this to 9 since AT_CATEGORY is 8,
|
||||
# but "My Inventory" has been type 8 for a long time.
|
||||
ROOT_INVENTORY = 8
|
||||
LSL_TEXT = 10
|
||||
BODYPART = 13
|
||||
TRASH = 14
|
||||
SNAPSHOT_CATEGORY = 15
|
||||
LOST_AND_FOUND = 16
|
||||
ANIMATION = 20
|
||||
GESTURE = 21
|
||||
FAVORITE = 23
|
||||
ENSEMBLE_START = 26
|
||||
ENSEMBLE_END = 45
|
||||
# This range is reserved for special clothing folder types.
|
||||
CURRENT_OUTFIT = 46
|
||||
OUTFIT = 47
|
||||
MY_OUTFITS = 48
|
||||
MESH = 49
|
||||
# "received items" for MP
|
||||
INBOX = 50
|
||||
OUTBOX = 51
|
||||
BASIC_ROOT = 52
|
||||
MARKETPLACE_LISTINGS = 53
|
||||
MARKETPLACE_STOCK = 54
|
||||
# Note: We actually *never* create folders with that type. This is used for icon override only.
|
||||
MARKETPLACE_VERSION = 55
|
||||
SETTINGS = 56
|
||||
# Firestorm folders, may not actually exist
|
||||
FIRESTORM = 57
|
||||
PHOENIX = 58
|
||||
RLV = 59
|
||||
# Opensim folders
|
||||
MY_SUITCASE = 100
|
||||
NONE = -1
|
||||
|
||||
|
||||
@se.enum_field_serializer("AgentIsNowWearing", "WearableData", "WearableType")
|
||||
@se.enum_field_serializer("AgentWearablesUpdate", "WearableData", "WearableType")
|
||||
@se.enum_field_serializer("CreateInventoryItem", "InventoryBlock", "WearableType")
|
||||
@@ -177,6 +223,7 @@ def _register_permissions_flags(message_name, block_name):
|
||||
|
||||
@se.flag_field_serializer("ObjectPermissions", "ObjectData", "Mask")
|
||||
@_register_permissions_flags("ObjectProperties", "ObjectData")
|
||||
@_register_permissions_flags("ObjectPropertiesFamily", "ObjectData")
|
||||
@_register_permissions_flags("UpdateCreateInventoryItem", "InventoryData")
|
||||
@_register_permissions_flags("UpdateTaskInventory", "InventoryData")
|
||||
@_register_permissions_flags("CreateInventoryItem", "InventoryBlock")
|
||||
@@ -201,11 +248,74 @@ class Permissions(IntFlag):
|
||||
RESERVED = 1 << 31
|
||||
|
||||
|
||||
@se.enum_field_serializer("ObjectSaleInfo", "ObjectData", "SaleType")
|
||||
@se.enum_field_serializer("ObjectProperties", "ObjectData", "SaleType")
|
||||
@se.enum_field_serializer("ObjectPropertiesFamily", "ObjectData", "SaleType")
|
||||
@se.enum_field_serializer("ObjectBuy", "ObjectData", "SaleType")
|
||||
@se.enum_field_serializer("RezScript", "InventoryBlock", "SaleType")
|
||||
@se.enum_field_serializer("RezObject", "InventoryData", "SaleType")
|
||||
@se.enum_field_serializer("UpdateTaskInventory", "InventoryData", "SaleType")
|
||||
@se.enum_field_serializer("UpdateCreateInventoryItem", "InventoryData", "SaleType")
|
||||
class SaleInfo(IntEnum):
|
||||
NOT = 0
|
||||
ORIGINAL = 1
|
||||
COPY = 2
|
||||
CONTENTS = 3
|
||||
|
||||
|
||||
@se.flag_field_serializer("ParcelInfoReply", "Data", "Flags")
|
||||
class ParcelInfoFlags(IntFlag):
|
||||
MATURE = 1 << 0
|
||||
# You should never see adult without mature
|
||||
ADULT = 1 << 1
|
||||
GROUP_OWNED = 1 << 2
|
||||
|
||||
|
||||
@se.flag_field_serializer("MapItemRequest", "AgentData", "Flags")
|
||||
@se.flag_field_serializer("MapNameRequest", "AgentData", "Flags")
|
||||
@se.flag_field_serializer("MapBlockRequest", "AgentData", "Flags")
|
||||
@se.flag_field_serializer("MapItemReply", "AgentData", "Flags")
|
||||
@se.flag_field_serializer("MapNameReply", "AgentData", "Flags")
|
||||
@se.flag_field_serializer("MapBlockReply", "AgentData", "Flags")
|
||||
class MapImageFlags(IntFlag):
|
||||
# No clue, honestly. I guess there's potentially different image types you could request.
|
||||
LAYER = 1 << 1
|
||||
|
||||
|
||||
@se.enum_field_serializer("MapBlockReply", "Data", "Access")
|
||||
@se.enum_field_serializer("RegionInfo", "RegionInfo", "SimAccess")
|
||||
class SimAccess(IntEnum):
|
||||
# Treated as 'unknown', usually ends up being SIM_ACCESS_PG
|
||||
MIN = 0
|
||||
PG = 13
|
||||
MATURE = 21
|
||||
ADULT = 42
|
||||
DOWN = 254
|
||||
|
||||
|
||||
@se.enum_field_serializer("MapItemRequest", "RequestData", "ItemType")
|
||||
@se.enum_field_serializer("MapItemReply", "RequestData", "ItemType")
|
||||
class MapItemType(IntEnum):
|
||||
TELEHUB = 0x01
|
||||
PG_EVENT = 0x02
|
||||
MATURE_EVENT = 0x03
|
||||
# No longer supported, 2009-03-02 KLW
|
||||
DEPRECATED_POPULAR = 0x04
|
||||
DEPRECATED_AGENT_COUNT = 0x05
|
||||
AGENT_LOCATIONS = 0x06
|
||||
LAND_FOR_SALE = 0x07
|
||||
CLASSIFIED = 0x08
|
||||
ADULT_EVENT = 0x09
|
||||
LAND_FOR_SALE_ADULT = 0x0a
|
||||
|
||||
|
||||
@se.flag_field_serializer("RezObject", "RezData", "ItemFlags")
|
||||
@se.flag_field_serializer("RezMultipleAttachmentsFromInv", "ObjectData", "ItemFlags")
|
||||
@se.flag_field_serializer("RezObject", "InventoryData", "Flags")
|
||||
@se.flag_field_serializer("RezScript", "InventoryBlock", "Flags")
|
||||
@se.flag_field_serializer("UpdateCreateInventoryItem", "InventoryData", "Flags")
|
||||
@se.flag_field_serializer("UpdateTaskInventory", "InventoryData", "Flags")
|
||||
@se.flag_field_serializer("ChangeInventoryItemFlags", "InventoryData", "Flags")
|
||||
class InventoryItemFlags(IntFlag):
|
||||
# The asset has only one reference in the system. If the
|
||||
# inventory item is deleted, or the assetid updated, then we
|
||||
@@ -232,7 +342,8 @@ class InventoryItemFlags(IntFlag):
|
||||
OBJECT_HAS_MULTIPLE_ITEMS = 0x200000
|
||||
|
||||
@property
|
||||
def attachment_point(self):
|
||||
def subtype(self):
|
||||
"""Subtype of the given item type, could be an attachment point or setting type, etc."""
|
||||
return self & 0xFF
|
||||
|
||||
|
||||
@@ -761,6 +872,7 @@ class MCode(IntEnum):
|
||||
@se.flag_field_serializer("ObjectUpdateCompressed", "ObjectData", "UpdateFlags")
|
||||
@se.flag_field_serializer("ObjectUpdateCached", "ObjectData", "UpdateFlags")
|
||||
@se.flag_field_serializer("ObjectAdd", "ObjectData", "AddFlags")
|
||||
@se.flag_field_serializer("ObjectDuplicate", "SharedData", "DuplicateFlags")
|
||||
class ObjectUpdateFlags(IntFlag):
|
||||
USE_PHYSICS = 1 << 0
|
||||
CREATE_SELECTED = 1 << 1
|
||||
@@ -796,6 +908,9 @@ class ObjectUpdateFlags(IntFlag):
|
||||
ZLIB_COMPRESSED_REPRECATED = 1 << 31
|
||||
|
||||
|
||||
JUST_CREATED_FLAGS = (ObjectUpdateFlags.CREATE_SELECTED | ObjectUpdateFlags.OBJECT_YOU_OWNER)
|
||||
|
||||
|
||||
class AttachmentStateAdapter(se.Adapter):
|
||||
# Encoded attachment point ID for attached objects
|
||||
# nibbles are swapped around because old attachment nums only used to live
|
||||
@@ -840,6 +955,15 @@ class ObjectStateSerializer(se.AdapterSubfieldSerializer):
|
||||
ORIG_INLINE = True
|
||||
|
||||
|
||||
@se.subfield_serializer("ObjectUpdate", "RegionData", "TimeDilation")
|
||||
@se.subfield_serializer("ObjectUpdateCompressed", "RegionData", "TimeDilation")
|
||||
@se.subfield_serializer("ObjectUpdateCached", "RegionData", "TimeDilation")
|
||||
@se.subfield_serializer("ImprovedTerseObjectUpdate", "RegionData", "TimeDilation")
|
||||
class TimeDilationSerializer(se.AdapterSubfieldSerializer):
|
||||
ADAPTER = se.QuantizedFloat(se.U16, 0.0, 1.0, False)
|
||||
ORIG_INLINE = True
|
||||
|
||||
|
||||
@se.subfield_serializer("ImprovedTerseObjectUpdate", "ObjectData", "Data")
|
||||
class ImprovedTerseObjectUpdateDataSerializer(se.SimpleSubfieldSerializer):
|
||||
TEMPLATE = se.Template({
|
||||
@@ -862,12 +986,12 @@ class ShineLevel(IntEnum):
|
||||
HIGH = 3
|
||||
|
||||
|
||||
@dataclasses.dataclass
|
||||
@dataclasses.dataclass(unsafe_hash=True)
|
||||
class BasicMaterials:
|
||||
# Meaning is technically implementation-dependent, these are in LL data files
|
||||
Bump: int = se.bitfield_field(bits=5)
|
||||
FullBright: bool = se.bitfield_field(bits=1, adapter=se.BoolAdapter())
|
||||
Shiny: int = se.bitfield_field(bits=2, adapter=se.IntEnum(ShineLevel))
|
||||
Bump: int = se.bitfield_field(bits=5, default=0)
|
||||
FullBright: bool = se.bitfield_field(bits=1, adapter=se.BoolAdapter(), default=False)
|
||||
Shiny: int = se.bitfield_field(bits=2, adapter=se.IntEnum(ShineLevel), default=0)
|
||||
|
||||
|
||||
BUMP_SHINY_FULLBRIGHT = se.BitfieldDataclass(BasicMaterials, se.U8)
|
||||
@@ -881,12 +1005,12 @@ class TexGen(IntEnum):
|
||||
CYLINDRICAL = 0x6
|
||||
|
||||
|
||||
@dataclasses.dataclass
|
||||
@dataclasses.dataclass(unsafe_hash=True)
|
||||
class MediaFlags:
|
||||
WebPage: bool = se.bitfield_field(bits=1, adapter=se.BoolAdapter())
|
||||
TexGen: "TexGen" = se.bitfield_field(bits=2, adapter=se.IntEnum(TexGen))
|
||||
WebPage: bool = se.bitfield_field(bits=1, adapter=se.BoolAdapter(), default=False)
|
||||
TexGen: "TexGen" = se.bitfield_field(bits=2, adapter=se.IntEnum(TexGen), default=TexGen.DEFAULT)
|
||||
# Probably unused but show it just in case
|
||||
_Unused: int = se.bitfield_field(bits=5)
|
||||
_Unused: int = se.bitfield_field(bits=5, default=0)
|
||||
|
||||
|
||||
# Not shifted so enum definitions can match indra
|
||||
@@ -1039,9 +1163,64 @@ def _te_field(spec: se.SERIALIZABLE_TYPE, first=False, optional=False,
|
||||
_T = TypeVar("_T")
|
||||
_TE_FIELD_KEY = Optional[Sequence[int]]
|
||||
|
||||
# If this seems weird it's because it is. TE offsets are S16s with `0` as the actual 0
|
||||
# point, and LL divides by `0x7FFF` to convert back to float. Negative S16s can
|
||||
# actually go to -0x8000 due to two's complement, creating a larger range for negatives.
|
||||
TE_S16_COORD = se.QuantizedFloat(se.S16, -1.000030518509476, 1.0, False)
|
||||
|
||||
|
||||
class PackedTERotation(se.QuantizedFloat):
|
||||
"""Another weird one, packed TE rotations have their own special quantization"""
|
||||
|
||||
def __init__(self):
|
||||
super().__init__(se.S16, math.pi * -2, math.pi * 2, zero_median=False)
|
||||
self.step_mag = 1.0 / (se.U16.max_val + 1)
|
||||
|
||||
def _float_to_quantized(self, val: float, lower: float, upper: float):
|
||||
val = math.fmod(val, upper)
|
||||
val = super()._float_to_quantized(val, lower, upper)
|
||||
if val == se.S16.max_val + 1:
|
||||
val = self.prim_min
|
||||
return val
|
||||
|
||||
|
||||
@dataclasses.dataclass
|
||||
class TextureEntry:
|
||||
"""Representation of a TE for a single face. Not sent over the wire."""
|
||||
Textures: UUID = UUID('89556747-24cb-43ed-920b-47caed15465f')
|
||||
Color: bytes = b"\xff\xff\xff\xff"
|
||||
ScalesS: float = 1.0
|
||||
ScalesT: float = 1.0
|
||||
OffsetsS: float = 0.0
|
||||
OffsetsT: float = 0.0
|
||||
# In radians
|
||||
Rotation: float = 0.0
|
||||
MediaFlags: "MediaFlags" = dataclasses.field(default_factory=MediaFlags)
|
||||
BasicMaterials: "BasicMaterials" = dataclasses.field(default_factory=BasicMaterials)
|
||||
Glow: float = 0.0
|
||||
Materials: UUID = UUID.ZERO
|
||||
|
||||
def st_to_uv(self, st_coord: Vector3) -> Vector3:
|
||||
"""Convert OpenGL ST coordinates to UV coordinates, accounting for mapping"""
|
||||
uv = Vector3(st_coord.X - 0.5, st_coord.Y - 0.5)
|
||||
cos_rot = math.cos(self.Rotation)
|
||||
sin_rot = math.sin(self.Rotation)
|
||||
uv = Vector3(
|
||||
X=uv.X * cos_rot + uv.Y * sin_rot,
|
||||
Y=-uv.X * sin_rot + uv.Y * cos_rot
|
||||
)
|
||||
uv *= Vector3(self.ScalesS, self.ScalesT)
|
||||
return uv + Vector3(self.OffsetsS + 0.5, self.OffsetsT + 0.5)
|
||||
|
||||
|
||||
# Max number of TEs possible according to llprimitive (but not really true!)
|
||||
# Useful if you don't know how many faces / TEs an object really has because it's mesh
|
||||
# or something.
|
||||
MAX_TES = 45
|
||||
|
||||
|
||||
@dataclasses.dataclass
|
||||
class TextureEntryCollection:
|
||||
Textures: Dict[_TE_FIELD_KEY, UUID] = _te_field(
|
||||
# Plywood texture
|
||||
se.UUID, first=True, default=UUID('89556747-24cb-43ed-920b-47caed15465f'))
|
||||
@@ -1049,21 +1228,65 @@ class TextureEntry:
|
||||
Color: Dict[_TE_FIELD_KEY, bytes] = _te_field(Color4(invert_bytes=True), default=b"\xff\xff\xff\xff")
|
||||
ScalesS: Dict[_TE_FIELD_KEY, float] = _te_field(se.F32, default=1.0)
|
||||
ScalesT: Dict[_TE_FIELD_KEY, float] = _te_field(se.F32, default=1.0)
|
||||
OffsetsS: Dict[_TE_FIELD_KEY, int] = _te_field(se.S16, default=0)
|
||||
OffsetsT: Dict[_TE_FIELD_KEY, int] = _te_field(se.S16, default=0)
|
||||
Rotation: Dict[_TE_FIELD_KEY, int] = _te_field(se.S16, default=0)
|
||||
OffsetsS: Dict[_TE_FIELD_KEY, float] = _te_field(TE_S16_COORD, default=0.0)
|
||||
OffsetsT: Dict[_TE_FIELD_KEY, float] = _te_field(TE_S16_COORD, default=0.0)
|
||||
Rotation: Dict[_TE_FIELD_KEY, float] = _te_field(PackedTERotation(), default=0.0)
|
||||
BasicMaterials: Dict[_TE_FIELD_KEY, "BasicMaterials"] = _te_field(
|
||||
BUMP_SHINY_FULLBRIGHT, default_factory=lambda: BasicMaterials(Bump=0, FullBright=False, Shiny=0),
|
||||
BUMP_SHINY_FULLBRIGHT, default_factory=BasicMaterials,
|
||||
)
|
||||
MediaFlags: Dict[_TE_FIELD_KEY, "MediaFlags"] = _te_field(
|
||||
MEDIA_FLAGS,
|
||||
default_factory=lambda: MediaFlags(WebPage=False, TexGen=TexGen.DEFAULT, _Unused=0),
|
||||
)
|
||||
Glow: Dict[_TE_FIELD_KEY, int] = _te_field(se.U8, default=0)
|
||||
Materials: Dict[_TE_FIELD_KEY, UUID] = _te_field(se.UUID, optional=True, default=UUID())
|
||||
MediaFlags: Dict[_TE_FIELD_KEY, "MediaFlags"] = _te_field(MEDIA_FLAGS, default_factory=MediaFlags)
|
||||
Glow: Dict[_TE_FIELD_KEY, float] = _te_field(se.QuantizedFloat(se.U8, 0.0, 1.0), default=0.0)
|
||||
Materials: Dict[_TE_FIELD_KEY, UUID] = _te_field(se.UUID, optional=True, default=UUID.ZERO)
|
||||
|
||||
def unwrap(self):
|
||||
"""Return `self` regardless of whether this is lazy wrapped object or not"""
|
||||
return self
|
||||
|
||||
def realize(self, num_faces: int = MAX_TES) -> List[TextureEntry]:
|
||||
"""
|
||||
Turn the "default" vs "exception cases" wire format TE representation to per-face lookups
|
||||
Makes it easier to get all TE details associated with a specific face
|
||||
"""
|
||||
as_dicts = [dict() for _ in range(num_faces)]
|
||||
for field in dataclasses.fields(self):
|
||||
key = field.name
|
||||
vals = getattr(self, key)
|
||||
# Fill give all faces the default value for this key
|
||||
for te in as_dicts:
|
||||
te[key] = vals[None]
|
||||
# Walk over the exception cases and replace the default value
|
||||
for face_nums, val in vals.items():
|
||||
# Default case already handled
|
||||
if face_nums is None:
|
||||
continue
|
||||
for face_num in face_nums:
|
||||
if face_num >= num_faces:
|
||||
raise ValueError(f"Bad value for num_faces? {face_num} >= {num_faces}")
|
||||
as_dicts[face_num][key] = val
|
||||
return [TextureEntry(**x) for x in as_dicts]
|
||||
|
||||
@classmethod
|
||||
def from_tes(cls, tes: List[TextureEntry]) -> "TextureEntryCollection":
|
||||
instance = cls()
|
||||
if not tes:
|
||||
return instance
|
||||
|
||||
for field in dataclasses.fields(cls):
|
||||
te_vals: Dict[Any, List[int]] = collections.defaultdict(list)
|
||||
for i, te in enumerate(tes):
|
||||
# Group values by what face they occur on
|
||||
te_vals[getattr(te, field.name)].append(i)
|
||||
# Make most common value the "default", everything else is an exception
|
||||
sorted_vals = sorted(te_vals.items(), key=lambda x: len(x[1]), reverse=True)
|
||||
default_val = sorted_vals.pop(0)[0]
|
||||
te_vals = {None: default_val}
|
||||
for val, face_nums in sorted_vals:
|
||||
te_vals[tuple(face_nums)] = val
|
||||
setattr(instance, field.name, te_vals)
|
||||
return instance
|
||||
|
||||
|
||||
TE_SERIALIZER = se.Dataclass(TextureEntry)
|
||||
TE_SERIALIZER = se.Dataclass(TextureEntryCollection)
|
||||
|
||||
|
||||
@se.subfield_serializer("ObjectUpdate", "ObjectData", "TextureEntry")
|
||||
@@ -1072,7 +1295,7 @@ TE_SERIALIZER = se.Dataclass(TextureEntry)
|
||||
@se.subfield_serializer("ObjectImage", "ObjectData", "TextureEntry")
|
||||
class TextureEntrySubfieldSerializer(se.SimpleSubfieldSerializer):
|
||||
EMPTY_IS_NONE = True
|
||||
TEMPLATE = TE_SERIALIZER
|
||||
TEMPLATE = se.TypedBytesGreedy(TE_SERIALIZER, empty_is_none=True, lazy=True)
|
||||
|
||||
|
||||
DATA_PACKER_TE_TEMPLATE = se.TypedByteArray(
|
||||
@@ -1555,6 +1778,7 @@ class DeRezObjectDestination(IntEnum):
|
||||
@se.flag_field_serializer("SimStats", "RegionInfo", "RegionFlagsExtended")
|
||||
@se.flag_field_serializer("RegionInfo", "RegionInfo", "RegionFlags")
|
||||
@se.flag_field_serializer("RegionInfo", "RegionInfo3", "RegionFlagsExtended")
|
||||
@se.flag_field_serializer("MapBlockReply", "Data", "RegionFlags")
|
||||
class RegionFlags(IntFlag):
|
||||
ALLOW_DAMAGE = 1 << 0
|
||||
ALLOW_LANDMARK = 1 << 1
|
||||
@@ -1600,6 +1824,7 @@ class RegionHandshakeReplyFlags(IntFlag):
|
||||
@se.flag_field_serializer("TeleportStart", "Info", "TeleportFlags")
|
||||
@se.flag_field_serializer("TeleportProgress", "Info", "TeleportFlags")
|
||||
@se.flag_field_serializer("TeleportFinish", "Info", "TeleportFlags")
|
||||
@se.flag_field_serializer("TeleportLocal", "Info", "TeleportFlags")
|
||||
@se.flag_field_serializer("TeleportLureRequest", "Info", "TeleportFlags")
|
||||
class TeleportFlags(IntFlag):
|
||||
SET_HOME_TO_TARGET = 1 << 0 # newbie leaving prelude (starter area)
|
||||
@@ -1618,6 +1843,158 @@ class TeleportFlags(IntFlag):
|
||||
IS_FLYING = 1 << 13
|
||||
SHOW_RESET_HOME = 1 << 14
|
||||
FORCE_REDIRECT = 1 << 15
|
||||
VIA_GLOBAL_COORDS = 1 << 16
|
||||
WITHIN_REGION = 1 << 17
|
||||
|
||||
|
||||
@se.flag_field_serializer("AvatarPropertiesReply", "PropertiesData", "Flags")
|
||||
class AvatarPropertiesFlags(IntFlag):
|
||||
ALLOW_PUBLISH = 1 << 0 # whether profile is externally visible or not
|
||||
MATURE_PUBLISH = 1 << 1 # profile is "mature"
|
||||
IDENTIFIED = 1 << 2 # whether avatar has provided payment info
|
||||
TRANSACTED = 1 << 3 # whether avatar has actively used payment info
|
||||
ONLINE = 1 << 4 # the online status of this avatar, if known.
|
||||
AGEVERIFIED = 1 << 5 # whether avatar has been age-verified
|
||||
|
||||
|
||||
@se.flag_field_serializer("AvatarGroupsReply", "GroupData", "GroupPowers")
|
||||
@se.flag_field_serializer("AvatarGroupDataUpdate", "GroupData", "GroupPowers")
|
||||
@se.flag_field_serializer("AvatarDataUpdate", "AgentDataData", "GroupPowers")
|
||||
class GroupPowerFlags(IntFlag):
|
||||
MEMBER_INVITE = 1 << 1 # Invite member
|
||||
MEMBER_EJECT = 1 << 2 # Eject member from group
|
||||
MEMBER_OPTIONS = 1 << 3 # Toggle "Open enrollment" and change "Signup Fee"
|
||||
MEMBER_VISIBLE_IN_DIR = 1 << 47
|
||||
|
||||
# Roles
|
||||
ROLE_CREATE = 1 << 4 # Create new roles
|
||||
ROLE_DELETE = 1 << 5 # Delete roles
|
||||
ROLE_PROPERTIES = 1 << 6 # Change Role Names, Titles, and Descriptions (Of roles the user is in, only, or any role in group?)
|
||||
ROLE_ASSIGN_MEMBER_LIMITED = 1 << 7 # Assign Member to a Role that the assigner is in
|
||||
ROLE_ASSIGN_MEMBER = 1 << 8 # Assign Member to Role
|
||||
ROLE_REMOVE_MEMBER = 1 << 9 # Remove Member from Role
|
||||
ROLE_CHANGE_ACTIONS = 1 << 10 # Change actions a role can perform
|
||||
|
||||
# Group Identity
|
||||
GROUP_CHANGE_IDENTITY = 1 << 11 # Charter, insignia, 'Show In Group List', 'Publish on the web', 'Mature', all 'Show Member In Group Profile' checkboxes
|
||||
|
||||
# Parcel Management
|
||||
LAND_DEED = 1 << 12 # Deed Land and Buy Land for Group
|
||||
LAND_RELEASE = 1 << 13 # Release Land (to Gov. Linden)
|
||||
LAND_SET_SALE_INFO = 1 << 14 # Set for sale info (Toggle "For Sale", Set Price, Set Target, Toggle "Sell objects with the land")
|
||||
LAND_DIVIDE_JOIN = 1 << 15 # Divide and Join Parcels
|
||||
|
||||
# Parcel Identity
|
||||
LAND_FIND_PLACES = 1 << 17 # Toggle "Show in Find Places" and Set Category.
|
||||
LAND_CHANGE_IDENTITY = 1 << 18 # Change Parcel Identity: Parcel Name, Parcel Description, Snapshot, 'Publish on the web', and 'Mature' checkbox
|
||||
LAND_SET_LANDING_POINT = 1 << 19 # Set Landing Point
|
||||
|
||||
# Parcel Settings
|
||||
LAND_CHANGE_MEDIA = 1 << 20 # Change Media Settings
|
||||
LAND_EDIT = 1 << 21 # Toggle Edit Land
|
||||
LAND_OPTIONS = 1 << 22 # Toggle Set Home Point, Fly, Outside Scripts, Create/Edit Objects, Landmark, and Damage checkboxes
|
||||
|
||||
# Parcel Powers
|
||||
LAND_ALLOW_EDIT_LAND = 1 << 23 # Bypass Edit Land Restriction
|
||||
LAND_ALLOW_FLY = 1 << 24 # Bypass Fly Restriction
|
||||
LAND_ALLOW_CREATE = 1 << 25 # Bypass Create/Edit Objects Restriction
|
||||
LAND_ALLOW_LANDMARK = 1 << 26 # Bypass Landmark Restriction
|
||||
LAND_ALLOW_SET_HOME = 1 << 28 # Bypass Set Home Point Restriction
|
||||
LAND_ALLOW_HOLD_EVENT = 1 << 41 # Allowed to hold events on group-owned land
|
||||
LAND_ALLOW_ENVIRONMENT = 1 << 46 # Allowed to change the environment
|
||||
|
||||
# Parcel Access
|
||||
LAND_MANAGE_ALLOWED = 1 << 29 # Manage Allowed List
|
||||
LAND_MANAGE_BANNED = 1 << 30 # Manage Banned List
|
||||
LAND_MANAGE_PASSES = 1 << 31 # Change Sell Pass Settings
|
||||
LAND_ADMIN = 1 << 32 # Eject and Freeze Users on the land
|
||||
|
||||
# Parcel Content
|
||||
LAND_RETURN_GROUP_SET = 1 << 33 # Return objects on parcel that are set to group
|
||||
LAND_RETURN_NON_GROUP = 1 << 34 # Return objects on parcel that are not set to group
|
||||
LAND_RETURN_GROUP_OWNED = 1 << 48 # Return objects on parcel that are owned by the group
|
||||
|
||||
LAND_GARDENING = 1 << 35 # Parcel Gardening - plant and move linden trees
|
||||
|
||||
# Object Management
|
||||
OBJECT_DEED = 1 << 36 # Deed Object
|
||||
OBJECT_MANIPULATE = 1 << 38 # Manipulate Group Owned Objects (Move, Copy, Mod)
|
||||
OBJECT_SET_SALE = 1 << 39 # Set Group Owned Object for Sale
|
||||
|
||||
# Accounting
|
||||
ACCOUNTING_ACCOUNTABLE = 1 << 40 # Pay Group Liabilities and Receive Group Dividends
|
||||
|
||||
# Notices
|
||||
NOTICES_SEND = 1 << 42 # Send Notices
|
||||
NOTICES_RECEIVE = 1 << 43 # Receive Notices and View Notice History
|
||||
|
||||
# Proposals
|
||||
# TODO: _DEPRECATED suffix as part of vote removal - DEV-24856:
|
||||
PROPOSAL_START = 1 << 44 # Start Proposal
|
||||
# TODO: _DEPRECATED suffix as part of vote removal - DEV-24856:
|
||||
PROPOSAL_VOTE = 1 << 45 # Vote on Proposal
|
||||
|
||||
# Group chat moderation related
|
||||
SESSION_JOIN = 1 << 16 # can join session
|
||||
SESSION_VOICE = 1 << 27 # can hear/talk
|
||||
SESSION_MODERATOR = 1 << 37 # can mute people's session
|
||||
|
||||
EXPERIENCE_ADMIN = 1 << 49 # has admin rights to any experiences owned by this group
|
||||
EXPERIENCE_CREATOR = 1 << 50 # can sign scripts for experiences owned by this group
|
||||
|
||||
# Group Banning
|
||||
GROUP_BAN_ACCESS = 1 << 51 # Allows access to ban / un-ban agents from a group.
|
||||
|
||||
|
||||
@se.flag_field_serializer("RequestObjectPropertiesFamily", "ObjectData", "RequestFlags")
|
||||
@se.flag_field_serializer("ObjectPropertiesFamily", "ObjectData", "RequestFlags")
|
||||
class ObjectPropertiesFamilyRequestFlags(IntFlag):
|
||||
BUG_REPORT = 1 << 0
|
||||
COMPLAINT_REPORT = 1 << 1
|
||||
OBJECT_PAY = 1 << 2
|
||||
|
||||
|
||||
@se.enum_field_serializer("RequestImage", "RequestImage", "Type")
|
||||
class RequestImageType(IntEnum):
|
||||
NORMAL = 0
|
||||
AVATAR_BAKE = 1
|
||||
|
||||
|
||||
@se.enum_field_serializer("ImageData", "ImageID", "Codec")
|
||||
class ImageCodec(IntEnum):
|
||||
INVALID = 0
|
||||
RGB = 1
|
||||
J2C = 2
|
||||
BMP = 3
|
||||
TGA = 4
|
||||
JPEG = 5
|
||||
DXT = 6
|
||||
PNG = 7
|
||||
|
||||
|
||||
@se.enum_field_serializer("LayerData", "LayerID", "Type")
|
||||
class LayerDataType(IntEnum):
|
||||
LAND_LAYER_CODE = ord('L')
|
||||
WIND_LAYER_CODE = ord('7')
|
||||
CLOUD_LAYER_CODE = ord('8')
|
||||
WATER_LAYER_CODE = ord('W')
|
||||
|
||||
# <FS:CR> Aurora Sim
|
||||
# Extended land layer for Aurora Sim
|
||||
AURORA_LAND_LAYER_CODE = ord('M')
|
||||
AURORA_WATER_LAYER_CODE = ord('X')
|
||||
AURORA_WIND_LAYER_CODE = ord('9')
|
||||
AURORA_CLOUD_LAYER_CODE = ord(':')
|
||||
|
||||
|
||||
@se.enum_field_serializer("ModifyLand", "ModifyBlock", "Action")
|
||||
class ModifyLandAction(IntEnum):
|
||||
LEVEL = 0
|
||||
RAISE = 1
|
||||
LOWER = 2
|
||||
SMOOTH = 3
|
||||
NOISE = 4
|
||||
REVERT = 5
|
||||
|
||||
|
||||
@se.http_serializer("RenderMaterials")
|
||||
|
||||
@@ -10,6 +10,7 @@ from typing import *
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.base.message.message import Block, Message
|
||||
from hippolyzer.lib.base.message.circuit import ConnectionHolder
|
||||
from hippolyzer.lib.base.message.msgtypes import PacketFlags
|
||||
from hippolyzer.lib.base.templates import (
|
||||
TransferRequestParamsBase,
|
||||
TransferChannelType,
|
||||
@@ -94,7 +95,7 @@ class TransferManager:
|
||||
if params_dict.get("SessionID", dataclasses.MISSING) is None:
|
||||
params.SessionID = self._session_id
|
||||
|
||||
self._connection_holder.circuit.send_message(Message(
|
||||
self._connection_holder.circuit.send(Message(
|
||||
'TransferRequest',
|
||||
Block(
|
||||
'TransferInfo',
|
||||
@@ -104,6 +105,7 @@ class TransferManager:
|
||||
Priority=priority,
|
||||
Params_=params,
|
||||
),
|
||||
flags=PacketFlags.RELIABLE,
|
||||
))
|
||||
transfer = Transfer(transfer_id)
|
||||
asyncio.create_task(self._pump_transfer_replies(transfer))
|
||||
|
||||
@@ -11,7 +11,7 @@ from typing import *
|
||||
from hippolyzer.lib.base.datatypes import UUID, RawBytes
|
||||
from hippolyzer.lib.base.message.data_packer import TemplateDataPacker
|
||||
from hippolyzer.lib.base.message.message import Block, Message
|
||||
from hippolyzer.lib.base.message.msgtypes import MsgType
|
||||
from hippolyzer.lib.base.message.msgtypes import MsgType, PacketFlags
|
||||
from hippolyzer.lib.base.network.transport import Direction
|
||||
from hippolyzer.lib.base.message.circuit import ConnectionHolder
|
||||
from hippolyzer.lib.base.templates import XferPacket, XferFilePath, AssetType, XferError
|
||||
@@ -110,7 +110,7 @@ class XferManager:
|
||||
direction: Direction = Direction.OUT,
|
||||
) -> Xfer:
|
||||
xfer_id = xfer_id if xfer_id is not None else random.getrandbits(64)
|
||||
self._connection_holder.circuit.send_message(Message(
|
||||
self._connection_holder.circuit.send(Message(
|
||||
'RequestXfer',
|
||||
Block(
|
||||
'XferID',
|
||||
@@ -174,10 +174,11 @@ class XferManager:
|
||||
to_ack = range(xfer.next_ackable, ack_max)
|
||||
xfer.next_ackable = ack_max
|
||||
for ack_id in to_ack:
|
||||
self._connection_holder.circuit.send_message(Message(
|
||||
self._connection_holder.circuit.send_reliable(Message(
|
||||
"ConfirmXferPacket",
|
||||
Block("XferID", ID=xfer.xfer_id, Packet=ack_id),
|
||||
direction=xfer.direction,
|
||||
flags=PacketFlags.RELIABLE,
|
||||
))
|
||||
|
||||
xfer.chunks[packet_id.PacketID] = packet_data
|
||||
@@ -216,7 +217,7 @@ class XferManager:
|
||||
else:
|
||||
inline_data = data
|
||||
|
||||
self._connection_holder.circuit.send_message(Message(
|
||||
self._connection_holder.circuit.send(Message(
|
||||
"AssetUploadRequest",
|
||||
Block(
|
||||
"AssetBlock",
|
||||
@@ -225,7 +226,8 @@ class XferManager:
|
||||
Tempfile=temp_file,
|
||||
StoreLocal=store_local,
|
||||
AssetData=inline_data,
|
||||
)
|
||||
),
|
||||
flags=PacketFlags.RELIABLE
|
||||
))
|
||||
fut = asyncio.Future()
|
||||
asyncio.create_task(self._pump_asset_upload(xfer, transaction_id, fut))
|
||||
@@ -272,12 +274,13 @@ class XferManager:
|
||||
chunk = xfer.chunks.pop(packet_id)
|
||||
# EOF if there are no chunks left
|
||||
packet_val = XferPacket(PacketID=packet_id, IsEOF=not bool(xfer.chunks))
|
||||
self._connection_holder.circuit.send_message(Message(
|
||||
self._connection_holder.circuit.send(Message(
|
||||
"SendXferPacket",
|
||||
Block("XferID", ID=xfer.xfer_id, Packet_=packet_val),
|
||||
Block("DataPacket", Data=chunk),
|
||||
# Send this towards the sender of the RequestXfer
|
||||
direction=~request_msg.direction,
|
||||
flags=PacketFlags.RELIABLE,
|
||||
))
|
||||
# Don't care about the value, just want to know it was confirmed.
|
||||
if wait_for_confirm:
|
||||
|
||||
@@ -17,6 +17,7 @@ from hippolyzer.lib.base.datatypes import UUID, Vector3
|
||||
from hippolyzer.lib.base.helpers import proxify
|
||||
from hippolyzer.lib.base.message.message import Block, Message
|
||||
from hippolyzer.lib.base.message.message_handler import MessageHandler
|
||||
from hippolyzer.lib.base.message.msgtypes import PacketFlags
|
||||
from hippolyzer.lib.base.objects import (
|
||||
normalize_object_update,
|
||||
normalize_terse_object_update,
|
||||
@@ -34,7 +35,7 @@ LOG = logging.getLogger(__name__)
|
||||
OBJECT_OR_LOCAL = Union[Object, int]
|
||||
|
||||
|
||||
class UpdateType(enum.IntEnum):
|
||||
class ObjectUpdateType(enum.IntEnum):
|
||||
OBJECT_UPDATE = enum.auto()
|
||||
PROPERTIES = enum.auto()
|
||||
FAMILY = enum.auto()
|
||||
@@ -116,15 +117,15 @@ class ClientObjectManager:
|
||||
*[Block("ObjectData", ObjectLocalID=x) for x in ids_to_req[:255]],
|
||||
]
|
||||
# Selecting causes ObjectProperties to be sent
|
||||
self._region.circuit.send_message(Message("ObjectSelect", blocks))
|
||||
self._region.circuit.send_message(Message("ObjectDeselect", blocks))
|
||||
self._region.circuit.send(Message("ObjectSelect", blocks, flags=PacketFlags.RELIABLE))
|
||||
self._region.circuit.send(Message("ObjectDeselect", blocks, flags=PacketFlags.RELIABLE))
|
||||
ids_to_req = ids_to_req[255:]
|
||||
|
||||
futures = []
|
||||
for local_id in local_ids:
|
||||
if local_id in unselected_ids:
|
||||
# Need to wait until we get our reply
|
||||
fut = self.state.register_future(local_id, UpdateType.PROPERTIES)
|
||||
fut = self.state.register_future(local_id, ObjectUpdateType.PROPERTIES)
|
||||
else:
|
||||
# This was selected so we should already have up to date info
|
||||
fut = asyncio.Future()
|
||||
@@ -150,16 +151,17 @@ class ClientObjectManager:
|
||||
|
||||
ids_to_req = local_ids
|
||||
while ids_to_req:
|
||||
self._region.circuit.send_message(Message(
|
||||
self._region.circuit.send(Message(
|
||||
"RequestMultipleObjects",
|
||||
Block("AgentData", AgentID=session.agent_id, SessionID=session.id),
|
||||
*[Block("ObjectData", CacheMissType=0, ID=x) for x in ids_to_req[:255]],
|
||||
flags=PacketFlags.RELIABLE,
|
||||
))
|
||||
ids_to_req = ids_to_req[255:]
|
||||
|
||||
futures = []
|
||||
for local_id in local_ids:
|
||||
futures.append(self.state.register_future(local_id, UpdateType.OBJECT_UPDATE))
|
||||
futures.append(self.state.register_future(local_id, ObjectUpdateType.OBJECT_UPDATE))
|
||||
return futures
|
||||
|
||||
|
||||
@@ -168,15 +170,15 @@ class ObjectEvent:
|
||||
|
||||
object: Object
|
||||
updated: Set[str]
|
||||
update_type: UpdateType
|
||||
update_type: ObjectUpdateType
|
||||
|
||||
def __init__(self, obj: Object, updated: Set[str], update_type: UpdateType):
|
||||
def __init__(self, obj: Object, updated: Set[str], update_type: ObjectUpdateType):
|
||||
self.object = obj
|
||||
self.updated = updated
|
||||
self.update_type = update_type
|
||||
|
||||
@property
|
||||
def name(self) -> UpdateType:
|
||||
def name(self) -> ObjectUpdateType:
|
||||
return self.update_type
|
||||
|
||||
|
||||
@@ -186,7 +188,7 @@ class ClientWorldObjectManager:
|
||||
self._session: BaseClientSession = session
|
||||
self._settings = settings
|
||||
self.name_cache = name_cache or NameCache()
|
||||
self.events: MessageHandler[ObjectEvent, UpdateType] = MessageHandler(take_by_default=False)
|
||||
self.events: MessageHandler[ObjectEvent, ObjectUpdateType] = MessageHandler(take_by_default=False)
|
||||
self._fullid_lookup: Dict[UUID, Object] = {}
|
||||
self._avatars: Dict[UUID, Avatar] = {}
|
||||
self._avatar_objects: Dict[UUID, Object] = {}
|
||||
@@ -295,7 +297,7 @@ class ClientWorldObjectManager:
|
||||
self._rebuild_avatar_objects()
|
||||
self._region_managers.clear()
|
||||
|
||||
def _update_existing_object(self, obj: Object, new_properties: dict, update_type: UpdateType):
|
||||
def _update_existing_object(self, obj: Object, new_properties: dict, update_type: ObjectUpdateType):
|
||||
old_parent_id = obj.ParentID
|
||||
new_parent_id = new_properties.get("ParentID", obj.ParentID)
|
||||
old_local_id = obj.LocalID
|
||||
@@ -354,7 +356,7 @@ class ClientWorldObjectManager:
|
||||
if obj.PCode == PCode.AVATAR:
|
||||
self._avatar_objects[obj.FullID] = obj
|
||||
self._rebuild_avatar_objects()
|
||||
self._run_object_update_hooks(obj, set(obj.to_dict().keys()), UpdateType.OBJECT_UPDATE)
|
||||
self._run_object_update_hooks(obj, set(obj.to_dict().keys()), ObjectUpdateType.OBJECT_UPDATE)
|
||||
|
||||
def _kill_object_by_local_id(self, region_state: RegionObjectsState, local_id: int):
|
||||
obj = region_state.lookup_localid(local_id)
|
||||
@@ -406,7 +408,7 @@ class ClientWorldObjectManager:
|
||||
# our view of the world then we want to move it to this region.
|
||||
obj = self.lookup_fullid(object_data["FullID"])
|
||||
if obj:
|
||||
self._update_existing_object(obj, object_data, UpdateType.OBJECT_UPDATE)
|
||||
self._update_existing_object(obj, object_data, ObjectUpdateType.OBJECT_UPDATE)
|
||||
else:
|
||||
if region_state is None:
|
||||
continue
|
||||
@@ -430,7 +432,7 @@ class ClientWorldObjectManager:
|
||||
# Need the Object as context because decoding state requires PCode.
|
||||
state_deserializer = ObjectStateSerializer.deserialize
|
||||
object_data["State"] = state_deserializer(ctx_obj=obj, val=object_data["State"])
|
||||
self._update_existing_object(obj, object_data, UpdateType.OBJECT_UPDATE)
|
||||
self._update_existing_object(obj, object_data, ObjectUpdateType.OBJECT_UPDATE)
|
||||
else:
|
||||
if region_state:
|
||||
region_state.missing_locals.add(object_data["LocalID"])
|
||||
@@ -458,7 +460,7 @@ class ClientWorldObjectManager:
|
||||
self._update_existing_object(obj, {
|
||||
"UpdateFlags": update_flags,
|
||||
"RegionHandle": handle,
|
||||
}, UpdateType.OBJECT_UPDATE)
|
||||
}, ObjectUpdateType.OBJECT_UPDATE)
|
||||
continue
|
||||
|
||||
cached_obj_data = self._lookup_cache_entry(handle, block["ID"], block["CRC"])
|
||||
@@ -497,7 +499,7 @@ class ClientWorldObjectManager:
|
||||
LOG.warning(f"Got ObjectUpdateCompressed for unknown region {handle}: {object_data!r}")
|
||||
obj = self.lookup_fullid(object_data["FullID"])
|
||||
if obj:
|
||||
self._update_existing_object(obj, object_data, UpdateType.OBJECT_UPDATE)
|
||||
self._update_existing_object(obj, object_data, ObjectUpdateType.OBJECT_UPDATE)
|
||||
else:
|
||||
if region_state is None:
|
||||
continue
|
||||
@@ -514,7 +516,7 @@ class ClientWorldObjectManager:
|
||||
obj = self.lookup_fullid(block["ObjectID"])
|
||||
if obj:
|
||||
seen_locals.append(obj.LocalID)
|
||||
self._update_existing_object(obj, object_properties, UpdateType.PROPERTIES)
|
||||
self._update_existing_object(obj, object_properties, ObjectUpdateType.PROPERTIES)
|
||||
else:
|
||||
LOG.debug(f"Received {packet.name} for unknown {block['ObjectID']}")
|
||||
packet.meta["ObjectUpdateIDs"] = tuple(seen_locals)
|
||||
@@ -561,9 +563,9 @@ class ClientWorldObjectManager:
|
||||
LOG.debug(f"Received ObjectCost for unknown {object_id}")
|
||||
continue
|
||||
obj.ObjectCosts.update(object_costs)
|
||||
self._run_object_update_hooks(obj, {"ObjectCosts"}, UpdateType.COSTS)
|
||||
self._run_object_update_hooks(obj, {"ObjectCosts"}, ObjectUpdateType.COSTS)
|
||||
|
||||
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str], update_type: UpdateType):
|
||||
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str], update_type: ObjectUpdateType):
|
||||
region_state = self._get_region_state(obj.RegionHandle)
|
||||
region_state.resolve_futures(obj, update_type)
|
||||
if obj.PCode == PCode.AVATAR and "NameValue" in updated_props:
|
||||
@@ -572,7 +574,7 @@ class ClientWorldObjectManager:
|
||||
self.events.handle(ObjectEvent(obj, updated_props, update_type))
|
||||
|
||||
def _run_kill_object_hooks(self, obj: Object):
|
||||
self.events.handle(ObjectEvent(obj, set(), UpdateType.KILL))
|
||||
self.events.handle(ObjectEvent(obj, set(), ObjectUpdateType.KILL))
|
||||
|
||||
def _rebuild_avatar_objects(self):
|
||||
# Get all avatars known through coarse locations and which region the location was in
|
||||
@@ -779,7 +781,7 @@ class RegionObjectsState:
|
||||
del self._orphans[parent_id]
|
||||
return removed
|
||||
|
||||
def register_future(self, local_id: int, future_type: UpdateType) -> asyncio.Future[Object]:
|
||||
def register_future(self, local_id: int, future_type: ObjectUpdateType) -> asyncio.Future[Object]:
|
||||
fut = asyncio.Future()
|
||||
fut_key = (local_id, future_type)
|
||||
local_futs = self._object_futures.get(fut_key, [])
|
||||
@@ -788,7 +790,7 @@ class RegionObjectsState:
|
||||
fut.add_done_callback(local_futs.remove)
|
||||
return fut
|
||||
|
||||
def resolve_futures(self, obj: Object, update_type: UpdateType):
|
||||
def resolve_futures(self, obj: Object, update_type: ObjectUpdateType):
|
||||
futures = self._object_futures.get((obj.LocalID, update_type), [])
|
||||
for fut in futures[:]:
|
||||
fut.set_result(obj)
|
||||
|
||||
@@ -73,17 +73,17 @@ def show_message(text, session=None) -> None:
|
||||
direction=Direction.IN,
|
||||
)
|
||||
if session:
|
||||
session.main_region.circuit.send_message(message)
|
||||
session.main_region.circuit.send(message)
|
||||
else:
|
||||
for session in AddonManager.SESSION_MANAGER.sessions:
|
||||
session.main_region.circuit.send_message(copy.copy(message))
|
||||
session.main_region.circuit.send(copy.copy(message))
|
||||
|
||||
|
||||
def send_chat(message: Union[bytes, str], channel=0, chat_type=ChatType.NORMAL, session=None):
|
||||
session = session or addon_ctx.session.get(None) or None
|
||||
if not session:
|
||||
raise RuntimeError("Tried to send chat without session")
|
||||
session.main_region.circuit.send_message(Message(
|
||||
session.main_region.circuit.send(Message(
|
||||
"ChatFromViewer",
|
||||
Block(
|
||||
"AgentData",
|
||||
@@ -128,6 +128,17 @@ def ais_item_to_inventory_data(ais_item: dict):
|
||||
)
|
||||
|
||||
|
||||
def ais_folder_to_inventory_data(ais_folder: dict):
|
||||
return Block(
|
||||
"FolderData",
|
||||
FolderID=ais_folder["cat_id"],
|
||||
ParentID=ais_folder["parent_id"],
|
||||
CallbackID=0,
|
||||
Type=ais_folder["preferred_type"],
|
||||
Name=ais_folder["name"],
|
||||
)
|
||||
|
||||
|
||||
class BaseAddon(abc.ABC):
|
||||
def _schedule_task(self, coro: Coroutine, session=None,
|
||||
region_scoped=False, session_scoped=True, addon_scoped=True):
|
||||
|
||||
@@ -58,6 +58,15 @@ class BaseInteractionManager:
|
||||
return None
|
||||
|
||||
|
||||
# Used to initialize a REPL environment with commonly desired helpers
|
||||
REPL_INITIALIZER = r"""
|
||||
from hippolyzer.lib.base.datatypes import *
|
||||
from hippolyzer.lib.base.templates import *
|
||||
from hippolyzer.lib.base.message.message import Block, Message, Direction
|
||||
from hippolyzer.lib.proxy.addon_utils import send_chat, show_message
|
||||
"""
|
||||
|
||||
|
||||
class AddonManager:
|
||||
COMMAND_CHANNEL = 524
|
||||
|
||||
@@ -133,6 +142,16 @@ class AddonManager:
|
||||
if _locals is None:
|
||||
_locals = stack.frame.f_locals
|
||||
|
||||
init_globals = {}
|
||||
exec(REPL_INITIALIZER, init_globals, None)
|
||||
# We're modifying the globals of the caller, be careful of things we imported
|
||||
# for the REPL initializer clobber things that already exist in the caller's globals.
|
||||
# Making our own mutable copy of the globals dict, mutating that and then passing it
|
||||
# to embed() is not an option due to https://github.com/prompt-toolkit/ptpython/issues/279
|
||||
for global_name, global_val in init_globals.items():
|
||||
if global_name not in _globals:
|
||||
_globals[global_name] = global_val
|
||||
|
||||
async def _wrapper():
|
||||
coro: Coroutine = ptpython.repl.embed( # noqa: the type signature lies
|
||||
globals=_globals,
|
||||
@@ -256,6 +275,9 @@ class AddonManager:
|
||||
|
||||
new_addons = {}
|
||||
for spec in cls.BASE_ADDON_SPECS[:]:
|
||||
previous_mod = cls.FRESH_ADDON_MODULES.get(spec.name)
|
||||
# Whether we've EVER successfully loaded this module,
|
||||
# There may be a `None` entry in the dict if that's the case.
|
||||
had_mod = spec.name in cls.FRESH_ADDON_MODULES
|
||||
try:
|
||||
mtime = get_mtime(spec.origin)
|
||||
@@ -270,20 +292,21 @@ class AddonManager:
|
||||
# Keep module loaded even if file went away.
|
||||
continue
|
||||
|
||||
if previous_mod:
|
||||
cls._unload_module(previous_mod)
|
||||
|
||||
logging.info("(Re)compiling addon %s" % spec.origin)
|
||||
old_mod = cls.FRESH_ADDON_MODULES.get(spec.name)
|
||||
mod = importlib.util.module_from_spec(spec)
|
||||
sys.modules[spec.name] = mod
|
||||
spec.loader.exec_module(mod)
|
||||
cls.FILE_MTIMES[spec.origin] = mtime
|
||||
cls._unload_module(old_mod)
|
||||
|
||||
new_addons[spec.name] = mod
|
||||
|
||||
# Make sure module initialization happens after any pending task cancellations
|
||||
# due to module unloading.
|
||||
|
||||
asyncio.get_event_loop().call_soon(cls._init_module, mod)
|
||||
loop = asyncio.get_event_loop_policy().get_event_loop()
|
||||
loop.call_soon(cls._init_module, mod)
|
||||
except Exception as e:
|
||||
if had_mod:
|
||||
logging.exception("Exploded trying to reload addon %s" % spec.name)
|
||||
|
||||
@@ -28,7 +28,8 @@ class ProxyCapsClient(CapsClient):
|
||||
# We go through the proxy by default, tack on a header letting mitmproxy know the
|
||||
# request came from us so we can tag the request as injected. The header will be popped
|
||||
# off before passing through to the server.
|
||||
headers["X-Hippo-Injected"] = "1"
|
||||
if "X-Hippo-Injected" not in headers:
|
||||
headers["X-Hippo-Injected"] = "1"
|
||||
proxy_port = self._settings.HTTP_PROXY_PORT
|
||||
proxy = f"http://127.0.0.1:{proxy_port}"
|
||||
# TODO: set up the SSLContext to validate mitmproxy's cert
|
||||
|
||||
@@ -25,7 +25,7 @@ class ProxiedCircuit(Circuit):
|
||||
except:
|
||||
logging.exception(f"Failed to serialize: {message.to_dict()!r}")
|
||||
raise
|
||||
if self.logging_hook and message.injected:
|
||||
if self.logging_hook and message.synthetic:
|
||||
self.logging_hook(message)
|
||||
return self.send_datagram(serialized, message.direction, transport=transport)
|
||||
|
||||
@@ -34,47 +34,46 @@ class ProxiedCircuit(Circuit):
|
||||
return self.out_injections, self.in_injections
|
||||
return self.in_injections, self.out_injections
|
||||
|
||||
def prepare_message(self, message: Message, direction=None):
|
||||
def prepare_message(self, message: Message):
|
||||
if message.finalized:
|
||||
raise RuntimeError(f"Trying to re-send finalized {message!r}")
|
||||
if message.queued:
|
||||
# This is due to be dropped, nothing should be sending the original
|
||||
raise RuntimeError(f"Trying to send original of queued {message!r}")
|
||||
direction = direction or getattr(message, 'direction')
|
||||
fwd_injections, reverse_injections = self._get_injections(direction)
|
||||
fwd_injections, reverse_injections = self._get_injections(message.direction)
|
||||
|
||||
message.finalized = True
|
||||
|
||||
# Injected, let's gen an ID
|
||||
if message.packet_id is None:
|
||||
message.packet_id = fwd_injections.gen_injectable_id()
|
||||
message.injected = True
|
||||
else:
|
||||
message.synthetic = True
|
||||
# This message wasn't injected by the proxy so we need to rewrite packet IDs
|
||||
# to account for IDs the real creator of the packet couldn't have known about.
|
||||
elif not message.synthetic:
|
||||
# was_dropped needs the unmodified packet ID
|
||||
if fwd_injections.was_dropped(message.packet_id) and message.name != "PacketAck":
|
||||
logging.warning("Attempting to re-send previously dropped %s:%s, did we ack?" %
|
||||
(message.packet_id, message.name))
|
||||
message.packet_id = fwd_injections.get_effective_id(message.packet_id)
|
||||
fwd_injections.track_seen(message.packet_id)
|
||||
|
||||
message.finalized = True
|
||||
|
||||
if not message.injected:
|
||||
# This message wasn't injected by the proxy so we need to rewrite packet IDs
|
||||
# to account for IDs the other parties couldn't have known about.
|
||||
message.acks = tuple(
|
||||
reverse_injections.get_original_id(x) for x in message.acks
|
||||
if not reverse_injections.was_injected(x)
|
||||
)
|
||||
|
||||
if message.name == "PacketAck":
|
||||
if not self._rewrite_packet_ack(message, reverse_injections):
|
||||
logging.debug(f"Dropping {direction} ack for injected packets!")
|
||||
if not self._rewrite_packet_ack(message, reverse_injections) and not message.acks:
|
||||
logging.debug(f"Dropping {message.direction} ack for injected packets!")
|
||||
# Let caller know this shouldn't be sent at all, it's strictly ACKs for
|
||||
# injected packets.
|
||||
return False
|
||||
elif message.name == "StartPingCheck":
|
||||
self._rewrite_start_ping_check(message, fwd_injections)
|
||||
|
||||
if not message.acks:
|
||||
if message.acks:
|
||||
message.send_flags |= PacketFlags.ACK
|
||||
else:
|
||||
message.send_flags &= ~PacketFlags.ACK
|
||||
return True
|
||||
|
||||
@@ -100,15 +99,18 @@ class ProxiedCircuit(Circuit):
|
||||
new_id = fwd_injections.get_effective_id(orig_id)
|
||||
if orig_id != new_id:
|
||||
logging.debug("Rewrote oldest unacked %s -> %s" % (orig_id, new_id))
|
||||
# Get a list of unacked IDs for the direction this StartPingCheck is heading
|
||||
fwd_unacked = (a for (d, a) in self.unacked_reliable.keys() if d == message.direction)
|
||||
# Use the proxy's oldest unacked ID if it's older than the client's
|
||||
new_id = min((new_id, *fwd_unacked))
|
||||
message["PingID"]["OldestUnacked"] = new_id
|
||||
|
||||
def drop_message(self, message: Message, orig_direction=None):
|
||||
def drop_message(self, message: Message):
|
||||
if message.finalized:
|
||||
raise RuntimeError(f"Trying to drop finalized {message!r}")
|
||||
if message.packet_id is None:
|
||||
return
|
||||
orig_direction = orig_direction or message.direction
|
||||
fwd_injections, reverse_injections = self._get_injections(orig_direction)
|
||||
fwd_injections, reverse_injections = self._get_injections(message.direction)
|
||||
|
||||
fwd_injections.mark_dropped(message.packet_id)
|
||||
message.dropped = True
|
||||
@@ -116,7 +118,7 @@ class ProxiedCircuit(Circuit):
|
||||
|
||||
# Was sent reliably, tell the other end that we saw it and to shut up.
|
||||
if message.reliable:
|
||||
self.send_acks([message.packet_id], ~orig_direction)
|
||||
self.send_acks([message.packet_id], ~message.direction)
|
||||
|
||||
# This packet had acks for the other end, send them in a separate PacketAck
|
||||
effective_acks = tuple(
|
||||
@@ -124,7 +126,7 @@ class ProxiedCircuit(Circuit):
|
||||
if not reverse_injections.was_injected(x)
|
||||
)
|
||||
if effective_acks:
|
||||
self.send_acks(effective_acks, orig_direction, packet_id=message.packet_id)
|
||||
self.send_acks(effective_acks, message.direction, packet_id=message.packet_id)
|
||||
|
||||
|
||||
class InjectionTracker:
|
||||
|
||||
@@ -308,7 +308,7 @@ class MITMProxyEventManager:
|
||||
return
|
||||
parsed = llsd.parse_xml(flow.response.content)
|
||||
if "uploader" in parsed:
|
||||
region.register_temporary_cap(cap_data.cap_name + "Uploader", parsed["uploader"])
|
||||
region.register_cap(cap_data.cap_name + "Uploader", parsed["uploader"], CapType.TEMPORARY)
|
||||
except:
|
||||
logging.exception("OOPS, blew up in HTTP proxy!")
|
||||
|
||||
|
||||
@@ -33,7 +33,7 @@ class HippoHTTPFlow:
|
||||
meta.setdefault("can_stream", True)
|
||||
meta.setdefault("response_injected", False)
|
||||
meta.setdefault("request_injected", False)
|
||||
meta.setdefault("cap_data", None)
|
||||
meta.setdefault("cap_data", CapData())
|
||||
meta.setdefault("from_browser", False)
|
||||
|
||||
@property
|
||||
@@ -111,6 +111,12 @@ class HippoHTTPFlow:
|
||||
self.resumed = True
|
||||
self.callback_queue().put(("callback", self.flow.id, self.get_state()))
|
||||
|
||||
def preempt(self):
|
||||
# Must be some flow that we previously resumed, we're racing
|
||||
# the result from the server end.
|
||||
assert not self.taken and self.resumed
|
||||
self.callback_queue().put(("preempt", self.flow.id, self.get_state()))
|
||||
|
||||
@property
|
||||
def is_replay(self) -> bool:
|
||||
return bool(self.flow.is_replay)
|
||||
|
||||
@@ -7,6 +7,7 @@ import sys
|
||||
import queue
|
||||
import typing
|
||||
import uuid
|
||||
import weakref
|
||||
|
||||
import mitmproxy.certs
|
||||
import mitmproxy.ctx
|
||||
@@ -70,7 +71,7 @@ class SLTlsConfig(mitmproxy.addons.tlsconfig.TlsConfig):
|
||||
)
|
||||
self.certstore.certs = old_cert_store.certs
|
||||
|
||||
def tls_start_server(self, tls_start: tls.TlsStartData):
|
||||
def tls_start_server(self, tls_start: tls.TlsData):
|
||||
super().tls_start_server(tls_start)
|
||||
# Since 2000 the recommendation per RFCs has been to only check SANs and not the CN field.
|
||||
# Most browsers do this, as does mitmproxy. The viewer does not, and the sim certs have no SAN
|
||||
@@ -99,7 +100,7 @@ class IPCInterceptionAddon:
|
||||
"""
|
||||
def __init__(self, flow_context: HTTPFlowContext):
|
||||
self.mitmproxy_ready = flow_context.mitmproxy_ready
|
||||
self.intercepted_flows: typing.Dict[str, HTTPFlow] = {}
|
||||
self.flows: weakref.WeakValueDictionary[str, HTTPFlow] = weakref.WeakValueDictionary()
|
||||
self.from_proxy_queue: multiprocessing.Queue = flow_context.from_proxy_queue
|
||||
self.to_proxy_queue: multiprocessing.Queue = flow_context.to_proxy_queue
|
||||
self.shutdown_signal: multiprocessing.Event = flow_context.shutdown_signal
|
||||
@@ -134,8 +135,13 @@ class IPCInterceptionAddon:
|
||||
await asyncio.sleep(0.001)
|
||||
continue
|
||||
if event_type == "callback":
|
||||
orig_flow = self.intercepted_flows.pop(flow_id)
|
||||
orig_flow = self.flows[flow_id]
|
||||
orig_flow.set_state(flow_state)
|
||||
elif event_type == "preempt":
|
||||
orig_flow = self.flows.get(flow_id)
|
||||
if orig_flow:
|
||||
orig_flow.intercept()
|
||||
orig_flow.set_state(flow_state)
|
||||
elif event_type == "replay":
|
||||
flow: HTTPFlow = HTTPFlow.from_state(flow_state)
|
||||
# mitmproxy won't replay intercepted flows, this is an old flow so
|
||||
@@ -157,8 +163,8 @@ class IPCInterceptionAddon:
|
||||
from_browser = "Mozilla" in flow.request.headers.get("User-Agent", "")
|
||||
flow.metadata["from_browser"] = from_browser
|
||||
# Only trust the "injected" header if not from a browser
|
||||
was_injected = flow.request.headers.pop("X-Hippo-Injected", False)
|
||||
if was_injected and not from_browser:
|
||||
was_injected = flow.request.headers.pop("X-Hippo-Injected", "")
|
||||
if was_injected == "1" and not from_browser:
|
||||
flow.metadata["request_injected"] = True
|
||||
|
||||
# Does this request need the stupid hack around aiohttp's windows proactor bug
|
||||
@@ -169,13 +175,13 @@ class IPCInterceptionAddon:
|
||||
|
||||
def _queue_flow_interception(self, event_type: str, flow: HTTPFlow):
|
||||
flow.intercept()
|
||||
self.intercepted_flows[flow.id] = flow
|
||||
self.flows[flow.id] = flow
|
||||
self.from_proxy_queue.put((event_type, flow.get_state()), True)
|
||||
|
||||
def responseheaders(self, flow: HTTPFlow):
|
||||
# The response was injected earlier in an earlier handler,
|
||||
# we don't want to touch this anymore.
|
||||
if flow.metadata["response_injected"]:
|
||||
if flow.metadata.get("response_injected"):
|
||||
return
|
||||
|
||||
# Someone fucked up and put a mimetype in Content-Encoding.
|
||||
@@ -187,7 +193,7 @@ class IPCInterceptionAddon:
|
||||
|
||||
def response(self, flow: HTTPFlow):
|
||||
cap_data: typing.Optional[SerializedCapData] = flow.metadata.get("cap_data")
|
||||
if flow.metadata["response_injected"] and cap_data and cap_data.asset_server_cap:
|
||||
if flow.metadata.get("response_injected") and cap_data and cap_data.asset_server_cap:
|
||||
# Don't bother intercepting asset server requests where we injected a response.
|
||||
# We don't want to log them and they don't need any more processing by user hooks.
|
||||
return
|
||||
@@ -197,10 +203,10 @@ class IPCInterceptionAddon:
|
||||
class SLMITMAddon(IPCInterceptionAddon):
|
||||
def responseheaders(self, flow: HTTPFlow):
|
||||
super().responseheaders(flow)
|
||||
cap_data: typing.Optional[SerializedCapData] = flow.metadata["cap_data_ser"]
|
||||
cap_data: typing.Optional[SerializedCapData] = flow.metadata.get("cap_data_ser")
|
||||
|
||||
# Request came from the proxy itself, don't touch it.
|
||||
if flow.metadata["request_injected"]:
|
||||
if flow.metadata.get("request_injected"):
|
||||
return
|
||||
|
||||
# This is an asset server response that we're not interested in intercepting.
|
||||
@@ -209,7 +215,7 @@ class SLMITMAddon(IPCInterceptionAddon):
|
||||
# Can't stream if we injected our own response or we were asked not to stream
|
||||
if not flow.metadata["response_injected"] and flow.metadata["can_stream"]:
|
||||
flow.response.stream = True
|
||||
elif not cap_data and not flow.metadata["from_browser"]:
|
||||
elif not cap_data and not flow.metadata.get("from_browser"):
|
||||
object_name = flow.response.headers.get("X-SecondLife-Object-Name", "")
|
||||
# Meh. Add some fake Cap data for this so it can be matched on.
|
||||
if object_name.startswith("#Firestorm LSL Bridge"):
|
||||
@@ -229,10 +235,6 @@ class SLMITMMaster(mitmproxy.master.Master):
|
||||
SLMITMAddon(flow_context),
|
||||
)
|
||||
|
||||
def start_server(self):
|
||||
self.start()
|
||||
asyncio.ensure_future(self.running())
|
||||
|
||||
|
||||
def create_proxy_master(host, port, flow_context: HTTPFlowContext): # pragma: no cover
|
||||
opts = mitmproxy.options.Options()
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import asyncio
|
||||
import logging
|
||||
import weakref
|
||||
from typing import Optional, Tuple
|
||||
@@ -35,6 +36,18 @@ class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
|
||||
)
|
||||
self.message_xml = MessageDotXML()
|
||||
self.session: Optional[Session] = None
|
||||
loop = asyncio.get_event_loop_policy().get_event_loop()
|
||||
self.resend_task = loop.create_task(self.attempt_resends())
|
||||
|
||||
async def attempt_resends(self):
|
||||
while True:
|
||||
await asyncio.sleep(0.1)
|
||||
if self.session is None:
|
||||
continue
|
||||
for region in self.session.regions:
|
||||
if not region.circuit or not region.circuit.is_alive:
|
||||
continue
|
||||
region.circuit.resend_unacked()
|
||||
|
||||
def _ensure_message_allowed(self, msg: Message):
|
||||
if not self.message_xml.validate_udp_msg(msg.name):
|
||||
@@ -99,6 +112,9 @@ class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
|
||||
LOG.error("No circuit for %r, dropping packet!" % (packet.far_addr,))
|
||||
return
|
||||
|
||||
# Process any ACKs for messages we injected first
|
||||
region.circuit.collect_acks(message)
|
||||
|
||||
if message.name == "AgentMovementComplete":
|
||||
self.session.main_region = region
|
||||
if region.handle is None:
|
||||
@@ -148,7 +164,7 @@ class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
|
||||
|
||||
# Send the message if it wasn't explicitly dropped or sent before
|
||||
if not message.finalized:
|
||||
region.circuit.send_message(message)
|
||||
region.circuit.send(message)
|
||||
|
||||
def close(self):
|
||||
super().close()
|
||||
@@ -156,3 +172,4 @@ class InterceptingLLUDPProxyProtocol(UDPProxyProtocol):
|
||||
AddonManager.handle_session_closed(self.session)
|
||||
self.session_manager.close_session(self.session)
|
||||
self.session = None
|
||||
self.resend_task.cancel()
|
||||
|
||||
@@ -3,7 +3,7 @@ import ast
|
||||
import typing
|
||||
|
||||
from arpeggio import Optional, ZeroOrMore, EOF, \
|
||||
ParserPython, PTNodeVisitor, visit_parse_tree, RegExMatch
|
||||
ParserPython, PTNodeVisitor, visit_parse_tree, RegExMatch, OneOrMore
|
||||
|
||||
|
||||
def literal():
|
||||
@@ -26,7 +26,9 @@ def literal():
|
||||
|
||||
|
||||
def identifier():
|
||||
return RegExMatch(r'[a-zA-Z*]([a-zA-Z0-9_*]+)?')
|
||||
# Identifiers are allowed to have "-". It's not a special character
|
||||
# in our grammar, and we expect them to show up some places, like header names.
|
||||
return RegExMatch(r'[a-zA-Z*]([a-zA-Z0-9_*-]+)?')
|
||||
|
||||
|
||||
def field_specifier():
|
||||
@@ -42,7 +44,7 @@ def unary_expression():
|
||||
|
||||
|
||||
def meta_field_specifier():
|
||||
return "Meta", ".", identifier
|
||||
return "Meta", OneOrMore(".", identifier)
|
||||
|
||||
|
||||
def enum_field_specifier():
|
||||
@@ -69,12 +71,17 @@ def message_filter():
|
||||
return expression, EOF
|
||||
|
||||
|
||||
MATCH_RESULT = typing.Union[bool, typing.Tuple]
|
||||
class MatchResult(typing.NamedTuple):
|
||||
result: bool
|
||||
fields: typing.List[typing.Tuple]
|
||||
|
||||
def __bool__(self):
|
||||
return self.result
|
||||
|
||||
|
||||
class BaseFilterNode(abc.ABC):
|
||||
@abc.abstractmethod
|
||||
def match(self, msg) -> MATCH_RESULT:
|
||||
def match(self, msg, short_circuit=True) -> MatchResult:
|
||||
raise NotImplementedError()
|
||||
|
||||
@property
|
||||
@@ -104,18 +111,36 @@ class BinaryFilterNode(BaseFilterNode, abc.ABC):
|
||||
|
||||
|
||||
class UnaryNotFilterNode(UnaryFilterNode):
|
||||
def match(self, msg) -> MATCH_RESULT:
|
||||
return not self.node.match(msg)
|
||||
def match(self, msg, short_circuit=True) -> MatchResult:
|
||||
# Should we pass fields up here? Maybe not.
|
||||
return MatchResult(not self.node.match(msg, short_circuit), [])
|
||||
|
||||
|
||||
class OrFilterNode(BinaryFilterNode):
|
||||
def match(self, msg) -> MATCH_RESULT:
|
||||
return self.left_node.match(msg) or self.right_node.match(msg)
|
||||
def match(self, msg, short_circuit=True) -> MatchResult:
|
||||
left_match = self.left_node.match(msg, short_circuit)
|
||||
if left_match and short_circuit:
|
||||
return MatchResult(True, left_match.fields)
|
||||
|
||||
right_match = self.right_node.match(msg, short_circuit)
|
||||
if right_match and short_circuit:
|
||||
return MatchResult(True, right_match.fields)
|
||||
|
||||
if left_match or right_match:
|
||||
# Fine since fields should be empty when result=False
|
||||
return MatchResult(True, left_match.fields + right_match.fields)
|
||||
return MatchResult(False, [])
|
||||
|
||||
|
||||
class AndFilterNode(BinaryFilterNode):
|
||||
def match(self, msg) -> MATCH_RESULT:
|
||||
return self.left_node.match(msg) and self.right_node.match(msg)
|
||||
def match(self, msg, short_circuit=True) -> MatchResult:
|
||||
left_match = self.left_node.match(msg, short_circuit)
|
||||
if not left_match:
|
||||
return MatchResult(False, [])
|
||||
right_match = self.right_node.match(msg, short_circuit)
|
||||
if not right_match:
|
||||
return MatchResult(False, [])
|
||||
return MatchResult(True, left_match.fields + right_match.fields)
|
||||
|
||||
|
||||
class MessageFilterNode(BaseFilterNode):
|
||||
@@ -124,15 +149,15 @@ class MessageFilterNode(BaseFilterNode):
|
||||
self.operator = operator
|
||||
self.value = value
|
||||
|
||||
def match(self, msg) -> MATCH_RESULT:
|
||||
return msg.matches(self)
|
||||
def match(self, msg, short_circuit=True) -> MatchResult:
|
||||
return msg.matches(self, short_circuit)
|
||||
|
||||
@property
|
||||
def children(self):
|
||||
return self.selector, self.operator, self.value
|
||||
|
||||
|
||||
class MetaFieldSpecifier(str):
|
||||
class MetaFieldSpecifier(tuple):
|
||||
pass
|
||||
|
||||
|
||||
@@ -158,7 +183,7 @@ class MessageFilterVisitor(PTNodeVisitor):
|
||||
return LiteralValue(ast.literal_eval(node.value))
|
||||
|
||||
def visit_meta_field_specifier(self, _node, children):
|
||||
return MetaFieldSpecifier(children[0])
|
||||
return MetaFieldSpecifier(children)
|
||||
|
||||
def visit_enum_field_specifier(self, _node, children):
|
||||
return EnumFieldSpecifier(*children)
|
||||
|
||||
@@ -21,7 +21,7 @@ from hippolyzer.lib.base.datatypes import TaggedUnion, UUID, TupleCoord
|
||||
from hippolyzer.lib.base.helpers import bytes_escape
|
||||
from hippolyzer.lib.base.message.message_formatting import HumanMessageSerializer
|
||||
from hippolyzer.lib.proxy.message_filter import MetaFieldSpecifier, compile_filter, BaseFilterNode, MessageFilterNode, \
|
||||
EnumFieldSpecifier
|
||||
EnumFieldSpecifier, MatchResult
|
||||
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
|
||||
from hippolyzer.lib.proxy.caps import CapType, SerializedCapData
|
||||
|
||||
@@ -235,7 +235,7 @@ class AbstractMessageLogEntry(abc.ABC):
|
||||
obj = self.region.objects.lookup_localid(selected_local)
|
||||
return obj and obj.FullID
|
||||
|
||||
def _get_meta(self, name: str):
|
||||
def _get_meta(self, name: str) -> typing.Any:
|
||||
# Slight difference in semantics. Filters are meant to return the same
|
||||
# thing no matter when they're run, so SelectedLocal and friends resolve
|
||||
# to the selected items _at the time the message was logged_. To handle
|
||||
@@ -308,7 +308,9 @@ class AbstractMessageLogEntry(abc.ABC):
|
||||
|
||||
def _val_matches(self, operator, val, expected):
|
||||
if isinstance(expected, MetaFieldSpecifier):
|
||||
expected = self._get_meta(str(expected))
|
||||
if len(expected) != 1:
|
||||
raise ValueError(f"Can only support single-level Meta specifiers, not {expected!r}")
|
||||
expected = self._get_meta(str(expected[0]))
|
||||
if not isinstance(expected, (int, float, bytes, str, type(None), tuple)):
|
||||
if callable(expected):
|
||||
expected = expected()
|
||||
@@ -362,12 +364,18 @@ class AbstractMessageLogEntry(abc.ABC):
|
||||
if matcher.value or matcher.operator:
|
||||
return False
|
||||
return self._packet_root_matches(matcher.selector[0])
|
||||
if len(matcher.selector) == 2 and matcher.selector[0] == "Meta":
|
||||
return self._val_matches(matcher.operator, self._get_meta(matcher.selector[1]), matcher.value)
|
||||
if matcher.selector[0] == "Meta":
|
||||
if len(matcher.selector) == 2:
|
||||
return self._val_matches(matcher.operator, self._get_meta(matcher.selector[1]), matcher.value)
|
||||
elif len(matcher.selector) == 3:
|
||||
meta_dict = self._get_meta(matcher.selector[1])
|
||||
if not meta_dict or not hasattr(meta_dict, 'get'):
|
||||
return False
|
||||
return self._val_matches(matcher.operator, meta_dict.get(matcher.selector[2]), matcher.value)
|
||||
return None
|
||||
|
||||
def matches(self, matcher: "MessageFilterNode"):
|
||||
return self._base_matches(matcher) or False
|
||||
def matches(self, matcher: "MessageFilterNode", short_circuit=True) -> "MatchResult":
|
||||
return MatchResult(self._base_matches(matcher) or False, [])
|
||||
|
||||
@property
|
||||
def seq(self):
|
||||
@@ -388,6 +396,14 @@ class AbstractMessageLogEntry(abc.ABC):
|
||||
xmlified = re.sub(rb" <key>", b"<key>", xmlified)
|
||||
return xmlified.decode("utf8", errors="replace")
|
||||
|
||||
@staticmethod
|
||||
def _format_xml(content):
|
||||
beautified = minidom.parseString(content).toprettyxml(indent=" ")
|
||||
# kill blank lines. will break cdata sections. meh.
|
||||
beautified = re.sub(r'\n\s*\n', '\n', beautified, flags=re.MULTILINE)
|
||||
return re.sub(r'<([\w]+)>\s*</\1>', r'<\1></\1>',
|
||||
beautified, flags=re.MULTILINE)
|
||||
|
||||
|
||||
class HTTPMessageLogEntry(AbstractMessageLogEntry):
|
||||
__slots__ = ["flow"]
|
||||
@@ -400,7 +416,7 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
|
||||
|
||||
super().__init__(region, session)
|
||||
# This was a request the proxy made through itself
|
||||
self.meta["Injected"] = flow.request_injected
|
||||
self.meta["Synthetic"] = flow.request_injected
|
||||
|
||||
@property
|
||||
def type(self):
|
||||
@@ -476,13 +492,17 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
|
||||
if not beautified:
|
||||
content_type = self._guess_content_type(message)
|
||||
if content_type.startswith("application/llsd"):
|
||||
beautified = self._format_llsd(llsd.parse(message.content))
|
||||
try:
|
||||
beautified = self._format_llsd(llsd.parse(message.content))
|
||||
except llsd.LLSDParseError:
|
||||
# Sometimes LL sends plain XML with a Content-Type of application/llsd+xml.
|
||||
# Try to detect that case and work around it
|
||||
if content_type == "application/llsd+xml" and message.content.startswith(b'<'):
|
||||
beautified = self._format_xml(message.content)
|
||||
else:
|
||||
raise
|
||||
elif any(content_type.startswith(x) for x in ("application/xml", "text/xml")):
|
||||
beautified = minidom.parseString(message.content).toprettyxml(indent=" ")
|
||||
# kill blank lines. will break cdata sections. meh.
|
||||
beautified = re.sub(r'\n\s*\n', '\n', beautified, flags=re.MULTILINE)
|
||||
beautified = re.sub(r'<([\w]+)>\s*</\1>', r'<\1></\1>',
|
||||
beautified, flags=re.MULTILINE)
|
||||
beautified = self._format_xml(message.content)
|
||||
except:
|
||||
LOG.exception("Failed to beautify message")
|
||||
|
||||
@@ -541,6 +561,20 @@ class HTTPMessageLogEntry(AbstractMessageLogEntry):
|
||||
return "application/xml"
|
||||
return content_type
|
||||
|
||||
def _get_meta(self, name: str) -> typing.Any:
|
||||
lower_name = name.lower()
|
||||
if lower_name == "url":
|
||||
return self.flow.request.url
|
||||
elif lower_name == "reqheaders":
|
||||
return self.flow.request.headers
|
||||
elif lower_name == "respheaders":
|
||||
return self.flow.response.headers
|
||||
elif lower_name == "host":
|
||||
return self.flow.request.host.lower()
|
||||
elif lower_name == "status":
|
||||
return self.flow.response.status_code
|
||||
return super()._get_meta(name)
|
||||
|
||||
def to_dict(self):
|
||||
val = super().to_dict()
|
||||
val['flow'] = self.flow.get_state()
|
||||
@@ -613,7 +647,7 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
|
||||
super().__init__(region, session)
|
||||
|
||||
_MESSAGE_META_ATTRS = {
|
||||
"Injected", "Dropped", "Extra", "Resent", "Zerocoded", "Acks", "Reliable",
|
||||
"Synthetic", "Dropped", "Extra", "Resent", "Zerocoded", "Acks", "Reliable",
|
||||
}
|
||||
|
||||
def _get_meta(self, name: str):
|
||||
@@ -671,20 +705,21 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
|
||||
def request(self, beautify=False, replacements=None):
|
||||
return HumanMessageSerializer.to_human_string(self.message, replacements, beautify)
|
||||
|
||||
def matches(self, matcher):
|
||||
def matches(self, matcher, short_circuit=True) -> "MatchResult":
|
||||
base_matched = self._base_matches(matcher)
|
||||
if base_matched is not None:
|
||||
return base_matched
|
||||
return MatchResult(base_matched, [])
|
||||
|
||||
if not self._packet_root_matches(matcher.selector[0]):
|
||||
return False
|
||||
return MatchResult(False, [])
|
||||
|
||||
message = self.message
|
||||
|
||||
selector_len = len(matcher.selector)
|
||||
# name, block_name, var_name(, subfield_name)?
|
||||
if selector_len not in (3, 4):
|
||||
return False
|
||||
return MatchResult(False, [])
|
||||
found_field_keys = []
|
||||
for block_name in message.blocks:
|
||||
if not fnmatch.fnmatchcase(block_name, matcher.selector[1]):
|
||||
continue
|
||||
@@ -693,13 +728,13 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
|
||||
if not fnmatch.fnmatchcase(var_name, matcher.selector[2]):
|
||||
continue
|
||||
# So we know where the match happened
|
||||
span_key = (message.name, block_name, block_num, var_name)
|
||||
field_key = (message.name, block_name, block_num, var_name)
|
||||
if selector_len == 3:
|
||||
# We're just matching on the var existing, not having any particular value
|
||||
if matcher.value is None:
|
||||
return span_key
|
||||
if self._val_matches(matcher.operator, block[var_name], matcher.value):
|
||||
return span_key
|
||||
found_field_keys.append(field_key)
|
||||
elif self._val_matches(matcher.operator, block[var_name], matcher.value):
|
||||
found_field_keys.append(field_key)
|
||||
# Need to invoke a special unpacker
|
||||
elif selector_len == 4:
|
||||
try:
|
||||
@@ -710,15 +745,21 @@ class LLUDPMessageLogEntry(AbstractMessageLogEntry):
|
||||
if isinstance(deserialized, TaggedUnion):
|
||||
deserialized = deserialized.value
|
||||
if not isinstance(deserialized, dict):
|
||||
return False
|
||||
continue
|
||||
for key in deserialized.keys():
|
||||
if fnmatch.fnmatchcase(str(key), matcher.selector[3]):
|
||||
if matcher.value is None:
|
||||
return span_key
|
||||
if self._val_matches(matcher.operator, deserialized[key], matcher.value):
|
||||
return span_key
|
||||
# Short-circuiting checking individual subfields is fine since
|
||||
# we only highlight fields anyway.
|
||||
found_field_keys.append(field_key)
|
||||
break
|
||||
elif self._val_matches(matcher.operator, deserialized[key], matcher.value):
|
||||
found_field_keys.append(field_key)
|
||||
break
|
||||
|
||||
return False
|
||||
if short_circuit and found_field_keys:
|
||||
return MatchResult(True, found_field_keys)
|
||||
return MatchResult(bool(found_field_keys), found_field_keys)
|
||||
|
||||
@property
|
||||
def summary(self):
|
||||
|
||||
@@ -11,7 +11,7 @@ from hippolyzer.lib.base.templates import PCode
|
||||
from hippolyzer.lib.client.namecache import NameCache
|
||||
from hippolyzer.lib.client.object_manager import (
|
||||
ClientObjectManager,
|
||||
UpdateType, ClientWorldObjectManager,
|
||||
ObjectUpdateType, ClientWorldObjectManager,
|
||||
)
|
||||
|
||||
from hippolyzer.lib.base.objects import Object
|
||||
@@ -63,18 +63,25 @@ class ProxyObjectManager(ClientObjectManager):
|
||||
cache_dir=self._region.session().cache_dir,
|
||||
)
|
||||
|
||||
def request_missed_cached_objects_soon(self):
|
||||
def request_missed_cached_objects_soon(self, report_only=False):
|
||||
if self._cache_miss_timer:
|
||||
self._cache_miss_timer.cancel()
|
||||
# Basically debounce. Will only trigger 0.2 seconds after the last time it's invoked to
|
||||
# deal with the initial flood of ObjectUpdateCached and the natural lag time between that
|
||||
# and the viewers' RequestMultipleObjects messages
|
||||
self._cache_miss_timer = asyncio.get_event_loop().call_later(
|
||||
0.2, self._request_missed_cached_objects)
|
||||
loop = asyncio.get_event_loop_policy().get_event_loop()
|
||||
self._cache_miss_timer = loop.call_later(0.2, self._request_missed_cached_objects, report_only)
|
||||
|
||||
def _request_missed_cached_objects(self):
|
||||
def _request_missed_cached_objects(self, report_only: bool):
|
||||
self._cache_miss_timer = None
|
||||
self.request_objects(self.queued_cache_misses)
|
||||
if not self.queued_cache_misses:
|
||||
# All the queued cache misses ended up being satisfied without us
|
||||
# having to request them, no need to fire off a request.
|
||||
return
|
||||
if report_only:
|
||||
print(f"Would have automatically requested {self.queued_cache_misses!r}")
|
||||
else:
|
||||
self.request_objects(self.queued_cache_misses)
|
||||
self.queued_cache_misses.clear()
|
||||
|
||||
def clear(self):
|
||||
@@ -110,9 +117,12 @@ class ProxyWorldObjectManager(ClientWorldObjectManager):
|
||||
)
|
||||
|
||||
def _handle_object_update_cached_misses(self, region_handle: int, missing_locals: Set[int]):
|
||||
region_mgr: Optional[ProxyObjectManager] = self._get_region_manager(region_handle)
|
||||
if not self._settings.ALLOW_AUTO_REQUEST_OBJECTS:
|
||||
return
|
||||
if self._settings.AUTOMATICALLY_REQUEST_MISSING_OBJECTS:
|
||||
if self._settings.USE_VIEWER_OBJECT_CACHE:
|
||||
region_mgr.queued_cache_misses |= missing_locals
|
||||
region_mgr.request_missed_cached_objects_soon(report_only=True)
|
||||
elif self._settings.AUTOMATICALLY_REQUEST_MISSING_OBJECTS:
|
||||
# Schedule these local IDs to be requested soon if the viewer doesn't request
|
||||
# them itself. Ideally we could just mutate the CRC of the ObjectUpdateCached
|
||||
# to force a CRC cache miss in the viewer, but that appears to cause the viewer
|
||||
@@ -123,16 +133,16 @@ class ProxyWorldObjectManager(ClientWorldObjectManager):
|
||||
region_mgr.queued_cache_misses |= missing_locals
|
||||
region_mgr.request_missed_cached_objects_soon()
|
||||
|
||||
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str], update_type: UpdateType):
|
||||
def _run_object_update_hooks(self, obj: Object, updated_props: Set[str], update_type: ObjectUpdateType):
|
||||
super()._run_object_update_hooks(obj, updated_props, update_type)
|
||||
region = self._session.region_by_handle(obj.RegionHandle)
|
||||
if self._settings.ALLOW_AUTO_REQUEST_OBJECTS:
|
||||
if obj.PCode == PCode.AVATAR and "ParentID" in updated_props:
|
||||
if obj.ParentID and not region.objects.lookup_localid(obj.ParentID):
|
||||
# If an avatar just sat on an object we don't know about, add it to the queued
|
||||
# cache misses and request if if the viewer doesn't. This should happen
|
||||
# regardless of the auto-request object setting because otherwise we have no way
|
||||
# to get a sitting agent's true region location, even if it's ourself.
|
||||
# cache misses and request it if the viewer doesn't. This should happen
|
||||
# regardless of the auto-request missing objects setting because otherwise we
|
||||
# have no way to get a sitting agent's true region location, even if it's ourselves.
|
||||
region.objects.queued_cache_misses.add(obj.ParentID)
|
||||
region.objects.request_missed_cached_objects_soon()
|
||||
AddonManager.handle_object_updated(self._session, region, obj, updated_props)
|
||||
|
||||
@@ -11,7 +11,8 @@ import multidict
|
||||
|
||||
from hippolyzer.lib.base.datatypes import Vector3, UUID
|
||||
from hippolyzer.lib.base.helpers import proxify
|
||||
from hippolyzer.lib.base.message.message import Message
|
||||
from hippolyzer.lib.base.message.llsd_msg_serializer import LLSDMessageSerializer
|
||||
from hippolyzer.lib.base.message.message import Message, Block
|
||||
from hippolyzer.lib.base.message.message_handler import MessageHandler
|
||||
from hippolyzer.lib.base.objects import handle_to_global_pos
|
||||
from hippolyzer.lib.client.state import BaseClientRegion
|
||||
@@ -110,7 +111,8 @@ class ProxiedRegion(BaseClientRegion):
|
||||
Wrap an existing, non-unique cap with a unique URL
|
||||
|
||||
caps like ViewerAsset may be the same globally and wouldn't let us infer
|
||||
which session / region the request was related to without a wrapper
|
||||
which session / region the request was related to without a wrapper URL
|
||||
that we inject into the seed response sent to the viewer.
|
||||
"""
|
||||
parsed = list(urllib.parse.urlsplit(self.caps[name][1]))
|
||||
seed_id = self.caps["Seed"][1].split("/")[-1].encode("utf8")
|
||||
@@ -120,22 +122,24 @@ class ProxiedRegion(BaseClientRegion):
|
||||
# to be secure. This should save on expensive TLS context setup for each req.
|
||||
parsed[0] = "http"
|
||||
wrapper_url = urllib.parse.urlunsplit(parsed)
|
||||
self.caps.add(name + "ProxyWrapper", (CapType.WRAPPER, wrapper_url))
|
||||
self._recalc_caps()
|
||||
# Register it with "ProxyWrapper" appended so we don't shadow the real cap URL
|
||||
# in our own view of the caps
|
||||
self.register_cap(name + "ProxyWrapper", wrapper_url, CapType.WRAPPER)
|
||||
return wrapper_url
|
||||
|
||||
def register_proxy_cap(self, name: str):
|
||||
"""
|
||||
Register a cap to be completely handled by the proxy
|
||||
"""
|
||||
"""Register a cap to be completely handled by the proxy"""
|
||||
if name in self.caps:
|
||||
# If we have an existing cap then we should just use that.
|
||||
cap_data = self.caps[name]
|
||||
if cap_data[1] == CapType.PROXY_ONLY:
|
||||
return cap_data[0]
|
||||
cap_url = f"http://{uuid.uuid4()!s}.caps.hippo-proxy.localhost"
|
||||
self.caps.add(name, (CapType.PROXY_ONLY, cap_url))
|
||||
self._recalc_caps()
|
||||
self.register_cap(name, cap_url, CapType.PROXY_ONLY)
|
||||
return cap_url
|
||||
|
||||
def register_temporary_cap(self, name: str, cap_url: str):
|
||||
"""Register a Cap that only has meaning the first time it's used"""
|
||||
self.caps.add(name, (CapType.TEMPORARY, cap_url))
|
||||
def register_cap(self, name: str, cap_url: str, cap_type: CapType = CapType.NORMAL):
|
||||
self.caps.add(name, (cap_type, cap_url))
|
||||
self._recalc_caps()
|
||||
|
||||
def resolve_cap(self, url: str, consume=True) -> Optional[Tuple[str, str, CapType]]:
|
||||
@@ -169,9 +173,26 @@ class EventQueueManager:
|
||||
self._region = weakref.proxy(region)
|
||||
self._last_ack: Optional[int] = None
|
||||
self._last_payload: Optional[Any] = None
|
||||
self.llsd_message_serializer = LLSDMessageSerializer()
|
||||
|
||||
def inject_message(self, message: Message):
|
||||
self.inject_event(self.llsd_message_serializer.serialize(message, True))
|
||||
|
||||
def inject_event(self, event: dict):
|
||||
self._queued_events.append(event)
|
||||
if self._region:
|
||||
circuit: ProxiedCircuit = self._region.circuit
|
||||
session: Session = self._region.session()
|
||||
# Inject an outbound PlacesQuery message so we can trigger an inbound PlacesReply
|
||||
# over the EQ. That will allow us to shove our own event onto the response once it comes in,
|
||||
# otherwise we have to wait until the EQ legitimately returns 200 due to a new event.
|
||||
# May or may not work in OpenSim.
|
||||
circuit.send_message(Message(
|
||||
'PlacesQuery',
|
||||
Block('AgentData', AgentID=session.agent_id, SessionID=session.id, QueryID=UUID()),
|
||||
Block('TransactionData', TransactionID=UUID()),
|
||||
Block('QueryData', QueryText=b'', QueryFlags=64, Category=-1, SimName=b''),
|
||||
))
|
||||
|
||||
def take_injected_events(self):
|
||||
events = self._queued_events
|
||||
|
||||
@@ -63,8 +63,14 @@ class TaskScheduler:
|
||||
def shutdown(self):
|
||||
for task_data, task in self.tasks:
|
||||
task.cancel()
|
||||
await_all = asyncio.gather(*(task for task_data, task in self.tasks))
|
||||
asyncio.get_event_loop().run_until_complete(await_all)
|
||||
|
||||
try:
|
||||
event_loop = asyncio.get_running_loop()
|
||||
await_all = asyncio.gather(*(task for task_data, task in self.tasks))
|
||||
event_loop.run_until_complete(await_all)
|
||||
except RuntimeError:
|
||||
pass
|
||||
self.tasks.clear()
|
||||
|
||||
def _task_done(self, task: asyncio.Task):
|
||||
for task_details in reversed(self.tasks):
|
||||
|
||||
@@ -14,7 +14,7 @@ from hippolyzer.lib.proxy.transport import SOCKS5UDPTransport
|
||||
|
||||
|
||||
class BaseProxyTest(unittest.IsolatedAsyncioTestCase):
|
||||
def setUp(self) -> None:
|
||||
async def asyncSetUp(self) -> None:
|
||||
self.client_addr = ("127.0.0.1", 1)
|
||||
self.region_addr = ("127.0.0.1", 3)
|
||||
self.circuit_code = 1234
|
||||
@@ -37,6 +37,9 @@ class BaseProxyTest(unittest.IsolatedAsyncioTestCase):
|
||||
self.serializer = UDPMessageSerializer()
|
||||
self.session.objects.track_region_objects(123)
|
||||
|
||||
def tearDown(self) -> None:
|
||||
self.protocol.close()
|
||||
|
||||
async def _wait_drained(self):
|
||||
await asyncio.sleep(0.001)
|
||||
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
import abc
|
||||
|
||||
from mitmproxy.addons import asgiapp
|
||||
from mitmproxy.controller import DummyReply
|
||||
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon
|
||||
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
|
||||
@@ -11,13 +10,7 @@ from hippolyzer.lib.proxy.sessions import Session, SessionManager
|
||||
|
||||
async def serve(app, flow: HippoHTTPFlow):
|
||||
"""Serve a request based on a Hippolyzer HTTP flow using a provided app"""
|
||||
# Shove this on mitmproxy's flow object so asgiapp doesn't explode when it tries
|
||||
# to commit the flow reply. Our take / commit semantics are different than mitmproxy
|
||||
# proper, so we ignore what mitmproxy sets here anyhow.
|
||||
flow.flow.reply = DummyReply()
|
||||
flow.flow.reply.take()
|
||||
await asgiapp.serve(app, flow.flow)
|
||||
flow.flow.reply = None
|
||||
# Send the modified flow object back to mitmproxy
|
||||
flow.resume()
|
||||
|
||||
@@ -34,7 +27,10 @@ class WebAppCapAddon(BaseAddon, abc.ABC):
|
||||
def handle_region_registered(self, session: Session, region: ProxiedRegion):
|
||||
# Register a fake URL for our cap. This will add the cap URL to the Seed
|
||||
# response that gets sent back to the client if that cap name was requested.
|
||||
if self.CAP_NAME not in region.cap_urls:
|
||||
region.register_proxy_cap(self.CAP_NAME)
|
||||
|
||||
def handle_session_init(self, session: Session):
|
||||
for region in session.regions:
|
||||
region.register_proxy_cap(self.CAP_NAME)
|
||||
|
||||
def handle_http_request(self, session_manager: SessionManager, flow: HippoHTTPFlow):
|
||||
|
||||
@@ -11,7 +11,7 @@ certifi==2021.10.8
|
||||
cffi==1.15.0
|
||||
charset-normalizer==2.0.9
|
||||
click==8.0.3
|
||||
cryptography==3.4.8
|
||||
cryptography==36.0.2
|
||||
defusedxml==0.7.1
|
||||
Flask==2.0.2
|
||||
frozenlist==1.2.0
|
||||
@@ -30,7 +30,7 @@ ldap3==2.9.1
|
||||
llbase==1.2.11
|
||||
lxml==4.6.4
|
||||
MarkupSafe==2.0.1
|
||||
mitmproxy==7.0.4
|
||||
mitmproxy==8.0.0
|
||||
msgpack==1.0.3
|
||||
multidict==5.2.0
|
||||
numpy==1.21.4
|
||||
@@ -42,8 +42,9 @@ ptpython==3.0.20
|
||||
publicsuffix2==2.20191221
|
||||
pyasn1==0.4.8
|
||||
pycparser==2.21
|
||||
pycollada==0.7.2
|
||||
Pygments==2.10.0
|
||||
pyOpenSSL==20.0.1
|
||||
pyOpenSSL==22.0.0
|
||||
pyparsing==2.4.7
|
||||
pyperclip==1.8.2
|
||||
PySide6==6.2.2
|
||||
@@ -56,6 +57,7 @@ shiboken6==6.2.2
|
||||
six==1.16.0
|
||||
sortedcontainers==2.4.0
|
||||
tornado==6.1
|
||||
transformations==2021.6.6
|
||||
typing-extensions==4.0.1
|
||||
urllib3==1.26.7
|
||||
urwid==2.1.2
|
||||
|
||||
8
setup.py
8
setup.py
@@ -25,7 +25,7 @@ from setuptools import setup, find_packages
|
||||
|
||||
here = path.abspath(path.dirname(__file__))
|
||||
|
||||
version = '0.8.0'
|
||||
version = '0.11.2'
|
||||
|
||||
with open(path.join(here, 'README.md')) as readme_fh:
|
||||
readme = readme_fh.read()
|
||||
@@ -67,6 +67,7 @@ setup(
|
||||
'lib/base/data/static_data.db2',
|
||||
'lib/base/data/static_index.db2',
|
||||
'lib/base/data/avatar_lad.xml',
|
||||
'lib/base/data/male_collada_joints.xml',
|
||||
'lib/base/data/avatar_skeleton.xml',
|
||||
'lib/base/data/LICENSE-artwork.txt',
|
||||
],
|
||||
@@ -89,7 +90,7 @@ setup(
|
||||
# requests breaks with newer idna
|
||||
'idna<3,>=2.5',
|
||||
# 7.x will be a major change.
|
||||
'mitmproxy>=7.0.2,<8.0',
|
||||
'mitmproxy>=8.0.0,<8.1',
|
||||
# For REPLs
|
||||
'ptpython<4.0',
|
||||
# JP2 codec
|
||||
@@ -98,6 +99,9 @@ setup(
|
||||
# These could be in extras_require if you don't want a GUI.
|
||||
'pyside6',
|
||||
'qasync',
|
||||
# Needed for mesh format conversion tooling
|
||||
'pycollada',
|
||||
'transformations',
|
||||
],
|
||||
tests_require=[
|
||||
"pytest",
|
||||
|
||||
@@ -113,7 +113,7 @@ executables = [
|
||||
|
||||
setup(
|
||||
name="hippolyzer_gui",
|
||||
version="0.8.0",
|
||||
version="0.9.0",
|
||||
description="Hippolyzer GUI",
|
||||
options=options,
|
||||
executables=executables,
|
||||
|
||||
BIN
static/repl_screenshot.png
Normal file
BIN
static/repl_screenshot.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 42 KiB |
@@ -134,3 +134,15 @@ class TestDatatypes(unittest.TestCase):
|
||||
val = llsd.parse_binary(llsd.format_binary(orig))
|
||||
self.assertIsInstance(val, UUID)
|
||||
self.assertEqual(orig, val)
|
||||
|
||||
def test_jank_stringy_bytes(self):
|
||||
val = JankStringyBytes(b"foo\x00")
|
||||
self.assertTrue("o" in val)
|
||||
self.assertTrue(b"o" in val)
|
||||
self.assertFalse(b"z" in val)
|
||||
self.assertFalse("z" in val)
|
||||
self.assertEqual("foo", val)
|
||||
self.assertEqual(b"foo\x00", val)
|
||||
self.assertNotEqual(b"foo", val)
|
||||
self.assertEqual(b"foo", JankStringyBytes(b"foo"))
|
||||
self.assertEqual("foo", JankStringyBytes(b"foo"))
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
import copy
|
||||
import unittest
|
||||
|
||||
from hippolyzer.lib.base.datatypes import *
|
||||
@@ -44,27 +45,58 @@ SIMPLE_INV = """\tinv_object\t0
|
||||
|
||||
|
||||
class TestLegacyInv(unittest.TestCase):
|
||||
def setUp(self) -> None:
|
||||
self.model = InventoryModel.from_str(SIMPLE_INV)
|
||||
|
||||
def test_parse(self):
|
||||
model = InventoryModel.from_str(SIMPLE_INV)
|
||||
self.assertTrue(UUID('f4d91477-def1-487a-b4f3-6fa201c17376') in model.containers)
|
||||
self.assertIsNotNone(model.root)
|
||||
self.assertTrue(UUID('f4d91477-def1-487a-b4f3-6fa201c17376') in self.model.nodes)
|
||||
self.assertIsNotNone(self.model.root)
|
||||
|
||||
def test_serialize(self):
|
||||
model = InventoryModel.from_str(SIMPLE_INV)
|
||||
new_model = InventoryModel.from_str(model.to_str())
|
||||
self.assertEqual(model, new_model)
|
||||
self.model = InventoryModel.from_str(SIMPLE_INV)
|
||||
new_model = InventoryModel.from_str(self.model.to_str())
|
||||
self.assertEqual(self.model, new_model)
|
||||
|
||||
def test_item_access(self):
|
||||
model = InventoryModel.from_str(SIMPLE_INV)
|
||||
item = model.items[UUID('dd163122-946b-44df-99f6-a6030e2b9597')]
|
||||
item = self.model.nodes[UUID('dd163122-946b-44df-99f6-a6030e2b9597')]
|
||||
self.assertEqual(item.name, "New Script")
|
||||
self.assertEqual(item.sale_info.sale_type, "not")
|
||||
self.assertEqual(item.model, model)
|
||||
self.assertEqual(item.model, self.model)
|
||||
|
||||
def test_access_children(self):
|
||||
root = self.model.root
|
||||
item = tuple(self.model.ordered_nodes)[1]
|
||||
self.assertEqual((item,), root.children)
|
||||
|
||||
def test_access_parent(self):
|
||||
root = self.model.root
|
||||
item = tuple(self.model.ordered_nodes)[1]
|
||||
self.assertEqual(root, item.parent)
|
||||
self.assertEqual(None, root.parent)
|
||||
|
||||
def test_unlink(self):
|
||||
self.assertEqual(1, len(self.model.root.children))
|
||||
item = tuple(self.model.ordered_nodes)[1]
|
||||
self.assertEqual([item], item.unlink())
|
||||
self.assertEqual(0, len(self.model.root.children))
|
||||
self.assertEqual(None, item.model)
|
||||
|
||||
def test_relink(self):
|
||||
item = tuple(self.model.ordered_nodes)[1]
|
||||
for unlinked in item.unlink():
|
||||
self.model.add(unlinked)
|
||||
self.assertEqual(self.model, item.model)
|
||||
self.assertEqual(1, len(self.model.root.children))
|
||||
|
||||
def test_eq_excludes_model(self):
|
||||
item = tuple(self.model.ordered_nodes)[1]
|
||||
item_copy = copy.copy(item)
|
||||
item_copy.model = None
|
||||
self.assertEqual(item, item_copy)
|
||||
|
||||
def test_llsd_serialization(self):
|
||||
model = InventoryModel.from_str(SIMPLE_INV)
|
||||
self.assertEqual(
|
||||
model.to_llsd(),
|
||||
self.model.to_llsd(),
|
||||
[
|
||||
{
|
||||
'name': 'Contents',
|
||||
@@ -102,9 +134,34 @@ class TestLegacyInv(unittest.TestCase):
|
||||
)
|
||||
|
||||
def test_llsd_legacy_equality(self):
|
||||
model = InventoryModel.from_str(SIMPLE_INV)
|
||||
new_model = InventoryModel.from_llsd(model.to_llsd())
|
||||
self.assertEqual(model, new_model)
|
||||
new_model = InventoryModel.from_llsd(self.model.to_llsd())
|
||||
self.assertEqual(self.model, new_model)
|
||||
new_model.root.name = "foo"
|
||||
self.assertNotEqual(self.model, new_model)
|
||||
|
||||
def test_difference_added(self):
|
||||
new_model = InventoryModel.from_llsd(self.model.to_llsd())
|
||||
diff = self.model.get_differences(new_model)
|
||||
self.assertEqual([], diff.changed)
|
||||
self.assertEqual([], diff.removed)
|
||||
|
||||
new_model.root.name = "foo"
|
||||
diff = self.model.get_differences(new_model)
|
||||
self.assertEqual([new_model.root], diff.changed)
|
||||
self.assertEqual([], diff.removed)
|
||||
|
||||
item = new_model.root.children[0]
|
||||
item.unlink()
|
||||
diff = self.model.get_differences(new_model)
|
||||
self.assertEqual([new_model.root], diff.changed)
|
||||
self.assertEqual([item], diff.removed)
|
||||
|
||||
new_item = copy.copy(item)
|
||||
new_item.node_id = UUID.random()
|
||||
new_model.add(new_item)
|
||||
diff = self.model.get_differences(new_model)
|
||||
self.assertEqual([new_model.root, new_item], diff.changed)
|
||||
self.assertEqual([item], diff.removed)
|
||||
|
||||
|
||||
GIRL_NEXT_DOOR_SHAPE = """LLWearable version 22
|
||||
|
||||
@@ -300,3 +300,14 @@ class HumanReadableMessageTests(unittest.TestCase):
|
||||
|
||||
with self.assertRaises(ValueError):
|
||||
HumanMessageSerializer.from_human_string(val)
|
||||
|
||||
def test_flags(self):
|
||||
val = """
|
||||
OUT FooMessage [ZEROCODED] [RELIABLE] [1]
|
||||
|
||||
[SomeBlock]
|
||||
foo = 1
|
||||
"""
|
||||
|
||||
msg = HumanMessageSerializer.from_human_string(val)
|
||||
self.assertEqual(HumanMessageSerializer.to_human_string(msg).strip(), val.strip())
|
||||
|
||||
@@ -28,7 +28,8 @@ class MockHandlingCircuit(ProxiedCircuit):
|
||||
self.handler = handler
|
||||
|
||||
def _send_prepared_message(self, message: Message, transport=None):
|
||||
asyncio.get_event_loop().call_soon(self.handler.handle, message)
|
||||
loop = asyncio.get_event_loop_policy().get_event_loop()
|
||||
loop.call_soon(self.handler.handle, message)
|
||||
|
||||
|
||||
class MockConnectionHolder(ConnectionHolder):
|
||||
@@ -70,7 +71,7 @@ class XferManagerTests(BaseTransferTests):
|
||||
manager = XferManager(self.server_connection)
|
||||
xfer = await manager.request(vfile_id=asset_id, vfile_type=AssetType.BODYPART)
|
||||
self.received_bytes = xfer.reassemble_chunks()
|
||||
self.server_circuit.send_message(Message(
|
||||
self.server_circuit.send(Message(
|
||||
"AssetUploadComplete",
|
||||
Block("AssetBlock", UUID=asset_id, Type=asset_block["Type"], Success=True),
|
||||
direction=Direction.IN,
|
||||
@@ -109,7 +110,7 @@ class TestTransferManager(BaseTransferTests):
|
||||
self.assertEqual(EstateAssetType.COVENANT, params.EstateAssetType)
|
||||
data = self.LARGE_PAYLOAD
|
||||
|
||||
self.server_circuit.send_message(Message(
|
||||
self.server_circuit.send(Message(
|
||||
'TransferInfo',
|
||||
Block(
|
||||
'TransferInfo',
|
||||
@@ -125,7 +126,7 @@ class TestTransferManager(BaseTransferTests):
|
||||
while True:
|
||||
chunk = data[:1000]
|
||||
data = data[1000:]
|
||||
self.server_circuit.send_message(Message(
|
||||
self.server_circuit.send(Message(
|
||||
'TransferPacket',
|
||||
Block(
|
||||
'TransferData',
|
||||
|
||||
@@ -62,8 +62,8 @@ addons = [ChildAddon()]
|
||||
|
||||
|
||||
class AddonIntegrationTests(BaseProxyTest):
|
||||
def setUp(self) -> None:
|
||||
super().setUp()
|
||||
async def asyncSetUp(self) -> None:
|
||||
await super().asyncSetUp()
|
||||
self.addon = MockAddon()
|
||||
AddonManager.init([], self.session_manager, [self.addon], swallow_addon_exceptions=False)
|
||||
self.temp_dir = TemporaryDirectory(prefix="addon_test_sources")
|
||||
|
||||
@@ -30,8 +30,8 @@ class MockAddon(BaseAddon):
|
||||
|
||||
|
||||
class HTTPIntegrationTests(BaseProxyTest):
|
||||
def setUp(self) -> None:
|
||||
super().setUp()
|
||||
async def asyncSetUp(self) -> None:
|
||||
await super().asyncSetUp()
|
||||
self.addon = MockAddon()
|
||||
AddonManager.init([], self.session_manager, [self.addon])
|
||||
self.flow_context = self.session_manager.flow_context
|
||||
@@ -124,8 +124,8 @@ class HTTPIntegrationTests(BaseProxyTest):
|
||||
|
||||
|
||||
class TestCapsClient(BaseProxyTest):
|
||||
def setUp(self) -> None:
|
||||
super().setUp()
|
||||
async def asyncSetUp(self) -> None:
|
||||
await super().asyncSetUp()
|
||||
self._setup_default_circuit()
|
||||
self.caps_client = self.session.main_region.caps_client
|
||||
|
||||
@@ -141,29 +141,30 @@ class TestCapsClient(BaseProxyTest):
|
||||
|
||||
|
||||
class TestMITMProxy(BaseProxyTest):
|
||||
def setUp(self) -> None:
|
||||
super().setUp()
|
||||
async def asyncSetUp(self) -> None:
|
||||
await super().asyncSetUp()
|
||||
self._setup_default_circuit()
|
||||
self.caps_client = self.session.main_region.caps_client
|
||||
|
||||
def test_mitmproxy_works(self):
|
||||
proxy_port = 9905
|
||||
self.session_manager.settings.HTTP_PROXY_PORT = proxy_port
|
||||
|
||||
http_proc = multiprocessing.Process(
|
||||
self.http_proc = multiprocessing.Process(
|
||||
target=run_http_proxy_process,
|
||||
args=("127.0.0.1", proxy_port, self.session_manager.flow_context),
|
||||
daemon=True,
|
||||
)
|
||||
http_proc.start()
|
||||
|
||||
self.http_proc.start()
|
||||
self.session_manager.flow_context.mitmproxy_ready.wait(1.0)
|
||||
|
||||
http_event_manager = MITMProxyEventManager(self.session_manager, self.session_manager.flow_context)
|
||||
self.http_event_manager = MITMProxyEventManager(
|
||||
self.session_manager,
|
||||
self.session_manager.flow_context
|
||||
)
|
||||
|
||||
def test_mitmproxy_works(self):
|
||||
async def _request_example_com():
|
||||
# Pump callbacks from mitmproxy
|
||||
asyncio.create_task(http_event_manager.run())
|
||||
asyncio.create_task(self.http_event_manager.run())
|
||||
try:
|
||||
async with self.caps_client.get("http://example.com/", timeout=0.5) as resp:
|
||||
self.assertIn(b"Example Domain", await resp.read())
|
||||
@@ -173,4 +174,4 @@ class TestMITMProxy(BaseProxyTest):
|
||||
# Tell the event pump and mitmproxy they need to shut down
|
||||
self.session_manager.flow_context.shutdown_signal.set()
|
||||
asyncio.run(_request_example_com())
|
||||
http_proc.join()
|
||||
self.http_proc.join()
|
||||
|
||||
@@ -47,8 +47,8 @@ class SimpleMessageLogger(FilteringMessageLogger):
|
||||
|
||||
|
||||
class LLUDPIntegrationTests(BaseProxyTest):
|
||||
def setUp(self) -> None:
|
||||
super().setUp()
|
||||
async def asyncSetUp(self) -> None:
|
||||
await super().asyncSetUp()
|
||||
self.addon = MockAddon()
|
||||
self.deserializer = UDPMessageDeserializer()
|
||||
AddonManager.init([], self.session_manager, [self.addon])
|
||||
@@ -204,8 +204,8 @@ class LLUDPIntegrationTests(BaseProxyTest):
|
||||
self.protocol.datagram_received(obj_update, self.region_addr)
|
||||
await self._wait_drained()
|
||||
entries = message_logger.entries
|
||||
self.assertEqual(len(entries), 1)
|
||||
self.assertEqual(entries[0].name, "ObjectUpdateCompressed")
|
||||
self.assertEqual(1, len(entries))
|
||||
self.assertEqual("ObjectUpdateCompressed", entries[0].name)
|
||||
|
||||
async def test_filtering_logged_messages(self):
|
||||
message_logger = SimpleMessageLogger()
|
||||
@@ -222,8 +222,8 @@ class LLUDPIntegrationTests(BaseProxyTest):
|
||||
await self._wait_drained()
|
||||
message_logger.set_filter("ObjectUpdateCompressed")
|
||||
entries = message_logger.entries
|
||||
self.assertEqual(len(entries), 1)
|
||||
self.assertEqual(entries[0].name, "ObjectUpdateCompressed")
|
||||
self.assertEqual(1, len(entries))
|
||||
self.assertEqual("ObjectUpdateCompressed", entries[0].name)
|
||||
|
||||
async def test_logging_taken_message(self):
|
||||
message_logger = SimpleMessageLogger()
|
||||
|
||||
@@ -1,13 +1,14 @@
|
||||
from mitmproxy.test import tflow, tutils
|
||||
|
||||
from hippolyzer.lib.proxy.caps import CapType
|
||||
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
|
||||
from hippolyzer.lib.proxy.message_logger import HTTPMessageLogEntry
|
||||
from hippolyzer.lib.proxy.test_utils import BaseProxyTest
|
||||
|
||||
|
||||
class TestHTTPFlows(BaseProxyTest):
|
||||
def setUp(self) -> None:
|
||||
super().setUp()
|
||||
async def asyncSetUp(self) -> None:
|
||||
await super().asyncSetUp()
|
||||
self.region = self.session.register_region(
|
||||
("127.0.0.1", 2),
|
||||
"https://test.localhost:4/foo",
|
||||
@@ -18,7 +19,7 @@ class TestHTTPFlows(BaseProxyTest):
|
||||
"ViewerAsset": "http://assets.example.com",
|
||||
})
|
||||
|
||||
def test_request_formatting(self):
|
||||
async def test_request_formatting(self):
|
||||
req = tutils.treq(host="example.com", port=80)
|
||||
fake_flow = tflow.tflow(req=req, resp=tutils.tresp())
|
||||
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
|
||||
@@ -32,7 +33,7 @@ content-length: 7\r
|
||||
\r
|
||||
content""")
|
||||
|
||||
def test_binary_request_formatting(self):
|
||||
async def test_binary_request_formatting(self):
|
||||
req = tutils.treq(host="example.com", port=80)
|
||||
fake_flow = tflow.tflow(req=req, resp=tutils.tresp())
|
||||
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
|
||||
@@ -46,7 +47,7 @@ X-Hippo-Escaped-Body: 1\r
|
||||
\r
|
||||
c\\x00ntent""")
|
||||
|
||||
def test_llsd_response_formatting(self):
|
||||
async def test_llsd_response_formatting(self):
|
||||
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
|
||||
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
|
||||
# Half the time LLSD is sent with a random Content-Type and no PI indicating
|
||||
@@ -63,7 +64,7 @@ content-length: 33\r
|
||||
</llsd>
|
||||
""")
|
||||
|
||||
def test_flow_state_serde(self):
|
||||
async def test_flow_state_serde(self):
|
||||
fake_flow = tflow.tflow(req=tutils.treq(host="example.com"), resp=tutils.tresp())
|
||||
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), self.session_manager)
|
||||
# Make sure cap resolution works correctly
|
||||
@@ -72,7 +73,7 @@ content-length: 33\r
|
||||
new_flow = HippoHTTPFlow.from_state(flow_state, self.session_manager)
|
||||
self.assertIs(self.session, new_flow.cap_data.session())
|
||||
|
||||
def test_http_asset_repo(self):
|
||||
async def test_http_asset_repo(self):
|
||||
asset_repo = self.session_manager.asset_repo
|
||||
asset_id = asset_repo.create_asset(b"foobar", one_shot=True)
|
||||
req = tutils.treq(host="assets.example.com", path=f"/?animatn_id={asset_id}")
|
||||
@@ -83,9 +84,9 @@ content-length: 33\r
|
||||
self.assertTrue(asset_repo.try_serve_asset(flow))
|
||||
self.assertEqual(b"foobar", flow.response.content)
|
||||
|
||||
def test_temporary_cap_resolution(self):
|
||||
self.region.register_temporary_cap("TempExample", "http://not.example.com")
|
||||
self.region.register_temporary_cap("TempExample", "http://not2.example.com")
|
||||
async def test_temporary_cap_resolution(self):
|
||||
self.region.register_cap("TempExample", "http://not.example.com", CapType.TEMPORARY)
|
||||
self.region.register_cap("TempExample", "http://not2.example.com", CapType.TEMPORARY)
|
||||
# Resolving the cap should consume it
|
||||
cap_data = self.session_manager.resolve_cap("http://not.example.com")
|
||||
self.assertEqual(cap_data.cap_name, "TempExample")
|
||||
|
||||
@@ -130,7 +130,7 @@ class MessageFilterTests(unittest.IsolatedAsyncioTestCase):
|
||||
# Make sure numbers outside 32bit range come through
|
||||
self.assertTrue(self._filter_matches("Foo.Bar.Foo == 0xFFffFFffFF", msg))
|
||||
|
||||
def test_http_flow(self):
|
||||
async def test_http_flow(self):
|
||||
session_manager = SessionManager(ProxySettings())
|
||||
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
|
||||
fake_flow.metadata["cap_data_ser"] = SerializedCapData(
|
||||
@@ -141,7 +141,17 @@ class MessageFilterTests(unittest.IsolatedAsyncioTestCase):
|
||||
self.assertTrue(self._filter_matches("FakeCap", entry))
|
||||
self.assertFalse(self._filter_matches("NotFakeCap", entry))
|
||||
|
||||
def test_export_import_http_flow(self):
|
||||
async def test_http_header_filter(self):
|
||||
session_manager = SessionManager(ProxySettings())
|
||||
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
|
||||
fake_flow.request.headers["Cookie"] = 'foo="bar"'
|
||||
flow = HippoHTTPFlow.from_state(fake_flow.get_state(), session_manager)
|
||||
entry = HTTPMessageLogEntry(flow)
|
||||
# The header map is case-insensitive!
|
||||
self.assertTrue(self._filter_matches('Meta.ReqHeaders.cookie ~= "foo"', entry))
|
||||
self.assertFalse(self._filter_matches('Meta.ReqHeaders.foobar ~= "foo"', entry))
|
||||
|
||||
async def test_export_import_http_flow(self):
|
||||
fake_flow = tflow.tflow(req=tutils.treq(), resp=tutils.tresp())
|
||||
fake_flow.metadata["cap_data_ser"] = SerializedCapData(
|
||||
cap_name="FakeCap",
|
||||
|
||||
@@ -17,19 +17,19 @@ class MockedProxyCircuit(ProxiedCircuit):
|
||||
self.in_injections = InjectionTracker(0, maxlen=10)
|
||||
|
||||
def _send_prepared_message(self, msg: Message, transport=None):
|
||||
self.sent_simple.append((msg.packet_id, msg.name, msg.direction, msg.injected, msg.acks))
|
||||
self.sent_simple.append((msg.packet_id, msg.name, msg.direction, msg.synthetic, msg.acks))
|
||||
self.sent_msgs.append(msg)
|
||||
|
||||
|
||||
class PacketIDTests(unittest.TestCase):
|
||||
class PacketIDTests(unittest.IsolatedAsyncioTestCase):
|
||||
def setUp(self) -> None:
|
||||
self.circuit = MockedProxyCircuit()
|
||||
|
||||
def _send_message(self, msg, outgoing=True):
|
||||
msg.direction = Direction.OUT if outgoing else Direction.IN
|
||||
return self.circuit.send_message(msg)
|
||||
return self.circuit.send(msg)
|
||||
|
||||
def test_basic(self):
|
||||
async def test_basic(self):
|
||||
self._send_message(Message('ChatFromViewer', packet_id=1))
|
||||
self._send_message(Message('ChatFromViewer', packet_id=2))
|
||||
|
||||
@@ -38,7 +38,7 @@ class PacketIDTests(unittest.TestCase):
|
||||
(2, "ChatFromViewer", Direction.OUT, False, ()),
|
||||
))
|
||||
|
||||
def test_inject(self):
|
||||
async def test_inject(self):
|
||||
self._send_message(Message('ChatFromViewer', packet_id=1))
|
||||
self._send_message(Message('ChatFromViewer'))
|
||||
self._send_message(Message('ChatFromViewer', packet_id=2))
|
||||
@@ -49,7 +49,7 @@ class PacketIDTests(unittest.TestCase):
|
||||
(3, "ChatFromViewer", Direction.OUT, False, ()),
|
||||
))
|
||||
|
||||
def test_max_injected(self):
|
||||
async def test_max_injected(self):
|
||||
self._send_message(Message('ChatFromViewer', packet_id=1))
|
||||
for _ in range(5):
|
||||
self._send_message(Message('ChatFromViewer'))
|
||||
@@ -74,7 +74,7 @@ class PacketIDTests(unittest.TestCase):
|
||||
# Make sure we're still able to get the original ID
|
||||
self.assertEqual(self.circuit.out_injections.get_original_id(15), 3)
|
||||
|
||||
def test_inject_hole_in_sequence(self):
|
||||
async def test_inject_hole_in_sequence(self):
|
||||
self._send_message(Message('ChatFromViewer', packet_id=1))
|
||||
self._send_message(Message('ChatFromViewer'))
|
||||
self._send_message(Message('ChatFromViewer', packet_id=4))
|
||||
@@ -87,7 +87,7 @@ class PacketIDTests(unittest.TestCase):
|
||||
(6, "ChatFromViewer", Direction.OUT, True, ()),
|
||||
))
|
||||
|
||||
def test_inject_misordered(self):
|
||||
async def test_inject_misordered(self):
|
||||
self._send_message(Message('ChatFromViewer', packet_id=2))
|
||||
self._send_message(Message('ChatFromViewer'))
|
||||
self._send_message(Message('ChatFromViewer', packet_id=1))
|
||||
@@ -98,7 +98,7 @@ class PacketIDTests(unittest.TestCase):
|
||||
(1, "ChatFromViewer", Direction.OUT, False, ()),
|
||||
])
|
||||
|
||||
def test_inject_multiple(self):
|
||||
async def test_inject_multiple(self):
|
||||
self._send_message(Message('ChatFromViewer', packet_id=1))
|
||||
self._send_message(Message('ChatFromViewer'))
|
||||
self._send_message(Message('ChatFromViewer'))
|
||||
@@ -115,7 +115,7 @@ class PacketIDTests(unittest.TestCase):
|
||||
(6, "ChatFromViewer", Direction.OUT, True, ()),
|
||||
])
|
||||
|
||||
def test_packet_ack_field_converted(self):
|
||||
async def test_packet_ack_field_converted(self):
|
||||
self._send_message(Message('ChatFromViewer', packet_id=1))
|
||||
self._send_message(Message('ChatFromViewer'))
|
||||
self._send_message(Message('ChatFromViewer'))
|
||||
@@ -139,7 +139,7 @@ class PacketIDTests(unittest.TestCase):
|
||||
(6, "ChatFromViewer", Direction.OUT, True, ()),
|
||||
])
|
||||
|
||||
def test_packet_ack_proxied_message_converted(self):
|
||||
async def test_packet_ack_proxied_message_converted(self):
|
||||
self._send_message(Message('ChatFromViewer', packet_id=1))
|
||||
self._send_message(Message('ChatFromViewer'))
|
||||
self._send_message(Message('ChatFromViewer'))
|
||||
@@ -176,12 +176,9 @@ class PacketIDTests(unittest.TestCase):
|
||||
|
||||
self.assertEqual(self.circuit.sent_msgs[5]["Packets"][0]["ID"], 2)
|
||||
|
||||
def test_drop_proxied_message(self):
|
||||
async def test_drop_proxied_message(self):
|
||||
self._send_message(Message('ChatFromViewer', packet_id=1))
|
||||
self.circuit.drop_message(
|
||||
Message('ChatFromViewer', packet_id=2, flags=PacketFlags.RELIABLE),
|
||||
Direction.OUT,
|
||||
)
|
||||
self.circuit.drop_message(Message('ChatFromViewer', packet_id=2, flags=PacketFlags.RELIABLE))
|
||||
self._send_message(Message('ChatFromViewer', packet_id=3))
|
||||
|
||||
self.assertSequenceEqual(self.circuit.sent_simple, [
|
||||
@@ -191,12 +188,9 @@ class PacketIDTests(unittest.TestCase):
|
||||
])
|
||||
self.assertEqual(self.circuit.sent_msgs[1]["Packets"][0]["ID"], 2)
|
||||
|
||||
def test_unreliable_proxied_message(self):
|
||||
async def test_unreliable_proxied_message(self):
|
||||
self._send_message(Message('ChatFromViewer', packet_id=1))
|
||||
self.circuit.drop_message(
|
||||
Message('ChatFromViewer', packet_id=2),
|
||||
Direction.OUT,
|
||||
)
|
||||
self.circuit.drop_message(Message('ChatFromViewer', packet_id=2))
|
||||
self._send_message(Message('ChatFromViewer', packet_id=3))
|
||||
|
||||
self.assertSequenceEqual(self.circuit.sent_simple, [
|
||||
@@ -204,15 +198,12 @@ class PacketIDTests(unittest.TestCase):
|
||||
(3, "ChatFromViewer", Direction.OUT, False, ()),
|
||||
])
|
||||
|
||||
def test_dropped_proxied_message_acks_sent(self):
|
||||
async def test_dropped_proxied_message_acks_sent(self):
|
||||
self._send_message(Message('ChatFromViewer', packet_id=1))
|
||||
self._send_message(Message('ChatFromViewer', packet_id=2))
|
||||
self._send_message(Message('ChatFromViewer', packet_id=3))
|
||||
self._send_message(Message('ChatFromSimulator'), outgoing=False)
|
||||
self.circuit.drop_message(
|
||||
Message('ChatFromViewer', packet_id=4, acks=(4,)),
|
||||
Direction.OUT,
|
||||
)
|
||||
self.circuit.drop_message(Message('ChatFromViewer', packet_id=4, acks=(4,)))
|
||||
self._send_message(Message('ChatFromViewer', packet_id=5))
|
||||
|
||||
self.assertSequenceEqual(self.circuit.sent_simple, [
|
||||
@@ -229,8 +220,8 @@ class PacketIDTests(unittest.TestCase):
|
||||
# We injected an incoming packet, so "4" is really "3"
|
||||
self.assertEqual(self.circuit.sent_msgs[4]["Packets"][0]["ID"], 3)
|
||||
|
||||
def test_resending_or_dropping(self):
|
||||
self.circuit.send_message(Message('ChatFromViewer', packet_id=1))
|
||||
async def test_resending_or_dropping(self):
|
||||
self.circuit.send(Message('ChatFromViewer', packet_id=1))
|
||||
to_drop = Message('ChatFromViewer', packet_id=2, flags=PacketFlags.RELIABLE)
|
||||
self.circuit.drop_message(to_drop)
|
||||
with self.assertRaises(RuntimeError):
|
||||
@@ -238,12 +229,72 @@ class PacketIDTests(unittest.TestCase):
|
||||
self.circuit.drop_message(to_drop)
|
||||
# Returns a new message without finalized flag
|
||||
new_msg = to_drop.take()
|
||||
self.circuit.send_message(new_msg)
|
||||
self.circuit.send(new_msg)
|
||||
with self.assertRaises(RuntimeError):
|
||||
self.circuit.send_message(new_msg)
|
||||
self.circuit.send(new_msg)
|
||||
self.assertSequenceEqual(self.circuit.sent_simple, [
|
||||
(1, "ChatFromViewer", Direction.OUT, False, ()),
|
||||
(1, "PacketAck", Direction.IN, True, ()),
|
||||
# ended up getting the same packet ID when injected
|
||||
(2, "ChatFromViewer", Direction.OUT, True, ()),
|
||||
])
|
||||
|
||||
async def test_reliable_unacked_queueing(self):
|
||||
self._send_message(Message('ChatFromViewer', flags=PacketFlags.RELIABLE))
|
||||
self._send_message(Message('ChatFromViewer', flags=PacketFlags.RELIABLE, packet_id=2))
|
||||
# Only the first, injected message should be queued for resends
|
||||
self.assertEqual({(Direction.OUT, 1)}, set(self.circuit.unacked_reliable))
|
||||
|
||||
async def test_reliable_resend_cadence(self):
|
||||
self._send_message(Message('ChatFromViewer', flags=PacketFlags.RELIABLE))
|
||||
resend_info = self.circuit.unacked_reliable[(Direction.OUT, 1)]
|
||||
self.circuit.resend_unacked()
|
||||
# Should have been too soon to retry
|
||||
self.assertEqual(10, resend_info.tries_left)
|
||||
# Switch to allowing resends every 0s
|
||||
self.circuit.resend_every = 0.0
|
||||
self.circuit.resend_unacked()
|
||||
self.assertSequenceEqual(self.circuit.sent_simple, [
|
||||
(1, "ChatFromViewer", Direction.OUT, True, ()),
|
||||
# Should have resent
|
||||
(1, "ChatFromViewer", Direction.OUT, True, ()),
|
||||
])
|
||||
self.assertEqual(9, resend_info.tries_left)
|
||||
for _ in range(resend_info.tries_left):
|
||||
self.circuit.resend_unacked()
|
||||
# Should have used up all the retry attempts and been kicked out of the retry queue
|
||||
self.assertEqual(set(), set(self.circuit.unacked_reliable))
|
||||
|
||||
async def test_reliable_ack_collection(self):
|
||||
msg = Message('ChatFromViewer', flags=PacketFlags.RELIABLE)
|
||||
fut = self.circuit.send_reliable(msg)
|
||||
self.assertEqual(1, len(self.circuit.unacked_reliable))
|
||||
# Shouldn't count, this is an ACK going in the wrong direction!
|
||||
ack_msg = Message("PacketAck", Block("Packets", ID=msg.packet_id))
|
||||
self.circuit.collect_acks(ack_msg)
|
||||
self.assertEqual(1, len(self.circuit.unacked_reliable))
|
||||
self.assertFalse(fut.done())
|
||||
# But it should count if the ACK message is heading in
|
||||
ack_msg.direction = Direction.IN
|
||||
self.circuit.collect_acks(ack_msg)
|
||||
self.assertEqual(0, len(self.circuit.unacked_reliable))
|
||||
self.assertTrue(fut.done())
|
||||
|
||||
async def test_start_ping_check(self):
|
||||
# Should not break if no unacked
|
||||
self._send_message(Message(
|
||||
"StartPingCheck",
|
||||
Block("PingID", PingID=0, OldestUnacked=20),
|
||||
packet_id=5,
|
||||
))
|
||||
|
||||
injected_msg = Message('ChatFromViewer', flags=PacketFlags.RELIABLE)
|
||||
self._send_message(injected_msg)
|
||||
|
||||
self._send_message(Message(
|
||||
"StartPingCheck",
|
||||
Block("PingID", PingID=0, OldestUnacked=20),
|
||||
packet_id=8,
|
||||
))
|
||||
# Oldest unacked should have been replaced with the injected packet's ID, it's older!
|
||||
self.assertEqual(self.circuit.sent_msgs[2]["PingID"]["OldestUnacked"], injected_msg.packet_id)
|
||||
|
||||
@@ -10,7 +10,8 @@ from hippolyzer.lib.base.message.message import Block, Message as Message
|
||||
from hippolyzer.lib.base.message.udpdeserializer import UDPMessageDeserializer
|
||||
from hippolyzer.lib.base.message.udpserializer import UDPMessageSerializer
|
||||
from hippolyzer.lib.base.objects import Object, normalize_object_update_compressed_data
|
||||
from hippolyzer.lib.base.templates import ExtraParamType, PCode
|
||||
from hippolyzer.lib.base.templates import ExtraParamType, PCode, JUST_CREATED_FLAGS
|
||||
from hippolyzer.lib.client.object_manager import ObjectUpdateType
|
||||
from hippolyzer.lib.proxy.addons import AddonManager
|
||||
from hippolyzer.lib.proxy.addon_utils import BaseAddon
|
||||
from hippolyzer.lib.proxy.region import ProxiedRegion
|
||||
@@ -55,8 +56,8 @@ class ObjectTrackingAddon(BaseAddon):
|
||||
|
||||
|
||||
class ObjectManagerTestMixin(BaseProxyTest):
|
||||
def setUp(self) -> None:
|
||||
super().setUp()
|
||||
async def asyncSetUp(self) -> None:
|
||||
await super().asyncSetUp()
|
||||
self._setup_default_circuit()
|
||||
self.region = self.session.main_region
|
||||
self.message_handler = WrappingMessageHandler(self.region)
|
||||
@@ -288,7 +289,7 @@ class RegionObjectManagerTests(ObjectManagerTestMixin, unittest.IsolatedAsyncioT
|
||||
self.message_handler.handle(msg)
|
||||
events = self.object_addon.events
|
||||
self.assertEqual(2, len(events))
|
||||
self.assertEqual({"Position"}, events[1][2])
|
||||
self.assertEqual({"Position", "TextureEntry"}, events[1][2])
|
||||
|
||||
def test_region_position(self):
|
||||
parent = self._create_object(pos=(0.0, 1.0, 0.0))
|
||||
@@ -418,13 +419,13 @@ class RegionObjectManagerTests(ObjectManagerTestMixin, unittest.IsolatedAsyncioT
|
||||
'AngularVelocity': Vector3(0.0, 0.0, 0.0791015625),
|
||||
'TreeSpecies': None,
|
||||
'ScratchPad': None,
|
||||
'Text': None,
|
||||
'TextColor': None,
|
||||
'MediaURL': None,
|
||||
'Sound': None,
|
||||
'SoundGain': None,
|
||||
'SoundFlags': None,
|
||||
'SoundRadius': None,
|
||||
'Text': b'',
|
||||
'TextColor': b'',
|
||||
'MediaURL': b'',
|
||||
'Sound': UUID(),
|
||||
'SoundGain': 0.0,
|
||||
'SoundFlags': 0,
|
||||
'SoundRadius': 0.0,
|
||||
'NameValue': [],
|
||||
'PathCurve': 32,
|
||||
'ProfileCurve': 0,
|
||||
@@ -505,8 +506,8 @@ class RegionObjectManagerTests(ObjectManagerTestMixin, unittest.IsolatedAsyncioT
|
||||
|
||||
|
||||
class SessionObjectManagerTests(ObjectManagerTestMixin, unittest.IsolatedAsyncioTestCase):
|
||||
def setUp(self) -> None:
|
||||
super().setUp()
|
||||
async def asyncSetUp(self) -> None:
|
||||
await super().asyncSetUp()
|
||||
self.second_region = self.session.register_region(
|
||||
("127.0.0.1", 9), "https://localhost:5", 124
|
||||
)
|
||||
@@ -663,3 +664,45 @@ class SessionObjectManagerTests(ObjectManagerTestMixin, unittest.IsolatedAsyncio
|
||||
self._create_object(local_id=av.LocalID, full_id=av.FullID,
|
||||
pcode=PCode.AVATAR, parent_id=seat_id, pos=(1, 2, 9))
|
||||
self.assertEqual(set(), self.region_object_manager.queued_cache_misses)
|
||||
|
||||
async def test_handle_object_update_event(self):
|
||||
with self.session.objects.events.subscribe_async(
|
||||
message_names=(ObjectUpdateType.OBJECT_UPDATE,),
|
||||
predicate=lambda e: e.object.UpdateFlags & JUST_CREATED_FLAGS and "LocalID" in e.updated,
|
||||
) as get_events:
|
||||
self._create_object(local_id=999)
|
||||
evt = await asyncio.wait_for(get_events(), 1.0)
|
||||
self.assertEqual(999, evt.object.LocalID)
|
||||
|
||||
async def test_handle_object_update_predicate(self):
|
||||
with self.session.objects.events.subscribe_async(
|
||||
message_names=(ObjectUpdateType.OBJECT_UPDATE,),
|
||||
) as get_events:
|
||||
self._create_object(local_id=999)
|
||||
evt = await asyncio.wait_for(get_events(), 1.0)
|
||||
self.assertEqual(999, evt.object.LocalID)
|
||||
|
||||
async def test_handle_object_update_events_two_subscribers(self):
|
||||
with self.session.objects.events.subscribe_async(
|
||||
message_names=(ObjectUpdateType.OBJECT_UPDATE,),
|
||||
) as get_events:
|
||||
with self.session.objects.events.subscribe_async(
|
||||
message_names=(ObjectUpdateType.OBJECT_UPDATE,),
|
||||
) as get_events2:
|
||||
self._create_object(local_id=999)
|
||||
evt = await asyncio.wait_for(get_events(), 1.0)
|
||||
evt2 = await asyncio.wait_for(get_events2(), 1.0)
|
||||
self.assertEqual(999, evt.object.LocalID)
|
||||
self.assertEqual(evt, evt2)
|
||||
|
||||
async def test_handle_object_update_events_two_subscribers_timeout(self):
|
||||
with self.session.objects.events.subscribe_async(
|
||||
message_names=(ObjectUpdateType.OBJECT_UPDATE,),
|
||||
) as get_events:
|
||||
with self.session.objects.events.subscribe_async(
|
||||
message_names=(ObjectUpdateType.OBJECT_UPDATE,),
|
||||
) as get_events2:
|
||||
self._create_object(local_id=999)
|
||||
evt = asyncio.wait_for(get_events(), 0.01)
|
||||
evt2 = asyncio.wait_for(get_events2(), 0.01)
|
||||
await asyncio.gather(evt, evt2)
|
||||
|
||||
@@ -1,9 +1,11 @@
|
||||
import math
|
||||
import unittest
|
||||
|
||||
import hippolyzer.lib.base.serialization as se
|
||||
from hippolyzer.lib.base.datatypes import UUID
|
||||
from hippolyzer.lib.base.datatypes import UUID, Vector3
|
||||
from hippolyzer.lib.base.message.message_formatting import HumanMessageSerializer
|
||||
from hippolyzer.lib.base.templates import TextureEntrySubfieldSerializer, TEFaceBitfield, TextureEntry
|
||||
from hippolyzer.lib.base.templates import TextureEntrySubfieldSerializer, TEFaceBitfield, TextureEntryCollection, \
|
||||
PackedTERotation, TextureEntry
|
||||
|
||||
EXAMPLE_TE = b'\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x08\xca*\x98:\x18\x02,\r\xf4\x1e\xc6\xf5\x91\x01]\x83\x014' \
|
||||
b'\x00\x90i+\x10\x80\xa1\xaa\xa2g\x11o\xa8]\xc6\x00\x00\x00\x00\x00\x00\x00\x00\x80?\x00\x00\x00\x80?' \
|
||||
@@ -12,12 +14,24 @@ EXAMPLE_TE = b'\x89UgG$\xcbC\xed\x92\x0bG\xca\xed\x15F_\x08\xca*\x98:\x18\x02,\r
|
||||
|
||||
|
||||
class TemplateTests(unittest.TestCase):
|
||||
|
||||
def test_te_round_trips(self):
|
||||
deserialized = TextureEntrySubfieldSerializer.deserialize(None, EXAMPLE_TE)
|
||||
serialized = TextureEntrySubfieldSerializer.serialize(None, deserialized)
|
||||
self.assertEqual(EXAMPLE_TE, serialized)
|
||||
|
||||
def test_realize_te(self):
|
||||
deserialized: TextureEntryCollection = TextureEntrySubfieldSerializer.deserialize(None, EXAMPLE_TE)
|
||||
realized = deserialized.realize(4)
|
||||
self.assertEqual(UUID('ca2a983a-1802-2c0d-f41e-c6f591015d83'), realized[3].Textures)
|
||||
self.assertEqual(UUID('89556747-24cb-43ed-920b-47caed15465f'), realized[1].Textures)
|
||||
with self.assertRaises(ValueError):
|
||||
deserialized.realize(3)
|
||||
|
||||
def test_tecollection_from_tes(self):
|
||||
deserialized: TextureEntryCollection = TextureEntrySubfieldSerializer.deserialize(None, EXAMPLE_TE)
|
||||
# The TE collection should re-serialize to the same collection when split up and regrouped
|
||||
self.assertEqual(deserialized, TextureEntryCollection.from_tes(deserialized.realize(4)))
|
||||
|
||||
def test_face_bitfield_round_trips(self):
|
||||
test_val = b"\x81\x03"
|
||||
reader = se.BufferReader("!", test_val)
|
||||
@@ -37,9 +51,9 @@ class TemplateTests(unittest.TestCase):
|
||||
'Color': {None: b'\xff\xff\xff\xff'},
|
||||
'ScalesS': {None: 1.0},
|
||||
'ScalesT': {None: 1.0},
|
||||
'OffsetsS': {None: 0},
|
||||
'OffsetsT': {None: 0},
|
||||
'Rotation': {None: 0},
|
||||
'OffsetsS': {None: 0.0},
|
||||
'OffsetsT': {None: 0.0},
|
||||
'Rotation': {None: 0.0},
|
||||
'BasicMaterials': {None: {'Bump': 0, 'FullBright': False, 'Shiny': 'OFF'}},
|
||||
'MediaFlags': {None: {'WebPage': False, 'TexGen': 'DEFAULT', '_Unused': 0}}, 'Glow': {None: 0},
|
||||
'Materials': {None: '00000000-0000-0000-0000-000000000000'},
|
||||
@@ -62,8 +76,56 @@ class TemplateTests(unittest.TestCase):
|
||||
# Serialization order and format should match indra's exactly
|
||||
self.assertEqual(EXAMPLE_TE, data_field)
|
||||
deser = spec.deserialize(None, data_field, pod=True)
|
||||
self.assertEqual(deser, pod_te)
|
||||
self.assertEqual(pod_te, deser)
|
||||
|
||||
def test_textureentry_defaults(self):
|
||||
te = TextureEntry()
|
||||
te = TextureEntryCollection()
|
||||
self.assertEqual(UUID('89556747-24cb-43ed-920b-47caed15465f'), te.Textures[None])
|
||||
|
||||
def test_textureentry_rotation_packing(self):
|
||||
writer = se.BufferWriter("!")
|
||||
writer.write(PackedTERotation(), math.pi * 2)
|
||||
# fmod() makes this loop back around to 0
|
||||
self.assertEqual(b"\x00\x00", writer.copy_buffer())
|
||||
writer.clear()
|
||||
|
||||
writer.write(PackedTERotation(), -math.pi * 2)
|
||||
# fmod() makes this loop back around to 0
|
||||
self.assertEqual(b"\x00\x00", writer.copy_buffer())
|
||||
writer.clear()
|
||||
|
||||
writer.write(PackedTERotation(), 0)
|
||||
self.assertEqual(b"\x00\x00", writer.copy_buffer())
|
||||
writer.clear()
|
||||
|
||||
# These both map to -32768 because of overflow in the positive case
|
||||
# that isn't caught by exact equality to math.pi * 2
|
||||
writer.write(PackedTERotation(), math.pi * 1.999999)
|
||||
self.assertEqual(b"\x80\x00", writer.copy_buffer())
|
||||
writer.clear()
|
||||
|
||||
writer.write(PackedTERotation(), math.pi * -1.999999)
|
||||
self.assertEqual(b"\x80\x00", writer.copy_buffer())
|
||||
writer.clear()
|
||||
|
||||
def test_textureentry_rotation_unpacking(self):
|
||||
reader = se.BufferReader("!", b"\x00\x00")
|
||||
self.assertEqual(0, reader.read(PackedTERotation()))
|
||||
|
||||
reader = se.BufferReader("!", b"\x80\x00")
|
||||
self.assertEqual(-math.pi * 2, reader.read(PackedTERotation()))
|
||||
|
||||
# This quantization method does not allow for any representation of
|
||||
# F_TWO_PI itself, just a value slightly below it! The float representation
|
||||
# is ever so slightly different from the C++ version, but it should still
|
||||
# round-trip correctly.
|
||||
reader = se.BufferReader("!", b"\x7f\xff")
|
||||
self.assertEqual(6.282993559581101, reader.read(PackedTERotation()))
|
||||
|
||||
writer = se.BufferWriter("!")
|
||||
writer.write(PackedTERotation(), 6.282993559581101)
|
||||
self.assertEqual(b"\x7f\xff", writer.copy_buffer())
|
||||
|
||||
def test_textureentry_st_to_uv_coords(self):
|
||||
te = TextureEntry(ScalesS=0.5, ScalesT=0.5, OffsetsS=-0.25, OffsetsT=0.25, Rotation=math.pi / 2)
|
||||
self.assertEqual(Vector3(0.25, 0.75), te.st_to_uv(Vector3(0.5, 0.5)))
|
||||
|
||||
Reference in New Issue
Block a user