32 Commits

Author SHA1 Message Date
Salad Dais
46e598cded Don't use setup.py for bundling 2025-05-26 19:15:33 +00:00
Salad Dais
ce130c4831 Use a newer cx_Freeze 2025-05-26 18:50:37 +00:00
Salad Dais
b6ac988601 Always fetch tags so SCM versioning works 2025-05-19 23:22:05 +00:00
Salad Dais
c8dbbef8fc Let's use newer Python versions 2025-05-19 23:14:40 +00:00
Salad Dais
a974f167d1 Update requirements and package dirs 2025-05-19 23:05:34 +00:00
Salad Dais
2d3b3daf10 Start switching to pyproject.toml 2025-05-19 22:49:05 +00:00
Salad Dais
1d54c70164 Update uses of recordclass and utcfromtimestamp() 2025-05-16 22:47:17 +00:00
Salad Dais
6dafe32f6a Update version to v0.15.6
I forgot I have to manually do it in this repo.
2025-04-18 04:33:00 +00:00
Salad Dais
3149d3610f Pin cx_freeze version 2025-04-18 04:30:11 +00:00
Salad Dais
f8f3bcfc36 Make PyPi stop whining about attestations 2025-04-18 04:26:42 +00:00
Salad Dais
8548cce4e5 Use new upload-artifact action 2025-04-18 04:19:52 +00:00
Salad Dais
ad2aca1803 Upgrade mitmproxy 2025-04-18 01:44:23 +00:00
Salad Dais
8cf500ce44 Me more verbose if we can't parse legacy schema 2025-04-18 01:43:10 +00:00
Salad Dais
ceda7f370e Update message template to upstream 2024-12-11 22:59:27 +00:00
Salad Dais
0692a10253 Add support for JankStringyBytes in LLSD 2024-12-11 22:58:56 +00:00
Salad Dais
c1c2a96295 Fix some event handling quirks 2024-12-11 22:56:50 +00:00
Salad Dais
b4be9fa757 Better handle resent reliable messages 2024-10-29 07:31:59 +00:00
Salad Dais
a8967f0b7d Handle unknown messages better 2024-10-29 07:31:35 +00:00
Salad Dais
10af5cc250 Handle more JankStringyBytes ops 2024-10-29 07:15:24 +00:00
Salad Dais
0ea1b0324e v0.15.2 2024-03-14 02:04:25 +00:00
Salad Dais
4ece6efe60 Fix #45, add support for attachment block in AvatarAppearance
This is just a guess based on what the data looks like. The message
template may not be representative of the actual template LL is using
and they may remove it at any time, but this seems close enough
to what is actually being used.

Also it stops the message from spamming me about unparsed data.
2024-03-14 01:44:00 +00:00
Salad Dais
15bc8e0ed2 Log when applying deferred inv calls 2024-02-20 04:56:25 +00:00
dependabot[bot]
33fad6339f Bump aiohttp from 3.9.1 to 3.9.2 (#43)
Bumps [aiohttp](https://github.com/aio-libs/aiohttp) from 3.9.1 to 3.9.2.
- [Release notes](https://github.com/aio-libs/aiohttp/releases)
- [Changelog](https://github.com/aio-libs/aiohttp/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/aiohttp/compare/v3.9.1...v3.9.2)

---
updated-dependencies:
- dependency-name: aiohttp
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-19 22:46:10 -04:00
dependabot[bot]
93916104db Bump jinja2 from 3.1.2 to 3.1.3 (#42)
Bumps [jinja2](https://github.com/pallets/jinja) from 3.1.2 to 3.1.3.
- [Release notes](https://github.com/pallets/jinja/releases)
- [Changelog](https://github.com/pallets/jinja/blob/main/CHANGES.rst)
- [Commits](https://github.com/pallets/jinja/compare/3.1.2...3.1.3)

---
updated-dependencies:
- dependency-name: jinja2
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-01-21 19:45:05 -04:00
Salad Dais
3bb4fb0640 Basic AIS response handling in proxy 2024-01-19 04:37:14 +00:00
Salad Dais
c9495763e5 Defer inventory update processing til cache is loaded 2024-01-18 05:08:36 +00:00
Salad Dais
a7825a881c Start improving InventoryManager 2024-01-16 01:56:34 +00:00
Salad Dais
a6bbd97b98 Make sure asyncio.Tasks always have their exceptiosn logged 2024-01-15 22:24:16 +00:00
Salad Dais
3500212da0 Start handling messages in InventoryManager 2024-01-14 07:04:28 +00:00
Salad Dais
01ea9d7879 Improve MessageHandler resiliency 2024-01-14 07:00:20 +00:00
Salad Dais
f19e1b8bfb Upgrade to outleap 0.6.1 2024-01-10 20:34:21 +00:00
Salad Dais
f2202556d7 Mark as compatible with Python 3.12 2024-01-10 16:20:03 +00:00
41 changed files with 6952 additions and 6431 deletions

View File

@@ -23,11 +23,14 @@ jobs:
contents: write
strategy:
matrix:
python-version: ["3.11"]
python-version: ["3.12"]
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- name: Get history and tags for SCM versioning to work
run: |
git fetch --prune --unshallow
git fetch --depth=1 origin +refs/tags/*:refs/tags/*
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
@@ -51,7 +54,7 @@ jobs:
mv ./dist/*.zip hippolyzer-windows-${{ env.target_tag }}.zip
- name: Upload the artifact
uses: actions/upload-artifact@v2
uses: actions/upload-artifact@v4
with:
name: hippolyzer-windows-${{ env.sha }}
path: ./hippolyzer-windows-${{ env.target_tag }}.zip

View File

@@ -16,18 +16,22 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- name: Get history and tags for SCM versioning to work
run: |
git fetch --prune --unshallow
git fetch --depth=1 origin +refs/tags/*:refs/tags/*
- uses: actions/setup-python@v2
with:
python-version: "3.10"
python-version: "3.12"
- name: Install dependencies
run: |
python -m pip install --upgrade pip setuptools wheel
python -m pip install --upgrade pip setuptools wheel build
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: Build
run: >-
python setup.py sdist bdist_wheel
python -m build
# We do this, since failures on test.pypi aren't that bad
- name: Publish to Test PyPI
if: startsWith(github.event.ref, 'refs/tags') || github.event_name == 'release'
@@ -36,6 +40,7 @@ jobs:
user: __token__
password: ${{ secrets.TEST_PYPI_API_TOKEN }}
repository_url: https://test.pypi.org/legacy/
attestations: false
- name: Publish to PyPI
if: startsWith(github.event.ref, 'refs/tags') || github.event_name == 'release'
@@ -43,3 +48,4 @@ jobs:
with:
user: __token__
password: ${{ secrets.PYPI_API_TOKEN }}
attestations: false

View File

@@ -14,11 +14,14 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.10", "3.12"]
python-version: ["3.12", "3.13"]
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
- name: Get history and tags for SCM versioning to work
run: |
git fetch --prune --unshallow
git fetch --depth=1 origin +refs/tags/*:refs/tags/*
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:

View File

@@ -27,7 +27,7 @@ with low-level SL details. See the [Local Animation addon example](https://githu
### From Source
* Python 3.10 or above is **required**. If you're unable to upgrade your system Python package due to
* Python 3.12 or above is **required**. If you're unable to upgrade your system Python package due to
being on a stable distro, you can use [pyenv](https://github.com/pyenv/pyenv) to create
a self-contained Python install with the appropriate version.
* [Create a clean Python 3 virtualenv](https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/#creating-a-virtual-environment)

View File

@@ -24,7 +24,7 @@ from hippolyzer.apps.model import MessageLogModel, MessageLogHeader, RegionListM
from hippolyzer.apps.proxy import start_proxy
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import bytes_unescape, bytes_escape, get_resource_filename
from hippolyzer.lib.base.helpers import bytes_unescape, bytes_escape, get_resource_filename, create_logged_task
from hippolyzer.lib.base.message.llsd_msg_serializer import LLSDMessageSerializer
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.message_formatting import (
@@ -826,7 +826,7 @@ class MessageBuilderWindow(QtWidgets.QMainWindow):
# enough for the full response to pass through the proxy
await resp.read()
asyncio.create_task(_send_request())
create_logged_task(_send_request(), "Send HTTP Request")
class AddonDialog(QtWidgets.QDialog):

View File

@@ -304,6 +304,9 @@ class JankStringyBytes(bytes):
def __str__(self):
return self.rstrip(b"\x00").decode("utf8", errors="replace")
def __bool__(self):
return not (super().__eq__(b"") or super().__eq__(b"\x00"))
def __eq__(self, other):
if isinstance(other, str):
return str(self) == other
@@ -319,12 +322,12 @@ class JankStringyBytes(bytes):
def __add__(self, other):
if isinstance(other, bytes):
return bytes(self) + other
return JankStringyBytes(bytes(self) + other)
return str(self) + other
def __radd__(self, other):
if isinstance(other, bytes):
return other + bytes(self)
return JankStringyBytes(other + bytes(self))
return other + str(self)
def lower(self):
@@ -333,6 +336,20 @@ class JankStringyBytes(bytes):
def upper(self):
return str(self).upper()
def startswith(self, __prefix, __start=None, __end=None):
if __start or __end:
raise RuntimeError("Can't handle __start or __end")
if isinstance(__prefix, str):
return str(self).startswith(__prefix)
return self.startswith(__prefix)
def endswith(self, __prefix, __start=None, __end=None):
if __start or __end:
raise RuntimeError("Can't handle __start or __end")
if isinstance(__prefix, str):
return str(self).endswith(__prefix)
return self.endswith(__prefix)
class RawBytes(bytes):
__slots__ = ()

View File

@@ -19,16 +19,19 @@ along with this program; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
import asyncio
from logging import getLogger
import logging
logger = getLogger('utilities.events')
from hippolyzer.lib.base.helpers import create_logged_task
LOG = logging.getLogger(__name__)
class Event:
""" an object containing data which will be passed out to all subscribers """
def __init__(self):
def __init__(self, name=None):
self.subscribers = []
self.name = name
def subscribe(self, handler, *args, one_shot=False, predicate=None, **kwargs):
""" establish the subscribers (handlers) to this event """
@@ -38,7 +41,8 @@ class Event:
return self
def _handler_key(self, handler):
@staticmethod
def _handler_key(handler):
return handler[:3]
def unsubscribe(self, handler, *args, **kwargs):
@@ -52,24 +56,30 @@ class Event:
raise ValueError(f"Handler {handler!r} is not subscribed to this event.")
return self
def _create_async_wrapper(self, handler, args, inner_args, kwargs):
# Note that unsubscription may be delayed due to asyncio scheduling :)
async def _run_handler_wrapper():
unsubscribe = await handler(args, *inner_args, **kwargs)
if unsubscribe:
_ = self.unsubscribe(handler, *inner_args, **kwargs)
return _run_handler_wrapper
def notify(self, args):
for handler in self.subscribers[:]:
handler, inner_args, kwargs, one_shot, predicate = handler
for subscriber in self.subscribers[:]:
handler, inner_args, kwargs, one_shot, predicate = subscriber
if predicate and not predicate(args):
continue
if one_shot:
self.unsubscribe(handler, *inner_args, **kwargs)
if asyncio.iscoroutinefunction(handler):
# Note that unsubscription may be delayed due to asyncio scheduling :)
async def _run_handler_wrapper():
unsubscribe = await handler(args, *inner_args, **kwargs)
if unsubscribe:
_ = self.unsubscribe(handler, *inner_args, **kwargs)
asyncio.create_task(_run_handler_wrapper())
create_logged_task(self._create_async_wrapper(handler, args, inner_args, kwargs)(), self.name, LOG)
else:
if handler(args, *inner_args, **kwargs) and not one_shot:
self.unsubscribe(handler, *inner_args, **kwargs)
try:
if handler(args, *inner_args, **kwargs) and not one_shot:
self.unsubscribe(handler, *inner_args, **kwargs)
except:
# One handler failing shouldn't prevent notification of other handlers.
LOG.exception(f"Failed in handler for {self.name}")
def __len__(self):
return len(self.subscribers)

View File

@@ -1,7 +1,9 @@
from __future__ import annotations
import asyncio
import codecs
import functools
import logging
import os
import lazy_object_proxy
@@ -165,3 +167,31 @@ def get_mtime(path):
return os.stat(path).st_mtime
except:
return None
def fut_logger(name: str, logger: logging.Logger, fut: asyncio.Future, *args) -> None:
"""Callback suitable for exception logging in `Future.add_done_callback()`"""
if not fut.cancelled() and fut.exception():
if isinstance(fut.exception(), asyncio.CancelledError):
# Don't really care if the task was just cancelled
return
logger.exception(f"Failed in task for {name}", exc_info=fut.exception())
def add_future_logger(
fut: asyncio.Future,
name: Optional[str] = None,
logger: Optional[logging.Logger] = None,
):
"""Add a logger to Futures that will never be directly `await`ed, logging exceptions"""
fut.add_done_callback(functools.partial(fut_logger, name, logger or logging.getLogger()))
def create_logged_task(
coro: Coroutine,
name: Optional[str] = None,
logger: Optional[logging.Logger] = None,
) -> asyncio.Task:
task = asyncio.create_task(coro, name=name)
add_future_logger(task, name, logger)
return task

View File

@@ -11,6 +11,7 @@ It's typically only used for object contents now.
from __future__ import annotations
import abc
import asyncio
import dataclasses
import datetime as dt
import inspect
@@ -202,6 +203,7 @@ class InventoryModel(InventoryBase):
def __init__(self):
self.nodes: Dict[UUID, InventoryNodeBase] = {}
self.root: Optional[InventoryContainerBase] = None
self.any_dirty = asyncio.Event()
@classmethod
def from_reader(cls, reader: StringIO, read_header=False) -> InventoryModel:
@@ -232,7 +234,7 @@ class InventoryModel(InventoryBase):
if (obj := inv_type.from_llsd(obj_dict, flavor)) is not None:
model.add(obj)
break
LOG.warning(f"Unknown object type {obj_dict!r}")
LOG.warning(f"Unknown object type {obj_dict!r}")
return model
@property
@@ -246,6 +248,12 @@ class InventoryModel(InventoryBase):
if isinstance(node, InventoryContainerBase):
yield node
@property
def dirty_categories(self) -> Iterable[InventoryCategory]:
for node in self.nodes:
if isinstance(node, InventoryCategory) and node.version == InventoryCategory.VERSION_NONE:
yield node
@property
def all_items(self) -> Iterable[InventoryItem]:
for node in self.nodes.values():
@@ -273,6 +281,29 @@ class InventoryModel(InventoryBase):
if node.parent_id == UUID.ZERO:
self.root = node
node.model = weakref.proxy(self)
return node
def update(self, node: InventoryNodeBase, update_fields: Optional[Iterable[str]] = None) -> InventoryNodeBase:
"""Update an existing node, optionally only updating specific fields"""
if node.node_id not in self.nodes:
raise KeyError(f"{node.node_id} not in the inventory model")
orig_node = self.nodes[node.node_id]
if node.__class__ != orig_node.__class__:
raise ValueError(f"Tried to update {orig_node!r} from non-matching {node!r}")
if not update_fields:
# Update everything but the model parameter
update_fields = node.get_field_names()
for field_name in update_fields:
setattr(orig_node, field_name, getattr(node, field_name))
return orig_node
def upsert(self, node: InventoryNodeBase, update_fields: Optional[Iterable[str]] = None) -> InventoryNodeBase:
"""Add or update a node"""
if node.node_id in self.nodes:
return self.update(node, update_fields)
return self.add(node)
def unlink(self, node: InventoryNodeBase, single_only: bool = False) -> Sequence[InventoryNodeBase]:
"""Unlink a node and its descendants from the tree, returning the removed nodes"""
@@ -313,6 +344,10 @@ class InventoryModel(InventoryBase):
removed=removed_in_other,
)
def flag_if_dirty(self):
if any(self.dirty_categories):
self.any_dirty.set()
def __getitem__(self, item: UUID) -> InventoryNodeBase:
return self.nodes[item]
@@ -367,6 +402,10 @@ class InventoryNodeBase(InventoryBase, _HasName):
default=None, init=False, hash=False, compare=False, repr=False
)
@classmethod
def get_field_names(cls) -> Set[str]:
return set(cls._get_fields_dict().keys()) - {"model"}
@property
def node_id(self) -> UUID:
return getattr(self, self.ID_ATTR)
@@ -459,6 +498,8 @@ class InventoryObject(InventoryContainerBase):
@dataclasses.dataclass
class InventoryCategory(InventoryContainerBase):
ID_ATTR: ClassVar[str] = "cat_id"
# AIS calls this something else...
ID_ATTR_AIS: ClassVar[str] = "category_id"
SCHEMA_NAME: ClassVar[str] = "inv_category"
VERSION_NONE: ClassVar[int] = -1
@@ -489,12 +530,24 @@ class InventoryCategory(InventoryContainerBase):
type=AssetType.CATEGORY,
)
@classmethod
def from_llsd(cls, inv_dict: Dict, flavor: str = "legacy"):
if flavor == "ais" and "type" not in inv_dict:
inv_dict = inv_dict.copy()
inv_dict["type"] = AssetType.CATEGORY
return super().from_llsd(inv_dict, flavor)
def to_llsd(self, flavor: str = "legacy"):
payload = super().to_llsd(flavor)
if flavor == "ais":
# AIS already knows the inventory type is category
payload.pop("type", None)
return payload
@classmethod
def _get_fields_dict(cls, llsd_flavor: Optional[str] = None):
fields = super()._get_fields_dict(llsd_flavor)
if llsd_flavor == "ais":
# AIS is smart enough to know that all categories are asset type category...
fields.pop("type")
# These have different names though
fields["type_default"] = fields.pop("preferred_type")
fields["agent_id"] = fields.pop("owner_id")
@@ -562,10 +615,13 @@ class InventoryItem(InventoryNodeBase):
def from_inventory_data(cls, block: Block):
return cls(
item_id=block["ItemID"],
parent_id=block["ParentID"],
# Might be under one of two names
parent_id=block.get("ParentID", block["FolderID"]),
permissions=InventoryPermissions(
creator_id=block["CreatorID"],
owner_id=block["OwnerID"],
# Unknown, not sent in this schema
last_owner_id=block.get("LastOwnerID", UUID.ZERO),
group_id=block["GroupID"],
base_mask=block["BaseMask"],
owner_mask=block["OwnerMask"],
@@ -573,7 +629,8 @@ class InventoryItem(InventoryNodeBase):
everyone_mask=block["EveryoneMask"],
next_owner_mask=block["NextOwnerMask"],
),
asset_id=block["AssetID"],
# May be missing in UpdateInventoryItem
asset_id=block.get("AssetID"),
type=AssetType(block["Type"]),
inv_type=InventoryType(block["InvType"]),
flags=block["Flags"],
@@ -591,7 +648,46 @@ class InventoryItem(InventoryNodeBase):
if flavor == "ais":
# There's little chance this differs from owner ID, just place it.
val["agent_id"] = val["permissions"]["owner_id"]
if val["type"] == AssetType.LINK:
# For link items, there is no asset, only a linked ID.
val["linked_id"] = val.pop("asset_id")
# These don't exist either
val.pop("permissions", None)
val.pop("sale_info", None)
return val
@classmethod
def from_llsd(cls, inv_dict: Dict, flavor: str = "legacy"):
if flavor == "ais" and "linked_id" in inv_dict:
# Links get represented differently than other items for whatever reason.
# This is incredibly annoying, under *NIX there's nothing really special about symlinks.
inv_dict = inv_dict.copy()
# Fill this in since it needs to be there
if "permissions" not in inv_dict:
inv_dict["permissions"] = InventoryPermissions(
base_mask=0xFFffFFff,
owner_mask=0xFFffFFff,
group_mask=0xFFffFFff,
everyone_mask=0,
next_owner_mask=0xFFffFFff,
creator_id=UUID.ZERO,
owner_id=UUID.ZERO,
last_owner_id=UUID.ZERO,
group_id=UUID.ZERO,
).to_llsd("ais")
if "sale_info" not in inv_dict:
inv_dict["sale_info"] = InventorySaleInfo(
sale_type=SaleType.NOT,
sale_price=0,
).to_llsd("ais")
if "type" not in inv_dict:
inv_dict["type"] = AssetType.LINK
# In the context of symlinks, asset id means linked item ID.
# This is also how indra stores symlinks. Why the asymmetry in AIS if none of the
# consumers actually want it? Who knows.
inv_dict["asset_id"] = inv_dict.pop("linked_id")
return super().from_llsd(inv_dict, flavor)
INVENTORY_TYPES: Tuple[Type[InventoryNodeBase], ...] = (InventoryCategory, InventoryObject, InventoryItem)

View File

@@ -46,7 +46,7 @@ class SchemaFieldSerializer(abc.ABC, Generic[_T]):
class SchemaDate(SchemaFieldSerializer[dt.datetime]):
@classmethod
def deserialize(cls, val: str) -> dt.datetime:
return dt.datetime.utcfromtimestamp(int(val))
return dt.datetime.fromtimestamp(int(val), dt.timezone.utc)
@classmethod
def serialize(cls, val: dt.datetime) -> str:
@@ -54,7 +54,7 @@ class SchemaDate(SchemaFieldSerializer[dt.datetime]):
@classmethod
def from_llsd(cls, val: Any, flavor: str) -> dt.datetime:
return dt.datetime.utcfromtimestamp(val)
return dt.datetime.fromtimestamp(val, dt.timezone.utc)
@classmethod
def to_llsd(cls, val: dt.datetime, flavor: str):
@@ -190,32 +190,36 @@ class SchemaBase(abc.ABC):
def from_llsd(cls, inv_dict: Dict, flavor: str = "legacy"):
fields = cls._get_fields_dict(llsd_flavor=flavor)
obj_dict = {}
for key, val in inv_dict.items():
if key in fields:
field: dataclasses.Field = fields[key]
key = field.name
spec = field.metadata.get("spec")
# Not a real key, an internal var on our dataclass
if not spec:
LOG.warning(f"Internal key {key!r}")
continue
try:
for key, val in inv_dict.items():
if key in fields:
field: dataclasses.Field = fields[key]
key = field.name
spec = field.metadata.get("spec")
# Not a real key, an internal var on our dataclass
if not spec:
LOG.warning(f"Internal key {key!r}")
continue
spec_cls = spec
if not inspect.isclass(spec_cls):
spec_cls = spec_cls.__class__
spec_cls = spec
if not inspect.isclass(spec_cls):
spec_cls = spec_cls.__class__
# some kind of nested structure like sale_info
if issubclass(spec_cls, SchemaBase):
obj_dict[key] = spec.from_llsd(val, flavor)
elif issubclass(spec_cls, SchemaFieldSerializer):
obj_dict[key] = spec.from_llsd(val, flavor)
# some kind of nested structure like sale_info
if issubclass(spec_cls, SchemaBase):
obj_dict[key] = spec.from_llsd(val, flavor)
elif issubclass(spec_cls, SchemaFieldSerializer):
obj_dict[key] = spec.from_llsd(val, flavor)
else:
raise ValueError(f"Unsupported spec for {key!r}, {spec!r}")
else:
raise ValueError(f"Unsupported spec for {key!r}, {spec!r}")
else:
if flavor != "ais":
# AIS has a number of different fields that are irrelevant depending on
# what exactly sent the payload
LOG.warning(f"Unknown key {key!r}")
if flavor != "ais":
# AIS has a number of different fields that are irrelevant depending on
# what exactly sent the payload
LOG.warning(f"Unknown key {key!r}")
except:
LOG.error(f"Failed to parse inventory schema: {inv_dict!r}")
raise
return cls._obj_from_dict(obj_dict)
def to_bytes(self) -> bytes:

View File

@@ -16,10 +16,12 @@ from hippolyzer.lib.base.datatypes import *
class HippoLLSDBaseFormatter(base_llsd.base.LLSDBaseFormatter):
UUID: callable
ARRAY: callable
BINARY: callable
def __init__(self):
super().__init__()
self.type_map[UUID] = self.UUID
self.type_map[JankStringyBytes] = self.BINARY
self.type_map[Vector2] = self.TUPLECOORD
self.type_map[Vector3] = self.TUPLECOORD
self.type_map[Vector4] = self.TUPLECOORD
@@ -101,7 +103,7 @@ def _format_binary_recurse(something) -> bytes:
raise LLSDSerializationError(str(exc), something)
elif isinstance(something, uuid.UUID):
return b'u' + something.bytes
elif isinstance(something, binary):
elif isinstance(something, (binary, JankStringyBytes)):
return b'b' + struct.pack('!i', len(something)) + something
elif is_string(something):
if is_unicode(something):

File diff suppressed because it is too large Load Diff

View File

@@ -75,8 +75,8 @@ class Block:
for var_name, val in kwargs.items():
self[var_name] = val
def get_variable(self, var_name):
return self.vars.get(var_name)
def get(self, var_name, default: Optional[VAR_TYPE] = None) -> Optional[VAR_TYPE]:
return self.vars.get(var_name, default)
def __contains__(self, item):
return item in self.vars
@@ -188,7 +188,7 @@ class MsgBlockList(List["Block"]):
class Message:
__slots__ = ("name", "send_flags", "packet_id", "acks", "body_boundaries", "queued",
"offset", "raw_extra", "raw_body", "deserializer", "_blocks", "finalized",
"direction", "meta", "synthetic", "dropped", "sender")
"direction", "meta", "synthetic", "dropped", "sender", "unknown_message")
def __init__(self, name, *args, packet_id=None, flags=0, acks=None, direction=None):
# TODO: Do this on a timer or something.
@@ -200,6 +200,7 @@ class Message:
self.acks = acks if acks is not None else tuple()
self.body_boundaries = (-1, -1)
self.unknown_message = False
self.offset = 0
self.raw_extra = b""
self.direction: Direction = direction if direction is not None else Direction.OUT
@@ -288,7 +289,7 @@ class Message:
def ensure_parsed(self):
# This is a little magic, think about whether we want this.
if self.raw_body and self.deserializer():
if self.raw_body and self.deserializer and self.deserializer():
self.deserializer().parse_message_body(self)
def to_dict(self, extended=False):

View File

@@ -31,7 +31,7 @@ _T = TypeVar("_T")
_K = TypeVar("_K", bound=Hashable)
MESSAGE_HANDLER = Callable[[_T], Any]
PREDICATE = Callable[[_T], bool]
# TODO: Can't do `Iterable[Union[_K, Literal["*"]]` apparently?
# TODO: Can't do `Iterable[Union[_K, Literal["*"]]]` apparently?
MESSAGE_NAMES = Iterable[Union[_K, str]]
@@ -42,7 +42,7 @@ class MessageHandler(Generic[_T, _K]):
def register(self, message_name: _K) -> Event:
LOG.debug('Creating a monitor for %s' % message_name)
return self.handlers.setdefault(message_name, Event())
return self.handlers.setdefault(message_name, Event(message_name))
def subscribe(self, message_name: Union[_K, Literal["*"]], handler: MESSAGE_HANDLER):
notifier = self.register(message_name)

View File

@@ -37,7 +37,7 @@ class MessageTemplateVariable:
return f"{self.__class__.__name__}(name={self.name!r}, tp={self.type!r}, size={self.size!r})"
@property
def probably_binary(self):
def probably_binary(self) -> bool:
if self._probably_binary is not None:
return self._probably_binary
@@ -49,7 +49,7 @@ class MessageTemplateVariable:
return self._probably_binary
@property
def probably_text(self):
def probably_text(self) -> bool:
if self._probably_text is not None:
return self._probably_text
@@ -97,11 +97,11 @@ class MessageTemplateBlock:
self.block_type: MsgBlockType = MsgBlockType.MBT_SINGLE
self.number = 0
def add_variable(self, var):
def add_variable(self, var: MessageTemplateVariable):
self.variable_map[var.name] = var
self.variables.append(var)
def get_variable(self, name):
def get_variable(self, name) -> MessageTemplateVariable:
return self.variable_map[name]
@@ -119,11 +119,11 @@ class MessageTemplate:
self.deprecation = None
self.encoding = None
def add_block(self, block):
def add_block(self, block: MessageTemplateBlock):
self.block_map[block.name] = block
self.blocks.append(block)
def get_block(self, name):
def get_block(self, name) -> MessageTemplateBlock:
return self.block_map[name]
def get_msg_freq_num_len(self):

View File

@@ -43,7 +43,7 @@ class TemplateDictionary:
self.template_list: typing.List[MessageTemplate] = []
# maps name to template
self.message_templates = {}
self.message_templates: typing.Dict[str, MessageTemplate] = {}
# maps (freq,num) to template
self.message_dict = {}

View File

@@ -126,8 +126,14 @@ class UDPMessageDeserializer:
frequency, num = _parse_msg_num(reader)
current_template = self.template_dict.get_template_by_pair(frequency, num)
if current_template is None:
raise exc.MessageTemplateNotFound("deserializing data", f"{frequency}:{num}")
msg.name = current_template.name
if self.settings.ALLOW_UNKNOWN_MESSAGES:
LOG.warning(f"Unknown message type {frequency}:{num}")
msg.unknown_message = True
msg.name = "UnknownMessage:%d" % num
else:
raise exc.MessageTemplateNotFound("deserializing data", f"{frequency}:{num}")
else:
msg.name = current_template.name
# extra field, see note regarding msg.offset
msg.raw_extra = reader.read_bytes(msg.offset)
@@ -143,6 +149,12 @@ class UDPMessageDeserializer:
# Already parsed if we don't have a raw body
if not raw_body:
return
if msg.unknown_message:
# We can't parse this, we don't know anything about it
msg.deserializer = None
return
msg.raw_body = None
msg.deserializer = None

View File

@@ -45,7 +45,7 @@ class UDPMessageSerializer:
def serialize(self, msg: Message):
current_template = self.template_dict.get_template_by_name(msg.name)
if current_template is None:
if current_template is None and msg.raw_body is None:
raise exc.MessageSerializationError("message name", "invalid message name")
# Header and trailers are all big-endian

View File

@@ -1728,7 +1728,6 @@ class QuantizedNumPyArray(Adapter):
def subfield_serializer(msg_name, block_name, var_name):
def f(orig_cls):
global SUBFIELD_SERIALIZERS
SUBFIELD_SERIALIZERS[(msg_name, block_name, var_name)] = orig_cls
return orig_cls
return f
@@ -1940,7 +1939,6 @@ class IntFlagSubfieldSerializer(AdapterInstanceSubfieldSerializer):
def http_serializer(msg_name):
def f(orig_cls):
global HTTP_SERIALIZERS
HTTP_SERIALIZERS[msg_name] = orig_cls
return orig_cls
return f

View File

@@ -55,6 +55,7 @@ class SettingDescriptor(Generic[_T]):
class Settings:
ENABLE_DEFERRED_PACKET_PARSING: bool = SettingDescriptor(True)
ALLOW_UNKNOWN_MESSAGES: bool = SettingDescriptor(True)
def __init__(self):
self._settings: Dict[str, Any] = {}

View File

@@ -1822,9 +1822,20 @@ class ChatSourceType(IntEnum):
UNKNOWN = 3
@dataclasses.dataclass
class ThrottleData:
resend: float = se.dataclass_field(se.F32)
land: float = se.dataclass_field(se.F32)
wind: float = se.dataclass_field(se.F32)
cloud: float = se.dataclass_field(se.F32)
task: float = se.dataclass_field(se.F32)
texture: float = se.dataclass_field(se.F32)
asset: float = se.dataclass_field(se.F32)
@se.subfield_serializer("AgentThrottle", "Throttle", "Throttles")
class AgentThrottlesSerializer(se.SimpleSubfieldSerializer):
TEMPLATE = se.Collection(None, se.F32)
TEMPLATE = se.Dataclass(ThrottleData)
@se.subfield_serializer("ObjectUpdate", "ObjectData", "NameValue")

View File

@@ -8,6 +8,7 @@ import dataclasses
from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import create_logged_task
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.circuit import ConnectionHolder
from hippolyzer.lib.base.message.msgtypes import PacketFlags
@@ -108,7 +109,7 @@ class TransferManager:
flags=PacketFlags.RELIABLE,
))
transfer = Transfer(transfer_id)
asyncio.create_task(self._pump_transfer_replies(transfer))
create_logged_task(self._pump_transfer_replies(transfer), "Transfer Pump")
return transfer
async def _pump_transfer_replies(self, transfer: Transfer):

View File

@@ -5,6 +5,7 @@ Body parts and linden clothing layers
from __future__ import annotations
import dataclasses
import enum
import logging
from io import StringIO
from typing import *
@@ -21,6 +22,60 @@ LOG = logging.getLogger(__name__)
_T = TypeVar("_T")
WEARABLE_VERSION = "LLWearable version 22"
DEFAULT_WEARABLE_TEX = UUID("c228d1cf-4b5d-4ba8-84f4-899a0796aa97")
class AvatarTEIndex(enum.IntEnum):
"""From llavatarappearancedefines.h"""
HEAD_BODYPAINT = 0
UPPER_SHIRT = enum.auto()
LOWER_PANTS = enum.auto()
EYES_IRIS = enum.auto()
HAIR = enum.auto()
UPPER_BODYPAINT = enum.auto()
LOWER_BODYPAINT = enum.auto()
LOWER_SHOES = enum.auto()
HEAD_BAKED = enum.auto()
UPPER_BAKED = enum.auto()
LOWER_BAKED = enum.auto()
EYES_BAKED = enum.auto()
LOWER_SOCKS = enum.auto()
UPPER_JACKET = enum.auto()
LOWER_JACKET = enum.auto()
UPPER_GLOVES = enum.auto()
UPPER_UNDERSHIRT = enum.auto()
LOWER_UNDERPANTS = enum.auto()
SKIRT = enum.auto()
SKIRT_BAKED = enum.auto()
HAIR_BAKED = enum.auto()
LOWER_ALPHA = enum.auto()
UPPER_ALPHA = enum.auto()
HEAD_ALPHA = enum.auto()
EYES_ALPHA = enum.auto()
HAIR_ALPHA = enum.auto()
HEAD_TATTOO = enum.auto()
UPPER_TATTOO = enum.auto()
LOWER_TATTOO = enum.auto()
HEAD_UNIVERSAL_TATTOO = enum.auto()
UPPER_UNIVERSAL_TATTOO = enum.auto()
LOWER_UNIVERSAL_TATTOO = enum.auto()
SKIRT_TATTOO = enum.auto()
HAIR_TATTOO = enum.auto()
EYES_TATTOO = enum.auto()
LEFT_ARM_TATTOO = enum.auto()
LEFT_LEG_TATTOO = enum.auto()
AUX1_TATTOO = enum.auto()
AUX2_TATTOO = enum.auto()
AUX3_TATTOO = enum.auto()
LEFTARM_BAKED = enum.auto()
LEFTLEG_BAKED = enum.auto()
AUX1_BAKED = enum.auto()
AUX2_BAKED = enum.auto()
AUX3_BAKED = enum.auto()
@property
def is_baked(self) -> bool:
return self.name.endswith("_BAKED")
@dataclasses.dataclass

View File

@@ -9,6 +9,7 @@ import random
from typing import *
from hippolyzer.lib.base.datatypes import UUID, RawBytes
from hippolyzer.lib.base.helpers import create_logged_task
from hippolyzer.lib.base.message.data_packer import TemplateDataPacker
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.msgtypes import MsgType, PacketFlags
@@ -125,7 +126,7 @@ class XferManager:
direction=direction,
))
xfer = Xfer(xfer_id, direction=direction, turbo=turbo)
asyncio.create_task(self._pump_xfer_replies(xfer))
create_logged_task(self._pump_xfer_replies(xfer), "Xfer Pump")
return xfer
async def _pump_xfer_replies(self, xfer: Xfer):

View File

@@ -13,7 +13,7 @@ import aiohttp
import multidict
from hippolyzer.lib.base.datatypes import Vector3, StringEnum
from hippolyzer.lib.base.helpers import proxify, get_resource_filename
from hippolyzer.lib.base.helpers import proxify, get_resource_filename, create_logged_task
from hippolyzer.lib.base.message.circuit import Circuit
from hippolyzer.lib.base.message.llsd_msg_serializer import LLSDMessageSerializer
from hippolyzer.lib.base.message.message import Message, Block
@@ -23,7 +23,7 @@ from hippolyzer.lib.base.message.udpdeserializer import UDPMessageDeserializer
from hippolyzer.lib.base.network.caps_client import CapsClient, CAPS_DICT
from hippolyzer.lib.base.network.transport import ADDR_TUPLE, Direction, SocketUDPTransport, AbstractUDPTransport
from hippolyzer.lib.base.settings import Settings, SettingDescriptor
from hippolyzer.lib.base.templates import RegionHandshakeReplyFlags, ChatType
from hippolyzer.lib.base.templates import RegionHandshakeReplyFlags, ChatType, ThrottleData
from hippolyzer.lib.base.transfer_manager import TransferManager
from hippolyzer.lib.base.xfer_manager import XferManager
from hippolyzer.lib.client.asset_uploader import AssetUploader
@@ -108,8 +108,9 @@ class HippoClientProtocol(asyncio.DatagramProtocol):
if should_handle:
self.session.message_handler.handle(message)
except:
LOG.exception("Failed in region message handler")
region.message_handler.handle(message)
LOG.exception("Failed in session message handler")
if should_handle:
region.message_handler.handle(message)
class HippoClientRegion(BaseClientRegion):
@@ -189,7 +190,7 @@ class HippoClientRegion(BaseClientRegion):
"RegionInfo",
Flags=(
RegionHandshakeReplyFlags.SUPPORTS_SELF_APPEARANCE
| RegionHandshakeReplyFlags.VOCACHE_IS_EMPTY
| RegionHandshakeReplyFlags.VOCACHE_CULLING_ENABLED
)
)
)
@@ -207,7 +208,15 @@ class HippoClientRegion(BaseClientRegion):
"Throttle",
GenCounter=0,
# Reasonable defaults, I guess
Throttles_=[207360.0, 165376.0, 33075.19921875, 33075.19921875, 682700.75, 682700.75, 269312.0],
Throttles_=ThrottleData(
resend=207360.0,
land=165376.0,
wind=33075.19921875,
cloud=33075.19921875,
task=682700.75,
texture=682700.75,
asset=269312.0
),
)
)
)
@@ -231,13 +240,13 @@ class HippoClientRegion(BaseClientRegion):
seed_resp.raise_for_status()
self.update_caps(await seed_resp.read_llsd())
self._eq_task = asyncio.create_task(self._poll_event_queue())
self._eq_task = create_logged_task(self._poll_event_queue(), "EQ Poll")
settings = self.session().session_manager.settings
if settings.AUTO_REQUEST_PARCELS:
_ = asyncio.create_task(self.parcel_manager.request_dirty_parcels())
_ = create_logged_task(self.parcel_manager.request_dirty_parcels(), "Parcel Request")
if settings.AUTO_REQUEST_MATERIALS:
_ = asyncio.create_task(self.objects.request_all_materials())
_ = create_logged_task(self.objects.request_all_materials(), "Request All Materials")
except Exception as e:
# Let consumers who were `await`ing the connected signal know there was an error
@@ -276,21 +285,25 @@ class HippoClientRegion(BaseClientRegion):
ack: Optional[int] = None
while True:
payload = {"ack": ack, "done": False}
async with self.caps_client.post("EventQueueGet", llsd=payload) as resp:
if resp.status != 200:
await asyncio.sleep(0.1)
continue
polled = await resp.read_llsd()
for event in polled["events"]:
if self._llsd_serializer.can_handle(event["message"]):
msg = self._llsd_serializer.deserialize(event)
else:
msg = Message.from_eq_event(event)
msg.sender = self.circuit_addr
msg.direction = Direction.IN
self.session().message_handler.handle(msg)
self.message_handler.handle(msg)
ack = polled["id"]
try:
async with self.caps_client.post("EventQueueGet", llsd=payload) as resp:
if resp.status != 200:
await asyncio.sleep(0.1)
continue
polled = await resp.read_llsd()
for event in polled["events"]:
if self._llsd_serializer.can_handle(event["message"]):
msg = self._llsd_serializer.deserialize(event)
else:
msg = Message.from_eq_event(event)
msg.sender = self.circuit_addr
msg.direction = Direction.IN
self.session().message_handler.handle(msg)
self.message_handler.handle(msg)
ack = polled["id"]
await asyncio.sleep(0.001)
except aiohttp.client_exceptions.ServerDisconnectedError:
# This is expected to happen during long-polling, just pick up again where we left off.
await asyncio.sleep(0.001)
async def _handle_ping_check(self, message: Message):
@@ -391,10 +404,10 @@ class HippoClientSession(BaseClientSession):
need_connect = (region.circuit and region.circuit.is_alive) or moving_to_region
self.open_circuit(sim_addr)
if need_connect:
asyncio.create_task(region.connect(main_region=moving_to_region))
create_logged_task(region.connect(main_region=moving_to_region), "Region Connect")
elif moving_to_region:
# No need to connect, but we do need to complete agent movement.
asyncio.create_task(region.complete_agent_movement())
create_logged_task(region.complete_agent_movement(), "CompleteAgentMovement")
class HippoClient(BaseClientSessionManager):
@@ -660,7 +673,7 @@ class HippoClient(BaseClientSessionManager):
self.session = HippoClientSession.from_login_data(login_data, self)
self.session.transport, self.session.protocol = await self._create_transport()
self._resend_task = asyncio.create_task(self._attempt_resends())
self._resend_task = create_logged_task(self._attempt_resends(), "Circuit Resend")
self.session.message_handler.subscribe("AgentDataUpdate", self._handle_agent_data_update)
self.session.message_handler.subscribe("AgentGroupDataUpdate", self._handle_agent_group_data_update)
@@ -742,7 +755,7 @@ class HippoClient(BaseClientSessionManager):
return
teleport_fut.set_result(None)
asyncio.create_task(_handle_teleport())
create_logged_task(_handle_teleport(), "Teleport")
return teleport_fut

View File

@@ -1,6 +1,7 @@
from __future__ import annotations
import gzip
import itertools
import logging
from pathlib import Path
from typing import Union, List, Tuple, Set
@@ -8,6 +9,7 @@ from typing import Union, List, Tuple, Set
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.inventory import InventoryModel, InventoryCategory, InventoryItem
from hippolyzer.lib.base.message.message import Message
from hippolyzer.lib.base.templates import AssetType, FolderType
from hippolyzer.lib.client.state import BaseClientSession
@@ -20,6 +22,10 @@ class InventoryManager:
self._session = session
self.model: InventoryModel = InventoryModel()
self._load_skeleton()
self._session.message_handler.subscribe("BulkUpdateInventory", self._handle_bulk_update_inventory)
self._session.message_handler.subscribe("UpdateCreateInventoryItem", self._handle_update_create_inventory_item)
self._session.message_handler.subscribe("RemoveInventoryItem", self._handle_remove_inventory_item)
self._session.message_handler.subscribe("MoveInventoryItem", self._handle_move_inventory_item)
def _load_skeleton(self):
assert not self.model.nodes
@@ -68,10 +74,8 @@ class InventoryManager:
# Cached cat isn't the same as what the inv server says it should be, can't use it.
if cached_cat.version != skel_versions.get(cached_cat.cat_id):
continue
if existing_cat:
# Remove the category so that we can replace it, but leave any children in place
self.model.unlink(existing_cat, single_only=True)
self.model.add(cached_cat)
# Update any existing category in-place, or add if not present
self.model.upsert(cached_cat)
# Any items in this category in our cache file are usable and should be added
loaded_cat_ids.add(cached_cat.cat_id)
@@ -81,10 +85,13 @@ class InventoryManager:
if cached_item.item_id in self.model:
continue
# The parent category didn't have a cache hit against the inventory skeleton, can't add!
# We don't even know if this item would be in the current version of it's parent cat!
if cached_item.parent_id not in loaded_cat_ids:
continue
self.model.add(cached_item)
self.model.flag_if_dirty()
def _parse_cache(self, path: Union[str, Path]) -> Tuple[List[InventoryCategory], List[InventoryItem]]:
"""Warning, may be incredibly slow due to llsd.parse_notation() behavior"""
categories: List[InventoryCategory] = []
@@ -112,3 +119,90 @@ class InventoryManager:
else:
LOG.warning(f"Unknown node type in inv cache: {node_llsd!r}")
return categories, items
def _handle_bulk_update_inventory(self, msg: Message):
any_cats = False
for folder_block in msg["FolderData"]:
if folder_block["FolderID"] == UUID.ZERO:
continue
any_cats = True
self.model.upsert(
InventoryCategory.from_folder_data(folder_block),
# Don't clobber version, we only want to fetch the folder if it's new
# and hasn't just moved.
update_fields={"parent_id", "name", "pref_type"},
)
for item_block in msg["ItemData"]:
if item_block["ItemID"] == UUID.ZERO:
continue
self.model.upsert(InventoryItem.from_inventory_data(item_block))
if any_cats:
self.model.flag_if_dirty()
def _validate_recipient(self, recipient: UUID):
if self._session.agent_id != recipient:
raise ValueError(f"AgentID Mismatch {self._session.agent_id} != {recipient}")
def _handle_update_create_inventory_item(self, msg: Message):
self._validate_recipient(msg["AgentData"]["AgentID"])
for inventory_block in msg["InventoryData"]:
self.model.upsert(InventoryItem.from_inventory_data(inventory_block))
def _handle_remove_inventory_item(self, msg: Message):
self._validate_recipient(msg["AgentData"]["AgentID"])
for inventory_block in msg["InventoryData"]:
node = self.model.get(inventory_block["ItemID"])
if node:
self.model.unlink(node)
def _handle_remove_inventory_folder(self, msg: Message):
self._validate_recipient(msg["AgentData"]["AgentID"])
for folder_block in msg["FolderData"]:
node = self.model.get(folder_block["FolderID"])
if node:
self.model.unlink(node)
def _handle_move_inventory_item(self, msg: Message):
for inventory_block in msg["InventoryData"]:
node = self.model.get(inventory_block["ItemID"])
if not node:
LOG.warning(f"Missing inventory item {inventory_block['ItemID']}")
continue
if inventory_block["NewName"]:
node.name = str(inventory_block["NewName"])
node.parent_id = inventory_block['FolderID']
def process_aisv3_response(self, payload: dict):
if "name" in payload:
# Just a rough guess. Assume this response is updating something if there's
# a "name" key.
if InventoryCategory.ID_ATTR_AIS in payload:
if (cat_node := InventoryCategory.from_llsd(payload, flavor="ais")) is not None:
self.model.upsert(cat_node)
elif InventoryItem.ID_ATTR in payload:
if (item_node := InventoryItem.from_llsd(payload, flavor="ais")) is not None:
self.model.upsert(item_node)
else:
LOG.warning(f"Unknown node type in AIS payload: {payload!r}")
# Parse the embedded stuff
embedded_dict = payload.get("_embedded", {})
for category_llsd in embedded_dict.get("categories", {}).values():
self.model.upsert(InventoryCategory.from_llsd(category_llsd, flavor="ais"))
for item_llsd in embedded_dict.get("items", {}).values():
self.model.upsert(InventoryItem.from_llsd(item_llsd, flavor="ais"))
for link_llsd in embedded_dict.get("links", {}).values():
self.model.upsert(InventoryItem.from_llsd(link_llsd, flavor="ais"))
# Get rid of anything we were asked to
for node_id in itertools.chain(
payload.get("_broken_links_removed", ()),
payload.get("_removed_items", ()),
payload.get("_category_items_removed", ()),
payload.get("_categories_removed", ()),
):
node = self.model.get(node_id)
if node:
# Presumably this list is exhaustive, so don't unlink children.
self.model.unlink(node, single_only=True)

View File

@@ -24,7 +24,7 @@ from mitmproxy.http import HTTPFlow
from mitmproxy.proxy.layers import tls
import OpenSSL
from hippolyzer.lib.base.helpers import get_resource_filename
from hippolyzer.lib.base.helpers import get_resource_filename, create_logged_task
from hippolyzer.lib.base.multiprocessing_utils import ParentProcessWatcher
from hippolyzer.lib.proxy.caps import SerializedCapData
@@ -116,21 +116,9 @@ class IPCInterceptionAddon:
self.to_proxy_queue: multiprocessing.Queue = flow_context.to_proxy_queue
self.shutdown_signal: multiprocessing.Event = flow_context.shutdown_signal
def add_log(self, entry: mitmproxy.log.LogEntry):
if entry.level == "debug":
logging.debug(entry.msg)
elif entry.level in ("alert", "info"):
# TODO: All mitmproxy infos are basically debugs, should
# probably give these dedicated loggers
logging.debug(entry.msg)
elif entry.level == "warn":
logging.warning(entry.msg)
elif entry.level == "error":
logging.error(entry.msg)
def running(self):
# register to pump the events or something here
asyncio.create_task(self._pump_callbacks())
create_logged_task(self._pump_callbacks(), "Pump HTTP proxy callbacks")
# Tell the main process mitmproxy is ready to handle requests
self.mitmproxy_ready.set()

View File

@@ -1,19 +1,58 @@
import asyncio
import datetime as dt
import functools
import logging
from typing import *
from hippolyzer.lib.base.helpers import get_mtime
from hippolyzer.lib.base import llsd
from hippolyzer.lib.base.helpers import get_mtime, create_logged_task
from hippolyzer.lib.client.inventory_manager import InventoryManager
from hippolyzer.lib.client.state import BaseClientSession
from hippolyzer.lib.proxy.http_flow import HippoHTTPFlow
from hippolyzer.lib.proxy.viewer_settings import iter_viewer_cache_dirs
if TYPE_CHECKING:
from hippolyzer.lib.proxy.sessions import Session
LOG = logging.getLogger(__name__)
class ProxyInventoryManager(InventoryManager):
def __init__(self, session: BaseClientSession):
_session: "Session"
def __init__(self, session: "Session"):
# These handlers all need their processing deferred until the cache has been loaded.
# Since cache is loaded asynchronously, the viewer may get ahead of us due to parsing
# the cache faster and start requesting inventory details we can't do anything with yet.
self._handle_update_create_inventory_item = self._wrap_with_cache_defer(
self._handle_update_create_inventory_item
)
self._handle_remove_inventory_item = self._wrap_with_cache_defer(
self._handle_remove_inventory_item
)
self._handle_remove_inventory_folder = self._wrap_with_cache_defer(
self._handle_remove_inventory_folder
)
self._handle_bulk_update_inventory = self._wrap_with_cache_defer(
self._handle_bulk_update_inventory
)
self._handle_move_inventory_item = self._wrap_with_cache_defer(
self._handle_move_inventory_item
)
self.process_aisv3_response = self._wrap_with_cache_defer(
self.process_aisv3_response
)
# Base constructor after, because it registers handlers to specific methods, which need to
# be wrapped before we call they're registered. Handlers are registered by method reference,
# not by name!
super().__init__(session)
session.http_message_handler.subscribe("InventoryAPIv3", self._handle_aisv3_flow)
newest_cache = None
newest_timestamp = dt.datetime(year=1970, month=1, day=1, tzinfo=dt.timezone.utc)
# So consumers know when the inventory should be complete
self.cache_loaded: asyncio.Event = asyncio.Event()
self._cache_deferred_calls: List[Tuple[Callable[..., None], Tuple]] = []
# Look for the newest version of the cached inventory and use that.
# Not foolproof, but close enough if we're not sure what viewer is being used.
for cache_dir in iter_viewer_cache_dirs():
@@ -31,5 +70,38 @@ class ProxyInventoryManager(InventoryManager):
cache_load_fut = asyncio.ensure_future(asyncio.to_thread(self.load_cache, newest_cache))
# Meh. Don't care if it fails.
cache_load_fut.add_done_callback(lambda *args: self.cache_loaded.set())
create_logged_task(self._apply_deferred_after_loaded(), "Apply deferred inventory", LOG)
else:
self.cache_loaded.set()
async def _apply_deferred_after_loaded(self):
await self.cache_loaded.wait()
LOG.info("Applying deferred inventory calls")
deferred_calls = self._cache_deferred_calls[:]
self._cache_deferred_calls.clear()
for func, args in deferred_calls:
try:
func(*args)
except:
LOG.exception("Failed to apply deferred inventory call")
def _wrap_with_cache_defer(self, func: Callable[..., None]):
@functools.wraps(func)
def wrapped(*inner_args):
if not self.cache_loaded.is_set():
self._cache_deferred_calls.append((func, inner_args))
else:
func(*inner_args)
return wrapped
def _handle_aisv3_flow(self, flow: HippoHTTPFlow):
if flow.response.status_code < 200 or flow.response.status_code > 300:
# Probably not a success
return
content_type = flow.response.headers.get("Content-Type", "")
if "llsd" not in content_type:
# Okay, probably still some kind of error...
return
# Try and add anything from the response into the model
self.process_aisv3_response(llsd.parse(flow.response.content))

View File

@@ -16,6 +16,7 @@ from hippolyzer.lib.base.events import Event
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.base.objects import handle_to_gridxy
from .connection import VivoxConnection, VivoxMessage
from ..base.helpers import create_logged_task
LOG = logging.getLogger(__name__)
RESP_LOG = logging.getLogger(__name__ + ".responses")
@@ -79,7 +80,7 @@ class VoiceClient:
self._pos = Vector3(0, 0, 0)
self.vivox_conn: Optional[VivoxConnection] = None
self._poll_task = asyncio.create_task(self._poll_messages())
self._poll_task = create_logged_task(self._poll_messages(), "Poll Vivox messages")
self.event_handler: MessageHandler[VivoxMessage, str] = MessageHandler(take_by_default=False)
self.event_handler.subscribe(
@@ -352,7 +353,7 @@ class VoiceClient:
RESP_LOG.debug("%s %s %s %r" % ("Request", request_id, msg_type, data))
asyncio.create_task(self.vivox_conn.send_request(request_id, msg_type, data))
create_logged_task(self.vivox_conn.send_request(request_id, msg_type, data), "Send Vivox message")
future = asyncio.Future()
self._pending_req_futures[request_id] = future
return future

69
pyproject.toml Normal file
View File

@@ -0,0 +1,69 @@
[build-system]
requires = ["setuptools>=64", "setuptools-scm>=8"]
build-backend = "setuptools.build_meta"
[project]
name = "hippolyzer"
dynamic = ["version"]
description = "Analysis tools for SL-compatible virtual worlds"
readme = "README.md"
license = "LGPL-3.0-only"
requires-python = ">=3.12"
authors = [
{ name = "Salad Dais", email = "83434023+SaladDais@users.noreply.github.com" },
]
classifiers = [
"Operating System :: MacOS",
"Operating System :: Microsoft :: Windows",
"Operating System :: POSIX",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: 3.13",
"Topic :: Software Development :: Libraries :: Python Modules",
"Topic :: Software Development :: Testing",
"Topic :: System :: Networking :: Monitoring",
]
dependencies = [
"aiohttp<4.0.0",
"arpeggio",
"defusedxml",
"gltflib",
"Glymur<0.9.7",
"idna<3,>=2.5",
"lazy-object-proxy",
"llsd<1.1.0",
"mitmproxy>=11.0.0,<12",
"numpy<2.0",
"outleap<1.0",
"ptpython<4.0",
"pycollada",
"pyside6-essentials",
"qasync",
"recordclass>=0.23.1,<0.24",
"transformations",
"Werkzeug<4.0",
]
[tool.setuptools.packages.find]
where = ["."]
include = ["hippolyzer*"]
namespaces = false
[project.scripts]
hippolyzer-cli = "hippolyzer.apps.proxy:main"
hippolyzer-gui = "hippolyzer.apps.proxy_gui:gui_main"
[project.urls]
Homepage = "https://github.com/SaladDais/Hippolyzer/"
[tool.black]
line-length = 160
[tool.pytest.ini_options]
minversion = "6.0"
[tool.isort]
profile = "black"
[tool.setuptools_scm]

View File

@@ -1,77 +1,81 @@
aiohttp==3.9.1
aioquic==0.9.25
aiosignal==1.3.1
aiohappyeyeballs==2.6.1
aiohttp==3.11.18
aioquic==1.2.0
aiosignal==1.3.2
appdirs==1.4.4
argon2-cffi==23.1.0
argon2-cffi-bindings==21.2.0
Arpeggio==2.0.2
asgiref==3.7.2
attrs==23.2.0
blinker==1.7.0
asgiref==3.8.1
attrs==25.3.0
blinker==1.9.0
Brotli==1.1.0
certifi==2023.11.17
cffi==1.16.0
click==8.1.7
cryptography==41.0.7
dataclasses-json==0.6.3
certifi==2025.4.26
cffi==1.17.1
click==8.2.0
cryptography==44.0.3
dataclasses-json==0.6.7
defusedxml==0.7.1
Flask==2.3.3
frozenlist==1.4.1
Flask==3.1.0
frozenlist==1.6.0
gltflib==1.0.13
Glymur==0.9.6
h11==0.14.0
h2==4.1.0
hpack==4.0.0
hyperframe==6.0.1
hpack==4.1.0
hyperframe==6.1.0
idna==2.10
itsdangerous==2.1.2
jedi==0.19.1
Jinja2==3.1.2
itsdangerous==2.2.0
jedi==0.19.2
Jinja2==3.1.6
kaitaistruct==0.10
lazy-object-proxy==1.10.0
lazy-object-proxy==1.11.0
ldap3==2.9.1
llsd==1.0.0
lxml==5.1.0
MarkupSafe==2.1.3
marshmallow==3.20.1
mitmproxy==10.2.1
mitmproxy_rs==0.5.1
msgpack==1.0.7
multidict==6.0.4
mypy-extensions==1.0.0
numpy==1.26.3
outleap==0.5.1
packaging==23.2
parso==0.8.3
lxml==5.4.0
MarkupSafe==3.0.2
marshmallow==3.26.1
mitmproxy==11.1.3
mitmproxy_linux==0.11.5
mitmproxy_rs==0.11.5
msgpack==1.1.0
multidict==6.4.4
mypy_extensions==1.1.0
numpy==1.26.4
outleap==0.7.1
packaging==25.0
parso==0.8.4
passlib==1.7.4
prompt-toolkit==3.0.43
protobuf==4.25.1
ptpython==3.0.25
prompt_toolkit==3.0.51
propcache==0.3.1
ptpython==3.0.30
publicsuffix2==2.20191221
pyasn1==0.5.1
pyasn1-modules==0.3.0
pycollada==0.8
pycparser==2.21
Pygments==2.17.2
pylsqpack==0.3.18
pyOpenSSL==23.3.0
pyparsing==3.1.1
pyperclip==1.8.2
PySide6-Essentials==6.6.1
python-dateutil==2.8.2
pyasn1==0.6.1
pyasn1_modules==0.4.2
pycollada==0.9
pycparser==2.22
Pygments==2.19.1
pylsqpack==0.3.22
pyOpenSSL==25.0.0
pyparsing==3.2.1
pyperclip==1.9.0
PySide6_Essentials==6.9.0
python-dateutil==2.9.0.post0
qasync==0.27.1
recordclass==0.18.2
ruamel.yaml==0.18.5
ruamel.yaml.clib==0.2.8
service-identity==23.1.0
shiboken6==6.6.1
six==1.16.0
recordclass==0.23.1
ruamel.yaml==0.18.10
service-identity==24.2.0
setuptools==80.7.1
shiboken6==6.9.0
six==1.17.0
sortedcontainers==2.4.0
tornado==6.4
transformations==2024.6.1
tornado==6.4.2
transformations==2025.1.1
typing-inspect==0.9.0
typing_extensions==4.9.0
urwid-mitmproxy==2.1.2.1
typing_extensions==4.13.2
urwid==2.6.16
wcwidth==0.2.13
Werkzeug==2.3.8
Werkzeug==3.1.3
wsproto==1.2.0
yarl==1.9.4
zstandard==0.22.0
yarl==1.20.0
zstandard==0.23.0

View File

@@ -10,3 +10,10 @@ universal = 1
max-line-length = 160
exclude = build/*, .eggs/*
ignore = F405, F403, E501, F841, E722, W503, E741, E731
[options.extras_require]
test =
pytest
aioresponses
pytest-cov
flake8

117
setup.py
View File

@@ -1,115 +1,6 @@
"""
Copyright 2008, Linden Research, Inc.
See NOTICE.md for previous contributors
Copyright 2021, Salad Dais
All Rights Reserved.
#!/usr/bin/env python3
This program is free software; you can redistribute it and/or
modify it under the terms of the GNU Lesser General Public
License as published by the Free Software Foundation; either
version 3 of the License, or (at your option) any later version.
from setuptools import setup
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
Lesser General Public License for more details.
You should have received a copy of the GNU Lesser General Public License
along with this program; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
"""
from os import path
from setuptools import setup, find_packages
here = path.abspath(path.dirname(__file__))
version = '0.15.1'
with open(path.join(here, 'README.md')) as readme_fh:
readme = readme_fh.read()
setup(
name='hippolyzer',
version=version,
description="Analysis tools for SL-compatible virtual worlds",
long_description=readme,
long_description_content_type="text/markdown",
classifiers=[
"License :: OSI Approved :: GNU Lesser General Public License v3 or later (LGPLv3+)",
"Operating System :: MacOS",
"Operating System :: POSIX",
"Operating System :: Microsoft :: Windows",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: Implementation :: CPython",
"Topic :: System :: Networking :: Monitoring",
"Topic :: Software Development :: Libraries :: Python Modules",
"Topic :: Software Development :: Testing",
],
author='Salad Dais',
author_email='83434023+SaladDais@users.noreply.github.com',
url='https://github.com/SaladDais/Hippolyzer/',
license='LGPLv3',
packages=find_packages(include=["hippolyzer", "hippolyzer.*"]),
package_data={
'hippolyzer': [
'apps/message_builder.ui',
'apps/proxy_mainwindow.ui',
'apps/filter_dialog.ui',
'apps/addon_dialog.ui',
'lib/base/message/data/message_template.msg',
'lib/base/message/data/message.xml',
'lib/base/network/data/ca-bundle.crt',
'lib/base/data/static_data.db2',
'lib/base/data/static_index.db2',
'lib/base/data/avatar_lad.xml',
'lib/base/data/male_collada_joints.xml',
'lib/base/data/avatar_skeleton.xml',
'lib/base/data/LICENSE-artwork.txt',
],
},
entry_points={
'console_scripts': {
'hippolyzer-gui = hippolyzer.apps.proxy_gui:gui_main',
'hippolyzer-cli = hippolyzer.apps.proxy:main',
}
},
zip_safe=False,
python_requires='>=3.10',
install_requires=[
'llsd<1.1.0',
'defusedxml',
'aiohttp<4.0.0',
# Newer recordclasses break!
'recordclass>0.15,<0.18.3',
'lazy-object-proxy',
# requests breaks with newer idna
'idna<3,>=2.5',
# Needed for mesh format conversion tooling
'pycollada',
'transformations',
'gltflib',
# JP2 codec
'Glymur<0.9.7',
'numpy<2.0',
# Proxy-specific stuff
'outleap<1.0',
'arpeggio',
# 11.x will be a major change.
'mitmproxy>=10.0.0,<11',
'Werkzeug<3.0',
# For REPLs
'ptpython<4.0',
# These could be in extras_require if you don't want a GUI.
'pyside6-essentials',
'qasync',
],
tests_require=[
"pytest",
"aioresponses",
],
)
if __name__ == "__main__":
setup()

View File

@@ -163,3 +163,6 @@ class TestDatatypes(unittest.TestCase):
self.assertNotEqual(b"foo", val)
self.assertEqual(b"foo", JankStringyBytes(b"foo"))
self.assertEqual("foo", JankStringyBytes(b"foo"))
self.assertFalse(JankStringyBytes(b""))
self.assertFalse(JankStringyBytes(b"\x00"))
self.assertTrue(JankStringyBytes(b"\x01"))

View File

@@ -49,3 +49,15 @@ class TestEvents(unittest.IsolatedAsyncioTestCase):
await called.wait()
mock.assert_called_with("foo")
self.assertNotIn(_mock_wrapper, [x[0] for x in self.event.subscribers])
async def test_multiple_subscribers(self):
called = asyncio.Event()
called2 = asyncio.Event()
self.event.subscribe(lambda *args: called.set())
self.event.subscribe(lambda *args: called2.set())
self.event.notify(None)
self.assertTrue(called.is_set())
self.assertTrue(called2.is_set())

View File

@@ -181,6 +181,8 @@ class TestMessageHandlers(unittest.IsolatedAsyncioTestCase):
self.message_handler.handle(msg)
async def test_subscription(self):
called = asyncio.Event()
called2 = asyncio.Event()
with self.message_handler.subscribe_async(
message_names=("Foo",),
predicate=lambda m: m["Bar"]["Baz"] == 1,
@@ -192,6 +194,10 @@ class TestMessageHandlers(unittest.IsolatedAsyncioTestCase):
msg3 = Message("Foo", Block("Bar", Baz=1, Biz=3))
self._fake_received_message(msg1)
self._fake_received_message(msg2)
self.message_handler.subscribe("Foo", lambda *args: called.set())
self.message_handler.subscribe("Foo", lambda *args: called2.set())
self._fake_received_message(msg3)
received = []
while True:
@@ -199,14 +205,15 @@ class TestMessageHandlers(unittest.IsolatedAsyncioTestCase):
received.append(await asyncio.wait_for(get_msg(), 0.001))
except asyncio.exceptions.TimeoutError:
break
self.assertEqual(len(foo_handlers), 1)
self.assertEqual(len(foo_handlers), 3)
self.assertListEqual(received, [msg1, msg3])
# The message should have been take()n, making a copy
self.assertIsNot(msg1, received[0])
# take() was called, so this should have been marked queued
self.assertTrue(msg1.queued)
# Leaving the block should have unsubscribed automatically
self.assertEqual(len(foo_handlers), 0)
self.assertEqual(len(foo_handlers), 2)
self.assertTrue(called.is_set())
async def test_subscription_no_take(self):
with self.message_handler.subscribe_async(("Foo",), take=False) as get_msg:

View File

@@ -50,6 +50,8 @@ OBJECT_UPDATE = binascii.unhexlify(''.join(OBJECT_UPDATE.split()))
COARSE_LOCATION_UPDATE = b'\x00\x00\x00\x00E\x00\xff\x06\x00\xff\xff\xff\xff\x00'
UNKNOWN_PACKET = b'\x00\x00\x00\x00E\x00\xff\xf0\x00\xff\xff\xff\xff\x00'
class TestPacketDecode(unittest.TestCase):
@@ -110,3 +112,12 @@ class TestPacketDecode(unittest.TestCase):
parsed = deserializer.deserialize(message)
logging.debug("Parsed blocks: %r " % (list(parsed.blocks.keys()),))
self.assertEqual(message, serializer.serialize(parsed))
def test_unknown_packet_roundtrips(self):
message = UNKNOWN_PACKET
deserializer = UDPMessageDeserializer(settings=self.settings)
serializer = UDPMessageSerializer()
parsed = deserializer.deserialize(message)
logging.debug("Parsed blocks: %r " % (list(parsed.blocks.keys()),))
self.assertEqual("UnknownMessage:240", parsed.name)
self.assertEqual(message, serializer.serialize(parsed))

View File

@@ -4,6 +4,7 @@ import unittest
from typing import *
from hippolyzer.lib.base.datatypes import UUID
from hippolyzer.lib.base.helpers import create_logged_task
from hippolyzer.lib.base.message.message import Block, Message
from hippolyzer.lib.base.message.message_handler import MessageHandler
from hippolyzer.lib.base.templates import (
@@ -61,7 +62,7 @@ class XferManagerTests(BaseTransferTests):
))
async def test_small_xfer_upload(self):
asyncio.create_task(self._handle_vfile_upload())
_ = create_logged_task(self._handle_vfile_upload())
await asyncio.wait_for(self.xfer_manager.upload_asset(
AssetType.BODYPART, self.SMALL_PAYLOAD
), timeout=0.1)
@@ -69,7 +70,7 @@ class XferManagerTests(BaseTransferTests):
async def test_large_xfer_upload(self):
# Larger payloads take a different path
asyncio.create_task(self._handle_vfile_upload())
_ = create_logged_task(self._handle_vfile_upload())
await asyncio.wait_for(self.xfer_manager.upload_asset(
AssetType.BODYPART, self.LARGE_PAYLOAD
), timeout=0.1)
@@ -125,7 +126,7 @@ class TestTransferManager(BaseTransferTests):
packet_num += 1
async def test_simple_transfer(self):
asyncio.create_task(self._handle_covenant_download())
_ = create_logged_task(self._handle_covenant_download())
transfer: Transfer = await asyncio.wait_for(self.transfer_manager.request(
source_type=TransferSourceType.SIM_ESTATE,
params=TransferRequestParamsSimEstate(

View File

@@ -12,6 +12,7 @@ from yarl import URL
from hippolyzer.apps.proxy import run_http_proxy_process
from hippolyzer.lib.base.datatypes import Vector3
from hippolyzer.lib.base.helpers import create_logged_task
from hippolyzer.lib.proxy.addon_utils import BaseAddon
from hippolyzer.lib.proxy.addons import AddonManager
from hippolyzer.lib.proxy.http_event_manager import MITMProxyEventManager
@@ -164,7 +165,7 @@ class TestMITMProxy(BaseProxyTest):
def test_mitmproxy_works(self):
async def _request_example_com():
# Pump callbacks from mitmproxy
asyncio.create_task(self.http_event_manager.run())
_ = create_logged_task(self.http_event_manager.run())
try:
async with self.caps_client.get("http://example.com/", timeout=0.5) as resp:
self.assertIn(b"Example Domain", await resp.read())

View File

@@ -21,6 +21,9 @@ from hippolyzer.lib.proxy.sessions import Session
from hippolyzer.lib.proxy.test_utils import BaseProxyTest
UNKNOWN_PACKET = b'\x00\x00\x00\x00E\x00\xff\xf0\x00\xff\xff\xff\xff\x00'
class MockAddon(BaseAddon):
def __init__(self):
self.events = []
@@ -242,6 +245,21 @@ class LLUDPIntegrationTests(BaseProxyTest):
self.assertEqual(entry.name, "UndoLand")
self.assertEqual(entry.message.dropped, True)
async def test_logging_unknown_message(self):
message_logger = SimpleMessageLogger()
self.session_manager.message_logger = message_logger
self._setup_default_circuit()
self.protocol.datagram_received(UNKNOWN_PACKET, self.region_addr)
await self._wait_drained()
entries = message_logger.entries
self.assertEqual(len(entries), 1)
entry: LLUDPMessageLogEntry = entries[0] # type: ignore
# Freezing shouldn't affect this
entry.freeze()
self.assertEqual(entry.name, "UnknownMessage:240")
self.assertEqual(entry.message.dropped, False)
self.assertEqual(entry.message.unknown_message, True)
async def test_session_message_handler(self):
self._setup_default_circuit()
obj_update = self._make_objectupdate_compressed(1234)

View File

@@ -6,6 +6,7 @@ from typing import *
from unittest import mock
from hippolyzer.lib.base.datatypes import *
from hippolyzer.lib.base.helpers import create_logged_task
from hippolyzer.lib.base.message.message import Block, Message as Message
from hippolyzer.lib.base.message.udpdeserializer import UDPMessageDeserializer
from hippolyzer.lib.base.message.udpserializer import UDPMessageSerializer
@@ -620,7 +621,7 @@ class SessionObjectManagerTests(ObjectManagerTestMixin, unittest.IsolatedAsyncio
async def _create_after():
await asyncio.sleep(0.001)
self._create_object(region_handle=123, local_id=child.ParentID)
asyncio.create_task(_create_after())
_ = create_logged_task(_create_after())
await self.session.objects.load_ancestors(child)
await self.session.objects.load_ancestors(parentless)