Block 2 — Gitea webhooks → gateway (#77): - New server/webhooks.py: translates push/issue/PR/comment events to Matrix messages - Gateway integration: POST /api/webhook/gitea endpoint - Bot filtering (hermes, kimi, manus), HMAC signature verification - 17/17 tests pass Block 3 — Self-triggering research (#78): - _evaluate_research_trigger() in bridge.py - Pattern matching for question-like thoughts (I wonder, How does, etc.) - Cooldown (10min), seed type filter, active-lock safeguards - _extract_research_topic() extracts concise topic from thought content - 6 new tests in test_bridge.py (14 → 17 total) Block 4 — Model fallback chain (#79): - New server/ollama_client.py: resilient Ollama client - Configurable model_chain with auto-retry and model health tracking - Integrated into ResearchEngine (replaces raw httpx, backward compatible) - health_check() and status() for monitoring - 11/11 tests pass, 21/21 research tests still pass Block 5 — Bridge as SensoryBus subscriber (#80): - register_on_bus() subscribes to 7 SensoryBus event types - Adapter methods translate SensoryEvent → Matrix protocol messages - Ready for Timmy dashboard integration via get_sensory_bus() - 3 new bus integration tests in test_bridge.py (17 total) PROTOCOL.md updated with all new capabilities.
Matrix Server Infrastructure
Backend services for The Matrix 3D world.
Components
Gateway (gateway.py) — Issue #70
Central WebSocket hub that routes messages between all Matrix participants:
Timmy's Dashboard ──ws──→ Gateway ──ws──→ Matrix 3D Client (iPad)
↕
Perplexity Bot ──ws──→
↕
Future agents ──ws──→
Quick start:
pip install websockets
python server/gateway.py --port 8765
With auth:
python server/gateway.py --port 8765 --token my-secret
REST status:
pip install aiohttp # optional
curl http://localhost:8766/api/gateway/status
The gateway is a pure message router — it doesn't interpret messages, just
forwards them based on target_id (directed) or broadcasts to all clients.
Cognitive Bridge (bridge.py) — Issue #69
Translates Timmy's cognitive events into Matrix protocol messages:
| Timmy Event | Matrix Action |
|---|---|
think_once() fires |
agent_behavior:ponder + bark with thought excerpt |
| Mood changes | agent_state with mapped state + glow |
| Energy drops below 0.3 | agent_state:idle override |
| Gitea issue filed | agent_behavior:place + bark |
| Visitor enters | agent_state:active + agent_behavior:converse |
Standalone mode (polls presence.json):
python server/bridge.py --gateway ws://localhost:8765 --presence ~/.timmy/presence.json
Integrated mode (imported into Timmy's dashboard):
from server.bridge import CognitiveBridge
bridge = CognitiveBridge(gateway_url="ws://localhost:8765")
await bridge.connect()
# Call from thinking.py._broadcast()
await bridge.on_thought({"content": "...", "seed_type": "existential"})
# Call from cognitive_state.py on change
await bridge.on_state_change({"mood": "curious", "engagement": "deep", "energy": 0.9})
Interview Bot (../bot/interview.py)
Existing bot that connects to the gateway as "perplexity" and interviews Timmy.
Testing
# Gateway unit tests (10 tests)
python server/test_gateway.py
# Bridge integration tests (8 tests)
python server/test_bridge.py
Connecting the Matrix 3D Client
Pass the gateway URL when loading the Matrix:
https://the-matrix.example.com/?ws=ws://gateway-host:8765&token=my-secret
The client will connect to the gateway instead of Timmy's dashboard directly. In this topology, the gateway is the central hub and all communication flows through it.