# Agent Development Guide ## Development Environment This project uses: - **mise** (mise.jdx.dev) - tool version manager and task runner - **uv** - fast Python package installer - **ruff** - linter and formatter (line-length 88, target Python 3.10) - **pytest** - test runner with strict marker enforcement ### Setup ```bash mise run install # Install dependencies # Or: uv sync --all-extras # includes mic, websocket support ``` ### Available Commands ```bash # Testing mise run test # Run all tests mise run test-cov # Run tests with coverage report pytest tests/test_foo.py::TestClass::test_method # Run single test # Linting & Formatting mise run lint # Run ruff linter mise run lint-fix # Run ruff with auto-fix mise run format # Run ruff formatter # CI mise run ci # Full CI pipeline (topics-init + lint + test-cov) ``` ### Running a Single Test ```bash # Run a specific test function pytest tests/test_eventbus.py::TestEventBusInit::test_init_creates_empty_subscribers # Run all tests in a file pytest tests/test_eventbus.py # Run tests matching a pattern pytest -k "test_subscribe" ``` ### Git Hooks Install hooks at start of session: ```bash ls -la .git/hooks/pre-commit # Verify installed hk init --mise # Install if missing mise run pre-commit # Run manually ``` ## Code Style Guidelines ### Imports (three sections, alphabetical within each) ```python # 1. Standard library import os import threading from collections import defaultdict from collections.abc import Callable from dataclasses import dataclass, field from typing import Any # 2. Third-party from abc import ABC, abstractmethod # 3. Local project from engine.events import EventType ``` ### Type Hints - Use type hints for all function signatures (parameters and return) - Use `|` for unions (Python 3.10+): `EventType | None` - Use `dict[K, V]`, `list[V]` (generic syntax): `dict[str, list[int]]` - Use `Callable[[ArgType], ReturnType]` for callbacks ```python def subscribe(self, event_type: EventType, callback: Callable[[Any], None]) -> None: ... def get_sensor_value(self, sensor_name: str) -> float | None: return self._state.get(f"sensor.{sensor_name}") ``` ### Naming Conventions - **Classes**: `PascalCase` (e.g., `EventBus`, `EffectPlugin`) - **Functions/methods**: `snake_case` (e.g., `get_event_bus`, `process_partial`) - **Constants**: `SCREAMING_SNAKE_CASE` (e.g., `CURSOR_OFF`) - **Private methods**: `_snake_case` prefix (e.g., `_initialize`) - **Type variables**: `PascalCase` (e.g., `T`, `EffectT`) ### Dataclasses Use `@dataclass` for simple data containers: ```python @dataclass class EffectContext: terminal_width: int terminal_height: int scroll_cam: int ticker_height: int = 0 _state: dict[str, Any] = field(default_factory=dict, repr=False) ``` ### Abstract Base Classes Use ABC for interface enforcement: ```python class EffectPlugin(ABC): name: str config: EffectConfig @abstractmethod def process(self, buf: list[str], ctx: EffectContext) -> list[str]: ... @abstractmethod def configure(self, config: EffectConfig) -> None: ... ``` ### Error Handling - Catch specific exceptions, not bare `Exception` - Use `try/except` with fallbacks for optional features - Silent pass in event callbacks to prevent one handler from breaking others ```python # Good: specific exception try: term_size = os.get_terminal_size() except OSError: term_width = 80 # Good: silent pass in callbacks for callback in callbacks: try: callback(event) except Exception: pass ``` ### Thread Safety Use locks for shared state: ```python class EventBus: def __init__(self): self._lock = threading.Lock() def publish(self, event_type: EventType, event: Any = None) -> None: with self._lock: callbacks = list(self._subscribers.get(event_type, [])) ``` ### Comments - **DO NOT ADD comments** unless explicitly required - Let code be self-documenting with good naming - Use docstrings only for public APIs or complex logic ### Testing Patterns Follow pytest conventions: ```python class TestEventBusSubscribe: """Tests for EventBus.subscribe method.""" def test_subscribe_adds_callback(self): """subscribe() adds a callback for an event type.""" bus = EventBus() def callback(e): return None bus.subscribe(EventType.NTFY_MESSAGE, callback) assert bus.subscriber_count(EventType.NTFY_MESSAGE) == 1 ``` - Use classes to group related tests (`Test`, `Test`) - Test docstrings follow `"() "` pattern - Use descriptive assertion messages via pytest behavior ## Workflow Rules ### Before Committing 1. Run tests: `mise run test` 2. Run linter: `mise run lint` 3. Review changes: `git diff` ### On Failing Tests - **Out-of-date test**: Update test to match new expected behavior - **Correctly failing test**: Fix implementation, not the test **Never** modify a test to make it pass without understanding why it failed. ## Testing Tests live in `tests/` and follow the pattern `test_*.py`. Run all tests: ```bash mise run test ``` Run with coverage: ```bash mise run test-cov ``` The project uses pytest with strict marker enforcement. Test configuration is in `pyproject.toml` under `[tool.pytest.ini_options]`. ### Test Coverage Strategy Current coverage: 56% (463 tests) Key areas with lower coverage (acceptable for now): - **app.py** (8%): Main entry point - integration heavy, requires terminal - **scroll.py** (10%): Terminal-dependent rendering logic (unused) Key areas with good coverage: - **display/backends/null.py** (95%): Easy to test headlessly - **display/backends/terminal.py** (96%): Uses mocking - **display/backends/multi.py** (100%): Simple forwarding logic - **effects/performance.py** (99%): Pure Python logic - **eventbus.py** (96%): Simple event system - **effects/controller.py** (95%): Effects command handling Areas needing more tests: - **websocket.py** (48%): Network I/O, hard to test in CI - **ntfy.py** (50%): Network I/O, hard to test in CI - **mic.py** (61%): Audio I/O, hard to test in CI Note: Terminal-dependent modules (scroll, layers render) are harder to test in CI. Performance regression tests are in `tests/test_benchmark.py` with `@pytest.mark.benchmark`. ## Architecture Notes - **ntfy.py** - standalone notification poller with zero internal dependencies - **sensors/** - Sensor framework (MicSensor, OscillatorSensor) for real-time input - **eventbus.py** provides thread-safe event publishing for decoupled communication - **effects/** - plugin architecture with performance monitoring - The new pipeline architecture: source → render → effects → display #### Canvas & Camera - **Canvas** (`engine/canvas.py`): 2D rendering surface with dirty region tracking - **Camera** (`engine/camera.py`): Viewport controller for scrolling content The Canvas tracks dirty regions automatically when content is written (via `put_region`, `put_text`, `fill`), enabling partial buffer updates for optimized effect processing. ### Pipeline Architecture The new Stage-based pipeline architecture provides capability-based dependency resolution: - **Stage** (`engine/pipeline/core.py`): Base class for pipeline stages - **Pipeline** (`engine/pipeline/controller.py`): Executes stages with capability-based dependency resolution - **PipelineConfig** (`engine/pipeline/controller.py`): Configuration for pipeline instance - **StageRegistry** (`engine/pipeline/registry.py`): Discovers and registers stages - **Stage Adapters** (`engine/pipeline/adapters.py`): Wraps existing components as stages #### Pipeline Configuration The `PipelineConfig` dataclass configures pipeline behavior: ```python @dataclass class PipelineConfig: source: str = "headlines" # Data source identifier display: str = "terminal" # Display backend identifier camera: str = "vertical" # Camera mode identifier effects: list[str] = field(default_factory=list) # List of effect names enable_metrics: bool = True # Enable performance metrics ``` **Available sources**: `headlines`, `poetry`, `empty`, `list`, `image`, `metrics`, `cached`, `transform`, `composite`, `pipeline-inspect` **Available displays**: `terminal`, `null`, `replay`, `websocket`, `pygame`, `moderngl`, `multi` **Available camera modes**: `FEED`, `SCROLL`, `HORIZONTAL`, `OMNI`, `FLOATING`, `BOUNCE`, `RADIAL` #### Capability-Based Dependencies Stages declare capabilities (what they provide) and dependencies (what they need). The Pipeline resolves dependencies using prefix matching: - `"source"` matches `"source.headlines"`, `"source.poetry"`, etc. - `"camera.state"` matches the camera state capability - This allows flexible composition without hardcoding specific stage names #### Minimum Capabilities The pipeline requires these minimum capabilities to function: - `"source"` - Data source capability - `"render.output"` - Rendered content capability - `"display.output"` - Display output capability - `"camera.state"` - Camera state for viewport filtering These are automatically injected if missing by the `ensure_minimum_capabilities()` method. #### Sensor Framework - **Sensor** (`engine/sensors/__init__.py`): Base class for real-time input sensors - **SensorRegistry**: Discovers available sensors - **SensorStage**: Pipeline adapter that provides sensor values to effects - **MicSensor** (`engine/sensors/mic.py`): Self-contained microphone input - **OscillatorSensor** (`engine/sensors/oscillator.py`): Test sensor for development - **PipelineMetricsSensor** (`engine/sensors/pipeline_metrics.py`): Exposes pipeline metrics as sensor values Sensors support param bindings to drive effect parameters in real-time. #### Pipeline Introspection - **PipelineIntrospectionSource** (`engine/data_sources/pipeline_introspection.py`): Renders live ASCII visualization of pipeline DAG with metrics - **PipelineIntrospectionDemo** (`engine/pipeline/pipeline_introspection_demo.py`): 3-phase demo controller for effect animation Preset: `pipeline-inspect` - Live pipeline introspection with DAG and performance metrics #### Partial Update Support Effect plugins can opt-in to partial buffer updates for performance optimization: - Set `supports_partial_updates = True` on the effect class - Implement `process_partial(buf, ctx, partial)` method - The `PartialUpdate` dataclass indicates which regions changed ### Preset System Presets use TOML format (no external dependencies): - Built-in: `engine/presets.toml` - User config: `~/.config/mainline/presets.toml` - Local override: `./presets.toml` - **Preset loader** (`engine/pipeline/preset_loader.py`): Loads and validates presets - **PipelinePreset** (`engine/pipeline/presets.py`): Dataclass for preset configuration Functions: - `validate_preset()` - Validate preset structure - `validate_signal_path()` - Detect circular dependencies - `generate_preset_toml()` - Generate skeleton preset ### Display System - **Display abstraction** (`engine/display/`): swap display backends via the Display protocol - `display/backends/terminal.py` - ANSI terminal output - `display/backends/websocket.py` - broadcasts to web clients via WebSocket - `display/backends/null.py` - headless display for testing - `display/backends/multi.py` - forwards to multiple displays simultaneously - `display/backends/moderngl.py` - GPU-accelerated OpenGL rendering (optional) - `display/__init__.py` - DisplayRegistry for backend discovery - **WebSocket display** (`engine/display/backends/websocket.py`): real-time frame broadcasting to web browsers - WebSocket server on port 8765 - HTTP server on port 8766 (serves HTML client) - Client at `client/index.html` with ANSI color parsing and fullscreen support - **Display modes** (`--display` flag): - `terminal` - Default ANSI terminal output - `websocket` - Web browser display (requires websockets package) - `moderngl` - GPU-accelerated rendering (requires moderngl package) ### Effect Plugin System - **EffectPlugin ABC** (`engine/effects/types.py`): abstract base class for effects - All effects must inherit from EffectPlugin and implement `process()` and `configure()` - Runtime discovery via `effects_plugins/__init__.py` using `issubclass()` checks - **EffectRegistry** (`engine/effects/registry.py`): manages registered effects - **EffectChain** (`engine/effects/chain.py`): chains effects in pipeline order ### Command & Control - C&C uses separate ntfy topics for commands and responses - `NTFY_CC_CMD_TOPIC` - commands from cmdline.py - `NTFY_CC_RESP_TOPIC` - responses back to cmdline.py - Effects controller handles `/effects` commands (list, on/off, intensity, reorder, stats) ### Pipeline Documentation The rendering pipeline is documented in `docs/PIPELINE.md` using Mermaid diagrams. **IMPORTANT**: When making significant architectural changes to the rendering pipeline (new layers, effects, display backends), update `docs/PIPELINE.md` to reflect the changes: 1. Edit `docs/PIPELINE.md` with the new architecture 2. If adding new SVG diagrams, render them manually using an external tool (e.g., Mermaid Live Editor) 3. Commit both the markdown and any new diagram files ### Pipeline Mutation API The Pipeline class supports dynamic mutation during runtime via the mutation API: **Core Methods:** - `add_stage(name, stage, initialize=True)` - Add a stage to the pipeline - `remove_stage(name, cleanup=True)` - Remove a stage and rebuild execution order - `replace_stage(name, new_stage, preserve_state=True)` - Replace a stage with another - `swap_stages(name1, name2)` - Swap two stages - `move_stage(name, after=None, before=None)` - Move a stage in execution order - `enable_stage(name)` - Enable a stage - `disable_stage(name)` - Disable a stage **New Methods (Issue #35):** - `cleanup_stage(name)` - Clean up specific stage without removing it - `remove_stage_safe(name, cleanup=True)` - Alias for remove_stage that explicitly rebuilds - `can_hot_swap(name)` - Check if a stage can be safely hot-swapped - Returns False for stages that provide minimum capabilities as sole provider - Returns True for swappable stages **WebSocket Commands:** Commands can be sent via WebSocket to mutate the pipeline at runtime: ```json {"action": "remove_stage", "stage": "stage_name"} {"action": "swap_stages", "stage1": "name1", "stage2": "name2"} {"action": "enable_stage", "stage": "stage_name"} {"action": "disable_stage", "stage": "stage_name"} {"action": "cleanup_stage", "stage": "stage_name"} {"action": "can_hot_swap", "stage": "stage_name"} ``` **Implementation Files:** - `engine/pipeline/controller.py` - Pipeline class with mutation methods - `engine/app/pipeline_runner.py` - `_handle_pipeline_mutation()` function - `engine/pipeline/ui.py` - execute_command() with docstrings - `tests/test_pipeline_mutation_commands.py` - Integration tests ## Skills Library A skills library MCP server (`skills`) is available for capturing and tracking learned knowledge. Skills are stored in `~/.skills/`. ### Workflow **Before starting work:** 1. Run `local_skills_list_skills` to see available skills 2. Use `local_skills_peek_skill({name: "skill-name"})` to preview relevant skills 3. Use `local_skills_skill_slice({name: "skill-name", query: "your question"})` to get relevant sections **While working:** - If a skill was wrong or incomplete: `local_skills_update_skill` → `local_skills_record_assessment` → `local_skills_report_outcome({quality: 1})` - If a skill worked correctly: `local_skills_report_outcome({quality: 4})` (normal) or `quality: 5` (perfect) **End of session:** - Run `local_skills_reflect_on_session({context_summary: "what you did"})` to identify new skills to capture - Use `local_skills_create_skill` to add new skills - Use `local_skills_record_assessment` to score them ### Useful Tools - `local_skills_review_stale_skills()` - Skills due for review (negative days_until_due) - `local_skills_skills_report()` - Overview of entire collection - `local_skills_validate_skill({name: "skill-name"})` - Load skill for review with sources ### Agent Skills This project also has Agent Skills (SKILL.md files) in `.opencode/skills/`. Use the `skill` tool to load them: - `skill({name: "mainline-architecture"})` - Pipeline stages, capability resolution - `skill({name: "mainline-effects"})` - How to add new effect plugins - `skill({name: "mainline-display"})` - Display backend implementation - `skill({name: "mainline-sources"})` - Adding new RSS feeds - `skill({name: "mainline-presets"})` - Creating pipeline presets - `skill({name: "mainline-sensors"})` - Sensor framework usage