forked from genewildish/Mainline
MAJOR REFACTORING: Consolidate duplicated pipeline code and standardize on capability-based dependency resolution. This is a significant but backwards-compatible restructuring that improves maintainability and extensibility. ## ARCHITECTURE CHANGES ### Data Sources Consolidation - Move engine/sources_v2.py → engine/data_sources/sources.py - Move engine/pipeline_sources/ → engine/data_sources/ - Create unified DataSource ABC with common interface: * fetch() - idempotent data retrieval * get_items() - cached access with automatic refresh * refresh() - force cache invalidation * is_dynamic - indicate streaming vs static sources - Support for SourceItem dataclass (content, source, timestamp, metadata) ### Display Backend Improvements - Update all 7 display backends to use new import paths - Terminal: Improve dimension detection and handling - WebSocket: Better error handling and client lifecycle - Sixel: Refactor graphics rendering - Pygame: Modernize event handling - Kitty: Add protocol support for inline images - Multi: Ensure proper forwarding to all backends - Null: Maintain testing backend functionality ### Pipeline Adapter Consolidation - Refactor adapter stages for clarity and flexibility - RenderStage now handles both item-based and buffer-based rendering - Add SourceItemsToBufferStage for converting data source items - Improve DataSourceStage to work with all source types - Add DisplayStage wrapper for display backends ### Camera & Viewport Refinements - Update Camera class for new architecture - Improve viewport dimension detection - Better handling of resize events across backends ### New Effect Plugins - border.py: Frame rendering effect with configurable style - crop.py: Viewport clipping effect for selective display - tint.py: Color filtering effect for atmosphere ### Tests & Quality - Add test_border_effect.py with comprehensive border tests - Add test_crop_effect.py with viewport clipping tests - Add test_tint_effect.py with color filtering tests - Update test_pipeline.py for new architecture - Update test_pipeline_introspection.py for new data source location - All 463 tests pass with 56% coverage - Linting: All checks pass with ruff ### Removals (Code Cleanup) - Delete engine/benchmark.py (deprecated performance testing) - Delete engine/pipeline_sources/__init__.py (moved to data_sources) - Delete engine/sources_v2.py (replaced by data_sources/sources.py) - Update AGENTS.md to reflect new structure ### Import Path Updates - Update engine/pipeline/controller.py::create_default_pipeline() * Old: from engine.sources_v2 import HeadlinesDataSource * New: from engine.data_sources.sources import HeadlinesDataSource - All display backends import from new locations - All tests import from new locations ## BACKWARDS COMPATIBILITY This refactoring is intended to be backwards compatible: - Pipeline execution unchanged (DAG-based with capability matching) - Effect plugins unchanged (EffectPlugin interface same) - Display protocol unchanged (Display duck-typing works as before) - Config system unchanged (presets.toml format same) ## TESTING - 463 tests pass (0 failures, 19 skipped) - Full linting check passes - Manual testing on demo, poetry, websocket modes - All new effect plugins tested ## FILES CHANGED - 24 files modified/added/deleted - 723 insertions, 1,461 deletions (net -738 LOC - cleanup!) - No breaking changes to public APIs - All transitive imports updated correctly
307 lines
9.6 KiB
Python
307 lines
9.6 KiB
Python
"""
|
|
Pipeline core - Unified Stage abstraction and PipelineContext.
|
|
|
|
This module provides the foundation for a clean, dependency-managed pipeline:
|
|
- Stage: Base class for all pipeline components (sources, effects, displays, cameras)
|
|
- PipelineContext: Dependency injection context for runtime data exchange
|
|
- Capability system: Explicit capability declarations with duck-typing support
|
|
- DataType: PureData-style inlet/outlet typing for validation
|
|
"""
|
|
|
|
from abc import ABC, abstractmethod
|
|
from collections.abc import Callable
|
|
from dataclasses import dataclass, field
|
|
from enum import Enum, auto
|
|
from typing import TYPE_CHECKING, Any
|
|
|
|
if TYPE_CHECKING:
|
|
from engine.pipeline.params import PipelineParams
|
|
|
|
|
|
class DataType(Enum):
|
|
"""PureData-style data types for inlet/outlet validation.
|
|
|
|
Each type represents a specific data format that flows through the pipeline.
|
|
This enables compile-time-like validation of connections.
|
|
|
|
Examples:
|
|
SOURCE_ITEMS: List[SourceItem] - raw items from sources
|
|
ITEM_TUPLES: List[tuple] - (title, source, timestamp) tuples
|
|
TEXT_BUFFER: List[str] - rendered ANSI buffer for display
|
|
RAW_TEXT: str - raw text strings
|
|
PIL_IMAGE: PIL Image object
|
|
"""
|
|
|
|
SOURCE_ITEMS = auto() # List[SourceItem] - from DataSource
|
|
ITEM_TUPLES = auto() # List[tuple] - (title, source, ts)
|
|
TEXT_BUFFER = auto() # List[str] - ANSI buffer
|
|
RAW_TEXT = auto() # str - raw text
|
|
PIL_IMAGE = auto() # PIL Image object
|
|
ANY = auto() # Accepts any type
|
|
NONE = auto() # No data (terminator)
|
|
|
|
|
|
@dataclass
|
|
class StageConfig:
|
|
"""Configuration for a single stage."""
|
|
|
|
name: str
|
|
category: str
|
|
enabled: bool = True
|
|
optional: bool = False
|
|
params: dict[str, Any] = field(default_factory=dict)
|
|
|
|
|
|
class Stage(ABC):
|
|
"""Abstract base class for all pipeline stages.
|
|
|
|
A Stage is a single component in the rendering pipeline. Stages can be:
|
|
- Sources: Data providers (headlines, poetry, pipeline viz)
|
|
- Effects: Post-processors (noise, fade, glitch, hud)
|
|
- Displays: Output backends (terminal, pygame, websocket)
|
|
- Cameras: Viewport controllers (vertical, horizontal, omni)
|
|
- Overlays: UI elements that compose on top (HUD)
|
|
|
|
Stages declare:
|
|
- capabilities: What they provide to other stages
|
|
- dependencies: What they need from other stages
|
|
- stage_type: Category of stage (source, effect, overlay, display)
|
|
- render_order: Execution order within category
|
|
- is_overlay: If True, output is composited on top, not passed downstream
|
|
|
|
Duck-typing is supported: any class with the required methods can act as a Stage.
|
|
"""
|
|
|
|
name: str
|
|
category: str # "source", "effect", "overlay", "display", "camera"
|
|
optional: bool = False # If True, pipeline continues even if stage fails
|
|
|
|
@property
|
|
def stage_type(self) -> str:
|
|
"""Category of stage for ordering.
|
|
|
|
Valid values: "source", "effect", "overlay", "display", "camera"
|
|
Defaults to category for backwards compatibility.
|
|
"""
|
|
return self.category
|
|
|
|
@property
|
|
def render_order(self) -> int:
|
|
"""Execution order within stage_type group.
|
|
|
|
Higher values execute later. Useful for ordering overlays
|
|
or effects that need specific execution order.
|
|
"""
|
|
return 0
|
|
|
|
@property
|
|
def is_overlay(self) -> bool:
|
|
"""If True, this stage's output is composited on top of the buffer.
|
|
|
|
Overlay stages don't pass their output to the next stage.
|
|
Instead, their output is layered on top of the final buffer.
|
|
Use this for HUD, status displays, and similar UI elements.
|
|
"""
|
|
return False
|
|
|
|
@property
|
|
def inlet_types(self) -> set[DataType]:
|
|
"""Return set of data types this stage accepts.
|
|
|
|
PureData-style inlet typing. If the connected upstream stage's
|
|
outlet_type is not in this set, the pipeline will raise an error.
|
|
|
|
Examples:
|
|
- Source stages: {DataType.NONE} (no input needed)
|
|
- Transform stages: {DataType.ITEM_TUPLES, DataType.TEXT_BUFFER}
|
|
- Display stages: {DataType.TEXT_BUFFER}
|
|
"""
|
|
return {DataType.ANY}
|
|
|
|
@property
|
|
def outlet_types(self) -> set[DataType]:
|
|
"""Return set of data types this stage produces.
|
|
|
|
PureData-style outlet typing. Downstream stages must accept
|
|
this type in their inlet_types.
|
|
|
|
Examples:
|
|
- Source stages: {DataType.SOURCE_ITEMS}
|
|
- Transform stages: {DataType.TEXT_BUFFER}
|
|
- Display stages: {DataType.NONE} (consumes data)
|
|
"""
|
|
return {DataType.ANY}
|
|
|
|
@property
|
|
def capabilities(self) -> set[str]:
|
|
"""Return set of capabilities this stage provides.
|
|
|
|
Examples:
|
|
- "source.headlines"
|
|
- "effect.noise"
|
|
- "display.output"
|
|
- "camera"
|
|
"""
|
|
return {f"{self.category}.{self.name}"}
|
|
|
|
@property
|
|
def dependencies(self) -> set[str]:
|
|
"""Return set of capability names this stage needs.
|
|
|
|
Examples:
|
|
- {"display.output"}
|
|
- {"source.headlines"}
|
|
- {"camera"}
|
|
"""
|
|
return set()
|
|
|
|
def init(self, ctx: "PipelineContext") -> bool:
|
|
"""Initialize stage with pipeline context.
|
|
|
|
Args:
|
|
ctx: PipelineContext for accessing services
|
|
|
|
Returns:
|
|
True if initialization succeeded, False otherwise
|
|
"""
|
|
return True
|
|
|
|
@abstractmethod
|
|
def process(self, data: Any, ctx: "PipelineContext") -> Any:
|
|
"""Process input data and return output.
|
|
|
|
Args:
|
|
data: Input data from previous stage (or initial data for first stage)
|
|
ctx: PipelineContext for accessing services and state
|
|
|
|
Returns:
|
|
Processed data for next stage
|
|
"""
|
|
...
|
|
|
|
def cleanup(self) -> None: # noqa: B027
|
|
"""Clean up resources when pipeline shuts down."""
|
|
pass
|
|
|
|
def get_config(self) -> StageConfig:
|
|
"""Return current configuration of this stage."""
|
|
return StageConfig(
|
|
name=self.name,
|
|
category=self.category,
|
|
optional=self.optional,
|
|
)
|
|
|
|
def set_enabled(self, enabled: bool) -> None:
|
|
"""Enable or disable this stage."""
|
|
self._enabled = enabled # type: ignore[attr-defined]
|
|
|
|
def is_enabled(self) -> bool:
|
|
"""Check if stage is enabled."""
|
|
return getattr(self, "_enabled", True)
|
|
|
|
|
|
@dataclass
|
|
class StageResult:
|
|
"""Result of stage processing, including success/failure info."""
|
|
|
|
success: bool
|
|
data: Any
|
|
error: str | None = None
|
|
stage_name: str = ""
|
|
|
|
|
|
class PipelineContext:
|
|
"""Dependency injection context passed through the pipeline.
|
|
|
|
Provides:
|
|
- services: Named services (display, config, event_bus, etc.)
|
|
- state: Runtime state shared between stages
|
|
- params: PipelineParams for animation-driven config
|
|
|
|
Services can be injected at construction time or lazily resolved.
|
|
"""
|
|
|
|
def __init__(
|
|
self,
|
|
services: dict[str, Any] | None = None,
|
|
initial_state: dict[str, Any] | None = None,
|
|
):
|
|
self.services: dict[str, Any] = services or {}
|
|
self.state: dict[str, Any] = initial_state or {}
|
|
self._params: PipelineParams | None = None
|
|
|
|
# Lazy resolvers for common services
|
|
self._lazy_resolvers: dict[str, Callable[[], Any]] = {
|
|
"config": self._resolve_config,
|
|
"event_bus": self._resolve_event_bus,
|
|
}
|
|
|
|
def _resolve_config(self) -> Any:
|
|
from engine.config import get_config
|
|
|
|
return get_config()
|
|
|
|
def _resolve_event_bus(self) -> Any:
|
|
from engine.eventbus import get_event_bus
|
|
|
|
return get_event_bus()
|
|
|
|
def get(self, key: str, default: Any = None) -> Any:
|
|
"""Get a service or state value by key.
|
|
|
|
First checks services, then state, then lazy resolution.
|
|
"""
|
|
if key in self.services:
|
|
return self.services[key]
|
|
if key in self.state:
|
|
return self.state[key]
|
|
if key in self._lazy_resolvers:
|
|
try:
|
|
return self._lazy_resolvers[key]()
|
|
except Exception:
|
|
return default
|
|
return default
|
|
|
|
def set(self, key: str, value: Any) -> None:
|
|
"""Set a service or state value."""
|
|
self.services[key] = value
|
|
|
|
def set_state(self, key: str, value: Any) -> None:
|
|
"""Set a runtime state value."""
|
|
self.state[key] = value
|
|
|
|
def get_state(self, key: str, default: Any = None) -> Any:
|
|
"""Get a runtime state value."""
|
|
return self.state.get(key, default)
|
|
|
|
@property
|
|
def params(self) -> "PipelineParams | None":
|
|
"""Get current pipeline params (for animation)."""
|
|
return self._params
|
|
|
|
@params.setter
|
|
def params(self, value: "PipelineParams") -> None:
|
|
"""Set pipeline params (from animation controller)."""
|
|
self._params = value
|
|
|
|
def has_capability(self, capability: str) -> bool:
|
|
"""Check if a capability is available."""
|
|
return capability in self.services or capability in self._lazy_resolvers
|
|
|
|
|
|
class StageError(Exception):
|
|
"""Raised when a stage fails to process."""
|
|
|
|
def __init__(self, stage_name: str, message: str, is_optional: bool = False):
|
|
self.stage_name = stage_name
|
|
self.message = message
|
|
self.is_optional = is_optional
|
|
super().__init__(f"Stage '{stage_name}' failed: {message}")
|
|
|
|
|
|
def create_stage_error(
|
|
stage_name: str, error: Exception, is_optional: bool = False
|
|
) -> StageError:
|
|
"""Helper to create a StageError from an exception."""
|
|
return StageError(stage_name, str(error), is_optional)
|