Real-time stop sign accountability
StopSign AI watches a live intersection feed and detects vehicles with YOLO while measuring true stop compliance using capture-time data. Every hop in the pipeline is tuned for low latency and instrumented for observability.
Pipeline at a glance
Frames take a deterministic path from the curb to the browser. Each boundary is backed by Redis queues, explicit health checks, and Prometheus metrics so issues are easy to trace.
RTSP Camera
Network camera or sample MP4 feed streaming into the site via RTSP.
rtsp_to_redis
Encodes frames as JPEG, wraps them in the SSFM header, and LPUSHes into Redis with FIFO semantics.
Redis · RAW
Deterministic queueing keeps capture order intact while smoothing network jitter.
video_analyzer
YOLO inference, Kalman-smoothed tracking, and stop-zone scoring feed Postgres + MinIO evidence.
Redis · PROCESSED
Annotated frames with timestamps stay ready for streaming without blocking the analyzer.
ffmpeg_service
FFmpeg (NVENC or libx264) assembles HLS playlists, guarded by watchdog and readiness probes.
web_server
FastAPI + FastHTML + htmx deliver the live player, dashboards, and developer tooling.
Operators
Browsers consume HLS, review recent passes, and adjust zones without redeploying.
PostgreSQL
Stores vehicle pass records, compliance scoring, and trend queries for insights.
MinIO
Holds annotated JPEG clips and exposes them through signed URLs in the UI.
Grafana + Prometheus
Dashboards visualize FPS, inference latency, queue depth, and HLS freshness.
What each service owns
rtsp_to_redis
Frame ingestion & SSFM packaging
- LPUSHes JPEG frames with SSFM headers so capture timestamps survive downstream hops.
- Bounded queues (FRAME_BUFFER_SIZE) smooth out bursty networks without going stale.
- Exports Prometheus counters/timers plus runtime status mixins for health probes.
video_analyzer
Computer vision core
- Runs Ultralytics YOLO models (configured via YOLO_MODEL_NAME/YOLO_DEVICE).
- CarTracker + Kalman filter blend trajectories for reliable stop detection.
- Persists scores to Postgres, ships annotated evidence to MinIO, and surfaces live insights.
ffmpeg_service
HLS edge
- Consumes processed frames from Redis and renders annotated video at 15 FPS.
- Configurable FFmpeg encoders (NVENC, libx264) with presets tuned for low latency.
- Watchdog + /ready + /health endpoints restart the stream if freshness drifts.
web_server
Experience + APIs
- FastAPI + FastHTML pages powered by htmx for live updates without heavy JS.
- Interactive records view, live HLS.js player, and `/debug` zone editor for calibration.
- Caches insights, proxies media from MinIO, and exposes `/health/stream` for monitors.
Operational guardrails
Observability
- Prometheus exporters on every service feed Grafana boards shipped in `static/`.
- Health surface: `/healthz` for liveness, `/ready` for freshness, `/health/stream` for external probes.
- ServiceStatus mixins report queue depth, Redis/DB connectivity, and error counters for triage.
- Insights cache highlights live trends (peak hour, average stop time, fastest vehicle).
Resilience
- Analyzer catch-up trims Redis backlogs when frames age past ANALYZER_CATCHUP_SEC.
- FFmpeg watchdog exits when HLS segments age beyond playlist thresholds so orchestrators restart cleanly.
- Single-source config (`config/config.yaml`) hot-reloads across services and persists via volumes.
- Debug UI + CLI tools (`tools/set_stop_zone.py`) let operators retune stop zones without downtime.
Build & extend it
The repository doubles as a reference implementation for real-time computer vision pipelines. Everything from configuration to deployment can be modified without touching production footage.
- `docker/local/docker-compose.yml` spins up the full stack with Redis, Postgres, and MinIO dependencies.
- `Makefile` automates setup (`make setup`), streaming (`make stream-local`), and linting.
- `sample_data/` video lets you replay the pipeline offline; `uv` manages Python deps reproducibly.
- Documentation lives under `docs/` covering architecture, health modeling, and deployment strategy.
Next steps
Explore the code, adapt the stop-zone logic to your intersection, or plug in new models—the stack is modular by design.