From aa9d3e87d3ee6f4e995c4afca2032f8ac2a325ef Mon Sep 17 00:00:00 2001 From: Claude Date: Sat, 31 Jan 2026 02:56:51 +0000 Subject: [PATCH] docs: reposition HomeSec as local-first AI security platform - Add "Local-first AI security for your home" tagline - Rewrite intro to focus on value proposition (smart alerts, privacy) - Add "Why HomeSec?" comparison table vs cloud/basic NVRs - Rename "Pipeline at a glance" to "How It Works" with clearer steps - Replace technical "Highlights" with benefit-focused "Features" - Remove "pipeline" terminology from user-facing sections - Reorder sections: value first, technical details later - Update Table of Contents to match new structure https://claude.ai/code/session_01DLKoAQmcfCapgGJTagGifd --- README.md | 142 +++++++++++++++++++++++++++++++----------------------- 1 file changed, 81 insertions(+), 61 deletions(-) diff --git a/README.md b/README.md index 156dc714..12f9cabd 100644 --- a/README.md +++ b/README.md @@ -1,64 +1,33 @@ # HomeSec +**Local-first AI security for your home.** + [![PyPI](https://img.shields.io/pypi/v/homesec)](https://pypi.org/project/homesec/) [![License: Apache 2.0](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](LICENSE) [![Python: 3.10+](https://img.shields.io/badge/python-3.10%2B-blue)](https://www.python.org/) [![Typing: Typed](https://img.shields.io/badge/typing-typed-2b825b)](https://peps.python.org/pep-0561/) [![codecov](https://codecov.io/gh/lan17/HomeSec/branch/main/graph/badge.svg)](https://codecov.io/gh/lan17/HomeSec) -HomeSec is a self-hosted, extensible video pipeline for home security cameras. You can connect cameras directly via RTSP, receive clips over FTP, or implement your own ClipSource. From there, the pipeline filters events with AI and sends smart notifications. Your footage stays private and off third-party clouds. - -## Design Principles - -- **Local-Only Data Processing**: Video footage remains on the local network by default. Cloud usage (Storage, VLM/OpenAI) is strictly opt-in. -- **Modular Architecture**: All major components (sources, filters, analyzers, notifiers) are decoupled plugins defined by strict interfaces. If you want to use a different AI model or storage backend, you can swap it out with a few lines of Python. -- **Resilience**: The primary resilience feature is backing up clips to storage. The pipeline handles intermittent stream failures and network instability without crashing or stalling. - -## Pipeline at a glance - - - -```mermaid -graph TD - %% Layout Wrapper for horizontal alignment - subgraph Wrapper [" "] - direction LR - style Wrapper fill:none,stroke:none - - S[Clip Source] - - subgraph Pipeline [Media Processing Pipeline] - direction TB - C(Clip File) --> U([Upload to Storage]) - C --> F([Detect objects: YOLO]) - F -->|Detected objects| AI{Trigger classes filter} - AI -->|Yes| V([VLM Analysis]) - AI -->|No| D([Discard]) - V -->|Risk level, detected objects| P{Alert Policy filter} - P -->|No| D - P -->|YES| N[Notifiers] - end - - S -->|New Clip File| Pipeline - - PG[(Postgres)] - Pipeline -.->|State & Events| PG - end -``` +HomeSec turns your IP cameras into an intelligent security system. It detects people, vehicles, and objects with on-device AI, then uses vision language models to understand what's actually happening — not just "motion detected" but "person with package at front door." -- **Parallel Processing**: Upload and filter run in parallel. -- **Resilience**: Upload failures do not block alerts; filter failures stop expensive VLM calls. -- **State**: Metadata is stored in Postgres (`clip_states` + `clip_events`) for full observability. +Get smart alerts for what matters. Your footage stays on your network. +- **No cloud required** — runs entirely on your hardware +- **No subscriptions** — own your security, forever +- **Home Assistant ready** — MQTT integration out of the box +- **Fully extensible** — swap any component with Python plugins ## Table of Contents -- [Highlights](#highlights) -- [Pipeline at a glance](#pipeline-at-a-glance) +- [Why HomeSec?](#why-homesec) +- [How It Works](#how-it-works) +- [Features](#features) - [Quickstart](#quickstart) - [30-Second Start (Docker)](#30-second-start-docker) - [Manual Setup](#manual-setup) + - [Developer Setup](#developer-setup) - [Configuration](#configuration) + - [Configuration Examples](#configuration-examples) - [Commands](#commands) - [Plugins](#plugins) - [Built-in plugins](#built-in-plugins) @@ -69,17 +38,68 @@ graph TD - [Contributing](#contributing) - [License](#license) -## Highlights +## Why HomeSec? -- Multiple pluggable video clip sources: [RTSP](https://en.wikipedia.org/wiki/Real-Time_Streaming_Protocol) motion detection, [FTP](https://en.wikipedia.org/wiki/File_Transfer_Protocol) uploads, or a watched folder -- Parallel upload + filter ([YOLO](https://en.wikipedia.org/wiki/You_Only_Look_Once)) with frame sampling and early exit -- OpenAI-compatible VLM analysis with structured output -- Policy-driven alerts with per-camera overrides -- Fan-out notifiers (MQTT for Home Assistant, SendGrid email) -- Postgres-backed state + events with graceful degradation -- Health endpoint plus optional Postgres telemetry logging +| | Cloud NVRs | Basic Local NVR | HomeSec | +|---|:---:|:---:|:---:| +| No subscription fees | :x: | :white_check_mark: | :white_check_mark: | +| Footage stays private | :x: | :white_check_mark: | :white_check_mark: | +| AI object detection | :white_check_mark: | Some | :white_check_mark: | +| Scene understanding (VLM) | Limited | :x: | :white_check_mark: | +| Home Assistant integration | Varies | Varies | :white_check_mark: | +| Extensible with plugins | :x: | :x: | :white_check_mark: | +**What makes HomeSec different:** Two-stage AI analysis. Most systems stop at "person detected." HomeSec goes further — using vision language models to understand context: Is this a delivery? A family member? Someone unfamiliar lingering? You get alerts that actually mean something. +## How It Works + +```mermaid +graph TD + subgraph Wrapper [" "] + direction LR + style Wrapper fill:none,stroke:none + + S[Camera Feed] + + subgraph Processing [" "] + direction TB + C(New Clip) --> U([Backup to Storage]) + C --> F([Detect Objects: YOLO]) + F -->|Objects found| AI{Matches trigger?} + AI -->|Yes| V([Analyze Scene: VLM]) + AI -->|No| D([Skip]) + V -->|Assessment| P{Alert worthy?} + P -->|No| D + P -->|Yes| N[Send Alert] + end + + S -->|Motion detected| Processing + + PG[(Database)] + Processing -.->|Events| PG + end +``` + +1. **Capture** — Camera detects motion, creates clip +2. **Detect** — YOLO identifies objects (person, car, dog, etc.) +3. **Analyze** — VLM understands the scene in context +4. **Alert** — Smart notification via MQTT or email + +### Design Principles + +- **Local-Only by Default**: Video footage stays on your network. Cloud storage (Dropbox) and cloud AI (OpenAI) are opt-in. +- **Modular Architecture**: All components (sources, filters, analyzers, notifiers) are plugins. Swap out any piece with a few lines of Python. +- **Resilience**: Handles camera disconnects and network issues gracefully. Clips are backed up before processing. + +## Features + +- **Smart alerts, not noise** — AI filters out false positives so you only hear about what matters +- **Understands context** — VLM analysis knows the difference between a delivery and a stranger lingering +- **Works with your cameras** — RTSP streams, FTP uploads, or any video source via plugins +- **Home Assistant native** — MQTT notifications integrate seamlessly +- **Privacy by design** — cloud storage and cloud AI are opt-in, never required +- **Built to extend** — write custom sources, filters, storage backends, and notifiers in Python +- **Observable** — health endpoints plus Postgres-backed event logging ## Quickstart @@ -111,7 +131,7 @@ For standard production usage without Docker Compose: # Download example config & env curl -O https://raw.githubusercontent.com/lan17/homesec/main/config/example.yaml mv example.yaml config.yaml - + curl -O https://raw.githubusercontent.com/lan17/homesec/main/.env.example mv .env.example .env @@ -140,7 +160,7 @@ If you are contributing or running from source: 3. **Run** ```bash - uv run python -m homesec.cli run --config config/config.yaml + uv run python -m homesec.cli run --config config.yaml ``` @@ -218,7 +238,7 @@ homesec --help ### Commands -**Run the pipeline:** +**Run HomeSec:** ```bash homesec run --config config.yaml ``` @@ -239,9 +259,9 @@ Use `homesec --help` for detailed options on each command. ### Extensible by design -We designed HomeSec to be modular. Each major capability is an interface (`ClipSource`, `StorageBackend`, `ObjectFilter`, `VLMAnalyzer`, `AlertPolicy`, `Notifier`) defined in `src/homesec/interfaces.py`. This means you can swap out components (like replacing YOLO with a different detector) without changing the core pipeline. - -HomeSec uses a plugin architecture where every component is discovered at runtime via entry points. +HomeSec is built to be modular. Each major capability is an interface (`ClipSource`, `StorageBackend`, `ObjectFilter`, `VLMAnalyzer`, `AlertPolicy`, `Notifier`) defined in `src/homesec/interfaces.py`. Swap out components — like replacing YOLO with a different detector — without touching the core. + +Plugins are discovered at runtime via entry points. ### Built-in plugins @@ -269,9 +289,9 @@ All interfaces are defined in [`src/homesec/interfaces.py`](src/homesec/interfac ### Writing a custom plugin -Extending HomeSec is designed to be easy. You can write custom sources, filters, storage backends, and more. +Extending HomeSec is straightforward. You can write custom sources, filters, storage backends, and more. -👉 **See [PLUGIN_DEVELOPMENT.md](PLUGIN_DEVELOPMENT.md) for a complete guide.** +See [PLUGIN_DEVELOPMENT.md](PLUGIN_DEVELOPMENT.md) for a complete guide. ## Observability @@ -292,7 +312,7 @@ Extending HomeSec is designed to be easy. You can write custom sources, filters, - Run tests: `make test` - Run type checking (strict): `make typecheck` - Run both: `make check` -- Run the pipeline: `make run` +- Run HomeSec: `make run` ### Notes