Observability tools tell you what broke.
Obsvty shows you why it broke and how to fix it β based on your code.
Obsvty is an open-source platform that connects observability data (logs, metrics, traces) with code changes and language models (LLMs) to generate actionable, contextual, and secure insights.
All this with:
- π§© Modular architecture β use any LLM, version control, or alert destination.
- π Privacy-first β sensitive data never leaves your environment.
- π¦ Auto technical documentation β your docs update as your code and infra change.
- π± Easy to run and contribute β
docker-compose upand you're set.
- About
- Features
- Getting Started
- Installation
- Usage
- Architecture
- Development
- Roadmap
- Contributing
- License
- Contact
Most observability tools stop at the question:
"Where is the error?"
But engineers need to know:
"Which commit caused this? Which line of code should I review? What is the practical fix suggestion?"
Obsvty bridges this gap by correlating:
- Traces/logs (OTLP) β Commits/PRs β LLM Suggestions
π Detected insight:
- Metric: average latency of /checkout rose from 120ms β 480ms
- Commit: d34db33f (added synchronous card validation)
- Suggestion (LLM): "Move validation to an async queue. See example in docs/async-payment.md"
- Alert sent to #eng-alerts (Slack)This is smart observability β not just data, but action.
- Contextual Insights: Connects observability data with code changes and context
- Privacy-First: All sensitive data stays within your environment
- Modular Architecture: Support for any LLM, version control, or alert system
- Automatic Documentation: Docs update as your code and infrastructure change
- OpenTelemetry Integration: Native support for OTLP gRPC protocol
- Extensible Plugin System: Easy to add support for new services and tools
We are building the first functional end-to-end flow:
[OTLP] β [Compression + Sanitization] β [Modular LLM] β [Alert + Doc + Chat]
β
[GitHub: commit, PR, diff]
- You send traces/logs via OTLP.
- Receive a Slack alert with a commit-contextualized suggestion.
- Access a chat (Streamlit) with all the context: trace + code + recommendation.
- Confirm that no sensitive data was sent to the LLM.
- All runs locally with
docker-compose up.
- Language: Python (3.11+)
- Ingestion: OTLP gRPC (OpenTelemetry)
- Storage: DuckDB (lightweight, no external dependencies)
- LLM: Any OpenAI-compatible provider (Ollama, OpenAI, Anthropic, etc.)
- Frontend: Streamlit (fast, iterative prototype)
- Extensibility: Abstract interfaces for plugins (Git, LLM, Alerts, Docs)
class GitProvider(ABC): ...
class LLMEngine(ABC): ...
class AlertPlugin(ABC): ...
class DocGenerator(ABC): ...Want to add support for GitLab? Confluence? A new local model? Just implement the interface.
- Python 3.11+
- Poetry 1.7+
- Docker (optional)
-
Clone the repository:
git clone https://github.com/thorgus-services/obsvty.git cd obsvty -
Install dependencies:
poetry install
-
Generate OTLP proto stubs:
python generate_protos.py
The project uses Tox for standardized development tasks:
# List all available environments
poetry run tox -l
# Run linting checks
poetry run tox -e lint
# Run type checking
poetry run tox -e type
# Run unit tests
poetry run tox -e unit
# Run security checks
poetry run tox -e security
# Run all checks at once
poetry run toxThis project follows standardized Python toolchain configuration for consistent, secure, and maintainable codebases.
- Poetry for dependency management with precise version constraints
- Runtime, dev, and test dependencies are properly separated
poetry.lockfile ensures deterministic builds
- Ruff for formatting and linting (replaces Black and Flake8)
- Enforces consistent import ordering and grouping
- Disallows unused imports and variables
- Code formatting with line length of 88 characters
- Mypy for type checking with strict mode enabled for core packages
- Pytest for testing with coverage requirements (β₯80% in core)
- Safety for dependency vulnerability scanning
- Bandit for security issue detection in Python code
- Tox for standardized environments (replaces Invoke/tasks.py)
lintenvironment: Code quality checks with Rufftypeenvironment: Type checking with Mypyunitenvironment: Unit tests with Pytestsecurityenvironment: Combined Safety and Bandit scanning
The CI pipeline includes:
- Ruff format and lint check
- Mypy type checking
- Bandit security scan
- Safety dependency vulnerability check
- Pytest with coverage requirements (β₯80% in core)
- Build and package verification with Poetry
# Install dependencies
poetry install
# Generate OTLP proto stubs
python generate_protos.py
# Run lint, typecheck and tests
poetry run tox -e lint && poetry run tox -e type && poetry run tox -e unit
# Run all checks at once (lint, type, unit tests)
poetry run tox
# Run security checks
poetry run tox -e security
# Run individual checks
poetry run tox -e lint # Linting only
poetry run tox -e type # Type checking only
poetry run tox -e unit # Unit tests only# Regenerate stubs from a specific ref (branch or tag)
python generate_protos.py --ref main --force
# With network timeout and validation
python generate_protos.py --ref v1.1.0 --timeout 20 --forceCopy .env.example to .env and adjust values if needed:
# Main OTLP configuration (new standard)
OTLP_HOST=localhost
OTLP_PORT=4317
OTLP_MAX_MESSAGE_LENGTH=4194304
OTLP_BUFFER_MAX_SIZE=1000
# Backward compatibility (existing implementation)
OTLP_GRPC_HOST=0.0.0.0
OTLP_GRPC_PORT=4317
OTLP_GRPC_MAX_BUFFER_SIZE=10000
OTLP_GRPC_MAX_MESSAGE_LENGTH=4194304
OTLP_GRPC_ENABLE_REFLECTION=false
OTLP_GRPC_ENABLE_LOGS_SERVICE=false
LOG_LEVEL=INFOThe project uses OtlpGrpcSettings Pydantic model for validated configuration management:
OTLP_HOST: Host address for the gRPC server (default: "localhost")OTLP_PORT: Port number for the gRPC server (default: 4317)OTLP_MAX_MESSAGE_LENGTH: Maximum message size in bytes (default: 4MB)OTLP_BUFFER_MAX_SIZE: Maximum size of the trace buffer (default: 1000)
To start the OTLP gRPC server:
python -m obsvtyThe server will load configuration from environment variables and start on the configured endpoint.
To connect your own OTLP client to the server, ensure environment variables are set:
import os
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
# Read from environment variables
host = os.getenv("OTLP_HOST", "localhost")
port = os.getenv("OTLP_PORT", "4317")
endpoint = f"{host}:{port}"
# Create the exporter
otlp_exporter = OTLPSpanExporter(
endpoint=endpoint,
insecure=True, # For development
)For a complete example, see examples/otlp_client.py.
- Ports (in
src/obsvty/application/ports/):TraceIngestionPort,TraceBatchIngestionPort,TraceStoragePort - Services (in
src/obsvty/domain/services/):otlp_processing.pywithprocess_otlp_data()function - Composition Root:
src/obsvty/main.pywithcreate_application(buffer_size)andmain(port, buffer_size)
Run the package entrypoint:
python -m obsvtySetup validation tests are in tests/unit/test_setup_validation.py and include:
- Directory structure validation
- Dependency version pinning check
- Proto/stub generation validation
- Dockerfile presence
Run tests with coverage:
pytest --cov=src --cov-fail-under=80| Phase | Name | Goal |
|---|---|---|
| M0 | Bootstrapping | Repo, CI, modular structure |
| M1 | Observability Core | OTLP + compression + detection |
| M2 | AI Brain | Secure LLM + modular workflow |
| M3 | Context Connect | GitHub + Slack + auto doc |
| M4 | Insight Chat | UI with contextual chat |
| M5 | First Release | Community launch |
git clone https://github.com/thorgus-services/obsvty.git
cd obsvty
docker-compose up
β οΈ Still under construction! We are in phase M0/M1. The runnable version will be released in the coming weeks.
Obsvty is born as a project from the community, for the community.
We welcome contributions from everyone! Check out our CONTRIBUTING.md file for more details on how to get started.
- π§ͺ Test the MVP as soon as it's released
- π§© Write a plugin (e.g.: GitLab, Jira, Confluence)
- π§ Suggest improvements for trace compression or anomaly detection
- π Improve documentation or write tutorials
See CONTRIBUTING.md to get started.
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
- Open an Issue
- Contact me directly
Obsvty: because understanding the why is as important as seeing the what.