feat: native OTLP export via configure_otlp()#54
feat: native OTLP export via configure_otlp()#54ellucas-creator wants to merge 1 commit intomainfrom
Conversation
Add bmasterai/otlp.py — a zero-breaking-change OTLP layer that hooks into AgentMonitor and emits OTel spans + metrics to any compatible backend. What's new: - bmasterai.configure_otlp(endpoint, service_name, headers, use_http, ...) - Wraps track_agent_start/stop → root span per agent lifecycle - Wraps track_llm_call → child span with token/model/latency attrs - Wraps track_task_duration → child span per task - Wraps track_error → counter + span event - Wraps record_custom_metric → OTel counter Metrics exported: bmasterai.llm.tokens_used (counter, labels: agent_id, model) bmasterai.llm.call_duration (histogram ms) bmasterai.task.duration (histogram ms) bmasterai.agent.errors (counter, labels: agent_id, error_type) bmasterai.custom.metric (counter, passthrough labels) Optional dependencies (zero install required for existing users): pip install 'bmasterai[otlp]' # gRPC transport pip install 'bmasterai[otlp-http]' # HTTP/protobuf transport Supported backends: Grafana Tempo, Jaeger, Honeycomb, Datadog, New Relic, Prometheus (OTLP bridge), any OTel collector. Examples: examples/otlp-export/agent_with_otlp.py examples/otlp-export/README.md
|
Claude finished @ellucas-creator's task —— View job Code Review CompleteI've thoroughly reviewed this PR that adds native OTLP export functionality to bmasterai. Here's my comprehensive analysis: Todo List
🔍 Overall AssessmentThis is a well-structured PR that adds valuable OpenTelemetry integration. The implementation is clean, follows good practices, and maintains backward compatibility. However, there are several areas for improvement. ✅ Strengths
|
Native OTLP Support
Adds zero-breaking-change native OTLP export to bmasterai. One call to
configure_otlp()before your first monitor call and all agent events automatically flow to any OTel-compatible backend.Usage
What gets exported
Spans (traces)
agent.<agent_id>track_agent_start/track_agent_stopllm.calltrack_llm_call— includes model, tokens, latency, reasoning_stepstask.<task_name>track_task_durationtrack_error— added to active agent spanMetrics
bmasterai.llm.tokens_usedbmasterai.llm.call_durationbmasterai.task.durationbmasterai.agent.errorsbmasterai.custom.metricInstall
Zero install impact for existing users — opentelemetry-sdk is optional. If not installed, all OTLP calls are no-ops.
Supported backends
Grafana Tempo, Jaeger, Honeycomb, Datadog, New Relic, Prometheus (OTLP bridge), any OTel collector.
Files changed
src/bmasterai/otlp.py— new OTLP module (hooks, span management, instruments)src/bmasterai/monitoring.py— hook calls added to all track_* methodssrc/bmasterai/__init__.py—configure_otlpexported from top levelpyproject.toml—[otlp]and[otlp-http]optional dependency groupsexamples/otlp-export/— working example + README with backend configs