Skip to content

fix(issues): Exclude gen_ai ops from consecutive HTTP detector#112517

Merged
mrduncan merged 1 commit intomasterfrom
mrduncan/gen-ai-spans
Apr 8, 2026
Merged

fix(issues): Exclude gen_ai ops from consecutive HTTP detector#112517
mrduncan merged 1 commit intomasterfrom
mrduncan/gen-ai-spans

Conversation

@mrduncan
Copy link
Copy Markdown
Member

@mrduncan mrduncan commented Apr 8, 2026

The consecutive HTTP detector only excluded HTTP spans parented to gen_ai.chat spans. Agentic AI workflows use gen_ai.invoke_agent and other gen_ai.* ops, causing false positive detections on inherently sequential LLM call loops.

Broaden the filter from gen_ai.chat to any span with a gen_ai.* prefix.

…ve HTTP detector

The consecutive HTTP detector only excluded HTTP spans parented to
gen_ai.chat spans. Agentic AI workflows use gen_ai.invoke_agent and
other gen_ai.* ops, causing false positive detections on inherently
sequential LLM call loops.

Broaden the filter from gen_ai.chat to any span with a gen_ai.* prefix.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@github-actions github-actions bot added the Scope: Backend Automatically applied to PRs that change backend components label Apr 8, 2026
@mrduncan mrduncan changed the title fix(issues): Exclude HTTP spans under all gen_ai.* ops from consecutive HTTP detector fix(issues): Exclude gen_ai ops from consecutive HTTP detector Apr 8, 2026
@mrduncan mrduncan marked this pull request as ready for review April 8, 2026 21:30
@mrduncan mrduncan requested a review from a team as a code owner April 8, 2026 21:30
@mrduncan mrduncan merged commit 4e9ed4a into master Apr 8, 2026
66 checks passed
@mrduncan mrduncan deleted the mrduncan/gen-ai-spans branch April 8, 2026 21:32
george-sentry pushed a commit that referenced this pull request Apr 9, 2026
The consecutive HTTP detector only excluded HTTP spans parented to
gen_ai.chat spans. Agentic AI workflows use gen_ai.invoke_agent and
other gen_ai.* ops, causing false positive detections on inherently
sequential LLM call loops.

Broaden the filter from gen_ai.chat to any span with a gen_ai.* prefix.

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Scope: Backend Automatically applied to PRs that change backend components

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants