Skip to content

[bridge] Fix off-by-one in sliding window size for Gemma2, Gemma3, and GPT-OSS#2656

Open
cuichenx wants to merge 2 commits intomainfrom
chcui/fix-sliding-window-off-by-one
Open

[bridge] Fix off-by-one in sliding window size for Gemma2, Gemma3, and GPT-OSS#2656
cuichenx wants to merge 2 commits intomainfrom
chcui/fix-sliding-window-off-by-one

Conversation

@cuichenx
Copy link
Contributor

@cuichenx cuichenx commented Mar 5, 2026

Summary

  • Fix off-by-one error in sliding window attention size for Gemma2, Gemma3, and GPT-OSS bridges
  • HuggingFace sliding_window is inclusive (tokens within the window are attended to), while Megatron/FlashAttention window_size is exclusive — subtract 1 to align semantics
  • GPT-OSS now reads sliding_window from the HF config instead of hardcoding 128

Test plan

  • Verify Gemma2 conversion + inference produces matching logits
  • Verify Gemma3 conversion + inference produces matching logits
  • Verify GPT-OSS conversion + inference produces matching logits

Made with Cursor

Thanks to @returnL for catching this in NVIDIA/Megatron-LM#2771 (review)

Summary by CodeRabbit

  • Bug Fixes
    • Adjusted sliding window size calculations for Gemma2 and Gemma3 model attention mechanisms.
    • Updated GPT-OSS bridge to use dynamic sliding window configuration instead of fixed size.

…d GPT-OSS

HuggingFace sliding_window is inclusive (tokens within window are attended to),
while Megatron/FlashAttention window_size is exclusive. Subtract 1 to align
semantics. Also make GPT-OSS read sliding_window from the HF config instead of
hardcoding 128.

Made-with: Cursor
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Mar 5, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: b5a21f4f-1aac-42b3-8306-df13784fc2e2

📥 Commits

Reviewing files that changed from the base of the PR and between 394037d and e70253f.

📒 Files selected for processing (3)
  • src/megatron/bridge/models/gemma/gemma2_bridge.py
  • src/megatron/bridge/models/gemma/gemma3_provider.py
  • src/megatron/bridge/models/gpt_oss/gpt_oss_bridge.py

📝 Walkthrough

Walkthrough

The changes adjust sliding window size calculations in three model bridge implementations. In Gemma2Bridge, Gemma3TEDotProductAttention, and GPT-OSS bridge, window_size is modified to subtract 1 from the computed or configured window dimensions, affecting local attention behavior.

Changes

Cohort / File(s) Summary
Gemma Model Bridges
src/megatron/bridge/models/gemma/gemma2_bridge.py, src/megatron/bridge/models/gemma/gemma3_provider.py
Adjusted sliding window size calculations to subtract 1 from configured window dimensions in attention layer configuration.
GPT-OSS Bridge
src/megatron/bridge/models/gpt_oss/gpt_oss_bridge.py
Changed window_size configuration from fixed value to dynamically computed as (sliding_window - 1).

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~8 minutes

Suggested labels

r0.3.0

🚥 Pre-merge checks | ✅ 2 | ❌ 2

❌ Failed checks (2 warnings)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 66.67% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
Test Results For Major Changes ⚠️ Warning PR contains numerical changes affecting sliding window attention computation but lacks actual test results demonstrating no regression. Add actual test results or GitHub Actions CI/CD logs showing successful test execution and logit validation.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title clearly and concisely describes the main change: fixing an off-by-one error in sliding window size across three model bridges (Gemma2, Gemma3, and GPT-OSS).

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch chcui/fix-sliding-window-off-by-one

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant