-
Notifications
You must be signed in to change notification settings - Fork 0
DRAFT: LLM refactor plan + model-info/capabilities initialization design #12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Co-authored-by: openhands <openhands@all-hands.dev>
Co-authored-by: openhands <openhands@all-hands.dev>
- Avoid inline import; keep imports at top for clarity Co-authored-by: openhands <openhands@all-hands.dev>
- Thin LLM facade with focused modules (formatters, options, tools, transport, invokers) - Future-ready for streaming, async, stateful Responses API - Extract+cache model_info; pure derivations for token limits and capabilities - Group Anthropic-specific logic (cache markers, tokens, reasoning) behind small helpers Co-authored-by: openhands <openhands@all-hands.dev>
|
Important Review skippedDraft detected. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the ✨ Finishing touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
…t Anthropic grouping\n\n- clone(): reuse profile only if model/base_url unchanged; otherwise re-resolve\n- LRU cache keyed by (normalized_model, base_url)\n- default http when scheme missing; no extra flags\n\nCo-authored-by: openhands <openhands@all-hands.dev>
…rify Responses path has no Anthropic\n\nCo-authored-by: openhands <openhands@all-hands.dev>
…; keep Chat example minimal\n\nCo-authored-by: openhands <openhands@all-hands.dev>
…ction\n\nCo-authored-by: openhands <openhands@all-hands.dev>
…egments, live switch flow)\n\nCo-authored-by: openhands <openhands@all-hands.dev>
…nfigure of state.agent.llm; no new Agent copies; persistence via base_state save\n\nCo-authored-by: openhands <openhands@all-hands.dev>
Design docs for refactoring the LLM module with a focus on simplicity, readability, and future extensibility (streaming, async, stateful Responses API).
Summary
Files
Notes