Change max context size - local llm #1548
Replies: 6 comments 2 replies
-
|
Context window size is controlled by the AI runtime (Claude Code, OpenCode, etc.), not by GSD directly. GSD's architecture assumes 200K tokens per agent. For local LLMs with different context limits, Generated by Claude Code |
Beta Was this translation helpful? Give feedback.
-
|
No, Claude Code does not currently let you configure a custom “max context size” in settings. |
Beta Was this translation helpful? Give feedback.
-
|
@Aashish-po — correct that Claude Code does not expose a "max context size" setting, but to add precision to your response: the relevant config in GSD is |
Beta Was this translation helpful? Give feedback.
-
|
Yes, to a point. GSD has a But the real hard limit still comes from your runtime/model. So for a local 262k model, you can raise GSD’s assumed window, but GSD cannot make the model exceed what the runtime actually supports. |
Beta Was this translation helpful? Give feedback.
-
|
Ok after re-reading my initial post I think I wan't clear enough, my bad. What I meant is that, 200k is the assumed max context size for both Claude Code and GSD, but if running local modes, the actual max context size could be different either shorter or longer, so I wanted to know if it was possible to change it in GSD (and maybe in CC too if you know how to) to align it to the actual max content size of the underlying local model used. I'll try setting it in config.json, thanks |
Beta Was this translation helpful? Give feedback.
-
|
To directly answer your follow-up @marcorobiati: yes — set Fastest path: Then set This does not change what the model can actually hold — it just tells GSD how much room to assume when constructing prompts. The runtime hard limit is still set by your model and Claude Code configuration. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Is it possible to set a different max context size? Useful when using LLMs locally. In my case I'm currently using a local model with 262144 max context size, so single agent could work with higher than default context but this also means if 4 agents work in parallel they would have 65536 each and not 200k
Beta Was this translation helpful? Give feedback.
All reactions