-
Notifications
You must be signed in to change notification settings - Fork 14
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Hello authors,
Thank you for releasing the code for your excellent work. I am currently attempting to reproduce the results of the paper.
However, I have encountered an issue with the maximum context length:
The prompt length for several data points exceeds 36,000 tokens. Given that Qwen2.5-Coder-32B with vllm officially supports a maximum context length of approximately 32,000 tokens, this results in silent truncation during the reproduction process.
I wonder if your team encountered this specific issue with very long prompts during your experiments. It would be incredibly helpful if you could share the strategy or technique you used to successfully handle these instances.
Thank you very much for your time and assistance.
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working