Skip to content

Context Length Issue with Qwen2.5-Coder-32B on Long Prompts #11

@cbacode

Description

@cbacode

Hello authors,

Thank you for releasing the code for your excellent work. I am currently attempting to reproduce the results of the paper.

However, I have encountered an issue with the maximum context length:

The prompt length for several data points exceeds 36,000 tokens. Given that Qwen2.5-Coder-32B with vllm officially supports a maximum context length of approximately 32,000 tokens, this results in silent truncation during the reproduction process.

I wonder if your team encountered this specific issue with very long prompts during your experiments. It would be incredibly helpful if you could share the strategy or technique you used to successfully handle these instances.

Thank you very much for your time and assistance.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions