Skip to content

Image tokens not always included when running Qwen3-VL #233

@vincentamato

Description

@vincentamato

I was interacting with a user in Discord who was trying to run Qwen3-VL and they kept getting this error:

Error: Error in iterating prediction stream: ValueError: Image features and image tokens do not match: tokens: 0, features 240

I'm pretty sure this means that their prompt was not including the image tokens for some reason. I confirmed that they had the latest runtime (v0.30.0) and their application was up to date. Interestingly, the user had no issues with Qwen2.5-VL. After some debugging, I asked the user to move their ~/.cache/lm-studio folder out of their home directory and then reinstall the LM Studio application. This solved the problem and they were able chat with Qwen3-VL.

I'm not sure what was happening or why regenerating the cache folder seemed to work, so I wanted to put this on your radar. Our messages are in the model-discussion channel in the Discord server if you want to see more context about what we tried while debugging. Throughout our conversation I was unable to reproduce the error on my end.

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingfixed-in-next-releaseThe next release of LM Studio fixes this issue

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions