You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
However, the generated qwen1.5-1.8b-chat-rot_q4_0.mllm file is 1.0 GB in size, which does not match the file size of the official version hosted on ModelScope:
🔗 mllmTeam/Qwen1.5-1.8B-Chat on ModelScope
My generated file: qwen1.5-1.8b-chat-rot_q4_0.mllm (1.0 GB)
Official ModelScope file: qwen1.5-1.8b-chat-rot_q4_0.mllm (3.17GB)
Whether there are additional quantization or export steps I might have missed? How did you obtain the model files?
Thank you for your help!