-
Notifications
You must be signed in to change notification settings - Fork 32
Open
Description
I want to deploy it via ollama, so I firstly convert it to .guff file by llama.cpp's convert_hf_to_guff.py,but I got an error that KeyError "<|user|>",so I found it not in added_tokens_decoder of tokenizer_config.json while it exists in convert_hf_to_guff.py.So I comment this line out, code can finish normally.And ollama create model also successfully completed.But when ollama run model,I got tensor number errors,

I wonder what happens after removing the <|user|> keyword and how should I fix this?
ExtReMLapin
Metadata
Metadata
Assignees
Labels
No labels