Skip to content

convert to guff and ollama  #10

@Aniwine

Description

@Aniwine

I want to deploy it via ollama, so I firstly convert it to .guff file by llama.cpp's convert_hf_to_guff.py,but I got an error that KeyError "<|user|>",so I found it not in added_tokens_decoder of tokenizer_config.json while it exists in convert_hf_to_guff.py.So I comment this line out, code can finish normally.And ollama create model also successfully completed.But when ollama run model,I got tensor number errors,
image

I wonder what happens after removing the <|user|> keyword and how should I fix this?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions