Skip to content

Added support for llama.cpp and fixed serverless chat complete#57

Open
twesterm wants to merge 6 commits intovast-ai:mainfrom
twesterm:llama_cpp
Open

Added support for llama.cpp and fixed serverless chat complete#57
twesterm wants to merge 6 commits intovast-ai:mainfrom
twesterm:llama_cpp

Conversation

@twesterm
Copy link
Copy Markdown

No description provided.

@twesterm twesterm changed the title Llama cpp Added support for llama.cpp and fixed serverless chat complete Nov 11, 2025
@twesterm
Copy link
Copy Markdown
Author

A cheap model to test with: https://huggingface.co/mradermacher/Rocinante-12B-v1.1-GGUF

@Colter-Downing
Copy link
Copy Markdown
Contributor

Looks like these changes have been implemented into main. The lib/backend change is included, and the openai client has been refactored. Thanks for the contribution!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants