Skip to content

A python proxy between Home Assistant and Ollama for tweaking the queries and responses on the fly

License

Notifications You must be signed in to change notification settings

kulve/haollama-proxy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Haollama-proxy

Haollama-proxy is a python based proxy between Home Assistant's Ollama integration and the actual Ollama server.

It's purpose is to:

  • [DONE] Remove extra <think></think> tags that Qwen3 produces
  • [TODO] Enable tweaking Home Assistant's hard coded LLM prompt
  • [TODO] Allow controlling devices outside of Home Assistant

About

A python proxy between Home Assistant and Ollama for tweaking the queries and responses on the fly

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages