local AI chatbot powered by open models. It combines OpenHermes for text generation and LLaVA for image recognition, all running locally through llama.cpp — no external API calls required.
- User management
- Conversation history
- Offline conversation
- Image recognition (temporary and stored images)
- Multiple personalities
- Conversational AI using OpenHermes
- Image understanding with LLaVA
- Local inference via llama.cpp
- Python backend for user, conversation and messages curl, model management and prompt orchestration
- React + Vite frontend
- Modular structure (backend and frontend separated for easy customization)
- sqlite3 for user, conversation and messages storage
chat-bot/
├── client/ # React + Vite frontend
├── server/ # Python backend
│ ├── controller/ # HTTP handlers / controllers (API routes -> services)
│ ├── service/ # Model management & inference logic
│ ├── uploads/
│ ├── db/
│ ├── utils/
│ ├── prompts/
│ ├── models/
│ └── app.py
git clone https://github.com/damian5/chat-bot.git
cd chat-bot
See server/README.md for full setup.
See client/README.md for full setup.
- Some browsers require a secure environment to make usage of the camera, for this reason, if you want to use that feature, you will have to run the app using HTTPS
1. Install mkcert
brew install mkcert
brew install nss # if you use Firefox
mkcert --install
You can choose the URL you want to use, I recommend to use your local hostname for simplicity You can configure this in System preferences > Sharing > local hostname
mkcert <ip_address | local_hostname> 127.0.0.1 ::1
Note If you want to access the app trough another device like your phone, you will need to send the root cert to the device and install it.
mkcert -CAROOT
├── client/
│ ├── cert.pem
│ ├── cert-key.pem
├── server/
│ ├── cert.pem
│ ├── cert-key.pem
Here’s a quick look at the chatbot interface:
MIT
Built with ❤️ by Damian






