This project is a chatbot designed to assist the residents of the city of Anthony by providing accurate and helpful information based on the data available in the Redis vector store. The chatbot uses LangChain and ChatOllama for natural language processing and retrieval.
- Docker
- Docker Compose
-
Clone the repository:
git clone https://github.com/CaptainZiboo/anthony cd anthony -
Ensure you have the necessary Docker images and volumes:
docker-compose pull
-
Build the custom Docker image for the
ollamaservice:docker-compose build
-
Start the services:
docker-compose up -d
This will start the following services:
ollama: The main service for the chatbot, which will pull the necessary models (llamaandmxbai-embed-large).redis-stack: The Redis stack for storing vector data.redis-insight: A web interface for managing Redis.
If the models aren't working, try downloading those directly from the container :
docker compose exec ollama ollama pull llama3.1 docker compose exec ollama ollama pull mxbai-embed-large -
Access the services:
- Chatbot Web UI: http://localhost:3000
- Redis Insight: http://localhost:5540
The chatbot is designed to provide accurate and helpful information based on the data available in the Redis vector store. It can answer questions, provide summaries, and more.
- To ask a question: Simply type your question in the chatbot interface.
- To get a summary: Type a request for a summary in the chatbot interface.
To develop and test the chatbot locally, you can use the following commands:
-
To stop the services:
docker-compose down
-
To view logs:
docker-compose logs -f