An interactive question-answering bot for a pizza restaurant, built using LangChain, Ollama's LLMs, and ChromaDB. The bot answers user questions by leveraging realistic customer reviews stored in a local vector database.
- Uses LangChain and Ollama LLM (
llama3) to generate accurate, context-aware responses. - Embeds customer reviews using the mxbai-embed-large model for efficient semantic search.
- Stores embedded reviews in ChromaDB, a local vector database.
- Retrieves the top 5 most relevant reviews per query to provide informed answers.
To run this project, you need the following Python packages:
langchainlangchain-ollamalangchain-chromapandas
Install all dependencies using the provided requirements.txt:
pip install -r requirements.txt-
Install Ollama from the official website:
π https://ollama.com/download -
Pull the required models:
ollama pull llama3
ollama pull mxbai-embed-large- Ensure the Ollama service is running in the background before executing any scripts.
βββ main.py # Main QA bot loop
βββ vector.py # Embeds reviews into ChromaDB
βββ realistic_restaurant_reviews.csv # Customer review dataset
βββ requirements.txt # Project dependencies
βββ README.md # Project documentation
- Clone the repository:
git clone https://github.com/kawish918/CritiqueChain.git
cd CritiqueChain- Install dependencies:
pip install -r requirements.txt- Create the vector database (only needed once):
python vector.py- Start the QA chatbot:
python main.pyβ
Ask questions about the restaurant!
π¬ Type q to quit.
- Ensure Ollama is running before starting any script.
- This project runs entirely locally with no cloud dependencies.