Ask questions and get answers directly from your PDFs. This is a Retrieval-Augmented Generation (RAG) system powered by OpenAI, LangChain, and Chroma vector database, with an intuitive Streamlit UI. The entire app is fully containerized for easy deployment.
- Upload one or more PDF documents and extract knowledge.
- Semantic search using embeddings stored in ChromaDB.
- Query your PDFs naturally with OpenAI-powered RAG.
- Uses LangChain 1.0 to create the agent and orchestrate prompts.
- Clean Streamlit interface for interactive chatting.
- Fully containerized for easy deployment with Docker Compose.
- Manage documents: add, delete, or clear chat history.

Example: Upload PDFs, ask questions, and get answers.
- Docker & Docker Compose installed
.envfile with your OpenAI API key:
OPENAI_API_KEY=your_api_key_here- Clone the repository:
git clone https://github.com/your-username/chat-with-documents.git
cd chat-with-documents- Copy the example environment file:
cp .env.example .env- Set your OpenAI API key:
Open the .env file and replace your_api_key_here with your actual OpenAI API key:
OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx- Build and start the containers:
docker-compose up --build- Access the app:
Open http://localhost:8501 in your browser.
- Upload PDF files via the sidebar.
- Click Submit to process documents.
- Ask questions in the chat input at the bottom.
- Use sidebar buttons to Clear Chat or Delete Docs.
- Enjoy AI-powered answers directly from your PDFs!
.
├── app/ # Streamlit app code
├── utils/ # Helper modules (agent, config, pdf processor, etc.)
├── docker-compose.yml # Container orchestration
├── Dockerfile # App container configuration
├── .env.example # Environment variables template
├── pyproject.toml # Project dependencies
└── README.md