A modern document Q&A system built with FastAPI, LangChain, and Next.js that allows users to upload documents and ask questions about their content using Retrieval-Augmented Generation (RAG).
- Document Upload: Support for PDF and TXT files
- Smart Search: Document-specific search with global fallback capabilities
- Real-time Streaming: Server-Sent Events for live query responses
- Contextual Compression: Automatic result sanitization and deduplication
- Modern UI: Beautiful, responsive interface built with Next.js and Tailwind CSS
- Vector Database: ChromaDB for efficient document storage and retrieval
- OpenAI Integration: Human-readable answer formatting
- Backend: FastAPI with LangChain for RAG implementation
- Frontend: Next.js 15 with TypeScript and Tailwind CSS
- Vector Database: ChromaDB for document embeddings
- AI Provider: OpenAI for text generation and embeddings
- Python 3.8+
- Node.js 18+
- Docker and Docker Compose
- OpenAI API key
git clone <your-repo-url>
cd ask-docsFirst, start the ChromaDB vector database using Docker:
cd server
docker-compose up -dThis will start ChromaDB on port 8001.
cd server
pip install -r requirements.txtCreate a .env file in the server directory:
OPENAI_API_KEY=your_openai_api_key_herecd client
npm installcd server
python main.pyThe FastAPI server will start on http://localhost:8000
cd client
npm run devThe Next.js application will start on http://localhost:3000
POST /api/documents/upload- Upload a documentPOST /api/documents/query- Query documentsPOST /api/documents/query/stream- Stream query responsesGET /api/documents/stats- Get document statisticsGET /api/documents/health- Health checkGET /docs- API documentation (Swagger UI)
- Start ChromaDB:
docker-compose up -d - Start Server:
python main.py(from server directory) - Start Client:
npm run dev(from client directory) - Open Browser: Navigate to
http://localhost:3000 - Upload Document: Click the upload button and select a PDF or TXT file
- Ask Questions: Start asking questions about your uploaded document!
The backend is built with FastAPI and includes:
- Document processing with LangChain
- Vector embeddings with OpenAI
- ChromaDB integration for document storage
- Streaming responses with Server-Sent Events
The frontend is built with Next.js 15 and includes:
- Modern React with TypeScript
- Tailwind CSS for styling
- Real-time chat interface
- File upload functionality
- Responsive design
ask-docs/
├── client/ # Next.js frontend
│ ├── src/app/ # App router components
│ ├── public/ # Static assets
│ └── package.json # Frontend dependencies
├── server/ # FastAPI backend
│ ├── controller/ # API controllers
│ ├── service/ # Business logic
│ ├── config/ # Configuration files
│ ├── main.py # FastAPI application
│ └── requirements.txt # Python dependencies
├── assets/ # Media files
│ ├── photo.png # Screenshot
│ └── recording.mov # Demo video
└── README.md # This file
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
This project is created for learning purposes and is open source. Feel free to use, modify, and learn from this code for educational purposes.
Note: This is a learning project demonstrating RAG (Retrieval-Augmented Generation) implementation with FastAPI, LangChain, and Next.js. The code is provided as-is for educational use.
