A modern, intelligent chatbot application built with FastAPI and LangChain, featuring a Streamlit web interface for interactive conversations. This project demonstrates the implementation of an AI-powered conversational agent with multi-threaded conversation management and streaming capabilities.
- FastAPI Backend: High-performance REST API with automatic OpenAPI documentation
- LangChain Integration: Leverages LangChain for agent orchestration and conversation management
- Thread Management: Support for multiple concurrent conversation threads with persistent history
- Streaming Responses: Real-time streaming of chatbot responses for improved UX
- Streamlit Frontend: Interactive web interface for chatting with the bot
- Database Integration: MongoDB-based storage for conversation history and thread management
- State Management: Sophisticated state machine for managing chat agent behavior
Backend:
- FastAPI 0.128.0
- LangChain (AI/LLM orchestration)
- Uvicorn (ASGI server)
- Python 3.13
Frontend:
- Streamlit
- Requests (HTTP client)
Database:
- MongoDB
Additional Libraries:
- python-dotenv (environment configuration)
- Various AI/ML dependencies
.
βββ main.py # FastAPI application entry point
βββ new_app.py # Enhanced Streamlit interface with threading
βββ app.py # Alternative Streamlit interface (basic)
βββ src/
β βββ agents/
β β βββ chat_agent/
β β βββ graph.py # Chat agent graph builder
β β βββ nodes/ # Agent nodes/actions
β β βββ states/ # State definitions
β β βββ tools/ # Agent tools
β βββ handlers/
β β βββ chat_handler.py # Request handlers for chat operations
β βββ routes/
β β βββ chat_route.py # API route definitions
β βββ services/
β β βββ database_service.py # Database operations
β βββ main.py
βββ my_env/ # Python virtual environment
- Python 3.13+
- MongoDB instance (local or cloud)
- pip or poetry for dependency management
- Clone the repository:
git clone https://github.com/krgaurav7/Chatbot.git
cd Chatbot- Create and activate virtual environment:
# Windows
python -m venv my_env
my_env\Scripts\activate
# Linux/macOS
python3 -m venv my_env
source my_env/bin/activate- Install dependencies:
pip install -r requirements.txtOr if using the virtual environment in the repo:
my_env\Scripts\activate- Configure environment variables:
Create a
.envfile in the root directory:
DB_URI=mongodb://localhost:27017/chatbot
# Add other required environment variables- Start the FastAPI backend:
python main.pyThe API will be available at http://localhost:8000
- API documentation:
http://localhost:8000/docs
- In another terminal, start the Streamlit interface:
# Enhanced version with threading support
streamlit run new_app.py
# Or basic version
streamlit run app.pyThe interface will open at http://localhost:8501
streamlit run app.py-
POST
/chat/{thread_id}- Send a message and get a response- Parameters:
message(query parameter) - Returns: Chat state with messages
- Parameters:
-
POST
/stream/{thread_id}- Stream chat response- Parameters:
message(query parameter) - Returns: Streaming text response
- Parameters:
-
GET
/chat/threads- Get all conversation threads- Returns: List of thread IDs
-
GET
/chat/history/{thread_id}- Get conversation history- Returns: Chat state with message history
The backend follows a modular architecture:
-
FastAPI Application (
main.py)- Manages application lifecycle
- Database initialization on startup
- Request routing
-
Chat Agent (
src/agents/chat_agent/)- Graph-based agent orchestration
- State management for conversations
- Tool integration for extended capabilities
-
Route Handler (
src/routes/)- REST endpoint definitions
- Request/response mapping
-
Business Logic (
src/handlers/)- Chat processing logic
- Streaming response handling
- Thread and history management
-
Database Service (
src/services/)- MongoDB integration
- Data persistence
- Streamlit Interface (
new_app.py)- Multi-threaded conversation support
- Real-time chat history
- Thread management UI
- Streaming response display
- Open the Streamlit app in your browser
- View existing threads or create a new conversation
- Type your message in the chat input
- View the bot's response in real-time
# Get all threads
curl http://localhost:8000/chat/threads
# Send a message
curl -X POST "http://localhost:8000/chat/thread-abc123?message=Hello%20there"
# Stream a response
curl -N "http://localhost:8000/stream/thread-abc123?message=Tell%20me%20a%20story"- Activate virtual environment
- Install development dependencies
- Set up MongoDB locally or use a cloud instance
- Configure
.envfile with your settings
src/agents/chat_agent/graph.py- Modify agent behaviorsrc/agents/chat_agent/nodes/- Add or modify agent actionssrc/agents/chat_agent/tools/- Add new tools/capabilitiessrc/handlers/chat_handler.py- Modify request handling logic
Create a .env file with the following:
DB_URI=mongodb://localhost:27017/chatbot
# Add other configuration as needed- Ensure MongoDB is running:
mongod - Check
DB_URIin.envmatches your MongoDB setup
- Verify FastAPI server is running:
python main.py - Check
http://localhost:8000/docsfor API documentation
- Make sure FastAPI backend is running before starting Streamlit
- Clear Streamlit cache:
streamlit cache clear
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
This project is open source and available under the MIT License.
Gaurav Kumar
- GitHub: @krgaurav7
- FastAPI - Modern web framework
- LangChain - LLM orchestration framework
- Streamlit - Rapid UI development
- MongoDB - NoSQL database
For support, please open an issue on the GitHub repository.
Happy Chatting! π