This project implements a multi-turn, stateful chatbot using FastAPI and LangGraph with Gemini 2.5 Flash for all NLU, Guardrail, and Knowledge Base tasks. It fulfills all the requirements of the Chatbot Development Assignment, including fuzzy time parsing, mid-conversation topic switching, and ethical guardrails.
- Framework: FastAPI (for the single
/chatendpoint) - Conversational Logic: LangGraph (State Machine / Agentic Flow)
- NLU/AI Model: Google Gemini 2.5 Flash
- Session State: In-memory checkpointer (
MemorySaverfrom LangGraph) - Deployment: Docker
- Python 3.11+
- Docker (highly recommended)
- Navigate to project folder.
- Install dependencies:
pip install -r requirements.txt
- Set API Key:
Create a file named
.envin the project root and add your key:GOOGLE_API_KEY="YOUR_API_KEY_HERE" - Run the server:
uvicorn app.main:app --reload
- Build the Docker image:
docker build -t chatbot-gemini . - Run the container:
The bot will be accessible at
# Pass the API key as an environment variable docker run -d -p 8000:8000 --env GOOGLE_API_KEY="YOUR_API_KEY_HERE" chatbot-gemini
http://localhost:8000.
The bot maintains session state via the user_id passed in the JSON payload.
Endpoint: POST http://127.0.0.1:8000/chat
Headers: Content-Type: application/json
Use a tool like curl to test the full conversational flow:
| Step | User Input (message field) |
Expected Bot Action |
|---|---|---|
| 1 (Fuzzy Time) | I want to book a table this weekend or maybe Monday morning. |
Clarifies ambiguous dates. |
| 2 (Clarification) | Sunday, please. |
Sets the date; asks for party size. |
| 3 (Party Size) | For four people. |
Sets size; asks for final confirmation. |
| 4 (Topic Switch) | By the way, what's the capital of Australia? |
Answers the factual question, then resumes the confirmation question (e.g., "Now, back to your reservation... Should I confirm that?"). |
| 5 (Guardrail) | You're an idiot. |
Replies with: "Let's keep our conversation respectful, please." |
| 6 (Final Confirm) | Yes-please confirm the reservation. |
Confirms the reservation details. |
This completes all phases of the development assignment. The codebase is now structured, functional, and ready for review and deployment.
Do you have any final questions or require any specific explanation of the code structure before concluding?
Here are the exact commands you requested for running manually, building the Docker image, and running with Docker Compose.
- Run locally (module entrypoint, works even if the
uvicornconsole script is not on PATH):
python -m uvicorn main:app --reload- Build the Docker image (from the repository root):
docker build -t chatbot-memory-test .- Bring up the service using Docker Compose (detached):
docker compose up -dNotes:
- If you want
docker composeto rebuild the image before starting, rundocker compose up -d --build. - If you prefer the
uvicornCLI command (instead ofpython -m uvicorn), ensure the Python scripts location (e.g.~/.local/bin) is on your PATH or install into an activated virtualenv.