The Agentic HR Chatbot is a next-generation, GenAI-powered HR assistant designed to deliver precise and context-aware guidance across organizational policies. Leveraging advanced natural language understanding, it analyzes user queries, discerns intent, and classifies questions as either informational (fact-based) or situational (context-driven).
The chatbot intelligently maps inquiries to predefined HR policies:
- Benevolence Policy
- Leave Policy
- Probation Policy
- Referral Policy
- Relocation Policy
- Travel Policy
Beyond policy insights, it integrates seamlessly with the Leave Management System, enabling users to apply for leave and receive actionable responses in real-time. Built as an agentic system, it empowers employees with dynamic, personalized HR guidance while ensuring compliance and operational efficiency.
- Technical Architecture
- MongoDB Vector Database Setup using Atlas CLI (Dockerized)
- Automated Notification Service Setup
- Knowledge Source Initial Setup
- Setup Observability Layer
- HR Chatbot Application Usage
- Future Enhancements
This guide provides step-by-step instructions for setting up a MongoDB vector database using the Atlas CLI with Docker.
Note: that vector storage is only available in MongoDB Atlas (cloud-managed version) and not in the local open-source MongoDB. Therefore, we are using the Dockerized version of Atlas and manage deployments locally via the Atlas CLI.
Before starting, ensure the following:
- Windows system
- Docker Desktop installed and running
- Internet connection to download images
- MongoDB Compass for GUI-based database management : Optional
Download and install the Atlas CLI binary for Windows from the official MongoDB documentation: Download Atlas CLI for Windows
Verify the installation:
atlas --versionMongoDB Atlas CLI provides two deployment options:
- local – Local Docker-based MongoDB deployment
- atlas – Cloud-hosted MongoDB Atlas deployment
We will be setting up a local setup, therefore follow these steps:
- Run the Atlas deployment setup command:
atlas deployment create | atlas deployments setup- Select
localas the deployment type. - Choose custom deployment configuration.
- Specify the following parameters:
- Deployment name: e.g.,
vector-mongodb-local - MongoDB version: 8.0 or above (required for vector storage)
- Port: e.g.,
59719
- Ensure Docker Desktop is running the deployment will start automatically.
Note: Skip connection setup during initial deployment. You can connect later using MongoDB Compass or using mongo shell.
Atlas CLI simplifies deployment management without requiring direct Docker commands. You can manage, start, stop, or delete deployments easily. Below are some frequently used commands.
- List all deployments:
atlas deployment list- Start a deployment:
atlas deployment start <deployment_name>- Stop a deployment:
atlas deployment stop <deployment_name>Once your deployment is running, you have two options to connect:
- Download MongoDB Compass from MongoDB Compass Download.
- Retrieve the connection string from Atlas CLI:
- Enter connection string in Compass and connect with your running mongodb image.
You can use Mongo Shell and connect with the mongodb instance using mongosh commands.
This module enables automated email notifications from the HR Chatbot system. Whenever user apply for the leave automatically mail also send to the user's gte approver.
- Git repository cloned locally
- Gmail account with App Password generated for the sender email
- In the root directory of the repository, locate the file:
encoding_pass.py- Open the file and add your Gmail App Password.
- Run the encoding script to securely encode your password:
python encoding_pass.pyNote: This will generate an encoded version of your password for secure use in the application.
- Navigate to the preprocessing script:
src/data_pipeline/preprocessing_script.py - Locate the
send_email()function. - Update the email credentials with the encoded password:
sender_email = "<your_email>@gmail.com"
password = "<encoded_password>"Once the credentials are configured you are good to go to the next setup process:
This guide walks you through the initial setup of the knowledge source in MongoDB, including data ingestion, Dockerization, and vectorization.
- Git
- Docker Installed
- MongoDB installed and running (ensure MongoDB service is up).
Clone the whole Git repository:
git clone <repository_url>
cd <repository_folder if needed> Inside the data folder, you will find two subfolders:
upload_pii_dataupload_static_policy_data
These folders contain the sample data files.
Note: You can add new policy with same filenames. But if you modify/change content of these files, do not change their filenames, as the agentic tool is already built to work with these specific names.
If no changes are needed, you can ingest the same sample data into MongoDB.
Build the Docker image using the following command:
docker build -t <docker_name> .Note:Before building docker image check all the credentials are properly but in the
creds.yamlfile. It requires Azure LLM , Embedding, LLAMA Parser as well as MongoDB credentials. Place all the keys in this file and then only build the docker image.
For login credentials for streamlit it is avaiable in
src>steamlit_pipeline>login_creds.yaml. Here is the simulation of two user names has been taken admin and non-admin. Make sure user id of these username should match with the ingested pii_data present indata>upload_pii_data.
In the docker-compose.yml file, make sure to mention the same Docker image name with wich you have build the docker. Then run:
docker-compose up -dThis will start the application container.
For initial setup, SSH into the running container:
ssh root@localhost -p 2222Note: Password: rootpassword
Make sure your MongoDB service is up and running. You can check using MongoDB Compass or the CLI.
Run the setup script to ingest and vectorize the data:
cd /app
python3 setup_src/setup_main.pyOnce completed, you can verify the data in MongoDB using Compass. Two databases will be created:
- hr_chatbot_pii_db (contatining hr_chatbot_pii_db)
- hr_chatbot_db (containing upload_static_policy_data)
This setup enables observability for the HR Chatbot system by integrating Grafana, Loki, and Promtail to capture, visualize, and monitor application logs.
- Repository cloned locally
- Docker & Docker Compose installed
Inside the cloned repository, go to the observability folder:
cd grafana-loki-dockerHere you will find:
docker-compose.yml– service definitions for Grafana, Loki, and Promtailloki-config.yaml– Loki configurationpromtail_config.yaml– Promtail configuration
Note: If you need configuration changes, modify these files before proceeding. Otherwise, you can use the default setup.
The following logs will be collected by this observability layer (stored under the logs/ directory in the repository root):
fast_api_logsllm_orchestration_logssetup_logsstreamlit_ui_logs
Run the following command to start services in detached mode:
docker compose up -dThis will create and start Docker containers for Grafana, Loki, and Promtail.
Check if Grafana started successfully:
docker psIf Grafana is not running, start it manually:
docker start <container_name_or_id>You can access the grafana using below details:
- URL:
http://localhost:3000 - Username:
admin - Password:
ChangeMeNow!
- Log in to Grafana
- Navigate to
Configuration > Data Sources - Add a new data source
- Select Loki
- Enter the URL:
http://loki:3100 - Save & Test the connection
Now you can visualize logs, build dashboards, and create charts based on application logs.
You can check logs for each service separately using:
- Check Grafana logs
docker logs <grafana_container_name_or_id>- Check Loki logs
docker logs <loki_container_name_or_id>- Check Promtail logs
docker logs <promtail_container_name_or_id>Note:
- Make sure the logs/ directory exists in the repository root before starting the observability stack.
- Customize promtail_config.yaml if you want to add new log paths or change log labels.
Now that the initial setup is complete and the application Docker container is up and running, you can leverage the HR Chatbot in two ways:
- Scalable Async API developed in FastAPI
- You can use this API to build a custom UI and integrate it directly.
- Streamlit Application
- Use the existing Streamlit UI to interact with the bot.
- MongoDB must be up and running with the initial data setup as described in the initial setup steps.
- Application Docker container must be up and running.
- Grafana-Loki-Promtail docker container must be up and running to capture logs for observability.
| Port | Purpose | How to Access / Notes |
|---|---|---|
| 8080 | Nginx reverse proxy | FastAPI is configured with Nginx for reverse proxy. Access endpoints via: http://localhost:8080/api/<endpoint> |
| 8000 | Direct FastAPI access | Direct FastAPI access: http://localhost:8000/<endpoint> |
| 2222 | SSH access | ssh root@localhost -p 2222 Password: rootpassword |
| 9001 | Supervisor Web UI | Access at: http://localhost:9001 Username: admin Password: admin |
| 8501 | Streamlit Web UI | Access at: http://localhost:8501 Admin Login: Username: anshbahuguna Password: admin Non-Admin Login: Username: vaishaliraheja Password: non-admin |
Note:
- FastAPI is accessible both directly on port
8000and through Nginx reverse proxy on port8080.- Use SSH (
2222) to connect to the container for initial setup or debugging.- Supervisor UI (
9001) provides process management.- Streamlit (
8501) offers the HR Chatbot’s interactive web interface with role-based login.
The following endpoints are available:
| Path | Method | Description |
|---|---|---|
/chatbot/ask |
POST | Chat with HR bot |
/check-health |
GET | Health check |
/list-endpoints |
GET | List all endpoints |
{
"user_id": "U001",
"question": "Your query here"
}NOTE: You can utilize these endpoints in any UI or test them using Postman.
- Access the Streamlit app via port 8501.
- Log in using the credentials as mentioned in
login_creds.yaml. - Explore and enjoy all available features of the HR Chatbot application.
- Integration with enterprise-grade SSO for secure authentication
- Enhanced analytics dashboards for HR query trends
- Multi-language policy support using GenAI models
- Role-based access control with fine-grained permissions
- And many more