Skip to content

LinaMohsen1234/HR_ChatBot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation


HR Analytics Chatbot

A conversational HR analytics assistant with dual model support (Cloud & Local)

Screenshot 2026-01-21 164759 image

Table of Contents

About The Project

The HR Analytics Chatbot is a conversational assistant designed to help HR teams and decision-makers explore workforce data using natural language.

Instead of navigating dashboards or spreadsheets, users can ask questions directly and receive clear, contextual answers related to employee attrition, tenure, income, and departmental distribution.

(back to top)

Key Features

  • Natural language querying over HR datasets
  • Context-aware conversations with follow-up support
  • Dual model support (Cloud and Local)
  • Vector-based data retrieval
  • CSV-based HR data source
  • Designed for non-technical users and decision-makers

(back to top)

Latest Update

The chatbot now supports two execution modes:

  • Cloud-based language model
  • Fully local offline language model

This provides flexibility, privacy control, and cost optimization depending on deployment needs.

(back to top)


Tags

HR Analytics Chatbot AI NLP LLM Flask Python
Cloud Model Local Model RAG ChromaDB

Built With

Backend

  • Python
  • Flask
  • Pandas

AI & NLP

  • OpenRouter API (Cloud LLM)
  • Local LLM (Qwen / Mistral)
  • Sentence Transformers
  • ChromaDB

Frontend

  • HTML
  • CSS
  • JavaScript

Python Flask AI NLP Cloud Local

(back to top)


Diagrams

System Architecture

The system follows a modular and flexible architecture that supports both cloud-based and local model inference.

flowchart LR
    U[User]
    UI[Web Interface]
    F[Flask Backend]
    CM[Conversation Manager]
    R[Retriever Layer]
    DB[Vector Database]
    MR[Model Router]
    C[Cloud LLM]
    L[Local LLM]
    RG[Response Generator]

    U --> UI
    UI --> F
    F --> CM
    CM --> R
    R --> DB
    CM --> MR
    MR --> C
    MR --> L
    C --> RG
    L --> RG
    RG --> UI
Loading

Layered Architecture

flowchart TB
    subgraph Presentation_Layer
        UI[Web Interface]
    end

    subgraph Application_Layer
        API[Flask Backend]
        CM[Conversation Manager]
    end

    subgraph Data_Layer
        VS[Vector Store]
    end

    subgraph AI_Layer
        Router[Model Router]
        Cloud[Cloud LLM]
        Local[Local LLM]
    end

    UI --> API
    API --> CM
    CM --> VS
    CM --> Router
    Router --> Cloud
    Router --> Local
    Cloud --> API
    Local --> API
Loading

Request Flow

This explains WHAT HAPPENS WHEN A USER ASKS A QUESTION.

sequenceDiagram
    participant U as User
    participant UI as Web Interface
    participant B as Flask Backend
    participant C as Conversation Manager
    participant R as Retriever
    participant M as Model

    U->>UI: Ask Question
    UI->>B: Send Request
    B->>C: Load Context
    C->>R: Retrieve Relevant Data
    C->>M: Generate Response
    M->>UI: Return Answer
Loading

High-Level Flow

  1. The user submits a question through the web interface.
  2. The Flask backend receives the request.
  3. Conversation history is managed to preserve context.
  4. Relevant HR data is retrieved using vector search.
  5. A model routing layer selects the appropriate model (Cloud or Local).
  6. The selected model generates a natural language response.
  7. The response is returned to the user interface.

Getting Started

Prerequisites

  • Python 3.10
  • pip
  • Virtual environment (recommended)

Optional:

GPU for local model acceleration

(back to top)

Installation

Clone the repository:

git clone https://github.com/LinaMohsen1234/HR_ChatBot_v1.git

Create and activate a virtual environment:

python -m venv venv
source venv/bin/activate
# Windows: venv\Scripts\activate

Install dependencies:

pip install -r requirements.txt

Set API key (Cloud mode only):

export OPENROUTER_API_KEY=your_api_key_here
# Windows PowerShell:
# setx OPENROUTER_API_KEY "your_api_key_here"

Run the application:

python app.py

(back to top)

Usage

Open your browser and navigate to:

http://localhost:5000

Environment Variables

This project uses environment variables for configuration.

  1. Copy the example file:
    copy .env.example .env
    

(back to top)

Model Modes

Cloud Model

  • Uses a hosted large language model via OpenRouter

  • Requires internet connection

  • Higher language fluency and reasoning quality

Local Model

  • Fully offline execution

  • Suitable for privacy-sensitive environments

  • No external API cost

Model selection is configurable within the application.

(back to top)

License

Distributed under the -- License.

(back to top)

Contact

Lina Mohsen

LinkedIn:

Lina Alnasi

(back to top)

About

A conversational HR analytics chatbot that answers natural language questions using cloud and local large language models.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors