Skip to content

Django-based chatbot that integrates Ollama (to use deepseek-r1 locally). It is containerized using Docker for easy deployment and follows a microservices architecture with a Service Registry for dynamic service discovery and communication.

Notifications You must be signed in to change notification settings

Hasan-Saju/ChatMesh

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

55 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLM-Project: Software Studio

Welcome to the LLM Based Project of Software Studio! This project is a Django-based chatbot that integrates Ollama (to use deepseek-r1 locally). It is containerized using Docker for easy deployment and follows a microservices architecture with a Service Registry for dynamic service discovery and communication.

📌 Table of Contents


Prerequisites

Before running the project, ensure you have the following installed:

  • Python 3.8.10
  • pip (Python package manager)
  • Docker (for containerized deployment)
  • Ollama (for the AI interaction)

Ensure:

  • Ollama is running on your machine (http://localhost:11434)

Assignment 1: Running the Project Locally

1️⃣ Clone the Repository

git clone https://github.com/Hasan-Saju/LLM-Project-Studio
cd LLM-Project-Studio

2️⃣ Create a Virtual Environment

python -m venv venv
source venv/bin/activate   # For macOS/Linux
venv\Scripts\activate     # For Windows

3️⃣ Install Dependencies

pip install -r requirements.txt

4️⃣ Run Migrations

cd llm_project
python manage.py makemigrations
python manage.py migrate

5️⃣ Start the Django Server

python manage.py runserver

Running the Project with Docker

1️⃣ Build the Docker Image

docker build -t my-django-app .

2️⃣ Run the Container

docker run -p 8000:8000 my-django-app

Environment Variables

To configure Ollama API URL, create a .env file in the llm_project directory with:

ENVIRONMENT=development  # or 'production'
# Ollama API URL
OLLAMA_API_DEV=http://localhost:11434
OLLAMA_API_PROD=http://host.docker.internal:11434  

Preview

alt text

Assignment 2: Service Discovery

alt text

Service Registry - Microservices Discovery and Communication

Overview

The Service Registry is a microservice that allows dynamic discovery, registration, and communication between different microservices in a distributed architecture. It enables microservices to register themselves, send heartbeats to indicate their availability, and communicate with each other.

Features

  • Service Registration: Microservices register themselves with the registry upon startup.
  • Heartbeat Monitoring: Services send a heartbeat every 2 minutes to indicate they are alive.
  • Service Discovery: Any service can request a list of available services.
  • Message Forwarding: Allows a service to send a message to another service.
  • Auto Deregistration: If a service fails to send a heartbeat within 5 minutes, it is removed from the registry.

Technologies Used

  • Python (Flask for the Service Registry, Requests for HTTP communication)
  • Django (For the chatbot)
  • Threading (For background monitoring of inactive services)

Running the Service Registry

Run the service registry to allow microservices to register:

python service_registrar.py

API Endpoints

1. Register a Service

  • Endpoint: POST /register
  • Request Body:
    {
      "service_name": "grammar_service",
      "service_address": "http://localhost:5002/process"
    }
  • Response:
    {"message": "Service grammar_service registered successfully"}

2. List All Services

  • Endpoint: GET /list
  • Response:
    {
      "grammar_service": "http://localhost:5002/process"
    }

3. Send Heartbeat

  • Endpoint: POST /heartbeat
  • Request Body:
    {"service_name": "grammar_service"}
  • Response:
    {"message": "Heartbeat received from grammar_service"}

4. Forward a Message to Another Service

  • Endpoint: POST /forward
  • Request Body:
    {
      "target_service": "grammar_service",
      "payload": {"message": "r u fine"}
    }
  • Response:
    {"fixed_message": "are you fine"}

Service Cleanup Process

  • A background thread runs every 60 seconds to check for inactive services.
  • If a service does not send a heartbeat for 5 minutes, it is automatically removed.

Communication Between Chatbot and Service Registry

Chatbot Requesting Available Services

  • Endpoint: GET /microservices/list/
  • Chatbot Request:
    response = requests.get("http://localhost:5001/list")
    available_services = response.json()
  • Example Response:
    {
      "grammar_service": "http://localhost:5002/process"
    }

Chatbot Forwarding a Message to a Registered Service

  • Endpoint: POST /microservices/forward/
  • Chatbot Request:
    response = requests.post("http://localhost:5001/forward", json={
        "target_service": "grammar_service",
        "payload": {"message": "r u fine"}
    })
    fixed_message = response.json()["fixed_message"]
  • Example Response:
    {"fixed_message": "are you fine"}

Running Other Services

Running the Grammar Service

python grammar_service.py

Running the Django Chatbot

python manage.py runserver

Conclusion

The Service Registry ensures seamless communication between microservices by dynamically managing service discovery and request forwarding.

About

Django-based chatbot that integrates Ollama (to use deepseek-r1 locally). It is containerized using Docker for easy deployment and follows a microservices architecture with a Service Registry for dynamic service discovery and communication.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •