Skip to content

Thejesh-M/MyHealthAgent_Multi-Agent-Assistant

Repository files navigation

MyHealthAgent – Multi‐Agent Nutrition Assistant

Python
Flask
React
License: MIT


🚀 Project Overview

MyHealthAgent is a GenAI‐powered, multi‐agent nutrition assistant designed to help users log meals, extract medical conditions from lab‐report images, and receive real‐time, personalized dietary guidance.

  • Engineered a GenAI‐powered nutrition assistant that analyzes user chronic conditions and meal logs to deliver real‐time, personalized dietary recommendations in under 2 seconds.
  • Formulated a condition‐to‐nutrition pipeline that:
    1. Extracts user medical conditions from lab‐report images via OCR + a LLaMA‐based model.
    2. Computes 10+ macro‐ and micro‐nutrient thresholds tailored to those conditions.
  • Designed a Web App enabling users to chat with a Multi‐Agent System for:
    - Condition‐specific food recommendations
    - Dietary planning
    - Food image classification (∼90 % ingredient‐recognition accuracy)

📐 High‐Level Architecture

┌───────────────────────────────────────────────────────────────────┐
│                            MyHealthAgent                          │
│ ┌──────────────────────────┐    ┌───────────────────────────────┐ │
│ │      Frontend (/frontend)│    │       Backend (/app.py, etc.) │ │
│ │ ┌──────────────────────┐ │    │ ┌───────────────────────────┐ │ │
│ │ │  React App (chat UI) │ │    │ │ Flask API Endpoints       │ │ │
│ │ │  • /src, /public,    │ │    │ │ • /api/chat               │ │ │
│ │ │   /build, package.json │    │ │ • /api/get_thresholds     │ │ │
│ │ └──────────────────────┘ │    │ │ • /api/process_meal       │ │ │
│ │                          │    │ │ • /api/extract_diseases   │ │ │
│ │                          │    │ │                           │ │ │
│ │                          │    │ │  LangGraph StateGraph     │ │ │
│ │                          │    │ │  • Tracks “current intent”│ │ │
│ │                          │    │ │   per‐session             │ │ │
│ │                          │    │ │  • Routes to greeting,    │ │ │
│ │                          │    │ │    food_logging, planning,│ │ │
│ │                          │    │ │    health_advice, or other│ │ │
│ │                          │    │ │  Agents (LLM calls)       │ │ │
│ └──────────────────────────┘    │ │ └─────────────────────────┘ │ │
│                                 │ │                             │ │
│                                 │ │  ┌─────────────────────────┐│ │
│                                 │ │  │ Food‐classification Model│ │
│                                 │ │  │ (ResNet‐based Food101)  ││ │
│                                 │ │  └─────────────────────────┘│ │
│                                 │ │  ┌─────────────────────────┐│ │
│                                 │ │  │ OCR + LLaMA model for   ││ │
│                                 │ │  │ disease‐extraction      ││ │
│                                 │ │  └─────────────────────────┘│ │
│                                 │ │                             │ │
│                                 │ │  ┌─────────────────────────┐│ │
│                                 │ │  │ Nutrient‐Calculator     ││ │
│                                 │ │  │ (Combine thresholds,etc)││ │
│                                 │ │  └─────────────────────────┘│ │
│                                 │ │                             │ │
│                                 │ │  ┌─────────────────────────┐│ │
│                                 │ │  │ SQLite / In‐Memory Store││ │
│                                 │ │  │ (Session states, logs)  ││ │
│                                 │ │  └─────────────────────────┘│ │
│                                 │ │                             │ │
│                                 │ │  └─────────────────────────┘│ │
│                                 │ │                             │ │
└───────────────────────────────────────────────────────────────────┘
  1. Frontend (folder: /frontend)

    • Built with React (v18+) and served by npm run start (development) or npm run build (production).
    • Provides a chat interface that calls the Flask backend via REST.
  2. Backend

    • Written in Flask (v2.0+). Entry point: app.py in the repo root.
    • Defines API endpoints:
      • POST /api/chat – receives a user message, classifies intent, and routes through a LangGraph‐controlled multi‐agent.
      • POST /api/get_thresholds – sets user diseases, computes nutrient thresholds (via nutrient_calculator.py).
      • POST /api/process_meal – handles image‐ or text‐based meal logging (food classification + nutrient accumulation).
      • POST /api/extract_diseases – runs OCR on a medical image, uses a LLaMA‐based prompt to extract disease names.
  3. Models (folder: /models)

    • food101_model.pth – Pretrained PyTorch model (ResNet‐based) for food classification.
  4. Key Python Modules (in the root):

    • food_model.py – Loads models/food101_model.pth, defines load_model, predict_food, and image preprocess.
    • nutrient_calculator.py – Contains get_food_nutrients(...) (lookup in a nutrient DB) and combine_thresholds(diseases) logic.
    • app.py – Main Flask application (described above).
    • requirements.txt – Pin all Python dependencies.
  5. Jupyter Notebook

    • Image_Classification.ipynb – Demo / prototype notebook for training / testing the food classifier.
  6. Static Assets

    • blood-sugar-levels-example.jpg, pizza.jpg – Example images used for testing OCR or classifier.

📁 Repository Structure

MyHealthAgent/
├── app.py
├── food_model.py
├── nutrient_calculator.py
├── requirements.txt
├── Image_Classification.ipynb
├── blood-sugar-levels-example.jpg
├── pizza.jpg
├── models/
│   └──  food101_model.pth
├── frontend/
│   ├── build/                 # Auto‐generated by `npm run build` (production)
│   ├── node_modules/          # Auto‐generated by `npm install`
│   ├── public/
│   │   ├── index.html
│   │   └── …
│   ├── src/
│   │   ├── components/        # React components (ChatWindow, MessageBubble, etc.)
│   │   ├── App.jsx
│   │   └── index.jsx
│   ├── package.json
│   ├── package-lock.json
└── README.md            

⚙️ Installation & Local Setup

1. Clone the Repository

git clone https://github.com/Thejesh-M/MyHealthAgent_Multi-Agent-Assistant.git
cd MyHealthAgent_Multi-Agent-Assistant

2. Backend (Python + Flask)

  1. Create a Python virtual environment (Python 3.8+ recommended):

    python3 -m venv .venv
    source .venv/bin/activate    # on macOS/Linux
  2. Install Python dependencies:

    pip install --upgrade pip
    pip install -r requirements.txt
  3. Environment variables
    Create a file named .env in the project root (or export environment vars in your shell). We will be using USDA FoodData Central API for extracting nutrients from food. You can create one from the provided link and set it in the .env file

    USDA_FOOD_API="your_USDA_API_KEY"
    PORT=5000
  4. Run the Flask server:

    export FLASK_APP=app.py   # macOS/Linux
    flask run --host=0.0.0.0 --port=$PORT

    Or simply:

    python app.py

    You should see something like:

    * Serving Flask app "app.py" (lazy loading)
    * Environment: development
    * Debug mode: on
    * Running on http://0.0.0.0:5000/ (Press CTRL+C to quit)
    
  5. Verify Backend Endpoints

    • GET / → (if you added a root route).
    • POST /http://localhost:5000/api/chat { "message": "Hi there" } → Should return JSON with assistant_response, intent, etc.
    • POST /http://localhost:5000/api/get_thresholds { "diseases": ["Diabetes","Hypertension"] }
    • POST /http://localhost:5000/api/extract_diseases (form data: medical_image = JPEG/PNG, entered_text=some text)
    • POST /http://localhost:5000/api/process_meal (form data: food_image or entered_text).

3. Frontend (React)

  1. Navigate into the frontend folder:

    cd frontend
  2. Install Node packages (requires Node 16+ & npm 8+):

    npm install

    This will populate node_modules/ and create a lockfile (package-lock.json).

  3. Run in Development Mode:

    npm start

    This will launch a dev server at http://localhost:3000 (by default).
    Open your browser to http://localhost:3000 → you should see the chat UI.

  4. Interact at http://localhost:3000:

    • Post an image of a lab report (e.g. blood-sugar-levels-example.jpg) and ask to extract diseases → api_extract_diseases uses OCR + LLaMA prompt.
    • Type “Hi” → the intent classifier should go to greeting_agent.
    • Upload a food photo (e.g. pizza.jpg) → /api/process_meal will classify and log.
    • Ask “What should I have for dinner?” → meal_planning_agent produces a bullet list.

🤖 Multi‐Agent (LangGraph) Flow

Inside app.py, we maintain a small StateGraph that represents five “intent nodes”:

  1. START – initial state (no message yet).
  2. greeting – user is saying “hello” or small talk.
  3. food_logging – user is reporting what they have eaten (text or image).
  4. meal_planning – user is asking what they plan to eat in the future.
  5. health_advice – user is asking for general health/nutrition guidance.
  6. other – fallback for messages that don’t fit above.

At runtime:

current_node = state["current_node"]  # e.g. “START” or “meal_planning”

# 1. Classify the new user message:
detected_intent = classify_intent(user_message)

# 2. Traverse graph: find an outgoing edge from `current_node` whose condition
#    (classify_intent == dest) is True → that dest becomes next_node.
if no edge matches, next_node = "other"

# 3. state["current_node"] = next_node

# 4. Call _AGENT[next_node](state, user_message) → reply

Each “agent” function (e.g. meal_logging_agent, planning_agent) takes the ChatState and the raw user text (or processed text+predicted food) and returns a string. We then sanitize and return it in JSON.


🎖 Acknowledgments

  • Built with Flask and React.
  • OCR powered by pytesseract.
  • Food classification courtesy of a pretrained Food101 model (ResNet).
  • Intent‐classification & disease extraction via langchain_ollama + a LLaMA‐based prompt.
  • State management / multi‐agent routing via langgraph.

Made with ❤️ by Thejesh M

About

MyHealthAgent is a GenAI‐powered, multi‐agent nutrition assistant designed to help users log meals, extract medical conditions from lab‐report images, and receive real‐time, personalized dietary guidance.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors