Skip to content

thetealover/py-ai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

py-ai

An AI-powered chat API with Model Context Protocol (MCP) integration, built with FastAPI, LangChain, and LangGraph.

Features

  • Backend: FastAPI with Python 3.12
  • Conversational AI: LangChain with LangGraph for stateful, multi-step agents
  • LLM: Google Gemini (configurable model)
  • Real-time Communication: Server-Sent Events (SSE) for streaming LLM responses
  • MCP Integration: Support for Model Context Protocol tools
  • Search Capabilities: Integrated Tavily search
  • Configuration: Pydantic-based settings management
  • Dependency Management: Poetry

Architecture

The project follows a clean architecture pattern with clear separation of concerns:

  • src/config/: Configuration management using Pydantic
  • src/ai/: AI agents and tools
  • src/mcp/: MCP server implementation
  • src/api/: FastAPI application and routes

Prerequisites

  • Python 3.12 or higher
  • Poetry for dependency management

Setup

  1. Install Poetry: If you don't have Poetry installed, follow the instructions on the official website.

  2. Clone the repository:

    git clone <repository-url>
    cd py-ai
  3. Install dependencies:

    poetry install
  4. Configure environment: Edit .env and add your API keys. If none are provided, the variables will be interpolated from the system environment variables.

    Keys to be set up:

Formatting the code base

A code formatter is included in the project. Run the following command to format the code base:

poetry run black .

To check for formatting issues without making changes, run:

poetry run black --check .

Running the Application

You need to run both the MCP server and the API server for full functionality.

Option 1: Using the main entry point

In separate terminal windows:

Terminal 1—Run the MCP server:

poetry run python main.py mcp

Terminal 2—Run the API server:

poetry run python main.py api

Option 2: Run directly with modules

Terminal 1—Run the MCP server:

poetry run python -m src.mcp.server

Terminal 2—Run the API server:

poetry run uvicorn src.api.app:app --reload --host localhost --port 8000

Terminal 3—Run the Streamlit UI app

poetry run python main.py streamlit

Accessing the Application

Once both servers are running:

The assistant has access to:

  • Weather information (through MCP tools)
  • Web search capabilities (through Tavily)
  • General knowledge from the LLM

API Endpoints

  • POST /chat/stream - Stream chat responses (SSE)
  • GET /mcp?city={city} - Test MCP weather tool directly
  • GET /health - Health check endpoint
  • GET /docs - Swagger UI documentation
  • GET /redoc - ReDoc documentation

Development

Code Quality Tools

Format code with Black:

poetry run black src tests

Sort imports with isort:

poetry run isort src tests

Lint with flake8:

poetry run flake8 src tests

Type check with mypy:

poetry run mypy src

Run all checks:

poetry run black src tests && poetry run isort src tests && poetry run flake8 src tests && poetry run mypy src

Running Tests

poetry run pytest

Run with coverage:

poetry run pytest --cov=src

Run specific test file:

poetry run pytest tests/test_api.py

Configuration

All configuration is managed through environment variables and the src/config/settings.py file. Key settings include:

Required Environment Variables

  • GOOGLE_API_KEY: Google API key for Gemini LLM
  • TAVILY_API_KEY: Tavily API key for search functionality
  • WEATHER_API_KEY: Weather API key for weather data

Troubleshooting

Common Issues

  1. Dependency conflicts during installation:

    poetry lock --no-update
    poetry install
  2. MCP server connection errors:

    • Ensure the MCP server is running before starting the API server
    • Check that the MCP_WS_URL in your .env matches the MCP server address
  3. API key errors:

    • Verify all required API keys are set in your .env file
    • Ensure there are no extra spaces or quotes around the API keys
  4. Port already in use:

    • Change the port in your .env file or use different ports:
    poetry run uvicorn src.api.app:app --port 8001

Build and run:

docker build -t py-ai .
docker run -p 8000:8000 -p 8001:8001 --env-file .env py-ai

About

Python powered Langchain + LangGraph AI chatbot application

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors