Skip to content

A production-ready tool that uses AI to explain code, analyze complexity, detect security issues, generate documentation, and visualize code flow.

License

Notifications You must be signed in to change notification settings

swetachovatiya23/ai-code-explainer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

17 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

title emoji colorFrom colorTo sdk sdk_version app_file pinned license tags short_description
AI Code Explainer
🧠
blue
purple
streamlit
1.40.0
src/streamlit_app.py
false
mit
code-analysis
ai
llm
groq
streamlit
education
AI-powered code analysis, explanation, and visualization

🧠 AI Code Explainer

Analyze, understand, and improve your code with AI

A production-ready tool that uses AI to explain code, analyze complexity, detect security issues, generate documentation, and visualize code flow.

Python Streamlit Groq License


✨ Features

πŸ“– Code Explanation

  • High-Level Summary: Understand what code does at a glance
  • Line-by-Line Walkthrough: Step-by-step explanation of each line
  • ELI5 Mode: Simple explanations with real-world analogies
  • Audience Levels: Beginner, Intermediate, Expert

πŸ“Š Code Analysis

  • Complexity Analysis: Time & Space complexity (Big O notation)
  • Security Scan: Detect common vulnerabilities (SQL injection, hardcoded keys, etc.)
  • Best Practices Review: PEP8 compliance, naming conventions, code smells

πŸ“ˆ Visualization

  • Flowchart Generation: Visual representation of code logic
  • Dependency Graphs: See which functions call which
  • Mermaid Diagrams: Interactive, copy-ready diagrams

πŸ”„ Code Improvement

  • Refactoring Suggestions: Improve readability, performance, or maintainability
  • Before/After Comparison: Side-by-side view of changes
  • Docstring Generation: Auto-generate documentation in multiple styles

πŸ’¬ Interactive Chat

  • Ask Questions: "Why did you use a while loop here?"
  • Context-Aware: Remembers the conversation history
  • Deep Understanding: Get detailed answers about specific code sections

πŸš€ Quick Start

Prerequisites

Option 1: Run Locally with uv (Recommended)

# Clone the repository
git clone https://github.com/your-repo/ai-code-explainer.git
cd ai-code-explainer

# Create virtual environment with uv
uv venv
source .venv/bin/activate  # On Windows: .venv\Scripts\activate

# Install dependencies
uv pip install -e .

# Copy environment template and add your API key
cp .env.example .env
# Edit .env and add your GROQ_API_KEY

# Run the application
streamlit run src/streamlit_app.py

Option 2: Run with Docker

# Clone the repository
git clone https://github.com/your-repo/ai-code-explainer.git
cd ai-code-explainer

# Copy environment template
cp .env.example .env
# Edit .env and add your GROQ_API_KEY

# Run with Docker Compose (Direct Mode - Default)
docker-compose up --build

# Access at http://localhost:7860

Option 3: Deploy to Hugging Face Spaces

  1. Fork this repository
  2. Create a new Space on Hugging Face
  3. Select "Docker" as the SDK
  4. Add your GROQ_API_KEY as a secret in Space settings
  5. Push to your Space repository

πŸ—οΈ Architecture

This project follows a "Direct-First" Hybrid Architecture:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                    AI Code Explainer                        β”‚
β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€
β”‚                                                             β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”         β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”   β”‚
β”‚  β”‚   Streamlit     β”‚         β”‚       Backend           β”‚   β”‚
β”‚  β”‚   Frontend      β”‚         β”‚                         β”‚   β”‚
β”‚  β”‚                 β”‚  Direct β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”‚   β”‚
β”‚  β”‚  components.py  │─────────│─►│   services.py   β”‚    β”‚   β”‚
β”‚  β”‚                 β”‚  Import β”‚  β”‚  (Business Logic)β”‚   β”‚   β”‚
β”‚  β”‚  streamlit_app  β”‚         β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β”‚   β”‚
β”‚  β”‚     .py         β”‚         β”‚           β”‚             β”‚   β”‚
β”‚  β”‚                 β”‚   OR    β”‚           β–Ό             β”‚   β”‚
β”‚  β”‚                 β”‚         β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”‚   β”‚
β”‚  β”‚                 β”‚   HTTP  β”‚  β”‚    api.py       β”‚    β”‚   β”‚
β”‚  β”‚                 │─────────│─►│   (FastAPI)     β”‚    β”‚   β”‚
β”‚  β”‚                 β”‚ (--mode β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β”‚   β”‚
β”‚  β”‚                 β”‚   api)  β”‚                         β”‚   β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜         β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜   β”‚
β”‚                                                             β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Modes

Mode Description Use Case
Direct Mode (Default) Frontend imports backend services directly HF Spaces, local dev
API Mode Frontend calls FastAPI endpoints via HTTP Microservices, scaling

Directory Structure

ai-code-explainer/
β”œβ”€β”€ .env.example           # Environment template
β”œβ”€β”€ Dockerfile             # Streamlit container (HF Spaces)
β”œβ”€β”€ Dockerfile.api         # FastAPI container (API mode)
β”œβ”€β”€ docker-compose.yml     # Multi-container orchestration
β”œβ”€β”€ pyproject.toml         # uv/pip project configuration
β”œβ”€β”€ requirements.txt       # pip-compatible dependencies
β”œβ”€β”€ README.md              # This file
└── src/
    β”œβ”€β”€ __init__.py
    β”œβ”€β”€ streamlit_app.py   # Main entry point
    β”œβ”€β”€ frontend/
    β”‚   β”œβ”€β”€ __init__.py
    β”‚   └── components.py  # Reusable UI components
    └── backend/
        β”œβ”€β”€ __init__.py
        β”œβ”€β”€ config.py      # Centralized configuration
        β”œβ”€β”€ services.py    # Business logic (AI interactions)
        └── api.py         # FastAPI endpoints

βš™οΈ Configuration

Environment Variables

Variable Required Default Description
GROQ_API_KEY βœ… Yes - Your Groq API key
GROQ_BASE_URL No https://api.groq.com API base URL (no /openai/v1)
GROQ_MODEL_NAME No llama-3.3-70b-versatile AI model to use
MAX_CODE_LINES No 500 Max lines to analyze

Available Models

Model Speed Quality Best For
llama-3.3-70b-versatile ⭐⭐⭐ ⭐⭐⭐⭐⭐ Complex code, detailed explanations
llama-3.1-8b-instant ⭐⭐⭐⭐⭐ ⭐⭐⭐ Quick analysis, simple code
llama-3.1-70b-versatile ⭐⭐⭐ ⭐⭐⭐⭐ Alternative to 3.3
mixtral-8x7b-32768 ⭐⭐⭐⭐ ⭐⭐⭐⭐ Large files (32K context)
gemma2-9b-it ⭐⭐⭐⭐ ⭐⭐⭐ Efficient, balanced

🎯 Use Cases

πŸŽ“ For Students

  • Understand complex algorithms with ELI5 explanations
  • Learn from flowchart visualizations
  • Get analogies that make concepts click

πŸ‘¨β€πŸ’» For Developers

  • Document legacy code with auto-generated docstrings
  • Review code for best practices
  • Refactor for better readability

🎯 For Interview Prep

  • Analyze time/space complexity of solutions
  • Get optimization suggestions
  • Understand algorithmic patterns

πŸ”’ For Security Review

  • Detect common vulnerabilities
  • Find hardcoded secrets
  • Review for injection risks

πŸ”§ API Mode

For microservices architecture or when you need to scale the backend separately:

# Start both API and Streamlit in API mode
docker-compose --profile api-mode up --build

# Or manually:
# Terminal 1: Start FastAPI
uvicorn src.backend.api:app --host 0.0.0.0 --port 8000

# Terminal 2: Start Streamlit in API mode
streamlit run src/streamlit_app.py -- --mode api --api-url http://localhost:8000

API Endpoints

Endpoint Method Description
/health GET Health check
/models GET List available models
/explain POST Generate code explanation
/analyze-complexity POST Analyze time/space complexity
/check-security POST Security vulnerability scan
/review-practices POST Best practices review
/generate-docstring POST Generate documentation
/generate-flowchart POST Generate Mermaid flowchart
/refactor POST Suggest refactoring
/chat POST Interactive Q&A

API documentation available at /docs when running in API mode.


πŸ›‘οΈ Safety

  • βœ… No Code Execution: This tool only analyzes code, never runs it
  • βœ… No Storage: Code is sent to Groq's API but not stored permanently
  • βœ… Input Limits: Configurable limits on code size to prevent abuse
  • βœ… API Key Protection: Keys stored securely, never exposed in UI

🀝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/AmazingFeature)
  3. Commit your changes (git commit -m 'Add some AmazingFeature')
  4. Push to the branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.


πŸ™ Acknowledgments


Built with ❀️ using Streamlit and Groq AI

GitHub β€’ Demo β€’ Get API Key

Releases

No releases published

Packages

No packages published