| title | emoji | colorFrom | colorTo | sdk | sdk_version | app_file | pinned | license | tags | short_description | ||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
AI Code Explainer |
π§ |
blue |
purple |
streamlit |
1.40.0 |
src/streamlit_app.py |
false |
mit |
|
AI-powered code analysis, explanation, and visualization |
Analyze, understand, and improve your code with AI
A production-ready tool that uses AI to explain code, analyze complexity, detect security issues, generate documentation, and visualize code flow.
- High-Level Summary: Understand what code does at a glance
- Line-by-Line Walkthrough: Step-by-step explanation of each line
- ELI5 Mode: Simple explanations with real-world analogies
- Audience Levels: Beginner, Intermediate, Expert
- Complexity Analysis: Time & Space complexity (Big O notation)
- Security Scan: Detect common vulnerabilities (SQL injection, hardcoded keys, etc.)
- Best Practices Review: PEP8 compliance, naming conventions, code smells
- Flowchart Generation: Visual representation of code logic
- Dependency Graphs: See which functions call which
- Mermaid Diagrams: Interactive, copy-ready diagrams
- Refactoring Suggestions: Improve readability, performance, or maintainability
- Before/After Comparison: Side-by-side view of changes
- Docstring Generation: Auto-generate documentation in multiple styles
- Ask Questions: "Why did you use a while loop here?"
- Context-Aware: Remembers the conversation history
- Deep Understanding: Get detailed answers about specific code sections
- Python 3.10+
- Groq API Key (free tier available)
# Clone the repository
git clone https://github.com/your-repo/ai-code-explainer.git
cd ai-code-explainer
# Create virtual environment with uv
uv venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Install dependencies
uv pip install -e .
# Copy environment template and add your API key
cp .env.example .env
# Edit .env and add your GROQ_API_KEY
# Run the application
streamlit run src/streamlit_app.py# Clone the repository
git clone https://github.com/your-repo/ai-code-explainer.git
cd ai-code-explainer
# Copy environment template
cp .env.example .env
# Edit .env and add your GROQ_API_KEY
# Run with Docker Compose (Direct Mode - Default)
docker-compose up --build
# Access at http://localhost:7860- Fork this repository
- Create a new Space on Hugging Face
- Select "Docker" as the SDK
- Add your
GROQ_API_KEYas a secret in Space settings - Push to your Space repository
This project follows a "Direct-First" Hybrid Architecture:
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β AI Code Explainer β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β β
β βββββββββββββββββββ βββββββββββββββββββββββββββ β
β β Streamlit β β Backend β β
β β Frontend β β β β
β β β Direct β βββββββββββββββββββ β β
β β components.py βββββββββββββΊβ services.py β β β
β β β Import β β (Business Logic)β β β
β β streamlit_app β β βββββββββββββββββββ β β
β β .py β β β β β
β β β OR β βΌ β β
β β β β βββββββββββββββββββ β β
β β β HTTP β β api.py β β β
β β βββββββββββββΊβ (FastAPI) β β β
β β β (--mode β βββββββββββββββββββ β β
β β β api) β β β
β βββββββββββββββββββ βββββββββββββββββββββββββββ β
β β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
| Mode | Description | Use Case |
|---|---|---|
| Direct Mode (Default) | Frontend imports backend services directly | HF Spaces, local dev |
| API Mode | Frontend calls FastAPI endpoints via HTTP | Microservices, scaling |
ai-code-explainer/
βββ .env.example # Environment template
βββ Dockerfile # Streamlit container (HF Spaces)
βββ Dockerfile.api # FastAPI container (API mode)
βββ docker-compose.yml # Multi-container orchestration
βββ pyproject.toml # uv/pip project configuration
βββ requirements.txt # pip-compatible dependencies
βββ README.md # This file
βββ src/
βββ __init__.py
βββ streamlit_app.py # Main entry point
βββ frontend/
β βββ __init__.py
β βββ components.py # Reusable UI components
βββ backend/
βββ __init__.py
βββ config.py # Centralized configuration
βββ services.py # Business logic (AI interactions)
βββ api.py # FastAPI endpoints
| Variable | Required | Default | Description |
|---|---|---|---|
GROQ_API_KEY |
β Yes | - | Your Groq API key |
GROQ_BASE_URL |
No | https://api.groq.com |
API base URL (no /openai/v1) |
GROQ_MODEL_NAME |
No | llama-3.3-70b-versatile |
AI model to use |
MAX_CODE_LINES |
No | 500 |
Max lines to analyze |
| Model | Speed | Quality | Best For |
|---|---|---|---|
llama-3.3-70b-versatile |
βββ | βββββ | Complex code, detailed explanations |
llama-3.1-8b-instant |
βββββ | βββ | Quick analysis, simple code |
llama-3.1-70b-versatile |
βββ | ββββ | Alternative to 3.3 |
mixtral-8x7b-32768 |
ββββ | ββββ | Large files (32K context) |
gemma2-9b-it |
ββββ | βββ | Efficient, balanced |
- Understand complex algorithms with ELI5 explanations
- Learn from flowchart visualizations
- Get analogies that make concepts click
- Document legacy code with auto-generated docstrings
- Review code for best practices
- Refactor for better readability
- Analyze time/space complexity of solutions
- Get optimization suggestions
- Understand algorithmic patterns
- Detect common vulnerabilities
- Find hardcoded secrets
- Review for injection risks
For microservices architecture or when you need to scale the backend separately:
# Start both API and Streamlit in API mode
docker-compose --profile api-mode up --build
# Or manually:
# Terminal 1: Start FastAPI
uvicorn src.backend.api:app --host 0.0.0.0 --port 8000
# Terminal 2: Start Streamlit in API mode
streamlit run src/streamlit_app.py -- --mode api --api-url http://localhost:8000| Endpoint | Method | Description |
|---|---|---|
/health |
GET | Health check |
/models |
GET | List available models |
/explain |
POST | Generate code explanation |
/analyze-complexity |
POST | Analyze time/space complexity |
/check-security |
POST | Security vulnerability scan |
/review-practices |
POST | Best practices review |
/generate-docstring |
POST | Generate documentation |
/generate-flowchart |
POST | Generate Mermaid flowchart |
/refactor |
POST | Suggest refactoring |
/chat |
POST | Interactive Q&A |
API documentation available at /docs when running in API mode.
- β No Code Execution: This tool only analyzes code, never runs it
- β No Storage: Code is sent to Groq's API but not stored permanently
- β Input Limits: Configurable limits on code size to prevent abuse
- β API Key Protection: Keys stored securely, never exposed in UI
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Groq for blazing-fast LLM inference
- Streamlit for the amazing web framework
- FastAPI for the robust API framework
- Hugging Face for hosting and deployment
Built with β€οΈ using Streamlit and Groq AI
GitHub β’ Demo β’ Get API Key