repo-explainer-demo-1.mp4
An AI app that explains GitHub repositories through agentic file exploration.
- AI-powered summaries -- Paste a GitHub repo URL and get an overview, architecture diagram, directory tree, and tech stack.
- Multiple AI providers -- Switch between Claude (Anthropic) and Gemini (Google) with a single env var (
AI_PROVIDER). Easy to add more providers. - Ask what you need -- Add instructions (e.g. "Focus on API design") and the explanation is tailored to your question.
- Smart file discovery -- The AI chooses which files to read from the repo tree; we fetch them in parallel for fast results.
- Live status updates -- The UI streams progress (validating, fetching tree, AI exploring files, fetching contents, generating explanation) so you see what’s happening at each step.
- Safe for large repos -- Context limits and clear errors (e.g. "repository too large", "rate limit") instead of cryptic failures.
- Polished UI -- Dark/light theme, example repos and prompts, compact layout.
Like this project? Star the repo on GitHub -- it helps others find it. Want to improve it? Contributions are welcome! See Contributing below.
Prerequisites: Python 3.11+
- Clone the repo and go to its root.
- Create and activate a virtual environment, then install the package.
Using pip:
python -m venv .venv
# Windows:
.venv\Scripts\activate
# Unix/macOS:
source .venv/bin/activate
pip install -e .Using uv:
uv venv
# Windows:
.venv\Scripts\activate
# Unix/macOS:
source .venv/bin/activate
uv pip install -e .- Copy env: create a
.envat the project root (or inbackend/) and set variables. See backend/.env.example andENV_SETUP.md(if present) forDATABASE_URL,AI_PROVIDER,ANTHROPIC_API_KEY,GEMINI_API_KEY,GITHUB_TOKEN,CORS_ORIGINS, etc.
Prerequisites: Node.js 18+ (npm or bun)
- From repo root, go to the frontend and install dependencies:
cd frontend
# npm:
npm install
# or bun:
bun install- Set the backend URL for development: create
frontend/.env.developmentwith:See ENV_SETUP.md for production and other options.VITE_BACKEND_API_URL=http://127.0.0.1:8000
- Dev dependencies (optional):
pip install -e '.[dev]'oruv pip install -e '.[dev]'(pytest, black, ruff, mypy, etc.). - Run API: From repo root:
fastapi dev backend/main.py(serves at http://127.0.0.1:8000). - Run tests:
pytest backend/tests/(orpytest backend/tests/ -v). - Lint / format:
ruff check .andblack .(config in pyproject.toml). - Local Postgres:
docker compose up -d, then setDATABASE_URLin.env. See ENV_SETUP.md.
- Run dev server: From
frontend/:npm run devorbun run dev(Vite, usually http://localhost:5173). - Build:
npm run buildorbun run build. - Lint:
npm run lintorbun run lint. - Preview production build:
npm run previeworbun run preview.
Run the backend API first so the frontend can talk to it; use the same VITE_BACKEND_API_URL as in frontend/.env.development.
GET /{owner}/{repo}/stream?instructions=...-- SSE (Server-Sent Events) stream used by the frontend: sends status events (validating, fetching_tree, exploring_files, fetching_files, generating_explanation) then a singleresultorerrorevent with the explanation or message.GET /{owner}/{repo}?instructions=...-- Optional one-shot JSON response (same payload as the finalresultevent). Useful for scripts or clients that don’t need streaming.
Run backend, frontend, and Postgres with a single command (from repo root):
# Create .env at repo root (same folder as docker-compose.yml) with AI_PROVIDER=claude or gemini,
# the corresponding ANTHROPIC_API_KEY or GEMINI_API_KEY, and optionally GITHUB_TOKEN.
# Copy from backend/.env.example, then add your keys. Docker Compose needs this file to exist.
docker compose up --buildThen open http://localhost:5173 (frontend) and http://localhost:8000 (API docs). The backend runs migrations on startup. To run in the background: docker compose up -d.
The backend can run in a container for deployment on any cloud (Render, Fly.io, Cloud Run, ECS, etc.).
Build (from repo root):
docker build -t repo-explainer .Run locally (Claude):
docker run -p 8000:8000 \
-e DATABASE_URL="postgresql+psycopg2://user:pass@host:5432/db" \
-e AI_PROVIDER="claude" \
-e ANTHROPIC_API_KEY="your-claude-key" \
-e CORS_ORIGINS="http://localhost:5173" \
repo-explainerRun locally (Gemini):
docker run -p 8000:8000 \
-e DATABASE_URL="postgresql+psycopg2://user:pass@host:5432/db" \
-e AI_PROVIDER="gemini" \
-e GEMINI_API_KEY="your-gemini-key" \
-e MODEL="gemini-2.0-flash" \
-e CORS_ORIGINS="http://localhost:5173" \
repo-explainer- The image uses
PORT(default 8000); set-e PORT=8000or let your platform set it. MODELoverrides the default model for the selected provider (e.g.claude-haiku-4-5-20251001,gemini-2.0-flash,gemini-3-pro-preview).- Run migrations before or after start (e.g. a separate job or init container):
alembic upgrade headwith the sameDATABASE_URL. - See backend/.env.example for all env vars.
Migrations live in backend/alembic/. Run from repo root (where alembic.ini lives):
alembic upgrade head-- apply all migrationsalembic downgrade -1-- roll back one revisionalembic revision --autogenerate -m "description"-- new migration from modelsalembic current-- show current revisionalembic history -v-- show history
Set DATABASE_URL in .env before running. See ENV_SETUP.md.
We welcome contributions -- whether it’s a bug fix, a new feature, or better docs. Here’s how to get started.
- Star the repo -- If you find this useful, starring helps others discover it.
- Open an issue -- Report bugs or suggest ideas in GitHub Issues. Check existing issues first to avoid duplicates.
- Submit a pull request -- For code or doc changes:
- Fork the repo and create a branch from
main(e.g.fix/typo-readmeorfeat/your-feature). - Make your changes. Keep commits focused and messages clear.
- Run tests and linting (see Development).
- Open a PR against
mainwith a short description of what changed and why. Link any related issue.
- Fork the repo and create a branch from
- Code style: Backend -- follow ruff and Black (config in
pyproject.toml). Frontend -- use the existing ESLint config. - Tests: Add or update tests for behavior changes; run
pytest backend/tests/before submitting. - Docs: Update the README or ENV_SETUP.md if you change setup, env vars, or usage.
- Scope: Keep PRs reasonably scoped; for large features, open an issue first to discuss.
Thank you for contributing!