Understand your Kubernetes manifests with AI-powered and rule-based explanations
Features β’ Quick Start β’ Usage β’ Configuration β’ API Docs
Kubernetes YAML Explainer is a self-contained, Dockerized web application that helps you understand Kubernetes and k3s YAML manifests. It validates manifests against known schemas and explains their structure and meaning in clear English.
The application works entirely offline using an internal rule-based engine, but also supports optional integration with remote LLM backends (OpenAI, Anthropic Claude, local Ollama, etc.) for richer natural-language explanations.
- π Parse & Validate - Parse multi-document YAML and validate against Kubernetes schemas
- π Rule-Based Explanations - 80+ built-in explanations for common Kubernetes fields
- π€ Optional AI Integration - Connect to OpenAI, Claude, or local LLMs for enhanced explanations
- π¨ Modern Glossy UI - Beautiful glassmorphism design with light/dark themes
- π Secure & Private - All API keys encrypted, works offline, no telemetry
- π³ Fully Dockerized - One-command deployment with persistent storage
- π οΈ YAML Wizard - Generate common resources (Deployment, Service, Ingress, ConfigMap)
-
YAML Upload & Parsing
- Upload
.yamlor.ymlfiles or paste content directly - Multi-document YAML support (separated by
---) - Identifies and structures each Kubernetes resource
- Upload
-
Schema Validation
- Validates required fields and structure
- Detects deprecated API versions
- Provides actionable suggestions for fixes
- Highlights errors, warnings, and informational issues
-
Rule-Based Explanation Engine (Offline)
- Maps 80+ field paths to human-readable explanations
- Covers Deployments, Services, Pods, Ingresses, ConfigMaps, and more
- Explains purpose, behavior, and cluster impact
- Works completely offline without external dependencies
-
LLM Integration (Optional)
- Configure OpenAI, Anthropic Claude, or custom endpoints
- API keys encrypted with AES-256 and persisted
- Test connections before saving
- Enhanced natural-language summaries
- Multiple LLM configurations supported
-
YAML Wizard
- Guided creation for common resources
- Step-by-step forms with validation
- Generates production-ready YAML
- Supports: Deployment, Service, Ingress, ConfigMap
-
Modern UI
- Glossy glassmorphism design
- Light/Dark theme with smooth transitions
- Monaco Editor with syntax highlighting
- Responsive layout for desktop and mobile
- Keyboard shortcuts and accessibility
- Docker (20.10+) and Docker Compose (2.0+)
- OR Node.js 18+ and Python 3.11+ for local development
-
Clone the repository
git clone <repository-url> cd kubernetes-yaml-explainer
-
Start the application
docker-compose up -d
-
Access the application
Open your browser to: http://localhost:8080
That's it! The app is now running with persistent storage.
cd backend
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
python main.pyBackend runs at: http://localhost:8000
cd frontend
npm install
npm startFrontend runs at: http://localhost:3000 (proxied to backend)
- Click Upload to select a
.yamlfile from your computer - Or paste YAML content directly into the Monaco Editor
- Or use the New YAML wizard to generate manifests
- Click Validate to check for errors and warnings
- Review issues in the Validation tab
- Each issue includes the field path, message, and suggestion
- Click Explain for rule-based explanations (works offline)
- Click Explain with AI for LLM-enhanced summaries (requires LLM setup)
- View results in three tabs:
- Summary - High-level overview of all resources
- Field Details - Field-by-field explanations
- Validation - Errors, warnings, and suggestions
- Click Export to download the YAML file
- Modified content in the editor is automatically saved to the download
- Click the Settings icon (βοΈ) in the top-right corner
- Navigate to LLM Configuration
- Click Add New LLM Connection
- Fill in the details:
- Provider Name: e.g., "OpenAI", "Anthropic", "Local Ollama"
- API Endpoint: Full URL including path
- OpenAI:
https://api.openai.com/v1/chat/completions - Anthropic:
https://api.anthropic.com/v1/messages - Ollama:
http://localhost:11434/api/chat
- OpenAI:
- API Key: Your authentication key
- Model Name: e.g.,
gpt-3.5-turbo,claude-3-sonnet-20240229
- Click Test Connection to verify
- Click Save to persist (encrypted)
Create a .env file based on .env.example:
# Server Port
PORT=8080
# Database (SQLite by default)
DATABASE_URL=sqlite+aiosqlite:///./data/app.db
# Encryption Key (CHANGE THIS!)
ENCRYPTION_KEY=your-secure-random-key-here
# Optional: PostgreSQL
# DATABASE_URL=postgresql+asyncpg://user:password@localhost:5432/k8s_explainerENCRYPTION_KEY in production!
- Click the Sun/Moon icon to toggle between light and dark themes
- Your preference is saved automatically
βββββββββββββββββββ
β React SPA β β Glossy UI with Monaco Editor
β (Frontend) β
ββββββββββ¬βββββββββ
β HTTP/REST
ββββββββββΌβββββββββ
β FastAPI β β YAML parsing, validation, explanations
β (Backend) β
ββββββββββ¬βββββββββ
β
ββββββ΄βββββ
β β
βββββΌββββ ββββΌβββββββββ
β SQLiteβ β Optional β
β DB β β LLM API β
βββββββββ βββββββββββββ
- Frontend: React with Monaco Editor, Lucide icons, custom CSS
- Backend: FastAPI with async SQLAlchemy, ruamel.yaml parser
- Database: SQLite (default) or PostgreSQL for settings and LLM configs
- Encryption: AES-256 via cryptography library
- LLM Support: OpenAI, Anthropic, Ollama, and OpenAI-compatible APIs
Once running, access interactive API docs:
- Swagger UI: http://localhost:8080/docs
- ReDoc: http://localhost:8080/redoc
| Endpoint | Method | Description |
|---|---|---|
/api/v1/parse |
POST | Parse YAML into structured resources |
/api/v1/validate |
POST | Validate manifest against schemas |
/api/v1/explain |
POST | Generate explanations (rule-based + optional LLM) |
/api/v1/generate |
POST | Generate YAML via wizard |
/api/v1/settings |
GET/PUT | User settings (theme, preferences) |
/api/v1/llm/config |
GET/POST/DELETE | Manage LLM connections |
/api/v1/llm/test |
POST | Test LLM connection |
curl -X POST http://localhost:8080/api/v1/explain \
-H "Content-Type: application/json" \
-d '{
"content": "apiVersion: v1\nkind: Pod\nmetadata:\n name: nginx\nspec:\n containers:\n - name: nginx\n image: nginx:latest",
"use_llm": false
}'kubernetes-yaml-explainer/
βββ backend/
β βββ app/
β β βββ models.py # SQLAlchemy models
β β βββ schemas.py # Pydantic schemas
β β βββ database.py # Database configuration
β β βββ routes/ # API endpoints
β β β βββ yaml_routes.py
β β β βββ settings_routes.py
β β βββ services/ # Business logic
β β βββ yaml_service.py # YAML parsing & validation
β β βββ explainer.py # Rule-based explanations
β β βββ llm_service.py # LLM integration
β β βββ crypto_service.py # Encryption
β βββ main.py # FastAPI app entry point
β βββ requirements.txt # Python dependencies
βββ frontend/
β βββ public/
β β βββ index.html
β βββ src/
β β βββ components/ # React components
β β β βββ SettingsModal.js
β β β βββ WizardModal.js
β β βββ services/
β β β βββ api.js # API client
β β βββ styles/
β β β βββ App.css # Glassmorphism styles
β β βββ App.js # Main app component
β β βββ index.js # React entry point
β βββ package.json # Node dependencies
βββ docker/
β βββ Dockerfile # Multi-stage Docker build
βββ k8s/ # Example YAML files
βββ docs/ # Additional documentation
βββ docker-compose.yml # Docker Compose config
βββ .env.example # Environment template
βββ README.md # This file
Try these example manifests located in the k8s/ directory:
example-deployment.yaml- Nginx deployment with resource limits and probesexample-service.yaml- LoadBalancer serviceexample-ingress.yaml- Ingress with TLS
- API Key Encryption: All LLM API keys are encrypted using AES-256 with PBKDF2 key derivation
- Master Key: Set
ENCRYPTION_KEYenvironment variable (default is insecure) - CORS Protection: Configurable allowed origins
- Safe YAML Parsing: Uses
ruamel.yamlsafe loader (no code execution) - No Telemetry: Zero external requests unless LLM is explicitly configured
- Rate Limiting: (Coming soon) Protect LLM endpoints from abuse
# Check Docker logs
docker-compose logs -f
# Rebuild containers
docker-compose up --build# Remove and recreate database
docker-compose down -v
docker-compose up- Verify endpoint URL is correct (include full path)
- Check API key is valid
- Ensure network connectivity to LLM provider
- Review backend logs for detailed error messages
- Ensure backend is running on port 8000 (dev) or 8080 (prod)
- Check CORS configuration in
backend/main.py - Verify proxy setting in
frontend/package.json
- AI-assisted YAML generation ("create a Deployment for nginx with 3 replicas")
- Policy scanning integration (OPA, Kyverno)
- Diff/compare mode for multiple YAMLs
- Helm chart support
- Kustomize overlay explanations
- Export explanations as Markdown/PDF
- Offline Kubernetes API reference cache
- Multi-language support
- Browser extension
- VS Code extension
This project is provided as-is for educational and practical use.
- Kubernetes community for comprehensive API documentation
- Monaco Editor for the excellent code editor
- FastAPI for the elegant Python web framework
- React for the powerful UI library
Crafted with β‘ by RoarinPenguin
Contributions are welcome! Please feel free to submit issues, feature requests, or pull requests.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
β If you find this project useful, please consider giving it a star! β