A modern cross platform GUI for Garak, NVIDIA's LLM vulnerability scanner. Aegis provides an intuitive interface to scan language models for security vulnerabilities including jailbreaks, prompt injection, toxicity, and more.
aegis/
├── backend/ # FastAPI backend service
│ ├── api/ # API routes
│ ├── models/ # Data models
│ ├── services/ # Business logic
│ ├── config/ # Configuration
│ └── main.py # Entry point
├── frontend/ # Flutter desktop application
│ ├── lib/ # Dart source code
│ ├── assets/ # Images, fonts, etc.
│ └── pubspec.yaml # Flutter dependencies
└── README.md # This file
- Quick Scan: Fast vulnerability scanning with preset configurations
- Full Scan: Comprehensive testing with all available probes
- Browse Probes: Explore all available vulnerability probes
- Scan History: View and analyze past scan results
- Real-time Progress: WebSocket-based live scan monitoring
- Rich Results: Interactive HTML reports with detailed breakdowns
- Python 3.8+ (for backend)
- Flutter 3.0+ (for frontend)
- Garak (LLM vulnerability scanner)
- Ollama/OpenAI/Anthropic (or other supported LLM providers)
pip install garakcd backend
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txtcd frontend
flutter pub getcd backend
python main.pyThe backend will start on http://localhost:8888
cd frontend
flutter run -d macos # Or: -d windows, -d linuxCreate a .env file in the backend directory:
# Server settings
HOST=0.0.0.0
PORT=8888
LOG_LEVEL=INFO
# Optional: Custom garak path
GARAK_PATH=/path/to/garak
# Optional: API keys for LLM providers
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...The frontend automatically connects to http://localhost:8888. To change this, edit:
// frontend/lib/config/constants.dart
class ApiConstants {
static const String baseUrl = 'http://localhost:8888';
}cd backend
# Run with auto-reload
python main.py
# Run tests
pytest tests/
# Format code
black .cd frontend
# Hot reload is enabled by default
flutter run -d macos
# Generate code (if using code generation)
flutter pub run build_runner build --delete-conflicting-outputs
# Run tests
flutter testOnce the backend is running, visit:
- Swagger UI:
http://localhost:8888/docs - ReDoc:
http://localhost:8888/redoc
- Select Model: Choose your LLM provider and model
- Configure Probes: Select vulnerability tests to run
- Start Scan: Initiate the scan with real-time progress
- View Results: Analyze detailed HTML reports
- History: Access past scans anytime
- OpenAI (GPT-3.5, GPT-4, etc.)
- Anthropic (Claude)
- Ollama (Local models)
- HuggingFace
- Cohere
- Replicate
- LiteLLM
- NVIDIA NIM
- RESTful API: Standard HTTP endpoints for all operations
- WebSocket: Real-time scan progress updates
- Async Processing: Non-blocking scan execution
- File System: Reads historical scans from garak's output directory
- Riverpod: State management
- Dio: HTTP client with interceptors
- WebSocket: Real-time progress monitoring
- Material 3: Modern dark theme UI
Port already in use:
lsof -ti:8888 | xargs kill -9Garak not found:
which garak
# Or set GARAK_PATH in .envBuild errors:
flutter clean
flutter pub get
flutter pub run build_runner build --delete-conflicting-outputsThe main dashboard provides quick access to all key features:
Select your target model from multiple LLM providers with built-in presets:
Browse and select from hundreds of vulnerability probes organized by category:
Real-time progress tracking with detailed status information:
View all past scans with pass/fail metrics at a glance:
Comprehensive results with visualizations and metrics:
Contributions are welcome! Please:
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests
- Submit a pull request
This project is a GUI wrapper for Garak. Please refer to Garak's license for the underlying scanner.
- Garak by NVIDIA for the core vulnerability scanner
- Flutter and FastAPI communities for excellent frameworks





