---# AIDE - AI Desktop Editor
Document Status: Living Specification
Stability Tier: Stable
Last Updated: 2026-01-27
Governing Document: SPECIFICATIONS.md
Your configurable AI bridge to local files
Before implementing, read these documents in order:
SPECIFICATIONS.md- Architecture, tech stack, MVP scopePROVIDERS.md- AI provider configurations (models are fetched dynamically, NEVER hardcoded)UI_UX_SPECIFICATION.md- UI components, design systemINTELLIGENCE.md- Advanced AI intelligence features (context awareness, learning, multi-model)Key Rule: Models are fetched from provider APIs at runtime.
selectedModelstarts asnulland there are NEVER any default models. Never hardcode model names.
This README is a Stable, user-facing charter governed by SPECIFICATIONS.md.
Rules:
- This document MUST NOT define architecture, command contracts, or provider behavior.
- Technical authority resides in:
SPECIFICATIONS.md(system law)PROVIDERS.md(AI provider law)UI_UX_SPECIFICATION.md(interface law)
- Any change that affects installation, security model, or distribution mechanics
MUST be reflected in
SPECIFICATIONS.md. - The
Last Updatedfield in the header MUST be maintained.
- AIDE does not modify files automatically or in the background.
- All file edits require user review and acceptance in the Diff view.
- AI actions originate from user input or user-initiated analysis sessions.
- Agents and internal orchestration are not user-visible concepts.
- Provider models are always selected by the user and fetched dynamically.
AIDE (AI Desktop Editor) is a secure, privacy-focused desktop application that allows you to chat with your choice of AI provider (OpenAI, Anthropic, Google Gemini, OpenRouter, local models, etc.) to directly read, analyze, and edit files within a controlled local workspace.
To install and use AIDE on Windows (no coding required):
-
Download the installer:
- Go to Releases
- Download
AIDE_<platform>_<arch>.msi(5-15 MB)
-
Install:
- Double-click the downloaded
.msifile - Follow the installation wizard
- Click "Install"
- Double-click the downloaded
-
Run:
- Find AIDE in your Start Menu
- Double-click to launch
- Configure your AI provider (OpenAI, Anthropic, etc.)
- Start chatting with AI about your code!
System Requirements:
- Windows 10 or Windows 11 (64-bit)
- 100 MB free disk space
- Internet connection (for AI providers)
- No programming tools or Node.js required!
AI Development Note: This README, along with the
SPECIFICATIONS.md,PROVIDERS.md, andUI_UX_SPECIFICATION.mdfiles, serves as the complete blueprint for AI-assisted development. Use these documents as a unified set. Do not implement from partial excerpts, as architectural and security rules are defined across all governing files.
- 🤖 Multi-Provider AI Hub: Connect and configure 44 AI connections: 31 HTTP providers (29 cloud + 2 local) + 13 CLI agents in one interface
- 🧠 Advanced Intelligence: Context-aware AI that analyzes your project and adapts to your preferences during user-initiated sessions
- 🔒 Secure File Operations: Read and edit files with mandatory user confirmation
- 💾 Local-First & Private: Your API keys and file data stay on your machine
- 🛠️ Developer-Friendly: Works with your existing projects and workflows
- 🏝️ Workspace Sandboxing: Strict file access limited to user-selected directories
- 🎯 Guided Assistance: AI analyzes and suggests improvements when you request or start an analysis session
- 🧠 Advanced Task Decomposition: The system can break complex requests into structured steps internally
| Component | Technology | Why Chosen |
|---|---|---|
| Desktop Framework | Electron 25 (Chromium 119, Node 20) | Production-proven, stable, battle-tested |
| AI HTTP Client | Custom fetch/streaming implementation | Lightweight, provider-agnostic, supports streaming |
| Frontend UI | React 18.2 + TypeScript 5 + Tailwind CSS 3 | Modern, type-safe UI with utility-first styling |
| UI Components | shadcn/ui v2 | Accessible, customizable components |
| State Management | Zustand 4 + TanStack Query v5 | Lightweight state + robust server-state caching |
| Local Database | Better-SQLite3 + Drizzle ORM v0.36 | Fastest SQLite for Node.js with type-safe ORM |
| Code Editor | Monaco Editor | VS Code-grade editing for diff viewer |
| Dev Tools | Vite 5 + Biome 2.0 + Vitest 3 + Playwright 2 | Fast builds, linting/formatting, testing |
CLI agents: AIDE also supports 13 CLI-based AI tools (Aider, Copilot CLI, etc.). See
PROVIDERS.mdCLI Agent Integration section for details.
To install and use AIDE (no coding required):
-
Download the installer:
- Go to Releases
- Download
AIDE_<platform>_<arch>.msi(5-15 MB)
-
Install:
- Double-click the downloaded
.msifile - Follow the installation wizard
- Click "Install"
- Double-click the downloaded
-
Run:
- Find AIDE in your Start Menu
- Double-click to launch
- Configure your AI provider (OpenAI, Anthropic, etc.)
- Start chatting with AI about your code!
System Requirements:
- Windows 10 or Windows 11 (64-bit)
- 100 MB free disk space
- Internet connection (for AI providers)
- No programming tools or Node.js required!
Note: This section is only for developers who want to modify AIDE's source code. End users can skip this entirely.
- Node.js 20+ (for frontend and Electron main process)
- pnpm (recommended package manager)
- AI provider API keys (OpenAI, Anthropic, etc.)
# Clone and setup
git clone https://github.com/yourusername/aide-desktop.git
cd aide-desktop
# Install dependencies (using pnpm)
pnpm install
# Initialize database
pnpm db:push
# Start development
pnpm electron:devpnpm dev # Start Vite dev server
pnpm electron:dev # Start Electron development
pnpm build # Build for production
pnpm electron:build # Build production executable (.exe/.msi)
pnpm test # Run Vitest tests
pnpm test:e2e # Run Playwright tests
pnpm lint # Run Biome linter
pnpm format # Format with Biome
pnpm db:push # Push database schema
pnpm db:studio # Open Drizzle Studio# Build Windows installer and portable executable
pnpm electron:build
# Output location:
# dist/AIDE_<platform>_<arch>.msi (Installer)
# dist/AIDE_<platform>_<arch>-setup.exe (NSIS Installer)
# dist/AIDE.exe (Portable)After building, you get multiple distribution options:
| File | Type | Size | Use Case |
|---|---|---|---|
AIDE_<platform>_<arch>.msi |
Windows Installer | 100-200 MB | Standard installation with shortcuts |
AIDE_<platform>_<arch>-setup.exe |
NSIS Installer | 100-200 MB | Custom branded installer |
AIDE.exe |
Portable Executable | 100-200 MB | Run without installation (USB, testing) |
For End Users (No Development Tools Required):
-
MSI Installer (Recommended):
- Download
AIDE_<platform>_<arch>.msi - Double-click to run installer
- Follow installation wizard
- App appears in Start Menu
- Download
-
Portable Executable:
- Download
AIDE.exe - Double-click to run
- No installation needed
- Download
-
System Requirements:
- Windows 10/11 (64-bit)
- 100 MB disk space
- No Node.js, npm, or development tools required
# Create GitHub Release with installers
gh release create <release-tag> \
dist/*.msi \
dist/*.exe \
--title "AIDE Release" \
--notes "Release notes"
# Or manually upload to:
# - GitHub Releases
# - Company file server
# - Download pageTo avoid "Unknown Publisher" warnings:
// electron-builder.yml
{
"win": {
"certificateFile": "path/to/cert.p12",
"certificatePassword": "CERT_PASSWORD"
}
}AIDE uses Electron 25 (Chromium 119, Node 20) for its production-proven stability and extensive ecosystem:
| Feature | Electron 25 (Chromium 119, Node 20) | Notes |
|---|---|---|
| Platform Support | Windows, macOS, Linux | True cross-platform |
| Chromium Version | 119 | Latest web features, secure |
| Node.js Version | 20 | LTS with native module support |
| Mature Ecosystem | Extensive | Large community, battle-tested |
| Used By | VS Code, Slack, Discord, Claude Desktop | Industry standard |
| Auto-Updates | electron-updater | Built-in support |
Key Advantages:
- ✅ Production-Proven - Used by millions of applications worldwide
- ✅ Full Node.js Access - Direct filesystem, native modules
- ✅ Modern Web Stack - React + TypeScript + Tailwind CSS
- ✅ Mature Tooling - electron-builder, auto-updaters, debug tools
- ✅ Consistent Experience - Same runtime across all platforms
Scenario 1: Enterprise Internal Tool
# Build once on CI/CD
pnpm electron:build
# Deploy to company network share
\\company-server\apps\AIDE_<platform>_<arch>.msi
# Employees install via Group Policy or self-service
# No Node.js or dev tools needed on employee machinesScenario 2: Public Distribution
# Build and release
pnpm electron:build
gh release create <release-tag> dist/*.msi
# Users download from GitHub/website and install
# Works like any commercial Windows applicationScenario 3: Portable USB Distribution
# Build portable executable
pnpm electron:build
# Copy to USB drive
cp dist/AIDE.exe E:\Tools\
# Run on any Windows PC without installation
# Perfect for contractors, demos, or restricted environments- First Launch: App opens to Settings page
- Add AI provider:
- Select provider type (OpenRouter, Groq, Anthropic, etc.)
- Enter API endpoint and key
- Click "Test Connection" to verify
- Select Model:
- App fetches available models from provider's API
- Choose your preferred model from the dropdown
- Select Workspace: Choose a folder for file operations
- Start Chatting: Ask the AI to help with your files!
Note: Models are fetched dynamically from each provider's API. See
PROVIDERS.mdfor details.
See PROVIDERS.md for complete setup details for all supported AI services.
User: "Read the main.py file in my project and suggest improvements"
AIDE: [Reads file, analyzes with AI, provides suggestions]
User: "Add error handling to the calculate() function"
AIDE: [Shows diff of proposed changes → User Accepts/Rejects → Applies edits]
User: "Explain how this function works"
AIDE: [Analyzes code and provides detailed explanation]
User: "Create a new file called utils.js with helper functions"
AIDE: [Creates file proposal → User reviews → Accepts/Rejects]
aide-desktop/
├── src/ # Frontend (React/TypeScript)
│ ├── components/ # UI Components
│ │ ├── ui/ # shadcn/ui components (Button, Card, Badge)
│ │ ├── chat/ # Chat interface components
│ │ ├── diff/ # Diff viewer components (Monaco Editor)
│ │ ├── providers/ # Provider selector components
│ │ ├── intelligence/ # Intelligence features UI
│ │ ├── sidebar/ # File tree, activity log, memory browser
│ │ └── layout/ # Header, status bar, command palette
│ ├── lib/
│ │ ├── ai/ # AI HTTP client, model discovery, control plane
│ │ ├── intelligence/ # Advanced AI features (memory, task orchestration, etc.)
│ │ ├── cli/ # CLI agent execution and detection
│ │ ├── db/ # Drizzle ORM schema & queries
│ │ └── utils/ # Helper utilities (file, diff, vector)
│ ├── stores/ # Zustand state stores (provider, workspace, intelligence)
│ ├── hooks/ # Custom React hooks (AI, intelligence)
│ └── main.tsx # App entry point
├── electron/ # Electron main process
│ ├── main/
│ │ ├── index.js # Main entry point
│ │ ├── ipc-handlers/ # IPC handlers (file-ops, keychain, cli-agents, intelligence)
│ │ └── config.js # App configuration
│ └── preload/
│ └── index.js # IPC bridge
├── tests/ # Test files
│ ├── unit/ # Vitest unit tests (stores, utils, intelligence)
│ └── e2e/ # Playwright E2E tests (chat, file-ops, complex-task)
├── drizzle/ # Database migrations (including intelligence tables)
├── SPECIFICATIONS.md # Complete project blueprint
├── PROVIDERS.md # AI provider configurations (44 total connections)
├── UI_UX_SPECIFICATION.md # Design & interface specs
├── INTELLIGENCE.md # Advanced AI intelligence features
└── README.md # This file
| Principle | Implementation |
|---|---|
| File Sandboxing | All operations confined to user-selected workspace only |
| Credential Safety | API keys encrypted in Windows Credential Manager |
| Explicit Consent | All file edits require manual approval via diff view - no auto-apply |
| No Telemetry | Zero data collection - app only communicates with user-configured AI providers** |
| No Hardcoded Secrets | No API keys, endpoints, or model names hardcoded in source |
Exception: Required API Headers: OpenRouter requires
HTTP-RefererandX-Titleheaders for API functionality and ranking. These contain only app identification data (https://aide-app.com,AIDE Desktop Editor), not user behavioral data or file content. This is the only allowed exception to the zero data collection policy.
- ✅ Multi-provider AI configuration (44 AI connections: 29 cloud HTTP + 2 local HTTP + 13 CLI agents)
- ✅ Dynamic model fetching from provider API
- ✅ Secure file operations with diff viewer
- ✅ Workspace sandboxing
- ✅ API keys stored in OS keychain
- ✅ CLI agent integration (Aider, Copilot CLI, etc.)
- ✅ Advanced intelligence features
- ✅ Context-aware AI assistance
| Document | Description |
|---|---|
README.md |
This file - quick start and overview |
SPECIFICATIONS.md |
Complete technical specification, architecture, feature set |
PROVIDERS.md |
AI provider configurations: 44 total (29 cloud HTTP + 2 local HTTP + 13 CLI agents) |
UI_UX_SPECIFICATION.md |
UI components, design system, user flows |
INTELLIGENCE.md |
Advanced AI intelligence features: context awareness, learning, multi-model routing |
For AI Agents: Always read
SPECIFICATIONS.mdfirst. Models are NEVER hardcoded - they are fetched from provider APIs at runtime.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing) - Commit changes (
git commit -m 'Add amazing feature') - Push to branch (
git push origin feature/amazing) - Open a Pull Request
Distributed under the MIT License. See LICENSE for more information.
- Documentation: Check
SPECIFICATIONS.mdfor detailed specs - Issues: GitHub Issues
- Discussions: GitHub Discussions
Built with ❤️ for developers who want AI assistance without compromising privacy.
# Optional environment variables for advanced users
AIDE_LOG_LEVEL=debug # Enable debug logging
AIDE_CACHE_DIR=/custom/path # Custom cache directory
AIDE_MAX_FILE_SIZE=10MB # Maximum file size for processing
AIDE_CONCURRENT_REQUESTS=3 # Max simultaneous AI requests
AIDE_OFFLINE_MODE=true # Force offline mode for testing# User-level configuration
~/.aide/config.json # Global settings
~/.aide/providers.json # Provider configurations
~/.aide/keybindings.json # Custom keyboard shortcuts
# Project-level configuration
.aide/workspace.json # Workspace-specific settings
.aide/ignore # Files to exclude from AI context
.aide/templates/ # Custom code templates# CLI commands for power users
aide --provider openrouter --model "anthropic/claude-3-5-sonnet" --file src/main.ts
aide --batch-process src/ # Process entire directory
aide --export-conversation conversation.md # Export chat history
aide --import-settings settings.json # Import configuration
aide --health-check # System diagnostics| Component | Minimum | Recommended | Optimal |
|---|---|---|---|
| RAM | 4 GB | 8 GB | 16 GB+ |
| CPU | 2 cores | 4 cores | 8 cores+ |
| Storage | 1 GB | 5 GB | 10 GB+ |
| Network | 1 Mbps | 10 Mbps | 100 Mbps+ |
- Large Projects: Use
.aide/ignoreto excludenode_modules, build artifacts - Slow Networks: Enable caching with
AIDE_CACHE_ENABLED=true - Memory Usage: Limit concurrent requests with
AIDE_CONCURRENT_REQUESTS=1 - Battery Life: Use local providers (Ollama) when on battery power
- API keys stored in OS keychain (never in files)
- Workspace sandboxing enabled
- Network traffic encrypted (HTTPS only)
- No telemetry or data collection (except required API headers for provider functionality)
- Regular security updates enabled
- Local-first data processing
| Feature | Implementation | Benefit |
|---|---|---|
| Local Processing | Code analysis runs locally | Your code never leaves your machine |
| Encrypted Storage | OS keychain for API keys | Military-grade credential protection |
| No Telemetry | Zero data collection (except required API headers for provider functionality) | Complete privacy |
| Workspace Isolation | Sandboxed file access | Protection from unauthorized access |
// Available metrics for personal productivity tracking
interface PersonalMetrics {
usage: {
requestsPerDay: number;
tokensConsumed: number;
costEstimate: number;
};
performance: {
averageResponseTime: number;
successRate: number;
errorRate: number;
};
productivity: {
filesModified: number;
linesGenerated: number;
timesSaved: number;
skillsLearned: string[];
};
}- Daily Coding Stats: Track your development velocity
- AI Effectiveness: See which suggestions help most
- Learning Progress: Monitor skill development
- Cost Tracking: Keep tabs on API usage costs
AIDE is intentionally designed as a personal AI coding assistant. We have deliberately excluded the following features to maintain focus on individual developer productivity:
- Enterprise/Team Features: Shared workspaces, team management, organization controls
- Mobile Applications: Mobile companion apps, cross-platform mobile support
- Compliance Systems: SOC 2, HIPAA, enterprise audit trails, SSO integration
- Team Collaboration: Multi-user workspaces, shared provider pools, team analytics
- Enterprise Integrations: CI/CD pipelines, monitoring systems (Prometheus, Grafana), enterprise SSO
- Real-time Pair Programming: AI watching and suggesting as you type in real-time
- Plugin System: Third-party extensions, custom plugins, marketplace integrations
- Advanced Collaboration: Team features, shared workspaces, multi-user environments
- Visual Understanding (Vision): Image analysis, screenshot understanding, visual code inspection, image input capabilities
- Voice Interface: Speech-to-text, text-to-speech, voice commands, audio input/output
AIDE focuses on making individual developers incredibly productive with AI assistance while maintaining:
- Privacy: Your code stays on your machine
- Simplicity: No enterprise complexity or overhead
- Performance: Optimized for personal workflows
- Security: Personal-grade security without enterprise bureaucracy
"AIDE is built for developers who want the most advanced AI coding assistance without enterprise complexity."
If you need enterprise features, consider solutions like GitHub Copilot Enterprise, Cursor Pro, or Replit Teams. AIDE excels at being your personal AI coding companion.
Built for individual developers who want powerful, private AI assistance.