RiceCoder (rice) is a terminal-first, spec-driven coding assistant that understands your project before generating code. Unlike other AI coding tools, RiceCoder follows a research-first approach-analyzing your codebase, understanding your patterns, and generating code that fits your project's style.
- 🔬 Research-First - Analyzes your project context before generating code
- 📋 Spec-Driven - Systematic, repeatable development from specifications
- 💻 Terminal-Native - Beautiful CLI/TUI that works anywhere
- 🔒 Offline-First - Local models via Ollama for privacy and offline work
- 🤖 Multi-Agent - Specialized agents for different tasks
- 🎨 Multi-Provider - OpenAI, Anthropic, Ollama, and more
- 📊 Token Tracking - Real-time token usage monitoring with cost estimation
- 🚀 Project Bootstrap - Automatic project detection and configuration
- 🎯 Session Management - Persistent sessions with token-aware message handling
- 🔧 Dependency Injection - Modular architecture with service container for clean component wiring
- 🔍 RiceGrep Integration - AI-enhanced code search with embedded models and ripgrep compatibility
Choose your preferred installation method:
| Method | Command | Best For |
|---|---|---|
| Cargo | cargo install ricecoder |
Rust developers, easy updates |
| Curl | curl -fsSL https://raw.githubusercontent.com/moabualruz/ricecoder/main/scripts/install | bash |
Quick setup, any platform |
| Docker | docker pull moabualruz/ricecoder:latest |
Isolated environments |
| npm | npm install -g ricecoder |
Node.js developers |
| Homebrew | brew install ricecoder |
macOS users |
| Scoop | scoop install ricecoder |
Windows users |
| Winget | winget install RiceCoder.RiceCoder |
Windows users |
| From Source | git clone ... && ./scripts/install.sh |
Development, customization |
Install from crates.io:
# Install
cargo install ricecoder
# Verify
rice --version
# Update
cargo install --force ricecoderRequirements: Rust 1.75+ (Install Rust)
Platforms: Linux, macOS, Windows
Build and install from source with a single command:
# Linux/macOS - Standard installation
curl -fsSL https://raw.githubusercontent.com/moabualruz/ricecoder/main/scripts/install | bash
# With custom prefix
curl -fsSL https://raw.githubusercontent.com/moabualruz/ricecoder/main/scripts/install | bash -s -- --prefix /usr/local
# Debug build
curl -fsSL https://raw.githubusercontent.com/moabualruz/ricecoder/main/scripts/install | bash -s -- --debug
# Verify
rice --versionFeatures:
- Detects OS and architecture automatically
- Builds from source with automatic compilation
- Verifies prerequisites (Rust, Cargo, Git)
- Automatic Rust toolchain update
- Installs to custom prefix or default location
- Automatic installation verification
Platforms: Linux (x86_64, ARM64), macOS (Intel, Apple Silicon)
Troubleshooting: See Installation Troubleshooting
Run in a containerized environment:
# Pull image
docker pull moabualruz/ricecoder:latest
# Run
docker run --rm moabualruz/ricecoder:latest --version
# Run with workspace access
docker run -it -v $(pwd):/workspace moabualruz/ricecoder:latest chatFeatures:
- Isolated environment
- No system dependencies
- Consistent across platforms
- Easy cleanup
Platforms: Any platform with Docker
Requirements: Docker (Install Docker)
Install via npm registry:
# Install globally
npm install -g ricecoder
# Verify
rice --version
# Update
npm install -g ricecoder@latestFeatures:
- Familiar npm workflow
- Easy version management
- Works with Node.js projects
Platforms: Linux, macOS, Windows
Requirements: Node.js 14+ and npm
Install via Homebrew:
# Install
brew install ricecoder
# Verify
rice --version
# Update
brew upgrade ricecoderFeatures:
- Native macOS package manager
- Easy updates and uninstall
- Integrates with system
Platforms: macOS
Requirements: Homebrew (Install Homebrew)
Install via Scoop:
# Install
scoop install ricecoder
# Verify
rice --version
# Update
scoop update ricecoderFeatures:
- Native Windows package manager
- Easy updates and uninstall
- Integrates with Windows
Platforms: Windows
Requirements: Scoop (Install Scoop)
Install via Windows Package Manager:
# Install
winget install RiceCoder.RiceCoder
# Verify
rice --version
# Update
winget upgrade RiceCoder.RiceCoderFeatures:
- Official Microsoft package manager
- Enterprise-friendly
- Integrated with Windows
Platforms: Windows
Requirements: Windows Package Manager (winget) - included with Windows 10/11
Clone and build locally with automatic installation:
Using Installation Scripts (Recommended):
# Clone repository
git clone https://github.com/moabualruz/ricecoder.git
cd ricecoder
# Linux/macOS - Automatic installation
chmod +x scripts/install.sh
./scripts/install.sh
# Windows (PowerShell)
.\scripts\install.ps1
# Windows (CMD)
scripts\install.batInstallation Script Options:
# Linux/macOS
./scripts/install.sh --prefix /usr/local # Custom prefix
./scripts/install.sh --debug # Debug build
./scripts/install.sh --verbose # Verbose output
# Windows (PowerShell)
.\scripts\install.ps1 -Prefix "C:\Program Files\ricecoder"
.\scripts\install.ps1 -DebugManual Build and Install:
# Clone repository
git clone https://github.com/moabualruz/ricecoder.git
cd ricecoder
# Build and install
cargo install --path projects/ricecoder
# Verify
rice --versionFeatures:
- Latest development version
- Customizable build (release/debug)
- Full source access
- Automatic prerequisite checking
- Automatic Rust toolchain update
- Automatic PATH configuration
Platforms: Linux, macOS, Windows
Requirements: Rust 1.75+, Git, C compiler
See Also: Installation Guide - Comprehensive build and installation documentation
| Platform | Arch | Cargo | Curl | Docker | npm | Homebrew | Scoop | Winget | Source |
|---|---|---|---|---|---|---|---|---|---|
| Linux | x86_64 | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ |
| Linux | ARM64 | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ |
| macOS | Intel | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ |
| macOS | Apple Silicon | ✅ | ✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ |
| Windows | x86_64 | ✅ | ❌ | ✅ | ✅ | ❌ | ✅ | ✅ | ✅ |
| Windows | ARM64 | ✅ | ❌ | ✅ | ✅ | ❌ | ✅ | ✅ | ✅ |
Legend: ✅ Supported | ❌ Not available |
Minimum:
- OS: Linux, macOS, or Windows
- RAM: 512 MB
- Disk: 100 MB
Recommended:
- OS: Linux (Ubuntu 18.04+), macOS (10.13+), Windows 10+
- RAM: 2 GB
- Disk: 500 MB
- Terminal: Modern terminal emulator (iTerm2, Windows Terminal, GNOME Terminal)
For Building from Source:
- Rust 1.75+ (Install Rust)
- Git
- C compiler (gcc, clang, or MSVC)
After installation, verify it works:
# Check version
rice --version
# Show help
rice --help
# Initialize project
rice init
# Test connection (if configured)
rice chat --testSolution:
- Restart your terminal
- Check PATH:
echo $PATH - Verify installation:
which riceorwhere rice(Windows) - Reinstall if needed
See Installation Setup Guide for detailed troubleshooting.
Solution:
- Check file permissions:
ls -la ~/.cargo/bin/rice - Fix permissions:
chmod +x ~/.cargo/bin/rice - Ensure directory is in PATH
Solution:
- Re-run installation script
- Check network connection
- Try alternative installation method
Solution:
- Pull image:
docker pull moabualruz/ricecoder:latest - Check Docker is running:
docker ps - Verify internet connection
Remove RiceCoder:
# Cargo
cargo uninstall ricecoder
# npm
npm uninstall -g ricecoder
# Homebrew
brew uninstall ricecoder
# Docker
docker rmi moabualruz/ricecoder:latest
# From source
cd ricecoder && cargo uninstall --path projects/ricecoderRemove configuration:
# Remove global config
rm -rf ~/.ricecoder/
# Remove project config
rm -rf .agent/RiceCoder is a terminal-first, spec-driven coding assistant that understands your project before generating code. Unlike traditional AI coding tools, RiceCoder analyzes your codebase, understands your patterns, and generates code that fits your project's style and architecture.
- Research-First: Analyzes your project context before generating code
- Spec-Driven: Systematic development from specifications
- Terminal-Native: Beautiful CLI/TUI that works anywhere
- Offline-First: Local models via Ollama for privacy
- Multi-Agent: Specialized agents for different tasks
- Enterprise-Ready: SOC 2 compliance, audit logging, RBAC
RiceCoder is licensed under CC BY-NC-SA 4.0, making it free for personal and non-commercial use. Commercial use requires a separate license.
RiceCoder supports all major programming languages including Rust, Python, JavaScript/TypeScript, Go, Java, C++, and more. It uses LSP (Language Server Protocol) for comprehensive language support.
- Cargo: Best for Rust developers, easy updates
- Curl: Quick setup for any platform
- Docker: Isolated environments
- Package Managers: Homebrew (macOS), Scoop (Windows), Winget (Windows), npm
- From Source: For development or customization
- Restart your terminal
- Check PATH:
echo $PATH - Verify installation:
which rice(Unix) orwhere rice(Windows) - Reinstall if needed
# Configure OpenAI
rice provider config openai --api-key sk-your-key
# Configure Anthropic
rice provider config anthropic --api-key sk-ant-your-key
# Test connection
rice provider test openai
# Set default provider
rice provider default openaiYes! RiceCoder supports Ollama for offline-first development:
# Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh
# Start server
ollama serve
# Pull a model
ollama pull llama2:13b
# Configure RiceCoder
rice provider config ollama --base-url http://localhost:11434# Initialize project
rice init
# Start interactive chat
rice chat
# Generate code from spec
rice gen --spec my-feature.spec.md
# Review code
rice review src/main.rsSpecs are Markdown or YAML files that describe what you want to build. Example:
# User Authentication System
## Requirements
- User registration with email verification
- JWT-based authentication
- Password hashing with bcrypt
- Role-based access control
## API Endpoints
- POST /auth/register
- POST /auth/login
- GET /auth/me
- POST /auth/logout# Create shareable session
rice chat --share
# Share with team
rice session share --team my-team
# Join shared session
rice session join https://ricecoder.app/s/session-idMCP (Model Context Protocol) allows RiceCoder to connect to external tools. Basic setup:
# Install MCP servers
npm install -g @modelcontextprotocol/server-filesystem
# Configure in RiceCoder
rice mcp add filesystem --command npx --args "-y,@modelcontextprotocol/server-filesystem,/workspace"
# Start MCP servers
rice mcp start
# Use in chat
rice chat --mcpCommon causes:
- Large codebase analysis - use
--focusflag - Slow provider - try different model or provider
- Network issues - check connection
- Resource constraints - check memory/CPU usage
# Run performance validation
./scripts/run-performance-validation.sh
# Profile specific operations
rice profile chat --duration 30s
# Optimize provider settings
rice provider optimize openai --metric latency
# Monitor resources
rice monitor resourcesSolutions:
- Reduce request frequency
- Switch to different provider
- Upgrade API plan
- Use local models (Ollama)
# Enable debug logging
export RUST_LOG=debug
rice chat
# Check system health
rice doctor
# View logs
rice logs --tail 100
# Validate configuration
rice config validateYes, RiceCoder includes SOC 2 Type II compliance features:
- Comprehensive audit logging
- Customer-managed encryption keys
- Access control and RBAC
- Security monitoring and threat detection
# Enable enterprise features
rice enterprise enable
# Configure audit logging
rice audit enable
# Set up RBAC
rice rbac configure
# Configure encryption
rice encryption setupYes, RiceCoder supports proxy configuration and can work behind corporate firewalls. Configure proxy settings in your provider configuration.
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests
- Submit a pull request
See CONTRIBUTING.md for detailed guidelines.
Use GitHub Issues: https://github.com/moabualruz/ricecoder/issues
Include:
- RiceCoder version (
rice --version) - Operating system
- Steps to reproduce
- Expected vs actual behavior
- Logs (
rice logs)
- Discord: https://discord.gg/BRsr7bDX
- GitHub Discussions: https://github.com/moabualruz/ricecoder/discussions
- Documentation: https://github.com/moabualruz/ricecoder/wiki
Unix/Linux/macOS:
# Fix permissions
chmod +x ~/.cargo/bin/rice
# Or reinstall with proper permissions
sudo curl -fsSL https://raw.githubusercontent.com/moabualruz/ricecoder/main/scripts/install | bashWindows:
- Run terminal as Administrator
- Check User Account Control settings
- Use Scoop or Winget for easier installation
# Update Rust
rustup update
# Install specific version
rustup install 1.75
# Set default version
rustup default 1.75# Clean build artifacts
cargo clean
# Update dependencies
cargo update
# Build with verbose output
cargo build --verbose
# Check for missing system dependencies
# Ubuntu/Debian
sudo apt-get install build-essential pkg-config libssl-dev
# macOS
xcode-select --install
# Windows
# Install Visual Studio Build Tools# Validate configuration
rice config validate
# Test provider connection
rice provider test openai
# Check API key format
echo $OPENAI_API_KEY | head -c 10 # Should start with sk-
# Reset provider config
rice provider reset openai# Check MCP server status
rice mcp status
# Restart MCP servers
rice mcp restart
# Debug MCP connections
rice mcp debug
# Check server logs
rice mcp logs filesystem# Check session storage
rice session list
# Validate session files
rice session validate
# Reset session storage
rice session reset
# Check disk space
df -h# Monitor memory usage
rice monitor memory
# Clear caches
rice cache clear
# Reduce concurrent sessions
rice config set max_concurrent_sessions 3
# Use smaller models
rice provider config ollama --model phi:2.7b# Check network latency
ping api.openai.com
# Test provider performance
rice provider benchmark openai
# Switch to faster provider
rice provider default anthropic
# Enable caching
rice cache enable# Enable crash reporting
rice config set crash_reporting enabled
# Run with debug symbols
RUST_BACKTRACE=1 rice chat
# Check system resources
rice doctor
# Update to latest version
rice update# Set HTTP proxy
rice config set proxy http://proxy.company.com:8080
# Set HTTPS proxy
rice config set https_proxy http://proxy.company.com:8080
# Bypass proxy for local
rice config set no_proxy localhost,127.0.0.1# Disable SSL verification (not recommended for production)
rice config set ssl_verify false
# Use custom CA certificates
rice config set ca_certs /path/to/ca-bundle.crt
# Check certificate validity
openssl s_client -connect api.openai.com:443# Test connectivity
rice network test api.openai.com
# Check firewall rules
# Linux
sudo ufw status
# Windows
netsh advfirewall show allprofiles
# macOS
sudo pfctl -s rules# Check API key
rice provider test openai
# Check rate limits
rice provider limits openai
# Switch to different model
rice provider config openai --model gpt-3.5-turbo
# Check account status
# Visit https://platform.openai.com/account# Test connection
rice provider test anthropic
# Check API key format (should start with sk-ant-)
echo $ANTHROPIC_API_KEY | head -c 10
# Try different model
rice provider config anthropic --model claude-3-haiku-20240307# Check Ollama status
ollama list
# Restart Ollama
ollama serve
# Pull model again
ollama pull llama2:13b
# Check system resources
# Ollama needs significant RAM for larger models# Run performance benchmarks
rice benchmark run
# Profile specific command
rice profile "gen --spec my-spec.md" --output profile.json
# Monitor system resources
rice monitor resources --interval 5
# Generate performance report
rice report performance --output perf-report.md# Check memory usage
rice monitor memory
# Clear all caches
rice cache clear all
# Reduce session history
rice config set max_session_history 100
# Use streaming responses
rice config set streaming enabled# Check disk usage
rice monitor disk
# Clean old sessions
rice session cleanup --older-than 90d
# Compress session data
rice session compress
# Move cache to faster storage
rice config set cache_dir /fast/ssd/cache# Enable detailed logging
export RUST_LOG=ricecoder=debug
# Log to file
rice chat 2>&1 | tee ricecoder-debug.log
# Filter specific components
export RUST_LOG=ricecoder::mcp=trace,ricecoder::providers=debug
# View recent logs
rice logs --since 1h# Run full system check
rice doctor --full
# Check dependencies
rice doctor --deps
# Validate all configurations
rice doctor --config
# Generate diagnostic report
rice doctor --report diagnostic.md# Backup current configuration
rice config backup --output config-backup.yaml
# Reset to defaults
rice config reset
# Restore from backup
rice config restore config-backup.yaml
# Emergency cleanup
rice emergency cleanupIf you can't resolve an issue:
- Check the documentation: https://github.com/moabualruz/ricecoder/wiki
- Search existing issues: https://github.com/moabualruz/ricecoder/issues
- Ask the community: https://discord.gg/BRsr7bDX
- File a bug report: Include version, OS, steps to reproduce, and logs
For urgent enterprise support, contact enterprise@ricecoder.com
After installation, initialize your first project:
# Initialize project
rice init
# Start interactive chat
rice chat
# Generate code from a spec
rice gen --spec my-feature
# Review code
rice review src/main.rsFor detailed setup instructions, see Installation Setup Guide.
- Explore interactive chat mode for free-form coding assistance
- Learn about spec-driven development for systematic coding
- Configure multiple AI providers for optimal performance
- Set up local models for offline privacy
RiceCoder is designed for high performance with strict performance targets:
- 🚀 Startup Time: < 3 seconds (cold start)
- ⚡ Response Time: < 500ms (typical operations)
- 🧠 Memory Usage: < 300MB (typical sessions)
- 🏗️ Large Projects: 500+ crates, 50K+ lines with incremental analysis
- 🔄 Concurrent Sessions: Up to 10+ parallel sessions
Run performance validation to ensure targets are met:
# Run performance validation
./scripts/run-performance-validation.sh
# Update performance baselines
./scripts/update-performance-baselines.sh
# Check for regressions
ricecoder-performance check-regression --binary ./target/release/ricecoder --baseline performance-baselines.jsonPerformance baselines are automatically tracked and regression detection alerts when performance degrades beyond acceptable thresholds.
| Feature | RiceCoder | Others |
|---|---|---|
| Terminal-native | ✅ | ❌ IDE-focused |
| Spec-driven development | ✅ | ❌ Ad-hoc |
| Offline-first (local models) | ✅ | ❌ Cloud-only |
| Research before coding | ✅ | ❌ Generate immediately |
| Multi-agent framework | ✅ |
Join our community to discuss RiceCoder, ask questions, and share ideas:
- Discord Server - Real-time chat and community support
- GitHub Discussions - Async discussions and Q&A
- GitHub Issues - Bug reports and feature requests
Complete documentation is available in the RiceCoder Wiki:
- Quick Start Guide - Get started in 5 minutes
- CLI Commands Reference - All available commands
- Configuration Guide - Configure RiceCoder
- TUI Interface Guide - Terminal UI navigation and shortcuts
- AI Providers Guide - Set up OpenAI, Anthropic, Ollama, and more
- Local Models Setup - Use Ollama for offline-first development
- Spec-Driven Development - Systematic development with specs
- Code Generation - Generate code from specs with AI enhancement and validation
- Multi-Agent Framework - Specialized agents for code review, testing, documentation, and refactoring
- Workflows & Execution - Declarative workflows with state management, approval gates, and risk scoring
- Execution Plans - Risk scoring, approval gates, test integration, pause/resume, and rollback
- Sessions - Multi-session support with persistence, sharing, and background agent execution
- Modes - Code/Ask/Vibe modes with Think More extended reasoning
- Conversation Sharing - Share sessions with team members via shareable links with read-only access
- LSP Integration - Language Server Protocol for IDE integration with multi-language semantic analysis
- Code Completion - Context-aware code completion with intelligent ranking and ghost text
- Hooks System - Event-driven automation with hook chaining and configuration
- Enhanced Tools - Webfetch, Patch, Todo, Web Search with hybrid MCP provider architecture
- Webfetch Tool - Fetch web content with timeout and truncation
- Patch Tool - Apply unified diffs with conflict detection
- Todo Tool - Persistent task management
- Web Search Tool - Search with free APIs or local MCP servers
- Refactoring Engine - Safe refactoring with multi-language support
- Markdown Configuration - Markdown-based configuration system
- Keybind Customization - Custom keybind profiles and management
- Orchestration - Multi-project workspace management with cross-project operations
- Domain-Specific Agents - Specialized agents for frontend, backend, DevOps, data engineering, mobile, and cloud
- Learning System - User interaction tracking and personalized recommendations
- GitHub Integration - GitHub API integration, PR/Issue creation, repository analysis (✅ Complete)
- Conversation Sharing - Share sessions with shareable links and read-only access (✅ Complete)
- Team Collaboration - Team workspaces, shared knowledge base, permissions (✅ Complete)
- IDE Integration - VS Code, JetBrains, Neovim plugins with external LSP-first architecture (✅ Complete)
- Installation Methods - Curl, package managers, Docker, binaries for all platforms (✅ Complete)
- Theme System - Built-in and custom themes with hot-reload support (✅ Complete)
- Image Support - Drag-and-drop images with AI analysis, caching, and terminal display (✅ Complete)
- FAQ - Frequently asked questions
- Troubleshooting Guide - Common issues and solutions
- Architecture Overview - System design and architecture
Status: Phase 7 complete with all integration features validated and released. GitHub integration, team collaboration, IDE plugins, installation methods, theme system, and image support now available.
RiceCoder follows a phased release strategy with extended Alpha testing before production release:
- Alpha (v0.1.1) ✅ - Phase 1: Foundation features
- Alpha (v0.1.2) ✅ - Phase 2: Enhanced features
- Alpha (v0.1.3) ✅ - Phase 3: MVP features
- Alpha (v0.1.4) ✅ - Phase 4: Polished and hardened
- Alpha (v0.1.5) ✅ - Phase 5: Foundation features
- Alpha (v0.1.6) ✅ - Phase 6: Infrastructure features
- Alpha (v0.1.7) ✅ - Phase 7: Integration features (current)
- Alpha (v0.1.8) ✅ - Phase 8: Enterprise features complete
Why Extended Alpha? We're gathering user feedback, identifying edge cases, optimizing performance, and hardening security before the production release.
Status: 11/11 features complete, 500+ tests, 82% coverage, zero clippy warnings
- CLI Foundation - Commands, shell completion, beautiful UX
- AI Providers - OpenAI, Anthropic, Ollama, 75+ providers
- TUI Interface - Terminal UI with themes and syntax highlighting
- Spec System - YAML/Markdown specs with validation
- File Management - Safe writes, git integration, backups
- Templates & Boilerplates - Template engine with substitution
- Research System - Project analysis and context building
- Permissions System - Fine-grained tool access control
- Custom Commands - User-defined shell commands
- Local Models - Ollama integration for offline-first development
- Storage & Config - Multi-level configuration hierarchy
Status: 7/7 features complete, 900+ tests, 86% coverage, zero clippy warnings
- Code Generation - Spec-driven code generation with AI enhancement, validation, conflict detection, and rollback
- Multi-Agent Framework - Specialized agents for code review, testing, documentation, and refactoring
- Workflows - Declarative workflow execution with state management, approval gates, and risk scoring
- Execution Plans - Risk scoring, approval gates, test integration, pause/resume, and rollback
- Sessions - Multi-session persistence, sharing, and background agent execution
- Modes - Code/Ask/Vibe modes with Think More extended reasoning
- Conversation Sharing - Share sessions with team members via shareable links with read-only access and permission-based filtering
Timeline: Completed December 8, 2025
Status: 3/3 features complete, 544 tests, 86% coverage, zero clippy warnings
- LSP Integration - Language Server Protocol for IDE integration with multi-language semantic analysis
- Code Completion - Tab completion and ghost text suggestions with context awareness
- Hooks System - Event-driven automation with hook chaining and configuration
Timeline: Completed December 5, 2025
Status: 7/7 feature areas complete, 1000+ tests, 85%+ coverage, zero clippy warnings
- Performance Optimization - Profiling, caching, memory optimization
- Security Hardening - Security audit, best practices, hardening
- User Experience Polish - Error messages, onboarding, accessibility
- Documentation & Support - Comprehensive docs, guides, support resources
- External LSP Integration - Integration with external LSP servers (rust-analyzer, tsserver, pylsp)
- Final Validation - Comprehensive testing, validation, community feedback
- Alpha Release - v0.1.4 released and available
Timeline: Completed December 5, 2025
Status: 7/7 features complete, 1100+ tests, 85%+ coverage, zero clippy warnings
- Enhanced Tools - Webfetch, Patch, Todo, Web Search with hybrid MCP provider architecture
- Webfetch Tool - Fetch web content with timeout and truncation
- Patch Tool - Apply unified diffs with conflict detection
- Todo Tools - Persistent task management
- Web Search Tool - Search with free APIs or local MCP servers
- Refactoring System - Safe refactoring with multi-language support
- Markdown Configuration - Markdown-based configuration system
- Keybind Customization - Custom keybind profiles and management
Timeline: Completed December 5, 2025
Status: 3/3 features complete, 1200+ tests, 85%+ coverage, zero clippy warnings
- Orchestration - Multi-project workspace management with cross-project operations
- Domain-Specific Agents - Specialized agents for frontend, backend, DevOps, data engineering, mobile, and cloud
- Learning System - User interaction tracking and personalized recommendations
Timeline: Completed December 6, 2025
Status: 7/7 features complete, 1300+ tests, 88% coverage, zero clippy warnings
- GitHub Integration - GitHub API integration and PR/Issue creation
- Conversation Sharing - Share sessions with shareable links and read-only access
- Team Collaboration - Team workspaces and shared knowledge base
- IDE Integration - VS Code, JetBrains, Neovim plugins with external LSP-first architecture
- Installation Methods - Curl, package managers, Docker, binaries for all platforms
- Theme System - Built-in and custom themes with hot-reload support
- Image Support - Drag-and-drop images with AI analysis, caching, and terminal display
Timeline: Completed December 9, 2025
RiceCoder includes enterprise-grade capabilities for production deployment and team collaboration:
- 🔐 Security & Authentication - Enterprise security with credential management, audit logging, and access controls
- 📊 Monitoring & Telemetry - Comprehensive performance tracking, AI usage metrics, and health monitoring
- 🗄️ Local Database Storage - Persistent search history, user preferences, and configuration storage
- ⚙️ Background Process Management - Reliable background operations with lifecycle control and resource monitoring
- 🔗 MCP Ecosystem - Full Model Context Protocol support for AI assistant integration
- 🚀 Production Deployment - Containerization, orchestration, and automated deployment procedures
- 🧪 Integration Testing - Comprehensive compatibility validation for external tools and AI assistants
RiceCoder now supports drag-and-drop image support with AI analysis, intelligent caching, and terminal display:
Features:
- Drag-and-Drop: Simply drag images into the terminal to include them in your prompts
- Multi-Format Support: PNG, JPG, GIF, and WebP formats
- AI Analysis: Automatic image analysis via your configured AI provider (OpenAI, Anthropic, Ollama, etc.)
- Smart Caching: Cached analysis results with 24-hour TTL and LRU eviction (100 MB limit)
- Terminal Display: Beautiful image rendering in the terminal with ASCII fallback for unsupported terminals
- Automatic Optimization: Large images (>10 MB) are automatically optimized before analysis
- Session Integration: Images are stored in session history for persistence and sharing
Usage:
# Start interactive chat
rice chat
# Drag and drop an image into the terminal
# The image will be analyzed and included in your prompt
# Example: Ask about an image
# "What's in this screenshot?"
# "Analyze this diagram"
# "Review this design mockup"Configuration:
Image support is configured in projects/ricecoder/config/images.yaml:
images:
# Supported formats
formats:
- png
- jpg
- jpeg
- gif
- webp
# Display settings
display:
max_width: 80 # Max width for terminal display
max_height: 30 # Max height for terminal display
placeholder_char: "█" # ASCII placeholder character
# Cache settings
cache:
enabled: true
ttl_seconds: 86400 # 24 hours
max_size_mb: 100 # LRU limit
# Analysis settings
analysis:
timeout_seconds: 10 # Provider timeout
max_image_size_mb: 10 # Optimization threshold
optimize_large_images: truePerformance:
- Drag-and-drop detection: < 100ms
- Format validation: < 500ms
- Image analysis: < 10 seconds
- Cache lookup: < 50ms
- Display rendering: < 200ms
Supported Providers:
- OpenAI (GPT-4 Vision)
- Anthropic (Claude 3 Vision)
- Google (Gemini Vision)
- Ollama (with vision models)
- Zen (with vision support)
See Image Support Guide for detailed documentation.
Status: 11 consolidated issues identified and being fixed
After publishing v0.1.7 on crates.io, 11 issues were discovered through manual testing and code analysis:
- TUI Event Loop - Event polling implementation
- Configuration System - First-run setup and config loading
- Provider Standards - Zen provider URL and model discovery
- Path Resolution - Unified path handling across commands
- Default to TUI - Make
ricecommand default to TUI - Non-Interactive Init - Support CI/CD initialization
- Provider Registry - Unified provider management
- Config Command - Actual config loading and saving
- Error Handling - Robust error recovery
- Graceful Shutdown - Terminal state restoration
- Code Reusability - Centralized utilities
Fixes: Configuration-driven architecture with maximum code reusability and feature configurability.
After Phase 8 completion - Production release with all issues fixed and features validated
See Development Roadmap for details.
We welcome contributions! See CONTRIBUTING.md for guidelines.
This project is licensed under CC BY-NC-SA 4.0.
- ✅ Free for personal and non-commercial use
- ✅ Fork, modify, and share
- ❌ Commercial use requires a separate license
Built with ❤️ using Rust.
Inspired by Aider, OpenCode, and Claude Code.
r[ - Plan. Think. Code.
RiceCoder includes RiceGrep (ricegrep), an AI-enhanced, offline-first code search tool with heuristic-based AI processing that maintains full compatibility with traditional grep workflows while adding intelligent reranking and natural language query understanding.
- 🔍 AI-Enhanced Search - Natural language queries with intelligent result reranking (heuristic-based, no external APIs)
- ⚡ Ripgrep Compatible - Drop-in replacement with identical CLI and output formats
- 🔒 Offline-First - Full functionality without internet connectivity
- 🧠 Heuristic AI Processing - Advanced query understanding and result ranking without external models
- 🎯 Language Awareness - Programming language detection and context-aware ranking
- 📊 Enhanced Output - Detailed metadata, confidence scores, and AI reasoning
- 🔄 Watch Mode - Continuous monitoring with automatic index updates
- 🛠️ Safe Replace - Preview and execute find-replace operations safely
- ⚙️ Configuration System - Comprehensive customization and scripting support
- 🤖 MCP Server - AI assistant integration via Model Context Protocol (starts background watch by default)
- 📦 Plugin Ecosystem - Automated installation for Claude Code, OpenCode, Codex, Factory Droid
- 🗂️ Index Management - Build, update, status, and clear search indexes (auto-skips .git files)
# Install ricegrep (included with RiceCoder)
ricegrep --help
# Basic search (ripgrep compatible)
ricegrep "function.*error" src/
# AI-enhanced natural language search
ricegrep search --ai-enhanced "find error handling functions" src/
# Answer generation from search results
ricegrep search --answer "how does authentication work" .
# Watch mode with automatic index updates
ricegrep watch # Watch current directory
ricegrep watch src/ # Watch specific directory
ricegrep watch --timeout 300 # Watch with 5-minute timeout
# Index management (basic)
ricegrep index --build . # Build search index (framework)
ricegrep index --status # Check index status
# Plugin installation for AI assistants
ricegrep install claude-code
ricegrep install opencode
# MCP server for AI assistants
ricegrep mcp # Start MCP server with background watch
ricegrep mcp --no-watch # Start MCP server without background watch
# Safe replace operations
ricegrep replace "old_name" "new_name" --preview file.rs| Feature | RiceGrep | ripgrep | grep |
|---|---|---|---|
| Regex Performance | ✅ Native speed | ✅ Native speed | |
| AI Enhancement | ✅ Heuristic-based | ❌ | ❌ |
| Offline Operation | ✅ Full offline | ✅ | ✅ |
| Language Awareness | ✅ Context ranking | ❌ | ❌ |
| Watch Mode | ✅ Auto-reindexing + search updates | ❌ | ❌ |
| Index Management | ✅ File-by-file incremental | ❌ | ❌ |
| MCP Integration | ✅ AI assistants + background watch | ❌ | ❌ |
| Plugin Ecosystem | ✅ Claude/OpenCode/etc | ❌ | ❌ |
| Replace Operations | ✅ Safe preview |
See RiceGrep Documentation for comprehensive usage guides.