MLX Code v1.3.0
MLX Code v1.3.0
Local LLM-powered coding assistant for macOS using Apple's MLX framework. Privacy-first alternative to GitHub Copilot — no cloud, no telemetry, runs entirely on Apple Silicon.
Features
- Code completion and generation using local LLMs
- Chat-based coding assistant
- Multi-model support (MLX, Ollama, OpenAI-compatible)
- Code explanation and refactoring suggestions
- Privacy-first: all inference runs locally on Apple Silicon
- No internet connection required for local models
Installation
- Download
MLX-Code-v1.3.0-Final.dmgbelow - Open the DMG and drag MLX Code to Applications
- Launch MLX Code
Requirements
- macOS 14.0 (Sonoma) or later
- Apple Silicon Mac (M1/M2/M3/M4)
- MLX (
pip install mlx-lm) or Ollama (brew install ollama)
What's New in v1.3.0
- Enhanced cloud AI integration
- Improved model management
- Performance optimizations
- Bug fixes
Created by Jordan Koch