Skip to content

MLX Code v1.3.0

Choose a tag to compare

@kochj23 kochj23 released this 18 Feb 06:29
· 131 commits to main since this release

MLX Code v1.3.0

Local LLM-powered coding assistant for macOS using Apple's MLX framework. Privacy-first alternative to GitHub Copilot — no cloud, no telemetry, runs entirely on Apple Silicon.

Features

  • Code completion and generation using local LLMs
  • Chat-based coding assistant
  • Multi-model support (MLX, Ollama, OpenAI-compatible)
  • Code explanation and refactoring suggestions
  • Privacy-first: all inference runs locally on Apple Silicon
  • No internet connection required for local models

Installation

  1. Download MLX-Code-v1.3.0-Final.dmg below
  2. Open the DMG and drag MLX Code to Applications
  3. Launch MLX Code

Requirements

  • macOS 14.0 (Sonoma) or later
  • Apple Silicon Mac (M1/M2/M3/M4)
  • MLX (pip install mlx-lm) or Ollama (brew install ollama)

What's New in v1.3.0

  • Enhanced cloud AI integration
  • Improved model management
  • Performance optimizations
  • Bug fixes

Created by Jordan Koch