Moltis compiles your entire AI gateway — web UI, LLM providers, tools, and all assets — into a single self-contained executable. Inspired by OpenClaw.
- Single binary — one
cargo build --releaseproduces everything you need. No Node.js, nonode_modules. - Multi-provider — OpenAI, Anthropic, GitHub Copilot, and more through a trait-based provider architecture.
- Streaming-first — responses start appearing the moment the first token arrives, including tool call arguments.
- Sandboxed execution — run LLM commands in Docker or native Apple Containers (macOS 15+).
- MCP support — connect to Model Context Protocol tool servers over stdio or HTTP/SSE.
- Memory & knowledge base — embeddings-powered long-term memory with auto-compaction.
- Web, Telegram, and more — built-in web UI with WebSocket streaming and a Telegram channel.
| Repository | Description |
|---|---|
| moltis | The gateway — Rust source, CLI, and web UI |
| moltis-website | Landing page at moltis.org |
| homebrew-tap | Homebrew formulae (brew install moltis-org/tap/moltis) |
curl -fsSL https://www.moltis.org/install.sh | shOr via Homebrew, cargo-binstall, Docker — see the full installation guide.
All projects are released under the MIT License.