your schizo AI waifu that actually respects your privacy
Milady is a personal AI assistant that is local-first by default and can also connect to Eliza Cloud or a remote self-hosted backend when you want hosted runtime access. Built on elizaOS
manages your sessions, tools, and vibes through a Gateway control plane. Connects to Telegram, Discord, whatever normie platform you use. Has a cute WebChat UI too.
tl;dr: local AI gf that's actually fast and doesn't phone home
Milady ships with native BNB Smart Chain (BSC) support — your agent can trade tokens, track meme launches, and interact with DeFi on BSC out of the box.
The built-in EXECUTE_TRADE action lets your agent swap tokens on BSC via PancakeSwap. Supports buy/sell with configurable slippage.
To enable BSC trading, add to your .env or ~/.eliza/.env:
EVM_PRIVATE_KEY=0x... # wallet private key (hex, 0x-prefixed)
ELIZA_TRADE_PERMISSION_MODE=agent # "agent" for autonomous, "user" for manual confirmOptional RPC configuration (defaults to public BSC RPC):
ALCHEMY_API_KEY=... # or use ANKR_API_KEY / INFURA_API_KEY
EVM_RPC_PROVIDER=alchemy # alchemy | infura | ankr | elizacloudOnce configured, just tell your agent: "buy 0.1 BNB of <token address>" or "sell all my <token>".
Install the meme-rush skill from Binance Skills Hub to track meme token launches across BSC and Solana:
- Meme Rush — real-time token lists from Pump.fun, Four.meme across new, finalizing, and migrated stages
- Topic Rush — AI-powered market hot topics with tokens ranked by net inflow
Install from the Skills Marketplace in the app, or ask your agent to install it.
Milady auto-generates EVM and Solana wallet addresses on startup. For BSC trading you need to import your own private key (see above). If connected to Eliza Cloud, managed wallets via Privy are available without local key management.
View your agent's wallet addresses in the Settings tab or ask: "what's my wallet address?"
Grab from Releases:
| Platform | File | |
|---|---|---|
| macOS (Apple Silicon) | Milady-arm64.dmg |
for your overpriced rectangle |
| macOS (Intel) | Milady-x64.dmg |
boomer mac (why separate arm64/x64: Build & release) |
| Windows | Milady-Setup.exe |
for the gamer anons |
| iOS | App Store | for the privacy-pilled |
| Android | Google Play / APK | for the degen on the go |
| Linux | .AppImage / .deb / Snap / Flatpak / APT repo |
I use arch btw |
Signed and notarized. No Gatekeeper FUD. We're legit.
cd ~/Downloads
curl -fsSLO https://github.com/milady-ai/milady/releases/latest/download/SHA256SUMS.txt
shasum -a 256 --check --ignore-missing SHA256SUMS.txtcurl -fsSL https://milady-ai.github.io/milady/install.sh | bash
milady setupThen start Milady:
miladyFirst run walks you through onboarding:
┌ milady
│
◇ What should I call your agent?
│ mila
│
◇ Pick a vibe
│ ● Helpful & friendly
│ ○ Tsundere
│ ○ Unhinged
│ ○ Custom...
│
◇ Connect a brain
│ ● Anthropic (Claude) ← recommended, actually smart
│ ○ OpenAI (GPT)
│ ○ Ollama (local, free, full schizo mode)
│ ○ Skip for now
│
◇ API key?
│ sk-ant-•••••••••••••••••
│
└ Starting agent...
Dashboard: http://localhost:2138
Gateway: ws://localhost:18789/ws
she's alive. go say hi.
Windows:
irm https://milady-ai.github.io/milady/install.ps1 | iexNPM global:
npm install -g miladyai
milady setupbrew tap milady-ai/milady
brew install milady # CLI
brew install --cask milady # Desktop app (macOS only)sudo snap install milady
milady setupSnap packages auto-update in the background. Available on Ubuntu, Fedora, Manjaro, and any distro with snapd installed.
For the latest development builds:
sudo snap install milady --edgeflatpak install flathub ai.milady.Milady
flatpak run ai.milady.MiladyOr sideload from a release bundle:
flatpak --user install milady.flatpak# Add the repository
curl -fsSL https://apt.milady.ai/gpg.key | sudo gpg --dearmor -o /usr/share/keyrings/milady.gpg
echo "deb [signed-by=/usr/share/keyrings/milady.gpg] https://apt.milady.ai stable main" | \
sudo tee /etc/apt/sources.list.d/milady.list
# Install
sudo apt update && sudo apt install miladyWorks on Debian 12+, Ubuntu 22.04+, Linux Mint 22+, Pop!_OS, and other Debian derivatives. Updates come through apt upgrade.
The API server binds to 127.0.0.1 (loopback) by default — only you can reach it. If you expose it to the network (e.g. MILADY_API_BIND=0.0.0.0 for container/cloud deployments), set a token:
echo "MILADY_API_TOKEN=$(openssl rand -hex 32)" >> .envWithout a token on a public bind, anyone who can reach the server gets full access to the dashboard, agent, and wallet endpoints.
On first run, onboarding now asks where the backend should live:
Local— run the backend on the current machine, exactly like the existing local flow.Cloud— either useEliza Cloudor attach to aRemote Miladybackend with its address and access key.
If you choose Eliza Cloud, the app provisions and connects to a managed backend. If you choose Remote Milady, the frontend rebinds to the backend you specify and continues against that API.
Use this when you want the Milady frontend to connect to a backend running on your VPS, homelab box, or another machine.
- Install Milady on the target machine.
- Bind the API to a reachable address.
- Generate a strong API token.
- Allow the frontend origin explicitly.
- Expose the backend over HTTPS or a private Tailscale URL.
Recommended server environment:
export MILADY_API_BIND=0.0.0.0
export MILADY_API_TOKEN="$(openssl rand -hex 32)"
export MILADY_ALLOWED_ORIGINS="https://app.milady.ai,https://milady.ai,https://elizacloud.ai,https://www.elizacloud.ai"
milady start --headlessThe access key the user enters in onboarding is the value of MILADY_API_TOKEN.
If you want to connect from the desktop shell instead of the web frontend:
MILADY_DESKTOP_API_BASE=https://your-milady-host.example.com \
MILADY_API_TOKEN=your-token \
bun run dev:desktopFor private remote access without opening the backend publicly, expose it over your tailnet:
tailscale serve --https=443 http://127.0.0.1:2138If you intentionally want a public Tailscale URL:
tailscale funnel --https=443 http://127.0.0.1:2138Then use the Tailscale HTTPS URL as the backend address in onboarding and keep using the same MILADY_API_TOKEN as the access key.
Milady uses the existing Eliza Cloud deployment directly at https://elizacloud.ai. The managed control plane, auth surface, billing, and instance dashboard all live there; there is no separate Milady-hosted cloud control plane to deploy.
Managed browser flow:
- Sign in on
https://elizacloud.ai/login?returnTo=%2Fdashboard%2Fmilady - Open or create a Milady instance in
https://elizacloud.ai/dashboard/milady - Eliza Cloud redirects to
https://app.milady.aiwith a one-time launch session app.milady.aiexchanges that launch session directly with Eliza Cloud and attaches to the selected managed backend
The desktop/local app still exposes local /api/cloud/* passthrough routes for cloud login, billing, and compat management so it can persist the Eliza Cloud API key into the local config/runtime. That is local app plumbing, not a separate hosted Milady server.
The integration plan lives in docs/eliza-cloud-rollout.md.
The implementation and proxy runbook lives in docs/eliza-cloud-deployment.md.
milady # start (default)
milady start # same thing
milady start --headless # no browser popup
milady start --verbose # debug mode for when things breakmilady setup # first-time setup / refresh workspace after update
milady configure # interactive config wizard
milady config get <key> # read a config value
milady config set <k> <v> # set a config valuemilady dashboard # open web UI in browser
milady dashboard --port 3000 # custom portmilady models # list configured model providers
milady models add # add a new provider
milady models test # test if your API keys workmilady plugins list # what's installed
milady plugins add <name> # install a plugin
milady plugins remove <name>milady --version # version check
milady --help # help
milady doctor # diagnose issuesWhen running, milady shows a live terminal interface:
╭─────────────────────────────────────────────────────────────╮
│ milady v0.1.0 ▲ running │
├─────────────────────────────────────────────────────────────┤
│ │
│ Agent: mila │
│ Model: anthropic/claude-opus-4-5 │
│ Sessions: 2 active │
│ │
│ ┌─ Activity ──────────────────────────────────────────┐ │
│ │ 12:34:02 [web] user: hey mila │ │
│ │ 12:34:05 [web] mila: hi anon~ what's up? │ │
│ │ 12:35:11 [telegram] user joined │ │
│ │ 12:35:15 [telegram] user: gm │ │
│ │ 12:35:17 [telegram] mila: gm fren │ │
│ └─────────────────────────────────────────────────────┘ │
│ │
│ Tokens: 12,847 in / 3,291 out Cost: $0.42 │
│ │
╰─────────────────────────────────────────────────────────────╯
[q] quit [r] restart [d] dashboard [l] logs [?] help
| Key | Action |
|---|---|
q |
quit gracefully |
r |
restart gateway |
d |
open dashboard in browser |
l |
toggle log view |
c |
compact/clear activity |
? |
show help |
↑/↓ |
scroll activity |
Don't want the TUI? Run headless:
milady start --headlessLogs go to stdout/stderr (or configure LOG_FILE). Daemonize with your favorite process manager.
| Command | What it do |
|---|---|
/status |
session status, tokens, cost |
/new /reset |
memory wipe, fresh start |
/compact |
compress context (she summarizes) |
/think <level> |
reasoning: off|minimal|low|medium|high|max |
/verbose on|off |
toggle verbose responses |
/usage off|tokens|full |
per-message token display |
/model <id> |
switch model mid-session |
/restart |
restart the gateway |
/help |
list commands |
| Service | Default | Env Override |
|---|---|---|
| API + WebSocket | 31337 |
MILADY_API_PORT |
| Gateway (API + WebSocket) | 18789 |
MILADY_GATEWAY_PORT |
| Dashboard (Web UI) | 2138 |
MILADY_PORT |
| Home Dashboard | 2142 |
MILADY_HOME_PORT |
# custom ports
MILADY_GATEWAY_PORT=19000 MILADY_PORT=3000 milady startLives at ~/.eliza/eliza.json (override with ELIZA_CONFIG_PATH or ELIZA_STATE_DIR)
{
agent: {
name: "mila",
model: "anthropic/claude-opus-4-5",
},
env: {
ANTHROPIC_API_KEY: "sk-ant-...",
},
}Or use ~/.eliza/.env for secrets.
| Provider | Env Variable | Vibe |
|---|---|---|
| Anthropic | ANTHROPIC_API_KEY |
recommended — claude is cracked |
| OpenAI | OPENAI_API_KEY |
gpt-4o, o1, the classics |
| OpenRouter | OPENROUTER_API_KEY |
100+ models one API |
| Ollama | — | local, free, no API key, full privacy |
| Groq | GROQ_API_KEY |
fast af |
| xAI | XAI_API_KEY |
grok, based |
| DeepSeek | DEEPSEEK_API_KEY |
reasoning arc |
Ollama lets you run models locally with zero API keys. Install it, pull a model, and configure Milady:
# install ollama
curl -fsSL https://ollama.ai/install.sh | sh
# pull a model
ollama pull gemma3:4b
⚠️ Known issue: The@elizaos/plugin-ollamahas an SDK version incompatibility with the current AI SDK. Use Ollama's OpenAI-compatible endpoint as a workaround:
Edit ~/.eliza/eliza.json:
{
env: {
OPENAI_API_KEY: "ollama", // any non-empty string works
OPENAI_BASE_URL: "http://localhost:11434/v1", // ollama's openai-compat endpoint
SMALL_MODEL: "gemma3:4b", // or any model you pulled
LARGE_MODEL: "gemma3:4b",
},
}This routes through the OpenAI plugin instead of the broken Ollama plugin. Works with any Ollama model — just make sure ollama serve is running.
Recommended models for local use:
| Model | Size | Vibe |
|---|---|---|
gemma3:4b |
~3GB | fast, good for chat |
llama3.2 |
~2GB | lightweight, quick responses |
mistral |
~4GB | solid all-rounder |
deepseek-r1:8b |
~5GB | reasoning arc |
| Version | Check | Install | |
|---|---|---|---|
| Node.js | >= 22 | node --version |
nodejs.org |
| Bun | latest | bun --version |
curl -fsSL https://bun.sh/install | bash |
| Git | any | git --version |
system package manager |
Optional (for vision/TTS plugins with native deps):
- macOS:
xcode-select --install - Linux:
sudo apt install build-essential python3 libcairo2-dev libjpeg-dev libpango1.0-dev
git clone https://github.com/milady-ai/milady.git
cd milady
bun install # runs postinstall hooks (patches deps, seeds skills, etc.)
bun run build
bun run milady start
scripts/rt.shprefers bun but falls back to npm automatically. If you want to be explicit:bun run build:nodeuses only Node.
bun run dev # starts API (:31337) + Vite UI (:2138) with hot reloadThis auto-kills zombie processes on the dev ports, waits for the API to be healthy, then starts the Vite dev server with proxy.
bun run check # typecheck + lint (run before committing)
bun run test # parallel test suite
bun run doctor # diagnose environment issues
bun run repair # re-run postinstall hooksSee DEVELOPMENT.md for the full development guide including troubleshooting, architecture overview, and config reference.
- Plugin resolution and NODE_PATH — Why we set
NODE_PATHin three places so dynamic plugin imports resolve when building from source (CLI, desktop dev, Electrobun). - Build and release — Why the release pipeline uses strict shell, retries, setup-node v3/Blacksmith, Bun cache, timeouts; why size-report pipelines handle SIGPIPE; why Windows plugin build uses
npx -p typescript tsc.
This project is built by agents, for agents.
Humans contribute as QA testers — use the app, find bugs, report them. That's the most valuable thing you can do. All code contributions are reviewed and merged by AI agents. No exceptions.
Read CONTRIBUTING.md for the full details.
MIT License
free to use, free to modify, free to distribute. see LICENSE for details.
built by agents. tested by humans. that's the split.