Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ MimiClaw turns a tiny ESP32-S3 board into a personal AI assistant. Plug it into

![](assets/mimiclaw.png)

You send a message on Telegram. The ESP32-S3 picks it up over WiFi, feeds it into an agent loop — the LLM thinks, calls tools, reads memory — and sends the reply back. Supports both **Anthropic (Claude)** and **OpenAI (GPT)** as providers, switchable at runtime. Everything runs on a single $5 chip with all your data stored locally on flash.
You send a message on Telegram. The ESP32-S3 picks it up over WiFi, feeds it into an agent loop — the LLM thinks, calls tools, reads memory — and sends the reply back. Supports **Anthropic (Claude)**, **OpenAI (GPT)**, and **[Avian](https://avian.io)** as providers, switchable at runtime. Everything runs on a single $5 chip with all your data stored locally on flash.

## Quick Start

Expand All @@ -40,7 +40,7 @@ You send a message on Telegram. The ESP32-S3 picks it up over WiFi, feeds it int
- An **ESP32-S3 dev board** with 16 MB flash and 8 MB PSRAM (e.g. Xiaozhi AI board, ~$10)
- A **USB Type-C cable**
- A **Telegram bot token** — talk to [@BotFather](https://t.me/BotFather) on Telegram to create one
- An **Anthropic API key** — from [console.anthropic.com](https://console.anthropic.com), or an **OpenAI API key** — from [platform.openai.com](https://platform.openai.com)
- An **Anthropic API key** — from [console.anthropic.com](https://console.anthropic.com), an **OpenAI API key** — from [platform.openai.com](https://platform.openai.com), or an **Avian API key** — from [avian.io](https://avian.io)

### Install

Expand Down Expand Up @@ -128,7 +128,7 @@ Edit `main/mimi_secrets.h`:
#define MIMI_SECRET_WIFI_PASS "YourWiFiPassword"
#define MIMI_SECRET_TG_TOKEN "123456:ABC-DEF1234ghIkl-zyx57W2v1u123ew11"
#define MIMI_SECRET_API_KEY "sk-ant-api03-xxxxx"
#define MIMI_SECRET_MODEL_PROVIDER "anthropic" // "anthropic" or "openai"
#define MIMI_SECRET_MODEL_PROVIDER "anthropic" // "anthropic", "openai", or "avian"
#define MIMI_SECRET_SEARCH_KEY "" // optional: Brave Search API key
#define MIMI_SECRET_PROXY_HOST "" // optional: e.g. "10.0.0.1"
#define MIMI_SECRET_PROXY_PORT "" // optional: e.g. "7897"
Expand Down Expand Up @@ -168,7 +168,7 @@ Connect via serial to configure or debug. **Config commands** let you change set
mimi> wifi_set MySSID MyPassword # change WiFi network
mimi> set_tg_token 123456:ABC... # change Telegram bot token
mimi> set_api_key sk-ant-api03-... # change API key (Anthropic or OpenAI)
mimi> set_model_provider openai # switch provider (anthropic|openai)
mimi> set_model_provider openai # switch provider (anthropic|openai|avian)
mimi> set_model gpt-4o # change LLM model
mimi> set_proxy 127.0.0.1 7897 # set HTTP proxy
mimi> clear_proxy # remove proxy
Expand Down Expand Up @@ -278,7 +278,7 @@ This turns MimiClaw into a proactive assistant — write tasks to `HEARTBEAT.md`
- **OTA updates** — flash new firmware over WiFi, no USB needed
- **Dual-core** — network I/O and AI processing run on separate CPU cores
- **HTTP proxy** — CONNECT tunnel support for restricted networks
- **Multi-provider** — supports both Anthropic (Claude) and OpenAI (GPT), switchable at runtime
- **Multi-provider** — supports Anthropic (Claude), OpenAI (GPT), and Avian, switchable at runtime
- **Cron scheduler** — the AI can schedule its own recurring and one-shot tasks, persisted across reboots
- **Heartbeat** — periodically checks a task file and prompts the AI to act autonomously
- **Tool use** — ReAct agent loop with tool calling for both providers
Expand Down
10 changes: 5 additions & 5 deletions README_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ MimiClaw 把一块小小的 ESP32-S3 开发板变成你的私人 AI 助理。插

![](assets/mimiclaw.png)

你在 Telegram 发一条消息,ESP32-S3 通过 WiFi 收到后送进 Agent 循环 — LLM 思考、调用工具、读取记忆 — 再把回复发回来。同时支持 **Anthropic (Claude)****OpenAI (GPT)** 两种提供商,运行时可切换。一切都跑在一颗 $5 的芯片上,所有数据存在本地 Flash。
你在 Telegram 发一条消息,ESP32-S3 通过 WiFi 收到后送进 Agent 循环 — LLM 思考、调用工具、读取记忆 — 再把回复发回来。同时支持 **Anthropic (Claude)****OpenAI (GPT)** 和 **[Avian](https://avian.io)** 三种提供商,运行时可切换。一切都跑在一颗 $5 的芯片上,所有数据存储在本地 Flash

## 快速开始

Expand All @@ -40,7 +40,7 @@ MimiClaw 把一块小小的 ESP32-S3 开发板变成你的私人 AI 助理。插
- 一块 **ESP32-S3 开发板**,16MB Flash + 8MB PSRAM(如小智 AI 开发板,~¥30)
- 一根 **USB Type-C 数据线**
- 一个 **Telegram Bot Token** — 在 Telegram 找 [@BotFather](https://t.me/BotFather) 创建
- 一个 **Anthropic API Key** — 从 [console.anthropic.com](https://console.anthropic.com) 获取,或一个 **OpenAI API Key** — 从 [platform.openai.com](https://platform.openai.com) 获取
- 一个 **Anthropic API Key** — 从 [console.anthropic.com](https://console.anthropic.com) 获取,一个 **OpenAI API Key** — 从 [platform.openai.com](https://platform.openai.com) 获取,或一个 **Avian API Key** — 从 [avian.io](https://avian.io) 获取

### 安装

Expand Down Expand Up @@ -128,7 +128,7 @@ cp main/mimi_secrets.h.example main/mimi_secrets.h
#define MIMI_SECRET_WIFI_PASS "你的WiFi密码"
#define MIMI_SECRET_TG_TOKEN "123456:ABC-DEF1234ghIkl-zyx57W2v1u123ew11"
#define MIMI_SECRET_API_KEY "sk-ant-api03-xxxxx"
#define MIMI_SECRET_MODEL_PROVIDER "anthropic" // "anthropic" 或 "openai"
#define MIMI_SECRET_MODEL_PROVIDER "anthropic" // "anthropic"、"openai" 或 "avian"
#define MIMI_SECRET_SEARCH_KEY "" // 可选:Brave Search API key
#define MIMI_SECRET_PROXY_HOST "10.0.0.1" // 可选:代理地址
#define MIMI_SECRET_PROXY_PORT "7897" // 可选:代理端口
Expand Down Expand Up @@ -183,7 +183,7 @@ mimi> clear_proxy # 清除代理
mimi> wifi_set MySSID MyPassword # 换 WiFi
mimi> set_tg_token 123456:ABC... # 换 Telegram Bot Token
mimi> set_api_key sk-ant-api03-... # 换 API Key(Anthropic 或 OpenAI)
mimi> set_model_provider openai # 切换提供商(anthropic|openai)
mimi> set_model_provider openai # 切换提供商(anthropic|openai|avian
mimi> set_model gpt-4o # 换模型
mimi> set_proxy 192.168.1.83 7897 # 设置代理
mimi> clear_proxy # 清除代理
Expand Down Expand Up @@ -293,7 +293,7 @@ MimiClaw 内置 cron 调度器,让 AI 可以自主安排任务。LLM 可以通
- **OTA 更新** — WiFi 远程刷固件,无需 USB
- **双核** — 网络 I/O 和 AI 处理分别跑在不同 CPU 核心
- **HTTP 代理** — CONNECT 隧道,适配受限网络
- **多提供商** — 同时支持 Anthropic (Claude)OpenAI (GPT),运行时可切换
- **多提供商** — 同时支持 Anthropic (Claude)OpenAI (GPT) 和 Avian,运行时可切换
- **定时任务** — AI 可自主创建周期性和一次性任务,重启后持久保存
- **心跳服务** — 定期检查任务文件,驱动 AI 自主执行
- **工具调用** — ReAct Agent 循环,两种提供商均支持工具调用
Expand Down
10 changes: 5 additions & 5 deletions README_JA.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ MimiClawは小さなESP32-S3ボードをパーソナルAIアシスタントに

![](assets/mimiclaw.png)

Telegramでメッセージを送ると、ESP32-S3がWiFi経由で受信し、エージェントループに送ります — LLMが思考し、ツールを呼び出し、メモリを読み取り — 返答を送り返します。**Anthropic (Claude)****OpenAI (GPT)** の両方をサポートし、実行時に切り替え可能です。すべてが$5のチップ上で動作し、データはすべてローカルのFlashに保存されます。
Telegramでメッセージを送ると、ESP32-S3がWiFi経由で受信し、エージェントループに送ります — LLMが思考し、ツールを呼び出し、メモリを読み取り — 返答を送り返します。**Anthropic (Claude)****OpenAI (GPT)**、**[Avian](https://avian.io)** をサポートし、実行時に切り替え可能です。すべてが$5のチップ上で動作し、データはすべてローカルのFlashに保存されます。

## クイックスタート

Expand All @@ -40,7 +40,7 @@ Telegramでメッセージを送ると、ESP32-S3がWiFi経由で受信し、エ
- **ESP32-S3開発ボード**(16MB Flash + 8MB PSRAM搭載、例:小智AIボード、約$10)
- **USB Type-Cケーブル**
- **Telegram Botトークン** — Telegramで[@BotFather](https://t.me/BotFather)に話しかけて作成
- **Anthropic APIキー** — [console.anthropic.com](https://console.anthropic.com)から取得、または **OpenAI APIキー** — [platform.openai.com](https://platform.openai.com)から取得
- **Anthropic APIキー** — [console.anthropic.com](https://console.anthropic.com)から取得、**OpenAI APIキー** — [platform.openai.com](https://platform.openai.com)から取得、または **Avian APIキー** — [avian.io](https://avian.io)から取得

### インストール

Expand Down Expand Up @@ -128,7 +128,7 @@ cp main/mimi_secrets.h.example main/mimi_secrets.h
#define MIMI_SECRET_WIFI_PASS "WiFiパスワード"
#define MIMI_SECRET_TG_TOKEN "123456:ABC-DEF1234ghIkl-zyx57W2v1u123ew11"
#define MIMI_SECRET_API_KEY "sk-ant-api03-xxxxx"
#define MIMI_SECRET_MODEL_PROVIDER "anthropic" // "anthropic" または "openai"
#define MIMI_SECRET_MODEL_PROVIDER "anthropic" // "anthropic"、"openai" または "avian"
#define MIMI_SECRET_SEARCH_KEY "" // 任意:Brave Search APIキー
#define MIMI_SECRET_PROXY_HOST "" // 任意:例 "10.0.0.1"
#define MIMI_SECRET_PROXY_PORT "" // 任意:例 "7897"
Expand Down Expand Up @@ -168,7 +168,7 @@ idf.py -p PORT flash monitor
mimi> wifi_set MySSID MyPassword # WiFiネットワークを変更
mimi> set_tg_token 123456:ABC... # Telegram Botトークンを変更
mimi> set_api_key sk-ant-api03-... # APIキーを変更(AnthropicまたはOpenAI)
mimi> set_model_provider openai # プロバイダーを切替(anthropic|openai)
mimi> set_model_provider openai # プロバイダーを切替(anthropic|openai|avian
mimi> set_model gpt-4o # LLMモデルを変更
mimi> set_proxy 127.0.0.1 7897 # HTTPプロキシを設定
mimi> clear_proxy # プロキシを削除
Expand Down Expand Up @@ -278,7 +278,7 @@ MimiClawにはcronスケジューラが内蔵されており、AIが自律的に
- **OTAアップデート** — WiFi経由でファームウェア更新、USB不要
- **デュアルコア** — ネットワークI/OとAI処理が別々のCPUコアで動作
- **HTTPプロキシ** — CONNECTトンネル対応、制限付きネットワークに対応
- **マルチプロバイダー** — Anthropic (Claude)OpenAI (GPT) の両方をサポート、実行時に切り替え可能
- **マルチプロバイダー** — Anthropic (Claude)OpenAI (GPT)、Avian をサポート、実行時に切り替え可能
- **Cronスケジューラ** — AIが定期・単発タスクを自律的にスケジュール、再起動後も永続化
- **ハートビート** — タスクファイルを定期チェックし、AIを自律的に駆動
- **ツール呼び出し** — ReActエージェントループ、両プロバイダーでツール呼び出し対応
Expand Down
2 changes: 1 addition & 1 deletion main/cli/serial_cli.c
Original file line number Diff line number Diff line change
Expand Up @@ -708,7 +708,7 @@ esp_err_t serial_cli_init(void)
esp_console_cmd_register(&model_cmd);

/* set_model_provider */
provider_args.provider = arg_str1(NULL, NULL, "<provider>", "Model provider (anthropic|openai)");
provider_args.provider = arg_str1(NULL, NULL, "<provider>", "Model provider (anthropic|openai|avian)");
provider_args.end = arg_end(1);
esp_console_cmd_t provider_cmd = {
.command = "set_model_provider",
Expand Down
29 changes: 19 additions & 10 deletions main/llm/llm_proxy.c
Original file line number Diff line number Diff line change
Expand Up @@ -182,24 +182,29 @@ static esp_err_t http_event_handler(esp_http_client_event_t *evt)

/* ── Provider helpers ──────────────────────────────────────────── */

static bool provider_is_openai(void)
static bool provider_is_openai_compat(void)
{
return strcmp(s_provider, "openai") == 0;
return strcmp(s_provider, "openai") == 0
|| strcmp(s_provider, "avian") == 0;
}

static const char *llm_api_url(void)
{
return provider_is_openai() ? MIMI_OPENAI_API_URL : MIMI_LLM_API_URL;
if (strcmp(s_provider, "avian") == 0) return MIMI_AVIAN_API_URL;
if (strcmp(s_provider, "openai") == 0) return MIMI_OPENAI_API_URL;
return MIMI_LLM_API_URL;
}

static const char *llm_api_host(void)
{
return provider_is_openai() ? "api.openai.com" : "api.anthropic.com";
if (strcmp(s_provider, "avian") == 0) return "api.avian.io";
if (strcmp(s_provider, "openai") == 0) return "api.openai.com";
return "api.anthropic.com";
}

static const char *llm_api_path(void)
{
return provider_is_openai() ? "/v1/chat/completions" : "/v1/messages";
return provider_is_openai_compat() ? "/v1/chat/completions" : "/v1/messages";
}

/* ── Init ─────────────────────────────────────────────────────── */
Expand Down Expand Up @@ -265,7 +270,7 @@ static esp_err_t llm_http_direct(const char *post_data, resp_buf_t *rb, int *out

esp_http_client_set_method(client, HTTP_METHOD_POST);
esp_http_client_set_header(client, "Content-Type", "application/json");
if (provider_is_openai()) {
if (provider_is_openai_compat()) {
if (s_api_key[0]) {
char auth[LLM_API_KEY_MAX_LEN + 16];
snprintf(auth, sizeof(auth), "Bearer %s", s_api_key);
Expand Down Expand Up @@ -293,7 +298,7 @@ static esp_err_t llm_http_via_proxy(const char *post_data, resp_buf_t *rb, int *
int body_len = strlen(post_data);
char header[1024];
int hlen = 0;
if (provider_is_openai()) {
if (provider_is_openai_compat()) {
hlen = snprintf(header, sizeof(header),
"POST %s HTTP/1.1\r\n"
"Host: %s\r\n"
Expand Down Expand Up @@ -559,13 +564,17 @@ esp_err_t llm_chat_tools(const char *system_prompt,
/* Build request body (non-streaming) */
cJSON *body = cJSON_CreateObject();
cJSON_AddStringToObject(body, "model", s_model);
if (provider_is_openai()) {
/* Note: intentionally NOT using provider_is_openai_compat() here.
* Only real OpenAI supports max_completion_tokens; Avian's API
* (and Anthropic) expect max_tokens despite being OpenAI-compatible
* for routing/auth purposes. */
if (strcmp(s_provider, "openai") == 0) {
cJSON_AddNumberToObject(body, "max_completion_tokens", MIMI_LLM_MAX_TOKENS);
} else {
cJSON_AddNumberToObject(body, "max_tokens", MIMI_LLM_MAX_TOKENS);
}

if (provider_is_openai()) {
if (provider_is_openai_compat()) {
cJSON *openai_msgs = convert_messages_openai(system_prompt, messages);
cJSON_AddItemToObject(body, "messages", openai_msgs);

Expand Down Expand Up @@ -635,7 +644,7 @@ esp_err_t llm_chat_tools(const char *system_prompt,
return ESP_FAIL;
}

if (provider_is_openai()) {
if (provider_is_openai_compat()) {
cJSON *choices = cJSON_GetObjectItem(root, "choices");
cJSON *choice0 = choices && cJSON_IsArray(choices) ? cJSON_GetArrayItem(choices, 0) : NULL;
if (choice0) {
Expand Down
1 change: 1 addition & 0 deletions main/mimi_config.h
Original file line number Diff line number Diff line change
Expand Up @@ -85,6 +85,7 @@
#define MIMI_LLM_MAX_TOKENS 4096
#define MIMI_LLM_API_URL "https://api.anthropic.com/v1/messages"
#define MIMI_OPENAI_API_URL "https://api.openai.com/v1/chat/completions"
#define MIMI_AVIAN_API_URL "https://api.avian.io/v1/chat/completions"
#define MIMI_LLM_API_VERSION "2023-06-01"
#define MIMI_LLM_STREAM_BUF_SIZE (32 * 1024)
#define MIMI_LLM_LOG_VERBOSE_PAYLOAD 0
Expand Down
4 changes: 2 additions & 2 deletions main/mimi_secrets.h.example
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,10 @@
#define MIMI_SECRET_FEISHU_APP_ID ""
#define MIMI_SECRET_FEISHU_APP_SECRET ""

/* Anthropic API */
/* LLM API */
#define MIMI_SECRET_API_KEY ""
#define MIMI_SECRET_MODEL ""
#define MIMI_SECRET_MODEL_PROVIDER "anthropic"
#define MIMI_SECRET_MODEL_PROVIDER "anthropic" /* "anthropic", "openai", or "avian" */

/* HTTP Proxy (leave empty or set both) */
#define MIMI_SECRET_PROXY_HOST ""
Expand Down