Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 7 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ MimiClaw turns a tiny ESP32-S3 board into a personal AI assistant. Plug it into

![](assets/mimiclaw.png)

You send a message on Telegram. The ESP32-S3 picks it up over WiFi, feeds it into an agent loop — the LLM thinks, calls tools, reads memory — and sends the reply back. Supports both **Anthropic (Claude)** and **OpenAI (GPT)** as providers, switchable at runtime. Everything runs on a single $5 chip with all your data stored locally on flash.
You send a message on Telegram. The ESP32-S3 picks it up over WiFi, feeds it into an agent loop — the LLM thinks, calls tools, reads memory — and sends the reply back. Supports **Anthropic (Claude)**, **OpenAI (GPT)**, and **[MiniMax](https://platform.minimax.io)** as providers, switchable at runtime. Everything runs on a single $5 chip with all your data stored locally on flash.

## Quick Start

Expand All @@ -40,7 +40,7 @@ You send a message on Telegram. The ESP32-S3 picks it up over WiFi, feeds it int
- An **ESP32-S3 dev board** with 16 MB flash and 8 MB PSRAM (e.g. Xiaozhi AI board, ~$10)
- A **USB Type-C cable**
- A **Telegram bot token** — talk to [@BotFather](https://t.me/BotFather) on Telegram to create one
- An **Anthropic API key** — from [console.anthropic.com](https://console.anthropic.com), or an **OpenAI API key** — from [platform.openai.com](https://platform.openai.com)
- An **Anthropic API key** — from [console.anthropic.com](https://console.anthropic.com), an **OpenAI API key** — from [platform.openai.com](https://platform.openai.com), or a **MiniMax API key** — from [platform.minimax.io](https://platform.minimax.io)

### Install

Expand Down Expand Up @@ -128,7 +128,7 @@ Edit `main/mimi_secrets.h`:
#define MIMI_SECRET_WIFI_PASS "YourWiFiPassword"
#define MIMI_SECRET_TG_TOKEN "123456:ABC-DEF1234ghIkl-zyx57W2v1u123ew11"
#define MIMI_SECRET_API_KEY "sk-ant-api03-xxxxx"
#define MIMI_SECRET_MODEL_PROVIDER "anthropic" // "anthropic" or "openai"
#define MIMI_SECRET_MODEL_PROVIDER "anthropic" // "anthropic", "openai", or "minimax"
#define MIMI_SECRET_SEARCH_KEY "" // optional: Brave Search API key
#define MIMI_SECRET_TAVILY_KEY "" // optional: Tavily API key (preferred)
#define MIMI_SECRET_PROXY_HOST "" // optional: e.g. "10.0.0.1"
Expand Down Expand Up @@ -168,8 +168,9 @@ Connect via serial to configure or debug. **Config commands** let you change set
```
mimi> wifi_set MySSID MyPassword # change WiFi network
mimi> set_tg_token 123456:ABC... # change Telegram bot token
mimi> set_api_key sk-ant-api03-... # change API key (Anthropic or OpenAI)
mimi> set_model_provider openai # switch provider (anthropic|openai)
mimi> set_api_key sk-ant-api03-... # change API key (Anthropic, OpenAI, or MiniMax)
mimi> set_model_provider minimax # switch provider (anthropic|openai|minimax)
mimi> set_model MiniMax-M2.7 # use MiniMax model
mimi> set_model gpt-4o # change LLM model
mimi> set_proxy 127.0.0.1 7897 # set HTTP proxy
mimi> clear_proxy # remove proxy
Expand Down Expand Up @@ -280,7 +281,7 @@ This turns MimiClaw into a proactive assistant — write tasks to `HEARTBEAT.md`
- **OTA updates** — flash new firmware over WiFi, no USB needed
- **Dual-core** — network I/O and AI processing run on separate CPU cores
- **HTTP proxy** — CONNECT tunnel support for restricted networks
- **Multi-provider** — supports both Anthropic (Claude) and OpenAI (GPT), switchable at runtime
- **Multi-provider** — supports Anthropic (Claude), OpenAI (GPT), and MiniMax, switchable at runtime
- **Cron scheduler** — the AI can schedule its own recurring and one-shot tasks, persisted across reboots
- **Heartbeat** — periodically checks a task file and prompts the AI to act autonomously
- **Tool use** — ReAct agent loop with tool calling for both providers
Expand Down
14 changes: 7 additions & 7 deletions README_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ MimiClaw 把一块小小的 ESP32-S3 开发板变成你的私人 AI 助理。插

![](assets/mimiclaw.png)

你在 Telegram 发一条消息,ESP32-S3 通过 WiFi 收到后送进 Agent 循环 — LLM 思考、调用工具、读取记忆 — 再把回复发回来。同时支持 **Anthropic (Claude)****OpenAI (GPT)** 两种提供商,运行时可切换。一切都跑在一颗 $5 的芯片上,所有数据存在本地 Flash。
你在 Telegram 发一条消息,ESP32-S3 通过 WiFi 收到后送进 Agent 循环 — LLM 思考、调用工具、读取记忆 — 再把回复发回来。同时支持 **Anthropic (Claude)****OpenAI (GPT)** 和 **[MiniMax](https://platform.minimax.io)** 三种提供商,运行时可切换。一切都跑在一颗 $5 的芯片上,所有数据存在本地 Flash。

## 快速开始

Expand All @@ -40,7 +40,7 @@ MimiClaw 把一块小小的 ESP32-S3 开发板变成你的私人 AI 助理。插
- 一块 **ESP32-S3 开发板**,16MB Flash + 8MB PSRAM(如小智 AI 开发板,~¥30)
- 一根 **USB Type-C 数据线**
- 一个 **Telegram Bot Token** — 在 Telegram 找 [@BotFather](https://t.me/BotFather) 创建
- 一个 **Anthropic API Key** — 从 [console.anthropic.com](https://console.anthropic.com) 获取,或一个 **OpenAI API Key** — 从 [platform.openai.com](https://platform.openai.com) 获取
- 一个 **Anthropic API Key** — 从 [console.anthropic.com](https://console.anthropic.com) 获取,一个 **OpenAI API Key** — 从 [platform.openai.com](https://platform.openai.com) 获取,或一个 **MiniMax API Key** — 从 [platform.minimax.io](https://platform.minimax.io) 获取

### 安装

Expand Down Expand Up @@ -128,7 +128,7 @@ cp main/mimi_secrets.h.example main/mimi_secrets.h
#define MIMI_SECRET_WIFI_PASS "你的WiFi密码"
#define MIMI_SECRET_TG_TOKEN "123456:ABC-DEF1234ghIkl-zyx57W2v1u123ew11"
#define MIMI_SECRET_API_KEY "sk-ant-api03-xxxxx"
#define MIMI_SECRET_MODEL_PROVIDER "anthropic" // "anthropic" 或 "openai"
#define MIMI_SECRET_MODEL_PROVIDER "anthropic" // "anthropic"、"openai" 或 "minimax"
#define MIMI_SECRET_SEARCH_KEY "" // 可选:Brave Search API key
#define MIMI_SECRET_TAVILY_KEY "" // 可选:Tavily API key(优先)
#define MIMI_SECRET_PROXY_HOST "10.0.0.1" // 可选:代理地址
Expand Down Expand Up @@ -183,9 +183,9 @@ mimi> clear_proxy # 清除代理
```
mimi> wifi_set MySSID MyPassword # 换 WiFi
mimi> set_tg_token 123456:ABC... # 换 Telegram Bot Token
mimi> set_api_key sk-ant-api03-... # 换 API Key(Anthropic 或 OpenAI
mimi> set_model_provider openai # 切换提供商(anthropic|openai)
mimi> set_model gpt-4o # 换模型
mimi> set_api_key sk-ant-api03-... # 换 API Key(Anthropic、OpenAIMiniMax
mimi> set_model_provider minimax # 切换提供商(anthropic|openai|minimax
mimi> set_model MiniMax-M2.7 # 使用 MiniMax 模型
mimi> set_proxy 192.168.1.83 7897 # 设置代理
mimi> clear_proxy # 清除代理
mimi> set_search_key BSA... # 设置 Brave Search API Key
Expand Down Expand Up @@ -295,7 +295,7 @@ MimiClaw 内置 cron 调度器,让 AI 可以自主安排任务。LLM 可以通
- **OTA 更新** — WiFi 远程刷固件,无需 USB
- **双核** — 网络 I/O 和 AI 处理分别跑在不同 CPU 核心
- **HTTP 代理** — CONNECT 隧道,适配受限网络
- **多提供商** — 同时支持 Anthropic (Claude)OpenAI (GPT),运行时可切换
- **多提供商** — 同时支持 Anthropic (Claude)OpenAI (GPT) 和 MiniMax,运行时可切换
- **定时任务** — AI 可自主创建周期性和一次性任务,重启后持久保存
- **心跳服务** — 定期检查任务文件,驱动 AI 自主执行
- **工具调用** — ReAct Agent 循环,两种提供商均支持工具调用
Comment on lines +298 to 301
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

“多提供商”与“工具调用”描述数量不一致。

Line 298 写的是三家提供商,但 Line 301 仍写“两种提供商均支持工具调用”。这会让读者误解 MiniMax 的工具调用支持状态。建议统一为“三种”,或明确注明 MiniMax 的限制(如果有)。

✏️ 建议修订
-- **工具调用** — ReAct Agent 循环,两种提供商均支持工具调用
+- **工具调用** — ReAct Agent 循环,三种提供商均支持工具调用
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
- **多提供商** — 同时支持 Anthropic (Claude)、OpenAI (GPT) 和 MiniMax,运行时可切换
- **定时任务** — AI 可自主创建周期性和一次性任务,重启后持久保存
- **心跳服务** — 定期检查任务文件,驱动 AI 自主执行
- **工具调用** — ReAct Agent 循环,两种提供商均支持工具调用
- **多提供商** — 同时支持 Anthropic (Claude)、OpenAI (GPT) 和 MiniMax,运行时可切换
- **定时任务** — AI 可自主创建周期性和一次性任务,重启后持久保存
- **心跳服务** — 定期检查任务文件,驱动 AI 自主执行
- **工具调用** — ReAct Agent 循环,三种提供商均支持工具调用
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@README_CN.md` around lines 298 - 301, README_CN.md has inconsistent wording:
the "多提供商" bullet lists three providers (Anthropic/Claude, OpenAI/GPT, MiniMax)
but the "工具调用" bullet incorrectly says "两种提供商". Update the "工具调用" line to either
say "三种提供商均支持工具调用" to match the three providers, or explicitly state MiniMax's
limitation (e.g., "Anthropic 和 OpenAI 支持工具调用,MiniMax 暂不支持") so the README
consistently reflects MiniMax's support status; adjust the bullets containing
"多提供商" and "工具调用" accordingly.

Expand Down
14 changes: 7 additions & 7 deletions README_JA.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ MimiClawは小さなESP32-S3ボードをパーソナルAIアシスタントに

![](assets/mimiclaw.png)

Telegramでメッセージを送ると、ESP32-S3がWiFi経由で受信し、エージェントループに送ります — LLMが思考し、ツールを呼び出し、メモリを読み取り — 返答を送り返します。**Anthropic (Claude)****OpenAI (GPT)** の両方をサポートし、実行時に切り替え可能です。すべてが$5のチップ上で動作し、データはすべてローカルのFlashに保存されます。
Telegramでメッセージを送ると、ESP32-S3がWiFi経由で受信し、エージェントループに送ります — LLMが思考し、ツールを呼び出し、メモリを読み取り — 返答を送り返します。**Anthropic (Claude)****OpenAI (GPT)**、**[MiniMax](https://platform.minimax.io)** をサポートし、実行時に切り替え可能です。すべてが$5のチップ上で動作し、データはすべてローカルのFlashに保存されます。

## クイックスタート

Expand All @@ -40,7 +40,7 @@ Telegramでメッセージを送ると、ESP32-S3がWiFi経由で受信し、エ
- **ESP32-S3開発ボード**(16MB Flash + 8MB PSRAM搭載、例:小智AIボード、約$10)
- **USB Type-Cケーブル**
- **Telegram Botトークン** — Telegramで[@BotFather](https://t.me/BotFather)に話しかけて作成
- **Anthropic APIキー** — [console.anthropic.com](https://console.anthropic.com)から取得、または **OpenAI APIキー** — [platform.openai.com](https://platform.openai.com)から取得
- **Anthropic APIキー** — [console.anthropic.com](https://console.anthropic.com)から取得、**OpenAI APIキー** — [platform.openai.com](https://platform.openai.com)から取得、または **MiniMax APIキー** — [platform.minimax.io](https://platform.minimax.io)から取得

### インストール

Expand Down Expand Up @@ -128,7 +128,7 @@ cp main/mimi_secrets.h.example main/mimi_secrets.h
#define MIMI_SECRET_WIFI_PASS "WiFiパスワード"
#define MIMI_SECRET_TG_TOKEN "123456:ABC-DEF1234ghIkl-zyx57W2v1u123ew11"
#define MIMI_SECRET_API_KEY "sk-ant-api03-xxxxx"
#define MIMI_SECRET_MODEL_PROVIDER "anthropic" // "anthropic" または "openai"
#define MIMI_SECRET_MODEL_PROVIDER "anthropic" // "anthropic"、"openai"、または "minimax"
#define MIMI_SECRET_SEARCH_KEY "" // 任意:Brave Search APIキー
#define MIMI_SECRET_TAVILY_KEY "" // 任意:Tavily APIキー(優先)
#define MIMI_SECRET_PROXY_HOST "" // 任意:例 "10.0.0.1"
Expand Down Expand Up @@ -168,9 +168,9 @@ idf.py -p PORT flash monitor
```
mimi> wifi_set MySSID MyPassword # WiFiネットワークを変更
mimi> set_tg_token 123456:ABC... # Telegram Botトークンを変更
mimi> set_api_key sk-ant-api03-... # APIキーを変更(AnthropicまたはOpenAI
mimi> set_model_provider openai # プロバイダーを切替(anthropic|openai)
mimi> set_model gpt-4o # LLMモデルを変更
mimi> set_api_key sk-ant-api03-... # APIキーを変更(Anthropic、OpenAI、またはMiniMax
mimi> set_model_provider minimax # プロバイダーを切替(anthropic|openai|minimax
mimi> set_model MiniMax-M2.7 # MiniMaxモデルを使用
mimi> set_proxy 127.0.0.1 7897 # HTTPプロキシを設定
mimi> clear_proxy # プロキシを削除
mimi> set_search_key BSA... # Brave Search APIキーを設定
Expand Down Expand Up @@ -280,7 +280,7 @@ MimiClawにはcronスケジューラが内蔵されており、AIが自律的に
- **OTAアップデート** — WiFi経由でファームウェア更新、USB不要
- **デュアルコア** — ネットワークI/OとAI処理が別々のCPUコアで動作
- **HTTPプロキシ** — CONNECTトンネル対応、制限付きネットワークに対応
- **マルチプロバイダー** — Anthropic (Claude)OpenAI (GPT) の両方をサポート、実行時に切り替え可能
- **マルチプロバイダー** — Anthropic (Claude)OpenAI (GPT)、MiniMax をサポート、実行時に切り替え可能
- **Cronスケジューラ** — AIが定期・単発タスクを自律的にスケジュール、再起動後も永続化
- **ハートビート** — タスクファイルを定期チェックし、AIを自律的に駆動
- **ツール呼び出し** — ReActエージェントループ、両プロバイダーでツール呼び出し対応
Expand Down
2 changes: 1 addition & 1 deletion main/cli/serial_cli.c
Original file line number Diff line number Diff line change
Expand Up @@ -887,7 +887,7 @@ esp_err_t serial_cli_init(void)
esp_console_cmd_register(&model_cmd);

/* set_model_provider */
provider_args.provider = arg_str1(NULL, NULL, "<provider>", "Model provider (anthropic|openai)");
provider_args.provider = arg_str1(NULL, NULL, "<provider>", "Model provider (anthropic|openai|minimax)");
provider_args.end = arg_end(1);
esp_console_cmd_t provider_cmd = {
.command = "set_model_provider",
Expand Down
31 changes: 23 additions & 8 deletions main/llm/llm_proxy.c
Original file line number Diff line number Diff line change
Expand Up @@ -187,19 +187,34 @@ static bool provider_is_openai(void)
return strcmp(s_provider, "openai") == 0;
}

static bool provider_is_minimax(void)
{
return strcmp(s_provider, "minimax") == 0;
}

/* MiniMax uses the OpenAI-compatible API format */
static bool provider_uses_openai_format(void)
{
return provider_is_openai() || provider_is_minimax();
}

static const char *llm_api_url(void)
{
return provider_is_openai() ? MIMI_OPENAI_API_URL : MIMI_LLM_API_URL;
if (provider_is_minimax()) return MIMI_MINIMAX_API_URL;
if (provider_is_openai()) return MIMI_OPENAI_API_URL;
return MIMI_LLM_API_URL;
}

static const char *llm_api_host(void)
{
return provider_is_openai() ? "api.openai.com" : "api.anthropic.com";
if (provider_is_minimax()) return "api.minimax.io";
if (provider_is_openai()) return "api.openai.com";
return "api.anthropic.com";
}

static const char *llm_api_path(void)
{
return provider_is_openai() ? "/v1/chat/completions" : "/v1/messages";
return provider_uses_openai_format() ? "/v1/chat/completions" : "/v1/messages";
}

/* ── Init ─────────────────────────────────────────────────────── */
Expand Down Expand Up @@ -265,7 +280,7 @@ static esp_err_t llm_http_direct(const char *post_data, resp_buf_t *rb, int *out

esp_http_client_set_method(client, HTTP_METHOD_POST);
esp_http_client_set_header(client, "Content-Type", "application/json");
if (provider_is_openai()) {
if (provider_uses_openai_format()) {
if (s_api_key[0]) {
char auth[LLM_API_KEY_MAX_LEN + 16];
snprintf(auth, sizeof(auth), "Bearer %s", s_api_key);
Expand Down Expand Up @@ -293,7 +308,7 @@ static esp_err_t llm_http_via_proxy(const char *post_data, resp_buf_t *rb, int *
int body_len = strlen(post_data);
char header[1024];
int hlen = 0;
if (provider_is_openai()) {
if (provider_uses_openai_format()) {
hlen = snprintf(header, sizeof(header),
"POST %s HTTP/1.1\r\n"
"Host: %s\r\n"
Expand Down Expand Up @@ -559,13 +574,13 @@ esp_err_t llm_chat_tools(const char *system_prompt,
/* Build request body (non-streaming) */
cJSON *body = cJSON_CreateObject();
cJSON_AddStringToObject(body, "model", s_model);
if (provider_is_openai()) {
if (provider_uses_openai_format()) {
cJSON_AddNumberToObject(body, "max_completion_tokens", MIMI_LLM_MAX_TOKENS);
} else {
cJSON_AddNumberToObject(body, "max_tokens", MIMI_LLM_MAX_TOKENS);
}

if (provider_is_openai()) {
if (provider_uses_openai_format()) {
cJSON *openai_msgs = convert_messages_openai(system_prompt, messages);
cJSON_AddItemToObject(body, "messages", openai_msgs);

Expand Down Expand Up @@ -635,7 +650,7 @@ esp_err_t llm_chat_tools(const char *system_prompt,
return ESP_FAIL;
}

if (provider_is_openai()) {
if (provider_uses_openai_format()) {
cJSON *choices = cJSON_GetObjectItem(root, "choices");
cJSON *choice0 = choices && cJSON_IsArray(choices) ? cJSON_GetArrayItem(choices, 0) : NULL;
if (choice0) {
Expand Down
1 change: 1 addition & 0 deletions main/mimi_config.h
Original file line number Diff line number Diff line change
Expand Up @@ -88,6 +88,7 @@
#define MIMI_LLM_MAX_TOKENS 4096
#define MIMI_LLM_API_URL "https://api.anthropic.com/v1/messages"
#define MIMI_OPENAI_API_URL "https://api.openai.com/v1/chat/completions"
#define MIMI_MINIMAX_API_URL "https://api.minimax.io/v1/chat/completions"
#define MIMI_LLM_API_VERSION "2023-06-01"
#define MIMI_LLM_STREAM_BUF_SIZE (32 * 1024)
#define MIMI_LLM_LOG_VERBOSE_PAYLOAD 0
Expand Down
4 changes: 2 additions & 2 deletions main/mimi_secrets.h.example
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,10 @@
#define MIMI_SECRET_FEISHU_APP_ID ""
#define MIMI_SECRET_FEISHU_APP_SECRET ""

/* Anthropic API */
/* LLM API (Anthropic / OpenAI / MiniMax) */
#define MIMI_SECRET_API_KEY ""
#define MIMI_SECRET_MODEL ""
#define MIMI_SECRET_MODEL_PROVIDER "anthropic"
#define MIMI_SECRET_MODEL_PROVIDER "anthropic" /* "anthropic", "openai", or "minimax" */

/* HTTP Proxy (leave empty or set both) */
#define MIMI_SECRET_PROXY_HOST ""
Expand Down