feat: add MiniMax provider support with M2.7 model#144
feat: add MiniMax provider support with M2.7 model#144octo-patch wants to merge 3 commits intomemovai:mainfrom
Conversation
Add MiniMax (https://platform.minimax.io) as a third provider option alongside Anthropic and OpenAI. MiniMax uses OpenAI-compatible API format, so the existing OpenAI code path is reused via a new provider_uses_openai_format() helper. Changes: - Add MIMI_MINIMAX_API_URL constant (https://api.minimax.io/v1) - Add provider_is_minimax() and provider_uses_openai_format() helpers - Route MiniMax requests to api.minimax.io with Bearer auth - Update CLI help text to include minimax option - Update mimi_secrets.h.example with MiniMax documentation - Update README.md, README_CN.md, README_JA.md with MiniMax info Supported models: MiniMax-M2.5, MiniMax-M2.5-highspeed Usage: mimi> set_model_provider minimax mimi> set_api_key <MINIMAX_API_KEY> mimi> set_model MiniMax-M2.5
📝 WalkthroughWalkthroughAdds MiniMax as a third LLM provider alongside Anthropic and OpenAI. Updates documentation, build-time secret examples, CLI help text, and llm proxy logic to treat MiniMax as OpenAI-format (URL/host/path, headers, message formatting, and response parsing). Changes
Sequence Diagram(s)mermaid Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
📝 Coding Plan
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
- Update MiniMax model reference from M2.5 to M2.7 in all README docs - MiniMax-M2.7 is the latest flagship model with enhanced reasoning and coding
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (1)
README_CN.md (1)
34-34: 建议润色这句中文表达。Line 34 的“所有数据存在本地 Flash”读起来略生硬,建议改为“所有数据都保存在本地 Flash 中”。
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@README_CN.md` at line 34, 替换 README_CN.md 中句子“所有数据存在本地 Flash”为更通顺的表述,例如“所有数据都保存在本地 Flash 中”;定位并修改包含原句的段落(包含“你在 Telegram 发一条消息,ESP32-S3 通过 WiFi 收到后送进 Agent 循环 — LLM 思考、调用工具、读取记忆 — 再把回复发回来。”的那一行)以保留原意并统一语体。
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@README_CN.md`:
- Around line 298-301: README_CN.md has inconsistent wording: the "多提供商" bullet
lists three providers (Anthropic/Claude, OpenAI/GPT, MiniMax) but the "工具调用"
bullet incorrectly says "两种提供商". Update the "工具调用" line to either say
"三种提供商均支持工具调用" to match the three providers, or explicitly state MiniMax's
limitation (e.g., "Anthropic 和 OpenAI 支持工具调用,MiniMax 暂不支持") so the README
consistently reflects MiniMax's support status; adjust the bullets containing
"多提供商" and "工具调用" accordingly.
---
Nitpick comments:
In `@README_CN.md`:
- Line 34: 替换 README_CN.md 中句子“所有数据存在本地 Flash”为更通顺的表述,例如“所有数据都保存在本地 Flash
中”;定位并修改包含原句的段落(包含“你在 Telegram 发一条消息,ESP32-S3 通过 WiFi 收到后送进 Agent 循环 — LLM
思考、调用工具、读取记忆 — 再把回复发回来。”的那一行)以保留原意并统一语体。
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 910cde09-b3ab-4081-99e0-1c7a11ed023b
📒 Files selected for processing (4)
README.mdREADME_CN.mdREADME_JA.mdmain/mimi_config.h
🚧 Files skipped from review as they are similar to previous changes (3)
- main/mimi_config.h
- README_JA.md
- README.md
| - **多提供商** — 同时支持 Anthropic (Claude)、OpenAI (GPT) 和 MiniMax,运行时可切换 | ||
| - **定时任务** — AI 可自主创建周期性和一次性任务,重启后持久保存 | ||
| - **心跳服务** — 定期检查任务文件,驱动 AI 自主执行 | ||
| - **工具调用** — ReAct Agent 循环,两种提供商均支持工具调用 |
There was a problem hiding this comment.
“多提供商”与“工具调用”描述数量不一致。
Line 298 写的是三家提供商,但 Line 301 仍写“两种提供商均支持工具调用”。这会让读者误解 MiniMax 的工具调用支持状态。建议统一为“三种”,或明确注明 MiniMax 的限制(如果有)。
✏️ 建议修订
-- **工具调用** — ReAct Agent 循环,两种提供商均支持工具调用
+- **工具调用** — ReAct Agent 循环,三种提供商均支持工具调用📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| - **多提供商** — 同时支持 Anthropic (Claude)、OpenAI (GPT) 和 MiniMax,运行时可切换 | |
| - **定时任务** — AI 可自主创建周期性和一次性任务,重启后持久保存 | |
| - **心跳服务** — 定期检查任务文件,驱动 AI 自主执行 | |
| - **工具调用** — ReAct Agent 循环,两种提供商均支持工具调用 | |
| - **多提供商** — 同时支持 Anthropic (Claude)、OpenAI (GPT) 和 MiniMax,运行时可切换 | |
| - **定时任务** — AI 可自主创建周期性和一次性任务,重启后持久保存 | |
| - **心跳服务** — 定期检查任务文件,驱动 AI 自主执行 | |
| - **工具调用** — ReAct Agent 循环,三种提供商均支持工具调用 |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@README_CN.md` around lines 298 - 301, README_CN.md has inconsistent wording:
the "多提供商" bullet lists three providers (Anthropic/Claude, OpenAI/GPT, MiniMax)
but the "工具调用" bullet incorrectly says "两种提供商". Update the "工具调用" line to either
say "三种提供商均支持工具调用" to match the three providers, or explicitly state MiniMax's
limitation (e.g., "Anthropic 和 OpenAI 支持工具调用,MiniMax 暂不支持") so the README
consistently reflects MiniMax's support status; adjust the bullets containing
"多提供商" and "工具调用" accordingly.
Summary
Add MiniMax as a new LLM provider option alongside Anthropic and OpenAI. MiniMax uses the OpenAI-compatible API format, so the existing OpenAI code path is cleanly reused.
Supported Models
MiniMax-M2.7— Latest flagship model with enhanced reasoning and coding capabilitiesMiniMax-M2.7-highspeed— High-speed version of M2.7 for low-latency scenariosMiniMax-M2.5— Peak Performance, Ultimate Value (previous generation)MiniMax-M2.5-highspeed— Fast version of M2.5Changes
llm_proxy.cminimaxas valid option forset_model_providerCLI commandmimi_secrets.h.exampleUsage
Testing
https://api.minimax.io/v1/chat/completionsSummary by CodeRabbit
New Features
Documentation
Chores