Skip to content

feat: add MiniMax provider support (M2.7, M2.5, M2.5-highspeed)#1062

Open
octo-patch wants to merge 2 commits intoItzCrazyKns:masterfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax provider support (M2.7, M2.5, M2.5-highspeed)#1062
octo-patch wants to merge 2 commits intoItzCrazyKns:masterfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

@octo-patch octo-patch commented Mar 14, 2026

Summary

  • Add MiniMax as a new LLM provider with OpenAI-compatible API integration
  • Support MiniMax-M2.7 (default), MiniMax-M2.5, and MiniMax-M2.5-highspeed models (204K context)
  • Override generateObject/streamObject to use prompt-based JSON extraction (MiniMax does not support response_format)
  • Clamp temperature to (0.0, 1.0] range (MiniMax constraint)
  • Support configurable base URL via MINIMAX_BASE_URL env var (default: https://api.minimax.io/v1)
  • Add MINIMAX_API_KEY environment variable for authentication

Changes

File Description
src/lib/models/providers/minimax/index.ts MiniMax provider with M2.7/M2.5/M2.5-highspeed models
src/lib/models/providers/minimax/miniMaxLLM.ts LLM class extending OpenAILLM with temperature clamping
src/lib/models/providers/index.ts Register MiniMax in provider registry
README.md Add MiniMax to supported providers list

Test plan

  • TypeScript compilation passes (tsc --noEmit)
  • Configure MiniMax API key and verify chat completion works
  • Verify temperature clamping with temperature=0 input
  • Test generateObject/streamObject with structured output

- Add MiniMax chat model provider with OpenAI-compatible API
- Support MiniMax-M2.5 and MiniMax-M2.5-highspeed models (204K context)
- Override generateObject/streamObject to use prompt-based JSON
  (MiniMax does not support response_format)
- Clamp temperature to (0.0, 1.0] range (MiniMax rejects 0)
- Support configurable base URL (default: api.minimax.io/v1)
- Add MINIMAX_API_KEY and MINIMAX_BASE_URL environment variables
- Update README to mention MiniMax provider
Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1 issue found across 4 files

Prompt for AI agents (unresolved issues)

Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.


<file name="src/lib/models/providers/minimax/miniMaxLLM.ts">

<violation number="1" location="src/lib/models/providers/minimax/miniMaxLLM.ts:123">
P2: Temperature normalization is incomplete for MiniMax: values above 1 are not clamped, and inherited text paths can still send invalid config-level temperatures.</violation>
</file>

Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.

if (
options.temperature !== undefined &&
options.temperature !== null &&
options.temperature <= 0
Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot Mar 14, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2: Temperature normalization is incomplete for MiniMax: values above 1 are not clamped, and inherited text paths can still send invalid config-level temperatures.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At src/lib/models/providers/minimax/miniMaxLLM.ts, line 123:

<comment>Temperature normalization is incomplete for MiniMax: values above 1 are not clamped, and inherited text paths can still send invalid config-level temperatures.</comment>

<file context>
@@ -0,0 +1,131 @@
+    if (
+      options.temperature !== undefined &&
+      options.temperature !== null &&
+      options.temperature <= 0
+    ) {
+      return { ...options, temperature: 0.01 };
</file context>
Fix with Cubic

Add MiniMax-M2.7 to the supported chat models list and set it as the
default model, ahead of M2.5 and M2.5-highspeed.
@octo-patch octo-patch changed the title feat: add MiniMax provider support feat: add MiniMax provider support (M2.7, M2.5, M2.5-highspeed) Mar 18, 2026
@angusthefuzz
Copy link

M2.7 doesn't seem to work. I get invalid chat setting (2013)
M2.5 works great.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants