Skip to content

feat: add MiniMax provider support#1350

Open
octo-patch wants to merge 1 commit intoruvnet:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax provider support#1350
octo-patch wants to merge 1 commit intoruvnet:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Add MiniMax as a first-class LLM provider in the multi-provider system, supporting MiniMax-M2.5 and MiniMax-M2.5-highspeed models via the OpenAI-compatible API.

Changes

  • New file: v3/@claude-flow/providers/src/minimax-provider.ts — Full MiniMax provider implementation extending BaseProvider
  • Modified: types.ts — Add 'minimax' to LLMProvider union type and MiniMax models to LLMModel
  • Modified: provider-manager.ts — Register MiniMax in the provider factory switch
  • Modified: index.ts — Export MiniMaxProvider with updated module docs
  • Modified: provider-integration.test.ts — Integration tests for completion, streaming, and model listing
  • Modified: README.md — Add MiniMax to provider lists, architecture diagram, and environment variables

MiniMax Provider Details

  • API: OpenAI-compatible (https://api.minimax.io/v1)
  • Models: MiniMax-M2.5 (default) and MiniMax-M2.5-highspeed
  • Context: 204,800 tokens, up to 192K output
  • Features: Chat completion, streaming, tool calling
  • Temperature: Clamped to (0.0, 1.0] range (MiniMax rejects zero)
  • Auth: MINIMAX_API_KEY environment variable

Pricing (per 1M tokens)

Model Input Output
MiniMax-M2.5 $0.30 $1.20
MiniMax-M2.5-highspeed $0.60 $2.40

API Documentation

Testing

  • All 3 MiniMax integration tests pass (completion, streaming, model listing)
  • Build passes with no TypeScript errors

- Add MiniMaxProvider with OpenAI-compatible API integration
- Support MiniMax-M2.5 and MiniMax-M2.5-highspeed models (204K context)
- Handle temperature constraint (must be in (0.0, 1.0])
- Register provider in ProviderManager and export from index
- Add MiniMax models to LLMModel type union
- Add integration tests for completion, streaming, and model listing
- Update README with MiniMax in provider list and env variables
Copy link
Copy Markdown
Owner

ruvnet commented Mar 17, 2026

Thanks for the MiniMax provider PR @octo-patch. This looks well-structured with proper tests.

We're currently focused on bug fixes and stability for v3.5.x. Provider additions will be reviewed in the next feature cycle. Keeping this open for future consideration.

Note: #1324 (duplicate MiniMax PR by @ximiximi423) has been closed in favor of this one.

@octo-patch
Copy link
Copy Markdown
Author

Thanks @ruvnet! Totally understand the focus on stability for v3.5.x. Happy to keep this open and address any feedback when the feature cycle comes around.

And thanks for consolidating with #1324 — makes sense to keep things clean.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants