Skip to content

Add MiniMax as built-in LLM provider#1374

Open
octo-patch wants to merge 1 commit intosimonw:mainfrom
octo-patch:feature/add-minimax-provider
Open

Add MiniMax as built-in LLM provider#1374
octo-patch wants to merge 1 commit intosimonw:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

Summary

  • Add MiniMax-M2.5 and MiniMax-M2.5-highspeed as built-in default models via new minimax_models.py default plugin
  • Reuse existing OpenAI-compatible Chat/AsyncChat classes with MiniMax-specific API base URL and key management
  • Enforce MiniMax temperature constraint (0, 1] via custom MiniMaxOptions (MiniMax API rejects temperature=0)
  • Register model aliases: minimax, m2.5, minimax-fast, m2.5-highspeed

Usage

llm keys set minimax
# Paste MiniMax API key here

llm -m MiniMax-M2.5 "Tell me about the Great Wall of China"
llm -m minimax-fast "Summarize this text" < article.txt

Changes

File Description
llm/default_plugins/minimax_models.py New default plugin registering MiniMax models
llm/plugins.py Added minimax_models to DEFAULT_PLUGINS
tests/test_minimax_models.py 30 tests (27 unit + 3 integration)
README.md Added MiniMax to provider list and quick start
pyproject.toml Added MiniMax to description

Test plan

  • 27 unit tests pass (model registration, options validation, mocked HTTP streaming/non-streaming, usage tracking, aliases)
  • 3 integration tests pass against live MiniMax API (skipped when MINIMAX_API_KEY not set)
  • Existing test suite unaffected (pre-existing test_chat_tools failure is unrelated)

MiniMax API Details

  • Base URL: (OpenAI-compatible)
  • Models: MiniMax-M2.5 (204K context), MiniMax-M2.5-highspeed (faster variant)
  • API Key: Set via or env var
  • Temperature: Must be in (0, 1] — validated by custom Options class

Register MiniMax-M2.5 and MiniMax-M2.5-highspeed as default models
with OpenAI-compatible API integration. Includes temperature clamping
(0, 1] per MiniMax API constraints, model aliases (minimax, m2.5,
minimax-fast, m2.5-highspeed), and 30 tests (27 unit + 3 integration).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant