Skip to content

feat: add MiniMax LLM provider support#197

Open
octo-patch wants to merge 1 commit intovxcontrol:masterfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax LLM provider support#197
octo-patch wants to merge 1 commit intovxcontrol:masterfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

Summary

Add MiniMax as a new LLM provider, following the existing provider architecture pattern (similar to DeepSeek, Kimi, GLM, Qwen).

MiniMax offers an OpenAI-compatible API with high-performance models:

  • MiniMax-M2.5 — 204K context window, suitable for general dialogue, code generation, and complex reasoning
  • MiniMax-M2.5-highspeed — Optimized version with faster inference, same 204K context

Changes

New files

  • backend/pkg/providers/minimax/minimax.go — Provider implementation using langchaingo OpenAI client
  • backend/pkg/providers/minimax/config.yml — Model configurations for all option types (simple, primary_agent, assistant, generator, coder, etc.)
  • backend/pkg/providers/minimax/models.yml — Model definitions with descriptions and pricing

Modified files

  • backend/pkg/providers/provider/provider.go — Add ProviderMiniMax type constant and DefaultProviderNameMiniMax
  • backend/pkg/config/config.go — Add MINIMAX_API_KEY, MINIMAX_SERVER_URL, MINIMAX_PROVIDER environment variable config fields
  • backend/pkg/providers/providers.go — Register MiniMax in provider factory (import, default config, instantiation, NewProvider switch case)

Configuration

Set these environment variables to enable MiniMax:

MINIMAX_API_KEY=your_api_key_here
# Optional: override default server URL
MINIMAX_SERVER_URL=https://api.minimax.io/v1
# Optional: model prefix for Langfuse logging
MINIMAX_PROVIDER=

Test Plan

  • Verify Go build compiles without errors
  • Test provider instantiation with valid MiniMax API key
  • Verify model selection for different option types (simple, primary_agent, coder, etc.)
  • Test streaming and tool calling capabilities
  • Verify provider appears in the UI provider list when API key is configured

Add MiniMax as a new LLM provider following the existing provider pattern.
MiniMax offers an OpenAI-compatible API at https://api.minimax.io/v1
with models MiniMax-M2.5 (204K context) and MiniMax-M2.5-highspeed.

Changes:
- Add backend/pkg/providers/minimax/ package with provider implementation,
  config.yml (model configs per option type), and models.yml (model definitions)
- Add ProviderMiniMax type constant and DefaultProviderNameMiniMax
- Add MINIMAX_API_KEY, MINIMAX_SERVER_URL, MINIMAX_PROVIDER config fields
- Register MiniMax provider in factory (default config, instantiation, NewProvider switch)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant