Skip to content

feat: add MiniMax as fourth LLM provider#29

Open
octo-patch wants to merge 1 commit intoSamurAIGPT:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as fourth LLM provider#29
octo-patch wants to merge 1 commit intoSamurAIGPT:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Add MiniMax AI as a fourth LLM provider alongside OpenAI, Groq, and Google Gemini for script generation and video search query extraction.

MiniMax offers an OpenAI-compatible API (https://api.minimax.io/v1), so it integrates seamlessly through the existing client.chat.completions.create() code path — no new SDK dependency required.

Changes

  • utility/config.py — Add minimax to LLM_PROVIDER validation, MINIMAX_API_KEY/MINIMAX_MODEL env checks, and get_llm_client() factory (OpenAI SDK with custom base_url)
  • .env.example — Add MINIMAX_API_KEY and MINIMAX_MODEL=MiniMax-M2.7 settings
  • README.md — Add MiniMax to features list, API key table, and provider selection docs
  • tests/test_minimax_provider.py — 18 unit tests covering config validation, script generator routing, video search query routing, and .env.example verification
  • tests/test_minimax_integration.py — 3 integration tests verifying real MiniMax API responses (auto-skip when MINIMAX_API_KEY not set)

Usage

LLM_PROVIDER=minimax
MINIMAX_API_KEY=your_key_here
MINIMAX_MODEL=MiniMax-M2.7

Available models: MiniMax-M2.7 (default, 204K context), MiniMax-M2.7-highspeed

Test plan

  • All 18 unit tests pass (python -m pytest tests/test_minimax_provider.py)
  • All 3 integration tests pass with valid API key (python -m pytest tests/test_minimax_integration.py)
  • Existing OpenAI/Groq/Gemini providers unaffected (no changes to their code paths)

Add MiniMax AI as an alternative LLM provider alongside OpenAI, Groq,
and Google Gemini. MiniMax offers an OpenAI-compatible API, so it
integrates seamlessly through the existing chat completions path.

Changes:
- Add 'minimax' to LLM_PROVIDER validation and client factory in config.py
- Register MiniMax-M2.7 as default model with OpenAI SDK + custom base_url
- Update .env.example with MINIMAX_API_KEY and MINIMAX_MODEL settings
- Update README.md with MiniMax in provider table, features, and config
- Add 18 unit tests covering config validation, script generation,
  video search query, and .env.example verification
- Add 3 integration tests verifying real MiniMax API responses
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant