feat: add MiniMax provider support#99
Conversation
- Add MiniMaxProvider class implementing MemoryProvider interface using Anthropic-compatible API - Add MINIMAX_API_KEY environment variable support with auto-detection - Add MINIMAX_MODEL environment variable support (default: MiniMax-M2.7) - Register minimax in ProviderType union and provider factory - Add minimax to VALID_PROVIDERS for fallback chain support - Add unit tests for MiniMaxProvider (10 tests) - Supported models: MiniMax-M2.7 and MiniMax-M2.7-highspeed
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: defaults Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (5)
📝 WalkthroughWalkthroughThis pull request adds support for the MiniMax AI provider, an Anthropic-compatible service. Changes include environment-based provider detection in configuration, a new provider implementation using the Anthropic SDK, type definitions for the new provider, and comprehensive test coverage. Changes
Sequence DiagramsequenceDiagram
participant App as Application
participant Config as Config Detection
participant Factory as Provider Factory
participant MiniMax as MiniMaxProvider
participant Anthropic as Anthropic SDK
App->>Config: Initialize with env vars
Config->>Config: Check MINIMAX_API_KEY
Config->>Factory: Create provider (type: "minimax")
Factory->>MiniMax: Construct(apiKey, model, maxTokens)
MiniMax->>MiniMax: Initialize Anthropic client
App->>MiniMax: compress(systemPrompt, userPrompt)
MiniMax->>Anthropic: messages.create(model, max_tokens, system, messages)
Anthropic-->>MiniMax: Response with content blocks
MiniMax->>MiniMax: Extract text block
MiniMax-->>App: Return compressed text
Estimated Code Review Effort🎯 3 (Moderate) | ⏱️ ~20 minutes Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
|
Closing as superseded by #103 which includes the same MiniMax provider plus the Anthropic baseURL fix. Thank you for the contribution @octo-patch — your PR helped identify the need and informed the final implementation. |
Summary
This PR adds MiniMax as a new LLM provider for agentmemory, enabling memory compression and summarization using MiniMax models via the Anthropic-compatible API.
Changes
src/providers/minimax.ts— NewMiniMaxProviderclass implementing theMemoryProviderinterfacesrc/types.ts— Added"minimax"to theProviderTypeunionsrc/providers/index.ts— Registered MiniMax in the provider factorysrc/config.ts— AddedMINIMAX_API_KEYauto-detection andminimaxtoVALID_PROVIDERStest/minimax-provider.test.ts— 10 unit tests covering instantiation, API calls, error handling, and model selectionConfiguration
Set the following environment variables in
~/.agentmemory/.env:Supported Models
MiniMax-M2.7MiniMax-M2.7-highspeedAPI Reference
Implementation Notes
MiniMax uses the Anthropic-compatible API (
https://api.minimax.io/anthropic), consistent with how the existingAnthropicProviderworks. The implementation reuses the already-installed@anthropic-ai/sdkwith a custombaseURL, requiring no new dependencies.The provider is automatically selected when
MINIMAX_API_KEYis set (priority: Anthropic → MiniMax → Gemini → OpenRouter → Agent SDK). It also participates in the fallback chain viaFALLBACK_PROVIDERS.Summary by CodeRabbit
New Features
Tests