feat: add MiniMax as LLM provider (M2.7 default)#1275
Open
octo-patch wants to merge 2 commits intoagent0ai:mainfrom
Open
feat: add MiniMax as LLM provider (M2.7 default)#1275octo-patch wants to merge 2 commits intoagent0ai:mainfrom
octo-patch wants to merge 2 commits intoagent0ai:mainfrom
Conversation
added 2 commits
March 17, 2026 17:53
- Add MiniMax to chat providers in model_providers.yaml (OpenAI-compatible, api_base: https://api.minimax.io/v1) - Add temperature clamping in _adjust_call_args for MiniMax models (MiniMax requires temperature in (0.0, 1.0]) - Add 24 unit tests covering YAML config, ProviderManager, temperature clamping, and API key detection - Add 6 integration tests verifying chat completion, streaming, system messages, and both M2.5 models
- Update integration tests to use MiniMax-M2.7 as primary test model - Add MiniMax-M2.7-highspeed integration test - Keep M2.5 and M2.5-highspeed tests for backward compatibility - Update temperature clamping tests to cover M2.7 model names
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Add MiniMax as a first-class LLM provider for Agent Zero, with M2.7 as the recommended model.
Changes
conf/model_providers.yamlas an OpenAI-compatible chat providermodels.pyfor MiniMax API constraintsWhy
MiniMax provides an OpenAI-compatible API with competitive models. M2.7 is the latest flagship model with enhanced reasoning and coding capabilities.
Configuration
Users select "MiniMax" as their chat provider and enter their model name (e.g.,
MiniMax-M2.7,MiniMax-M2.7-highspeed,MiniMax-M2.5,MiniMax-M2.5-highspeed). The API key is auto-detected from theMINIMAX_API_KEYenvironment variable.Testing