Skip to content

Add MiniMax as a first-class LLM provider#696

Open
octo-patch wants to merge 1 commit intocrmne:mainfrom
octo-patch:feature/add-minimax-provider
Open

Add MiniMax as a first-class LLM provider#696
octo-patch wants to merge 1 commit intocrmne:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

@octo-patch octo-patch commented Mar 22, 2026

Summary

Adds MiniMax as a built-in LLM provider, following the same OpenAI-compatible pattern used by DeepSeek and Perplexity.

MiniMax offers high-performance language models with up to 1M token context windows through an OpenAI-compatible API.

Changes

Provider implementation (5 new files)

  • providers/minimax.rb — Main provider class extending OpenAI, custom api_base and headers
  • providers/minimax/chat.rb — Role formatting
  • providers/minimax/capabilities.rb — Model metadata, pricing, and feature flags
  • providers/minimax/models.rb — Static model listing (MiniMax has no /v1/models endpoint)
  • providers/minimax/temperature.rb — Clamps temperature to [0.0, 1.0] with debug logging

Models supported

Model Context Window Notes
MiniMax-M2.7 1,000,000 Latest flagship model
MiniMax-M2.7-highspeed 1,000,000 Cost-optimized variant
MiniMax-M2.5 204,000 Previous generation
MiniMax-M2.5-highspeed 204,000 Cost-optimized variant

Configuration

RubyLLM.configure do |config|
  config.minimax_api_key = ENV['MINIMAX_API_KEY']
  config.minimax_api_base = ENV['MINIMAX_API_BASE'] # optional
end

Tests (5 spec files, 48 examples, 0 failures)

  • minimax_spec.rb — Provider initialization, API base, headers, configuration, slug
  • minimax/capabilities_spec.rb — Context windows, model families, pricing, features
  • minimax/temperature_spec.rb — Clamping, nil handling, edge cases
  • minimax/models_spec.rb — Static model list, metadata, provider slug
  • minimax/chat_spec.rb — Role formatting

Other changes

  • Registered provider in lib/ruby_llm.rb with Zeitwerk inflection
  • Added to .env.example, README.md, configuration docs, test support files

Test plan

  • All 48 new unit tests pass
  • Existing provider_spec.rb tests pass (configuration schema validation)
  • CI should pass on all supported Ruby versions

MiniMax offers an OpenAI-compatible API with high-performance language
models. This adds MiniMax as a built-in provider following the same
patterns as DeepSeek and Perplexity (extending OpenAI).

Provider implementation:
- MiniMax class extending OpenAI with custom api_base and headers
- Temperature module clamping values to [0.0, 1.0] range
- Capabilities module with model metadata and pricing
- Static Models module (MiniMax has no /v1/models endpoint)
- Models: MiniMax-M2.7, M2.7-highspeed, M2.5, M2.5-highspeed

Configuration:
- minimax_api_key (required)
- minimax_api_base (optional, defaults to https://api.minimax.io/v1)

Tests:
- Provider spec (api_base, headers, configuration, slug)
- Capabilities spec (context windows, pricing, model families)
- Temperature spec (clamping, nil handling, edge cases)
- Models spec (static model list, metadata)
- Chat spec (role formatting)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant