This PR introduces the MiniMax (Global/Intl) LLM provider to the Archestra platform. It enables MiniMax as a first-class citizen for both the LLM Proxy and the Chat interface, supporting tool invocation, streaming, and TOON compression.
backend/src/routes/proxy/adapterV2/minimax.ts which implements the LLMRequestAdapter, LLMResponseAdapter, and LLMStreamAdapter following the OpenAI-compatible schema used by MiniMax.backend/src/routes/proxy/minimax.ts using the generic handleLLMProxy logic.fetchMiniMaxModels to backend/src/routes/chat-models.ts with a hardcoded fallback to ensure availability of MiniMax-M2.1 and associated models.backend/src/tokenizers/index.ts to use TiktokenTokenizer for MiniMax.ChatApiKeyForm to support MiniMax, including icons, placeholder formats, and links to the MiniMax international console.ModelSelector grouping.ModelSelectorLogo types and local logo mappings to display the MiniMax branding correctly.ProxyConnectionInstructions component to help users configure their external clients.interaction.utils.ts to route minimax:chatCompletions through the OpenAiChatCompletionInteraction class for consistent metrics and cost calculation.backend/src/routes/proxy/adapterV2/minimax.test.ts with 23 test cases covering:
/v1/proxy/minimax/...) handle both streaming and non-streaming requests correctly.pnpm test minimax.test.ts.Fixes #1855 /claim #1855
Pranjal Negi
@Pranjal6955
Archestra
@archestra-ai