Here is the @FactoryAI 's custom config.json file you will need to run @MiniMax__AI M2 running on @chutes_ai at 80+ tps 🙃 enjoy! { "custom_models": [ { "model_display_name": "MiniMax-M2", "model": "MiniMaxAI/MiniMax-M2", "base_url": "llm.chutes.ai/v1", "api_key": "YOUR_API_KEY_HERE", "provider": "generic-chat-completion-api", "max_tokens": 204800 } ] }
The new #1 Open Source model according to Artificial Intelligence, now available for free on @chutes_ai. Link below

Oct 31, 2025 · 3:48 AM UTC

5
5
112
*max token: 131072
1
True. But pretty hard to beat this
1
6
Interesting to see the custom config needed for MiniMax-M2.
1
Great Does it support interleaved thinking ?