MiniMax M2.7
MoEApache 2.0
MiniMax's self-evolving MoE model with 1M token context. Recently open-sourced under Apache 2.0 license.
Provider
MiniMax
Parameters
Unknown (MoE)
Context
1,000K
Released
2026-04-10
VRAM Requirements by Quantization
| Method | Disk Size | VRAM Required | Fits GPUs |
|---|---|---|---|
| Q4_K_M | 45 GB | 50 GB | 0 GPUs |
| Q4_0 | 42 GB | 47 GB | 0 GPUs |
| Q2_K | 28 GB | 31 GB | 2 GPUs |
Install with Ollama
Benchmark Scores
mmlu85.6%
humaneval83.2%
Scores are approximate and may vary by quantization level.
Compatible GPUs (2)
HuggingFace
MiniMaxAI/MiniMax-M2.7