runlocal.devCheck My GPU →

MiniMax M2.7

MoEApache 2.0

MiniMax's self-evolving MoE model with 1M token context. Recently open-sourced under Apache 2.0 license.

Provider

MiniMax

Parameters

Unknown (MoE)

Context

1,000K

Released

2026-04-10

VRAM Requirements by Quantization

MethodDisk SizeVRAM RequiredFits GPUs
Q4_K_M45 GB50 GB0 GPUs
Q4_042 GB47 GB0 GPUs
Q2_K28 GB31 GB2 GPUs

Install with Ollama

Run in terminal:

ollama pull minimax-m2.7

Minimum 31GB VRAM required. Install Ollama from ollama.com

Benchmark Scores

mmlu85.6%
humaneval83.2%

Scores are approximate and may vary by quantization level.

Compatible GPUs (2)

HuggingFace

MiniMaxAI/MiniMax-M2.7

View on HF →