Mistral Small 3.2
Apache 2.0
Mistral's efficient 22B model with strong instruction following and multilingual support. Apache 2.0 license.
Provider
Mistral AI
Parameters
22B
Context
128K
Released
2025-07-01
VRAM Requirements by Quantization
| Method | Disk Size | VRAM Required | Fits GPUs |
|---|---|---|---|
| Q8_0 | 22.5 GB | 24.5 GB | 3 GPUs |
| Q4_K_M | 12.8 GB | 14 GB | 12 GPUs |
| Q4_0 | 12.2 GB | 13.5 GB | 12 GPUs |
Install with Ollama
Run in terminal:
ollama pull mistral-smallMinimum 13.5GB VRAM required. Install Ollama from ollama.com
Benchmark Scores
mmlu81.5%
humaneval77.8%
Scores are approximate and may vary by quantization level.
Compatible GPUs (12)
HuggingFace
mistralai/Mistral-Small-3.2-24B-Instruct-2506