runlocal.devCheck My GPU →

Phi-4

MIT

Microsoft's 14B model with exceptional reasoning for its size. Particularly strong on math, science, and STEM tasks.

Provider

Microsoft

Parameters

14B

Context

16.384K

Released

2024-12-12

VRAM Requirements by Quantization

MethodDisk SizeVRAM RequiredFits GPUs
Q8_014.5 GB16 GB7 GPUs
Q4_K_M8.1 GB9.5 GB14 GPUs
Q4_07.8 GB9 GB14 GPUs

Install with Ollama

Run in terminal:

ollama pull phi4

Minimum 9GB VRAM required. Install Ollama from ollama.com

Benchmark Scores

mmlu84.8%
humaneval82.6%

Scores are approximate and may vary by quantization level.

Compatible GPUs (14)

HuggingFace

microsoft/phi-4

View on HF →