HunYuan Hy3
MoEHunYuan
Tencent's open-source preview MoE model. +40% efficiency gain on reasoning and coding vs prior generation. 21B active params from a 295B pool. Weights on HuggingFace; local deployment at scale needs multi-GPU.
Provider
Tencent
Parameters
21B active / 295B total (MoE)
Context
128K
Released
2026-04-24
VRAM Requirements by Quantization
| Method | Disk Size | VRAM Required | Fits GPUs |
|---|---|---|---|
| BF16 (reference) | 540 GB | 590 GB | 0 GPUs |
| Q4_K_M (est.) | 165 GB | 180 GB | 0 GPUs |
Benchmark Scores
mmlu86.5%
humaneval87.5%
Scores are approximate and may vary by quantization level.
HuggingFace
tencent/Hy3-preview