runlocal.devCheck My GPU →

GPU Compatibility Calculator

Select your GPU or enter VRAM to see which local AI models you can run, the best quantization, and Ollama install commands.