Phi-4 14B scores 84.8% on MMLU — matching models 5x its size — and fits on a 12GB GPU at Q4. The full Phi lineup from 3.8B to 14B with VRAM needs, benchmarks, and honest weaknesses.