A practical guide to running local AI on 16GB VRAM. Covers which LLMs and image models work, the 13B-14B sweet spot, and whether 16GB is worth it vs 12GB or 24GB.