Used Server GPUs for Local AI: Tesla P40, V100, A100, and the eBay Goldmine
A Tesla P40 has 24GB VRAM for $175. A V100 has 32GB for $350. Server GPUs offer insane VRAM per dollar for local AI — if you can handle the quirks. Full breakdown with prices, benchmarks, and cooling fixes.
Feb 25, 2026