The NVIDIA H200 SXM 141 GB has 141GB VRAM and 4890 GB/s memory bandwidth, making it one of the best consumer GPUs for running local AI models. This guide covers which LLMs fit, expected tok/s performance, and recommended settings.
See the full NVIDIA H200 SXM 141 GB specs page for detailed specifications, all compatible models, and speed estimates. Current price: $30000.