Loading...

Running LLMs on Apple M2 Ultra (192GB): Complete Guide

The Apple M2 Ultra (192GB) has 192GB VRAM and 800 GB/s memory bandwidth, making it one of the best consumer GPUs for running local AI models. This guide covers which LLMs fit, expected tok/s performance, and recommended settings.

See the full Apple M2 Ultra (192GB) specs page for detailed specifications, all compatible models, and speed estimates. Current price: $5499.