Loading...

Running LLMs on Apple M1 Ultra (128GB): Complete Guide

The Apple M1 Ultra (128GB) has 128GB VRAM and 800 GB/s memory bandwidth, making it one of the best consumer GPUs for running local AI models. This guide covers which LLMs fit, expected tok/s performance, and recommended settings.

See the full Apple M1 Ultra (128GB) specs page for detailed specifications, all compatible models, and speed estimates. Current price: $4999.