▸ DEVICE UNDER TEST
NVIDIA H100 SXM5 96 GB — 96 GB VRAM.
▸ H100 SXM5 96 GB SPEC
- BRAND
- NVIDIA
- VRAM
- 96 GB HBM3
- BANDWIDTH
- 3360 GB/s
- FP16 COMPUTE
- 267.6 TFLOPS
- FP32 COMPUTE
- 66.9 TFLOPS
- CUDA CORES
- 16,896
- TENSOR CORES
- 528
- TDP
- 700 W
- ARCHITECTURE
- Hopper
- MSRP
- $25000
▸ AI CAPABILITY
297/ 331 models @ Q4
With 96 GB VRAM and 3360 GB/s bandwidth, this GPU handles models up to 132B parameters.
Speed ≈ bandwidth / model_size × efficiency. A 7B model at Q4 runs at ~384 tok/s.
§ 01TOP MODELS FOR H100 SXM5 96 GB
297 FIT · SHOWING 20| MODEL | SIZE | VRAM Q4 | TOK/S | AVG |
|---|---|---|---|---|
| DBRX 132B | 132B | 81.2 GB | 75 | 46.3 |
| Qwen3.5-122B-A10B | 125.1B | 77.0 GB | 21 | 45.5 |
| Pixtral Large 124B | 124B | 76.3 GB | 22 | 39.3 |
| Mistral-Large 123B | 123B | 75.7 GB | 22 | 33.5 |
| Devstral 2 123B | 123B | 75.7 GB | 22 | 38.1 |
| Qwen 3.5 122B A10B | 122B | 75.1 GB | 269 | 56.8 |
| Nemotron 3 Super 120B | 120B | 73.8 GB | 224 | 57.3 |
| Nemotron 3 Super 120B-A12B | 120B | 73.8 GB | 224 | 53.2 |
| Mistral Small 4 119B | 119B | 73.2 GB | 414 | 50.2 |
| GPT-OSS 120B | 117B | 72.0 GB | 527 | 54.1 |
| Command A 111B | 111B | 68.3 GB | 24 | 27.6 |
| GLM 4.5 Air | 110B | 67.7 GB | 224 | 51.0 |
| Qwen 1.5 110B | 110B | 67.7 GB | 24 | 33.4 |
| Llama 4 Scout 17B-16E | 109B | 67.1 GB | 158 | 33.9 |
| Sarvam 105B | 105B | 64.7 GB | 26 | 48.0 |
| Command-R+ 104B | 104B | 64.1 GB | 26 | 52.7 |
| Llama-3.2-90B-Vision-Instruct | 90B | 55.5 GB | 30 | 48.5 |
| Hunyuan A13B | 80B | 49.4 GB | 207 | 81.1 |
| Qwen3-Coder-Next | 80B | 49.4 GB | 896 | 43.0 |
| Qwen2.5-72B | 72.7B | 44.9 GB | 37 | 39.7 |