The A100 PCIe 80GB offers Ampere performance in a PCIe form factor with 80GB HBM2e memory. Enables enterprise deployment without specialized SXM baseboard infrastructure.
VRAM
80 GB
Memory
HBM2e
Bandwidth
1935 GB/s
TDP
300W
Large Language Models
Training and inference for models like GPT-4, Llama 70B+
Distributed Training
Multi-node training with fast interconnects
Enterprise Deployment
Designed for 24/7 datacenter operations
PCIe 80GB
Estimates based on INT8 quantization. Actual fit depends on framework and batch size.
Added Jan 25, 2026
Last updated: Jan 25, 2026
Explore models, compare pricing and benchmarks, and right-size your infrastructure — all in one place.