GPU Cloud Pricing

NVIDIA H100

GPU Specifications

VRAM
80 GB HBM3
Architecture
Hopper
Memory Bandwidth
3,350 GB/s
TDP
700W

Price per GPU/hr (cheapest region)

GCP
$1.18
AWS
$6.88

$1.18

Cheapest On-Demand

GCP · us-west1

$0.20

Cheapest Spot

GCP · europe-north1

84%

Max Spot Savings

vs on-demand

124

Instance Configs

across 2 providers

All Listings

Instance GPU vCPU RAM On-Demand Spot Regions
GCPa3-highgpu-1g1× NVIDIA H100 80GB26234 GB$1.183$0.20217 regions
GCPa3-highgpu-2g2× NVIDIA H100 80GB52468 GB$2.365$0.40417 regions
GCPa3-highgpu-4g4× NVIDIA H100 80GB104936 GB$4.730$0.80917 regions
AWSp5.4xlarge1× H10016256 GB$6.880$5.0424 regions
GCPa3-edgegpu-8g8× NVIDIA H100 80GB2081872 GB$9.460$1.67614 regions
GCPa3-highgpu-8g8× NVIDIA H100 80GB2081872 GB$9.460$1.61816 regions
GCPa3-edgegpu-8g-nolssd8× NVIDIA H100 80GB2081872 GB$9.460$1.67614 regions
GCPa3-megagpu-8g8× NVIDIA H100 80GB2081872 GB$9.460$1.61813 regions
AWSp5.48xlarge8× H1001922048 GB$55.040$9.37212 regions

Run a free assessment to identify overprovisioned workloads, idle capacity, and your potential savings, in minutes.

Most clusters are overprovisioned.
Let's prove yours is.