NVIDIA H100 GPU Pricing
Flagship NVIDIA H100 Tensor Core GPU with 80 GB HBM3 for exascale HPC, large-language-model training and low-latency AI inference.
You can get the NVIDIA H100 GPU as cheap as $1.90/hr through Sesterce Cloud
Per GPU hour
Across all offerings
Cloud platforms
Available options
Compare NVIDIA H100 GPU rental prices across all available cloud providers
Provider | GPU Count | Total VRAM | vCPU | RAM | Storage | Price/GPU/hr | Total/hr | Action |
---|---|---|---|---|---|---|---|---|
Sesterce Cloud | 8 GPUs | 640 GB Total VRAM | 224 vCPUs | 1,536 GB System RAM | 7,200 GB Boot disk (SSD) | $1.90 Per GPU/hr | $15.20 Total per hour | Best Deal |
Sesterce Cloud | 8 GPUs | 640 GB Total VRAM | 224 vCPUs | 1,536 GB System RAM | 7,200 GB Boot disk (SSD) | $1.90 Per GPU/hr | $15.20 Total per hour | View |
Sesterce Cloud | 8 GPUs | 640 GB Total VRAM | 224 vCPUs | 1,536 GB System RAM | 7,200 GB Boot disk (SSD) | $1.90 Per GPU/hr | $15.20 Total per hour | View |
Sesterce Cloud | 8 GPUs | 640 GB Total VRAM | 224 vCPUs | 1,536 GB System RAM | 7,200 GB Boot disk (SSD) | $1.90 Per GPU/hr | $15.20 Total per hour | View |
Hyperstack | 1 GPU | 80 GB Total VRAM | 28 vCPUs | 180 GB System RAM | — | $1.90 Per GPU/hr | $1.90 Total per hour | View |
Sesterce Cloud | 8 GPUs | 640 GB Total VRAM | 128 vCPUs | 1,536 GB System RAM | 15,200 GB Boot disk (SSD) | $1.97 Per GPU/hr | $15.77 Total per hour | View |
Oblivus | 1 GPU | 80 GB Total VRAM | 28 vCPUs | 180 GB System RAM | 850 GB Boot disk (NVMe) | $1.98 Per GPU/hr | $1.98 Total per hour | View |
Oblivus | 2 GPUs | 160 GB Total VRAM | 60 vCPUs | 360 GB System RAM | 1,600 GB Boot disk (NVMe) | $1.98 Per GPU/hr | $3.96 Total per hour | View |
Oblivus | 4 GPUs | 320 GB Total VRAM | 124 vCPUs | 720 GB System RAM | 3,300 GB Boot disk (NVMe) | $1.98 Per GPU/hr | $7.92 Total per hour | View |
Oblivus | 8 GPUs | 640 GB Total VRAM | 252 vCPUs | 1,440 GB System RAM | 6,600 GB Boot disk (NVMe) | $1.98 Per GPU/hr | $15.84 Total per hour | View |
Sesterce Cloud | 8 GPUs | 640 GB Total VRAM | 252 vCPUs | 1,440 GB System RAM | 6,600 GB Boot disk (SSD) | $2.15 Per GPU/hr | $17.16 Total per hour | View |
Sesterce Cloud | 8 GPUs | 640 GB Total VRAM | 104 vCPUs | 1,024 GB System RAM | 18,000 GB Boot disk (SSD) | $2.19 Per GPU/hr | $17.51 Total per hour | View |
DataCrunch.io | 4 GPUs | 320 GB Total VRAM | 176 vCPUs | 740 GB System RAM | — | $2.19 Per GPU/hr | $8.76 Total per hour | View |
DataCrunch.io | 8 GPUs | 640 GB Total VRAM | 176 vCPUs | 1,480 GB System RAM | — | $2.19 Per GPU/hr | $17.52 Total per hour | View |
DataCrunch.io | 1 GPU | 80 GB Total VRAM | 30 vCPUs | 120 GB System RAM | — | $2.19 Per GPU/hr | $2.19 Total per hour | View |
DataCrunch.io | 1 GPU | 80 GB Total VRAM | 32 vCPUs | 185 GB System RAM | — | $2.19 Per GPU/hr | $2.19 Total per hour | View |
DataCrunch.io | 2 GPUs | 160 GB Total VRAM | 80 vCPUs | 370 GB System RAM | — | $2.19 Per GPU/hr | $4.38 Total per hour | View |
Sesterce Cloud | 8 GPUs | 640 GB Total VRAM | 126 vCPUs | 1,800 GB System RAM | 10,000 GB Boot disk (SSD) | $2.66 Per GPU/hr | $21.30 Total per hour | View |
Sesterce Cloud | 8 GPUs | 640 GB Total VRAM | 104 vCPUs | 1,024 GB System RAM | 18,000 GB Boot disk (SSD) | $2.74 Per GPU/hr | $21.91 Total per hour | View |
RunPod 💰 $500 Credit | 1 GPU | 80 GB Total VRAM | 24 vCPUs | 251 GB System RAM | — | $2.99 Per GPU/hr | $2.99 Total per hour | View |
RunPod 💰 $500 Credit | 2 GPUs | 160 GB Total VRAM | 24 vCPUs | 251 GB System RAM | — | $2.99 Per GPU/hr | $5.98 Total per hour | View |
RunPod 💰 $500 Credit | 3 GPUs | 240 GB Total VRAM | 24 vCPUs | 251 GB System RAM | — | $2.99 Per GPU/hr | $8.97 Total per hour | View |
RunPod 💰 $500 Credit | 5 GPUs | 400 GB Total VRAM | 24 vCPUs | 251 GB System RAM | — | $2.99 Per GPU/hr | $14.95 Total per hour | View |
RunPod 💰 $500 Credit | 6 GPUs | 480 GB Total VRAM | 24 vCPUs | 251 GB System RAM | — | $2.99 Per GPU/hr | $17.94 Total per hour | View |
RunPod 💰 $500 Credit | 7 GPUs | 560 GB Total VRAM | 24 vCPUs | 251 GB System RAM | — | $2.99 Per GPU/hr | $20.93 Total per hour | View |
RunPod 💰 $500 Credit | 8 GPUs | 640 GB Total VRAM | 24 vCPUs | 251 GB System RAM | — | $2.99 Per GPU/hr | $23.92 Total per hour | View |
RunPod 💰 $500 Credit | 4 GPUs | 320 GB Total VRAM | 24 vCPUs | 251 GB System RAM | — | $2.99 Per GPU/hr | $11.96 Total per hour | View |
DigitalOcean 💰 $200 Credit | 8 GPUs | 640 GB Total VRAM | 160 vCPUs | 1,920 GB System RAM | 2,046 GB Boot disk (NVMe) 40.96 TB Scratch disk (NVMe) | $2.99 Per GPU/hr | $23.92 Total per hour | View |
Sesterce Cloud | 8 GPUs | 640 GB Total VRAM | 208 vCPUs | 1,800 GB System RAM | 24,780 GB Boot disk (SSD) | $3.29 Per GPU/hr | $26.31 Total per hour | View |
DigitalOcean 💰 $200 Credit | 1 GPU | 80 GB Total VRAM | 20 vCPUs | 240 GB System RAM | 720 GB Boot disk (NVMe) 5.12 TB Scratch disk (NVMe) | $3.39 Per GPU/hr | $3.39 Total per hour | View |
DigitalOcean 💰 $200 Credit | 8 GPUs | 640 GB Total VRAM | 160 vCPUs | 1,920 GB System RAM | 200 GB Boot disk (NVMe) 40.96 TB Scratch disk (NVMe) | $5.95 Per GPU/hr | $47.60 Total per hour | View |
Sesterce Cloud | 8 GPUs | 640 GB Total VRAM | 128 vCPUs | 1,760 GB System RAM | 2,000 GB Boot disk (SSD) | $6.57 Per GPU/hr | $52.54 Total per hour | View |
DigitalOcean 💰 $200 Credit | 1 GPU | 80 GB Total VRAM | 20 vCPUs | 240 GB System RAM | 200 GB Boot disk (NVMe) 5.12 TB Scratch disk (NVMe) | $6.74 Per GPU/hr | $6.74 Total per hour | View |
Memory
80 GB VRAM
Use Cases
About GPUs.io - Your Cloud GPU Price Comparison Platform
GPUs.io is a free cloud GPU price comparison website designed for AI, machine learning, and deep learning practitioners. We aggregate real-time pricing data from multiple cloud providers to help you easily compare and find cost-effective GPU options for your projects.
What You'll Find Here
- • Real-time GPU pricing data from multiple cloud providers
- • Detailed specifications for GPU models (NVIDIA A100, H100, RTX 4090, AMD MI300X, etc.)
- • Side-by-side cost comparisons for AI/ML workloads
- • Both spot and on-demand instance pricing information
- • Regional availability and performance data
How It Works
- • Browse pricing from several cloud providers in one place
- • Pricing data is updated automatically every few hours
- • Filter results by GPU type, memory, and budget constraints
- • Compare options without visiting multiple provider websites
- • Transparent pricing data sourced directly from providers
For AI/ML Engineers and Researchers
Whether you're training large language models, running computer vision experiments, or deploying AI applications, GPUs.io provides a simple way to compare your options across major providers like AWS, Google Cloud, Azure, and specialized GPU cloud services. No more checking multiple websites to find the best rates.
Important Notice
- • Prices shown are approximate and may vary based on region, demand, and promotional offers.
- • Availability can change rapidly. Always check with the provider for real-time availability.
- • Spot/preemptible instance pricing is subject to market conditions and may be terminated with short notice.
- • Signup credits may or may not be available at any given time or geographical location. Each provider has their own terms and conditions.