NVIDIA L40S GPU Pricing
NVIDIA L40S 48 GB Ada GPU, tuned for faster generative-AI inference and graphics workloads in datacenter environments.
You can get the NVIDIA L40S GPU as cheap as $0.86/hr through RunPod
Per GPU hour
Across all offerings
Cloud platforms
Available options
Compare NVIDIA L40S GPU rental prices across all available cloud providers
Provider | GPU Count | Total VRAM | vCPU | RAM | Storage | Price/GPU/hr | Total/hr | Action |
---|---|---|---|---|---|---|---|---|
RunPod 💰 $500 Credit | 1 GPU | 48 GB Total VRAM | 28 vCPUs | 125 GB System RAM | — | $0.86 Per GPU/hr | $0.86 Total per hour | Best Deal |
DataCrunch.io | 1 GPU | 48 GB Total VRAM | 20 vCPUs | 60 GB System RAM | — | $0.91 Per GPU/hr | $0.91 Total per hour | View |
DataCrunch.io | 8 GPUs | 384 GB Total VRAM | 160 vCPUs | 480 GB System RAM | — | $0.91 Per GPU/hr | $7.31 Total per hour | View |
DataCrunch.io | 2 GPUs | 96 GB Total VRAM | 40 vCPUs | 120 GB System RAM | — | $0.92 Per GPU/hr | $1.83 Total per hour | View |
DataCrunch.io | 4 GPUs | 192 GB Total VRAM | 80 vCPUs | 240 GB System RAM | — | $0.92 Per GPU/hr | $3.66 Total per hour | View |
Hyperstack | 1 GPU | 48 GB Total VRAM | 28 vCPUs | 58 GB System RAM | — | $1.00 Per GPU/hr | $1.00 Total per hour | View |
Vultr | 1 GPU | 48 GB Total VRAM | 16 vCPUs | 180 GB System RAM | 1,200 GB Boot disk (NVMe) | $1.56 Per GPU/hr | $1.56 Total per hour | View |
Vultr | 2 GPUs | 96 GB Total VRAM | 32 vCPUs | 375 GB System RAM | 2,200 GB Boot disk (NVMe) | $1.56 Per GPU/hr | $3.12 Total per hour | View |
Vultr | 4 GPUs | 192 GB Total VRAM | 64 vCPUs | 750 GB System RAM | 2,600 GB Boot disk (NVMe) | $1.56 Per GPU/hr | $6.24 Total per hour | View |
Vultr | 8 GPUs | 384 GB Total VRAM | 128 vCPUs | 1,500 GB System RAM | 3,400 GB Boot disk (NVMe) | $1.56 Per GPU/hr | $12.48 Total per hour | View |
DigitalOcean 💰 $200 Credit | 1 GPU | 48 GB Total VRAM | 8 vCPUs | 64 GB System RAM | 500 GB Boot disk (NVMe) | $1.57 Per GPU/hr | $1.57 Total per hour | View |
Sesterce Cloud | 10 GPUs | 480 GB Total VRAM | 80 vCPUs | 1,470 GB System RAM | 128 GB Boot disk (SSD) | $1.59 Per GPU/hr | $15.95 Total per hour | View |
Sesterce Cloud | 8 GPUs | 384 GB Total VRAM | 64 vCPUs | 768 GB System RAM | 13,300 GB Boot disk (SSD) | $1.70 Per GPU/hr | $13.63 Total per hour | View |
Memory
48 GB VRAM
Use Cases
About GPUs.io - Your Cloud GPU Price Comparison Platform
GPUs.io is a free cloud GPU price comparison website designed for AI, machine learning, and deep learning practitioners. We aggregate real-time pricing data from multiple cloud providers to help you easily compare and find cost-effective GPU options for your projects.
What You'll Find Here
- • Real-time GPU pricing data from multiple cloud providers
- • Detailed specifications for GPU models (NVIDIA A100, H100, RTX 4090, AMD MI300X, etc.)
- • Side-by-side cost comparisons for AI/ML workloads
- • Both spot and on-demand instance pricing information
- • Regional availability and performance data
How It Works
- • Browse pricing from several cloud providers in one place
- • Pricing data is updated automatically every few hours
- • Filter results by GPU type, memory, and budget constraints
- • Compare options without visiting multiple provider websites
- • Transparent pricing data sourced directly from providers
For AI/ML Engineers and Researchers
Whether you're training large language models, running computer vision experiments, or deploying AI applications, GPUs.io provides a simple way to compare your options across major providers like AWS, Google Cloud, Azure, and specialized GPU cloud services. No more checking multiple websites to find the best rates.
Important Notice
- • Prices shown are approximate and may vary based on region, demand, and promotional offers.
- • Availability can change rapidly. Always check with the provider for real-time availability.
- • Spot/preemptible instance pricing is subject to market conditions and may be terminated with short notice.
- • Signup credits may or may not be available at any given time or geographical location. Each provider has their own terms and conditions.