← All GPUs

H100 PCIE vs H200 NVL price comparison

Compare H100 PCIE vs H200 NVL price, VRAM, and provider coverage so you can see which GPU is cheaper to rent today and how the spread has moved over time.

H100 PCIE vs H200 NVL: how to compare cost in context

If you are choosing between H100 PCIE and H200 NVL, hourly price is only part of the trade-off. This page lines up specs, shared providers, and current market pricing so you can compare both cost and coverage.

Use the specs table to understand the memory difference, the historical chart to see how each market has moved, and the provider table to check whether one GPU consistently carries a premium on the same cloud.

Cheapest provider right now

H100 PCIE vs H200 NVL: cheapest market entry

H100 PCIE is currently cheaper to enter, starting at $1.25/hr on RunPod, while H200 NVL starts at $2.79/hr on Vast.ai. Across 1 providers with both GPUs listed, H100 PCIE is cheaper on 1 providers and H200 NVL is cheaper on 0 providers.

Methodology and freshness

How the side-by-side comparison works

We compare the latest per-GPU hourly pricing we have for both models, prefer on-demand rows when available, and keep provider histories separate so you can see whether a gap is structural or just a short-lived market move.

H100 PCIE vs H200 NVL pricing FAQ

Which is cheaper right now: H100 PCIE or H200 NVL?

H100 PCIE is currently cheaper to enter, starting at $1.25/hr on RunPod, while H200 NVL starts at $2.79/hr on Vast.ai. Across 1 providers with both GPUs listed, H100 PCIE is cheaper on 1 providers and H200 NVL is cheaper on 0 providers.

How much VRAM do H100 PCIE and H200 NVL have?

H100 PCIE is tracked with 80GB of HBM3, while H200 NVL is tracked with 141GB of HBM3e.

Which providers currently carry both H100 PCIE and H200 NVL?

We currently see shared coverage on Vast.ai.

How fresh is the H100 PCIE vs H200 NVL pricing data?

The comparison uses the latest stored snapshot for each GPU and provider. The newest row visible on this page is from May 1, 2026, and collectors run daily.

Spec H100 PCIE H200 NVL
VRAM 80 GB 141 GB
Memory Type HBM3 HBM3e
Generation Hopper Hopper
Tier High Performance Flagship
Best Price
Providers

Historical H100 PCIE vs H200 NVL price trend

Price by provider: H100 PCIE vs H200 NVL

Provider H100 PCIE H200 NVL Difference
Loading...

More GPU comparisons