← All GPUs

H100 PCIE vs H200 price comparison

Compare H100 PCIE vs H200 price, VRAM, and provider coverage so you can see which GPU is cheaper to rent today and how the spread has moved over time.

H100 PCIE vs H200: how to compare cost in context

If you are choosing between H100 PCIE and H200, hourly price is only part of the trade-off. This page lines up specs, shared providers, and current market pricing so you can compare both cost and coverage.

Use the specs table to understand the memory difference, the historical chart to see how each market has moved, and the provider table to check whether one GPU consistently carries a premium on the same cloud.

Cheapest provider right now

H100 PCIE vs H200: cheapest market entry

H100 PCIE is currently cheaper to enter, starting at $1.25/hr on RunPod, while H200 starts at $2.29/hr on Lambda. Across 3 providers with both GPUs listed, H100 PCIE is cheaper on 2 providers and H200 is cheaper on 1 providers.

Methodology and freshness

How the side-by-side comparison works

We compare the latest per-GPU hourly pricing we have for both models, prefer on-demand rows when available, and keep provider histories separate so you can see whether a gap is structural or just a short-lived market move.

H100 PCIE vs H200 pricing FAQ

Which is cheaper right now: H100 PCIE or H200?

H100 PCIE is currently cheaper to enter, starting at $1.25/hr on RunPod, while H200 starts at $2.29/hr on Lambda. Across 3 providers with both GPUs listed, H100 PCIE is cheaper on 2 providers and H200 is cheaper on 1 providers.

How much VRAM do H100 PCIE and H200 have?

H100 PCIE is tracked with 80GB of HBM3, while H200 is tracked with 141GB of HBM3e.

Which providers currently carry both H100 PCIE and H200?

We currently see shared coverage on Vast.ai, Lambda, and RunPod.

How fresh is the H100 PCIE vs H200 pricing data?

The comparison uses the latest stored snapshot for each GPU and provider. The newest row visible on this page is from May 2, 2026, and collectors run daily.

Spec H100 PCIE H200
VRAM 80 GB 141 GB
Memory Type HBM3 HBM3e
Generation Hopper Hopper
Tier High Performance Flagship
Best Price
Providers

Historical H100 PCIE vs H200 price trend

Price by provider: H100 PCIE vs H200

Provider H100 PCIE H200 Difference
Loading...

More GPU comparisons