NVIDIA Blackwell

NVIDIA B200

B200 ushers in NVIDIA's Blackwell generation with FP4/FP8 support aimed at trillion-parameter foundation models.

Launch year
2024
Memory
192 GB HBM3e (projected)
Memory bandwidth
>4.0 TB/s
Peak FP16 / FP32
1,000 TFLOPS · 120 TFLOPS

Market snapshot

$1.56 /hr

Range $1.56 – $70.88

Catalog coverage

97 live offerings

Across 5 providers · 33 regions

  • Blackwell Transformer Engine
  • NVLink 5 fabric
  • Designed for exascale AI
NVIDIA B200 render

Last refreshed Oct 20, 2025, 2:01 AM

Performance snapshot

Normalized versus NVIDIA A100 (=1.0). Values use public reference benchmarks for training and inference workloads.

  • AI Training (FP8)×2.50
  • AI Inference×2.20
  • Memory Bandwidth×2.00

Provider availability

Price bands per provider pulled from the live catalog.

  • Datacrunch logoDatacrunch32 offers
    $1.56$31.92 /hr
  • RunPod logoRunPod39 offers
    $3.59$45.52 /hr
  • Lambda Labs logoLambda Labs17 offers
    $39.92$39.92 /hr
  • Nebius logoNebius1 offers
    $44.00$44.00 /hr
  • Google Cloud logoGoogle Cloud8 offers
    $51.55$70.88 /hr

Popular regions

  • EU-RO-116 offers
  • US-CA-216 offers
  • FIN-018 offers
  • FIN-028 offers
  • FIN-038 offers
  • ICE-018 offers
  • IN7 offers
  • asia-northeast-11 offers

Weighted average price $24.43 /hr · median $21.54