NVIDIA CDNA3

AMD Instinct MI300X

MI300X is AMD's flagship accelerator with massive HBM3 capacity for LLM inference and training deployments.

Launch year
2023
Memory
192 GB HBM3
Memory bandwidth
5.2 TB/s
Peak FP16 / FP32
380 TFLOPS · 61 TFLOPS

Market snapshot

$1.49 /hr

Range $1.49 – $18.32

Catalog coverage

16 live offerings

Across 1 providers · 1 regions

  • 192 GB HBM3
  • Infinity Fabric interconnect
  • Strong FP16 throughput
AMD Instinct MI300X render

Last refreshed Oct 20, 2025, 1:56 AM

Performance snapshot

Normalized versus NVIDIA A100 (=1.0). Values use public reference benchmarks for training and inference workloads.

  • AI Inference×1.60
  • Memory Capacity×2.00
  • AI Training×1.40

Provider availability

Price bands per provider pulled from the live catalog.

  • RunPod logoRunPod16 offers
    $1.49$18.32 /hr

Popular regions

  • EU-RO-116 offers

Weighted average price $8.51 /hr · median $8.94