NVIDIA CDNA3
AMD Instinct MI300X
MI300X is AMD's flagship accelerator with massive HBM3 capacity for LLM inference and training deployments.
- Launch year
- 2023
- Memory
- 192 GB HBM3
- Memory bandwidth
- 5.2 TB/s
- Peak FP16 / FP32
- 380 TFLOPS · 61 TFLOPS
Market snapshot
$1.49 /hr
Range $1.49 – $18.32
Catalog coverage
16 live offerings
Across 1 providers · 1 regions
- 192 GB HBM3
- Infinity Fabric interconnect
- Strong FP16 throughput
Last refreshed Oct 20, 2025, 1:56 AM
Performance snapshot
Normalized versus NVIDIA A100 (=1.0). Values use public reference benchmarks for training and inference workloads.
- AI Inference×1.60
- Memory Capacity×2.00
- AI Training×1.40
Provider availability
Price bands per provider pulled from the live catalog.
RunPod16 offers
$1.49 – $18.32 /hr
Popular regions
- EU-RO-116 offers
Weighted average price $8.51 /hr · median $8.94
RunPod
$1.49/hrRunPod
$2.29/hrRunPod
$2.98/hrRunPod
$4.47/hrRunPod
$4.58/hrRunPod
$5.96/hrRunPod
$6.87/hrRunPod
$7.45/hrRunPod
$8.94/hrRunPod
$9.16/hrRunPod
$10.43/hrRunPod
$11.45/hrRunPod
$11.92/hrRunPod
$13.74/hrRunPod
$16.03/hrRunPod
$18.32/hr
MI300X | EU-RO-1 | $1.49/hr | |
MI300X | EU-RO-1 | $2.29/hr | |
MI300X | EU-RO-1 | $2.98/hr | |
MI300X | EU-RO-1 | $4.47/hr | |
MI300X | EU-RO-1 | $4.58/hr | |
MI300X | EU-RO-1 | $5.96/hr | |
MI300X | EU-RO-1 | $6.87/hr | |
MI300X | EU-RO-1 | $7.45/hr | |
MI300X | EU-RO-1 | $8.94/hr | |
MI300X | EU-RO-1 | $9.16/hr | |
MI300X | EU-RO-1 | $10.43/hr | |
MI300X | EU-RO-1 | $11.45/hr | |
MI300X | EU-RO-1 | $11.92/hr | |
MI300X | EU-RO-1 | $13.74/hr | |
MI300X | EU-RO-1 | $16.03/hr | |
MI300X | EU-RO-1 | $18.32/hr |