Overview
- The Instinct MI400X GPU boasts up to 10× the compute power of the MI300X and doubles FP4 performance over the MI355X, featuring 432 GB of HBM4 and 19.6 TBps of memory bandwidth.
- Helios integrates 72 MI400-series GPUs linked by UALink, paired with EPYC “Venice” CPUs and Pensando Vulcano NICs, to reach 2.9 exaFLOPS of FP4 inference and 1.4 exaFLOPS of FP8 training performance.
- AMD claims the new rack offers 50 percent more memory capacity and bandwidth—31 TBps and 1.4 PBps—compared with Nvidia’s upcoming NVL144 Vera Rubin platform.
- The company has moved to a yearly cadence for its AI hardware roadmap and plans to ship EPYC “Verano” CPUs, Instinct MI500-series GPUs, and a denser rack-scale system in 2027.
- AMD expects to leverage TSMC’s A16 process node for its 2027 chips, gaining backside power delivery to boost efficiency and performance density in future AI deployments.