Gaming

AMD Instinct MI325X AI accelerator promises more performance than NVIDIA H200


AMD Instinct MI325X AI accelerator promises more performance than NVIDIA H200



AMD has unveiled its new accelerator for Artificial Intelligence tasks in specialized servers and data centers, the AMD Instinct MI325X.

This model is based on the architecture AMD CDNA 3 and has a whopping 256 GB of next-generation HBM3E memory to achieve 6 TB/s bandwidth. This memory capacity will allow you to directly load very large language models without having to access much slower non-volatile storage.

In total, this GPU offers 304 computing units

Geeknetic AMD Instinct MI325X AI accelerator promises more performance than NVIDIA H200 1

According to data revealed by AMD, The new AMD Instinct MI325X surpasses the NVIDIA H200 in performance with Hopper architecture in its tests with LLM such as Mistral 7B or LLaMa 3.1 70B. However, we must not forget that its successor, the NVIDIA B200 with Blackwell architecture, is just around the corner.

Geeknetic AMD Instinct MI325X AI accelerator promises more performance than NVIDIA H200 2

Geeknetic AMD Instinct MI325X AI accelerator promises more performance than NVIDIA H200 3

The native platform of these AMD GPUs consists of eight MI325X running at the same time with a total of 2 TB of HBM3E memory and 48 TB of aggregate bandwidth. Its power will reach 20.9 PFLOTPS in FP16.

Geeknetic AMD Instinct MI325X AI accelerator promises more performance than NVIDIA H200 4

The AMD Instinct MI325X will arrive during the first three months of 2025 integrated into servers from Lenovo, Eviden, HP, Gigabyte, Dell and SuperMicro.

End of Article. Tell us something in the Comments!

Article Editor: Antonio Delgado

Antonio Delgado

Computer Engineer by training, editor and hardware analyst at Geeknetic since 2011. I love to tear apart everything that passes through my hands, especially the latest hardware that we receive here for reviews. In my free time I tinker with 3D printers, drones and other gadgets. For anything, here you have me.

Source link