Gaming

New details on Maia 100, Microsoft’s first AI chip with 64GB of integrated HBM2e memory


New details on Maia 100, Microsoft's first AI chip with 64GB of integrated HBM2e memory



Microsoft has revealed new details of its first chip for artificial intelligence: The Maia 100. This is a SoC specialized in AI processing for servers. We knew about its existence, but it is now that we can get an idea of ​​what this chip will be capable of, designed to work with services such as Azure OpenAI.

Geeknetic New details of Maia 100, Microsoft's first AI chip with 64 GB of integrated HBM2e memory 1

Maia 100 will be manufactured using TSMC’s 5-nanometer process, in fact, with a size of 820 mm², it is one of the largest chips created under N5. It includes integrated 64GB HBM2E memory at 1.8 TB/s, which is capable of achieving peak Tensor POPS at 6 bits, 1.5 at 9 bits, and 0.8 with BF16.

It features PCI Express Gen 5 x 8 at 32GB/s and 500MB of L1/L2 cache, and supports 600GB/s network connectivity via 12x400GbE to communicate between servers using the same chips. It has a provisioning TDP of 500W, although it is designed for support up to 700W TPD.

Geeknetic New details of Maia 100, Microsoft's first AI chip with 64 GB of integrated HBM2e memory 2

The Microsoft Maia 100 will be launched with a dedicated SDK to help harness its power with frameworks such as Triton or PyTorch, also supporting the Maia API with a custom model to achieve maximum performance and control.

End of Article. Tell us something in the Comments!

Article Author: Antonio Delgado

Antonio Delgado

Computer Engineer by training, editor and hardware analyst at Geeknetic since 2011. I love to dissect everything that passes through my hands, especially the latest hardware that we receive here to do reviews. In my free time I tinker with 3D printers, drones and other gadgets. For anything you need, here I am.

Source link