Gaming

Microsoft, Amazon and Oracle will be the first to receive the NVIDIA Blackwell in December


Microsoft, Amazon and Oracle will be the first to receive the NVIDIA Blackwell in December



After suffering some production problems with the new Blackwell chips, NVIDIA already announced that it would increase the number of units manufactured to cover the demand that server GPUs based on this new architecture would be having after the initial delay.

It seems that the first GB200 servers based on these B200 and Grace CPUs they will arrive soon to the company’s main clients, and in Decembercompanies like Microsoft, Amazon (AWS) or Oracle They will already have them available in their different data centers.

Geeknetic Microsoft, Amazon and Oracle will be the first to receive the NVIDIA Blackwell on December 1

Initially and according to this information, these chips were expected for the months of November, but were delayed to January. After the increase in production, NVIDIA has been able to get ahead a few weeks to be able to deliver them to the first customers in the last month of the year.

Other companies like Meta will also receive servers based on what will be the most powerful GPU in the world for Artificial Intelligence tasks.

The NVIDIA Blackwell B200 promises four times more performance when handling large models such as LLaMa 70B on the NVIDIA Hopper H100, and up to 27% more inference performance compared to the Nvidia Hopper H200.

Also, let’s not forget that Blackwell will be a unified architecture that will also be integrated into gaming graphics cards. Unless there is a surprise, we will see it in the next RTX 5000.

End of Article. Tell us something in the Comments!

Article Editor: Antonio Delgado

Antonio Delgado

Computer Engineer by training, editor and hardware analyst at Geeknetic since 2011. I love to tear apart everything that passes through my hands, especially the latest hardware that we receive here for reviews. In my free time I tinker with 3D printers, drones and other gadgets. For anything, here you have me.

Source link