Although this 2023 was not predicted good year for computer salesit seems that the sector of the high computing and AI can’t say the same. We have seen that AI is now used for almost anything, applications or in the automotive sector, for example, and this will generate a great demand for servers capable of managing all this data more quickly. According to the trendforce studiothe demand for AI and high computing will generate a demand for 60% more memory HBM in this 2023.
This will mean a total of 290 million GB of HBM memory for this same year, but it is expected 30% more by 2024. This consumption is due to the growing demand for tasks such as 8K video, AR/VR, use of supercomputers, etc.. Trendforce anticipates that they will be needed between 145,600 and 233,700 NVIDIA A100 GPUs to meet the demand for AI like ChapGPT or Midjourney by 2025.
For this it is ideal to have the HBM memory usage, where the bandwidth of HBM3 is capable of multiplying by 15 that of DDR5, and that can be improved by stacking more memory of this type. You can also replace some of the conventional SDRAM memory with HBM3 to achieve greater energy savings.
This study also mentions that the sale of servers for this type of tasks will increase in 2023, with 1.2 million servers shipped equipped with various accelerator solutions from NVIDIA, AMD or custom solutions from the biggest like Google or Amazon, a growth rate of 38% Over the previous year and 50% on the shipment of AI-related chips.
End of Article. Tell us something in the Comments!
Juan Antonio Soto
I am a Computer Engineer and my specialty is automation and robotics. My passion for hardware began at the age of 14 when I gutted my first computer: a 386 DX 40 with 4MB of RAM and a 210MB hard drive. I continue to give free rein to my passion in the technical articles that I write at Geeknetic. I spend most of my free time playing video games, contemporary and retro, on the 20+ consoles I own, in addition to the PC.