No less than the Vice President of SK Hynix, Ki-tae Kim, has highlighted the situation his company is experiencing, which is spreading to all major memory manufacturers worldwide. The fact is that HBM memory in its various versions, especially the most recent ones like HBM3e and the upcoming HBM4 that is extremely close, is in short supply for both cases and for the entire year of 2024. In just two months, the stock has flown off the shelves, so how are they going to create more GPUs for AI and HPC?
Without a doubt, SK Hynix is the number one company when it comes to HBM memory. Neither Samsung nor Micron are really challenging them for the throne, as the Koreans are truly giving their all to constantly innovate and stay ahead of others in this type of product. Therefore, when their vice president makes such inflammatory statements, it’s essential to listen carefully.
Gina Raimondo, Jensen Huang, and now Kitae Kim: no HBM memory for 2024
There are factors of instability in a market that is going as fast as a jet fighter. It’s a modernized Lockheed SR-71 Blackbird that, like this impressive fighter plane, has its leaks. Kim states that although the demand for semiconductors and cutting-edge technological products is recovering thanks to the expansion of AI, the forecasts for all types of memory are growing, particularly for HBM3e.
For this reason, and following the argument, he commented:
“To maintain a sustained advantage in the market, technological competitiveness is fundamental, and from a sales perspective, shortening the TTM (Time to Market: the time it takes to conceive a product and bring it to market) is key.
The basis of semiconductor sales is to proactively secure customer volumes and negotiate to sell quality products under better conditions. We have a good product, so now it’s a battle of speed. This year’s HBM edition is already ‘sold out.’ Although 2024 is just beginning, we are already preparing for 2025 to dominate the market.”
NVIDIA is said to have already sold all Blackwell GPUs for 2024
The world is moving at a truly surprising pace. Hardware and technology are being purchased that aren’t even on sale, don’t have an official presentation, and in many cases are pre-ordered when they’re still incomplete.
This includes future ASML scanners, future lithographic processes by Samsung, Intel, and TSMC, and NVIDIA and even Intel’s AI GPUs. The “greens” have already made it clear that demand exceeds supply, and that statement comes from none other than Colette Kress, NVIDIA’s chief financial officer.
To be even more specific and complete the information with the surrounding rumors, it’s said that the B100 SXM and B100 PCIe and compatible DGX servers using new components will not be “available” after their launch, mainly because everything is reserved, meaning sold.
Gina Raimondo herself said just last night, “The demand for AI chips is mind-boggling,” and knowing Raimondo a little, these words should not be taken lightly. Given the statements made by SK Hynix, NVIDIA, and the US government, we can already envision the unprecedented race in the industry for HBM memory, which will be depleted by 2024 and everyone is already thinking about 2025.
The article “We are two months into the year, and HBM memory is already sold out for the entire 2024 – how can GPUs for AI and HPC be created?” first appeared in El Chapuzas Informático.