The high demand for AI-related technologies, coupled with various other factors such as the rise of cloud computing, 5G networks, and IoT, has indeed put pressure on the supply of high-end memory chips. Analysts have been cautioning about potential shortages for a while now. The boom in AI applications is a significant contributor to this tight supply situation. As more industries adopt AI-driven solutions, the demand for memory chips used in data centers and AI processors continues to soar. This trend emphasizes the importance of investment and innovation in semiconductor manufacturing to meet the growing demands of the AI-driven economy.
According to analysts, the high demand for AI applications is expected to continue causing a shortage of high-performance memory chips throughout this year. SK Hynix and Micron, two major memory chip suppliers, have reportedly depleted their stock of high-bandwidth memory chips for 2024, with supplies for 2025 also nearly exhausted. Kazunori Ito, director of equity research at Morningstar, predicts that the overall memory supply will remain constrained throughout 2024. The surge in demand for AI chipsets has particularly benefited companies like Samsung Electronics and SK Hynix, the leading memory chip manufacturers globally. While SK Hynix is already a supplier to Nvidia, there are reports that the company is considering Samsung as an additional supplier.
These high-performance memory chips are vital for training large language models (LLMs) such as OpenAI's ChatGPT, which has contributed to the rapid adoption of AI technology. LLMs rely on these chips to retain information from past interactions with users, enabling them to generate responses that mimic human-like conversation. William Bailey, director at Nasdaq IR Intelligence, explains that the complexity of manufacturing these chips and the challenges in scaling up production contribute to the anticipated shortages throughout 2024 and possibly into 2025.
Market intelligence firm TrendForce highlights that the production cycle for High Bandwidth Memory (HBM) chips is longer by 1.5 to 2 months compared to DDR5 memory chips commonly used in personal computers and servers, further exacerbating the supply constraints.
To meet the surging demand, SK Hynix is set to expand its production capacity by investing in advanced packaging facilities in Indiana, U.S., as well as in the M15X fab in Cheongju and the Yongin semiconductor cluster in South Korea.
During its first-quarter earnings call in April, Samsung announced that its supply of High Bandwidth Memory (HBM) bits for 2024 had expanded by over threefold compared to the previous year. The company has already finalized agreements with customers for this increased supply and plans to further expand supply by at least two times or more year on year in 2025.
The intense competition among tech giants like Microsoft, Amazon, and Google has led them to invest billions in training their own Large Language Models (LLMs) to maintain competitiveness, driving up the demand for AI chips, including High Bandwidth Memory (HBM).
Chris Miller, author of "Chip War," notes that these major buyers of AI chips, such as Meta and Microsoft, have indicated their intent to continue investing heavily in AI infrastructure, ensuring sustained demand for AI chips, including HBM, at least through 2024.
Chipmakers are engaged in a fierce competition to produce the most advanced memory chips in the market to capitalize on the AI boom.
SK Hynix announced during a press conference earlier this month that it would commence mass production of its latest generation of HBM chips, the 12-layer HBM3E, in the third quarter. Meanwhile, Samsung Electronics plans to begin mass production within the second quarter and has already shipped samples of the latest chip, positioning itself as an industry leader in this regard. SK Kim, executive director and analyst at Daiwa Securities, suggests that if Samsung achieves qualification earlier than its competitors for the 12-layer HBM3E, it could secure a significant share of the market by the end of 2024 and into 2025.
Source: CNBC