news

AI boom to keep supply of high-end memory chips tight this year, analysts warn

Bloomberg | Bloomberg | Getty Images
  • SK Hynix and Micron – two of the world's largest memory chip suppliers – are out of high-bandwidth memory chips for 2024, while the stock for 2025 is also nearly sold out
  • "We expect the general memory supply to remain tight throughout 2024," Kazunori Ito, director of equity research at Morningstar said in a report last week.
  • Big Tech companies Microsoft, Amazon and Google are spending billions to train their own large language models to stay competitive, fueling demand for AI chips.

High-performance memory chips are likely to remain in tight supply this year, as explosive AI demand drives a shortage for these chips, according to analysts.

SK Hynix and Micron – two of the world's largest memory chip suppliers – are out of high-bandwidth memory chips for 2024, while the stock for 2025 is also nearly sold out, according to the firms.

"We expect the general memory supply to remain tight throughout 2024," Kazunori Ito, director of equity research at Morningstar said in a report last week.

The demand for AI chipsets has boosted the high-end memory chip market, hugely benefiting firms such Samsung Electronics and SK Hynix, the top two memory chipmakers in the world. While SK Hynix already supplies chips to Nvidia, the company is reportedly considering Samsung as a potential supplier too.

High-performance memory chips play a crucial role in the training of large language models (LLMs) such as OpenAI's ChatGPT, which led AI adoption to skyrocket. LLMs need these chips to remember details from past conversations with users and their preferences to generate human-like responses to queries.

"The manufacturing of these chips are more complex and ramping up production has been difficult. This likely sets up shortages through the rest of 2024 and through much of 2025," said William Bailey, director at Nasdaq IR Intelligence.

HBM's production cycle is longer by 1.5 to 2 months compared with DDR5 memory chip commonly found in personal computers and servers, market intelligence firm TrendForce said in March.

To meet soaring demand, SK Hynix plans to expand production capacity by investing in advanced packaging facilities in Indiana, U.S. as well as in the M15X fab in Cheongju and the Yongin semiconductor cluster in South Korea.

Samsung during its first-quarter earnings call in April said its HBM bit supply in 2024 "expanded by more than threefold versus last year." Chip capacity refers to the number of bits of data a memory chip can store.

"And we have already completed discussions with our customers with that committed supply. In 2025, we will continue to expand supply by at least two times or more year on year, and we're already in smooth talks with our customers on that supply," Samsung said.

Micron didn't respond to CNBC's request for comment.

Intense competition

Big Tech companies Microsoft, Amazon and Google are spending billions to train their own LLMs to stay competitive, fueling demand for AI chips.

"The big buyers of AI chips – firms like Meta and Microsoft – have signaled they plan to keep pouring resources into building AI infrastructure. This means they will be buying large volumes of AI chips, including HBM, at least through 2024," said Chris Miller, author of "Chip War," a book on the semiconductor industry.

Chipmakers are in a fierce race to manufacture the most advanced memory chips in the market to capture the AI boom.

SK Hynix in a press conference earlier this month said that it would begin mass production of its latest generation of HBM chips, the 12-layer HBM3E, in the third quarter, while Samsung Electronics plans to do so within the second quarter, having been the first in the industry to ship samples of the latest chip.

"Currently Samsung is ahead in 12-layer HBM3E sampling process. If they can get qualification earlier than its peers, I assume it can get majority shares in end-2024 and 2025," said SK Kim, executive director and analyst at Daiwa Securities.

Copyright CNBC
Contact Us