Artificial Intelligence (AI)ElectronicsNews

Nvidia supplier SK Hynix says its high-bandwidth memory chips used in AI processors almost sold out for 2025

SK Hynix announced on Thursday that its high-bandwidth memory (HBM) chips, integral to AI processors, are completely sold out for the year and nearly exhausted for 2025. This surge in demand comes as businesses aggressively expand their AI services. As a key supplier to Nvidia, the world’s second-largest memory chip maker is gearing up to dispatch samples of its latest innovation, the 12-layer HBM3E, starting May, with mass production slated for the third quarter.

“The HBM market anticipates sustained growth alongside the expansion of data and AI model sizes,” stated SK Hynix CEO Kwak Noh-Jung during a news conference. “We foresee an annual demand growth of approximately 60 per cent in the mid- to long-term.”

Competing with US rival Micron Technology and domestic giant Samsung Electronics in the HBM sector, SK Hynix held exclusive rights to supply these chips to Nvidia until March, analysts revealed. However, major buyers of AI chips are now keen on diversifying their suppliers to bolster operational margins. With Nvidia currently dominating around 80 per cent of the global AI chip market, this strategic move is poised to reshape the industry landscape.

Micron has reported that its HBM chip inventory is fully depleted for 2024, with the majority of its 2025 supply already spoken for. The company intends to distribute samples of its 12-layer HBM3E chips to clients come March.

“As advancements in AI functionalities and performance outpace projections, the demand for ultra-high-performance chips like the 12-layer variants appears to be escalating faster compared to 8-layer HBM3Es,” remarked Jeff Kim, research head at KB Securities.

Samsung, set to commence production of its HBM3E 12-layer chips in the second quarter, disclosed this week a more than three-fold surge in HBM chip shipments for the current year. The company affirmed completion of supply negotiations with clients, without providing further details.

Last month, SK Hynix unveiled a $3.87 billion blueprint to construct a cutting-edge chip packaging facility in the US state of Indiana, featuring an HBM chip line. Additionally, it announced a 5.3 trillion won (approximately $3.9 billion) investment in a new DRAM chip plant domestically, with a specific emphasis on HBMs.

By 2028, the proportion of chips dedicated to AI applications, including HBM and high-capacity DRAM modules, is projected to represent 61 percent of the total memory volume in terms of value, compared to a mere 5 percent just last year, as stated by Justin Kim, SK Hynix’s AI infrastructure lead.

During a post-earnings conference call last week, SK Hynix cautioned that there could potentially be a shortage of standard memory chips for smartphones, personal computers, and network servers by the end of the year if the demand for tech devices surpasses expectations.

Kerry Dean

Kerry is a Content Creator at www.systemtek.co.uk she has spent many years working in IT support, her main interests are computing, networking and AI.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.