Tech/Science

HBM Prices Soaring Due to AI Chip Demand

High-bandwidth memory (HBM) prices are soaring in response to a surge in demand from artificial intelligence (AI) chipmakers such as Nvidia and AMD. The average selling prices of HBM chips this year have been five times higher than conventional DRAM memory chips, according to market research firm Yole Group on Feb. 8.

HBM chips are rising in popularity because they are essential components of Nvidia’s graphics processing units (GPUs), which power generative AI systems such as OpenAI’s ChatGPT. Samsung Electronics and SK Hynix, South Korean chipmakers that dominate 90% of the HBM market, are expected to benefit from the global boom in generative AI.

Yole Group predicts HBM supply will grow at a compound annual rate of 45% from 2023 to 2028. Despite robust growth, HBM prices are expected to remain high for some time, considering how difficult it is to scale up HBM production to keep up with demand.

The market for HBM is expanding beyond its traditional client base centered around Nvidia. Intel and AMD plan to incorporate HBM in their next-generation central processing unit (CPU) products.

HBM boasts higher data capacity and lower power consumption than conventional DRAM, making it ideal for AI applications requiring high performance and efficiency. HBM is projected to account for more than 18% of the DRAM market this year, up from 9% last year, according to market research firm Omnia. “Currently, no other memory chip can replace HBM in the realm of AI computing systems,” said an industry insider.

Samsung Electronics and SK Hynix are responding to growing demand by ramping up HBM production and allocating capital expenditures towards HBM-related equipment.

Samsung’s semiconductor division recorded annual losses of $9.6 billion (12.69 trillion won) last year. SK Hynix’s annual losses amounted to 7.73 trillion won. Both chipmakers are eagerly anticipating a rebound this year. The companies have high hopes for HBM sales, as HBM’s profit margins are five times higher than those of conventional DRAM chips.

HBM sales are fueled by large-scale investments in AI servers by U.S. tech companies. According to Hi Investment & Securities, the capital expenditure (CAPEX) growth rate of 14 tech companies in the U.S. and China is forecasted to rise by 18.4% this year.

SK Hynix recently announced that sales of its next-generation HBM product, HBM3, increased more than fivefold year-on-year.

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *