FB Pixel no scriptSamsung shakes off AI memory woes but race to innovate is far from over
MENU
KrASIA
Insights

Samsung shakes off AI memory woes but race to innovate is far from over

Written by Nikkei Asia Published on   5 mins read

Share
Image of Samsung's HBM3E chip. Image courtesy of Samsung.
The tech giant is betting big on HBM chips after ceding ground to rival SK Hynix.

Samsung Electronics’ flagship store in the affluent district of Gangnam, southern Seoul, was packed last month with scores of journalists eager to try out the company’s latest device, the Galaxy XR extended reality headset. The demonstration included an immersive video of an up-and-coming K-pop girl group performing an energetic dance.

The upbeat event mirrored the company’s optimism that Samsung is regaining its footing after falling behind in the artificial intelligence boom.

South Korea’s iconic tech company, which produces everything from microchips to smartphones and refrigerators, has enjoyed several pieces of good news this year, including the resolution of chairman Lee Jae-yong’s years-long legal dispute over fraud charges.

On the business front, Samsung announced in July it secured a USD 16.5 billion deal to manufacture chips for Tesla at its new fab in Texas, easing worries that it was struggling to find customers for its contract chipmaking business.

Last month, Samsung signed a strategic partnership agreement with OpenAI to supply dynamic random access memory (DRAM) chips for the ChatGPT developer’s USD 500 billion Stargate project, which aims to build large-scale data centers in the US.

“This is definitely good news for Samsung,” Lee Seung-woo, head of research at Eugene Investment & Securities, said of the signing. “[It] worked as a catalyst to boost its stock price, as expansion of semiconductor infrastructure is inevitable to support AI.”

Indeed, Samsung’s shares have jumped over 87% year-to-date, partly due to the OpenAI agreement and partly to investors betting that Samsung is ready to benefit from the upward cycle in the semiconductor market.

It was not always certain that the company would.

For decades, Samsung was the unquestioned titan of DRAM and NAND memory chips. But in the generative AI era, cloud giants and makers of graphics processing units (GPUs) prize one particular memory product above all others: high-bandwidth memory, or HBM. And in this segment, Samsung found itself playing second fiddle to domestic rival SK Hynix.

HBM, in which DRAM dies are stacked on top of each other to deliver very fast data transfer speeds using less power, is an essential component for AI accelerators. SK Hynix moved early to mass produce HBM and has captured a commanding share of the market. Samsung, on the other hand, ran into quality issues producing the advanced memory chips.

Earlier this year, TrendForce estimated that SK Hynix was set to remain the dominant HBM player, commanding 52.3% of the global market this year. Samsung’s share, by contrast, was expected to drop to 28.7%, from 41% last year, followed by Micron with 19%.

For Samsung, the gap came as a shock. It responded not with incremental tweaks but with an organizationwide campaign: new engineering teams, targeted investments in wafer-to-package integration and accelerated road maps for having its products qualified by customers.

The result is a dramatic sprint to reclaim ground in the memory sector.

By late summer and early autumn, the effort began to show concrete progress. Analysts indicated that Samsung’s HBM3E, the latest version of HBM, cleared critical qualification tests with Nvidia, the leading GPU maker, after a one-and-a-half-year delay.

Nvidia gets most of its HBM supplies from SK Hynix, and the two companies have spoken often of their cooperation. Samsung has not publicly commented on whether Nvidia has approved its HBM.

“Samsung appears to have achieved good results with HBM3e shipments to Nvidia recently, but the volume impact remains limited,” said MS Hwang, research director at Counterpoint, in a note last month. “To regain its previous market leadership, Samsung needs to carry this momentum into its next-generation product, HBM4.”

An industry source familiar with the company’s HBM strategy said Samsung has passed the test and is supplying small amounts of HBM3E chips to Nvidia. “The volume is not that much, but Samsung passed the test and is supplying HBM3E chips to Nvidia now,” said the source, asking not to be named.

Samsung declined to comment when Nikkei Asia asked whether Nvidia had approved its HBM and how many of the components it supplies to the US company.

For Samsung, the next 12 months will be crucial. If it achieves reliable HBM3E supply and aligns HBM4 timing with Nvidia’s GPU roadmaps, it could regain parity in the market and temper SK Hynix’s temporary ascendancy. If it falters, the long-term competitive map for AI memory could tilt in ways that are hard to reverse.

Analysts expect that Samsung will enter the HBM4 market using its 1C nanometer, or ten-nanometer level, production technology next year, though they add that it may take time for the company to catch up, in terms of both technology and market share, with SK Hynix.

“We expect HBM, which has been a burden on Samsung’s earnings improvement so far, to see its competitiveness gap narrowing and an expansion of market share,” said Nomura analysts in a recent report. “However, due to the high initial costs and low yields of 1C nm, we expect HBM profitability to be relatively low compared to peers in 2026, although we estimate the gap to narrow rapidly by 2027.”

Leaders in the semiconductor sector stress that the catch-up requires more than engineering.

SK chairman Chey Tae-won, who controls SK Hynix, complained last year about the pressure that Nvidia was putting on his company to develop HBM chips quickly. He said the US chipmaker demanded SK Hynix supply HBM4 chips six months earlier than the planned schedule.

Such pressure, however, has helped keep SK Hynix at the forefront of HBM. The company is working hard to stay ahead by accelerating the development of next-generation HBM. In September, it announced that it had become the first in the world to complete the development of HBM4 chips and was ready to supply them to its key customer, Nvidia.

“HBM4, a symbolic turning point beyond the AI infrastructure limitations, will be a core product for overcoming technological challenges,” Justin Kim, president and head of AI infrastructure at SK Hynix, said in a statement. “We will grow into a full-stack AI memory provider by supplying memory products with the best quality and diverse performance required for the AI era in a timely manner.”

SK Hynix also signed a tentative memory chip supply deal with OpenAI for its Stargate project this month, underscoring its continuing importance in the field.

Samsung’s revived fortunes in the HBM segment come as the company performs strongly in another key area, smartphones, buoyed by the success of its new foldable Z series.

“We’re expecting Samsung to see tailwinds on the mobile side with uplift from the strengthening mix coming from strong foldables performance, in particular, with the new Galaxy Z Fold7,” Counterpoint Research vice president Neil Shah said in a note last month.

But success in HBM would be particularly sweet for Samsung as it looks to reassert its chip dominance.

Chairman Lee has attributed the company’s HBM progress to two factors. One, encouraging its executives to speed up the process, and two, strengthening his personal relationship with Nvidia CEO Jensen Huang. That closeness could be seen when the two executives hugged each other at a business roundtable meeting in Washington in August. How close their companies remain could come down to Samsung continuing to keep pace in the AI memory race.

This article first appeared on Nikkei Asia. It has been republished here as part of 36Kr’s ongoing partnership with Nikkei.

Share

Loading...

Loading...