NVIDIA in secret talks with Samsung to develop new AI memory chip

NVIDIA has captured the market with its AI accelerators. Those chips require a lot of high-bandwidth memory, the bulk of which is currently being supplied by SK Hynix, while Samsung continues to try and get approval for its HBM3E chips.

While that process is ongoing, both companies are reportedly in secret discussions to commercialize a new type of advanced AI memory chip known as SOCAMM, or system on chip advanced memory module. NVIDIA is said to be in talks with SK Hynix as well for this memory chip.

Prototypes are currently being exchanged with NVIDIA

The talks center around commercializing this new memory standard. It’s viewed by the industry as an advancement over the existing high-bandwidth memory chips that form the backbone of AI accelerators today.

These chips are designed to significantly improve the performance of artificial intelligence supercomputers. SOCAMM has a higher cost-performance ratio, and its module design has a higher number of ports which helps reduce data bottlenecks which remain a challenge in AI computing.

SOCAMM is also detachable, so it will enable data center operators to swap and upgrade memory modules and achieve ongoing performance improvements. Given its compact size, more SOCAMM modules can be installed within the same area for improved high-performance computing.

Industry insiders highlight that NVIDIA and memory companies like Samsung are currently working on SOCAMM prototypes and doing performance tests ahead of mass production that’s expected to begin before the end of 2025.

This could provide Samsung with another opportunity to reclaim some of the ground that it has lost to SK Hynix in the high-bandwidth memory segment.

NVIDIA’s Blackwell issues might be just what Samsung needs

NVIDIA has become a darling of the tech industry in recent years as its AI chips have become crucial to the ambitions of giants like Meta, Microsoft, and Google. These companies are in an intense battle for AI dominance and are spending tens of billions of dollars to buy these chips from NVIDIA.

However, recent reports have highlighted delays in shipments of NVIDIA’s next-gen Blackwell GB200 accelerators, largely due to overheating concerns. This could work in Samsung’s favor as it hopes to secure a potentially lucrative order from NVIDIA for its high-bandwidth memory chips.

Samsung needs time to be on its side

High-bandwidth memory chips are a crucial component of AI accelerators. Up to 576 units of latest HBM are incorporated in the latest GB200 platform. SK Hynix has emerged as the leading supplier of HBM3 chips to NVIDIA.

Even though Samsung’s HBM3 chips are “Jensen approved,” they’ve yet to pass NVIDIA’s checks, largely due to overheating issues as well, though NVIDIA CEO Jensen Huang said as recently as last week that Samsung has a good shot of making it past the finish line.

The memory accounts for a major chunk of the thermal load in AI data centers. With HBM chips mounted in close proximity to the processors, an overheating memory chip alongside a processor that tends to run hot is not an ideal scenario for these accelerators.

NVIDIA has been sorting out the design issues in Blackwell before it begins shipping the next batch. Meta and Google have reportedly ordered $10 billion worth of GB200s while Microsoft is said to have ordered 65,000 units for itself and OpenAI, with prices between $30,000 and $40,000 per unit. Some of NVIDIA’s clients have reportedly postponed their Blackwell orders or have requested the company to provide them with previous generation Hopper accelerators.

If NVIDIA needs more time to sort out Blackwell’s issues, it provides that much more time for Samsung to do whatever it needs to do in order for its HBM3 chips to meet NVIDIA’s criteria. This might ultimately help Samsung in securing a decent number of orders for Blackwell.

Samsung already has its sights on the subsequent generation of accelerators as it has renewed its focus on HBM4, aiming to prevent SK Hynix from completely dominating this lucrative market.

NVIDIA CEO Jensen Huang keeps hyping up Samsung like a good buddy

Much ink has been spilled writing about Samsung’s challenges in the lucrative HBM3E market. It still hasn’t been able to win approval from NVIDIA to supply the company with these high-bandwidth memory modules for its AI accelerators. Meanwhile, cross-town rival SK Hynix has emerged as the main supplier of HBM3E to NVIDIA.

NVIDIA CEO Jensen Huang has continued to provide Samsung with hope that its memory chips would soon pass the qualification tests. Huang has also publicly expressed support for Samsung on several occasions, most recently at CES 2025, where Huang gave the keynote address.

Samsung supplied the first-ever HBM NVIDIA used

Samsung’s been trying to become a part of NVIDIA’s HBM3E supply chain for about a year now. Even though its HBM3E 12H memory chip got a “Jensen approved” autograph by Huang himself at GTC 2024 in March, the approval has eluded Samsung, despite hopes being raised in June/July 2024 and as recently as November.

Huang was asked during a press interaction at CES when Samsung might begin supplying HBM3E to NVIDIA. “They are working on it,” he said, adding that “They’re going to succeed. No question.” He further hyped up Samsung in a way only your best buddy would, saying “I have confidence that Samsung will succeed with HBM. I have confidence like, tomorrow is Wednesday.”

The NVIDIA boss also pointed out that Samsung has a solid legacy when it comes to HBM technology. It was the first company to create HBM and the first ever high-bandwidth memory NVIDIA used also came from Samsung. He’s confident that Samsung will revive itself in this lucrative industry.

Samsung and NVIDIA have had a close relationship for many years. Even NVIDIA’s latest 50-series GPUs that were unveiled at CES 2025 use Samsung’s GDDR7 memory. The continued public praise from NVIDIA’s boss will boost the morale of Samsung’s HBM3 team as they push to clear the bottlenecks that have so far prevented the company from capitalizing on this market.

Read more at: https://www.sammobile.com/news/nvidia-in-secret-talks-with-samsung-to-develop-new-ai-memory-chip/