HeadlinesBriefing favicon HeadlinesBriefing.com

Nvidia's Vera Rubin AI Systems to Use HBM4 from SK Hynix, Samsung

TechPowerUp •
×

NVIDIA's upcoming Vera Rubin AI systems, slated for late summer, will rely on HBM4 memory. Surprisingly, Micron will not be supplying this crucial component. Instead, SK Hynix will provide approximately 70% of the HBM4, with Samsung taking the remaining 30%. This shift highlights the competitive dynamics within the high-bandwidth memory market.

This decision marks a change in NVIDIA's memory supplier strategy. While Micron loses out on HBM4, it will still supply LPDDR5X memory for the "Vera" CPUs, providing up to 1.5 TB of memory. The system bandwidth has increased substantially, from an initial target of 13 TB/s to 22 TB/s highlighting the need for memory advancements.

The "Vera Rubin" systems, specifically the VR200 NVL72, are designed for next-generation AI model processing. The move to HBM4, and the exclusion of Micron, underscores the importance of high-performance memory in the AI sector. This highlights the ongoing competition among memory manufacturers to secure design wins with leading tech companies.

Ultimately, this news reflects the evolving memory landscape, with suppliers jockeying for position in the high-stakes AI hardware market. While Micron misses out on HBM4, it remains a key player through its LPDDR5X offerings. The focus on increased bandwidth demonstrates the growing demands of AI workloads.