GPUs

NVIDIA’s Next-Gen Blackwell GPUs to Likely Use HBM3E Memory, Claims Micron

NVIDIA may adopt Micron’s HBM3E memory to power its next-gen Blackwell GPUs. Slated to arrive in mid or late 2024, these Tensor Core GPUs (accelerators) will train the most complex neural networks. Micron plans to start volume shipments of its HBM3E memory in the first half of 2024. It is the first vendor to reveal its HBM3E memory, well ahead of rivals Samsung and HK Hynix.

Micron has announced that NVIDIA is currently testing its HBM3E memory for its data center GPUs set to drive future HBM3E-powered AI solutions. For now, NVIDIA’s only HBM3E product is the Grace Hopper CPU-GPU combo which leverages the Arm-based Grace CPU and the GH200 “Hopper” GPU. This can only mean that we’re looking at an unreleased SKU, likely the GB100 or “B100” Blackwell Tensor Core GPU.

The introduction of our HBM3E product offering has been met with strong customer interest and enthusiasm. We have been working closely with our customers throughout the development process and are becoming a closely integrated partner in their AI roadmaps.
Micron HBM3E is currently in qualification for NVIDIA compute products, which will drive HBM3E-powered AI solutions. We expect to begin the production ramp of HBM3E in early calendar 2024 and to achieve meaningful revenues in fiscal 2024.

Sanjay Mehrotra, Micron CEO

Micron is primarily known for manufacturing DRAM (computer memory) and NAND-based SSDs. In the HBM market, it’s little more than an outlier with a share of just 10%. Its HBM3E or HBM3 Gen 2 memory is supposed to be its ticket to the lucrative AI memory segment currently dominated by SK Hynix and Samsung.

Micron HBM3E modules offer 24GB packages that consist of 8-Hi 24Gbit memory dies fabbed using its 1β (1-beta) process. Each of these dies can attain data transfer rates of up to 9.2GT/s, allowing for peak bandwidths of up to 1.2TB/s for the entire 8-Hi stack. Existing HBM3 solutions top out just over 0.83TB/s, marking a 44% advantage for Micron’s HBM3 Gen 2 memory. The company plans to further beef up its HBM portfolio with plans of 12-Hi 36GB stacks for the latter half of 2024.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button