Micron in the United States has supplied samples of the 6th generation high bandwidth memory (HBM) "HBM4" to NVIDIA. This comes just three months after SK hynix became the first in the world to supply HBM4 samples. Micron, which was a latecomer in HBM, is quickly tightening the reins of its pursuit.
According to industry sources on the 11th, Micron announced on the 10th (local time) that it has begun supplying samples of the 12-layer HBM4 with a capacity of 36 gigabytes (GB) to major clients. Micron explained that HBM4 was manufactured by stacking 10nm class 5th generation (1b) DRAM, and that its performance and power efficiency have improved by more than 60% and 20%, respectively, compared to the previous generation, HBM3E (5th generation).
HBM4 is expected to become a game changer in the AI memory semiconductor market starting next year. Market research firm Omdia stated that "the supply capability of HBM4 is expected to emerge as a key differentiating factor in future market competition," while TrendForce predicted that HBM4 will surpass HBM3E to become a mainstream solution in the latter half of next year.
As HBM has emerged as a key product responsible for the company's performance in recent years, Micron and SK hynix plan to accelerate the mass production of HBM4 targeting 'AI giant' NVIDIA.
Micron noted, "We plan to significantly expand HBM4 mass production in 2026, in line with our customer's next-generation AI platform mass production schedule." The AI platform mentioned by Micron is presumed to be the 'Rubin' that NVIDIA revealed is scheduled for release in the second half of next year during its annual developer conference, GTC2025, in March.