Micron's memory semiconductor fab./Courtesy of Micron

Until last year, the world’s third-largest memory semiconductor corporation, Micron from the United States, had remained at the bottom with a 5% market share in the high-bandwidth memory (HBM) market. On the 25th (local time), Micron announced during its earnings report that it is supplying HBM, a key component of AI chips, in large quantities to four major clients, including AI chip giants Nvidia and AMD. It asserted that it would elevate its HBM market share to about 25%, similar to its overall DRAM market share, by the second half of this year. Micron, which was trailing behind SK hynix and Samsung Electronics, has clearly expressed its intent to break the duopoly and establish itself as a key supplier in the HBM market.

On this day, during a conference call for the third quarter (March to May) of the 2025 fiscal year, Micron emphasized that the competitiveness of HBM drove its strong performance and maintained confidence throughout. Third-quarter revenue reached $9.3 billion (about 12.66 trillion won), a 37% increase compared to the same period last year, and operating profit soared by 165% to $2.49 billion (about 3.39 trillion won), exceeding market expectations. HBM sales surged by about 50% compared to the previous quarter, driving the performance, and as a result, revenue from the institutional sector saw a record high of $7.07 billion (about 9.62 trillion won).

Sanjay Mehrotra, CEO of Micron, said, "In this transformative era where AI generates unprecedented demand for high-performance memory and storage, Micron is uniquely positioned to seize this opportunity." He stated, "Building on the proven success of HBM3E (5th generation HBM), we have gained the trust of major HBM clients and are providing the industry's lowest power consumption and highest performance HBM." He expressed confidence in securing major clients in the AI chip market by diversifying beyond a dependency on specific clients, targeting both graphics processing units (GPUs) and application-specific integrated circuits (ASICs).

Micron has established itself as a key supplier to Nvidia, which holds 80% of the AI accelerator market, demonstrating its technological prowess with the HBM3E, which has emerged as a market-leading product this year. Following SK hynix, it became the second in the industry, staying a step ahead of Samsung Electronics. Samsung Electronics has not yet been able to supply HBM3E to Nvidia. Analysts estimate that Micron's HBM sales for the third quarter will be around $1.5 billion (about 2 trillion won), which is roughly one-third of SK hynix's estimated HBM sales of around 6 trillion won in the second quarter (March to June). Micron noted, "The yield and production increase of the HBM3E 12-layer product are progressing very smoothly, and in the fourth quarter, the share of shipments will exceed that of the 8-layer products."

Expectations for next-generation products were also expressed. Micron stated, "Currently, HBM4 (6th generation HBM), samples of which have been provided to various clients, offers bandwidth exceeding 2.0 TB (terabytes) per second based on proven 1-beta (1β) DRAM technology, showing over 60% higher performance than the previous generation." It further revealed, "HBM4 has already reduced power consumption by 20% compared to the industry-leading HBM3E 12-layer product, and mass production will begin next year according to customer plans."

Micron’s guidance for the fourth quarter (June to August) of the 2025 fiscal year also exceeded market expectations. Micron projected a revenue of $10.7 billion (about 14.56 trillion won), an earnings per share of $2.50, and a gross margin of 42% for the upcoming fourth quarter. Micron also offered a positive outlook for the overall DRAM and NAND markets. It forecasted the demand growth rate for DRAM bits to be in the high 10% range and in the low 10% range for NAND this year. CEO Mehrotra stated, "For the entire 2025 fiscal year, we will achieve record-high revenue, strong profitability, and robust free cash flow."