AMD's next-generation AI accelerator, MI350. /Courtesy of AMD

It has been confirmed that the next-generation artificial intelligence (AI) accelerator 'MI350' series from AMD has adopted the 5th generation high-bandwidth memory (HBM) product, the 12-layer HBM (HBM3E), from Samsung Electronics. While there have been repeated failures in supplying to NVIDIA, this agreement with AMD has dispelled rumors about defects in Samsung Electronics' HBM products.

Samsung Electronics is in an optimistic mood following the confirmation of the supply of HBM3E 12-layer products to AMD. Expectations for the supply of HBM4 (6th generation HBM) for the next-generation MI400 series, which will be released next year, are also increasing. Additionally, there is a possibility that HBM3E supply will begin for NVIDIA as early as this month, which leads to forecasts that Samsung Electronics' HBM business will gain momentum from the second half of the year.

On the 12th (local time), AMD announced at the 'AI Advancing 2025' event held at the San Jose Convention Center in California that the new AI accelerators MI350X and MI355X will be equipped with HBM3E 12-layer from Samsung Electronics and Micron. Although it has been unofficially known that Samsung Electronics has been supplying HBM to AMD, this is the first time that AMD has officially confirmed it.

The HBM3E 12-layer products introduced in the MI350 series are believed to be 12-layer DRAM with a capacity of 36 gigabytes (GB) that Samsung Electronics completed developing last year. This product consists of 12 layers created by vertically stacking 24 Gb (gigabit) DRAM chips using Through-Silicon Via (TSV) technology, providing a total of 36 GB in a single package.

The current HBM3E 12-layer has reportedly improved performance and capacity by over 50% compared to the previous HBM3E 8-layer product. It offers a maximum bandwidth of 1280 GB per second and can handle speeds of up to 10 Gb per second across 1024 input/output (I/O) channels. This is equivalent to the level of transmitting data for about 40 UHD movies in one second.

Samsung Electronics has implemented 'Advanced TC NCF (Thermal Compression Non-Conductive Film)' technology in the HBM3E 12-layer process to achieve the same package height as the 8-layer products. It has also minimized the spacing between chips to 7 micrometers, resulting in an improvement of over 20% in vertical integration. The application of differently sized bumps at specific locations to separate signal and thermal characteristics is also a notable feature.

There is a high likelihood that HBM4, to be included in the MI400 series that AMD will launch next year, will also be a collaboration with Samsung Electronics. During the event, AMD announced that the upcoming 'MI400 series' will feature 432 GB of HBM4 per graphics processing unit (GPU). The server rack 'Helios', composed of 72 MI400 series GPUs, will concentrate HBM4 totaling up to 31 TB, exhibiting ten times the AI computing power compared to the current generation MI355X server rack.

HBM4 is regarded as a critical turning point that will determine the HBM market dominance among Samsung Electronics, SK hynix, and Micron. With the recent confirmation of JEDEC standards, HBM4 is entering the production phase. Both Samsung Electronics and SK hynix have set their sights on mass production of HBM4 by the end of this year. In particular, Samsung Electronics, which has fallen behind in the current generation HBM competition, is determined to turn the tide with HBM4.

An industry insider noted, "Samsung Electronics is betting everything on HBM4 and is focused on reclaiming its position, which has been lost to SK hynix and Micron." While SK hynix and Micron are manufacturing HBM4 based on a 5th-generation 10-nanometer (1b) process, Samsung Electronics plans to produce it based on a more advanced 6th-generation 10-nanometer (1c) process. If successful in mass production, it will gain a favorable position compared to the two companies.