Kim Gwi wook, head of HBM at SK Hynix, stated that the industry's HBM technology has reached a new level, and industry demand has prompted SK Hynix to accelerate the development process. The earliest release of HBM4E memory is in 2026, and the related memory bandwidth will be 1.4 times that of HBM4.
In addition to HBM4E, there are reports that SK Hynix plans to launch the first batch of HBM4 products using 12 layer DRAM stacking in the second half of 2025, while the 16 layer stacking HBM will be launched slightly later than 2026.
The current mainstream HBM3E products all use 24Gb DRAM particles, which allows the HBM3E memory to reach a single stack capacity of 24GB under 8-layer stacking. If 12 layer stacking is used, the HBM3E stack can reach 36GB.
After the future HBM4E memory is updated to 32Gb DRAM bare chips, the 12 layer stacked version can achieve a single 48GB capacity, and the 16 layer version can even reach a super large scale of 64GB, creating more possibilities for AI use cases.
Kim Kwi Wook predicts that HBM4E memory can increase bandwidth by 40%, density by 30%, and energy efficiency by 30% compared to HBM4.
The accelerated development process of HBM4/HBM4E undoubtedly demonstrates the strong demand of AI giants for high-performance memory, with increasingly powerful AI processors requiring higher memory bandwidth assistance.
The copyright of this article belongs to the original author. The reprint of the article is only for the purpose of disseminating more information. If the author's information is marked incorrectly, please contact us immediately to modify or delete it. Thank you for your attention!