Another storage giant is accelerating its HBM layout to meet the growing demand for AI servers, once again driving the growth of the HBM market after Hanmi Semiconductor. It is reported that Samsung Electronics will begin mass production of High Bandwidth Memory (HBM) chips in the second half of this year to meet the continuously growing artificial intelligence (AI) market.
According to reports, Samsung will mass produce 16GB and 24GB HBM3 storage chips, which can achieve a data processing speed of 6.4Gbps, helping to improve the learning and computing speed of servers.
Samsung Executive Vice President Kim Jae-joon stated during a conference call in April that the company plans to launch the next generation HBM3P product in the second half of this year to meet market demand for higher performance and capacity. In addition to HBM, Samsung continues to introduce new memory solutions, such as HBM-PIM (a high bandwidth memory chip that integrates AI processing power) and CXL DRAM, to overcome DRAM capacity limitations.
From the perspective of market structure, according to data from global research firm TrendForce Consulting, the market share of the three major original HBM factories in 2022 was about 50% for SK Hynix, about 40% for Samsung, and about 10% for Micron.
TrendForce Consulting pointed out that in addition, the specifications of advanced deep learning AIGPUs have also stimulated the replacement of HBM products. In the second half of 2023, with the deployment of NVIDIAH100 and AMDMI300, the three major manufacturers have also planned mass production of corresponding specifications of HBM3. Therefore, with the expectation that more customers will import HBM3 this year, SK Hynix, as the only supplier of new generation HBM3 products in mass production, is expected to increase its overall market share of HBM to 53%, while Samsung and Micron are expected to successively mass produce from the end of this year to the beginning of next year, with the market share of HBM of 38% and 9% respectively.