Samsung Electronics has begun shipping its latest high-bandwidth memory chips, HBM4, to customers, closing the gap with competitors as demand surges from data centre operators supplying Nvidia.
The South Korean chipmaker said on Thursday that the new chips are already being delivered, though it did not name customers.
This comes as demand for high-bandwidth memory steeply increases, driven by the global build-out of data centres used to train and run advanced artificial intelligence systems.
HBM is a form of dynamic random-access memory designed to handle very large volumes of data at high speed. It has become an essential component in modern AI processors.
Samsung, the world’s largest memory chipmaker by revenue, has struggled in recent years to keep pace with competitors in earlier generations of the technology.
The company said its HBM4 chips provide a consistent processing speed of 11.7 gigabits per second, about 22% faster than the previous HBM3E version. It added that the chips can reach a maximum speed of 13 gigabits per second, easing data bottlenecks as workloads grow heavier.
Samsung also said it plans to provide samples of its next version, HBM4E, in the second half of the year.
Sources say Samsung began mass production and shipments of HBM4 chips in February 2026, with Nvidia graphics processors among the key targets.
Those chips are expected to power Nvidia’s upcoming Vera Rubin AI accelerator platform, due for launch in the second half of the year.
The HBM4 chips are built on a 4-nanometre logic process and offer capacities between 24 and 36 gigabytes, with plans to scale up to 48 gigabytes.
Samsung says the new generation delivers up to 3 terabytes per second of bandwidth per stack, roughly 2.4 times that of HBM3E, alongside a 40% improvement in power efficiency and better thermal control.
Competition in the market is getting tighter. SK Hynix said in January that it aims to maintain its “overwhelming” market share in next-generation HBM4 chips, which it said were already in volume production.
The company added that it plans to achieve production yields for HBM4 in line with those of HBM3E.
Micron has also moved early. The company’s chief financial officer said it is in high-volume production of HBM4 and has begun shipping the chips to customers.
Samsung’s new focus underlines a catch-up initiative after falling behind in earlier HBM cycles. Investors welcomed the update. Samsung shares ended the day up 6.4%.
Memory bandwidth has become just as important as processing power in modern data centres, and demand is not falling behind. Companies are expanding AI capacity worldwide.




