Micron, Samsung, and SK Hynix preview new HBM4 memory for AI acceleration
Published on: 2025-06-04 17:24:00
Recap: The AI accelerator race is driving rapid innovation in high-bandwidth memory technologies. At this year's GTC event, memory giants Samsung, SK Hynix, and Micron previewed their next-generation HBM4 and HBM4e solutions coming down the pipeline.
While data center GPUs are transitioning to HBM3e, the memory roadmaps revealed at Nvidia GTC make it clear that HBM4 will be the next big step. Computerbase attended the event and noted that this new standard enables some serious density and bandwidth improvements over HBM3.
SK Hynix showcased its first 48GB HBM4 stack composed of 16 layers of 3GB chips running at 8Gbps. Likewise, Samsung and Micron had similar 16-high HBM4 demos, with Samsung claiming that speeds will ultimately reach 9.2Gbps within this generation. We should expect 12-high 36GB stacks to become more mainstream for HBM4 products launching in 2026. Micron says that its HBM4 solution will boost performance by over 50 percent compared to HBM3e.
However, memory makers are
... Read full article.