SK Hynix Starts Mass Production of HBM3E: 9.2 GT/s


SK Hynix said that it had started volume production of its HBM3E memory and would supply it to a customer in late March. The South Korean company is the second DRAM producer to announce mass production of HBM3E, so the market of ultra-high-performance memory will have some competition, which is good for companies that plan to use HBM3E.

According to specifications, SK Hynix’s HBM3E known good stack dies (KGSDs) feature data transfer rates up to 9.2 GT/s, a 1024-bit interface, and a bandwidth of 1.18 TB/s, which is massively higher than the 6.4 GT/s and 819 GB/s offered by HBM3. The company does not say whether it mass produces 8Hi 24GB HBM3E memory modules or 12Hi 36GB HBM3E devices, but it will likely begin its HBM3E ramp from lower-capacity products as they are easier to make.

We already know that SK Hynix’s HBM3E stacks employ the company’s advanced Mass Reflow Molded Underfill (MR-RUF) technology, which promises to reduce heat dissipation by 10%. This technology involves the use of an enhanced underfill between DRAM layers, which not only improves heat dissipation but also reduces the thickness of HBM stacks. As a result, 12-Hi HBM stacks can be constructed that are the same height as 8-Hi modules. However, this does not necessarily imply that the stacks currently in mass production are 12-Hi HBM3E stacks.

Although the memory maker does not officially confirm this, SK Hynix’s 24GB HBM3E stacks will arrive just in time to address NVIDIA’s Blackwell accelerator family for artificial intelligence and high-performance computing applications.

With the success story of the HBM business and the strong partnership with customers that it has built for years, SK Hynix will cement its position as the total AI memory provider,” said Sungsoo Ryu, Head of HBM business at SK Hynix. As a result, NVIDIA will have access to HBM3E memory from multiple suppliers with both Micron and SK Hynix.

Meanwhile, AMD recently confirmed that it was looking forward to expanding its Instinct MI300-series lineup for AI and HPC applications with higher-performance memory configurations, so SK Hynix’s HBM3E memory could also be used for this.



Source link