Nvidia Reportedly Interested in Using SK Hynix HBM3E Memory
NVIDIA is reportedly interested in evaluating SK Hynix’s HBM3E samples, according to “industry sources.” Digi Times report. If this information is accurate, his Nvidia’s next-generation computing GPUs for artificial intelligence and high-performance computing applications could use HBM3E memory instead of HBM3.
According to industry sources cited by South Korea’s Money Today and the Seoul Economic Daily, Nvidia has requested samples of HBM3E from SK Hynix for the purpose of evaluating its impact on GPU performance.
SK Hynix’s upcoming HBM3E memory will increase the data transfer rate from the current 6.40 GT/s to 8.0 GT/s. This enhancement increases bandwidth per stack from 819.2 GB/s to 1 TB/s. However, SK Hynix has not released any information about this aspect of the new technology yet, so there are uncertainties regarding the compatibility of HBM3E with his existing HBM3 controllers and interfaces. Either way, Nvidia and other computing AI and HPC GPU developers need to appreciate the technology.
SK Hynix is reportedly planning to start sampling HBM3E memory in late 2023 and start mass production in late 2023 or 2024. SK Hynix plans to manufacture HBM3E memory using its 1b nanometer manufacturing process. 5th generation 10nm class node for DRAM. This same manufacturing process is currently used to manufacture DDR5-6400 DRAM. The same technology is used to make his LPDDR5T memory chips for high performance, low power applications.
It’s not yet known which of Nvidia’s compute GPUs will use HBM3E memory, but it’s likely that the company’s next-generation processors due in 2024 will use the new type of memory. A revamped Hopper GH100 computing GPU or something completely new.
SK Hynix currently controls over 50% of the HBM memory market and is the only company supplying HBM3. Also, at least initially, he will be the exclusive manufacturer of HBM3.
Market research firm Yole Development predicts that the market for HBM memory will grow significantly due to its superior bandwidth compared to other types of DRAM. The company expects the market, worth $705 million in 2023, to nearly double to $1.324 billion by 2027.