SK Hynix announced At next month’s CES, it plans to demonstrate GDDR6-AiM memory with compute capabilities in action. GDDR6 accelerator-in-memory technology is designed to accelerate artificial intelligence and big data processing by bringing basic computational functions to memory chips.
According to the company, SK Hynix’s GDDR6-AIM chip can process data in memory at 16 Gbps, making certain calculations up to 16 times faster than other methods. Such chips are designed for machine learning, high-performance computing, and big data computation and storage. These types of workloads typically don’t always require truly serious computing performance, but transferring data from memory to the processor takes time and consumes a lot of power, so in-memory It makes sense to process the data in .
According to the memory maker, its GDDR6-AiM chips operate at 1.25V, and its use reduces power consumption by 80% compared to applications that move data to CPUs and GPUs. Such chips are designed to be drop-in compatible with existing he GDDR6 memory controllers, so you can even use them with your existing graphics cards to improve performance in AI, ML, Big Data, and HPC workloads. You should be able to improve performance.
SK Hynix has completed the development of GDDR6-AiM in early 2022, but has so far demonstrated only a limited number of real-world applications. It will therefore be particularly interesting to see what devices SK Hynix will show at the trade show.
SK Hynix isn’t the only memory manufacturer experimenting with Processing In Memory (PIM) technology. On various occasions, Samsung has been demonstrating his HBM2 and GDDR6 memory with embedded processing for about two years. On the other hand, PIM is not yet popular as many users prefer traditional CPUs, GPUs and FPGAs.
Alongside its GDDR6-AiM memory chip, SK Hynix plans to demonstrate a new HBM3 memory device “with the world’s highest specifications for high-performance computing.”