Samsung Outs World’s First GDDR7 Chip: 32 GT/s for Next-Gen GPUs
In a completely unexpected development, Samsung announced late Thursday that it had completed development of the industry’s first GDDR7 memory chip. The new device features a data transfer rate of 32 GT/s, uses Pulse Amplitude Modulation (PAM3) signaling and promises a 20% power efficiency improvement compared to GDDR6. To achieve this, Samsung had to implement some new technology.
Samsung’s first 16Gb GDDR7 device will feature 32 GT/s data transfer rate, resulting in 128 GB/s of bandwidth per chip, a significant increase from the 89.6 GB/s offered by GDDR6X’s 22.4 GT/s I am proud of the width. To put it into perspective, the 384-bit memory subsystem with 32 GT/s GDDR7 chips offers a whopping 1.536 TB/s of bandwidth, well above the GeForce RTX 4090’s 1.008 TB/s.
To achieve unprecedented high data transfer speeds, GDDR7 uses PAM3 signaling, a form of pulse amplitude modulation featuring three different signal levels (-1, 0, and +1). This mechanism allows the transfer of 3 bits of data in 2 cycles and is more efficient than 2-level NRZ, the scheme used in GDDR6. However, it is important to note that PAM3 signals are more complex to generate and decode than NRZ signals (which means additional power consumption) and can be more susceptible to noise and interference. On the other hand, PAM3’s benefits seem to outweigh its challenges, so it will be adopted by both GDDR7 and USB4 v2.
Samsung’s 32 GT/s GDDR7 chips are said to be 20% more power efficient compared to 24 GT/s GDDR6 in addition to higher performance, but Samsung has changed how it measures power efficiency. not clarified. Usually memory makers tend to measure the power per bit transferred, which makes sense, and from this perspective GDDR7 promises to be more efficient than his GDDR6.
On the other hand, this does not mean that GDDR7 memory chips and GDDR7 memory controllers will consume less power than today’s GDDR6 ICs and controllers. PAM3 encoding/decoding is more complex and requires more power. In fact, Samsung even said that they used an epoxy molding compound (EMC) with high thermal conductivity and 70% lower thermal resistance for the GDDR7 packaging in order to keep the active components (the IC itself) from overheating. says. This indicates that GDDR7 memory devices run hotter than his GDDR6 memory devices, especially when running at high clocks.
It’s also worth noting that Samsung’s GDDR7 components offer a lower operating voltage option for applications such as laptops, though the company hasn’t said what kind of performance one should expect from such devices. .
To tell the truth, Samsung’s announcement is a bit confusing in details. The company has not disclosed when the GDDR7 components will go into mass production or what process technology it plans to use. Considering the frequency of new GPU architecture announcements by AMD and Nvidia (every two years), it’s natural to expect next-generation graphics processors to hit the market in his 2024 year, and they are likely to be adopted.
On the other hand, Samsung expects artificial intelligence, high-performance computing, and automotive applications to utilize GDDR7 as well, so perhaps some AI or HPC ASICs will adopt GDDR7 before GPUs.
“Our GDDR7 DRAM will help improve the user experience in areas that require superior graphics performance, such as workstations, PCs, and game consoles, as well as future applications such as AI, High Performance Computing (HPC), and automotive. It is expected to expand to 2020,” said Young-cheol Bae, executive vice president of Samsung Electronics’ memory product planning team. “Next-generation graphics DRAMs will be brought to market in line with industry demand, and we intend to continue our leadership in this area.”