The developers of CapFrameX, a popular capture and analysis tool, Performance comparison (opens in new tab) AMD’s Radeon RX 6800 XT (RDNA 2), Intel’s Arc A770 (Alchemist), Nvidia’s GeForce RTX 3090 (Ampere) and GeForce RTX 4090 (Ada Lovelace) with AV1 video encoding at 8K resolution. The results he got are pretty amazing.
Top graphics cards like Nvidia’s GeForce RTX 3090 and RTX 4090 deliver incredible performance for demanding high-res games, unlike Intel’s Arc A770, which targets mainstream gamers. But when it comes to playing high-definition video, it all hinges on video decoding hardware performed by a dedicated hardware unit that doesn’t rely on the GPU’s overall capabilities.
To test the decoding capabilities of modern graphics processors, CapFrameX developers Japan in 8K 60 FPS (opens in new tab) I downloaded a video from YouTube and decoded it in Chrome browser at 4K and 8K resolution. Ideally, all GPUs should deliver a constant 60 FPS. These drops in 0.2% – 1% cases should not ruin the experience.
image 1 of 2
At 8K, Intel’s Arc A770 delivers smooth playback at an average of 60 FPS, dropping to 44 FPS in 0.2% of cases. In contrast, Nvidia’s GPU reached 56.8-57.6 FPS, dropping to 16.7 FPS at times, making it uncomfortable to watch. The only question is whether the drivers and Google Chrome can take advantage of Nvidia’s latest NVDEC hardware (and whether Nvidia has updated the hardware compared to the GA102). On AMD’s Radeon RX 6800 XT, 8K videos were “unwatchable” due to low frame rates and stuttering.
In general, Intel offers industry-leading video playback support even at 8K. In contrast, Nvidia’s software may have to catch up with Intel when it comes to high-definition AV1 video playback. AMD’s Navi 21 officially supports his AV1 decoding, but it doesn’t look like 8K resolution and could be cracked, at least with current software.