YouTube channel benchmark lab decided to test Nvidia’s new GeForce RTX 4090 in its own way — comparing it to two RTX 3090s at SLI.
That’s right, SLI — Nvidia’s obsolete multi-GPU tech that Benchmark Labs was able to do with some unknown SLI-enable magic. The channel was able to perform his SLI adequately on several DX11, DX12, and Vulkan titles, but in the end his 3090 couldn’t beat his RTX 4090 (close of things).
Remember, all of this should be taken with a grain (or a few) of salt before you jump on the results. SLI has not been supported for years at this point. Most modern titles, especially DX12 and Vulkan, require uncommon methods to get SLI working. Benchmark Lab has no idea how he enabled SLI in most of the games he tested (especially his DX12 version), or if the results were tampered with or presumed. That said, the results are very interesting (if they’re actually accurate).
Benchmark Lab tested 9 games: Spider-Man Remastered, Cyberpunk 2077, Watch Dogs Legion, Microsoft Flight Simulator 2020, Minecraft RTX, God of War, The Witcher 3, Horizon Zero Dawn, When Read Dead Redemption 2. The tests were run at 4k resolution using various quality settings and DLSS mode.
of spiderman remaster, At maximum settings (ray tracing and DLSS balanced mode enabled), the RTX 3090 on SLI averaged 80-85 frames per second (fps), while the RTX 4090 managed a significantly higher average of 95 fps. Did.
of cyberpunk 2077, At maximum settings (with ray tracing and DLSS balanced mode enabled), the RTX 4090 averaged 70 fps, while the RTX 3090 on SLI managed 50 fps.
of Watch Dogs: Legion, At the maximum settings (ray tracing and DLSS quality mode enabled) the gap closed a bit. The RTX 3090 on SLI averaged 73 fps, while the RTX 4090 averaged 80 fps.
the result was even closer Microsoft Flight Simulator Maximum settings (ray tracing and DLSS quality mode enabled). The RTX 3090 on SLI averaged 80 fps, while the RTX 4090 averaged 83-85 fps. (however, Microsoft Flight Simulator It is known to be CPU bound, especially at high frame rates. )
of Minecraft RTX, At maximum settings (with ray tracing enabled), the RTX 3090 on SLI averaged 70 fps, while the RTX 4090 reached a slightly higher average of 75 fps. and, god of war At maximum settings (DLSS quality mode enabled), the RTX 3090 on SLI hit an average of 103 fps, while the RTX 4090 averaged 120 fps which was pretty good.
Check out the video below for the rest of the titles.
The RTX 3090 SLI is the only setup that can come close to the RTX 4090
except, Cyberpunk 2077 As a result, the RTX 3090 on SLI is about 8-15% slower than the RTX 4090. For an SLI implementation this is not bad at all. Performance bracket for RTX 4090. From one perspective, the next closest GPU is the RTX 3090 Ti, which outperforms the 4090 by over 50% based on our tests.
Unfortunately, the 3090 in the SLI setup did not live up to its full potential as the secondary 3090 consistently maxed out at 45% utilization. This is one of the pitfalls of SLI setups, where improper optimization leads to poor GPU utilization on the secondary card.of hypothesis, If you can get close to 100% utilization on both GPUs, you probably know that 3090 with SLI is way ahead of 4090.
However, that is highly unlikely as Nvidia has officially ditched SLI in the form of a physical bridge on the RTX 40 series. SLI support has decreased significantly over the past few years, so it’s basically only available for synthetic benchmarks at this point.
There are technical ways to (unofficially) enable SLI in games that don’t support it, as is the case with the benchmarks seen in this video. However, these results are unpredictable at best, usually resulting in system instability or severe micro-stuttering issues while gaming.
However, there is hope for SLI. Multi-GPU workloads are very common in the enterprise space, and Nvidia also has multi-GPU technology that doesn’t require his NVLink or SLI bridges.
On the game side, modern APIs like DX12 and Vulkan have the ability to render frames to two completely different GPUs in tandem. As such, multi-GPU tech may eventually return to the gaming space (though whether or not developers want to support multi-GPU tech in their games is another story).