FreeSync vs. G-Sync 2022: Which Variable Refresh Tech Is Best?
in the last few years best gaming monitor (opens in new tab) Enjoying something of a Renaissance. Adaptive-Sync technology Nvidia G-Sync (opens in new tab) When AMD FreeSync (opens in new tab)In the 1990s, the only thing performance-seeking gamers could hope for was higher resolutions or refresh rates above 60 Hz. We are updating our technology. Which Adaptive-Sync technology reigns supreme in the battle between FreeSync and G-Sync in this era of gaming displays?
There are also next-gen graphics cards on the horizon, such as the Nvidia GeForce RTX 4090 and Ada Lovelace GPUs with DLSS 3 technology that could double framerates even at 4K. AMD’s RDNA 3 and Radeon RX 7000 series GPUs are also coming soon, which should boost performance and make higher quality displays more convenient.
For starters, Adaptive-Sync means that your monitor’s refresh cycle is synchronized with the refresh rate of the connected PC. graphics card (opens in new tab) Render each frame of the video, even if the rate changes. The game renders each frame in sequence, but the rate varies greatly depending on the complexity of the scene being rendered. If your monitor has a fixed refresh rate, the screen refreshes at a certain interval. For example, 60 times per second on a 60 Hz display. What if the new frame is ready before the scheduled update?
You have several options. One is to make the GPU and monitor wait to send new frames to the display. This can increase system latency and make games less responsive. Another option is to allow the GPU to send a new frame to the monitor and start drawing to the screen immediately. This is called tearing and the result is shown in the image above.
G-Sync (for Nvidia-based GPUs) and FreeSync (possibly AMD and Intel GPUs) aim to solve the above issues, offering maximum performance, lowest latency and no tearing To do. The GPU sends a “frame ready” signal to the Gsync or FreeSync monitor. The G-Sync or FreeSync monitor draws a new frame and waits for the next “frame ready” signal. This eliminates tearing artifacts.
Today, you can find countless monitors (including non-gaming monitors) boasting flavors of G-Sync, FreeSync, or both. If you haven’t committed to graphics card technology yet, or have the option to use one or the other, you may be wondering which is best for you when considering FreeSync and G-Sync. And if you have the option to use one over the other, does one offer a gaming advantage over the other?
FreeSync vs G-Sync
free sink | FreeSync Premium | FreeSync Premium Pro | G-Sink | G-Sync Ultimate | G-Sync compatibility |
---|---|---|---|---|---|
No price premium | No price premium | No price premium | HDR and enhanced color support | 144 Hz or higher refresh rate | Verified Artifact-Free Performance |
60 Hz or higher refresh rate | 120 Hz or higher refresh rate | 120 Hz or higher refresh rate | Frame doubling below 30 Hz for guaranteed Adaptive-Sync at all frame rates | Support for factory-calibrated accurate SDR (sRGB) and HDR color (P3) gamuts | G-Sync compatible monitors also run FreeSync |
Many FreeSync monitors can also run G-Sync | Low Frame Rate Compensation (LFC) | HDR and enhanced color support | Ultra low motion blur | “Realistic” HDR support | |
May support HDR | May support HDR | Low Frame Rate Compensation (LFC) | Variable LCD overdrive | ||
Many FreeSync Premium monitors can also do G-Sync in HDR | No peak power specified, but most offer at least 600 nits | optimized latency | |||
Many FreeSync Premium Pro monitors can also run G-Sync in HDR |
Basically, G-Sync and FreeSync are the same. Both synchronize the monitor to the graphics card and allow that component to control the refresh rate on a continuously variable basis. may exceed the requirements. For example, FreeSync monitors don’t need to use HDR, but some do. Select FreeSync monitors reduce motion blur via proprietary partner technology such as Asus ELMB Sync.
Can users tell the difference between the two? In our experience, there is no visual difference between FreeSync and G-Sync when the frame rate is the same and the monitor quality is the same. However, there is no guarantee that such equivalence can be achieved.
I did a blind test in 2015 and found that, all other parameters being equal between FreeSync and G-Sync monitors, G-Sync performed slightly better than the then-still-new FreeSync. But a lot has happened since then.our monitor reviews (opens in new tab)We’ve highlighted a few things that can be added or removed from your gaming experience that have little or nothing to do with refresh rates or Adaptive-Sync technology.
G-Sync Ultimate claims to deliver “realistic HDR,” but HDR quality is also subjective at this point. Then it comes down to the feature sets of competing technologies. What does this mean? Let’s see.
Features of G-Sync
G-Sync monitors typically include the extra hardware required to support Nvidia’s version of Adaptive Refresh, which makes them more expensive. When G-Sync was new (Nvidia introduced him in 2013), buying his G-Sync version of the display would have cost about $200 more and all other features and specs would be the same. did. Today, that difference is approaching $100.
However, FreeSync monitors can also be certified as G-Sync compatible. The certification could be retroactive, meaning the monitor can run his G-Sync within his Nvidia’s parameters, despite the lack of Nvidia’s own scaler hardware.a visit to Nvidia website A list of monitors certified to run G-Sync will appear. Technically, you can run G-Sync on monitors that are not G-Sync Compatible certified, but the quality and experience cannot be guaranteed. For more information, see How do I run G-Sync on my FreeSync monitor and should I care if my monitor is compatible with G-Sync?
G-Sync monitors have some guarantees that are not always available with their FreeSync counterparts. One is blur reduction (ULMB) in the form of backlight strobes. ULMB is Nvidia’s name for this feature. Some FreeSync monitors have another name as well. This works as an alternative to Adaptive-Sync, but some people find it to be less laggy and prefer it. Testing has not been able to demonstrate this. However, when running at 100 frames per second (fps) or more, blur is generally not an issue and input lag is very low, so you might want to enable G-Sync to keep things tight.
G-Sync also guarantees no frame tearing, even at the lowest refresh rates. Below 30 Hz, the G-Sync monitor doubles frame rendering (doubling the refresh rate) and continues to run in the adaptive refresh range.
FreeSync function
FreeSync uses Adaptive-Sync, an open-source standard created by VESA (also part of VESA’s DisplayPort specification), which gives it a price advantage over G-Sync.
DisplayPort interface versions 1.2a and later can support adaptive refresh rates. A manufacturer may choose not to implement it, but since the hardware already exists, there is no additional production cost for the manufacturer to implement his FreeSync. FreeSync also works with HDMI 2.0b and above. (See DisplayPort vs HDMI to understand which is best for gaming. (opens in new tab) analysis. )
Due to its open nature, FreeSync implementations vary greatly from monitor to monitor. Budget displays usually get FreeSync and 60 Hz or higher refresh rates. The cheapest displays likely won’t be shake-mitigated, and the lower end of the Adaptive-Sync range could be just 48 Hz. However, there are FreeSync (and G-Sync) displays that run at 30 Hz or even lower, according to AMD.
FreeSync Adaptive-Sync works in theory similar to G-Sync. In fact, even the cheapest FreeSync displays (especially older models) may not look great. The more expensive FreeSync monitors add blur reduction and low frame rate compensation (LFC) to make them more competitive with their G-Sync counterparts.
You can also run G-Sync on a FreeSync monitor that is not Nvidia certified, but you may experience slower performance. These days, the monitor opts for his FreeSync support. It’s virtually free, and the high-quality display ensures he works with Nvidia and is also G-Sync ready.
FreeSync vs G-Sync: Which is better for HDR?
To add even more choice to a potentially confusing market, AMD and Nvidia have stepped up their game with new versions of their Adaptive-Sync technology. This is, of course, justified by some significant additions to display technology. HDR (opens in new tab) and extended color.
On Nvidia’s side, the monitor can support G-Sync with HDR and enhanced color without needing “Ultimate” certification. Nvidia assigns that moniker to monitors with features that Nvidia considers “realistic HDR”. The exact requirements are vague, but Nvidia revealed his G-Sync Ultimate specs to Tom’s Hardware, saying these monitors are factory calibrated for HDR color space P3, with a refresh rate of 144Hz or higher, overdrive, optimal I told them that I was supposed to provide the Latency” and “best in class” image quality and HDR support.
The monitor, on the other hand, must support HDR, enhanced color, reach a minimum of 120Hz at 1080p resolution, and have LFC to feature FreeSync Premium on the spec sheet. If you want to know about FreeSync 2, AMD replaced his FreeSync Premium Pro. They are functionally the same.
Another fact: If you have an HDR monitor (see our article on choosing the best HDR monitor for recommendations. (opens in new tab)) supports FreeSync with HDR, but will likely also support G-Sync with HDR, both of which work without HDR.
And what about FreeSync Premium Pro? It’s in the same situation as G-Sync Ultimate in that it offers nothing new to its core Adaptive-Sync technology. FreeSync Premium Pro means AMD certified to deliver a premium experience with at least 120Hz refresh rate, LFC and HDR.
Naturally, the quality components required for FreeSync Premium Pro are more expensive than the basic ones. In other words, FreeSync is technically free of cost, but FreeSync Premium Pro monitors cost more than less expensive monitors.
If your FreeSync monitor supports HDR, it may work with G-Sync (Nvidia certified or not).
Conclusion
So which is better, G-Sync or FreeSync? There’s no unique reason to choose one particular monitor, as they are very similar in functionality. Both technologies produce similar results, so the contest is largely washed out at this point. However, there are some disclaimers.
When you buy a G-Sync monitor, that’s all Supports Adaptive Sync feature with GeForce graphics cards. As long as you want to get the most out of your monitor, you’re effectively tied to buying an Nvidia GPU. With FreeSync monitors, especially the newer, higher-quality variants that meet the FreeSync Premium Pro certification, you’ll often have AMD or Nvidia graphics cards at your disposal.
Them Purchasing a PC monitor (opens in new tab) They need to decide which additional features are most important to them. How high should your refresh rate be? What resolution can your graphics card handle? Is high brightness important? Need HDR and Enhanced Color?
It’s the combination of these factors that impacts the gaming experience, not just which adaptive sync technology is used. Ultimately, the more money you spend, the better gaming monitor you get. These days, when it comes to displays, you get what you pay for. But you don’t have to pay thousands of dollars to get a good and smooth gaming experience.
more: best gaming monitor (opens in new tab)
more: Best 4K Gaming Monitor (opens in new tab)
more: How to test your monitor (opens in new tab)
more: All surveillance content (opens in new tab)