In all its various forms, adaptive sync technology is intended improve synchronization between the monitor’s refresh rate and the graphics card’s frame rate, thereby eliminating screen tearing and stuttering.
There are however different solutions to these issues, the main ones being AMD’s FreeSync and Nvidia’s G-Sync. On this page we will examine how they work and whether there is a reason to choose one over the other.
What’s Screen Tearing and Micro-Stuttering?
Screen tearing is the splitting of images on the screen which disrupts the gaming experience. This happens when the graphics card (GPU) frame rate is is out of sync with the monitor’s refresh rate. As a result, the monitor displays content from multiple frames from the GPU in a single draw or refresh cycle.
Micro-stuttering, on the other hand, is when frames are being repeated, skipped, or frozen, which is particularly detrimental in fast-paced games like shooters. The reason is usually input delay between the GPU and the display.
From V-Sync to Adaptive Sync
V-Sync, short for Vertical Sync, was developed to combat screen tearing by synchronizing the game’s frame rate to the monitor’s refresh rate. While V-Sync does eliminate screen tearing, it is not without its drawbacks. If the game’s required frame rate (FPS) falls below the refresh rate of the monitor (usually 60Hz), having V-Sync enabled will not help. It also introduces new problems such as judder and input lag causing micro-stuttering.
To address the shortcomings of V-Sync, manufacturers introduced adaptive sync. This technology was developed by the Video Electronics Standards Association (VESA) and adjusts the display’s refresh rate to match the GPU’s outputting frames on the fly. This prevents screen tearing and reduces input lag to avoid stuttering.
Unlike V-Sync which caps the GPU’s frame rate to match the display’s refresh rate, VESA adaptive sync dynamically adjusts the monitor’s refresh rate to match the game’s required frame rates. This not only eliminates screen tearing but also addresses the juddering effect that V-Sync causes when the FPS falls.
AMD FreeSync Vs Nvidia G-Sync: A Closer Look
AMD’s FreeSync and Nvidia’s G-Sync are two popular Adaptive Sync technologies developed and maintained by AMD and Nvidia, respectively. While both aim to provide a seamless gaming experience, they have some differences.
FreeSync, on the other hand, holds a price advantage over G-Sync as it uses the open-source standard created by VESA, known as Adaptive-Sync.
- AMD FreeSync utilizes VESA’s royalty-free technology to sync the refresh rate to the FPS and works on most monitors, keeping the prices down. This standard is incorporated into VESA’s DisplayPort specification, meaning any DisplayPort interface version 1.2a or higher can support adaptive refresh rates. However, AMD has largely left the frame rate range in the hands of the manufacturers.
- NVIDIA G-Sync, on the other hand, relies on proprietary hardware that must be built into the display. Nvidia also enforces tighter quality control in some areas. As a result, G-Sync monitors generally come with a higher price tag. When G-Sync was first introduced (in 2013), a G-Sync monitor could command an additional $200, but this price gap has narrowed to roughly $100.
Note that it is possible – and quite common in the premium segment – for monitors to support both FreeSync and G-Sync, although just one of the technologies can be active at a time.
Feature Comparison
FreeSync | FreeSync Premium | FreeSync Premium Pro | G-Sync | G-Sync Ultimate | |
---|---|---|---|---|---|
GPU Compatibility | AMD GPUs (GCN2 or newer) Nvidia GPUs (10-series or newer) | AMD GPUs (GCN2 or newer) Nvidia GPUs (10-series or newer) | AMD GPUs (GCN2 or newer) Nvidia GPUs (10-series or newer) | Nvidia only (GTX 600-series or later) | Nvidia only (GTX 600-series or later) |
Price Premium | No | No | No | Yes | Yes |
Refresh Rates | 60Hz or higher | 120Hz or higher | 120Hz or higher | 75Hz or higher | 144Hz or higher |
HDR/Extended Color Ranges | HDR Support | HDR Support | HDR + extended color range support | HDR + extended color range support | Factory-calibrated sRGB and P3 (HDR color) gamut support |
Validated Artifact-Free | No | No | No | Yes | Yes |
Low Framerate Compensation (LFC) | No | Yes | Yes | Yes | Yes |
Being open-source, FreeSync’s implementation can vary significantly between different monitors. The most affordable ones will typically offer FreeSync and a 60Hz or higher refresh rate. However, more expensive FreeSync monitors include additional features such as blur reduction and Low Framerate Compensation (LFC) to compete more effectively with G-Sync monitors.
On the Nvidia side, a monitor can support G-Sync with HDR and extended color without having to earn the “Ultimate” certification. On the other hand, a monitor must support HDR, extended color, and hit a minimum of 120 Hz at 1080p resolution to list FreeSync Premium on its specs sheet.
Making the Choice: Which Adaptive Sync Technology to Use?
Choosing a sync technology ultimately depends on your needs, preferences, and budget. For the investment to make sense, having a reasonably powerful graphics card is a good idea, as entry-level GPUs are usually not fast enough for high refresh rate gaming in demanding AAA titles.
Another important consideration is of course that the GPU supports the technology. G-Sync is exclusive to Nvidia. As previously mentioned, high-end gaming monitors often support both FreeSync and G-Sync, making them a safe choice.
But with any high-quality gaming monitor, the end result is the same – a smoother and more enjoyable gaming experience.