FreeSync & G-SYNC

In the quest to produce the smoothest and most realistic gameplay, power users typically purchase the most powerful graphics card (or cards) their budgets will allow. More raw GPU horsepower typically translates to better FPS (frames per second) rates, and the faster the frame rate, the smoother gameplay will appear. A high-powered GPU can also let you max out video settings to enhance effects and further immerse yourself in the game. Yet there are some graphics issues that even a pricey GPU cant completely resolve by itself.

Screen tearing, for example, occurs when a GPUs frame rate doesn’t match up with the monitors refresh rate. The result is that the display might show parts of multiple frames in a single draw, and where differences exist between those frames, noticeable artifacts will result. Classic symptoms of screen tearing are misaligned on-screen content and fragmented scenes.
AMD FreeSync and NVIDIAs G-SYNC technologies solve screen tearing by synchronizing GPU and monitor refresh rates on a frame-by-frame basis. Because AMD and NVIDIA differ in how they implement the respective variable refresh rate technologies, both your GPU and monitor must be compatible for refresh rates to sync up. As such, you must use a compatible AMD GPU with a FreeSync-enabled display, or a compatible NVIDIA GPU with a G-SYNC-enabled display.
Recently, Team Green and Team Red raised the bar with new standards that add support for HDR (high dynamic range) to the respective adaptive-sync technologies. Titled FreeSync 2 and G-SYNC HDR. these technologies are necessary because existing HDR transport methods tend to introduce lag when HDR is used with videogames. Similar to the variable refresh rate techniques, FreeSync 2 and G-SYNC HDR allow a GPU to convey HDR metadata directly to a compatible display. The gaming evolution never stops.
For years, gamers have been using Vsync to combat the problem of screen tearing, and you don’t have to have a specific GPU to take advantage of Vsync. But some gamers opt to disable Vsync because it can cause choppy onscreen motion. Basically, Vsync forces to GPU to render frames at a fixed cadence to create a set of completed frames, and the monitor will signal when it’s ready for a new frame. Stutter can occur when your GPU’s frame rate falls below the Vsync cap, which is typically 60fps.

Vsyncs effect on input lag is another concern, because the GPU might have to hold onto a rendered frame for too long. For example, the GPU might buffer up to three fully rendered frames until the monitor is ready to draw a new on-screen frame. The lag trade-off can make Vsync unsuitable for shooters and other games that require precise, quick action. By perfecdy matching the refresh rate of the GPU and monitor, FreeSync and G-SYNC do away with both screen tearing and Vsyncs stutter and lag.
FreeSync
FreeSyncs dynamic refresh rate is supported by most Radeon GPUs released in 2014 or after, which includes the Radeon R9 290 and R7 260X. Older Radeon GPUs, such as the HD 7000 and R9 280, support FreeSync for video playback and powersaving jobs, but these cards don’t have a display controller that supports the transformative adaptive sync capabilities. GPUs based on AMD’s new Polaris and upcoming Vega architecture, of course, support FreeSync Best of all. AMD says that any FreeSync-compatible Radeon GPU will also support FreeSync 2.
As of December 2016, 20 display makers had partnered with AMD to create FreeSync displays, and in total, there were 121 FreeSync compatible monitors. AMD expects that most all of the partners who built FreeSync monitors will create FreeSync 2-compatible options. But don’t expect to see the market flood with as many FreeSync 2 monitors as you did with FreeSync monitors, because AMD will have stria standards for the minimum brightness, contrast, and color space. To make the cut. a monitor must be able to meet these standards while also delivering extremely low latency.
FreeSyncs variable refresh rate capability is based on Adaptive Sync, which is a VESA (Video Electronics Standards Association) industry standard that was originally designed to support dynamic refresh rates over the DisplayPort interface. By using the existing DisplayPort protocol, AMD was able to reduce the complexity necessary to implement FreeSync into a display. NVIDIA’s G-SYNC, by comparison, requires a chip inside the monitor that coordinates the refresh rate of the display. FreeSyncs relative simplicity reduces cost and makes it easier for monitor manufacturers to support variable refresh rates.
Initially, FreeSync only worked over DisplayPort, but in 2016, AMD developed an extension of FreeSync to allow for variable refresh rates over HDMI. We’ve found that monitors typically only support FreeSync over either DisplayPort or HDMI. And unfortunately, not all monitor manufacturers have clear specifications about which display interface supports FreeSync, meaning you might have to dig through your display’s manual a bit to discover which port supports variable refresh rates.

There are some other notable caveats with FreeSync technology. To start, FreeSync monitors feature both a minimum and maximum variable refresh range, and the supported range varies widely by monitor. The variable refresh rate might be as little as 40Hz to 60Hz, or as wide as 30Hz to 144Hz. The maximum variable refresh range typically matches the maximum refresh range of the monitor, but the minimum could be as high as 48Hz.
When FreeSync was first released, the standard was criticized because it had no way to handle frame rates that dropped below a monitor’s minimum variable refresh rate. In such cases, your system would revert back to Vsync, if you had it turned on, or introduce screen tearing, if Vsync was off. AiMD’s Crimson driver added LFC (low framerate compensation), which is an adaptive-sync algorithm to adjust the GPU’s output and refresh rate for smooth motion when below the monitor’s minimum refresh rate. There’s a catch, though, because a FreeSync monitor must boast a maximum refresh rate that’s 2.5 times (or more) the monitor’s minimum refresh rate to support LFC.

Again, the lack of strictly defined parameters requires you to research a monitor’s FreeSync capabilities before purchase. Fortunately, many of the FreeSync displays released in 2016 offer a much wider variable refresh range. If you often game at more than 75fps—or below 40fps—it’d be worth your time and effort to find a FreeSync monitor that supports the variable refresh rates you play at.
NVIDIA was the first to develop adaptive-sync technology in 2013. G-SYNC is supported by GeForce GTX 650 Ti or greater GPUs, so it works even if your NVIDIA GPU is a few generations old. Even some of NVIDIA’s mobile GPUs support G-SYNC, including the GeForce GTX 965M, 970M, 980M, and 10-Series notebook GPUs. With a dedicated GPU. you’ll need to connect to a G-SYNC monitor via the DisplayPort cable.

On the monitor front, supported panels feature a G-SYNC chip that’s in charge of the variable refresh rate. The approach helps NVIDIA to have a bit more control when using adaptive sync. NVIDIA indicates, for example, that G-SYNC displays don’t have a minimum refresh rate limit and the maximum refresh rate matches up with the panel’s top refresh rate. The flip side is that G-SYNC monitors tend to be a bit more expensive than comparable FreeSync monitors, and there are fewer G-SYNC monitors on the market, due to the added complexity of the G-SYNC chip.
When G-SYNC was first released, NVIDIA received some complaints that gamers couldn’t turn off G-SYNC for fast-paced shooters, such as CS:GO, that exceeded the maximum refresh rate of the monitor. In this situation, G-SYNC would automatically revert to a Vsync mode to prevent screen tearing, and gamers didn’t want Vsync’s additional input lag. In 2015, NVIDIA updated G-SYNC to let you disable the technology at refresh rates above the monitor’s maximum, though doing so will reintroduce the possibilities of screen tearing.
In 2016, the engineers at NVIDIA introduced another input latency-reducing feature called Fast Sync that, although not technically part G-SYNC, does complement the variable refresh rate technology. Its designed for the same group of people that wanted to turn off Vsync at high frame rates, yet still eliminates screen tearing. Fast Sync uses a triple buffer system, but the GPU renders the frames as if Vsync is off and grabs the most recent frame from the buffers. Hie resulting lag is only a little higher than with Vsync off and there’s no screen tearing.

To further avoid ghosting and input latency, some G-SYNC displays support NVIDIAs ULMB (Ultra Low Motion Blur) mode, which strobes the backlight of a monitor to eliminate motion blur and reduce input latency. ULMB and G-SYNC can’t function at the same time, though, so you’ll need to choose which one to activate. Assuming ULMB is supported by the monitor, you can use the NVIDIA Control Panel to switch between the ULMB or G-SYNC display modes for specific games.
As you can see, NVIDIA’s latest GPUs and G-SYNC monitors provide you with several different ways to optimize your gaming experience. A quick perusal of the market shows that the majority of G-SYNC monitors support refresh rates all the way up to l44FIz, which is ideal for enthusiasts with premium graphics cards. There are some FreeSync monitors that match up with the high-end G-SYNC panels, but many of the early FreeSync options lack the amenities of G-SYNC monitors.

HDR is not yet common on PC displays—CES 2017 marked the debut of HDR monitors—but HDR was the “it” feature for HDTVs in 2016. HDR improves brightness, contrast, and color gamut, all of which help to make on-screen visuals more lifelike. At CES, we were treated to several demos that displayed content both on an HDR monitor and a conventional

monitor. Colors on the standard panel were flat and washed out in comparison to the HDR displays, where we could clearly see more vibrant hues and deeper contrast.
HDR metadata and tone mapping, which allows for the expanded color saturation and contrast, forces the monitor to deal with a lot more data than what it sees with standard dynamic range content. Existing HDR formats (HDR 10 and Dolby Vision) work fine with Ultra Blu-ray movies and HDTVs, for example, because input lag isn’t a concern. NVIDIA estimates that current HDR TVs produce at least 21 to 45ms of latency, while AMD indicates input lag could be as high as 100ms. Such input lag would kill the HDR experience for gamers. Both FreeSync 2 and G-SYNC HDR are designed to minimize input lag while supporting the tear-free adaptive-sync experience.
With FreeSync 2, AMD reworks the transport system for HDR. AMD indicates that FreeSync 2 takes the tonemapping duties away from the display and moves it over to the GPU. The game engine, in turn, maps content for targeted brightness, contrast, and color values. AMD is providing a FreeSync 2 extension to game developers to make it easier to incorporate HDR rendering into new games.
There are no concrete details about the exact brightness, contrast, and color gamut specifications a monitor must meet for the new FreeSync 2 and G-SYNC HDR standards. But we do know that any monitor to meet the respective standards must be approved and certified by AMD or NVIDIA. And because most current monitors are based on standard dynamic range specifications, it appears that monitor manufacturers will need to make some significant improvements to meet the respective HDR standards.
The ASUS ROG Swift PG27UQ is one of the first G-SYNC HDR displays, and it includes some features that most current premium displays can’t match. For example, ASUS indicates that the PG27UQ supports a l. OOOcd/ m2. Conventional PC monitors are designed to deliver between 200 and 300 cd/m2. ASUS achieves this through the use of quantum-dot technology and a backlight that can be selectively controlled over 384 zones. According to ASUS, breaking down the backlight to individual zones allows the HDR transmission to selectively dim zones and maximize contrast, while the quantum-dot nanoparticles are able to produce a wider range of colors.
AMD hasn’t officially certified any displays for FreeSync 2, but the company’s Senior Manager of Global Technology Marketing, Antal Tungler, tells us “several displays are in the certification pipeline now.” Again, there’s no definitive timeline on when FreeSync 2 monitors will be available, but we expect some to see some announcements this year. One thing we can say for certain is that all FreeSync 2 monitors will support LFC, and AMD expects the monitors to provide twice the perceivable brightness and color volume over standard RGB panels.
FreeSync and G-SYNC displays are readily available to produce exceptionally smooth, fluid gaming visuals, when paired with a compatible graphics card. FreeSync 2 and G-SYNC HDR are the next step toward perfect pixels, but it will likely be a while before the HDR standards are widely supported by monitors and games. Even so, it’s nice to see AMD and NVIDIA get ahead of the curve with HDR. and the backing of Team Red and Team Green should help to move the needle with monitor manufacturers.

7Review earns Amazon affiliate commissions from qualifying purchases. You can support the site directly via Paypal donations ☕. Thank you!
We will be happy to hear your thoughts

Leave a reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

7Review
Logo