Table of Contents_
When it comes to modern gaming monitors, there is plenty of lingo that the average consumer may not understand. For instance, you may have encountered the term FreeSync. What does FreeSync mean and is FreeSync worth it? Keep reading to learn all about it.
In order to figure out if FreeSync is worth it, you will have to understand exactly what it is and what it does, especially when you’re learning to set up your new gaming monitor. FreeSync is an adaptive sync technology developed by industry giant AMD. It works to regulate the refresh rate between a computer’s GPU and the monitor.
Adaptive sync makes sure that the refresh rate of the GPU is constantly adapting to the predetermined refresh rate of the monitor.
Nearly everyone can benefit from some form of adaptive sync technology. Why? Computer monitors are extraordinarily stable when it comes to their refresh rate. GPUs, on the other hand, tend to be rather erratic as it reacts to the needs of the CPU and the demands of the gaming application. The GPU will vary the refresh rate sent to the monitor, but the monitor will only be equipped to deal with a single predetermined refresh rate. The end result? You will witness screen-tearing as you game, which is when the monitor shows two different images at once. At that point, you might wonder if a gaming monitor is worth it.
There are a few different ways in which adaptive-sync technologies work, but they all share certain commonalities. Generally speaking, adaptive sync makes sure that the refresh rate of the GPU is constantly adapting to the predetermined refresh rate of the monitor. That is exactly what G-Sync also does.
FreeSync monitors create a variable refresh rate that changes when needed to match the demands of the GPU and the CPU.
There is a reason AMD’s FreeSync is one of the two major players on the block, as it offers certain benefits.
FreeSync’s intended job is to reduce or eliminate screen-tearing and screen stuttering. In other words, if you have a FreeSync-enabled gaming monitor, these two issues should become relics of the past. FreeSync monitors create a variable refresh rate that changes when needed to match the demands of the GPU and the CPU.
FreeSync monitors tend to be cheaper than NVIDIA G-Sync monitors. NVIDIA is extremely selective and hands-on when it comes to the implementation of its proprietary hardware. AMD is a bit more lenient, so there are many more FreeSync-enabled displays on the market than G-Sync displays. This leads to competition, which leads to lower prices.
Computer monitors are extraordinarily stable when it comes to their refresh rate. GPUs, on the other hand, tend to be rather erratic as it reacts to the needs of the CPU and the demands of the gaming application.
AMD FreeSync vs NVIDIA G-Sync. Which is better?
There is no clear victor here. They each perform their job admirably. FreeSync monitors may not all boast the same standard of excellence, as AMD is more hands-off than NVIDIA. G-Sync monitors, though, tend to be more expensive.
Is 60Hz FreeSync worth it?
Yes. Adaptive sync is worth it at any refresh rate. It’s not the actual refresh rate that matters, it is the idea that this refresh rate will be stabilized and kept in sync with the GPU.
Should I keep FreeSync on if I don’t use it?
There is no reason to turn FreeSync off when not in use, as it is not a significant power draw. You can go into the settings and turn it off if you want to though.
STAT: There are FreeSync (as well as G-Sync) displays that operate at 30 Hz or, according to AMD, even lower. (source)