G-Sync vs. FreeSync: Which Display Tech Reigns Supreme? | Gadget Review
Gaming Monitors

G-Sync vs. FreeSync: Which Display Tech Reigns Supreme?

G-sync vs Freesync
G-sync vs Freesync: Which is better?

G-Sync vs FreeSync. Nvidia vs. AMD. A battle that feels like it’s been waged since the beginning of time, but in reality has only been going on since AMD purchased ATI back in 2006. Before then Nvidia was warring with ATI over which graphics card could squeeze out just a few more frames per second than the other, but this was long before either company even knew what the problem of “screen tearing” was, let alone how to solve it.

Nowadays, people are asking themselves which is a better solution to this issue: Nvidia’s G-Sync or AMD’s FreeSync? Well grab some popcorn and settle in, as we break down the few differences and the more common similarities between the two competing technologies and what makes them tick. The Samsung UE590D features AMD Freesync for minimal image tearing and ghosting.


The Plague of Screen Tearing

G-Sync vs. FreeSync

For those out of the loop on what screen tearing is and what causes it, first you have to know how a graphics card and a monitor communicate with one another.

When a graphics card tells a monitor what to display, it’s sending images to the display, which the display then renders at a predetermined “refresh rate”. Refresh rate determines the number of times that a monitor will ask for a new frame of information from your graphics card, and depending on the power of said card, how fast it can display it.

For most regular monitors (and 4K monitors), this happens at 60Hz, or 60 frames per second, while in high-performance gaming monitors this can be cranked all the way to 144Hz, or 144 frames per second.

What is Screen Tearing?

Screen tearing happens when your graphics card tries to push an extra frame to the display before it’s ready for it. Though the actual process is a bit more complex, this is how things work when broken down to their most basic terms.

This problem had been solved on the software side of things for years, with a setting called “Vsync”. This is an option you can enable in any game that locks the frame output to the expected refresh rate of the monitor, most often 60Hz. This is all well and good if you’ve got a graphics card that can comfortably and consistently output at least 60 frames per second, but if not, Vsync will start to throw numbers at the wall to see what sticks (anywhere from 45FPS to 10FPS), which can drastically affect performance in lower tier systems.

So what does this have to do with G-Sync vs. FreeSync? Read on to find out.

VESA Adaptive Sync

G-Sync vs. FreeSync

On the one hand, AMD’s FreeSync solution takes advantage of a grandfathered technology that works in all current iterations of the DisplayPort 1.2a cables, known as “adaptive sync”.

Adaptive sync is the first hardware-based solution to screen tearing, an open-source technology that AMD uses to keep a consistent layer of frames and display refreshes in “sync” with one another. Because adaptive sync is open source, it’s ridiculously cheap to implement in monitors or graphics cards, making AMD the better option for budget gamers.

Nvidia on the other hand uses their own proprietary version of the adaptive sync technology known as “G-Sync”, which is implemented as a chip that’s in the monitor itself, designed to communicate directly with other Nvidia-based graphics cards. Both of these solutions do the same thing by creating two separate pieces of hardware inside the monitor and the graphics card that are designed to combat screen tearing, but which one is right for you?

G-Sync vs. FreeSync: Which is Better?

G-Sync vs. FreeSync

Although there aren’t a ton of obvious technical differences between G-Sync and FreeSync, the first and most important that most gamers will point to is that unlike FreeSync, G-Sync doesn’t have a minimum refresh rate floor that it has to stick to in order to function correctly.

AMD’s FreeSync can only work within a certain range of frames per second, generally 20FPS to 144FPS, while G-Sync can go all the way down to 1FPS and up to 240FPS (when those monitors finally arrive sometime next year). This makes it the better choice for people who have high-powered systems and know that they’ll be able to handle 240Hz monitors once they hit store shelves. For now you’ll have to settle with these: a list of top gaming monitors and best computer monitors.

Next, many reviewers and gamers alike report that FreeSync setups may fix screen tearing, but suffer from another issue known as “ghosting”. Ghosting occurs when an object on screen briefly leaves behind an artifact of the last image that was displayed before the next, giving characters a sort of “ghost” like trail that can get worse as more action piles up on screen.

Lastly, for the time being Nvidia is the only company that offers adaptive sync technology (i.e. G-Sync) on the best gaming laptops. AMD hasn’t made any announcements for plans to import their FreeSync tech to the world of mobile gaming, so for the time being this is a market that Nvidia has a firm grip on that they don’t look to be letting up on anytime soon.

Final Verdict

Whether it’s AMD vs. Intel or Nvidia vs. Intel, G-Sync vs. FreeSync; as always it comes down to cost.

As is the case with nearly every AMD product, the monitors and cards equipped with FreeSync compatibility are anywhere from $50 to $500 cheaper than their Nvidia counterparts when used in combination. That said, Nvidia’s G-Sync is unquestionably the superior technology, and doesn’t have nearly as many ghosting or performance issues that AMD users report on their FreeSync setups.

If you can easily afford an Nvidia G-Sync setup and want the absolute peak of performance, that’s going to be the pick for you. If you’re shopping on a budget though and don’t mind a little bit of delay or slight image issues here and there, an AMD FreeSync setup should do the job just fine.

And of course, no matter which configuration you ultimately decide to go with, everyone can agree that both are clearly better than being stuck with plain old Vsync for your next high-octane gaming session.

Related Articles:


Leave a Reply

Your email address will not be published. Required fields are marked *