Nvidia And AMD's Nerdiest Fight Is The One That Matters Most

Photo: Matthew Reyes, Gizmodo

At CES earlier this month, Nvidia and AMD traded words, with Nvidia CEO Jensen Huang saying his GPUs would “crush” AMD’s newly announce Radeon VII. AMD CEO Lisa Su was more measured in her response, but she didn’t pass up the opportunity to make a dig at Nvidia, noting that when her company finally adopts ray tracing, people would actually know what it is.

But let’s set aside their back and forth on ray tracing and GPUs and hone in on another tiff between the companies — this one is about monitors, and if you’re buying one for gaming in the next few months, you need to pay attention.

Both companies are having a minor slap fight over monitors that support variable refresh rate. What is VRR? The display on a monitor, be it one for gaming, productivity, or even just a TV, always refreshes. A 60Hz display refreshes the image 60 times a second, while a 144Hz display refreshes 144 times a second. When the display refreshes, it refreshes from top to bottom — something you can see if you slow the image waaaaaay down.

Your computer isn’t beholden to the refresh rate of your monitor though. So it might spit out 120 frames per second when your monitor only handles 60. This leads to screen tearing. That’s when your display starts drawing an image and realises the computer is like 60 images ahead so it skips ahead to catch up, leading to what looks like weird tears across the screen. Not the end of the world, but not attractive!

A monitor with a variable refresh rate is more nimble. It knows how the GPU is drawing those images every second and refreshes at the correct rate to reduce tears, making things smoother. The technology in practice is really pretty, but AMD and Nvidia employ it differently. Nvidia’s G-Sync requires proprietary tech, in this case, an Nvidia-created display scaler, built into the display to communicate quickly with the GPU, providing an experience that should theoretically be smoother.

It comes in two versions, G-Sync, which will give you good peak brightness, a decent backlight system, and solid colours, and G-Sync Ultimate HDR, which requires the display to have a higher peak brightness (1,000 nits), a better backlight system (full array), and display more colours (DCI-P3 colour gamut).

AMD’s FreeSync is based on VESA’s Adaptive Sync technology, which is a different kind of variable refresh rate tech that should work with more monitors and GPUs because no custom scaler is required. FreeSync is an offshoot of Adaptive Sync, much like Intel’s promised Adaptive Sync software will be. So all FreeSync monitors are Adaptive Sync monitors, but not all Adaptive Sync monitors support FreeSync. Because it is open source and doesn’t require special hardware beyond a DisplayPort 1.2a port, FreeSync monitors are a lot easier to make and cheaper than G-Sync.

Which is why, despite 74.18-per cent of Steam users using Nvidia GPUs to game, the majority of monitors being made and sold continue to be FreeSync, not G-Sync. When I looked on Amazon US on January 24, 2019, the second bestselling monitor has FreeSync, while the bestselling G-Sync is ranked down at 34, and NPD hardware analyst Stephen Baker told me that there were “3 to 4 times as many” FreeSync monitors selling versus G-Sync.

But according to Nvidia, those sales come at a significant cost to quality. Nvidia is so sure FreeSync sucks that it spent a good portion of its CES 2019 press conference ridiculing FreeSync. CEO Jensen Huang claimed only 12 out of 400 Adaptive Sync monitor models tested actually worked, and while he didn’t cite specifics, a big chunk of those Adaptive Sync monitors are FreeSync.

Later, in a smaller press conference I attended he said, “most of the FreeSync monitors do not work.” He went on to claim, “They don’t even work with AMD’s graphics cards, because nobody tested it. And we think that is a terrible idea to let a customer buy something believing the promise of that product and have it not work.”

At a roundtable immediately following Nvidia’s press event, AMD CEO Lisa Su said denied that claim. “I don’t believe we’ve seen that,” she said. And the next day AMD Director of Product Marketing, Sasa Marinkovic, went a step further, challenging that Nvidia had not tested every AMD monitor it claimed to have tested. “Prove it,” he told me in an interview.

It sort of feels like everyone is pointing fingers at one another. So who do you trust? It’s probably safe to question both companies’ boldest claims. Remember big, for-profit corporations are not your friends. Huang, when asked about Nvidia’s decision to support Adaptive Sync, pointed to quality control issues with the current Adaptive Sync monitors on the market, while NPD’s Stephen Baker suggested the real reason is Nvidia isn’t competitive enough in the monitor space with G-Sync alone.

Nvidia is getting creamed by AMD, which is selling a lot more monitors and working with a lot more companies to make monitors. Nvidia needs to compete on its rival’s level. “[I]t’s really about cost,” Baker told me, “People may argue about technology or whatever else, but as long as there’s a significant incremental cost to G-Sync, it’s gonna be a tough sell.”

So Nvidia’s gotten on AMD’s level, and its new G-Sync compatible monitors toss out the cool Nvidia scaler and take a note from AMD, supporting Adaptive Sync though software exclusively. Vijay Sharma, Product Manager at Nvidia and head of G-Sync for the company, told me to think of all the different standards as something almost like a family tree.

A rough illustration of the VRR world currently. (Graphic: Gizmodo)

At the top level is the concept variable refresh rates. From there spring three types of technology that use VRR. G-Sync with its custom scaler, Adaptive Sync, which relies on software and the DisplayPort 1.2a standard, and HDMI VRR, an HDMI version of variable refresh rate that’s slowly getting support in TVs from makers like Samsung.

These technologies then branch out. From G-Sync springs regular G-Sync and G-Sync Ultimate HDR. From Adaptive Sync springs FreeSync, FreeSync 2 (more on that in a moment), Intel’s future Adaptive Sync support, and the new G-Sync compatible standard. From HDMI VRR springs, well, not a lot— just FreeSync on a select group of Samsung TVs that only works with the Xbox One X or Xbox One S.

Back to the claims of these companies—particularly Nvidia’s repeated insinuations that AMD is just slapping its name on monitors that don’t work. Is that true?

No. What AMD is doing is saying that if you get a monitor, and it says it works with FreeSync, then it will work with your AMD GPU and give you some kind of variable refresh rate. What kind can vary a lot. A monitor might only do FreeSync when it’s being asked to refresh between 60 and 120 frames a second. It might not work below 60 — which is when many gamers with cheaper cards would want VRR the most.

The original open source FreeSync standard leaves a lot of wiggle room. Even worse, a lot of those FreeSync monitors ship with FreeSync turned off. Including the one on my desk! I didn’t even realise it had FreeSync disabled by default until I started working on this piece.

If trolling review sites and living in the spec sheets for monitors is unappealing to you, AMD points to the newer FreeSync 2 HDR, which gives a very specific set of guidelines that a monitor must meet before it can be labelled FreeSync 2. As with G-Sync Ultimate HDR, those guidelines include support for HDR and a wider colour gamut.

It also includes something called Low Framerate Compensation, or LFC. This is AMD’s way of guaranteeing gamers have a smooth gaming experience even when performance dips. Above I mentioned that not all VRR monitors are created equal, and in some, VRR only works for a limited range of refresh rates.

For a monitor to qualify for FreeSync 2 HDR, it needs a sufficiently high refresh rate range. According to AMD, when you divide the top refresh rate of a monitor by its midpoint refresh rate, it must equal 2.5 or higher. So a 144Hz display with a mid of 48Hz would pass. A 144Hz display with a middle of 84Hz (and thus a low of 60Hz) would fail. Nvidia’s new G-Sync compatible displays also require a specific number, 2.4, to pass.

Unfortunately, display makers don’t just drop the range of refresh rates that a VRR runs at on the spec sheet. That would be too easy! So if you want to know what your monitor, or future monitor, is capable of you’ll have to check out a chart of support monitors. Nvidia has one handy for G-Sync compatible displays here. AMD’s super searchable list of FreeSync monitors, including VRR range, can be found here. Using it, I learned there was a very good reason my 4K FreeSync monitor was so cheap—it only supports VRR from 40Hz to 60Hz.

If you’re planning to buy a monitor and don’t want to find yourself holding something that doesn’t do the cool promised thing well, then your safest bet is to look for monitors labelled as FreeSync 2 HDR or G-Sync Compatible. Otherwise, you should double check the monitor’s range in the links above. Remember, you want a monitor that supports VRR not just at the highest refresh rate, but at the lowest too.

And hopefully, Nvidia and AMD’s pissing match continues to benefit consumers. Right now it’s highlighted the flaw in a lot of the VRR displays being sold. More discussion could lead to things like display manufacturers making it easy to note the supported variable refresh rate range.

Or it could just make displays cheaper. Something Lisa Su insinuated when asked about Nvidia’s adoption of Adaptive Sync. She wasn’t worried about the competition. “We think that’s just that just means that it’s better for gamers.”


Comments

    "And we think that is a terrible idea to let a customer buy something believing the promise of that product and have it not work."

    This is ironic, because I have a G-Sync monitor (Alienware AW3418DW) and an NVIDIA GPU, and my experience with G-Sync is that it never works properly - I still get a ton of tearing, and additionally games with high but highly variable framerates cause the brightness to flicker.

    Visually it's a great monitor and after having used an ultrawide for a bit I could not go back, but given how damn expensive it was - because of the G-Sync - I'd have hoped that it actually worked properly. It seems to kind of sort of work when I set it to run on everything rather than just fullscreen games but I'm still not convinced it's actually running even then because I still see obvious tearing, and I have terrible eyesight. :(

    Oh also, the fact that *every time my PC sleeps the display*, when it returns it pops a toast notification up telling me I plugged in a G-Sync-capable monitor is insanely annoying.

      There are a few things you can check to see if GSync is running. Please don't take it as suggesting you haven't troubleshooted, just want to be thorough.

      1. Make sure GSync is actually switched on in the monitor settings. Some monitors allow you to toggle the feature yourself. Some monitors change the LED colour to indicate if GSync is on or not.

      2. In the Nvidia Control Panel, if you select the 'Set up GSync' option, make sure 'Enable GSync' is ticked and enable for both windowed and full screen mode. Then go to the Display menu at the top and tick the GSync Indicator option. The next time you start the game, it should show a large and obtrusive indicator if GSync is active and running for the game. Definitely disable this when you're done testing.

      3. Make sure vsync is off in your game settings, and on globally in the Nvidia control panel under 'Manage 3D settings'. These are both important - vsync on in-game interferes with GSync's behaviour and produces tearing; and vsync off in the 3D settings disables GSync when the software frame rate exceeds the monitor refresh which also produces tearing. While you're in the 3D settings, also make sure the Monitor Technology setting is set to GSync.

      So I had this problem with my GSync monitor and it turns out you have to enable 'Fast VSync' in the Manage 3D Settings part of the NVidia control panel.

      This will cause the GPU to throw out partially-rendered frames instead of sending them to the screen. I can't 100% guarantee it will solve your issue, but it absolutely did for me.

    A 60Hz monitor receiving 120fps skips 1 frame, not "like 60".

Join the discussion!

Trending Stories Right Now