Don't Understand NVIDIA's G-SYNC Tech? Here It Is Explained In Simple Terms. Mostly

You might remember late last year that NVIDIA announced something called "G-SYNC". The basic idea is that a piece of additional hardware is built into your monitor, which will allow your NVIDIA graphics card to display games without visual stuttering (or at least, improve the situation immensely). If you're still a little vague on how it all works, this two-minute clip should clear things up.

RedditGames asked NVIDIA's Tom Petersen to explain the concept in a way that a five-year old would understand and while he does an admirable job, it's hard not to describe how it works without using a few technical terms.

Perhaps the best part is when Petersen states why games have stuttering in the first place, something many gamers might not understand beyond knowing that the v-sync option seems to make things better... sometimes.

Of course stuttering can be caused by a number of issues -- insufficient grunt, poorly-timed garbage collection or in the case of Dead Island, sending multiple keystrokes as input -- but it's good to know GPU vendors are working on addressing the most common suspect.

Explain Like I'm Five: G-SYNC [YouTube, via Reddit]


Comments

    The chart that they used for monitor vs gpu was wrong in the context of what they were explaining. So much for being a simpler explanation.

    So it is just variable refresh rate on the display? I guess if the game is consistently slow at points that would get rid of the judder. If it is just a few frames that are slow I wonder if it would make a difference? (e.g. panning the camera over a world, with some complex geometry only visible for part of the pan).

    That wasn't a very good explanation.... :\

    Monitors don't display the whole image at once, they update line by line (E.G., from top to bottom, like the english language is written).

    When V-Sync is ON the image buffer is only updated when the screen has finished rendering the last line.

    With V-Sync OFF the image buffer is updated as soon as the GPU has finished calculating the image; the monitor might only be part way through showing the old image, when the buffer updates with a new one, the monitor doesn't care, it just continues onto the next line, but now it's getting data from the newest image; basically you are now seeing part of two images at once, new part at the bottom and the old part at the top! if something was moving then it won't line up between the two, so you get a 'tearing' effect.

    The other issue with standard V-Sync is that it not only locks your frame rate at 60fps to match a 60hz screen, but if your computer drops below 60fps, V-Sync will lock the output to 30 instead and double up the frames (so it's not as smooth) - this stops the tearing but you lose 'smoothness' of higher frame rates. Fairly recently? a dynamic V-Sync was added with nvidia (maybe amd too?)...it only locks your frame rate to 60 when your computer is rendering 60+ fps; meaning you get tearing under 60fps, but don't lose the framerate.

Join the discussion!