NVIDIA Is Providing Do-It-Yourself G-SYNC Upgrade Kits... For One Display

You may have heard of "G-SYNC", NVIDIA's hardware and software-based solution to the phenomenon of screen tearing in PC games. While NVIDIA plans to partner with manufacturers to build this technology directly into displays, it is possible to grab a DIY kit from NVIDIA and install the gear yourself.

Unfortunately, the kit is only compatible with one monitor, ASUS' VG248QE, which severely limits the number of people that can take advantage of NVIDIA's G-SYNC tech. That said, the video is interesting from a purely technical point-of-view, as we get to see exactly how G-SYNC is implemented. I for one am surprised at the size of the G-SYNC board — it seems quite chunky for the functionality it provides.

Hopefully NVIDIA will provide display builders with the specifications they need to create their own upgrade kits, saving those of us who are happy with our current monitors the option of using G-SYNC without shelling out for new gear.

If you'd like the guide in document form, a PDF is available via NVIDIA.

G-SYNC Do-It-Yourself Kit [GeForce]


    Very smart move. First screen add on I've seen and it is heading to a good direction. I guess it will not be long until we can build your own monitor.

    I have that monitor. Sweeeet.

      The only question is.. when is it coming to Australia >w< (cause I have that monitor and awaiting for it) Its either I get the kit or spend prob about $800+ to get the First ROG monitor (Swift PG278Q)

        Looking at it I'm tempted to try it. Cause I own dead island which suffers from problems g-sync is meant to fix. Be a good little test


    So.... AMD have publicly stated that to achieve the same affect, all you have to do is have control of the monitor's VBLANK timing, which is something many monitors support, and it just comes down to software implementations.

    So, what is this extra hardware for then? The cynical side of me would suggest the hardware is responsible for nVidia hardware only lock-in. But the technical side of me cannot come up with another reason, going off the information provided elsewhere.

    I'm hardly an AMD fanboi, but as someone who is using a lot of the tech coming out of GPU's now, it seems that AMD is going down a path of open standards, while nVidia is pushing for proprietary solutions only. Let's ask 3dfx how that path turns out....

      They're not the same technology. GSync delays the VBLANK until a frame is ready from the GPU, FreeSync uses prediction to notify the monitor of the anticipated VBLANK interval. The latter is cheaper and easier to implement but is liable to tear frames on framerate fluctuations. The extra hardware in GSync is to freeze automatic VBLANK commands altogether and make it entirely manual based a signal from the GPU when a complete frame is ready to be pushed.

        Yep, absolutely correct.

        Not saying that the AMD version is exactly the same, nor even as good. But I find it interesting that nVidia seems to be making everything proprietary and closed to their own systems.
        At first people may say "well they are a business, of course they will keep their cards close to their chest", however a little history will reveal that nVidia and AMD/ATI's very existence owes to the fact that they pushed and adopted shared PC standard API's [openGL/directX], while 3dfx was pushing it's own closed API [glide].
        It's all well and good to promote your own technology, but when you forcefully create market fragmentation, it ultimately just begins to frustrate developers and then consumers.

      nVidia's argument against that was that the upscaling (and downscaling) chips in most monitors don't allow for VBLANK to be used. The demonstration worked on laptops because they don't have upscaling chips (when you set the resolution to be lower than the natural resolution, you end up with a boxed scr4een, instead of a lower resolution image stretched).

      Not sure how true that is. But I'd hold off on taking either side's word for now, until it's proved one way or another (show me it working on off the shelf monitors and video cards, not laptops).

Join the discussion!

Trending Stories Right Now