Look, sometimes you spend $US709 ($972) on a gaming monitor and wind up really happy with your purchase. I’m not saying it happens every time, but it happened this time.
After months of hemming, hawing, and talking myself out of it, last week I went ahead and dropped a bunch of bucks on a newfangled G-Sync monitor. This was a luxury purchase, and in many ways a ridiculous one: The monitor I bought cost twice as much as my PC’s graphics card, which itself cost as much as a PS4 or Xbox One. Whatever, though. Thirty minutes after I plugged the thing in, it was already clear: This shit was worth it.
As a single upgrade, this monitor has had a bigger and more immediate impact than upgrading to a 4GB graphics card; more than moving my Windows installation and games to a solid-state hard drive; certainly more than upgrading to Windows 10 or overclocking my GPU and CPU.
G-Sync is a proprietary monitor technology that the graphics-card manufacturer Nvidia introduced a couple of years ago. The idea sounded good: A chip that’s built into a monitor allows the monitor to talk directly with your PC’s graphics card and change its refresh rate on the fly, smoothly matching whatever is being output by the card.
If you’ve seen all the talk about frame-rate and 60fps over the last couple of years, that’s all tied to refresh rate, too. The higher (and more stable) the frame-rate on a game, the smoother it looks. The closer the game’s frame-rate is to the screen’s refresh rate, the less chance of tearing or hitching.
Historically, PC gamers have been stuck with two options for syncing frames with their monitor refresh-rates: They can use vertical sync (Vsync), which artificially locks a game’s frame-rate to a target number, or they can simply run the game with an unlocked frame-rate. Both options have downsides, and both options can leave you feeling like you’re not getting the most out of your expensive graphics card. Everything I’d heard about G-Sync suggested that this technology is a for-real, actual, bona-fide way to sync your PC and your screen, and that it makes games run noticeably more smoothly.
Last week I decided, fuck it, I’m going for it. Here’s the monitor I bought. It’s a 69cm, 2560×1440 Acer with a 144Hz refresh rate and built-in G-Sync support. There are plenty of other G-Sync monitors out there; this one had some good reviews, so I decided to go with it. I got mine from Amazon for a little below list price, but lots of stores carry them.
Thoughts informing my decision:
- It seems like a safe bet to get a 1440p monitor, given that it’s become a more reasonable resolution for stable PC gaming. 4K resolution just doesn’t seem practical or even necessary for a monitor-sized screen.
- 144Hz is more than double the 60Hz refresh rate of the other screens in my apartment, but I’ve seen enough PC gamers swear by higher frame-rates that I wanted to see what the deal was.
- Between the resolution and refresh rate, this monitor seems like it will be future-resistant, at least for a few years. For better or for worse, I’ve already committed to Nvidia’s whole deal by buying my latest graphics card from them, so I don’t really see myself switching to AMD anytime soon.
- It’s getting dark at like 4PM in Portland this time of year, and buying myself something cool will make me temporarily forget about that and feel happy.
Downsides:
- It only has a single DisplayPort input, so I won’t be able to have it double as an aux monitor for my game consoles without buying an expensive adaptor and manually swapping inputs. Apparently this is always true for this kind of G-Sync monitor, and it’s a bummer.
- It costs $US709 ($972), which is an insane amount of money to spend on a gaming peripheral, and enough to buy a whole lot of doughnuts and pastrami sandwiches.
- I don’t really care for some of the ways Nvidia does business. (More on that in a bit.)
The pros outweighed the cons, so I ordered the thing. A few days later, it arrived. I plugged it in, and yow. It is a damn good monitor.
It’s tricky to write about this kind of technology, because I can’t just show it to you. You’ll have to take my word for it. So: G-Sync works as advertised, and it’s noticeably changed how I experience PC games. I no longer sweat frame-rate fluctuations at all — I just turn on a game, turn off Vsync, and let it run. I have a GTX 970 graphics card, which can run most games at at least 1080p and get them north of 60fps. Since my monitor now goes all the way up to 144Hz, it has plenty of headroom to let games exist in the 60-80fps range, and thanks to G-Sync, it runs all of those frame-rates smoothly, with no tearing.
(Some purists may need their games to run at 144fps — I’m not there yet. I can usually detect when a game drops below 60fps, but in the midst of gameplay, I can’t really tell the difference between, say, 78fps and 94fps. When I run a game at a locked 144fps I can detect that it’s unusually smooth, but anything north of 60 is fine by me.)
I wasn’t aware of just how thoroughly screen-sync issues had invaded my PC gaming consciousness until I no longer had to deal with them. Time was, I’d start playing a new PC game with one eye on the FPS counter in the corner of the screen. If I saw a hitch or a slow point, both eyes would dart to the edge of the screen, like I was trying to catch the performance dip in the act. “Oh, nicked down to 54fps that time,” I’d think. “That’s not good.” Eventually I would have to turn off the FPS counter just so I would stop fixating on it and enjoy the game.
Now, every game just runs. GTA V is capable of maintaining 70-90fps on near-ultra settings in 1440p, and you should see it. It looks perfect. Even in the rare event that the frame-rate dips below 60fps, I barely notice, because there’s no hitching or stutter. Other games look just as good: Shadow of Mordor, Mad Max, The Witcher 3, Black Ops III, and on, and on. Dying Light looks bananas. Fallout 4 and Just Cause 3 have some real problems running at a consistent frame-rate, but even those games’ dips aren’t a big deal with G-Sync running.
The monitor can be a little funky sometimes: Assassin’s Creed Syndicate, for instance, drops its FPS to zero every time I leave a menu, though it returns to 60+ a few moments after that. Divinity: Original Sin will sometimes start freezing and unfreezing periodically, though that problem is addressed by restarting the game and I actually can’t say whether it’s G-Sync related or not. Regardless, a few hiccups don’t do much to mar the overall experience for me. It’s PC gaming, after all. There’ll always be some funkiness.
There are a few other things I don’t like about the monitor, however, chiefly the fact that it’s required me to make a substantial financial commitment to Nvidia’s hardware ecosystem. Ugh. Just typing the phrase “hardware ecosystem” makes me feel compromised. It’s one thing to take the initial step of buying an Nvidia or AMD graphics card. That’s a first step, and you can always change your mind next time around and go the other way. Buying a second piece of hardware is a much more substantial investment; it effectively removes any chance that I’ll switch to AMD for the duration of this monitor’s lifespan.
The technology is so good that I wish all monitors had it and that it could work with any graphics card. So, it’s too bad that G-Sync is proprietary to Nvidia and requires such a financial commitment to get it. AMD has a competing technology called FreeSync, which sounds like it works similarly, in that it requires a monitor to be equipped with the technology before your AMD card can work with it. There are some technical differences in how the two operate, but a primary difference is that Nvidia has controlled who can and can’t have a G-Sync module for use in their monitors, while AMD has made FreeSync freely available to any company that might want to support it.
AMD’s more open approach is doubtless fuelled in part by the fact that they’re the less popular brand and need to cut into Nvidia’s lead, but the dichotomy between the two still reinforces my distaste for the way Nvidia operates. Nvidia is all about injecting their proprietary tech anywhere they can, meaning that most big-budget games ship with GameWorks features that can (and often do) hobble performance on non-Nvidia GPUs. I get how competitive the PC gaming market is, and I don’t entirely begrudge Nvidia their attempts to succeed and make money, but as we’ve seen over the last couple of years, that kind of cutthroat manoeuvring hurts games as often as it helps them and causes headaches for gamers who don’t get in line.
It feels like it will only be a matter of time before all gaming-oriented monitors and TVs can do something similar to G-Sync. I may dislike that I’m forced to pick a brand and stick with it for the foreseeable future, but for the time being, I’m ultimately fine throwing in with Nvidia. I trust that they’re not going anywhere and that their cards will generally do a good job of running the games I want to play. It will do for now.
I understand that this article might get some of you considering spending too much money on a piece of hardware that you really don’t need. I sympathize! Most PC gamers have that one piece of equipment that they don’t own, but that they’d like to own. Your PC can always be a little bit better, after all; it’s both the blessing and the curse of PC gaming. You could always have that slightly faster CPU or that slightly improved graphics card; that clackier keyboard, or that mouse with all the buttons.
All of us have that one thing — the next thing — that we’re considering getting. Sometimes that next thing is more trouble than it’s worth; sometimes, that next thing is a disappointment. But sometimes, you spend a bunch of money on a thing and it winds up being totally worth it. Hooray, a new piece of technology does exactly what it promised it would.
Comments
28 responses to “I Bought An Expensive-Arse PC Gaming Monitor And It’s Really Good”
ROBOT JOX.
I bought an Asus 27″ PLS 1440p monitor about two years back, cost me around $700 – and is possibly the best PC peripheral I’ve bought in many years. 4k is just still too big to game on realistically with high end games on high settings at the moment, and 1440p looks fantastic. Plus the colors really pop on a PLS panel.
Had to go back to a 1080p LED monitor at one point while I was using the 1440p at work, and it almost broke me. Just couldn’t go back.
Same exact situation, Asus 27 monitor. Cant even look back at the monitors at work anymore without the tinge of sadness.
Came for the Robot Jox. Stayed for the Robot Jox.
It looks cool and all but seriously, NVidia are assholes and I won’t give them money to further restrict what should be open technologies.
Nothing is open technology if you’re funding it privately. No private business has to work with any other business. I don’t even use Nvidia, I’m currently running a way old 6950.
They don’t have to. But they should. Industry standards are good for everyone. Pursuing things like putting proprietary plugs and chips into generic hardware is anti-consumer, anti-competition, and makes the industry worse as a whole for everyone except them. But especially worse for consumers.
I agree in principle but when you’re running a million/billion dollar business you need to protect your investments and proprietary software/plugins do that. It sucks but that’s what you have to sometimes do.a
This isn’t like having a cool feature that your product has over others. If Microsoft suddenly locked all of their operating systems off from any hardware that isn’t specifically licensed for Windows, people would be so damn mad they’d vomit blood. Because that would be a shitty, underhanded cash grab that artificially limits consumer choice and market competition.
This is like when Apple first released iTunes, they encrypted and DRM locked every song so you could never play your music on anything other than an iPod. It’s a bullshit tactic that made people very angry and eventually they rolled it back because it was company A limiting how a customer uses products from company B, C, and D.
I’m not convinced that Nvidia’s intention is purely business based, but is maybe connected with efficiency and synergy of technology. Their tech is consistently more efficient across the board, maybe because they design everything to work better with their controlled in-house tech? Nvidia’s design synergy nit only benefits their business, but the advancement of pc graphics, be it down a very specific, insular, incestuous evolutionary path. 🙂
They use questionable business tactics to influence game performance during development and now with the monitor stuff they are doing almost exactly what AMD are doing, except they are locking shit down instead of opening it up to industry standards.
DirectX exists specifically to stop this bullshit. Video cards in the 90s were terrible because of competing standards. It took a third party (Microsoft) to force an industry standard so that the graphics card industry could stop being completely toxic to consumers and developers. This could be the beginning of the pre-DirectX days all over again. Nobody wants to see that. Well, nobody who isn’t trying to create a monopoly in the PC market wants to see that.
Just reading the article, but who uses cm to talk about monitor size?
People that want their monitor to sound bigger 😛
I use this rule for many things that i want to sound bigger.
Mark’s mentioned in the past that there are tools in place that automatically localise articles brought over from Kotaku US. They do things like change US spelling and measurements.
My best guess is that the use of cm is the result of that process.
More impact than moving to an SSD? I really find that hard to swallow. And you put this monitor on a 970??
Honestly, for that sort of money, I think it would make more sense to get a 980Ti and just lock it to 72Hz.
I find that while a high refresh rate monitor is useful, it is more important to get the most fps from the computer itself as it will then give you the most recent frame displayed at any time.
That is really my thoughts on it, spend the money on processing so that all your games can run at 72fps or 60fps without dropping under it, and then any monitor can lock to that rate, and you get the benefit of a faster machine, consistently fast fps and you can have your choice of monitors.
People seem to think that getting a faster screen will fix ALL of their problems, but you will still have some problems as the frames aren’t necessarily aligned; resulting in a slight delay.
As long as the gap of time between frames stays consistent your brain can predict the rest.
72Hz with one missing frame here and there is worse than 60Hz dependable.
Freesync and G-Sync can remove taring when this happens, but they cannot change how your visual cognition is evolved.
DERP REPLY
That screen is incredible. Ticks every box with features you could want… given that 4K isn’t really achievable at frame rates that PC gamers are accustom to. You see that screen pictured in so many ‘dream builds’ blogs/galleries where budget hasn’t been a limiting factor. Have fun amigo!
Very nice but I prefer 21:9 monitors, ultra widescreen is where it’s at. I got a GTX980Ti and a 34″ 1440p monitor this year, best $2100 I spent all year (maybe). 60fps@3440×1400 is a pixel wet dream.
for flight/driving/space sims… mmmmmm. 🙂
Good as it is, my money’s going towards a Pascal GPU when they come out next year. With any luck I’ll have some savings left over to go to a monitor like this too.
I’ve owned this monitor since august gotta say It’s a pretty amazing piece of kit.
Gsync really is awesome (when Nvidia don’t brake it with every new driver update, seriously wtf).
The resolution is razor sharp and 144hz is smooth as butter even if your just browsing web etc
But for the price the build quality seems kinda average (crappy shiny plastic, screen wobbles if bumped etc)
Anyone thinking of buying this should check out asus’s new 165hz ROG ips thingy imo
The AMD FreeSync tech is being used as a basis for the standardised DisplayPort adaptive sync feature by VESA.
So assuming the feature gets more widespread support (which it might, with Intel saying they will implement support for it), I wouldn’t be surprised to see Nvidia start supporting AMD’s version in future.
Can anybody tell me more about the apple Thunderbolt Display? How is it as a gaming monitor? I heard the colour is very good for office works, not sure about gaming.
I got the 24″ 1080P GSYNC TN panel version, the Acer XB240h for $479 AUD. You’re right on all counts, Gsync makes games feel seamless and smooth and was worth the investment. At times I wished I got the IPS version you got, but I’m running a single GTX 970 and even now at 1080p the latest AAA games will dip down to 45-50fps when maxed out, and tank to around 35-45fps when using DSR to downscale from 1440p so I’m glad I stuck with 1080p. I will say though, I miss the colours on my old IPS panel. Particularly greys/blacks and the shitty dithering bothers me sometimes.