Why It’s Better That Your Monitors Don’t Match

One of the great joys of a PC setup is symmetry. Everything is synchronised. The space is clean. Organised.

That’s true for monitors too. Monitors must be the same size. Everything must be even. Ordered.

But there’s a good argument for throwing that logic out the window.

Perhaps the most aggravating element of PC gaming over the last few years has been watching the proliferation of HDR in consoles. More vibrant colours, brighter whites, deeper blacks.

But consoles were always going to get HDR first. The market for big screen TVs and budget TVs well outstrips that of the specialist PC gaming market; phones even more so.

And then there’s all the other desireable requirements: an IPS screen (although the contrast ratios of OLED are also very good). 144Hz, or at least 100Hz. 4K, and if not 4K, at least 1440p and maybe even 21:9 support.

The cost of all of that in a single package is well over a thousand dollars. And if you want HDR on top of that?

$3500.

Most people don’t buy TVs that expensive. And that’s not even factoring G-Sync into the equation. FreeSync-enabled monitors are generally a fraction cheaper, but if you’ve got an Nvidia card — which most gaming PCs do — then you won’t get as much out of it.

But there’s a strong use case for having at least one large — I’m talking 32″ here — 4K screen on your desk. Most gamers do a ton of web browsing, and it’s pretty common for gamers to do some kind of office work that benefits from a good 4K IPS screen. Maybe you dabble with the Adobe suite in your spare time. Maybe you’re a web developer, someone who streams, or slices videos for YouTube.

The trick is that those kinds of people also want to game. So having a main or separate gaming monitor makes a lot of sense. People understand the value of higher refresh rates: action is smooth, recoil is easier to control, enemies are easier to spot in motion. That’s all well and good.

So instead of hoping for a pipe dream — one or two identical monitors that can do absolutely everything at a price that absolutely almost everybody can’t afford — it makes sense to have (at least) two screens: one specifically for gaming, with a high fresh rate and the lowest possible response time, and another big beast to handle the web browsing, colour reproduction, HDR, and so on.

That’s where something like the BenQ EW3270U comes in. For about $749, you get a 4K screen that supports HDR and can cover 95% of the DCI-P3 colour gamut. The latter is partially because of the panel: the EW3270U uses a VA panel, rather than IPS, so you’re trading some colour reproduction and better viewing angles for improved contrast.

VA-based panels — which are kind of a good middle ground between the gamer-focused TN offerings and the professional-focused IPS offerings — also tend to be a bit cheaper. If you want a 4K HDR screen that’s the same size as the EW3270U, you can expect to shell out $1149 or more. (There are some 31.5″ screens that are a fraction cheaper, like the ViewSonic VX3211, but that screen only supports 8-bit compared to the BenQ’s 10-bit support.)

For a piece of tech you’re going to run for five or more years, that’s a decent starting price. Other bits that people will appreciate: it’s not curved, which helps if you’re running multiple monitors; it comes with FreeSync support, which is a massive boon for Xbox One X (but also Xbox One S or even stock Xbox One owners) playing games that don’t maintain a locked 60 frames per second, as Digital Foundry showed with Wolfenstein 2: The New Colossus and The Vanishing of Ethan Carter.

Some points worth noting here: the EW3270U only supports FreeSync 1, not AMD’s latest version of the technology. FreeSync 2 gets rid of one of the tone mapping phases in the HDR rendering process, which helps reduce latency while playing HDR games. On the plus side, the EW3270U’s FreeSync range extends from 24Hz to 76Hz, which is nice and wide.

Other plus points: there’s only two major screws to pop in for installation. The base is also nice and small, and the bezels on the sides are relatively small — but the monitor is not bezel-free, if that’s something that matters to you. Still, the whole unit is easy to put together.

DisplayPort 1.4, a 3.5mm jack, USB-C port, 2x HDMI 2.0 ports and a quoted brightness of 300cd/m2 are all nice to have, although the USB-C port doesn’t supply enough power to charge your laptop. There’s also a range of colour calibrated modes, as well as other presets like Game, Movie, Picture, and so on.

BenQ are also quite proud of their special “Brightness Intelligence Plus” feature. It basically uses a sensor to gauge the lighting in your current environment, and adjusts the brightness and colour temperature to suit. You have to enable the Low Blue Light picture mode or use HDR within Windows 10, but be warned: the Windows 10 experience with HDR isn’t as nice as what you might expect coming over from a console. It’s improved over the last few months, but the effect of enabling HDR on non-HDR is jarring enough that you’re better off just running the EW3270U in SDR mode and letting the monitor switch when appropriate.

From a general usage perspective, the biggest benefit is being able to have two full width browsers or programs side by side. That’s hugely useful for whenever you need to reference back to something — maybe you want to watch a movie while doing some photo editing, or have a lecture on one side while you review notes or textbooks on the right.

But the main key with all of this: there’s one real good reason why you don’t want the EW3270U as your lone monitor. Or any super-sized screen, for that matter.

Games, by and large, should be run in fullscreen. Whether that’s actually fullscreen or fullscreen windowed/borderless fullscreen/fake fullscreen doesn’t matter: if you’re playing a game, you want it to occupy the entirety of your real estate.

And that’s why I’ve never bought the argument of buying a single, gigantic screen — ultrawide or not — to replace a multi-monitor setup. It screws you a little if you’re the kind of person who wants to share your gameplay: you can record a single fullscreen monitor with something like Nvidia’s Shadowplay/Share tools, but you lose the finer controls that you’d get by using something like OBS or XSplit (that would let you monitor levels on a second screen).

Borderless fullscreen also isn’t universally supported. It’s an expected feature on blockbuster AAA games, but I’ve lost count of the amount of indies using Unreal Engine 4 or Unity that run in windowed, exclusive fullscreen, and nothing else. In those instances, you can play at 4K — but what if you want to track something on a second screen, like a podcast or emails or a Twitch channel?

And then there’s the pricklier point: how many people have a gaming PC that can actually run 4K games at 60fps anyway? Answer: very few. Even graphs released by Nvidia to promote their upcoming RTX cards note that the RTX 2070 and 2080 aren’t automatically over the 4K/60fps magic mark.

So in all of these scenarios, it’s better to have a smaller gaming monitor — either 1080p or, more ideally these days, 1440p — with a cheaper, larger second monitor exclusively for productivity. And if that second monitor has HDR10 support (like the EW3270U), then you’ve got a screen that serves as a nice alternate option for console gaming.

(There’s also an understated advantage with having a smaller sized gaming monitor: most people will have spent the last 10 or so years playing at 1080p or smaller resolutions, and if you’re playing on a mouse and keyboard your muscle memory will be accustomed to that resolution. It’s not something that matters if you’re playing an open-world RPG, but if you’re converting from a tinier monitor and playing something like CS:GO or Fornite, you’ll definitely notice the difference in your aim.)

But the final point in my argument for an ugly, asymmetrical monitor setup: the situation isn’t going to get better anytime soon. Panel manufacturers are already looking ahead at 5K, and even 8K displays. When those start coming out, and content starts being built with that resolution in mind, gamers and PC users are going to be faced with the same vexing problem you have now. If you want the higher resolution content, you can have it, but you’ll have to pay a fortune and you’ll be doing so without the benefits that make a huge difference in the middle of gaming.

So it’s best to toss that ideal right out the window. Forget the dream of having multiple, equally sized monitors with no bezels that all have 144Hz G-Sync support with perfect colour reproduction. You can’t afford it. Even if you could, it’s absolutely not worth the money. Having monitors fit for purpose is a much smarter buy.


The Cheapest NBN 1000 Plans

Looking to bump up your internet connection and save a few bucks? Here are the cheapest plans available.

At Kotaku, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.

Comments


12 responses to “Why It’s Better That Your Monitors Don’t Match”

Leave a Reply

Your email address will not be published. Required fields are marked *