Why It's Better That Your Monitors Don't Match

I couldn't quite fit all of it in the frame. Image: Alex Walker (Kotaku)

One of the great joys of a PC setup is symmetry. Everything is synchronised. The space is clean. Organised.

That's true for monitors too. Monitors must be the same size. Everything must be even. Ordered.

But there's a good argument for throwing that logic out the window.

Perhaps the most aggravating element of PC gaming over the last few years has been watching the proliferation of HDR in consoles. More vibrant colours, brighter whites, deeper blacks.

But consoles were always going to get HDR first. The market for big screen TVs and budget TVs well outstrips that of the specialist PC gaming market; phones even more so.

And then there's all the other desireable requirements: an IPS screen (although the contrast ratios of OLED are also very good). 144Hz, or at least 100Hz. 4K, and if not 4K, at least 1440p and maybe even 21:9 support.

The cost of all of that in a single package is well over a thousand dollars. And if you want HDR on top of that?

$3500.

I wish my TV was that expensive.

Most people don't buy TVs that expensive. And that's not even factoring G-Sync into the equation. FreeSync-enabled monitors are generally a fraction cheaper, but if you've got an Nvidia card — which most gaming PCs do — then you won't get as much out of it.

But there's a strong use case for having at least one large — I'm talking 32" here — 4K screen on your desk. Most gamers do a ton of web browsing, and it's pretty common for gamers to do some kind of office work that benefits from a good 4K IPS screen. Maybe you dabble with the Adobe suite in your spare time. Maybe you're a web developer, someone who streams, or slices videos for YouTube.

The trick is that those kinds of people also want to game. So having a main or separate gaming monitor makes a lot of sense. People understand the value of higher refresh rates: action is smooth, recoil is easier to control, enemies are easier to spot in motion. That's all well and good.

So instead of hoping for a pipe dream — one or two identical monitors that can do absolutely everything at a price that absolutely almost everybody can't afford — it makes sense to have (at least) two screens: one specifically for gaming, with a high fresh rate and the lowest possible response time, and another big beast to handle the web browsing, colour reproduction, HDR, and so on.

Image: BenQ

That's where something like the BenQ EW3270U comes in. For about $749, you get a 4K screen that supports HDR and can cover 95% of the DCI-P3 colour gamut. The latter is partially because of the panel: the EW3270U uses a VA panel, rather than IPS, so you're trading some colour reproduction and better viewing angles for improved contrast.

VA-based panels — which are kind of a good middle ground between the gamer-focused TN offerings and the professional-focused IPS offerings — also tend to be a bit cheaper. If you want a 4K HDR screen that's the same size as the EW3270U, you can expect to shell out $1149 or more. (There are some 31.5" screens that are a fraction cheaper, like the ViewSonic VX3211, but that screen only supports 8-bit compared to the BenQ's 10-bit support.)

For a piece of tech you're going to run for five or more years, that's a decent starting price. Other bits that people will appreciate: it's not curved, which helps if you're running multiple monitors; it comes with FreeSync support, which is a massive boon for Xbox One X (but also Xbox One S or even stock Xbox One owners) playing games that don't maintain a locked 60 frames per second, as Digital Foundry showed with Wolfenstein 2: The New Colossus and The Vanishing of Ethan Carter.

Some points worth noting here: the EW3270U only supports FreeSync 1, not AMD's latest version of the technology. FreeSync 2 gets rid of one of the tone mapping phases in the HDR rendering process, which helps reduce latency while playing HDR games. On the plus side, the EW3270U's FreeSync range extends from 24Hz to 76Hz, which is nice and wide.

Other plus points: there's only two major screws to pop in for installation. The base is also nice and small, and the bezels on the sides are relatively small — but the monitor is not bezel-free, if that's something that matters to you. Still, the whole unit is easy to put together.

DisplayPort 1.4, a 3.5mm jack, USB-C port, 2x HDMI 2.0 ports and a quoted brightness of 300cd/m2 are all nice to have, although the USB-C port doesn't supply enough power to charge your laptop. There's also a range of colour calibrated modes, as well as other presets like Game, Movie, Picture, and so on.

BenQ are also quite proud of their special "Brightness Intelligence Plus" feature. It basically uses a sensor to gauge the lighting in your current environment, and adjusts the brightness and colour temperature to suit. You have to enable the Low Blue Light picture mode or use HDR within Windows 10, but be warned: the Windows 10 experience with HDR isn't as nice as what you might expect coming over from a console. It's improved over the last few months, but the effect of enabling HDR on non-HDR is jarring enough that you're better off just running the EW3270U in SDR mode and letting the monitor switch when appropriate.

From a general usage perspective, the biggest benefit is being able to have two full width browsers or programs side by side. That's hugely useful for whenever you need to reference back to something — maybe you want to watch a movie while doing some photo editing, or have a lecture on one side while you review notes or textbooks on the right.

But the main key with all of this: there's one real good reason why you don't want the EW3270U as your lone monitor. Or any super-sized screen, for that matter.

Sorry not sorry for my desk.

Games, by and large, should be run in fullscreen. Whether that's actually fullscreen or fullscreen windowed/borderless fullscreen/fake fullscreen doesn't matter: if you're playing a game, you want it to occupy the entirety of your real estate.

And that's why I've never bought the argument of buying a single, gigantic screen — ultrawide or not — to replace a multi-monitor setup. It screws you a little if you're the kind of person who wants to share your gameplay: you can record a single fullscreen monitor with something like Nvidia's Shadowplay/Share tools, but you lose the finer controls that you'd get by using something like OBS or XSplit (that would let you monitor levels on a second screen).

Borderless fullscreen also isn't universally supported. It's an expected feature on blockbuster AAA games, but I've lost count of the amount of indies using Unreal Engine 4 or Unity that run in windowed, exclusive fullscreen, and nothing else. In those instances, you can play at 4K — but what if you want to track something on a second screen, like a podcast or emails or a Twitch channel?

And then there's the pricklier point: how many people have a gaming PC that can actually run 4K games at 60fps anyway? Answer: very few. Even graphs released by Nvidia to promote their upcoming RTX cards note that the RTX 2070 and 2080 aren't automatically over the 4K/60fps magic mark.

So in all of these scenarios, it's better to have a smaller gaming monitor — either 1080p or, more ideally these days, 1440p — with a cheaper, larger second monitor exclusively for productivity. And if that second monitor has HDR10 support (like the EW3270U), then you've got a screen that serves as a nice alternate option for console gaming.

(There's also an understated advantage with having a smaller sized gaming monitor: most people will have spent the last 10 or so years playing at 1080p or smaller resolutions, and if you're playing on a mouse and keyboard your muscle memory will be accustomed to that resolution. It's not something that matters if you're playing an open-world RPG, but if you're converting from a tinier monitor and playing something like CS:GO or Fornite, you'll definitely notice the difference in your aim.)

But the final point in my argument for an ugly, asymmetrical monitor setup: the situation isn't going to get better anytime soon. Panel manufacturers are already looking ahead at 5K, and even 8K displays. When those start coming out, and content starts being built with that resolution in mind, gamers and PC users are going to be faced with the same vexing problem you have now. If you want the higher resolution content, you can have it, but you'll have to pay a fortune and you'll be doing so without the benefits that make a huge difference in the middle of gaming.

So it's best to toss that ideal right out the window. Forget the dream of having multiple, equally sized monitors with no bezels that all have 144Hz G-Sync support with perfect colour reproduction. You can't afford it. Even if you could, it's absolutely not worth the money. Having monitors fit for purpose is a much smarter buy.


Comments

    i always liked 2 different monitors back when i was doing graphic design so i could run different colour tests, even though i no longer do that, i still keep up with 2 different monitors.

      Came here to comment exactly this - the difference can be astounding so it's a valuable tool to have different colour profiles to choose less aggressive colours for high contrast monitors, etc.

      That's a good point. I've run two monitors for ages and the colour and contrast on different monitors was vastly different. A couple (asus and Benq) seemed to be super-bright but washed out the colours as a result. Been happiest with Samsung as a good all round monitor through several different models.

      Back to the article, while freesync/gsync are definitely worthwhile additions I'm not sold on the super high refresh/response monitors. I think you need to evaluation your requirements and visual acuity before worrying about that. If you're an average gamer or casual gamer I wouldn't worry about the super high speed monitors. That's like buying a porsche to drive to the shops. If on the other hand you're into competitive, even professional gaming then fair enough.

      Similarly, if you're an old fart it's probably not worth it since your vision is degrading to the point the added monitor speed is actually lost on you.

        Slight side point to this, but a few years back I was building a database here at work, and it was to be used by people that for various reasons refused to change from a 800 x 600 resolution. Mostly to do with other software we used that was hard to read at higher than that.

        Long story short, I used a 800 x 600 desktop picture (just a black screen with a red pixel along 2 edges. Still use it a decade later) so I knew the boundaries I had to work with when designing forms and such. What I'm getting at is that different people have different needs.

        Having 2 different monitors can represent the best and worst people might use, so from a development point of view give far more direct feedback on whether something visually works for everyone or not.

    There is another good reason to have different monitors: if there is a fault in that make and model, then only one monitor is affect and one still has the other to work with.

    I have two same-sized monitors, but one is 1440p for games and important stuff, the other is 1080p for secondary stuff and movies. Works well, and a lot cheaper than two super good monitors.

    There's an alternative I think that works better than what you posit.

    An Ultrawide 34" with a 15" screen (Wacom Intuous or similar if you have the cash) down low.

    Gives you direct controls, don't need to alt tab out if running borderless to change something, and everything is in your direct line of sight.

    Back at my last job when I first started working there they still had a 4:3 monitor off to the side as the secondary. Worked great, most of the time the screen space is wasted since I don't have my browser full-screen anyway. Makes jumping from line to line so damn awkward when the text isn't forced into narrower columns (god damn do I hate when people share screenshots of such text).

      Wait not 4:3, 5:4! Wide screen is overrated, give me extra height.

        Which is why my monitor is in portrait mode.

          I tried doing that once with the monitors at college, but with the way the colours change more dramatically with vertical viewing angle rather than horizontal (while in landscape), it meant that I was basically getting a different picture for each eye and made it uncomfortable to look at.

Join the discussion!

Trending Stories Right Now