It took a while, but eventually manufacturers were able to give gamers monitors larger than 23″ with the high refresh rates they’ve come to know and love. But the bar didn’t stop there. People are looking towards a 4K future, and they’re not prepared to sacrifice their buttery smooth 144hz for it.
Problem is that’s too much for the current generation of monitors. But it looks like a 4K, 144hz future isn’t too far off.
The first manufacturer to show off their prototype is ASUS, who showcased their monitor to German website PC Games Hardware. It was shown off at their Republic of Gamers booth, and you can see the details of it in the first minute of this video.
The trick here is that the monitor is using Displayport 1.3, which supports 5K/60Hz screens and 4K resolutions running at 120Hz. It’s not the latest standard of Displayport — that’s 1.4 — but that hasn’t been around long enough for manufacturers to be able to support it.
It’s also massive overkill for the content currently available. The thought of having a monitor that can do 8K at 60Hz (which is what DP 1.4 supports) is nice, but there’s no 8K content to watch. My PC certainly can’t play games at 8K. Consoles sure as hell can’t. My internet doesn’t even like streaming videos in 4K; imagine how long it’d take to buffer something in 8K.
ASUS’s 4K 144Hz beauty, in the meantime, will run off a IPS panel produced by Taiwanese manufacturer AU Optronics. That means excellent viewing angles and colour reproduction (if you’re not familiar with PC monitor lingo), although it’ll undoubtedly have a price to match.
The spec sheet viewable in the video also reveals that you’ll be able to tilt, swivel, pivot and adjust the monitor’s height from the stand. I’d be surprised if something with all of these features was available for less than $1000 in the current market, especially since the ASUS ROG Swift 27″ monitors with G-Sync start from $932. Hell, you can even spend $2000 on a 34″ 100Hz IPS screen if you want.
Comments
37 responses to “ASUS Will Apparently Have The World’s First 144Hz 4K Monitor”
Why bother having a 4K screen at 144hz when almost nothing can be played at 4K at 144fps? Makes no practical sense. I guess one could argue it’s is future proofing but as of right now there is nothing that can support those refresh rates at 4K without massively dropping the graphics quality.
Plenty of games can be played at 4K 144fps, just not necessarily at ultra quality. But that aside, the inverse argument to yours is “why bother having a graphics card that can output 4K 144fps when there aren’t any monitors that support it?” Someone has to make the first move.
Plenty of OLD games, you mean? There are no current titles that can be played in 4K @ 144fps on a single card (unless it’s a simple game that purposely requires little to no horsepower). Most reviews that I’ve seen of the GTX1080 put it’s capabilities around the 35fps – 55fps range for 4K depending on the title (and that’s just the averages, you still haven’t taken into account how far down the minimums can bottom out). Sure, get two GTX1080’s, and if you’re lucky and the game scales well then you will start to see numbers consistently over 60fps … but certainly almost nothing that could hit 144fps, let alone consistently (once again, unless it’s an OLD game). Don’t forget that new games are coming out all the time that constantly push graphical boundaries that even further cripple a 4K resolution (i.e. look what Tomb Raider did to the GTX980Ti, one of the best cards you could get at the time … 4K brought it down to 25fps – 30fps). New cards aren’t always getting ahead of that trend; they are often barely keeping up with it.
Yeah, I get where people like you are coming from – ‘it all has to start somewhere’. I understand that’s how technology works, otherwise we would all still be using Apple II’s and Commodore 64’s. But a lot people talk like 4K is the new standard, and say things like, “4K or GTFO”. I feel like people just want the latest and greatest because it’s simply that – the latest and greatest. I guess I’m practical to a fault, and seeing people think that the bleeding edge ‘is the new standard’ is just stupidity. Go to the steam hardware survey results and tell me what the REAL standard is, because I can tell you that it isn’t 4K (i.e. less than 4% for a resolution 1440p and higher, which is LESS than 2% for people gaming @ 4K resolution). It will be a long time before 4K is the actual standard for PC gaming. The bar this monitor is setting is even higher again. That’s the point.
No, I don’t mean old games. GTA5 on low runs on 4K at around 100fps on a 970 and around 160fps on a 980ti. The 1080 takes that even higher. Metro: Last Light and Hitman: Absolution both have similar results. I don’t have Rise of the Tomb Raider to test with so I can’t comment on that specific title. The 1080 benchmarks you’ve seen putting it at 35-55fps at 4K are for games on max settings, often including AA (which you shouldn’t enable when you’re running 4K).
A lot of current games can easily get 4K@144fps on current GPUs, just not at ultra quality (as I said in my earlier post).
On low? Seriously? Who is going to buy $1000+ card to play games on low settings, whatever the resolution is? Even if you bump those settings to a mix of medium and high (not ultra), you still won’t hit anything near 144fps @ 4K. A quick search on youtube finds me the following:
1) https://www.youtube.com/watch?v=77jd7Ku7tmA Medium settings on The Division @ 4K on a GTX980Ti – getting less than 60fps.
2) https://www.youtube.com/watch?v=S5sU575nHj4 Medium settings on The Ark @ 4K on a GTX980Ti (around the 1 min mark) – barely getting 60fps.
3) https://www.youtube.com/watch?v=LCOs-UJskUM Medium settings on Assassins Creed Syndicate @ 4K on a GTX980Ti (19:54 mark) – barely getting around 60fps.
4) https://www.youtube.com/watch?v=0E3Heb1HmIM A mix of medium and High settings on Far Cry Primal @ 4K on a GTX980Ti – an average of 48fps
5) https://www.youtube.com/watch?v=TyeSWceptY8 Quantum Break @ 1080p on both GTX980Ti and GTX1080 can’t even reach 60FPS … @ 1080P! I’ve read that you need to drop all settings quite significantly @ 4K to even come close to getting 60FPS, even on a GTX1080. Once again, what’s the point of having settings on LOW? Another example of a game that pushes the absolute limits of even the best cards on offer.
This was all just doing a quick search on youtube for GTX980Ti performance (a GTX0180 might get you about 10fps more @ 4K). So even on just a few examples of modern games here @ 4K on MEDIUM settings don’t even come close to 100fps on good hardware, let alone 144fps. No one is playing @ LOW settings to play in 4K either … kind of defeats the point.
But you can live in ‘you-land’ and pretend most titles hit 144fps @ 4K all you want. I’d personally rather play with higher settings, consistent frame rate, and lower resolution. It’s not as though everything below 4K looks horrible – it’s only the people who worship 4K who believe that.
I’m not sure why you’re getting so worked up. I said in the very first reply that it’s possible with reduced settings. I have a 1080, I’ve tested the titles I mentioned above myself last night with DSR, which adds additional overhead over standard 4K rendering. GTA5 in particular still looks pretty good at low/medium settings, it’s mostly just ground clutter that disappears. Several current titles can be played at that resolution and frame rate and still look decent.
I know plenty of people that play at reduced graphics settings on 4K monitors to ensure high frame rates, so saying “no one is playing @ LOW settings” on 4K just isn’t true. You might not, but you’re not everyone.
Playing any game on low graphics just to hit the fps count is just silly especially if you have a great card. I have a Titan X and I definitely did not get it to play low graphics and even this card struggles to hit 60 fps on most modern games at 4K look at reviews. I would much rather 1440 playing on max settings at 120hz+ then play on low. That is an insult to your hardware.
Different people have different priorities. Some prioritise high resolution even if the frame rate suffers. Some prioritise high frame rate even if it means lower resolution. In competitive FPS it’s often a combination of resolution and frame rate, at minimum quality settings. Personally I prioritise frame rate > resolution > graphics settings. You evidently prioritise graphics settings first, and that’s fine. Suggesting people with different priorities to you are ‘insulting their hardware’ is just a dick attitude though.
A lot of people like to use their gaming monitors for actual Windows use at the same time, and moving windows around, or even the mouse cursor, is so much more pleasant and smooth at 144hz than it is at 60hz. I would never go back to a 60hz monitor just because it looks like a strobe light compared to what I’m used to. I am an old balls and used CRT monitors and am thankful that high refresh rates are finally back because anything below 120hz is painful.
Also, you can always scale down to 144p or 1080p for high refresh rates at ultra on a 4k monitor if you really want the high fps but then you have the option of playing on 4k when you really want some eye candy. I’m waiting for a 32″ 4k 144hz monitor and boom, I will buy it!
Who would game 4K w/ a single 980TI? 4K gaming is primarily an SLI affair.
“There are no current titles” You mean no current AAA titles. Most games that come out aren’t as demanding as Rise of the Tomb Raider, The Witcher 3, The Division, etc.
from what we’re seeing from benchmarks a couple of GTX 1080’s in SLI will give it a red hot go for a few recent games. Content creators and people with workstation cards could also get something out of this monitor.
Definitely worth it for someone with deep pockets. Give it 12 months and we’ll see the technology filter through to cheaper screens and cheaper cards.
Lots of games can be played at those settings. The hard part is having the hardware to do it justice. At least 3 x 1080s I would guess.
For Pascal cards, Nvidia only enables 2 cards for SLI; that *might* change via NVLink w/ Big Pascal cards & Volta cards
Maybe because there’s games that aren’t those bullshit demanding AAA titles that I want to play at 4K over 60 fps. I don’t want to buy a 4K 60hz monitor and have to replace it with 4K 120 or 144hz later on.
If you have a 4k 144Hz screen then you’ll probably never need to buy a new monitor ever again
yogalD : Almost true. But For pictures and web browsing, 4k is very good but not excellent. I’ll always be looking for 8k.
144hz is useful right down to general use of the system, allowing very smooth appearance of window animations and mouse movement.
The benefits in gaming is just icing on the cake, which the 1080 and current Titan X are both capable of doing in some games. My 1080 nets me upwards of 100 FPS in Payday 2 with a few effects turned down a notch, just a shame my 4k screen can’t display it.
You don’t need to have 144fps to see the effect of 144hz
Actually by definition you do.
0/10
So everyone perceiving a noticeable difference while gaming at 60 fps is delirious?
You don’t even get the privilege of getting a score, you’re not qualified to take the test, i suggest you hit the gym and never return.
The reason why 120 or 144 Hz feels smoother and less choppy than 60Hz (even at 60 fps) is because you have twice or more the “space or windows of opportunity” to display the image.
60 images you can display per second or 120. The smoothness is not only determined by IF you can display them, also by WHEN. 120 Hz to display 40 fps or images will ALWAYS be smoother then 60Hz you basic tool.
there are older games that could handle it. I play games from only 4 years ago in 4kDSR at 120FPS on a GTX1080, so a GTX1080ti could run 2015 games well above the 60Hz barrier.
I bet it will be a long time after 144Hz 4k hits the rest of the world before 4k above 60Hz will be available in Australia though, and there will be some serious OzTax(TM) on it too, that I can assure you. So don’t be surprised if it reaches Australia in 2021, with a $4,500.00AUD price tag.
I am also concerned that in the future, Tech Companies will hide GPS region locks to look like the regular components in their hardware to prevent Aussies paying Rest of World prices, and ordering it from overseas before it is commercially available in Australia, to prevent Aussies from getting their tech at the same time as the rest of the world and dodging the OzTax(TM). Tha’t my big fear for the future. I’d hate to order a great monitor from the US or UK, only to get a display that only says “This hardware is designed to not function in your region. if you purchased this from a local retailer contact them immediately”
How much extra load would 60Hz vs 144Hz actually put on a graphics card? I realize going from 1080 to 4k requires a fair bit more grunt.
I don’t have the exact statistics, but for what it’s worth DisplayPort 1.3 can carry up to 32.4Gbps but doesn’t even support 4K 144Hz. It tops out at 4K 120Hz.
I’m guessing that it has the hardware necessary to support DSC in the DP 1.4 standard so it can be updated to 1.4 compliance with a firmware update in future. 4K 144Hz requires about 36.5Gbps but DSC would reduce that by half to two thirds. Failing that, the only way it could support 4K 144Hz right now would be dual connectors.
Frame rate is a linear load. 144 frames per second would require 2.4 times the processing power of 60 frames per second.
frame rate is liniar relationship ie double.
resolution exists in 2 dimensions so it is 2 x 2 = 4…
Talk about future proofing, damn. Current cards can’t even hit 4K 60fps reliably on all new games on anything other than Low.
Obviously running 4x SSA on a 4K monitor is silly, but nobody in their right mind would play on 4K on Low. You’d be better of running 1440p and higher settings.
Game dependent of course – I know a lot of games see little to no difference between Ultra and Very High, for example. But Low is pretty much potato quality across the board.
Depends on how well optimised the game is. GTA5 is a good example of a very well optimised engine, you can mix settings at 4K there and still scrape 150, and ground clutter aside it still looks quite decent. There’s a good chart I saw a while back that listed every setting and its frame rate impact in a big bar graph, it’s good for fishing out the ones you can up to high/max and the ones you have to turn down. Naturally you’d play with AA off at 4K.
It’s not universal that low settings in games are potato quality across the board. Some games are like that, but I remember Far Cry 4 in particular had a fairly narrow range between low and ultra, as I had a malfunctioning GPU at the time and was forced to play it on lowest settings. It comes down to how scalable the engine is, ultimately.
How dare you insult potatoes ?
Is it 144hz for a reason? 50/60 and 100/120 makes sense from mains frequency but 144?
As far as I know it’s left over from cinema content and hardware. 144Hz is double 72Hz, which is a very common cinema update rate as it refreshes each frame of a 24fps film exactly 3 times. 144Hz first appeared in cinema as a way of getting 3D imagery onto the screen while keeping the ‘3 updates per frame’ cycle for each image.
The fact it already existed as a standard is probably the only reason it was selected when making computer monitors. You want to make more than 120Hz, you either make up a number yourself or use one that’s already out there.
Fair enough. Does make sense.
Screw the cost – let’s have some 40″ (or larger) 144hz 4K monitors!
At that point, for now, a 55″ OLED 4K Dolby HDR TV from LG is your best option for that.
SLI GTX1080 @ battlefield 4 @ 4K@144hz you will get your ~144FPS 😀
or New TitanX ?
if you can buy a Monitor for 2000$ , so you can buy a GPUs for same price
Maybe I just wand the 4k for sharp text editing and I want to play with it at 1440p 144hz, Though I wouldn’t buy two monitors for each purpose.
im torn between a new monitor like this and a higher res VR unit
WHY NOAT BOTH
I knead a 4k 120hz HDR freesync 2 pls
I’ve already spent enough money on a 4k 60hz monitor 😛