So 360Hz Monitors Are A Thing Now

So 360Hz Monitors Are A Thing Now
Image: Nvidia
To sign up for our daily newsletter covering the latest news, features and reviews, head HERE. For a running feed of all our stories, follow us on Twitter HERE. Or you can bookmark the Kotaku Australia homepage to visit whenever you need a news fix.

Consoles with support for 120Hz and higher frame rates? Bugger that, we’re already moving onto 360Hz.

The insane figure was announced by Nvidia on Monday afternoon, with ASUS releasing the first monitor with the panel under their ROG Swift line. It’ll be available some time later this year, with demo units on display at CES in Las Vegas.

It’s worth noting that the 360Hz part is made possible by the panels from AU Optronics, so while ASUS will be first cabs off the rank, expect companies like Acer, Alienware, HP, Lenovo, MSI, Samsung and more to follow soon. The ASUS ROG Swift will be a G-Sync capable screen, but we’ll eventually see 360Hz models supporting (the much cheaper) FreeSync before long.

The news is actually something that Nvidia announced in Australia last year, when they presented a paper at SIGGRAPH Asia in Brisbane. Funnily enough, the abstract of that paper argued that reduced latency was more important than higher refresh rates, but higher refresh rates were important “largely because of the latency reduction it provides”.

The importance of high refresh rate visualisation cannot be understated. In essence, what our study shows is not that high refresh rate is unimportant, but that high refresh rate is important largely because of the latency reduction it provides. We present the first study to carefully examines these variables independently at higher modern refresh rates (>60 Hz). We show that a lower latency system can provide a higher level of player performance in FPS targeting tasks. We also show small but statistically significant improvement in performance of some tasks at higher refresh rates. We believe our study is a good step towards replacing conventional wisdom in esports with objective knowledge.

So in short, you’ll still need a beast PC capable of powering games at absurd frame rates, which you can’t even really do with a GTX 2080 Ti today. But who knows? There’s an awful lot of chatter about new video cards from AMD and Nvidia on the way, and it’s about time Nvidia’s cards had a process shrink.


  • I’m so curious to know if the human eye could even perceive this refresh rate? Sure there’s silly arguments over 100hz (which you can definitely tell) and 144hz etc but 360hz? I’m genuinely curious.

    • The human eye can perceive up to 1000Fps. Myelinated nerves can fire between 300 to 1000 times per second in the human body and transmit information at 200 miles per hour.

      While there are diminishing returns with amount of different you can feel as the fps increase. Most people can still feel it.

      • Just to further add; theres a difference between being able to say something was there, and being able to identify it.

        An experiment with Air Force pilots on identification managed 1/220th of a second. (Badly) equating 1fps = 1hz, this would imply that anything over 220fps\220hz is endgame.

    • If you were designing a game that was relying on adaptive sync to provide a variable frame rate, then this could reduce the latency for displaying a frame to the user.

      With a 120 Hz display, on average there will be 4.17 ms until the next frame is displayed. With a 360 Hz display, this is cut down to 1.39 ms. Whether those milliseconds matter will probably depend on what other latency exists on the input/output path.

      • I imagine if you’re trying to create the next big twitch-shooter game to take on the likes of Counter Strike (which still has a feverpitch crowd addicted to it) or Siege or whatnot, that would be incredibly important. All this information is fantastic to find out 🙂 Thank you for sharing it 🙂

  • I’ve been chasing 120+fps for a while. Some games lock to prevent it, and the horsepower required is pretty intense… but when it works, when you actually obtain it, it’s like watching something in fast-forward, only it’s real-time. My eyes didn’t quite know what they were seeing, the first time. Like… the character isn’t actually moving any faster, but it looks like it is. Everything is so smooth, it’s a night and day difference.

    I can’t believe that anyone could argue that they can’t see the difference. Going back to 60 just… hurts. It’s painfully, sluggishly slow, like reality just lost frames, like a highly-advanced impressive stop-motion. It’s like needing glasses, having them for a few days, then going back to your shitty eyesight.

    My goal for every upgrade I buy is to chase that ludicrous FPS. I can’t wait til we’re surpassing 100 as the base-line. I want to live in that future.

    • Funnily enough I recently got feeling on an XBOX One X. It wasn’t that it achieved some crazy high it’s that Fallen Order was running horribly until I capped it at 1080p. I’d been putting up with the bad frame rate for a few hours before I realised there was actual graphic options in the settings. My eyes had adjusted to it so when I turned the graphics down to minimum and got a stable high frame rate it felt like every animation was playing in double speed.
      I wouldn’t say I’ve been dismissive about frame rates but I’m used to starting and ending a game with the same frame rate, I don’t really put much effort into optimisation, so frame rate tends to just merge into my overall experience (unless it’s spiking down or something). It was a huge reminder of how important the difference is. I’ll still take stable low over unstable high but I have to remember that it’s always worth going the extra mile for stable high.

    • I sort of can in most titles (3950x+2080 ti), but playing RDR2 recently has me down around 60-70fps, and its rough. Feels all floaty and gross.

    • yeah since getting a 144hz monitor, looking at 60fps is as bad as i used to perceive 30fps when 60fps was my standard. 85-90fps is now my new baseline 60fps equivalent, any lower and it becomes too noticeable.
      hate to think how bad 30fps looks now…

  • Although I agree more frames are better. This monitor is fairly redundant unless you have the tens of thousands of dollars to buy the hardware to push this amount of frames in most current titles.

Show more comments

Log in to comment on this story!