Turns Out Battlefield Runs Just Fine At 8K, 60 FPS

Image: YouTube (ThirtyIR)

I mean, you'll need almost $4000 worth of GPU power in your system to do it. But hey! One can dream.

YouTuber ThirtyIR got their hands on a pair of RTX 2080 Tis - which cost $1800 each at the time of writing - and decided to see how far he could push them.

So he fired up Battlefield 1. At 7680x4320. With all settings maxed including HBAO anti-aliasing. There's some stuttering in the frame times, but it's mostly around 10ms, with the average frame rate hovering between 60-70fps for the most part.

The video itself is not in 8K, which would be cruel to anyone on Australian internet. But the whole gameplay was livestreamed, just to give everyone a case of envy.

"In 8K these cards really shine," ThirtyIR said. "The Titan XPs, although they played relaly well and I could do 4-way SLI, the overall smoothness and gameplay seems to be quite a bit better with the RTX 2080 Ti."

There is actually a video of Battlefield 5 gameplay at 8K, using a Titan XP SLI setup. The video itself is in 8K as well, although you have to use Microsoft Edge to watch it at such a stupid-high res.

Jesus wept.

Naturally, this isn't the only game ThirtyIR has played at stupid high resolutions just because they could. Here's Max Payne 3 at 8K:


Comments

    Enough! I have to really try to tell the difference between 1080p and 4K on a large tv at very close range.

    As long as we're still looking at a flat screen contained in a fixed frame there's not much point going higher than that.

    Still kinda cool.

      The assets themselves make a difference too. Sooner or later a lot of these things will be rendered internally at a much higher res before being downsampled to 4K, rather than being upscaled to 4K or 4K natively, and there's a significant bump in quality there (but also substantially more load on consoles/PCs/memory etc).

        This is called supersampling, if anyone's curious and didn't know. It's already a technique for the best quality antialiasing now, just mostly used at lower resolutions like 1080p. If monitor resolutions ever stop going up, supersampling is the next biggest frontier for GPU power allocation.

          That's probably an area where neural networks and deep learning come into play more too. Nvidia showed off some of that tech with RTX (it's part of how DLSS works) but AMD will surely jump on that train before long.

            Absolutely. DLSS gets crazy good AA for such a low performance hit.

      It'll depend on how big a screen you're looking at. You might not see much difference between 4K and 1080p on a 50" tele for example, but its much clearer on something like a 65" tele.

      Think of the wallpaper on your computer. If you have a 960x540 picture as your wallpaper with a 1920x1080 resolution, you might see pixilation on a 27" monitor, but not so much on a 21" monitor.

      As someone with a 4k computer monitor, You can quite easily tell the difference between a game running in 4k or 1080p.

    wait how does a RTX do 8k, then gtx 1080 ti can barely do 4k 50fps

Join the discussion!

Trending Stories Right Now