Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti Compared

Ray tracing is all well and good, but how many frames for your buck do you actually get from Nvidia’s new RTX 2080 Ti and RTX 2080 cards?

After the company’s biggest launch at Gamescom this year, I’ve been slowly working my way through tests with what most people would consider the three flagship gaming cards: the founders editions of the RTX 2080 Ti and RTX 2080, which cost $1899 and $1299 respectively. I’ve been running those against a GTX 1080 Ti, the previous gaming flagship, which you can grab locally for somewhere between $1150 and $1250 depending on brand.

[referenced url=”https://www.kotaku.com.au/2018/09/part-of-nvidias-pitch-games-can-get-better-looking-over-time/” thumb=”https://www.kotaku.com.au/wp-content/uploads/sites/3/2018/09/DSC01472-410×231.jpg” title=”Part Of Nvidia’s Pitch: Games Can Get Better Looking Over Time” excerpt=”About a hundred or so journalists, YouTubers and other tech media had just sat through about three hours of dense presentations. It was the middle of the Nvidia Editor’s Day, which was essentially a day where various Nvidia executives break down the architecture of their upcoming graphics cards in exhausting detail.”]

But when the Nvidia cards first launched, there were two parts missing: games that supported ray tracing, and an all-too crucial Windows update enabling ray tracing features within DirectX 12. That Windows Update turned out to be a horror show for Microsoft, with the company having to roll back the update not once, but twice, for system stability.

So when you take ray tracing out of the equation – which I’ll be doing for the purposes of this article – how do the RTX cards stack up against modern games? Nvidia released some preliminary graphs and slides showing that the RTX 2080 could handle games like HITMAN and Final Fantasy XV over (or at) 60fps at 4K and HDR.

[referenced url=”https://www.kotaku.com.au/2018/08/nvidia-rtx-2080-benchmarks-gamescom-2018-sort-of/” thumb=”https://www.kotaku.com.au/wp-content/uploads/sites/3/2018/08/upload1-2-410×231.jpg” title=”Some Benchmarks (Kind Of) From Nvidia’s RTX 2080″ excerpt=”When Nvidia launched their RTX 20 series cards prior to Gamescom kicking off, there was a notable element missing: benchmarks. Specifically video game benchmarks, a reliable go-to for people assessing the worthiness of a new GPU. Following a closed-door session with the press, the GPU maker released some more figures about how their cards perform in the real world. Sort of.”]

But in the real world, with real world drivers, released games and publicly available synthetic tests, how far does your $1300 or $1900 go?

Before we get into the tests, here’s the system used. It’s what most people would consider a good gaming rig, but it’s not the best. The weakness especially is the 7900X CPU, a 10-core Intel offering. It plays games fine, but it won’t get the results as a six or eight-core offering that can run at a higher clock speed (like the recently-released i9-9900K, or the popular i7-8700K with less physical cores but a higher turbo clock speed than the other chips). Keep that in mind as you digest the results below.

Benchmark System:

  • CPU: Intel i7-7900X (stock speeds)
  • RAM: 32GB DDR4 3200MHz G-Skill TridentZ RGB RAM
  • GPUs: GTX 1080 Ti Founders Edition / RTX 2080 / RTX 2080 Ti Founders Edition
  • Motherboard: Gigabyte AORUS Gaming 7
  • Monitors: Acer X27 4K HDR 144Hz / EIZO 23.5″ 240Hz VA monitors
  • PSU: EVGA Supernova G2 850W
  • GPU Drivers: 416.16 (October 4 2018)

Many thanks to Nvidia for also supplying the Acer X27 Predator screen for this testing.

For clarity: the 7900X is running on stock clock speeds on a Corsair H100i liquid cooler, while the RAM is running at 14-14-14-34-1.35V (confirmed with CPU-Z). G-SYNC was disabled for all tests, and the GPU was set to Maximum Performance in the Nvidia Control Panel.

The tests and games used were:

  • 3D Mark (Fire Strike, Fire Strike Ultra, Fire Strike Extreme)
  • Forza Horizon 4 (DX12)
  • Total War: Warhammer 2 (DX11)
  • Shadow of the Tomb Raider (DX12)
  • Middle-earth: Shadow of War

At the time of writing, the Final Fantasy XV DLSS benchmark was available privately but not publicly. It’s since been released publicly, but DLSS support has not (and may not ever) been patched into the full game. I’ll also be running some 4K specific tests with RTX enabled games and other recent AAA titles, like Battlefield 5, on newer drivers at a later date.

As for the games chosen, I opted for this mix because it runs off a variety of engines. There’s a variety of DX11 and DX12 usage across the board – some support both like Warhammer 2 and Shadow of the Tomb Raider – and each game is built using an in-house engine. Almost all of the games in this test are also rather well optimised, with the exception of Total Warhammer 2. Creative Assembly’s Warhammer RTS is more CPU-bound, but it’s also the type of game that attracts gamers who spend more on their PCs than most, and I’ve kept it in rotation for now.

Because of the time of year and my general daily workload, I wasn’t able to broaden testing an Ubisoft title or a Unreal Engine-based game, and Battlefield 5 wasn’t available at the time I ran these tests. I’ll be looking to run more coverage featuring some of those games soon though.

All games were tested across 1080p, 1440p and 4K using the three highest presets available in each games. 3D Mark has no such presets, but as the different tests run at different rendering resolutions, you get the same effect.

All tests were also run several times, with outlier results discarded. Some tests are more consistent than others – Shadow of War tends to return similar results whether you run it 17 times or 70 – but this was done to help avoid issues of variance. I also disabled automatic updates where possible on each of the games (which is easy to do for games running through Steam) to avoid inconsistencies with future performance improvements.

This proved to be particularly beneficial with Shadow of the Tomb Raider: a future update actually caused stability issues for Nvidia owners resulting in the game complaining about memory errors, which I discovered after I patched the game post-testing. Fortunately, Square Enix allows people to roll back their game to older versions through Steam’s beta settings, a move that more developers should consider supporting.

One important factor, and one I’ll explain after the results: these tests were run without HDR. I’ll get into that at the end, however. Dynamic resolution was also manually disabled in games where that was an option to ensure consistency.

Let’s begin with the synthetic figures. Clicking or press on the benchmark graphs below if you need to expand them for readability. All figures are reported in average frames per second.

3D Mark Fire Strike

3D Mark is the standard de jour synthetic test when it comes to gaming rigs. Split into multiple tests that stress different parts of the system and GPU before a the combined fighting test that you probably saw once in a shop window as a kid.

The RTX 2080 Ti is the king of the pack here, and it remains that way for the remainder of the tests. The advantage that the GTX 1080 Ti has over the RTX 2080 – which will be largely neck and neck for the rest of the results you’re about to see – is the extra 3GB of VRAM, slightly more memory bandwidth (484GB/s versus the RTX 2080’s 448GB/s) and a wider memory bus.

The RTX 2080 is higher clocked, though, and in the majority of instances it eeks out ahead of the GTX 1080 Ti while having the bonus hardware for futureproofing. The Fire Strike tests, however, are one area where they fell just behind. But it’s very fractional, and within the margin of error. It’s also common to see incremental performance jumps with future drivers too, so keep that in mind as we move forward.

Shadow of the Tomb Raider

Lara’s latest adventure was scheduled to be one of the first games with ray traced shadows, and it looked a treat over at Gamescom. Shadow hadn’t been updated with ray tracing shadows at the time of testing, but the in-game benchmark is improved over the one that shipped in Rise of the Tomb Raider, offering a more representative recreation of in-game performance as Lara traverses through town centres and jungles.

When Nvidia proclaims that the “4K 60fps” dream has been realised, this is generally the kind of result they’re talking about. The frame rate dipped below the 60fps water line on the ultra preset, but it’s here that I’d remind everyone: the 7900X is not the best gaming CPU around. If these tests were run with an i7-8700K, one of the newer i9 CPUs, or the stellar Ryzen 7 2700X all-rounder, the 2080 Ti would have more headroom at Ultra settings.

As for the RTX 2080 and GTX 1080 Ti, I’d actually consider just staying at 1440p. Having that extra overhead is important for the most intensive scenes, which you have to factor in when averaging benchmarks and just the general nature of gameplay. A solid 60fps when the sun is setting over the horizon is nice. A solid 60fps in the heat of battle is much, much better.

Middle-earth: Shadow of War

Monolith’s orc-slaying/dominating simulator can be quite the looker when all the textures are bumped up to their highest. It’s also a fun game in its own right, particularly now that the more egregious elements have been patched out and some solid expansions have been released.

Shadow of War has an in-built benchmark that runs through a single unbroken scene, flying through some vegetation before diving into a bloodied castle mid-battle and approaching an orc boss in full chain armour. The game also supports HDR, provided you’ve enabled the requisite setting in your Windows display settings.

A well optimised game, and one that all three cards should have no problem enjoying at 4K. The RTX 2080 Ti has by far and away the most headroom, although it’s still a fantastic looking game at High settings, and that buttery smooth goal of 144fps (relevant for those with high refresh gaming monitors) is well within reach for all of the flagships here.

Chain armour does look real nice at 4K, though. I’m looking forward to replaying this later this year when I have a bit of time off.

Forza Horizon 4

It hasn’t gotten quite as much praise as it should have, but holy shit is Forza Horizon 4 well optimised. It’s almost at DOOM levels of performance for how well it runs across these three cards, and I would expect similarly great results for users with the RTX 2070, GTX 1070, and AMD cards too (given that Turn 10 would have a lot of experience optimising for the AMD hardware in the Xbox One X).

Even better: Forza Horizon 4 has one of the best in-game benchmarks, replicating a short race with AI which isn’t too dissimilar from actual gameplay. And it’s a great showcase for just how well all three of Nvidia’s cards perform: all three are capable of maintaining well above 60fps at 4K, at any preset.

The gap between the GTX 1080 Ti and RTX 2080 narrows as Forza Horizon 4 eats up more VRAM, which is to be expected when the resolution starts picking up. It’s also a good reminder of the frame rate hit separating the highest possible presets from the second or third-best option.

A game running at 4K on High is going to look better than 1440p on the Ultra preset – you’re getting sharper textures, anti-aliasing algorithms don’t have to work as hard and the clarity will be nicer since you’re playing at that screen’s native resolution, assuming you’re playing on a 4K screen.

And even then, I’d still recommend downsampling when the results are this good.

Total War: Warhammer 2

An old favourite, Total Warhammer is a CPU-heavy game that throws tons and tons of units onto the battlefield while all manner of explosions, effects and spells decimate the land. For these tests, I’ve used the heavier Skaven battle benchmark, rather than the original battle or campaign benchmarks.

Warhammer 2 supports DX11 with “beta” support for DX12, although in my testing Nvidia cards typically get better performance from the DX11 mode, so I’ve left it at that.

You can see the obvious limitation here throughout the results: it’s the CPU, not the GPU, which helps explain why the Ultra setting resulted in basically no difference in performance between 1080p and 1440p for all three cards. Things change a little once Warhammer 2 starts eating up more VRAM at 4K, but in general the poorer performance here is a level of optimisation that’s just not as refined as other titles.

That’s to be expected: this is the oldest of the games in this lineup, and I’ll be keen to see what improvements Creative Assembly make with the next Total War game, particularly to their DirectX 12 implementation. There are a lot of multi-threading benefits within DX12 that would be a natural fit for Total War games, so we’ll have to sit tight until Total Warhammer 3 rolls around.


A Word on HDR

HDR gaming has been possible for a while with the latest GPUs. Support was enabled for the 900 series GeForce GPUs, albeit through HDMI, while every AMD card from the R9 380 and RX 460 have supported HDR through DisplayPort and HDMI. It’s slightly trickier if you have a G-Sync monitor: only the GTX 1050 series or higher is supported.

Support for HDR among PC games is becoming more standardised amongst the AAA games, especially since many of those studios are already working on their preferred HDR implementation for consoles. Games like Destiny 2, Battlefield 1, the latest Assassin’s Creed games, ARK: Survival Evolved and HITMAN are just some of the titles with HDR support. In the tests above, Shadow of War, Forza Horizon 4 and Shadow of the Tomb Raider all support HDR, while Total War: Warhammer 2 does not.

So, you might ask: why not test everything in HDR?

The reasons are twofold. Firstly, the vast majority of PC gamers still do not own a primary or secondary monitor that supports HDR. The preference is still very much for monitors with a high refresh rate, or a panel with higher colour reproduction, than a monitor that can do HDR. Monitors that support all of these things – like the Acer X27 Predator which Nvidia supplied for testing – are extraordinarily expensive. The Acer X27 that supports G-Sync, 144hz, HDR and 4K will set you back $2800 at the time of writing, or $3500 if you want the ASUS ROG Swift 27″ screen.

If you want a 4K screen that does most of that without 144hz support, you’re looking at around $770. But 144hz is a pinnacle for a lot of PC gamers, and with good reason, and having owned high refresh monitors since the first models became available in Australia almost a decade ago, I’m not going to argue against owning one.

HDR panels have taken a while to disseminate amongst PC gaming, primarily because the manufacturers have concentrated on the other ends of the market: smaller screens for phones, and larger displays for TVs. PC monitors are a smaller market with less profit margin than either of those two extremes, and as a result many PC gamers are still making do without.

The other roadblock in the way of HDR is Windows. Support for HDR in Windows hasn’t been fantastic over the last 12 months, and while this year’s April update improved how Windows handles SDR content, it’s still pretty awful. Non-SDR content still looks washed out, and then you have different HDR implementations to deal with: some games support Dolby Vision, others just support HDR10, and others have sliders to allow you to adjust the luminance so your eyes don’t bleed out.

But I did run a short batch of tests just to illustrate one thing: the lack of performance difference between HDR and non-HDR. The GTX 10 series has supported HDR, but it’s always come at a slight performance hit. That’s still a little noticeable in the reduced testing I ran, but for the most part if you want to run a game in HDR, and can get the visuals to a comfortable and pleasurable point, performance shouldn’t be a problem.

Before we get into the final nitty gritty, and dissect the prices of all these cards, there’s one other feature we need to talk about: AI.

Deep Learning Super Sampling (DLSS)

The range of AI-powered tech in the RTX cards, particularly the updates being made to Ansel, are rather cool. But out of all of them it’s DLSS, Nvidia’s neural-network powered anti-aliasing technique, that will have the most performance impact for now.

At the time of writing two synthetic tests were available, but the tests only work with Nvidia’s RTX cards. One of them is a 3D Mark-style test, Epic’s Infiltrator demo. You can view a video of that running from Guru3D on YouTube below, to give you an indication of what we’re talking about:

The second was a separate build of the Final Fantasy XV benchmark that supported DLSS. You can get the benchmark for yourself, with or without DLSS, through the FFXV site here.

At the time of writing, this is the closest we have to approximating the performance benefits with DLSS. That said, there are some strong arguments why it shouldn’t be considered in testing.

When it was released, the FFXV benchmark was released with severe culling and stuttering issues that were chronicled by Gamers Nexus earlier in the year. The general gist of the problems was that the benchmark was improperly rendering objects and models well beyond the scope of the player’s vision, and Square admitted on February 6 that the benchmark was beset with stuttering and level of detail problems that “will be addressed in the shipping game”.

For the most part, those issues were addressed in the final PC release. They just weren’t addressed in the benchmark, which kind of makes all of this moot.

So while the FFXV benchmark does showcase notable improvements in performance when DLSS is enabled, it’s a really, really flawed benchmark. It still reports back an arbitrary score, rather than standardised metrics that fall in line with any other reports, and the aforementioned issues make it too unreliable for me to have any comfort in using it as a gauge for real-world performance.

Having seen DLSS in action at Gamescom earlier this year, I’m still very hopeful that it’ll be a performance boon for RTX owners when it starts to roll out in games. I just don’t think the FFXV benchmark meets that standard, and with development on the PC version of FFXV having been cancelled, it seems unlikely that DLSS will ever be implemented into the full game. I think it’s still worth seeing how FFXV handles at 4K, particularly given that Nvidia helped out on the development of the PC version before release, but that’s for a future article.


Whichever of the three flagship GPUs you go for, you’re going to be spending at least $1150. The situation is different internationally, but in Australia local stockists are pricing the RTX 2080 at around the same levels as the GTX 1080 Ti, which neuters some of the value argument seen overseas where pricing on the GTX 1080 Ti has become rather competitive.

More importantly, stock of the RTX 2080 is more broadly available. I’ve even seen instances – albeit limited – of the RTX 2080 being priced under $1150, although you’ll have to buy through Newegg for that.

But for someone buying today, someone genuinely considering an investment in a card that will last them at least three years, I would consider this.

There’s a much stronger value proposition that can run the blockbuster games of 2018 at the highest settings – with overhead – than still spending over a grand for a card that will mostly get you there. When you factor in the natural depreciation of technology and as ray tracing in particular becomes more popular – Nvidia aren’t the only ones investing in this space – someone with an 8th-gen Intel gaming rig or a 2nd-gen Ryzen setup is going to get more mileage out of the RTX 2080 Ti, which should have no problems at 1440p and even 4K (with some drops in settings) a couple of years from now.

That’s the best way to think about these cards. How much are you looking to invest over the course of the next few years? It’s one thing to spend $600 or $700 on a video card now. But ultimately you have to think about how long the legs on that purchase will be, when you’re likely to upgrade again, and where to get the best mileage from the rest of the system.

If money was no object, or I already had a reasonable system limited by a GPU that was a generation two or old – people still on the 900 series GPUs, or perhaps making the most out of an AMD RX 480 or R9 390X – the GTX 2080 Ti offers a substantial upgrade in performance that will hang around for years.

If we’re talking a pure value proposition of what you can buy today, the RTX 2080 offers better value for Australians. That’s not the case overseas where stock of the GTX 1080 Ti is more readily available, and more competitively priced, but you can only play the cards you’re dealt. Besides, that’s a better situation for gamers: the more modern technology is on par, if not slightly better than the 1080 Ti bar slight reductions in memory bandwidth and VRAM, and you get the benefit of upgrades to the NVENC encoder (which streamers will enjoy), dedicated RT and tensor cores for ray tracing and AI, and a card that’s more energy efficient.

But that’s entirely contingent on one thing: the fact that you’re looking at the three cards by themselves. It doesn’t factor in, for instance, whether a $500 or $600 investment now (with view to buying the second generation of RTX cards in two or three years) is better value. Or what the impact of AMD’s 7nm cards will have next year.

And it’s AMD’s looming presence that could ultimately end up strengthening the argument for the RTX cards, especially if AMD follows suit by supporting real-time ray tracing in a convincing fashion. Even if the performance doesn’t match up to Nvidia – and previous experience lends me to suspect that it won’t, at least initially – the support of both manufacturers backing the technology in some form will help increase developer support down the road.

And then there’s future downward pressure on prices to consider.

So I’ll leave it at this. If you’re in the fortunate financial position to consider purchasing any one of these cards, and the raw value proposition is less of a concern, then you might as well go all out. The RTX 2080 Ti is a fantastic card, with enough overhead across a range of games at full settings to please. If you’re loaded, you won’t be disappointed – at least not in the raw performance. Ray tracing is another matter entirely, although the ongoing nightmares of Windows don’t help there.

If you’re after an almost top-of-the-line upgrade, but aren’t sure if the GTX 1080 Ti is a better buy, the RTX 2080 is the better choice. It’s evenly priced, more supply is available locally, and you’ll have some future proofing for the next couple of years once more developers become accustomed to ray tracing and AI-powered tech in general. The only qualification I’d make there is for people who do a lot of Adobe work or professional rendering – the extra VRAM and CUDA cores in the GTX 1080 Ti might be more handy to have, and you’re not sacrificing much in gaming performance. But that’s a moot point if supply remains limited.

If you’re the kind of person for whom these cards are aspirational and you baulked at the price of the GTX 1080 and 1080 Ti when those first dropped: carry on as you were. They’re the best cards on the market, but hardly the most affordable.

As a gamer who grew up poor, and played with a lot of aging systems (courtesy of local banks who didn’t want them, or know what to do with them), I’ll always lean towards the best bang for buck. And that will only truly arrive next year, once Nvidia has more competition in the market and prices on AIB models start to fall below four digits. An RTX 2080 around the $800 or $900 mark isn’t a price to baulk at.

That said, there’s always going to be that gamer who has the money to splurge today. And for that person who buys the RTX 2080 Ti?

Just make sure you have a nice screen to go with it.


The Cheapest NBN 1000 Plans

Looking to bump up your internet connection and save a few bucks? Here are the cheapest plans available.

At Kotaku, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.

Comments


21 responses to “Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti Compared”

Leave a Reply

Your email address will not be published. Required fields are marked *