Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti Compared

Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti Compared
Image: Alex Walker (Kotaku)

Ray tracing is all well and good, but how many frames for your buck do you actually get from Nvidia’s new RTX 2080 Ti and RTX 2080 cards?

After the company’s biggest launch at Gamescom this year, I’ve been slowly working my way through tests with what most people would consider the three flagship gaming cards: the founders editions of the RTX 2080 Ti and RTX 2080, which cost $1899 and $1299 respectively. I’ve been running those against a GTX 1080 Ti, the previous gaming flagship, which you can grab locally for somewhere between $1150 and $1250 depending on brand.

Part Of Nvidia's Pitch: Games Can Get Better Looking Over Time

About a hundred or so journalists, YouTubers and other tech media had just sat through about three hours of dense presentations. It was the middle of the Nvidia Editor's Day, which was essentially a day where various Nvidia executives break down the architecture of their upcoming graphics cards in exhausting detail.

Read more

But when the Nvidia cards first launched, there were two parts missing: games that supported ray tracing, and an all-too crucial Windows update enabling ray tracing features within DirectX 12. That Windows Update turned out to be a horror show for Microsoft, with the company having to roll back the update not once, but twice, for system stability.

So when you take ray tracing out of the equation – which I’ll be doing for the purposes of this article – how do the RTX cards stack up against modern games? Nvidia released some preliminary graphs and slides showing that the RTX 2080 could handle games like HITMAN and Final Fantasy XV over (or at) 60fps at 4K and HDR.

Some Benchmarks (Kind Of) From Nvidia's RTX 2080

When Nvidia launched their RTX 20 series cards prior to Gamescom kicking off, there was a notable element missing: benchmarks. Specifically video game benchmarks, a reliable go-to for people assessing the worthiness of a new GPU. Following a closed-door session with the press, the GPU maker released some more figures about how their cards perform in the real world. Sort of.

Read more

But in the real world, with real world drivers, released games and publicly available synthetic tests, how far does your $1300 or $1900 go?

Before we get into the tests, here’s the system used. It’s what most people would consider a good gaming rig, but it’s not the best. The weakness especially is the 7900X CPU, a 10-core Intel offering. It plays games fine, but it won’t get the results as a six or eight-core offering that can run at a higher clock speed (like the recently-released i9-9900K, or the popular i7-8700K with less physical cores but a higher turbo clock speed than the other chips). Keep that in mind as you digest the results below.

Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti ComparedImage: Alex Walker (Kotaku)

Benchmark System:

  • CPU: Intel i7-7900X (stock speeds)
  • RAM: 32GB DDR4 3200MHz G-Skill TridentZ RGB RAM
  • GPUs: GTX 1080 Ti Founders Edition / RTX 2080 / RTX 2080 Ti Founders Edition
  • Motherboard: Gigabyte AORUS Gaming 7
  • Monitors: Acer X27 4K HDR 144Hz / EIZO 23.5″ 240Hz VA monitors
  • PSU: EVGA Supernova G2 850W
  • GPU Drivers: 416.16 (October 4 2018)

Many thanks to Nvidia for also supplying the Acer X27 Predator screen for this testing.

For clarity: the 7900X is running on stock clock speeds on a Corsair H100i liquid cooler, while the RAM is running at 14-14-14-34-1.35V (confirmed with CPU-Z). G-SYNC was disabled for all tests, and the GPU was set to Maximum Performance in the Nvidia Control Panel.

The tests and games used were:

Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti ComparedImage: Sentor

  • 3D Mark (Fire Strike, Fire Strike Ultra, Fire Strike Extreme)
  • Forza Horizon 4 (DX12)
  • Total War: Warhammer 2 (DX11)
  • Shadow of the Tomb Raider (DX12)
  • Middle-earth: Shadow of War

At the time of writing, the Final Fantasy XV DLSS benchmark was available privately but not publicly. It’s since been released publicly, but DLSS support has not (and may not ever) been patched into the full game. I’ll also be running some 4K specific tests with RTX enabled games and other recent AAA titles, like Battlefield 5, on newer drivers at a later date.

As for the games chosen, I opted for this mix because it runs off a variety of engines. There’s a variety of DX11 and DX12 usage across the board – some support both like Warhammer 2 and Shadow of the Tomb Raider – and each game is built using an in-house engine. Almost all of the games in this test are also rather well optimised, with the exception of Total Warhammer 2. Creative Assembly’s Warhammer RTS is more CPU-bound, but it’s also the type of game that attracts gamers who spend more on their PCs than most, and I’ve kept it in rotation for now.

Because of the time of year and my general daily workload, I wasn’t able to broaden testing an Ubisoft title or a Unreal Engine-based game, and Battlefield 5 wasn’t available at the time I ran these tests. I’ll be looking to run more coverage featuring some of those games soon though.

All games were tested across 1080p, 1440p and 4K using the three highest presets available in each games. 3D Mark has no such presets, but as the different tests run at different rendering resolutions, you get the same effect.

All tests were also run several times, with outlier results discarded. Some tests are more consistent than others – Shadow of War tends to return similar results whether you run it 17 times or 70 – but this was done to help avoid issues of variance. I also disabled automatic updates where possible on each of the games (which is easy to do for games running through Steam) to avoid inconsistencies with future performance improvements.

This proved to be particularly beneficial with Shadow of the Tomb Raider: a future update actually caused stability issues for Nvidia owners resulting in the game complaining about memory errors, which I discovered after I patched the game post-testing. Fortunately, Square Enix allows people to roll back their game to older versions through Steam’s beta settings, a move that more developers should consider supporting.

One important factor, and one I’ll explain after the results: these tests were run without HDR. I’ll get into that at the end, however. Dynamic resolution was also manually disabled in games where that was an option to ensure consistency.

Let’s begin with the synthetic figures. Clicking or press on the benchmark graphs below if you need to expand them for readability. All figures are reported in average frames per second.

3D Mark Fire Strike

Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti Compared

3D Mark is the standard de jour synthetic test when it comes to gaming rigs. Split into multiple tests that stress different parts of the system and GPU before a the combined fighting test that you probably saw once in a shop window as a kid.

Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti Compared

The RTX 2080 Ti is the king of the pack here, and it remains that way for the remainder of the tests. The advantage that the GTX 1080 Ti has over the RTX 2080 – which will be largely neck and neck for the rest of the results you’re about to see – is the extra 3GB of VRAM, slightly more memory bandwidth (484GB/s versus the RTX 2080’s 448GB/s) and a wider memory bus.

The RTX 2080 is higher clocked, though, and in the majority of instances it eeks out ahead of the GTX 1080 Ti while having the bonus hardware for futureproofing. The Fire Strike tests, however, are one area where they fell just behind. But it’s very fractional, and within the margin of error. It’s also common to see incremental performance jumps with future drivers too, so keep that in mind as we move forward.

Shadow of the Tomb Raider

Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti Compared

Lara’s latest adventure was scheduled to be one of the first games with ray traced shadows, and it looked a treat over at Gamescom. Shadow hadn’t been updated with ray tracing shadows at the time of testing, but the in-game benchmark is improved over the one that shipped in Rise of the Tomb Raider, offering a more representative recreation of in-game performance as Lara traverses through town centres and jungles.

Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti Compared

When Nvidia proclaims that the “4K 60fps” dream has been realised, this is generally the kind of result they’re talking about. The frame rate dipped below the 60fps water line on the ultra preset, but it’s here that I’d remind everyone: the 7900X is not the best gaming CPU around. If these tests were run with an i7-8700K, one of the newer i9 CPUs, or the stellar Ryzen 7 2700X all-rounder, the 2080 Ti would have more headroom at Ultra settings.

Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti ComparedOne of the test runs with the RTX 2080, showing some of the settings and results.

As for the RTX 2080 and GTX 1080 Ti, I’d actually consider just staying at 1440p. Having that extra overhead is important for the most intensive scenes, which you have to factor in when averaging benchmarks and just the general nature of gameplay. A solid 60fps when the sun is setting over the horizon is nice. A solid 60fps in the heat of battle is much, much better.

Middle-earth: Shadow of War

Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti ComparedImage: Shadow of War

Monolith’s orc-slaying/dominating simulator can be quite the looker when all the textures are bumped up to their highest. It’s also a fun game in its own right, particularly now that the more egregious elements have been patched out and some solid expansions have been released.

Shadow of War has an in-built benchmark that runs through a single unbroken scene, flying through some vegetation before diving into a bloodied castle mid-battle and approaching an orc boss in full chain armour. The game also supports HDR, provided you’ve enabled the requisite setting in your Windows display settings.

Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti Compared

A well optimised game, and one that all three cards should have no problem enjoying at 4K. The RTX 2080 Ti has by far and away the most headroom, although it’s still a fantastic looking game at High settings, and that buttery smooth goal of 144fps (relevant for those with high refresh gaming monitors) is well within reach for all of the flagships here.

Chain armour does look real nice at 4K, though. I’m looking forward to replaying this later this year when I have a bit of time off.

Forza Horizon 4

Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti Compared

It hasn’t gotten quite as much praise as it should have, but holy shit is Forza Horizon 4 well optimised. It’s almost at DOOM levels of performance for how well it runs across these three cards, and I would expect similarly great results for users with the RTX 2070, GTX 1070, and AMD cards too (given that Turn 10 would have a lot of experience optimising for the AMD hardware in the Xbox One X).

Even better: Forza Horizon 4 has one of the best in-game benchmarks, replicating a short race with AI which isn’t too dissimilar from actual gameplay. And it’s a great showcase for just how well all three of Nvidia’s cards perform: all three are capable of maintaining well above 60fps at 4K, at any preset.

Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti Compared

The gap between the GTX 1080 Ti and RTX 2080 narrows as Forza Horizon 4 eats up more VRAM, which is to be expected when the resolution starts picking up. It’s also a good reminder of the frame rate hit separating the highest possible presets from the second or third-best option.

A game running at 4K on High is going to look better than 1440p on the Ultra preset – you’re getting sharper textures, anti-aliasing algorithms don’t have to work as hard and the clarity will be nicer since you’re playing at that screen’s native resolution, assuming you’re playing on a 4K screen.

And even then, I’d still recommend downsampling when the results are this good.

Total War: Warhammer 2

Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti ComparedThis was one of the Warhammer units Kotaku reader @jacka knocked up for a Warhammer 2 competition last year. It was too funny not to use here.

An old favourite, Total Warhammer is a CPU-heavy game that throws tons and tons of units onto the battlefield while all manner of explosions, effects and spells decimate the land. For these tests, I’ve used the heavier Skaven battle benchmark, rather than the original battle or campaign benchmarks.

Warhammer 2 supports DX11 with “beta” support for DX12, although in my testing Nvidia cards typically get better performance from the DX11 mode, so I’ve left it at that.

Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti Compared

You can see the obvious limitation here throughout the results: it’s the CPU, not the GPU, which helps explain why the Ultra setting resulted in basically no difference in performance between 1080p and 1440p for all three cards. Things change a little once Warhammer 2 starts eating up more VRAM at 4K, but in general the poorer performance here is a level of optimisation that’s just not as refined as other titles.

That’s to be expected: this is the oldest of the games in this lineup, and I’ll be keen to see what improvements Creative Assembly make with the next Total War game, particularly to their DirectX 12 implementation. There are a lot of multi-threading benefits within DX12 that would be a natural fit for Total War games, so we’ll have to sit tight until Total Warhammer 3 rolls around.

A Word on HDR

Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti Compared

HDR gaming has been possible for a while with the latest GPUs. Support was enabled for the 900 series GeForce GPUs, albeit through HDMI, while every AMD card from the R9 380 and RX 460 have supported HDR through DisplayPort and HDMI. It’s slightly trickier if you have a G-Sync monitor: only the GTX 1050 series or higher is supported.

Support for HDR among PC games is becoming more standardised amongst the AAA games, especially since many of those studios are already working on their preferred HDR implementation for consoles. Games like Destiny 2, Battlefield 1, the latest Assassin’s Creed games, ARK: Survival Evolved and HITMAN are just some of the titles with HDR support. In the tests above, Shadow of War, Forza Horizon 4 and Shadow of the Tomb Raider all support HDR, while Total War: Warhammer 2 does not.

So, you might ask: why not test everything in HDR?

The reasons are twofold. Firstly, the vast majority of PC gamers still do not own a primary or secondary monitor that supports HDR. The preference is still very much for monitors with a high refresh rate, or a panel with higher colour reproduction, than a monitor that can do HDR. Monitors that support all of these things – like the Acer X27 Predator which Nvidia supplied for testing – are extraordinarily expensive. The Acer X27 that supports G-Sync, 144hz, HDR and 4K will set you back $2800 at the time of writing, or $3500 if you want the ASUS ROG Swift 27″ screen.

If you want a 4K screen that does most of that without 144hz support, you’re looking at around $770. But 144hz is a pinnacle for a lot of PC gamers, and with good reason, and having owned high refresh monitors since the first models became available in Australia almost a decade ago, I’m not going to argue against owning one.

HDR panels have taken a while to disseminate amongst PC gaming, primarily because the manufacturers have concentrated on the other ends of the market: smaller screens for phones, and larger displays for TVs. PC monitors are a smaller market with less profit margin than either of those two extremes, and as a result many PC gamers are still making do without.

The other roadblock in the way of HDR is Windows. Support for HDR in Windows hasn’t been fantastic over the last 12 months, and while this year’s April update improved how Windows handles SDR content, it’s still pretty awful. Non-SDR content still looks washed out, and then you have different HDR implementations to deal with: some games support Dolby Vision, others just support HDR10, and others have sliders to allow you to adjust the luminance so your eyes don’t bleed out.

But I did run a short batch of tests just to illustrate one thing: the lack of performance difference between HDR and non-HDR. The GTX 10 series has supported HDR, but it’s always come at a slight performance hit. That’s still a little noticeable in the reduced testing I ran, but for the most part if you want to run a game in HDR, and can get the visuals to a comfortable and pleasurable point, performance shouldn’t be a problem.

Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti Compared

Before we get into the final nitty gritty, and dissect the prices of all these cards, there’s one other feature we need to talk about: AI.

Deep Learning Super Sampling (DLSS)

Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti Compared

The range of AI-powered tech in the RTX cards, particularly the updates being made to Ansel, are rather cool. But out of all of them it’s DLSS, Nvidia’s neural-network powered anti-aliasing technique, that will have the most performance impact for now.

At the time of writing two synthetic tests were available, but the tests only work with Nvidia’s RTX cards. One of them is a 3D Mark-style test, Epic’s Infiltrator demo. You can view a video of that running from Guru3D on YouTube below, to give you an indication of what we’re talking about:

The second was a separate build of the Final Fantasy XV benchmark that supported DLSS. You can get the benchmark for yourself, with or without DLSS, through the FFXV site here.

At the time of writing, this is the closest we have to approximating the performance benefits with DLSS. That said, there are some strong arguments why it shouldn’t be considered in testing.

When it was released, the FFXV benchmark was released with severe culling and stuttering issues that were chronicled by Gamers Nexus earlier in the year. The general gist of the problems was that the benchmark was improperly rendering objects and models well beyond the scope of the player’s vision, and Square admitted on February 6 that the benchmark was beset with stuttering and level of detail problems that “will be addressed in the shipping game”.

For the most part, those issues were addressed in the final PC release. They just weren’t addressed in the benchmark, which kind of makes all of this moot.

So while the FFXV benchmark does showcase notable improvements in performance when DLSS is enabled, it’s a really, really flawed benchmark. It still reports back an arbitrary score, rather than standardised metrics that fall in line with any other reports, and the aforementioned issues make it too unreliable for me to have any comfort in using it as a gauge for real-world performance.

Having seen DLSS in action at Gamescom earlier this year, I’m still very hopeful that it’ll be a performance boon for RTX owners when it starts to roll out in games. I just don’t think the FFXV benchmark meets that standard, and with development on the PC version of FFXV having been cancelled, it seems unlikely that DLSS will ever be implemented into the full game. I think it’s still worth seeing how FFXV handles at 4K, particularly given that Nvidia helped out on the development of the PC version before release, but that’s for a future article.

Nvidia’s RTX 2080 Ti, RTX 2080 And GTX 1080 Ti Compared

Whichever of the three flagship GPUs you go for, you’re going to be spending at least $1150. The situation is different internationally, but in Australia local stockists are pricing the RTX 2080 at around the same levels as the GTX 1080 Ti, which neuters some of the value argument seen overseas where pricing on the GTX 1080 Ti has become rather competitive.

More importantly, stock of the RTX 2080 is more broadly available. I’ve even seen instances – albeit limited – of the RTX 2080 being priced under $1150, although you’ll have to buy through Newegg for that.

But for someone buying today, someone genuinely considering an investment in a card that will last them at least three years, I would consider this.

There’s a much stronger value proposition that can run the blockbuster games of 2018 at the highest settings – with overhead – than still spending over a grand for a card that will mostly get you there. When you factor in the natural depreciation of technology and as ray tracing in particular becomes more popular – Nvidia aren’t the only ones investing in this space – someone with an 8th-gen Intel gaming rig or a 2nd-gen Ryzen setup is going to get more mileage out of the RTX 2080 Ti, which should have no problems at 1440p and even 4K (with some drops in settings) a couple of years from now.

That’s the best way to think about these cards. How much are you looking to invest over the course of the next few years? It’s one thing to spend $600 or $700 on a video card now. But ultimately you have to think about how long the legs on that purchase will be, when you’re likely to upgrade again, and where to get the best mileage from the rest of the system.

If money was no object, or I already had a reasonable system limited by a GPU that was a generation two or old – people still on the 900 series GPUs, or perhaps making the most out of an AMD RX 480 or R9 390X – the GTX 2080 Ti offers a substantial upgrade in performance that will hang around for years.

If we’re talking a pure value proposition of what you can buy today, the RTX 2080 offers better value for Australians. That’s not the case overseas where stock of the GTX 1080 Ti is more readily available, and more competitively priced, but you can only play the cards you’re dealt. Besides, that’s a better situation for gamers: the more modern technology is on par, if not slightly better than the 1080 Ti bar slight reductions in memory bandwidth and VRAM, and you get the benefit of upgrades to the NVENC encoder (which streamers will enjoy), dedicated RT and tensor cores for ray tracing and AI, and a card that’s more energy efficient.

But that’s entirely contingent on one thing: the fact that you’re looking at the three cards by themselves. It doesn’t factor in, for instance, whether a $500 or $600 investment now (with view to buying the second generation of RTX cards in two or three years) is better value. Or what the impact of AMD’s 7nm cards will have next year.

And it’s AMD’s looming presence that could ultimately end up strengthening the argument for the RTX cards, especially if AMD follows suit by supporting real-time ray tracing in a convincing fashion. Even if the performance doesn’t match up to Nvidia – and previous experience lends me to suspect that it won’t, at least initially – the support of both manufacturers backing the technology in some form will help increase developer support down the road.

And then there’s future downward pressure on prices to consider.

So I’ll leave it at this. If you’re in the fortunate financial position to consider purchasing any one of these cards, and the raw value proposition is less of a concern, then you might as well go all out. The RTX 2080 Ti is a fantastic card, with enough overhead across a range of games at full settings to please. If you’re loaded, you won’t be disappointed – at least not in the raw performance. Ray tracing is another matter entirely, although the ongoing nightmares of Windows don’t help there.

If you’re after an almost top-of-the-line upgrade, but aren’t sure if the GTX 1080 Ti is a better buy, the RTX 2080 is the better choice. It’s evenly priced, more supply is available locally, and you’ll have some future proofing for the next couple of years once more developers become accustomed to ray tracing and AI-powered tech in general. The only qualification I’d make there is for people who do a lot of Adobe work or professional rendering – the extra VRAM and CUDA cores in the GTX 1080 Ti might be more handy to have, and you’re not sacrificing much in gaming performance. But that’s a moot point if supply remains limited.

If you’re the kind of person for whom these cards are aspirational and you baulked at the price of the GTX 1080 and 1080 Ti when those first dropped: carry on as you were. They’re the best cards on the market, but hardly the most affordable.

As a gamer who grew up poor, and played with a lot of aging systems (courtesy of local banks who didn’t want them, or know what to do with them), I’ll always lean towards the best bang for buck. And that will only truly arrive next year, once Nvidia has more competition in the market and prices on AIB models start to fall below four digits. An RTX 2080 around the $800 or $900 mark isn’t a price to baulk at.

That said, there’s always going to be that gamer who has the money to splurge today. And for that person who buys the RTX 2080 Ti?

Just make sure you have a nice screen to go with it.


    • With the 2080Ti being twice the price of a 1080Ti, you could buy a lot of other items, faster CPU, SSDs, better monitor, games etc. with the change.

      • 1080 Ti’s right now are still going for $1150-1200 and you can get the FE 2080 Ti for $1900, so it’s not quite twice the price. But that’s always definitely the question to ask: what’s the best quality of life upgrade you can make with the purchase.

        I think a monitor one year / GPU the other year / CPU-mobo-RAM-HDD other year is probably the right cadence given where prices and technology is at.

    • Are there seriously people who don’t skip a generation or two? They’re either rich or have no concept of value, or both.

  • For someone sitting on a GTX970, seems to be worth my time to upgrade to the 2080 over a 1080 Ti for the slightly increased frames.

    • That’s what I’d recommend. The extra VRAM isn’t that big a factor given that the 4K performance isn’t good enough that you’d be playing at 4K all the time: you’re more likely to be playing at 1440p for the better frame rate and effects. So considering the RTX 2080 is priced the same as the GTX 1080 Ti, the RTX card seems a better bet.

      That’s if you’re upgrading before Christmas, anyway. There shouldn’t be much (if any) change in pricing until the new year.

  • Saw a vid on YouTube yesterday putting Star Citizen 1080 v 2080 side by side, and it made for an interesting one to watch. Put a couple of other games in for a base comparison, but most was for SC.


    Result was straight up pointing out that the fps gain was broadly 10fps, and ultimately not worth the expense. Not yet. There were so many other things, like server load, etc that was also impacting on the gaming that the end benefit, while there, couldn’t justify the US$1200 cost of the card.

    Clearly its the next stage for gaming, but I don’t think other tech is getting the most out of it yet. And that tech goes beyond whats in your rig. We’re back at the point where the GPU is ahead of the curve and we need server tech, or RAM, or CPU’s to catch up somewhat.

    • It’ll be deep learning. There’s only so much tech that can be physically soldered onto these boards, so anything that can leverage a much higher amount of computing for various tasks – rendering, NPC calculations, procedural generation – will end up making the most difference tbh. There’s only so far left to go with shrinking die processes, current cooling solutions and the general size and amount people are prepared to pay for.

      • That makes sense. Which makes it server side tech, something that hasn’t been explored as far as it could. Its been a race to make our home machines bigger and better, but if you switch that race about to the server side, the physical limitations should start to fade away.

        I can see definite advances there, then pushing that output back to our machines. Who was doing that recently? Steam? Nvidia? Either way, it takes the pressure off the consumer needing to pay thousands for smaller and smaller gains. That’s what the GPU game has ended up being.

    • How do you figure? Shadow of War 1440p ultra gets +30% on the 1080Ti, +45% at 4K ultra. Shadow of the Tomb Raider is +32% and +40% respectively. Forza and TWWH are both CPU-bound which is why you get flatter results. The price point would be the better point of argument with the MSRP +70% for the +40% performance you get, but you can’t really argue that the card isn’t a top performer.

      • Rtx is literally a load of shit, that is snake oil. All the games they talk about are partial implementations, gimicks that are designed to sell cards. In reality we still have years until we have a real sense of what rtx has to offer gaming; hence why I called it snake oil.

        • We’re not years away from RT enabled games that are proper implementations. Battlefield V has a full implementation and a dev update today announced a dynamic RT optimisation in the next patch that get a 50%+ performance increase. In particular, fixing waste ray use on foliage almost quadrupled frame rate for that case.

          Even if it was still years away, RT is only one part of the 20x series. The render pipeline has been changed with dynamically applied pixel shading resolutions that improve frame rate for no visible quality loss, a new mesh shading model that significantly improves multi-object geometric shading, revamped caching system that uses a shared memory architecture, much improved floating point calculation rate. DLSS gets you the current best quality visual result of antialiasing with a fraction of the cost because of its clever offloading to CUDA.

          All of this stuff is either already present due to being an innate property of the card, already present in one or more games, or has a commitment to be added to existing games by developers – eg. DLSS has a commitment from at least 25 existing titles as of last month.

          I understand it not being a good value proposition, and I don’t recommend it to just everybody. But I don’t see any justification for calling it snake oil short of not knowing what the new architecture does.

          • Bf5 isn’t full implementation, so far we have only been shown reflections without shadows, while Tomb Raider has displayed the opposite.

            They are clearly trying to justify the pricepoint with RTX, but so far I am yet to see full real time implementation (especially outside the controlled environment of a tech demo) and that is likely because the cards aren’t powerful enough yet to do the complete package. So I stick by what I said, it is snake oil.

          • Sorry, poor phrasing. What I meant is it’s not a ‘demo’ or ‘trial’ implementation, it’s complete for what they want to use from DXR’s capabilities – they just want reflections. I do agree that for current AAA games and especially ones with huge open areas like Battlefield, using all of DXR’s capabilities simultaneously would tank the frame rate.

            I guess where I disagree with you is this is a compromise developers always make. Direct3D offers a ton of great features that if you try to use them all at once will tank the game, so there’s always a balancing act of picking which ones you want and which ones to skip over. You’d hardly say a GPU is snake oil because it can’t use every D3D feature without tanking, and DXR is just another DirectX subset the same as D3D.

            I think you might feel like I’m bugging you at this point, if so I apologise. I appreciate you talking through your rationale.

          • So I take it from your posts that you feel RT is worth it (and/or DLSS)? My dilema is that I can get a used 1080ti, or a new 2080 for a little bit more (maybe 10%). Looking at benchmarks, and the extra VRAM of the 1080ti, the only reason to go with the 2080 is if you believe the RT and DLSS tech will mature enough in the near future to significantly add to your gaming experience.

          • Yeah, that’s a tough choice. Ignoring RT/DLSS, the 2080 is marginally better than the 1080Ti. RT as it exists now isn’t mature enough to be optimised well, which means it’s pretty brutal on overall performance, and it’s hard to tell what the lower bound for optimised use will look like. We do know that Battlefield V devs have reported 50%+ performance gain from some optimisations they’ve worked on, which is promising, but you’re still looking at the biggest performance drain of most any card feature.

            So the most important question at the moment, in my opinion, is whether the 2080 has enough grunt to make decent use of optimised RT implementations in the medium term. This I don’t actually know, I haven’t looked at many RT benchmarks to begin with and most of them look at the Ti model. I do think RT on the 2070 is wasted, it just won’t have the power to handle it properly, at least as optimisation stands at the moment.

            DLSS is fantastic, and I do think we’ll see a decent number of titles supporting that before the next generation hits. Provided they don’t collapse like FF15 did.

            If I were in your position, I’d favour the 2080 if only because of the marginal performance improvement over the 1080Ti, and the fact it’s new so you get the full warranty. Worst case, RT is too demanding and you don’t use it, but you’ll still have a card that’s at least as good as the 1080Ti in everything else. On paper the 2080 has better components in basically everything but one area – the 2080 has 8GB of GDDR6 memory while the 1080Ti has 11GB of GDDR5X. This should only really factor for you if you’re running 4K or high resolution VR. For most everything else the amount of memory shouldn’t make much difference.

          • Thanks for the advice. I decided to go with the 2080. As you mentioned, worse case scenario is I’m getting a currently marginally better card that should improve over time as the drivers are perfected, plus a warranty, for a slight increase in price over a used 1080 ti. And then if RT and/or DLSS turn out to be effective, I’m way ahead. I’m also banking on Nvidia’s clout in the industry to push developers to implement the tech in their games. Since they have little if any competition, it seems reasonable. Plus, I doubt the 2080 ti will ever be in my price range. The one thing you mentioned that gives me pause is the Vram difference. My current card is a 980, which has performed great, except in Fallout 4 VR. The card was perfectly capable of running the game, up to a certain point it was very smooth, but then it would all of the sudden start to get super choppy, and would remain that way until I restarted the game. I figured this was due to the 4GB of VRAM. So, I’m a little gunshy about that, but hopefully 8 will be enough for the foreseeable future. I do run a 4k monitor, but I’ve seen a ton of benchmarks for 4k on the 2080. You don’t think that will be an issue, do you?

          • 4K textures will come pretty close to filling 8GB, but in most cases it shouldn’t go over. This is a difficult one to properly test because a lot of games fill available VRAM even if they don’t necessarily need it, so a game that uses 7.5GB of VRAM at 1080p may still use 7.5GB of VRAM at 4K just because of how they’ve done texture handling. 4K benchmarks at ultra settings are a good indicator that it’ll work, so check as many of those as you can.

            Bluntly, it’s probably the weakest area of the 2080. Keep in mind, this is something you can control if you find a game that just blows it out, so you won’t get in a situation where you can’t play or where the game runs like shit and you’re just stuck with it – you can reduce your texture quality one step down (usually 4K down to 2K). You’ll lose some visual fidelity, but it’ll still run. This is only something you’d need to do if the 4K textures are exceeding VRAM, not all the time. Keep in mind the vast majority of gamers don’t have more than 8GB of VRAM, so games are definitely designed to give options for that.

            Sorry, I wish I could give a more concrete answer. It’s something you should definitely consider, but I’d probably still go with the 2080 personally.

  • I think AMD’s clinging-on to the market will soon pay off – as we’re reaching the upper limits for hardware GPU’s.

    They might still be behind Nvidia by this time next year – but will anyone really care when they can still get 144fps in Battlefield V?

  • Just pass and wait for next year’s cards like all the other smart people. There’s a reason Nvidia’s RTX range got laughed at (followed immediately after by the stock price plunge) and now Nvidia are ending 1080ti production to try and force RTX sales.

Show more comments

Log in to comment on this story!