Part Of Nvidia's Pitch: Games Can Get Better Looking Over Time

A shot of Wolfenstein 2: The New Colossus using variable content shaders, where shaders concentrate on parts of the image that are more prominent to the user. Image: Alex Walker/Kotaku

About a hundred or so journalists, YouTubers and other tech media had just sat through about three hours of dense presentations. It was the middle of the Nvidia Editor's Day, which was essentially a day where various Nvidia executives break down the architecture of their upcoming graphics cards in exhausting detail.

It was gruelling, particularly if you're not a polymath. But when the crowd broke up a little, and we wandered into an adjacent room to mess with some tech demos in person, a couple of Australians started chatting about some of the techniques that the general gaming populace would start to see in the coming months.

And there's one technique in particular that could have a particular impact.

There's been a ton of discussion about ray tracing, both from the theoretical and practical implications once developers get more access to capable hardware and familiarity with the techniques.

But what might impact gamers a lot more — and might get much more usage than raytracing off the bat — is NGX. It's a part of Nvidia's Turing platform that incorporates AI and deep learning for various uses. Examples given included cheat detection, facial and character animations, but the most illustrative use case was anti-aliasing.

The basic principle is that Nvidia would generate a reference image — or ground truth, as they called it — and then use their Turing-based supercomputers to train the AI model to replicate that at higher resolutions.

There's multiple purposes for this — another example given was inpainting, where deep learning was used to touch up images or remove unsightly parts from photos. It's much like the spot healing tool already in Photoshop, but substantially faster:

Where it gets cool for a gaming perspective is anti-aliasing (AA). Two of the more common higher-end AA techniques these days is multisample anti-aliasing (MSAA), which offers much better image quality than FXAA, or temporal anti-aliasing (TAA).

Both techniques come at a cost. They're more accurate than FXAA — an Nvidia-developed algorithm that's now one of the lower-end settings for resolving jaggies in games — but they have different impacts on the hardware. But they also have their problems, ranging from ghosting, flickering, and blurry looking images in static scenes.

This slide is the perfect example of the kind of detail that current high-end AA techniques — in this case, temporal anti-aliasing, one of the most hardware intensive methods — don't quite nail.

Zoomed out, both images look pretty good. But when you zoom in on the finer detail, you'll notice all sorts of blurred edges, artifacting, and details that are just wrong. Anything displaying text in the distance is almost guaranteed to be buggered, but it's the kind of detail that deep learning is incredibly effective at rendering properly.

On the 1080 Ti, which is running off the older Pascal architecture, you can see the edges of the minigun are pretty flawed. They're not round, and in some cases it looks like part of the barrel has actually rotted away.

It happens because, as Nvidia's Tony Tamasi pointed out, temporal AA can get confused. The technique relies on data from previous frames to make the best estimate possible for what the image should look like going forward. But if you've got something moving at high motion, or you've got text that's far away in the background, you're going to run into issues.

Having a neural network constantly reference against an established image — particularly with fine, sharp details like lettering — helps to correct that problem. And the most important element: DLSS wasn't a more resource-intensive technique than what's currently available.

Put simply: if you've got an RTX GPU, and a game comes or is patched with support for DLSS, you'd be silly not to use it. But therein lies the problem.

A live demo of Wolfenstein 2: The New Colossus using variable shaders.

Only a handful of games will have support for DLSS by the end of the year, let alone when the full RTX range of cards becomes available (as of this week for the RTX 2080, next week for the RTX 2080 Ti). It does include titles like Hellblade: Senua's Sacrifice, PUBG, Islands of Nyne, We Happy Few and Hitman 2 and Final Fantasy XV.

But for the most part, games will need to patch DLSS support over the coming year. And some major titles using RTX tech aren't supporting DLSS initially: Battlefield 5 will ship with raytracing support, but not DLSS, and Metro Exodus is in the same boat.

All The Games That Will Use Nvidia's RTX Tech (So Far)

What developers can do with real-time ray-tracing is pretty cool. Question is. what games can you play in the near future that will actually take advantage of it?

Read more

However, that's also where a neat little opportunity opens up.

For any of the NGX based features to work, games will have to implement part of Nvidia's NGX API. Apart from users having to own RTX cards — because the neural network runs off the tensor core hardware that's only available on those cards — developers will have to work with Nvidia to patch in support on a driver level.

So — much like what happened with Nvidia's Ansel, although hopefully with a much greater adoption rate — what you'll end up seeing is new AA techniques pop up in games as they become patched in.

Most users will get this through GeForce Experience (GFE), although the middleware isn't necessary for the NGX to function. How often the patches will be rolled out externally to GFE is another matter. Nvidia have spent the last couple of years incentivising sign-ups for GFE, and more frequent driver updates are a logical way of doing that.

Precisely how much work it'll take to implement NGX in existing games wasn't explained, although Nvidia is working with engine makers (Unity, Epic and so on) to incorporate the NGX API at an engine level.

Of course, there's a catch.

Nvidia explained in a follow up Q&A that games that don't take advantage of raytracing won't use the RT cores on the RTX GPUs. The RTX 20 series cards are still an upgrade on the 10 series hardware, but it's a chicken and egg scenario: without the hardware, studios have no incentive to develop for future-facing techniques like raytracing. And without the games that support the fancy new features, why fork out the premium to be an early adopter?

Someone has to bite the bullet to move the industry forward, though. And it's worth remembering that paying $1200 for an RTX 2080 seems absurd — if you paid $1200 for a Founder's Edition GTX 1080 at launch, or a GTX 1080 Ti after they launched, or if you've upgraded your GPU in the last few years. That price looks a lot different for someone who's been keeping an R9 390x going, or a creaking SLI rig from the Maxwell era.

The benefit for someone upgrading from that far back is liable to be substantial enough with or without raytracing and all the other benefits. It's those gamers, as well as the people with more money than sense, who are liable to be first adopters.

And that crowd is going to be introduced to something neat: driver updates that, eventually, will make their games look better. There's still plenty of ifs and buts about this, of course: while Nvidia has seeded a range of developers with RTX hardware to get started, most of the games that take advantage of the new toys won't land until next year.

And then you've got the wealth of games already out, ones that RTX 2080 Ti and RTX 2080 owners will inevitably want to replay at 4K with the highest settings — when will those games take advantage of raytracing, AI-powered anti-aliasing, neural network enhanced upscaling?

But whether it's curiosity, an eye for business or a natural development decision, some developers will jump on board. And the gamers who go on that path will get an intriguing experience over the next year. I suspect the second generation of adopters will be the ones who truly get the generational leap that's being envisioned with Nvidia's new architecture, but as before, new techniques and tools don't move forward by themselves.

Nvidia aren't alone, though. And despite the incremental and monumental changes planned and proposed, the most interesting question for all will probably be this: what will AMD do?


Comments

    someone who's been keeping an R9 390x going, or a creaking SLI rig from the Maxwell era

    I guarantee you that to someone who's keeping old gear going like that, $1200 looks like a LOT of money.

    But I see your point. However I think it's more likely it's the enthusiasts who will float this new tech. The people who almost always buy a new card as soon as it's released. They don't mind burning cash on new tech as much as the rest of us.

    Me? I'm sitting with my 1080 for a lot longer. It's doing a really nice job so far and I can't justify that kind of outlaw just for sweeter visuals. They're so sweet already!

      Yeah I got a 1080Ti a few months back and I'm feeling no desire to get a 20x0 card at all.

      Maybe I'll wait for the 30x0 series in a few years.

        I'm in the same boat - it looks like the 2080 has very similar levels of performance to the 1080 Ti if leaked benchmarks are to be believed. Considering the cost of upgrading to the 2080 Ti, especially at Founder's Edition prices, there's little to no incentive to move past what we have now.

      Still sitting on my GTX 970 at 1080p and not seeing many reasons to upgrade... except maybe Cyberpunk. Witcher 3 was the reason for my last upgrade

        Yeah if you're still on 1080p you're at a really nice place where some money can get you great performance. How did it handle Witcher? That game definitely pushes my gear. It's one of the first one's that I'll reinstall in a year or so when I upgrade.

          Witcher 3 ran really well on high-very high. Just couldn't run hairworks and some of the higher AA options. Solid 50-60 fps. No other game has really pushed it since except maybe for Tomb Raider.

      I went from a GTX 970 to a GTX 1080 and it doubled my framerate but even lowering game settings the 1080 struggles a lot at 1440p as I want to get 144fps for my 144hz screen but the best I can get in most new titles is around 80-90FPS. It looks like the RTX 2080Ti will be double the framerate of the 1080 judging by leaked benchmarks, i'll wait until after launch to find out though as it could be just the upgrade i've been looking for.

        I can totally see why you'd want to lock in on 144fps. But you're going to need a really beastly card. Mine keeps most things over 60fps only.

        If you can possibly wait... prices should start shifting a lot after 6months and this time next year you should be able to get some great bargains. If you buy it right now though you're going to pay the early adopter tax and then you'll have to see the manufacturer's custom cards coming out in the months to come as well. I hope it's enough to get you to 144fps, but more games are coming out, which will be even more punishing on our gear.

        we need some real damn benchies!

    I'm using a GTX970 and am still debating whether the RTX cards are even worth it. I simply can't justify spending over $1000 for any video card these days, which leaves me looking solely at the RTX 2070, which has a frankly insulting price tag considering it's a third-tier card.

    With Umart now offering 1080s for <$700, a 2070 is becoming a bit of a hard sell ... especially if you consider that the RTX features on a 2070 are probably worthless unless you like ~35fps @ 1080p

    Maybe some actual reviews will surprise me and push me firmly towards the 2070... If not, I hope there are still some nicely-priced 1080s available by the time said reviews are out.

    The benefit for someone upgrading from that far back is liable to be substantial enough with or without raytracing and all the other benefits. It's those gamers, as well as the people with more money than sense, who are liable to be first adopters.

    The 'as well as the people with more money than sense' part definitely applies here, especially in regards to the 2080Ti which costs almost $2,000. I don't see how many people would be comfortable with this, especially when 7nm is just around the corner, hopefully giving way for AMD to put some much needed competition back into the market (and then of course NVIDIA's response to it). I don't see Turing having a lifespan that's as long as what Pascal managed to get away with. I give it a year before they release their next architecture.

      The architecture makes sense to hang around for a while, but the hardware might get substantially refreshed once Nvidia jumps to 7nm.

        You are most certainly correct, Alex. That was my bad, I didn't really mean to say architecture so much as I meant to say the generation after Turing. I still believe that Turing won't last anywhere near as long as what Pascal did; 12-18 months at the very most.

        That $1,900 2080Ti won't have anywhere near the kind of resale value that Pascal has at the moment (which isn't too bad considering how old it is) if the architecture is substantially revised and more performant on a 7nm process. The 2080Ti seems like a really bad proposition unless you *absolutely* have to game @ 4K/60FPS (despite the fact that there will already be demanding games / it won't take long for additional games to come out that even the 2080Ti won't be able to handle at that resolution and frame rate), or that you *really, really* want to use ray tracing @ 1080p and pray that you can manage hit 60FPS on an almost $2,000 graphics card (which seems absurd to me despite what it's offering).

        No, the 1080 that I currently own will do me just fine until AMD comes back into the picture, or until NVIDIA gives us what they are currently offering (with even better performance for ray tracing, etc) in a package that you don't have to sell a kidney for. The technology that NVIDIA is showing might seem interesting in theory but it's still yet to prove itself. In that respect it certainly has its place, but I think it's more than just a little too rich for most.

      Be nice if AMD compete again, not so confident with vega 7nm, never liked vega.

      I have a 1080ti here which will soon have a waterblock installed, keep me busy until Q1 2019 where I will consider a new card, if AMD flop again then I will just have to save up for a 2080TI which could take a long time since we have seen at the top end that without competition, NVIDIA likes to keep their prices high for the entire lifespan of products!

    I place myself firmly in the "someone upgrading from that far back" camp.
    I was on a Radeon HD5570 until it died a few months ago and I'm currently borrowing a GTX770. This is the first time in 7 yrs I've had cash for an upgrade and I don't know when the next one will be going forward.

    From my POV 1080ti's are still around $1150, though there are a few $999 deals popping up, so even if the performance difference is only in the area of 10% I might as well pay the extra ~$50 to have the newer tech even if it is only 1st gen. Going forward I might not get an new card for another 5yrs but at least I'll have the entry level version of the system games will be using at that time.

    And if Gen 2 RTX is a huge leap forward and games start being designed with RTX 3XXX in mind. Well. I was alive in the 90's. I'm not going to have a spasm if I have to play a game at 25-30fps.

    DLSS sounds interesting but I can't help but feel that its another one of NVIDIA's tech that only ends up working on a couple games and is forgotten forever, much like that VR feature they had with pascal... yet no vr game uses it......

    If DLSS catches on and works as good as it looks/sounds then it might tempt me to get a 20 series card somehow (rob a bank), it be nice if AMD made top end cards... seems like they can't catch up and if DLSS works it will be near impossible.

Join the discussion!

Trending Stories Right Now