Nvidia Plan To Define The Next 20 Years Of Graphics With Real-Time Ray Tracing

Image: Supplied

The product of a decade's work, Nvidia have announced today that they are releasing technologies that will make real-time ray tracing a reality.

Now we could see cinematic graphics in video games as soon as this year and not in the way that's been used as a meaningless buzzword for years.

Ray tracing is a rendering technique that was previously limited to films and television. Render farms could plug away at it in their own time to get the right results, it was not something that could be achieved on home hardware. Not without things catching on fire, taking a long time or both.

It renders realistic graphics by simulating rays on all of the objects in a scene - calculating reflections, refraction, shadows, the whole shemozzle - then turning those into images that can be indistinguishable from those captured on camera. Basically it's a way to accurately imitate light.

Doing this in real time wasn't possible. Now it is and Nvidia think this will set the way for the next 20 years of computer graphics.

"GPUs are only now becoming powerful enough to deliver real-time ray tracing for gaming applications, and will usher in a new era of next-generation visuals," said Tony Tamasi, senior vice president of content and technology at Nvidia.

The new technology is called Nvidia RTX and it's a set of hardware and software solutions that allows for what Nvidia describe as "real-time cinematic renderings". This is the product of a decade of research and development into graphics hardware and software algorithms.

Let's get something out of the way. The only hardware capable of making full use of this technology is Nvidia's Volta GPUs. There's only one of those available right now - the Titan V. It costs $4700 and is targeted at developers. The hardware side of this equation is still out of reach for now.

Yet there's also the software side which is being used in demos shown at GDC today from Unity, Epic and Remedy. We might not see RTX's full potential for some years but the early seeds are starting to sprout.

Microsoft and Nvidia have worked together to create DirectX Raytracing. This is separate from RTX but will potentially kickstart the use of real-time ray tracing by making it available through more technologies. More importantly, the widespread use of DirectX should allow for faster adoption of this new ray tracing technology. DirectX Raytracing is expected to work on a wide range of graphics cards.

"DirectX Raytracing is the latest example of Microsoft's commitment toward enabling developers to create incredible experiences using cutting-edge graphics innovation," say Max McMullen, Microsoft's development manage of Windows Graphics and AI.

Nvidia are also introducing ray tracing tools to their Gameworks software development kit. Right now it is in early access to allow for a small group of developers to experiment with ray traced shadows, reflections and ambient occlusion. Soon it will be made more widely available. Again, this is limited to Volta GPUs.

The focus is to get this technology into the hands of developers. Tools take time to learn. Games take time to make. The sooner those aspects gets handled, the sooner RTX-featured games can make their way into our hands.

"We were surprised just how quickly we were able to prototype new lighting, reflection and ambient occlusion techniques, with significantly better visual fidelity than traditional rasterization techniques. We are really excited about what we can achieve in the future with the Nvidia RTX technology. Gamers are in for something special," says Mikko Orrenmaa, technology team manager at Remedy Entertainment, makers of the Northlight engine.

In fact, we could even see games that use elements of this technology later this year. It's ambitious but what about RTX isn't?


    I'd imagine we shouldn't expect to see this on consoles outside of specific and compromised scenarios.

      Clearly not on our current ones (as the article makes very clear).

      If consoles are still a thing in 10 years time, then almost deninitely unless something better comes along in between.

        I dunno, games today are still small room, graphically cheating set pieces, I can't imagine it getting that much better years down the line.

    Yes, this is definitely a future tech, however it has some very promising potential, as raytracing is a per-pixel operation, and so, in theory, the complexity of the scene has relatively little relation to performance - only the number of pixels that need rending...

    So if we can get it up to the level where rendering a 4k screen if viable, then things should start to get very, very pretty...

    Anandtech has a good article that explains what's going on here: https://www.anandtech.com/show/12546/nvidia-unveils-rtx-technology-real-time-ray-tracing-acceleration-for-volta-gpus-and-later

    Basically it seems like this isn't proper Pixar-level raytracing (because that requires CPUs running at around 20-30Ghz) but rather a hybrid approach that uses raytracing for better shadows and lighting off reflective surfaces, will using rasterisation for everything else.

    I'm still getting my head around it a bit (and according to Anandtech the announcement was a little light on technical detail) but that seems to be what's going on here.

      (because that requires CPUs running at around 20-30Ghz)If that were true, Intel and AMD (and their shareholders) would be beating down their door.
      Pixar doesn't use faster processors - it just uses (many) more of them. Their animator workstations would be something like 16-core Xeons, and the real grunt work for final rendering would be done on a render farm with 1000s of CPUs sharing the work between them.

        But can that farm run Crysis?


          Old joke I know but I just had to share as I threw similar jokes with my PhD class mates back when Crysis was new.

          And old and simple form of a cluster was called a Beowulf cluster and I used to imply one needed such a thing just to open the main menu.


          Sure. That farm can run the prettiest, most lifelike version of Crysis you'll ever see. At about 0.000012 frames per second.

          If anyone is interested in the different requirements between GPU real-time raytracing and cinema quality raytracing/rendering, read the brilliant response to this question on Stack Overflow:


          (Disclaimer: I haven't watched the NVIDIA video yet, so not sure how relevant it is to this new announcement)

        Sorry, I wasn't fully clear on what I meant. For Pixar-style raytracing in *real-time* you need a CPU running at around 20-30Ghz.

        CPUs haven't got any faster in around 10-15 years for reasons nobody has ever really explained to me (something to do with power requirements?) so it's basically an impossible dream until someone figures out how to do it on the GPU/in a massively multithreaded way.

        (Incidentally, the MAME guys once estimated that a 20Ghz CPU was what would be needed for cycle-accurate Sega Saturn emulation.)

    Oh pretty..

    Yes I am leading this comment section with the most technical examination :)

    Last edited 20/03/18 9:11 pm

Join the discussion!

Trending Stories Right Now