If you want the best 3D graphics has to offer, then raytracing is where it’s at. By simulating the path each photon takes from a light source to a surface, you can theoretically render lifelike, if motionless, scenes. Offline, you have all the time in the world, but in real-time… you have about 16 milliseconds. We’re not at the point where real-time raytracing is practical for games, but we’re inching closer, as this experiment with Quake 2 shows.
What you’re looking at is a GPU-based raytracing engine, made for Quake 2 by Edd Biddulph.
Your probably wondering why the footage is so grainy. Don’t worry, it’s intentional… sort of. Raytracing works by, well, tracing a ray from the point of the light through the scene and calculating the results as objects are hit. The more rays you trace, the better the quality and less noisy the final image becomes.
And there’s more going on than just figuring out how bright pixels should be. There’s shadows (and other forms of occlusion) to think about and light doesn’t just “stop” when it hits something — it’ll almost certainly reflect elsewhere and those rays need to be computed as well to provide “bounce” light (also referred to as global illumination).
An offline renderer in a 3D modelling program such as Maya or Blender can take as long as it wants to compute a scene, so firing out a bazillion rays and waiting a few hours isn’t a big deal. For real-time applications such as games however, ideally you have to render a scene not once, but 30 or 60 times per second.
If you want to learn more, I highly recommend this cute video from Disney Animation Studios.
Anyway, back to Quake 2. Sadly, the video compression makes everything look worse — codecs have a lot of trouble handling such high-frequency footage. Fortunately an uncompressed video is available… all 1.2GB of it.
You don’t need the raw clip though. If you watch the YouTube video closely, it’s possible to see the increased fidelity of lighting and shadows.
Towards the end of the video, Biddulph increases the sampling rate, which reduces the noise, but the framerate plummets (as expected).
Yes, real-time raytracing has a long way to go, but it’s great to see what could be possible in five or ten years.