We’re Getting Dangerously Close To Photo-Realistic Video Game Landscapes

13
We’re Getting Dangerously Close To Photo-Realistic Video Game Landscapes

What you’re seeing here is a process known as “scanning”, or “photogrammetry”. It’s not new, having already been used to generate the landscapes in games like Star Wars: Battlefront, but as you’ll see, as the technology involved (and the skill of artists) advances, so too do the results.

Just look at this.

That’s not video, that’s a 3D landscape, running in real-time — at 4K resolution — in Unreal Engine 4. It’s the work of former DICE artist Rense de Boer.

While its results seem almost perfect, the science behind photogrammetry isn’t that crazy; it was just waiting for technology to be able to match its processes.

Essentially, it involves taking a ton of photos of real, actual things/places. Those photos are then scanned, meshed together and turned into 3D landscapes. Artists then jump in and smooth out any rough edges, while adding their own touches and effects for things such as weather.

EA actually wrote a whole blog post on Battlefront’s photogrammetry last year; it’s a good read if you want to get up to speed.

And if you want to take a closer look at some of the landscapes featured in Rense’s video, here are some stills:

We're Getting Dangerously Close To Photo-Realistic Video Game Landscapes
We're Getting Dangerously Close To Photo-Realistic Video Game Landscapes
We're Getting Dangerously Close To Photo-Realistic Video Game Landscapes
We're Getting Dangerously Close To Photo-Realistic Video Game Landscapes

Comments

    • If it were just texturing 3D models with photos, we’d have had highly realistic graphics for decades because that’s standard procedure. This is a little more involved than that. It involves actually building the 3D model from the photos. It’d probably help with displacement and normal maps to some extent too but specular and emission maps would still have to be done by hand. There’d be a ton of manual adjustments required as well, but the result tends to be a lot better quality than your typical handmade model with photo texture applied.

  • i think it looks cool but euclidians engines version of using atoms its pretty awesome. was there ever and article on it?

  • Ah more of what these engines can apparently do, but not what they can actually do in a game.

    Is Unreal just a tech demo? I feel live i’ve been watching pretty demos for years now.

    • Why use any other engine when you can just fart any old half assed garbage out of the Unity tools onto Steam early access? Hell bolt on some low rent VR support and you’ve got yourself an Oculus store smash hit for $20.

    • Not sure what you’re asking. UE is an engine, it’s been used in tons of titles from the original Deus Ex to Bioshock series, Splinter Cell, Tribes Vengeance, the Borderlands series, Mass Effect series, etc.

      The reason you’re seeing more experimental projects with it lately is because since UE4 it’s a free download, you only owe them money once you start making money from it yourself. If your game/app doesn’t make money, it’s completely free.

      • Yeah but there’s been so many videos of the amazing stuff the latest version can do and yet it’s not translating into games. Just because it’s done in a game engine in no way has any bearing of this being close to what we’ll soon be getting in games.

        • That’s right. There are a lot of interacting systems in an actual complete game of any complexity, and they don’t always play well together. Stuff like the shots above isn’t especially hard on the system though, but if the game with 30 systems put together started chugging hyperdetail would probably be one of the first to go. Just depends what you’re making.

Log in to comment on this story!