If you were hoping Nvidia might talk about PC hardware during its presentation at CES 2022, perhaps lower your expectations. The RTX 3090 Ti did get a momentary look in, but that was really about it. The show did involve games though, in a roundabout way. Nvidia is making a serious push into the nascent Metaverse space and aiming to become a force in both business and video game development.
Nvidia Omniverse is “a scalable, multi-GPU real-time reference development platform for 3D simulation and design collaboration, and based on Pixar’s Universal Scene Description and Nvidia RTX technology.” Essentially, it’s a digital suite in which users can create or model 3D environments, among a number of other features. The product itself is largely used for one of two purposes, Digital Twins and creative modelling.
Digital Twins are 1:1 digital replicas of real spaces. Users can inhabit that space without ever actually having to visit the real thing. This is useful for companies like BMW, which used Omniverse’s Digital Twin features to construct an end-to-end, CAD-certified replica of its automotive factories to better understand how it could maximise workflows and construction efficiency.
Omniverse is also being built to power a number of real-world AI applications through Nvidia’s Isaac platform. These AI will be run through the Isaac AMR platform for Automated Mobile Robotics. AMR bots are day-to-day robots controlled by a central AI, useful for inventory and warehouse management. Similar software in Omniverse will power auto-driving AI, an interactive concierge for your car, and even a little character on a screen for information and ordering at restaurants called Project Tokkio.
Stay with me. I know this is dry, but I can tie it back to video game coverage, I swear.
The other aspect of Nvidia Omniverse is its creative modelling suite. This allows artists using any number of design or modelling programs to create the detailed environments of their dreams, all within the scope of Nvidia’s proprietary RTX technology. Modelling. Physics. Pathfinding. AI. It can do it all, natively.
Here’s where things got a bit more interesting.
This ties into Nvidia Studio, which is getting a bit of a revamp for individual users. Studio is all about creating these Metaverse worlds and spaces, but it can also be used to create whole game worlds too. This is all comes with access to the best of RTX’s bells and whistles, including raytracing. The overall impression I got during the pre-briefing where Nvidia walked us through its plans was of interest in becoming a new name in the game development world. Though Unreal certainly featured among the names Nvidia rattled off as compatible with its software, it was quite careful not to bring them up otherwise.
What Nvidia sees as Studio’s major benefit is that, according to them, its workflows and overall ease-of-use is significantly greater than any other design platform out there. Fold in all these different design ecosystems, like Unreal, like Blender, like AdobeCS, ArcGIS City Engine, Houdini, and many more, and suddenly you have an all-in-one game design studio for the layman and the professional alike. The Metaverse applications will keep the corporate side happy, and it’s likely there that Nvidia will be expecting to drum up the bulk of its business here. But the game design aspect is intriguing.
Like a lot of things at CES, this all remains high-flying back-of-the-serviette ideation until users can get their hands on it. But on paper, it certainly sounds impressive.