Take A Look At How Microsoft Claims The Cloud Will Change Gaming

Microsoft's talked about how its next-gen cloud infrastructure is going to change things for video games. Today, we could finally see what that looks like.

Back when Microsoft was revving up to launch the Xbox One last year, cloud computing access was one of the features they trumpeted as a big differentiator for their console. The basic idea was that game developers would be able to offload processing chores for things like AI behaviour to remote servers, letting the local hardware focus on different tasks like rendering an environment. Representatives from Microsoft showed it to the press at E3 last year and we described what it looked like . But now you can finally get a look at this important feature in action.

First, the caveats: what you're looking at isn't Xbox One footage. It's a custom demo from Microsoft's yearly Build developer conference that's running on PC hardware. That means that bandwidth connectivity and other computing factors are likely to be optimised in ways that aren't all that common in everyday scenarios. And the level shown isn't anywhere near a final retail version of a game; it's probably a prototype made to do this and only this.

That said, it illustrates the coolness of cloud computing as it'd apply to gaming development. What you'll see is a skyscraper being shot up in real time on two machines, one connected to Microsoft's Azure cloud backend and the other all by its lonesome. The unconnected machine gets bogged down with an all-too-familiar framerate drop as more stuff starts happening onscreen. But the framerate on the connected machine stays more or less consistent as the destruction increases.

You can watch this presentation in the archived video stream on the Build website. The demo starts at 3:22:01 and wraps up at 3:24:19. This cloud assistance is already at work in Titanfall. It will be interesting to watch other developers and games put it to use too.

[Thanks Arekks Gaming!]


Comments

    after years of using remote servers at work, I'm incredibly weary of 'remote processing'

    edit: I'll expand on that. we have a remote desktop app at work that performs averagely on a day to day basis, and when (not if) it crashes it's disastrous for the business. apply that logic to a videogame and I'm simply not interested.

    Last edited 04/04/14 11:31 am

      weary or wary? Both work :-). I agree with your experience, the bandwidth is more limited than processing power in my experience.

    Still not convinced. Did not see a single explosion in the second video as large scale as the first, only smaller scale tiny ones...

    I think it is total BS, the potential for lag, inconsistency, internet issues, developer headaches etc. plus cloud computing just doesn't suit realtime events.

    Maybe make it so you could add your own computer or network 2 XBONES to get better results, then maybe.

    It would be cool if the new PS4 VR headset allowed you to hookup a 2nd PS4 and double the power for example, consoles get cheaper as time goes on, and that could be a viable way to boost performance.

    But cloud for realtime gaming, more trouble than it is worth. Not interested.

      Yeah. Broad AI and non-time sensitive processes sure, that could possibly work, but even then I've been having a ton of trouble with Ground Zeroes leaderboards all week so I hate to think how that would play out with vital functionality.
      I've just got to wonder where they're going with this. They've got access to XBOX Live stats, they know how all this stuff works and they know they have to deliver something significant with this eventually or else they'd stop playing it up so much and quietly let people forget about it, so what's their plan?

    " what you’re looking at isn’t Xbox One footage. It’s a custom demo from Microsoft’s yearly Build developer conference that’s running on PC hardware. That means that bandwidth connectivity and other computing factors are likely to be optimised in ways that aren’t all that common in everyday scenarios. And the level shown isn’t anywhere near a final retail version of a game; it’s probably a prototype made to do this and only this."

    Pretty much sums it up.

    I don't understand how you could implement this in the real world, If a really powerful PC gaming rig can not do the calculations then wouldn't you need a few servers do it for you. Who is going to pay for the multiply servers that are rendering my game?

    I remember Mark Cerny saying in an interview about 5 months ago that using servers for hosting games, loading maps and so forth is fine but rendering in-game graphics and any other graphics processing is currently impossible with internet speeds and technology.
    Yes, he was plugging the PS4 at the time but he knows his stuff!

    Didn't the original PhysX cards have noticeable latency issues, because of having to pass data around? Doubt any internet connection is going to be any better.

    This article is almost a year old: http://www.eurogamer.net/articles/digitalfoundry-in-theory-can-xbox-one-cloud-transform-gaming

    I wonder what (if anything) has changed.

    Lag is the biggest problem for gaming, cloud just adds to the aggravation.

    Considering Microsoft's servers couldn't even handle all those people day-1 logging into Titanfall, I doubt they'll be able to handle magical processing power expansion for the next Halo or whatever. It's a nice dream but it's just not gonna happen anytime soon to this extent.

    I think it is just spin to try and have a reason for people to invest in XB1 over PS4.

    You know, maybe if Australia had actual decent internet then maybe I would be convinced, I can see this working fine for other countries but not ours.

    - Remove quotas
    - Give us NBN

    I'm not convinced that my computer couldn't handle that simulation... Give us the program ourselves to test if you want to convince us.

    Anything staged and also nonetheless done on a stage, is hard to believe on it's own.

    Edit: Sure enough, the 'almost 5 teraflops' of power listed in that eurogamer article is less than my 290x and I would have to pay for internet data usage and put up with latency... Forgive me for going all master race here, but this needs to be significantly better, not slightly worse than a high end graphics card to be worth while.

    Last edited 04/04/14 4:05 pm

    Remote gaming doesn't work because we don't live in a world where everyone is connected all of the time. You are in essence making an entry requirement that some people just won't be able to meet. You only have to look at the stink that was kicked up when people were told that only needed to connect their XB1 to the internet every so often to see that the world is not ready for any sort of system that has that pre-requisite.

    It just seems like a way to try to turn games into a service into more... money! No tyvm.

    Makes complete sense.
    Surely everyone has more bandwidth and less latency in their internet connection than the links to their cpu and gpu... right?

    Angry mobs used to run snake oil salesmen out of town for this level of BS. I'll believe all this 'power of the cloud' when I see it running in people's houses with fairly run-of-the-mill internet connections. Until then, this thing just reeks of smoke and mirrors. Microsoft could have saved themselves the time and effort of getting people to believe all this nonsense by just forking out the money for better hardware in the first place (and probably the embarrassment when, upon reflection at the end of the console's life cycle, it will be noted as a failed and over-hyped promise).

Join the discussion!