When word came out last month that upcoming dog simulator Call of Duty: Ghosts runs at a higher resolution on PS4 than it does on Xbox One, the reactions were loud and angry. Was this a launch-day anomaly, or will all multi-platform games perform better on Sony’s console?
Today, the people who make Call of Duty have an official explanation, and it ain’t great news for Microsoft. Speaking to Eurogamer, Infinity Ward boss Mark Rubin put it quite candidly: on the Xbox One, they had to sacrifice resolution to get the game running at the 60 frames-per-second they wanted.
“It’s very possible we can get it to native 1080p [resolution]. I mean I’ve seen it working at 1080p native,” Rubin said. “It’s just we couldn’t get the frame rate in the neighbourhood we wanted it to be. And it wasn’t a lack of effort. It wasn’t that it was like last minute. We had the theoretical hardware for a long time. That’s the thing you get pretty quickly and that doesn’t change dramatically.
“It was more about resource allocation. The resource allocation is different on the consoles. That huge web of tangled resources, whether it’s threads-based or if it’s GPU threads or if it’s memory – whatever it is – optimisation is something that could go theoretically on forever.”
In conversations with Kotaku, game developers have made it clear that the PlayStation 4 is, on paper, the more powerful system. But some of our best sources are split on the long-term ramifications of that power difference. Are these optimisation challenges that third-party developers will overcome, or will the PS4’s faster memory be a long-term advantage that continues to allow games like Call of Duty to run at higher resolution on Sony’s new machine?
Rubin says he thinks — and hopes — that “both platforms will look way better” in the future.
“First launch, first time at bat at a new console is a challenging one,” Rubin said to Eurogamer. “That’s just the way it is. For people fearful one system is more powerful than the other or vice versa, it’s a long game.”
Comments
46 responses to “Call Of Duty Makers Explain Why Resolution Is Lower On Xbox One”
First generation games are always the more difficult to program. It’s the 2nd, 3rd and onwards where they really begin to shine. Think back to when we had games like ‘FAR CRY: PREDATOR’ and all those coming out on both Xbox and 360, versions of games that were cross platform. This is really neither here nor there.
I actually really like Far Cry: Predator. I know it wasn’t as open as the PC version, but the levels were pretty massive for console at the time. I had a really good time playing it one summer.
Yeah it wasn’t a bad game at all, all I mean is how during that time we had different versions of games made across different platforms with different resolutions on the same game even on 360 and ps3 in some cases. It’s nothing really new.
i would assume that for those sort of people who find things like ensuring it’s 60fps at 1080p are more likely to buy the PC version though, right?
Not necessarily. The biggest difference between PC and console to me is the clarity of 1080p. If it was a game which I expect to have mod support such as a potential Fallout 4 or an Elder Scrolls, I’ll go for PC. On a run of the mill multiplat like Assassin’s Creed, CoD, or what-have-you I wouldn’t really care whether I have it on PC or console, as long as it’s in 1080p.
a game like Cod I couldn’t honestly care what res it’s in… It’s gonna be the same for me no matter what as long as there are players online and the game plays good, the new school noobs wouldn’t know but most competitive fps players would run games low res low texture and crank as much frames as possible to have any advantage
Now forza 5 is 1080p, 60frames that’s all I need to know it’s the difference between a 10hr game like CoD or 300hrs of Forza
Well, I’m in the standing that if resolution ever mattered for a game it’s for shooters. You need those extra pixels to see what your shooting at.
What if it’s not real 1080p though? Both PS4 and XB1 versions of CoD are upscaled (from 900p and 720p respectively). And what if to get 1080p, you have to drop to 30 or 45 fps?
If it’s not 1080p I’m not really interested, I’d just buy it on PC, unless it’s an exclusive I’m interested in. And I’d usually be fine with the framerate loss, as long as it’s consistent. 30fps or higher is fine by me, as long as it’s constant 30fps. 60fps that keeps dropping to 30 is unacceptable IMHO.
Why settle for 1080p?
The jump to 1600p is very much worth it.
COD: Ghosts is 1080p native on PS4 and 720 native upscaled to 1080p on Xbox One.
I think you meant BF4 which is 900p native upscaled to 1080p on PS4 and 720p upscaled to 1080p on Xbox One.
Sorry, my bad. It was one of them, they’re all the same to me =P
Yea totally, because you can play PC games on your PS4/Xbone.
I think his point was if 1080p@60 is important to you, you’re better off getting your games on PC and not buying a PS4 or XB1. From what we know at the moment, there’s a decent library of games on both platforms that are running below native and are upscaled, which produces visible aliasing.
Yes, I know, I was being a dick (for a change) because he was being a smug, smart ass.
Fair enough, carry on =)
If this new gen can’t get 60 fps at 1080 then the PC has won the games race. So the fact that a launch game of one of the biggest franchises in the world (read money and resources) can’t get that from one system, then that IS a big deal.
Its about price vs experience. And for many people there just isn’t that much more ‘experience’ to justify the cost of the hardware. For $500 you get a console that has guaranteed support and will run games in a playable fashion. Pc… not so much.
I own a pc and consoles and maintaining the PC isn’t cheap. And before we start down the road of ‘but you don’t have to spend XXX amount on a PC to make it good’, your argument is that 1080p60fps is achievable on PC and not console. To ply Crysis 3 on console you buy the game and it works. To play Crysis 3 on PC (At highest settings, because you want that 1080p60fps with a tangible visual difference) you need a PC with say an ATI R9 290. So the The GPU alone is $500. (BTW I play Crysis 3 on my PC and yes it’s very pretty maxed out. worth the extra $$$… well, that depends from person to person.)
So yes. You can get a better experience on PC for sure. But it’s not always worth it when you can still enjoy the game on console.
I don’t know. it mattered more on PC because people sit closer to their monitor than their TV, but people have pretty massive TV’s these day – so the difference between 720 and 1080 is pretty noticeable. I agree that FPS is more important, but ideally every game this generation would have been set to 60fps 1080p.
Careful! The last time I said something like that, I had someone who will not be named ranting that there was no visible difference between 720 and 1080p on my tv and that it was all in my head.
He’s an electrical engineer don’t you know. So he knew what I was seeing better than I did!
Whereas me? I think 1080p looks much better. IT LOOKS BETTER!
And you’re right, ideally that’ll happen, but in practise, it might not. If I was a consoler I’d still be really happy with 720p at 60fps with really high aliasing applied. That would look pretty great.
I agree that 720p won’t look that bad, in fact my TV is a 32inch 720p set. Thats mostly because I don’t actually use it that much – just for watching the odd movie. If I gamed on it more, I’d probably want something bigger and sharper (like most people) and a console that can realise the benefits of that. To be honest though, I’m one of those people that likes to use PC monitors even for consoles to reduce latency.
Off top kind of.
Why is the PC version only 34gb and the console one 49
I’ve wondered that too. I’m guessing compression?
You are correct – better compression schemes.
On disc DLC.
Wait… This isn’t a Capcom game…
Also on the PC topic. There are no dedicated servers. It’s P2P.
There is also no ping or latency information. Which is made extra frustrating by the fact the P2P seems to take no account of where the players are. Making for a horrendously laggy experience. When you’re in Australia being put into p2p games with people in europe.
I have read it was going to be dedicated servers >.< i played it a bit and was on 200ms the whole time lagging like hell it was very annoying
http://www.gamespot.com/articles/call-of-duty-ghosts-will-get-dedicated-servers-on-pc-current-and-next-gen/1100-6415420/
Anyone know when we’ll be getting servers for PC?
How’d you find out your ping?
Resource manager in task manager click on cod which is some randomcharacters with 64 in the name and it tells u the ping
I was hoping for a more cod-like answer; cause its gay; or cause I slept with your mum last night or something like that. Instead it makes sense. lol
Good thing this is brought to light now, though I do hope that this does not happen with other games down the track.
What if it’s not real 1080p though? Both PS4 and XB1 versions of CoD are upscaled (from 900p and 720p respectively).
the ps4 Is 1080p native. it was listed here on kotaku a few days ago. or were you referring to other games being upscaled on ps4?
The exact same argument was made between the PS3 and Xbox 360 when they were released.
Games released on both consoles would almost always look and perform better on the 360. The argument was always “The PS3 is hard to develop for. Wait until developers know how to make games for the PS3. Just wait, you’ll see.”
But the fact was, seven years later, the 360 still had the better looking and performing games.
It was the same for the PS2 and first Xbox. The Xbox had the better looking version.
So, mark my words. Im calling it right now. For this generation, games released on both consoles will look and perform better on the PS4.
The PS4 is a games machine first and foremost.
The Xbone is bogged down with so much other multi tasking which pulls resources away from game performance.
I don’t agree, lately (albeit probably too late in this generation), games have generally been running better on the PS3. Just look at GTA5
You don’t agree because one game, after seven years, is slightly better on PS3 than 360?
The entire back catalogue of games isn’t representative?
Well if games are beginning to run better on PS3, then the prediction you quoted was somewhat accurate, although I think it is a moot point in a practical sense given that it’s only just starting to level out on a few titles at the very end of the generation.
The point is, whereas the 360 and PS3 had hugely different engines and thus was much more difficult to cross develop for PS3 and 360 (to my understanding), it seems Sony will be making the PS4 easier to develop for and MS will be keeping their console at about the same level, which leads me to believe this effect will be much more pronounced this generation.
I’m not trying to say “PFFT, you’re an idiot and you’re wrong”, just throwing my observations out there for consideration.
Do you mean “all games”, or specifically multiplats?
Multi platform releases.
Cant compare a game only released on one console.
It’s lowest common denominator development. The most popular console will get the lion’s share of attention and care when it comes to third party games. The 360 was leading the PS3 most of this generation so games generally looked better on Xbox. However… the exclusives on both consoles looked better than any cross-platform title, and the ones that came out at the tail end of the PS3 like Last of Us were the most beautiful of the whole generation.
This time around, I think the PS4 will continue to have the better-looking exclusives. I think it will sell far more consoles out of the gate, I think the shared x86 architecture will be less of a problem than Cell last time, and I think it will be easier for devs to leverage the PS4’s hardware superiority.
What gets me about it was both companies were boasting about there hardware and straight away it seems that’s it’s not enough. Ps3 got better as time went on and devs learned to use the hardware and sonys complicated tools. But this is common hardware and it seems its behind already.
I can’t figure out why they chose the graphics cards they did, to be honest. They’re a good generation or two behind the current series, certainly in performance. I guess they went with what they did to keep costs down but it does mean the systems are behind before they’re even out of the gate.
I’m confused, so what exactly does the “cloud” do again?
Well, that’s a very diplomatic non-answer. I understand cutting resolution in favour of smoother frame rates, but they didn’t need to make the same concessions on PS4. Just once I’d like a PR guy to acknowledge the only logical conclusion, the PS4 is more powerful and answer a yes/no question whether it will continue to pan out this way.
It’s not the only logical conclusion. As the previous generation showed, it can also be explained by how difficult it is to develop for the platform. He said above that resource allocation optimisation was an issue they faced, which, like the PS3, is a skill cap that gets overcome later in the console’s lifespan as developers become more familiar with how to work with the hardware.
Aren’t both of these consoles described as being 4k capable though?
By the sounds of this, that won’t apply when talking about games.
Even current generation PCs take an SLI card setup to handle 4K resolution for gaming, there’s no chance these consoles will handle it with the hardware they have.