There’s a rumour that the next Call of Duty runs at 1080p on PS4 and 720p on Xbox One. For those who care, we’re looking into it with Activision, and will get back to you if we hear back.
See Games Differently
Now you can get the top stories from Kotaku delivered to your inbox. Enter your email below.
By subscribing you agree to our Terms of Use and Privacy Policy.
Comments
105 responses to “There’s A Rumour That The Next Call Of Duty Runs At 1080p On PS4 And 720p On Xbox One.”
O_o ominous…
Not satisfied until I get 4K res.
Better go watch some porn then…
http://www.gizmodo.com.au/2013/10/4k-porn-is-here-and-shows-you-every-single-detail/
The other half is that apparently the PS4 version looks ugly. We already know both consoles can handle 1080p@60 with no problems so this seems more of a reflection on the developer than the hardware.
Genuinely interested to see this proof.
GG have confirmed this in KZ multiplayer, but not SP.
Forza 5 and NBA 2K14 has been confirmed for 1080p and 60fps.
A racer and a basketball game. Unfortunately, that doesn’t count in my book. :\
Any talk of what games like Ryse and other full-world games run at? I really want the vast majority (90%+) to be 1080p30fps at least, but I have my doubts.
Well, I’m going to avoid this thread for the rest of the day.
Why? I have a valid point. Dirt3 was a very early DX11 game but ran better than DX10 games. This is because racers have nicely detailed car models and perhaps ground, but the rest is far off in the distance so doesn’t need the horsepower.
Basketball is similar, small stage with limited assets, and even more limited polygon count. The audience is the only thing that would be excessive due to them being so close together, but devs usually put a sub-par audience in there anyway.
All up, those two types of games run a lot better than open-world games due to the graphical complexity. My point still stands.
You asked for evidence that the console is capable of 1080p at 60fps. These games show that it is. The complexity of the scene graph is completely subjective on a game-by-game and scene-by-scene basis, there will always be scenes that are too complex for the XB1 to render on those settings, just as there will be the same for the PS4.
I think GTA5 on current generation consoles showed nicely that graphics capability is in large part a function of optimisation efforts by the developer. My original point was simply that if there’s merit to the rumour, it’s more likely to be a developer issue. That’s not unusual in the early days of a new console generation.
If you ask me, both consoles should have used at least current generation graphics hardware. Why they went for older cards that are well behind the bleeding edge is beyond me.
Stickman you are a moron… racing games are some of the most demanding on PCs especially dirt which uses heaps and heaps of processing power on particle effects from light, dirt and other weather elements
As for this rumour it’s freaking call of duty the people who play it don’t give a damn about graphics and resoloution they care about prestige and kd spread trust me this thing will sell like crazy on Xbox and Xbox One regardless
Indeed.
Also, a lot of the periphery in driving games moves past quickly, so you can get away with a lot that you can’t in other games that people move through at a slower pace. It’s also easier to polish individual tracks or i suppose a basketball court, than other games.
Running a game at 1080p is irrelevant as it is a hardware feature not a software feature. So the only reason a game would not run at 1080p would be a developer issue not a hardware issue on behalf of the PS4 and XBone. You were provided empirical proof that the hardware is capable of running at the 1080p 60FPS and you ignorantly chose disregard it.
No, he’s making perfect sense. Neither of the games listed carry any real weight, graphically. Forza is still using 2d spectators for goodness’ sake. When you get some confirmation on AC4 or Ghosts, with high detailed worlds, I’ll be happy.
But given Ryse is being nerfed to 900p as mentioned below, I really do have my doubts.
Ryse will run at 900p from memory and then upscaled to 1080
Better than 720p then upscaled, I guess. Native 1080p would be better. We can only dream!
Most xbone games that have been announced are running upscaled 800-900p. Killer Instinct is 720p. This is from one of the Microsoft devs tweeting.
It’s really looking poor for xbox in the early days…
Will not effect sales in the slightest I think people who have common sense will purchase a console that actually has AAA games not mud
You said there was no proof they can do it. Clearly they can
They “can do it” on current gen too. Wipeout HD was in native 1080p. Be realistic, we don’t care if they “can do it”, we want to know if it’s realistic to expect triple A games like AC, CoD, Watchdogs, BF4 and Ryse to run in native 1080p. And as evident by this rumor if it is confirmed and Ryse’s revolution, it is not. You must understand the implications outside of this technicality.
Racing games and sports games are not difficult to optimise. There is nowhere near the detail required as games like shooters, open world, RPGs etc. You can be much more selective as to where low res textures and low poly models are placed. You need a decently textured environment and stunning looking cars, and you can cut back on everything else and it will look exceptional. Games like Assassin’s Creed where you can scrutinise the tiniest things and where they may be cutscenes with upclose shots of 3D assets and textures require much better optimization and power.
As mattm said, a few games have been confirmed at that output already.
Thief runs 30 fps so plz tell me where the 60 fps 1080 come from. As a pc gamer my laptop nearly beats the xbox and ps4 and my desktop is on a different lvl XD.
I’d argue that it’s both.
The develop might not spend the resources to make Version 2 as “good” as Version 1, but if the HW underlying V2 is a pain to do that for, and costs more than V1 to optimise (by a fair margin, for sake of argument), then it’s also a HW problem.
“Not my job to fix your HW mistakes”, and so forth.
Yep, it’s a combination I expect. On the information we have right now I’m leaning more towards the developer being unfamiliar with how to optimise for the hardware, but who really knows right now. I think both consoles have made mistakes with their graphics hardware choice, frankly.
Sounds like a rumour two opposing fanboys start to try and say who has the better console
Well I know which I’d rather have. Ugly 1080p please.
That seems like an odd statement at face value. The only benefit 1080p brings over 720p is an improvement in visual quality. ‘Ugly’ obviously suggests a deficit in visual quality. Are you saying you prefer high fidelity crap to low fidelity goodness? No problem if you do, I’m just curious. Most opinions I’ve heard around here seem to be the opposite, and there’s been a huge resurgence of retro games lately that mostly fall into the latter.
Zombie you are so missing the point.
And the point was that a game forced to run at 1080 is sacrificing horse power for a slightly better resolution over something that runs at less.
In alot of circumstances i’d wager this horse power would be better utilised in other places such as lighting, shadows or draw distances amongst other things that when combined would give a greater detail enhancement and overall picture quality than a slightly better resolution.
While it does come down to individual games and how well optimised a system is, the fact is that in some situations having a lower resolution will make the game better looking.
I’m sorry, I don’t see how what you said relates to either what I said or what Bangers said, except for the number 1080. Could you clarify what you’re replying to?
No, because 720p wastes all the fidelity. Why even bother with shiny graphics and nice textures if you edges are jagged and sharp textures aren’t able to be seen in full detail?
And what good is fidelity if it’s only used to render crap? If your processing power is expended on resolution, you may not have enough left to run antialiasing passes or render high resolution textures. Hence why I asked the question, given limited processing power, would you prefer higher fidelity with lower quality, or higher quality with lower fidelity?
Since you downvoted it, I take it you don’t think that’s a fair question to ask? Could you explain why?
Then why render nice graphics if it’s viewed through a crappy pixellated filter? It’s all up to your own preference but personally I think crap being rendered in clarity is preferable. Why? There’s a multitude of reasons, but the main one is that textures aren’t going to be a “high enough” resolution for a long long time. No matter how high you crank the texture resolution, unless it gets to an absolutely obscenely high number, you’re still going to have parts of textures which look low res. So really, it’s a question of
Crappy graphics rendered at 1080p, considered my many to be as high a resolution as we need
or
Slightly less crappy graphics, rendered at pixellated blurry 720p
Textures aren’t going to get lower than 1024×1024 either way, and it’s the main driving factor here. I don’t see any compelling reason why the tradeoff would be better by having 720p.
Especially when you think that quality and fidelity are different things. You’ve either got low fidelity rendered in high fidelity, or you have high fidelity rendered at a low fidelity resolution which makes it look no better than low fidelity graphics in comparison. The 720p option is more or less objectively worse because it drags down the quality of everything else.
Quality and fidelity are different things. Fidelity is the clarity of a reproduction from its source material. You can have a high fidelity digital image scanned from a photo, but if the photo was aged and colour-bleached, you’ve got an image that is high fidelity, low quality.
The same comparison could be made to video. Would you prefer to watch a 1080p camera rip of a movie where the colours are a bit washed out and the contrast is pretty bad, or a 720p Bluray rip? You’re saying the 1080p version is superior every time, but most people would disagree.
But that’s not the same at all. That isn’t even a parallel here. You’re talking about colours and contrast, which is absolutely unrelated on a graphical note. Graphics has nothing to do with contrast or colour or anything like that. The only things they’re really going to alter if they upgrade to 1080p is the resolution of textures, perhaps some post processing options like bloom or HDR or SSAO or what have you. It’s not as though the image itself is being lowered in quality, it’s the assets, 3D models, polycount, textures, alpha textures, shadow resolution (which is probably the biggest factor now that I think of it). It’s not like there’s going to be a drastic difference. You’re going to see small changes in all of these aspects, not an overall change to the quality of the picture. The colours, contrast, saturation are all going to be as high quality as they can be on both resolutions. The only differences will be in mesh polygon count, texture resolution, shadow resolution, aliasing, and whatnot.
Having said that, could you give a compelling reason as to why 720p with normal assets is preferable to 1080p native with slightly lower resolution textures and slightly lower texture resolution? Especially when you consider the texture and shadow resolutions are effectively downsized because the render resolution is turned down. It’s essentially loss of pixels either way, difference is with 1080p you’re only losing pixels in textures and shadow resolution and postfx. With 720p you’re losing pixels in all of those due to the loss of pixels in render resolution.
@toasty_fresh
The comparison is absolutely relevant. The question I asked originally and have continued to hold through the entire conversation here is which is more important, fidelity or quality, in situations where you cannot maximise both. You seemed to have trouble with the difference between the two, so I explained them and gave you an example from other media to help illustrate that difference.
Graphics processing power is not unlimited. Every generation of every platform has had to contend with the question of what to do when the hardware limit is reached, and there are three main things that can be done.
– Reduce scene complexity. This is done by removing objects from the scene graph, and implementing techniques like occlusion culling.
– Reduce rendering quality. This is done by reducing or removing effects from the rendering pipeline, such as antialiasing, anisotropic filtering, ambient occlusion, soft shadows, reduced texture quality, etc.
– Reduce resolution. This works as described.
All three of those things will reduce the load on the GPU and increase available resources for other things. Developers will make a decision on which one (or more) areas they can cut back on so that other areas can flourish. Games like Forza cut back on the scene complexity so they can focus on render quality (primarily) and resolution. Games like Skyrim cut back on render quality so they can focus on scene complexity.
What we’re looking at right now is that some developers are choosing to cut back on resolution so the freed processing power can be given to scene complexity, rendering quality, or both. Games like CoD would focus on both of those things. It’s not simply choosing to run at 720p because they feel like it, they’re choosing to run at 720p because to run at 1080p would sacrifice quality or complexity that they want to retain.
That is the essence of the question, and what I’ve been getting at for the entirety of this thread. If you believe that resolution is the be-all end-all, then you must necessarily accept that in order to achieve maximum resolution on limited hardware, quality must be sacrificed in other areas. You may have worse textures, baked lighting, no antialiasing, low-poly models.
Many people prefer quality over resolution. They’re happy to watch a 720p movie instead of a 1080p one as long as the quality is better. They’ll run PC games at slightly reduced resolution in order to turn the quality settings up to full and enjoy the visuals. Contemporary retro games exemplify the low-fidelity high-quality approach and many are praised for their beautiful visuals. The benefits of reducing resolution are clear: 720p is only 33% shorter and narrower than 1080p, but it uses 55% less processing power to render. That processing power can go towards advanced rendering techniques that make the scene look qualitatively better than it would look at 1080p.
If this were a PC game, you’d have a choice. You could run your game at 720p with far view distance, maximum AA, AF and AO and get a stunning game in a 1280×720 screen area, or you could run it at 1080p, with the AA and AF turned down, the AO off, simple raycast shadows instead of volumetric.
I defend my question as valid because there’s no obvious answer between the two. Different people prefer different things. As far as I can tell, you seem to have been arguing all this time that the question is flawed (presumably, since you downvoted it) and that there is an unambiguous answer: resolution is king. I disagree, and I’ve said from the start it’s fine to disagree, so I’m not what exactly about the question you’re getting worked up about.
This is where I dispute you, because a lower resolution completely removes the point of having a stunning game in the first place, because it no longer looks stunning. Do you not agree that resolution has a huge impact on the graphical quality? Have you played Halo 4? The graphics in the game are impressive, but it’s wasted because it’s rendered in upscaled 720p.
Do you play console games on a TV or something? On TVs upscaled 720p is filtered to look substantially sharper than if it was on a PC monitor. If you play on a PC monitor like I and many others do, it looks horrendous.
Do this for me get a big 1080p tv plug your pc in find you tube go 720p then 1080 then come back and tell what you think. The diff ended is legit just the pixels that’s why 4k looks boss you cNt go backwards.
To clarify, yes I’m saying I’d rather have a 1080p 60fps game that had less effects/shaders/AA/etc than a game with more effects and such running at 720p.
Cool. That’s all I was curious about =)
I’d prefer high refresh at a decent size, but if one had to be paramount, it’d be refresh.
Oh yeah definitely, but as long as both are in 60fps, I’ll take the less detailed 1080p option.
Zombie has missed one thing taphat you pointed out with out knowing we are talking about people who play on nearly 10 year old hardware I’m sorry but my pc looks amazing and my ps3 look well average for people who have only played consoles for there life will have the jaw drops when they see 1080p and trust me on this zombie the ps4 is more powerful so maybe they have even graphics and ps4 also has 1080. It’s a bad day to be and xbox fan boy.
Heh, it’s hard to get a head around the concept of “less detailed” and “more pixels”, isn’t it? Both add a level of improvement, it’s just a matter of preference.
I’d love high refresh and crisp/clear images with less happening than fuzzy images with loads happening on screen, personally.
I played some Quake 3 Arena the other night in 1080p. It looks pretty good for a game released 14 years ago. I think the default “high” graphics settings set the resolution to 800×600, it’s quite surprising the difference pumping up the resolution makes. Those bezier curves are as sharp as ever.
https://twitter.com/BenKuchera/status/392992808161660928
That’s interesting as well, more or less confirmation. But, I disagree with his assertion.
Further:
Related rumours state that they managed to get a 3-week port job running at 1080px90FPS on the PS4, but took three months to get a 1080px15FPS port going on the XBO.
Apparently, it’s due to the issues with the ram set-up on the Xbox, and ease of development push Sony’s made on the PS4.
While this one game won’t change anything, if it sets a trend for development then unless MS splash money to enforce no visible distinction between a multi-play release, it’ll slowly snowball into something important.
Yeah, the extra ‘RAM’ issue is something that’s being billed as a possible reason for this, over on neogaf. Reading with interest.
That one’s doubtful. The RAM configuration is almost completely irrelevant to the developer, they’re both handled by the platform API. And as I’ve mentioned before, PS4’s RAM choice is questionable because high latency RAM should not be used for general purpose memory.
Do you have a source for this rumour?
Gaf – I’ll have to dig it out later, but it’s been going around through their “trusted informants” for a while that the eSRAM adds a fair bit of complexity to coding – it’s just another hurdle to learn, and *apparently* they have to manual write a flush for it, it’s not auto. At least, last I checked that was the story, post E3. DF picked something up about the latency on the eSRAM as well, I think – might be foggy there.
Usually I ignore that sort of thing, but I remember the informants being both from MS as well as “Big Developers”, so it added a small ring of truth at least.
It’s also the sort of thing that can be overcome with experience, just not when you’re rushing to release at launch.
Sorry if that came across like I was doubting your post, was just curious =)
Nope, all good!
Perhaps it’s a matter of bandwidth? If the bus is idle and you can issue the command immediately then the lower latency system is going to win. However, if the workload exceeds the bandwidth of a system, you’ve also got to factor in the time spent queuing to start the operation.
That’s why separate graphics and system RAM is usually the preferred architecture, rather than the combined RAM Sony opted for. If you’re pushing the graphics to its limits, your game logic doesn’t have to wait in line.
If anything, the industry seems to be moving in the other direction. In the mobile space, pretty much everything uses a SoC that combines CPU, GPU and other components together on a single chip. In the PC space, you’ve got both Intel and AMD producing chips containing both CPU cores and GPUs too.
In these systems, the components share the same memory controller, so they could be seen as competing for resources. However, it also means that data doesn’t necessarily have to pass through main memory to communicate between CPU and GPU.
And if we look only at the next gen consoles, both designs use this single chip design with a shared memory controller, so it isn’t clear that having two types of memory is going to be a win.
Mobile is a unique market where size constraints benefit from unified hardware. Nothing in development for the PC space right now is comparable to discrete GPU processing power, but they do make for appealing low- to mid-range substitutes for a discrete card – think onboard video, but more capable.
From a bandwidth perspective, whether you use a single memory controller or multiple doesn’t matter. There’s only one path out of the CPU, and it goes straight to the northbridge which feeds data directly to the RAM and the PCIe bus. The FSB is the fastest bus in the whole system, and the RAM and PCIe buses are the next fastest. Compared to the latency of the RAM itself, the bus speeds are basically irrelevant. So whether you’re sending to one controller or multiple, the buses and controllers don’t get choked, the RAM itself does. Which is why separating system RAM and GPU RAM is the standard architecture, including on the XB1: because they’re two different choke points, and one being choked won’t detrimentally affect the other. The PS4 is the only non-mobile device I’ve seen so far that uses a unified design.
Data doesn’t have to go through system memory to get to GPU memory, as an aside. The northbridge handles both.
The memory controller hasn’t resided in the northbridge chip for a long time: AMD started producing chips with on-die memory controllers 10 years ago, and Intel has followed suit. Other devices that need to access memory do so through the CPU.
So if you’re pursuing a unified memory architecture, a CPU/GPU combo chip puts the GPU closer to the memory. In addition to this there are even more benefits when the GPU shares a cache with the CPU (what I was alluding to in the previous post), since writes from the CPU don’t need to hit RAM before they can be read by the GPU and vice versa. This is why Intel has been pursuing this architecture since Sandy Bridge, and AMD doing the same with its A-series APUs.
Both the new Xbox and PlayStation are built on this architecture, so I don’t know why you consider the PS4 to be particularly different.
@jamesh Whether it’s on the northbridge or not is inconsequential the fact it’s not a bottleneck in any arrangement, onboard or otherwise. I considered putting a clarification in that when I say ‘CPU’ I mean the actual processing unit and not the auxiliary functions that are squeezed into the same die, which are still functionally separate from the CPU, but I didn’t think it was necessary. I guess I was wrong =) Reiterating my point, the choke point is the RAM itself, not the controller (regardless of where it’s located), and not the bus.
I addressed your comment on RAM caching in my previous post at the end: data doesn’t need to go via system RAM before heading to the graphics card. The northbridge routes that information directly, and buffered instructions are stored on the graphics card in the L2 cache.
The XB1 does not have unified RAM. Again, I covered this above. XB1 RAM is divided into three separate banks that handle different functions. One is for the operating system and apps (ie. all the social stuff, voice chat, XBL), one acts as the equivalent of VRAM, and the other acts as the equivalent of system RAM specifically available to the game running at the time. I think I’ve explained sufficiently well what the benefits of having different sets of RAM for system and graphics processing are in my previous post. You keep bringing up the controller even though I’ve addressed that point as not being a significant factor.
Should read the GAF thread. The esRAM breaks easy shit because it’s too complex.
I dunno, I’ve got to say most generations this stuff only seems to matter to people looking to justify why their console of choice is the superior one. Side by side comparisons don’t really matter when the choice of which version you get was made the day you brought your console. If people really cared the PC would win every console generation by a landslide.
Except that is nonsense, alot of people own both consoles and outside of exclusives decide which version to get based on a number of thing, most prominently being which system looks better (noticeably better, not the hog wash minute difference no one can spot outside of side by side).
I would also state the reason no one gives 2 hoots about pc is the minimum cost entry barrier. Sure you can get most games looking AS good and a select few even better than their console brethren but the $$$ cost of having a pc powerful enough to “grunt” the game to that point far exceeds the cost of a console.(usually by multiples in the range of 3-5x+)
So your assumption that if people cared about graphics is utterly bullshit, 9/10 people will get a better looking experience on their console simply because they can’t afford getting a good enough pc and then of those that can most probably think its a colossal waste of money when you can just get a console and have it looking just as good.
Absurd. For equal power, a PC costs 3-5x that of a console? A ‘select few’ games look better on PC than on console? Stick to what you know, mate, because you clearly don’t know anything about hardware capabilities and costs.
A high end PC from scratch will cost you: $200 case, $200 PSU, $500 GPU, windows $200+, cpu $200, 16GB ram $200, SSD 120 Gb $120, 2TB HDD $100, motherboard $200, $150-700 for a monitor and then i doubt you’d cheap out on keyboard and mouse so probably another $200 easy.
That right there is a high end pc. It will run everything better than a console if the game allows you to. (some are so poorly optimised even that machine won’t look much better).
Thats a solid $2000+ machine upwards of 3-4k if its not custom built, which makes it at minimum 3x the most expensive console. Even the expensive XB1 is only $600 and those games like titan fall you will hardly notice a difference between the machines, especially if the GPU you happen to have is shit for the game and runs like crap which is highly likely.
Even if you were to go moderate and get a $200 pos GPU downgrade the cpu to an i5, half the ram get rid of an SSD, slightly less awesome motherboard and PSU, no monitor and pirate windows and build it yourself.
Its STILL more expensive (around $800 easy) than the XB1, only your pc is less powerful and will now need an ENTIRE overhaul in 2 years when it can’t run games at even moderate settings. Because you cheaped out on a motherboard it has no room for an upgrade. Because you got a cheaper smaller case you can’t fit that giant ass GPU along with more ram or the larger MB.
THERE is not a single argument you can make you are utterly and entirely incorrect. even a SHIT computer is more expensive and outclassed easily by the next gen tech, hell you would be luck for that $800 machine to look as good as a finely tuned current gen game.
That just further proves that you don’t know anything about hardware and costs. Your ‘high end PC’ argument is a strawman, you don’t need to build a high end PC to compete with consoles. Here’s your statement:
Your claim is absurd. Here is a PC with comparable hardware and performance (based on known benchmark metrics of announced hardware) to the PS4.
– CPU: Intel i5 3570 ($210) – A quad core 3.4GHz is more useful for gaming than PS4’s 8-core 1.6GHz. Even if you were to add the processing speeds together, the 3570 still comes out ahead.
– GPU: Radeon HD7790 2GB ($150) – This GPU is equivalent to the one in the PS4 in both specifications and power (1.8 TFLOPS).
– Motherboard: MSI B75MA-P45 ($70) – And this is being generous. The Gigabyte GA-H61M-S1 would also suffice for these purposes at only $45.
– RAM: Corsair Vengeance PC12800 2x4GB ($75) – Note that this is in addition to the RAM already on the HD7790, bringing the RAM total higher than the PS4.*
– HDD: Any brand 500GB 5400rpm 2.5″ ($45) – These are the exact specs of the PS4 HDD. Pick any brand, they’re all about the same price.
– PSU: Corsair CX430 ($39) – Easy handles the power requirements of the hardware with room to spare, and has an 80 PLUS certification to boot.
– Case: Rosewill FBM-01 ($25) – The case makes no difference to performance, no need to waste money on a higher end version,
– Controller: PS4 Wireless Controller ($60) – Premium paid for the next gen controller. Or you could settle for a PS3 controller ($40) or an X360 controller ($20).
(* Unfortunately, due to a string of particularly bad luck for factories in Japan, PC RAM prices are quite a bit higher than usual at the moment. With any luck this should return to normal in the next 6 months and you should see 8GB prices closer to $60.)
These things don’t belong in the comparison:
– Windows: Not needed, you can run most games perfectly fine on Linux. See the Steam Box for an example. If you absolutely must have Windows, worst case you can pick Windows 8 up for about $80.
– Keyboard and mouse: Not needed. The console doesn’t have one, there’s no reason the PC should have one. Both work just fine with the controller.
– Monitor: Not included. You don’t get a monitor when you buy a console, you’re expected to provide your own. Both can output to the television your already own.
– SSD or extra hard drives: Not needed. The PS4 doesn’t use them either, so there’s no need to include them when building a comparable machine.
This brings the totals to the following:
– Gaming PC: $609 for equivalent performance, $674 if you want the most expensive components listed
– PS4: $549
– The difference: $60-$125
For your edification, that is 10-20% higher in cost than the PS4. Not 200%, 300%, 400% or more, as you claimed above, just 10-20% for hardware that will perform to an equivalent level as the more powerful of the two upcoming major consoles, and exceeds them both in computing power and total RAM.
Add to this the price of games. Worldwide, PC games are on average 30% cheaper than console games. This is because of a few things, but the biggest one is called a platform license – console developers are required to give a percentage of their sales to the platform owner (Sony or Microsoft in this case) for the right to publish the game on that platform. Developers naturally pass that cost on to you, the consumer.
PS4 titles in Australia range from $100 up to $120 for games like Thief, Dying Light, Final Fantasy XV, Dragon Age: Inquisition, etc. In comparison, PC games range from $60 to $80, which means a 33% difference on average, or about $40 per title. At those prices, it only takes 2-3 game purchases to completely negate the increased cost of the PC, and after that every $40 saved is $40 that could go towards upgrading your PC whenever you feel like it. In fact, after only 15 games purchased, you’ve already saved enough to pay for the entire PC in the first place.
You’re quite right, there is probably not a single argument I can make that will convince you, but I suspect that has very little to do with the mountains of evidence or the persuasiveness of my argument, and much more to do with the fact you don’t have the faintest idea what you’re talking about, yet are committed to defending your misinformed viewpoint come hell or high water.
You’re welcome to choose willful ignorance if you like, it’s no skin off my nose. Reality will be here for you as always, if you ever care to return.
O.O you can’t put ps4 specs in a pc and say ” same power” hur dur, that’s now how it works and the very fact you think it does is the problem. It’s called optimisation.
It’s why the GTA V looks incredible despite my phone having many many times the sheer grunt of my xbox 360 in every single regard, yet my phone would die a horrible death trying to run that game.
That “machine” you so nicely added up is rubbish, it couldn’t even beat GTAV visually and is 3x the cost of a 360 that case would also kill your pc, being that cheap that small, it would have no ventilation and would likely cook you components if you tried to run near optimal output in the 30+ typical aussie heat.
Also the VAST majority of games DO NOT RUN ON LINUX AT ALL, no matter how you toot your “Steam os” something that doesn’t exist currently and also still needs windows to play games , so you WILL need a copy of windows unless you want to exclude a LARGE portion of AAA titles.
You will also be needing a mouse and keyboard, unless of course you want you computer to be near useless outside of gaming.
While i went to the extreme end my number of $800 for a shit computer is about right, this computer “might” have better specs on paper than a XB1 or PS4 but it won’t out perform it. My pc right now has better specs than both next gen consoles and yet only just out performs my 360 and ps3.
Hell when i loaded up FF14 on both console and pc I couldn’t even tell a difference outside draw distance and my pc sounded like a jet engine compared to my noiseless ps3. So as i said, you are wrong in every respect.
Yeah, what would I know, I’m only a developer with game industry experience. Give it up, mate. Nobody’s buying it, you’re only lying to yourself. Not wasting any more time on your delusions.
a bit off topic but is call of duty ghost really only $44.96 on steam??
Yeah, and BF4 is 20% off on GOG.
I might add that you can get BF4 + premium for $72 from origin Mexico using a proxy. Worked for me and many others 🙂
http://www.ozbargain.com.au/node/120388
huh? I don’t see any BF4 on GOG
You might find that price is for the upgrade from standard to deluxe not for the game itself.
Yeah, it’s the upgrade price, but they’ve put the $45 price on the store’s front page so it looks like the full game…until you navigate to the page, of course.
why Neogaf is ever used as a source for anything is beyond me -_-
Lots of big news events and stories have broken on NeoGAF long before any other site
So yeah, there is a good reason – they have a solid track record, whether you choose to believe it or not, is up to you
I’m constantly annoyed that devs often post more on neogaf than their own forums. It’s infuriating to be using the official forum but have to get information (including critical stuff like bug info) from a 3rd party source… happened to me more than once.
GAF is usually right.
Because for the 99.9% of them that post .GIFs, there is 0.1% that have actual insider information.
haha Neogaf have no f-ing credibility.
I’ll admit I’m Concerned, just Preordered an xbox one yesterday o.O I hope this turns out to be just a rumour and nothing more. Surly the days of 720p are behind us, it comes with a 4k hdmi cable for goodness sake!
Neither is good enough for next gen. 4K Tv’s are about to hit hard and with the 8 year life-cycle of a console it wont be long and 1080p just wont cut it.
What are you going to use to power this 4K TV exactly? A PC? Do you have 4K budget to be able to drive it because that is a whole lotta pixels you need to be pushing to even get a decent frame rate at native res. Even Crysis 3 running on a tri-SLI titan rig could only pull about 30fps or so at 4K.
4K will be an ultra-enthusiast market until the PS5 or the next Xbox equivalent are able to do it on the cheap.
EDIT: 4K is also a joke until you have the content to use the device. Current consumer media isn’t up to the task yet so expect a couple of years at least before 4K movies are purchasable on a disc of some variety.
Except 4K porn is already coming out – see the giz article from last week.
Yeah, if the move from SD to HD was expensive for devs in terms of time requirements, I don’t think HD to 4K will be any different.
4K will be for dual or quad density BDs, or better still, streaming. Both Ps4 and xb1 can do that.
Both ps4 and xb1 only have HDMI 1.4 output though, which only supports 10.2Gbps, which is below the UHD 4K standard (only 2160p at 30fps, compared to 2160p at 60fps)
So both the ps4/xb1 should be able to read and decode UHD4K, but they will not be able to output it natively at 4K.
Agreed it’ll remain an enthusiast market and the processing power required is huge, but current graphics hardware from both companies can handle it now with just dual SLI, albeit not on ultra settings. It’s the equivalent of powering four 1080p monitors, so it’s only a little more demanding than a triple monitor setup with Eyefinity or Nvidia Surround.
Looks like the Wii U version is gonna be the best.
LOL! Yeah, with 2 gamepads, that’s probably a resolution of 4000×480
COD is a big deal, MS better get it right……
WTF are Microsoft going to do? It’s up to the developers to get it right.
The PS3 was supposed to be more powerful than the 360 and I’m sure Sony were fuming when Bethesda butchered Skyrim. These things happen when developers don’t get hardware right.
You’ll definitely see games looking far better than COD running at 1080p on the Xbone before the end of the gen, it’s just a case of whether the developer can get it right this early.
True, but im sure there may be beter support or something they can do to make sure a big title like cod is looking its best?? Maybe not…..well i hope this rumor is wrong just like the BF4 one was with PS4
Meh. Getting pc most likely.
So, if there’s a genuine difference between what the two can display (for a reasonable amount of effort on the devs part), does that mean we’re likely to see multi-platform games that are optimised for 720p as the lowest common denominator? Is that a thing? Genuinely don’t know here, just hoping we get rid of all the jaggy edges the PS3 had.
1080p?
My PC can do that pretty easily.
One main reason I sold GTAV this week. My retinas were burnt to a crisp playing it in 720p. Please make the pain go away…
I’ll buy it on PC when it gets released later in glorious full HD being powered by heavy duty hardware (twin videocards ftw)
It’ll be awesome on PC, I can’t wait. But you have to admit, they did an excellent job with the X360 hardware.
It looks ok on the Xbox and ps3 with all the late pop in and stuff but I have to agree on this one wi such old hardware they should be happy them self I mean old xbox 360s could not run this so it was pushing the hardware the the maximum but I think the ps3 could of has a v2 and just had 8gb of ram increase in clock spleen and it would be set for this gen. I mean blue ray check wireless Internet check funny i believed the Xbox had it and a few move features like it can already play some games in 1080p it was to far ahead in it’s time. Blue rays were only just coming out and all those things we see standed we’re not at the time. Pc comes out looks ok less popping in and a few more things but mods mods mods make the game run like 30fps and make my pc go jus but looks so good.
I don’t know about PS3 but I had almost no pop in at all on the X360. They said not to install the play disc because of a bug so that might have been why.
Yes it was superb considering ancient hardware. I have an original PS3 fat 40gb and the loading and game play was surprisingly good. My only gripe was that I have a fully phat gaming PC upstairs that can pop out 120fps+ at 1080p…
What about anti-aliasing? How many samples do they use? 1080p with 16xAA looks much better than just straight 1080p. There is much more to graphical quality than just line resolution.
I’ve been trying to argue that point above, glad someone agrees =) I doubt either of them are going to push AA and AF to their upper levels. I’m somewhat cynical about console hardware – after the current generation, and the fact they chose outdated hardware for the next one, it feels like they’re intending to settle for ‘looks great compared to the old one’ instead of ‘looks as good as PC’.
The new consoles are certainly good compared to the current generation, but still last years news compared to top-shelf PC hardware. Think about the “quality” gap between the PC and the console platform at the end of the next console generation (5-7yrs). Going to be a massive difference in terms of raw rendering capability.
Sounds like something a troll made up to start an argument lol. Even so, a game running at 1080p on ps4 and 720p on X1 doesn’t straight away mean the PS4 version will look better anyway; but as long as it runs at 60fps on both I am sure most people wont care.
With that said, everyone obviously has their own preference and opinions when it comes to games.. but whether people like to admit that or not is a different story though. To me I see consoles as a gateway to exclusive experiences that you can’t get elsewhere, be that an exclusive IP or a community in an MP game. So a game could be 720p @ 30fps and I wouldn’t really care; but for the fanboy wars sake, I hope this doesn’t end up being true…… lol
We’ll know what console has better multiplats in 3-5 weeks. If PS4 does end up having slightly better multiplats AND an online service that isn’t balls, then it’s all over red rover. Doesn’t even have to be necessarily better- it could be just as good as XBL is now without having 3/4 of its clunky, cluttered dashboard throwing ads at you.
The online has really been the only thing Xbox has had over Playstation.
Well, and the exclusives that sell really well. Nothing really comes close to Halo.
Silly me. I clicked into the article thinking, there may be more info, or at least a source. Then I looked at the author…
Yeah, I shoulda known better.