Why So Many People Care About All This '1080p' Stuff

Why So Many People Care About All This '1080p' Stuff

1080p. Ten-eighty-pee. If you have been paying any attention to video game news over the past year or two, you've probably seen the term pop up quite a bit, in everything from technical breakdowns to E3 press conferences. It's Today's Big Buzzword.

1080p, which is shorthand for 1920x1080, is commonly referred to as Full HD. It's pretty much the highest native resolution you can get from today's console video games, which is one of the reasons E3 2014 was full of marketers and presenters shouting about it. "Our game will be 1080p!" they'd screech. To some it sounded like robot speak, but to many people, it was a meaningful boast, and it represented what "next-gen" really meant. "1080p" means more pixels. More details. Better-looking games.

See, last year's expensive new Xbox One and PS4 consoles were delivered with lofty promises -- "next-gen" games would be bigger! More interesting! Better looking! And although some have argued, quite fairly, that a pedantic focus on graphical fidelity over artistic value has limited what video games can do, it's undeniable that there's a significant visual difference between a game that runs at 1080p and, say, a game that runs at 720p.

So many game developers have made it a priority to make games that can run at a native resolution of 1920x1080p. For people who care about graphics -- or people who want to feel like their expensive console purchases were worth every penny -- that's a big deal.

With all that in mind, consider what happened this week. Let's call it the Parity Problem.

In an article published on Monday, Ubisoft Montreal producer Vincent Pontbriand set off a firestorm by saying his team had capped November's Assassin's Creed Unity at 900p resolution for both Xbox One and PS4 in order to dodge console wars. "We decided to lock them at the same specs to avoid all the debates and stuff," he said. In other words, they wanted parity.

In a statement to Kotaku and then again with a follow-up blog post yesterday, Ubisoft waved off Pontbriand's comments, explaining why the game is capped at 900p:

As of now, Assassin's Creed Unity is locked at 900p. But why "stop" there? "We know a lot of gamers consider 1080p with 60 frames per second to be the gold standard, especially on the new generation of consoles," Pontbriand says. "We realise we had also pushed for 1080p in some of our previous games, including AC4. But we made the right decision to focus our resources on delivering the best gameplay experience, and resolution is just one factor. There is a real cost to all those NPCs, to all the details in the city, to all the systems working together, and to the seamless co-op gameplay. We wanted to be absolutely uncompromising when it comes to the overall gameplay experience. Those additional pixels could only come at a cost to the gameplay."

Seems reasonable, right? Nope. People were furious.

See, history has shown us that the PS4 and Xbox One are not technically equivalent. While both Sony and Microsoft have built impressive, powerful consoles -- and the Xbox One's multitasking features are getting better and better -- the new PlayStation is slightly more powerful. It's got faster RAM and a better GPU. From a technical perspective, it's the superior console right now.

Consequently, a lot of observers are reading between the lines of Ubisoft's comments and concluding that the publisher could have made Unity run at a higher resolution of the PS4, but didn't. Because of parity.

Meanwhile, here comes BioWare, touting in a tweet just this afternoon that their own big November game, Dragon Age: Inquisition, is milking out as much as possible from both consoles. The timing sure makes this seem like a subtle dig at Ubisoft.

Confirmed: #DAI resolution is 1080p on PS4, and 900p on Xbox One. We maximized the current potential of each platform.

-- Dragon Age (@dragonage) October 10, 2014

Now, just to reiterate:

Assassin's Creed Unity: 900p on PS4; 900p on Xbox One

Dragon Age Inquisition: 1080p on PS4; 900p on Xbox One

A quick look back at other multiplatform releases over the past year reveals that for the most part, the PS4 versions tend to run at higher native resolutions. Call of Duty, Battlefield, and even Ubisoft's own Assassin's Creed IV all performed better on the PlayStation 4 than they did on the Xbox One.

Of course -- and this important -- that doesn't mean the Xbox One versions weren't gorgeous. The PS4, the Xbox One, and even the way-technically-inferior Wii U have all seen some incredible-looking games over the past year.

But! Many PS4 owners bought their systems in hopes that big game developers would max out the capabilities of that shiny $US400 box. And to those PS4 owners, it doesn't make much sense that the folks at Ubisoft couldn't eke better specs out of Unity for the PlayStation 4 version.

I've seen some pundits and journalists theorise that this is all about console wars -- puerile battles over slavish company loyalty -- but I really don't think that's true. I think it's fairly reasonable for PS4 owners to get mad when a company appears to be limiting a game's capabilities. If the PS4 is capable of delivering a better technical performance -- and history has proven that it is -- why shouldn't PS4 owners hope for better-looking games? BioWare's tweet just adds fuel to the 1080p fire.

Meanwhile, sadly, the Wii U versions of both Dragon Age: Inquisition and Assassin's Creed Unity are running at a subpar zero-p.


Comments

    This is the major advantage of PC. You get to choose how well your computer performs. Regardless of the PC vs. Console argument, on console you are a slave to what the developer and the console maker supply you. On PC you are free.

      Square Enix disagrees.

        If you are talking about pc ff13, i'm pretty sure there is already a fan patch for 1080 res.

          ^ that's my point, you're not a slave to bad ports.

            So we should just accept that we will get poor ports on PC because "modders will fix it"?

            As nice as mods are they should NEVER be a requirement to get basic functionality in a game that we paid money for, if it is left down to modders to give us functioning games then there is something very very wrong in the PC space.

            I honestly cannot believe the amount of acceptance a number of recent terrible ports have gotten with the general consensus being "at least its on PC" or similar.

            To those people I say, kindly go back to consoles please. We don't need publishers believing we are fine with this or it will only get worse.

              No all he said was that PC you get to choose and then if for some shit reason you can't someone can fix (mod) it so add another bonus of PC not being a slave to developers.
              There is little to no acceptance of bad ports in the PC community. If anything (unlike what you insinuated) PC gamers actually scream even louder for blood at bad ports than Xbox owners scream about less than 1080p.

                I get that and I'm not trying to single him out, I was more generalising what I have seen/read over the last few months in certain PC communities. This year in particular, PC seems to be flooded with new users who are just rolling over and accepting whatever the publishers feed them.

                I am glad those that aren't are a bloody vocal bunch, keep fighting the good fight.

                Last edited 11/10/14 10:44 pm

                  Yeah reasonable enough. We just have to engage them I guess though if I was a guessing man I'd say they will get more and more frustrated (eventually jaded like us) as they get more accustom to PC gaming. Like anyone they will learn its strengths and get frustrated at the publisher imposed set backs.

            By that logic, once the DRM on the Xbox or PS4 is broken, it will be okay that games don't use the full performance of the console because modders will be able to fix it.

            And what do you do with multiplayer/online PC games whose anti-cheat features block your mods?

          Fan's shouldn't have to fix and patch in standard features to make up for sloppy development work.

            No, they shouldn't, but we don't live in an ideal world and having the option to fix it yourself is better than not having the option at all.

              Absolutely, but it does not serve to let the developers know that, otherwise they will acknowledge that and decide "let the fans do it". So for all intents and purposes, you ought to say that it's just unacceptable instead of saying "it's unacceptable but at least we can fix it", lest a developer see your post or something similar to it and decide they really don't need to go to the extra effort.

        I'm so confused. Square Enix were making some solid PC ports recently, games like Hitman: Absolution and the Tomb Raider reboot.

          Yeah they are solid ports... developed by Eidos Interactive (ok fine Square Enix Europe).

          Their main development studios in Japan however are pretty bad at PC ports (FFXIVARR excluded), unless oddly enough its on a 3rd party engine, like The Last Remnant was.

          Sleeping Dogs was also pretty damn solid, too. But still works with what @drunkaus said about it being another Eidos title.

    I will cry if they continue this with pc. First Watch dogs pc issues now this?

      I agree that Watch_dogs was a complete mess on PC (well it would seem it was more of an issue for higher powered rigs, go figure) - but that was also a different development team within Ubisoft. AC usually isn't too bad on PC - well its never been perfect, but nothing like Watch_dogs. I guess we'll see, as Ubisoft have touted that they want to make it up to PC gamers and also the Unity is 'truely' next gen, and shouldn't be a dodgy port....but nothing that has been ported so far from any company has every really lived up to any PR promotion.

        Didn't have problems with AC4, but couldn't run it with as many bells and whistles as Shadow of Mordor. Felt a bit under-optimised to me. That said, AC4 does look pretty great though, and I got more replay value out of it than SoM.

          It is an optimisation problem. It is why Shadow of Mordor is so much better looking on PS4 than a maxed out AC4 looks on a high end PC despite the same frame rate.
          So yes I just wrote a long winded agreement with you I guess lol.

            On the plus side, while I can't speak to Watch Dogs or AC4 on PC, I can say that Mordor is very, very pretty on PC.

    Meanwhile on PC, the discussion is whether 1440p or 4K at 90-95hz will be achievable for the first generation of Oculus Rift when the consumer version is out. The console world is really stuck in the past it seems.

    My main gripe with all this resolution talk is the frame rate seems like an after thought by comparison. Developers brag about it being 1080p but if it is locked at 30fps that is a huge disappointment in my opinion. Games like GTA and Sleeping dogs are getting DEFINITIVE EDITIONS on PS4 and Xbone but are only slightly higher resolution, there isn't a huge improvement on performance.

      I completely agree. I feel there's a significant different between 30fps vs 60fps (it can be very jarring, made worse when the frame rate dips in some games) whereas i'm not OVERLY fussed with 720p vs 1080p. At the end of the day I want my gameplay fun and framerate smooth.

        It's jarring in PC ( if a PC games frame rate isn't smooth it really bugs me for done reason) but on console on the TV I'm okay with 30fps

          Yeah I struggle to imagine 30fps, I straight up have no idea how console gamers put up with it. Even 60fps seems like a sloth running through molasses now after grabbing a few Swifts and gaming at 144fps.

            I believe the main issue with 30vs60 between TVs an PC Monitors is that TVs don't have as fast of a Grey-to-Grey response as gaming monitors. For example playing Destiny on the 40" Samsung TV in the lounge (even with it's smooth motion feature) looks better due to the inherent slight blur between frames, whereas on my Asus VG248QE and it's 144Hz 1ms display, each of the 30 frames per second flick between each other without any blur which is why it's so jarring (despite the monitor actually running at 60Hz via the console.

    Or to put it in sadly far more realistic terms since a large percentage of people don't even know what they're arguing about: "My number's bigger than your number, nyah!"

      You're correct in saying that a large percentage of people don't know what they're arguing about. As far as resolution is concerned, more pixels equals more detail, and when 1080p has been the *bare minimum standard* for PC since almost the last decade, it's pretty embarrassing that the consoles (we're all looking at you, XBone) can't even manage that.

      Last edited 12/10/14 7:29 pm

    Consoles used to be value for money compared to a pc. Now their on parity and if you look up how to build a gaming pc, your generally paying more than $500. Hopefully steambox catches on and people will get the performance they pay for without also being forced to play crappy pc ports.

    Last edited 11/10/14 3:35 pm

      Consoles are better value, when you consider that they generally produce results equivalent to current pc 'high' settings at 1080p.

      No way you could build a box that can do that for less than $800 including windows license plus mouse/keyboard or controller.

        Depending on game, current gen consoles are probably more like Medium/Medium-High settings. Many of them don't run at 1080p, and many are also locked to 30FPS. Framerate makes a huge difference.

        E.g. Shadow of Mordor does indeed look quite pretty on current gen consoles. But a good PC can run it looking prettier, and with double the frame rate or more. I get 80FPS+ with everything maxed out, for example.

        Don't get me wrong. I own both a PS4 and a PC, and I adore my PS4. I can't stand PC elitism. However, 1080P/60FPS is really the "old" standard. PCs have been able to do that for years.

        I've got hardware more than capable of playing at 2560x1440 currently. What happens when the hardware catches up to 4K? Where will the consoles be then, when 4K TVs and the like become normal and affordable?

        Both consoles and PCs have advantages and disadvantages, but I have trouble seeing the PS4/Xbone generation lasting as long as the previous gen did.

          80FPS, but how much did you spend on your PC? I suspect you hit 4 numbers, that's quite a bit more then the $600 (XB1+Kinect, I know you can get a PS4 for $400ish) you spend on a console.
          You can buy a Graphics card that costs more than a PS4 or XB1.

          It's an unfair fight. A Console is a sporty little coupe, Fast and fun, just jump in and drive. A Gaming PC is a Super Car, I'd bet on a Ferrari over an MX5.

            True, I did spend considerably more on my PC than a current gen console.

            However, for not *that* much more than a console (say $800-1000), you could get a PC with pretty comparable power to a PS4/Xbone. Then factor in the drastically lower cost of games and it doesn't take long for the PC to be good value. Plus, once you make that initial investment on a complete system, you can replace just a part or two every couple of years and remain fairly current.

            Like I said, I love my PS4. Consoles have some real strengths over PCs in other areas. I guess my concern is simply that the consoles struggle with the current/old performance standard, so what does that mean for the future in 4K and beyond?

            EDIT: Even completely removing PCs from the equation, my concern would remain the same. In a couple years time, 4K TVs will be the new normal, in much the same way HD and Blu Ray became average over time. Will the current gen consoles be able to play games at the native resolution of Average Joe's home TV?

            Last edited 12/10/14 3:27 am

              Personally I think 4K is further away then that. If before the next generation of consoles 4K TV are under $1000, then I think that supporting them would be good. At the moment owning a 4K TV is like owning a train, it's impressive but where are the Tracks/4K Media.

              Perhaps the future of gaming isn't on a TV, what if you just plugged into an Oculus Rift style of headset. With maybe a TV display for spectators.

                This is true, with tech like Oculus Rift, the next generation of gaming could be quite different to what we have now. I also wouldn't be surprised to see a rise in cloud-based gaming (Australian internet notwithstanding), which would reduce the reliance on hardware.

        If consoles can produce high settings, then the PC versions must be pretty gimped

      Consoles used to be value for money compared to a pc. Now their on parity

      What are you talking about? If you're building a PC to run games in reliable full HD, you're spending at least $500 on the CPU + GPU alone. That's to say nothing of the case, mobo, RAM, OS license, PSU, HDD and other bits and pieces for the thing to even run. Yes you can do more on a PC, but you're doubling a console's sticker price if you want to play games.

        There have been a lot of articles and comments (I did research on this myself when the consoles first came out) that put a gaming PC with at least equivalent performance to the PS4 in the $600 price range, and the price difference was made up quickly on the fact PC games are generally cheaper than console games. I say 'at least' because in some cases it's extremely cheap to outperform console hardware, like CPU.

        This generation isn't like the last. PC prices have come down a lot and the consoles are giving somewhat lacklustre performance for their price point this time around.

          gaming PC with at least equivalent performance to the PS4 in the $600 price range

          You can't compare specs between PCs and consoles, it's pointless. It reminds me of people who tried to match the Xbox's clockspeed and memory and wondered why they could barely run Windows. It's because a PC adds all sorts of interfaces such as a burdensome OS, drivers, BIOS, etc. all of which add more friction between the hardware and game. If you think you can make a $600 or so PC with input peripherals, etc. and it'll run Shadow of Mordor at the PS4's default, you're cracked.

          I own consoles and build PCs and while PCs are generally cheaper in the long run through things like Steam Sales, it's not unconditional. The value demands that you know how to build, or someone who does. It assumes that you're happy to wait, you're fine with the typical PC 'upkeep,' have a passing familiarity with computing, okay with less-than-stellar warranty. Consoles are a cheaper upfront cost, require little upkeep, and if you're buying games day 1, it's only a hair more expensive.

          Last edited 12/10/14 2:52 pm

            I didn't compare specs, I said equivalent performance.

              You won't get anything close to a PS4's gaming stability or performance without going into $800+ territory and that's assuming peripherals like KB/M or gamepad are free and you have a spare Windows key lying around. Look up minimum and recommended specs on a current AAA PC title like Shadow of Mordor. Now try building a rig on PC Case Gear or Newegg that approaches that without blowing $600. It's impossible, I don't know who said it was but I'd like to look at their component list. That's also excluding poorly-optimised PC ports which require even more headroom and thus more expensive components.

                Too easy. Exceeds minimum requirements:

                CPU: AMD FX-4130 3.8GHz: $130 (Amazon)
                MB: MSI 760GM-P21: $30 (Amazon) or ACS A960M-MV: $33 (Newegg)
                RAM: 4GB, any manufacturer, around $30
                GPU: GTX 660: $172 (Amazon)
                HDD: WD Green 500GB: $30 (Newegg)
                PSU: Diablotek 400W: $16 (Newegg)
                Case: Any cheap case, around $40 (Newegg)
                OS: Windows 8 OEM: $92 (Amazon)
                Keyboard/mouse: Rosewill: $6 and $5 respectively (Newegg)

                Total: $551

                Exceeds recommended requirements, upgrade the following:
                CPU: AMD FX-8350 Black Edition: $180 (Newegg) (+$50)
                RAM: Kingston 2x4GB: $60 (Amazon) (+$30)
                GPU: GTX 670 Superclock: $210 (Amazon) (+$38)

                Total: $669

                Note that the recommended requirements will run the game at 1080p on very high settings and high texture quality (which is what the PS4 runs) at 65fps, considerably higher than the PS4's 30fps cap.

                Last edited 13/10/14 7:08 am

                  Now try building a rig on PC Case Gear or Newegg that approaches that without blowing $600.

                  So many corners were cut to reach that. 400W PSU? Mobo that doesn't support USB 3.0 or WiFi? very cheap input peripherals, no gamepad (if you really want to make the console comparison). You're also quoting American online store prices without accounting for USD/AUD exchange rates, which will jack the barebones option to ~$630, that's not accounting for postage costs of multiple components from (multiple) retailers and shipping restrictions. The latter option will almost certainly exceed $800, to say nothing of the cost-cutting in some of your components and murky international warranty.

                  My point stands. Just by punching in the '$551' figure into a currency converter, you've broken the $600 budget, it's also a minimum spec build, without accounting for all the other disclaimers. Realistically, you're still going to be looking at $800+, especially when most people will be picking small upgrades like a mobo with 3.0 headers, a case which supports it, a beefier PSU, etc.

                  @strand0410

                  I think you're grasping at straws. If you want Australian retailers then let's use Australian retailers:

                  CPU: AMD FX-4130 3.8GHz: $105 (Computer Alliance)
                  MB: ASRock 980DE3-U3S3: $63 (GREENBOXiT)
                  RAM: Micron 4GB: $30 (Streetwise)
                  GPU: GTX 660: $168 (MSY)
                  HDD: Buffalo 500GB: $30 (Myer)
                  Case: Shaw GT-DF3: $29 (MSY)
                  PSU: Corsair VS 450: $55 (MSY)
                  OS: Windows 8.1 OEM: $107 (MSY)
                  Keyboard: Logitech K120: $8 (GREENBOXiT)
                  Mouse: Gigabyte GM5050: $5 (MSY)
                  Gamepad: Thrustmaster Dual Analog: $14 (mwave)

                  Total: $615 AUD

                  I think this fairly fits my statement "in the $600 price range" for a machine that exceeds the minimum requirements for Shadow of Mordor, a modern triple-A title that you chose. It comes with peripherals and capabilities that the PS4 does not.

                  To exceed the recommended requirements:

                  CPU: AMD FX-8350 Black Edition: $198 (GREENBOXiT) (+$94)
                  RAM: Mach Xtreme 2x4GB: $75 (ITSdirect) (+$45)

                  Total: $754 AUD

                  Under $800, higher cost parts at your insistence, more than double the performance of the PS4 in Shadow of Mordor for only $200 more than the PS4 costs right now, or roughly 3 games worth of price.

                  I haven't even mentioned the fact that PC games are on average $20 cheaper than console games, and the increased cost of the hardware is offset after buying only 10 games. I don't know about you, but I buy a lot more games than that over the lifespan of a console.

                  Last edited 13/10/14 8:40 pm

      You think the PC gets crappy ports now! How so think they're going to go porting to Linux!!? Somehow I don't think Steambox is the answer

      but you make the difference back on game price, PC games on release are consistently 10-20 dollars cheaper than their console counterparts, so if you buy 5-10 games a year your savings from that cover the difference between the console and PC

    While in general, the PS4 should be able to produce higher graphical standard, the reason Unity was capped is due to CPU performance issues. Now your article is happy to mention a small quote regarding this, but completely disregards the overall intent in Ubisofts comments.

    Is there bias in this piece? Yes.

    Feel free to include all the facts when talking about a situation.

    I find there's too much emphasis on visuals (especially when it comes to resolutions etc). It's nice to have great graphics, but i'd much rather smooth/optimised gameplay with 60fps than otherwise pretty decent graphics but locked at 30fps.

    While i'm not suggesting that graphics are "nothing", in this day and age, I don't think it's overly difficult to have at least reasonable graphics while still prioritising performance.

    Last edited 11/10/14 4:21 pm

      Good on you - that's what settings are for.

    I REALLY don't care anymore what resolution a game is in. I just want it to be fun. In fact, I was hoping that with all the extra RAM consoles and PCs have today, along with the massive leaps in CPU technology, that maybe all that extra computing power would be put towards Artificial Intelligence. I want to see enemies that behave realistically, not CPU controlled team-mates who run into my line of fire, or defenders who team up in groups of 3 to mark the man with the ball while Messi runs unmarked into an open box.

      I totally agree with you man.
      All this (very wrong) crap about this gen looking barely better is crap. It looks stacks better but we seem to not be getting fantastic AI and even effects just a resolution bump, more grass and better (much better) dynamic lighting.
      As a PC gamer though I have had that for years and want my freaking AI and better optimisation.

      Didn't Ubisoft say they wanted better AI which is why the game isn't 1080P, 60FPS. In the Other thread about this the non fanboy consensus seemed to come to the conclusion that the PS4 didn't have enough CPU and the XB1 didn't have enough GPU so they found a middle ground between the two.

      Finally! !!! Someone with a brain! Thank you for being awesome.

      And yes, Ubisoft did say the resolution has been down scaled becuase it has so much AI going on. AND YES! For all you PS4 fanboys out there, it's not only the becuase of xboxone that has scaled this game back in resolution. Its more so becuase of the PS4 that this game is running at 900p.

      This article is just full of assumptions.

      Last edited 12/10/14 1:11 pm

    (Sticking my head into the world of games) This was published today? 1080p is still an issue? With TVs it used to be a luxury, in 2005. With any kind of content, I always go for 1080p minimum. It just plain looks a lot better. In fact, it's humdrum. Why are games so far behind? I smell a swindle in the games industry.

    Seems reasonable, right? Nope. People were furious.

    What sort of people were furious?

    Most of my console-only friends (Ones who have never played on PC) don't even know what resolutions are. They unpack the console, attach the HDMI cable, and play with what default factory-settings were given to them.

    It's only my PC friends (Including myself) that whine and bitch about console resolutions, lack of anisotropic filtering, average texture quality, 30 frames per second, average draw distances, pop-in textures, loading times, online multiplayer payments, lack of steam sales, and sometimes lack of PhysX.

    PC people shouldn't game on consoles. It's like dropping an arrogant, ignorant, unforgiving wealthy douche into a third world country.

      You and your friends are a very small part of the gaming demographic.

        You're right, but that's how I see it from where I'm standing.

        The console-only player I've met is just there to play. They couldn't care about resolution. The PC players I've met all have this pseudo-intellectual, PC master race mentality where they are all very wary of what consoles bring to the table.

          Generalising based on what you experience is a sure fire way to be flat-out wrong.

          I play both. I built my PC from the ground up and own every console released in Australia and fiddle with PC performance regularly and I don't care about resolution on consoles. The reason is that I don't KNOW about consoles and neither does anyone else really. Noting specs and even analysing the architecture will get you almost nowhere in reality but seems to fill these people's heads somehow with undeniable knowledge. (or so they believe)

          I have also never met anyone like you have described, console or otherwise. Maybe my friends don't differentiate based on video games or something ridiculous like that.

          Last edited 12/10/14 9:56 pm

            I don't care about resolution on consoles. The reason is that I don't KNOW about consoles and neither does anyone else really

            You can find the native resolution of console games using emulators and soft-modded consoles. I'm sure someone, somewhere has made a list of native console resolutions on the internet.

            Maybe my friends don't differentiate based on video games or something ridiculous like that.

            You should differentiate games, it prevents you from being ripped off. A great example is Minecraft.

            Regardless, Ubisoft shouldn't be holding the PS4 back. If it's capable of more, then churn it out. Or at least cap the game at 120FPS so people with 120hz screens can drool over it. Or introduce a HD texture pack or something.

      I've met a Console user who knows absolutely nothing, and still raves on about how great the PS4 is because it runs 4K resolutionon every game and the XB1 runs 720. Did I mention he is playing his PS3 on a 32cm CRT TV? He's keeps asking me how to upgrade to HDMI to fix the picture, and doesn't believe me when I say buy a new TV.

        Hahah this is great. Another example of someone who can't think for themselves. PS4 power!!

    I'm not worried... they've been out less than a year.

    Developers will figure it out eventually, remember how good games looked at the end of the last generation?

    People just need to take a deep breath and calm themselves the hell down.

      That was because pretty much each new generation has used different hardware/architecture to what came before. As you said it took time for devs to learn the ins and outs of the system and how to most efficiently use the hardware.

      That really isn't the case with this generation as the new systems are x86 and more or less (obviously so in the XBO's case) off the shelf PC's, which we have been using for decades now, there are no new tricks to be learnt, they've been known for years.

      Things will of course improve as the devs learn to write more efficient code but I doubt we are going to see any massive strides in fidelity as the devs 'unlock the secrets of each console and get to know the hardware' like we did last generation.

      Last edited 11/10/14 7:37 pm

        I not only 100% agree with you here but I have been saying the same thing since they got announced. There is no IBM custom this or cell architecture that to master. Just two year old APU's with a resistor count a bit higher than mid to high level GPU's had two years ago.
        I rather of paid a bit more for my PS4 and got something a bit more ballsy but it is what it is.

    Good quality video I can tell the difference between 720p and 1080p, but with video game graphics, I can't tell the difference at all because no one knows how to take advantage of it even on PM where you have the power to make highly detailed models and run at 1080p.

    I think it's mainly Kotaku that perpetuates articles about resolution and frame rates. I see SO many of them on this site.

      Actually it's Eurogamer (specifically Digital Foundry) when the only difference between consoles now is resolution they make a big deal out of it

    Ok, PS4 owners paid Sony for a console they thought was superior. Are they going to pay Ubisoft more for a superior version of the game? Did Sony have anything to do with it? Clearly Sony aren't paying Ubisoft anything this time around like they did for ACIV exclusive content.

    You might "expect" Ubisoft to offer you more, but why should they spend more development resources to fragment their game further for you to pay the same money? Who's to say they're not telling the truth about issues with processing left over for little details, NPC's, etc? Would you really rather a version of the game with crappy AI but 1080p if that was the option?

    1080p is so 10 years ago, ive had a 1600p monitor for ages, im not bragging its just like wtf. and really if you are just learning about 1080p now then you are such a noob.

    What a shock, an article about CONSOLES and their graphics is about 90% "PC Master Race" comments.

      There wouldn't be the current anxiety about 1080p if PCs had not been delivering that standard for years.

    PFFFFT whatever, you consolers stick with that... meanwhile ill be sitting back enjoying evolve and star citizen in 4k instead of 720p upscaled...

    PC master race

    For those old enough to remember the SNES vs Megadrive/Genesis wars if a company tried to pull this shit by nerfing the SNES version there would have been a massive backlash, even without the internet to vent our disapproval!

    The main sticking point with AFL fans is that 7 refuses to shoot in HD (not even 720p), as a Hawks fan I am livid that our period of success will never make it to Bluray.

      Well... it did kind of used to happen all the time in the old days.
      This thread has some memory-joggers for you:
      http://www.sega-16.com/forum/showthread.php?19078-Genesis-vs-SNES-Game-by-game-comparisons-repost-of-lost-thread

      Even when the games weren't completely different entities (like Aladdin or Spider-man, or Shadowrun), when they were kinda similar, there were still often cases of massive performance/graphics differences on each.

      I don't recall people getting quite as angry about it, though.

    If it's a console, I don't care. If you do, it's because you've been conditioned to. It's nothing worth caring or worrying about and most likely isn't even in your realm of understanding despite what you think.

    If it's PC, I care. Lots.

    We know a lot of gamers consider 1080p with 60 frames per second to be the gold standard

    With a new generation of consoles, shouldn't it be the minimum end goal, not the just out of reach dream?

    I play on both consoles and PC. At one point my pc had to play games on 800x600 with settings on all low. Guess what? I had a marvelous time with those games despite my computers inability to play them at pique. I can now but I still play on medium because I just. don't. care.

    I wish more gamers had this mindset, focusing on the fun. But that's too much to ask and petty arguments about numbers just get stupidly large and all the attention.

    It is getting pretty old really. The games still look pretty good to me. Who cares if PC is 1080p standard; if it means so much to you go play something on PC. Just play the damn games on whichever console you choose.

    Shadow of Mordor is a really good example of what happens when a studio is not limited by the hardware of the current generation of console's; specifically: the draw distance.

    SoM has great draw distance so it doesn't need fog or blurring effects to hide the limitations that the last generation of consoles needed. As a result, the game looks spectacular and I reckon PC gamers are really happy to see a multi-platform AAA that looks superb.

    That said, it looks like the XB1 is already at its maximum potential and we're only 12 months into its lifecycle! What happens in 18-24 months when the newest batch of CPU/GPU's are out for PC and we're back at the huge gap of graphical power between PC and consoles?
    The answer, is studios will once again be limiting their engine and post-processing designs to the limitations of the console platforms =(

    Hahaha PC Master race swoops in....

    Fair enough you guys can boast about having the "gold standard" 1080p, and sometimes even pushing it further. But hey, I've got every single console of current gen, last gen and a PC that'll run anything under the sun. But still choose my PS4, you know why? For every 4 or 5 or so games that I've purchased on PC, I've had to spend hours searching through forums and calling up customer support just to get the bloody thing working.

    And you can say "oh my PC can run 4k gaemz PC MASTER RACE FTW", but seriously how much did you spend on your PC? When I was looking to build a PC it costed roughly $1000 to build anything near to the specs of any of the next-gen console, and mind you, to exceed them would've costed much more.

    Hey guys, hows GTA V treating you? Did you guys enjoy Red Dead Redemption? How good was The Last of Us?

    ....Oh.

Join the discussion!