Nvidia: PC Graphics Will Always Beat Consoles

“It's no longer possible for a console to be a better or more capable graphics platform than the PC.” Those fighting words come courtesy of Tony Tamasi, Nvidia’s Senior Vice President of Content and Technology, in a new interview with Australian PC PowerPlay.

Maybe Nvidia is still a little sore that the Xbox One and PS4 went with AMD technology. Or, maybe the graphics giant is right. Tamasi tells PC PowerPlay that “Nvidia spends 1.5 billion US dollars per year on research and development in graphics… Sony and Microsoft simply can’t afford to spend that kind of money. They just don’t have the investment capacity to match the PC guys.”

Tamasi does concede that consoles punch above their weight due to leaner operating systems and APIs, but goes on to technically explain pitfalls in power limitations. His main point: Consoles are equal to the best PC today. In a year or two, PCs will also jump ahead. [PC PowerPlay]

What’s your take?


    They're just jelly that they didn't get their hardware in consoles!

      I was reminded of this difference this week after playing GTA V. After coming back I'm thinking 'Wow, 360 graphics sure look nasty compared to my PC'.

      That's not to say that consoles aren't better value though. Bought a second hand 360 slim for just over the price of my graphics card upgrade last year..

    This has been true forever, nothing new at all.

      It actually hasn't been true, and isn't true yet. Read the source article.

        You clearly read a different article to me. They even have a graph showing PC is better! :D

          The graph shows gigaflops from the time of the PS3 launch. It does not show PS2, PSX, N64 years.

          Not only that, just measuring gigaflops and saying "BETTER GRAPHICS!" is wrong. It also relies on the operating environment, fill rates, and how far removed from the bone the access was.

          Console won on all fronts way before there.

            "Console wins" or "PC wins". The sound of plebs on the internet

        Show me a real life example where a console game has EVER looked better than PC, never happened. There's been double/triple/quad SLI'ing for years now as well if the tables weren't already tipped in the department.

        Don't get me wrong, it irks me when some pc elitist scoffs at a console game because they've just blown $1000 on a new graphics card (because it's stupid and a waste) but the fact remains that if you're a graphics junkie and have deep pockets then PC has always been the goto choice.

          N64, PS1, PS2.

          Pick a game.

            I just mentioned in my comment below but I'll say it here. It's hard because games weren't as cross platform as they are now but I remember podracer on PC was 10x better than N64.

            How about you defend the argument and show me when a console has looked better, it won't happen.

              Tamasi: In the past, certainly with the first PlayStation and PS2, in that era there weren’t really good graphics on the PC. Around the time of the PS2 is when 3D really started coming to the PC, but before that time 3D was the domain of Silicon Graphics and other 3D workstations. Sony, Sega or Nintendo could invest in bringing 3D graphics to a consumer platform. In fact, the PS2 was faster than a PC.

              There you go, he said it himself.

              The fill rate on the PS2 hardware was phenomenal. Graphics were objectively better, and 3D modelling on the console far, far outstripped anything available on the PC.

              Of course, you're now trying to shift the goalpost from "PCs have ALWAYS had better graphics!" to "Show me a big multi-plat game that objectively looked better on a console, despite most games not being multiplat at all". Next, it'll be "Multiplat games *this generation only*".

              Consoles performed much better and produced better graphics for a long period of time. You also need to look at more data that gigflops, considering how close to the bone console APIs are, and the massive difference that makes.

              Still stands, graphics on those systems were ahead of PC. PC was not "always better than consoles" in terms of graphics.

                I think you're being disingenuous about PC hardware capabilities. In 2000, the year the PS2 launched, the PC had Hitman: Codename 47, Deus Ex, Star Trek Voyager: Elite Force, Mechwarrior 4, Shogun: Total War, C&C Red Alert 2, American McGee's Alice and plenty of others.

                Take your pick of equivalent PS2 games from 2000, list those you think look better than the PC games listed above.

                Back then PC gaming was not a thing. It was always about console and there was no need for much graphical research and development.

                To be honest PC gaming was all about gameplay and not graphics. It somehow got shifted recently that graphic become the main point in a game.

                Not saying you are wrong but just to let you know ever since PC graphics development increased, it has never lost to console.

                  I agree that gameplay should be paramount - they're games, right?

                  I know improved graphics can lead to newer experiences (3D platformers, for example) or more immersive games, so graphics are important from that point of view, but I always find gameplay the kicker. If the game isn't enjoyable, it won't matter how pretty it is.

                  It's a shame that such a big part of things these days is how many FPS you can achieve for how many seconds.

                  I think we could argue the last bit though. Even nVidia graphs show that for the last two, three, four generations, every time a console was released, it leapfrogged PC graphics card performance. The period in which it maintained that raw power lead was shorter each gen, but PC was eclipsed. Also, that's compared to "very best nVidia card available", not "most popular card used by PC gamers" only.

                  The more detailed discussion comes in to the improvements in graphics achieved by the previously mentioned API benefits of consoles, and the ability for devs to code for a static system that is predictable for a 48 month dev cycle. Also, not all devs who released a game two years after the 360 era were pushing the brand new GeForce cards to their limits - so the majority of "graphics!" would have come in under that line.

                  PC gaming was not a thing back then? Huh?

                  Tell that to me and my friends and the hundreds of other people I played games with over dial-up, LAN parties and so on from the mid 90s to mid 2000s.

                  Add-in cards like the Diamond Monster 3D (and experiments like GLQuake) and Voodoo chipset were exploding around the same time as the N64, it was just the first fully mass consumer device to bring in hardware acceleration, but it was not vastly superior at all.

                  Last edited 30/09/13 5:51 pm

                Haha you're jumping to conclusions. I meant cross platforms are easy to benchmark. I took a look at Unreal Tournament on PC vs PS2.... yeah, there's no arguing that one.

                Games like Tekken tag tourn just weren't out on PC so I meant that you can't compare the two. But I guess the fact that it wasn't out on PC may speak for it's lack of being able to run it well?
                People can argue that games like No one lives forever is graphically better than tekken tag, it can be subjective.

                That whole PS2 phase was definitely a turning point to graphics shifting up a notch so it's a weird zone to compare apples and oranges. if you look well before that to NES/SNES then PC trumps. PC's have been trumping for 10 years (crysis etc). I see that zone as a bit of a transition phase where yeah maybe there was several exclusive titles on PS2 that looked better than some pc games but overall PC has always been master race.

                  PC cards have definitely been releasing quicker, and had higher raw grunt since just after the launch of the last gen, for sure.

                  But then it gets complicated.

                  The latest released PC card is certainly theoretically capable of more (although for this gen it took over a year for the top-of-the-range to eclipse the 360), and a lot of devs are lazy and just throw grunt at the problem until it's resolved.

                  On the console platforms, because everything is static and everything is closer to the bone, the devs can spend more time being more efficiency and milking more out of what is - on paper - a vastly inferior system until the next massive jump occurs.

                  I supposed a good-but-bad analogy would be Word 2000 and Word 2007. 2007 was technically the latest, greatest, most powerful release - but it was a steep learning curve. People who had been working with Word 2000 or 2003 could produce more, better, quicker than those who stepped up onto 2007 - until people really got the handle on 2007.

                  2007 wasn't immediately better - it was just more powerful.

                Are you kidding? I was playing 3D games on my PC before the N64/PS1 was even released! And the PS2 was faster than WHAT PC? Any PC? The Gamecube beat the crap out of the PS2 as far as graphics go and I wouldn't say they were any better than those of a good gaming PC of the time, not by a long shot. Far Cry was in that era, you know!

                  Mate, half of those statements came from the big guy at nVidia in the article linked to by this one. I'd also recommend nVidia's very own graphs of console performance vs their top-of-the-range graphics cards.

                  It's essentially been: Console gen released, massively leapfrogs PC. After a period of time, PC catches up, exceeds, and is then leapfrogged again.

                  PS2 released in 2000. Far Cry hit in 2004. PS3 launched in 2006.

                  So on that basis, assuming that we use your measure of Far Cry massively outstripping a console, it took four years of that generation for it to do so. It was then exceeded shortly after. And, again, keep in mind that this is only if we count the very best card that was released, not the most widely used ones.

          Wolf3d for jaguar.looked heaps better than PC.

        What are you talking about? It's been true for the entire current generation, as the graph in the article shows. At the time the last generation of consoles came out, console graphics hardware was equal to PC hardware, which immediately outgrew it. PC hardware today is leagues ahead of next generation console graphics hardware. Nvidia is saying that the best a console can get now is equal performance with PC hardware at launch, which they'll get by using PC hardware. They don't have the budget and R&D to beat PC hardware performance with their own hardware.

        Nvidia is essentially correct. I won't go as far as to say 'impossible', but for nearly the last decade, it has been extremely unlikely that console graphics hardware will exceed PC graphics hardware in performance.


          Is staying upto date with a PC the better option? That's an entirely different argument. I think I'll shy away from PC gaming because I have more important things in life to spend money on... but it's just obvious that graphics wise a PC has always trumped consoles (sometimes it's the same, sometimes they floor them, but it's always been better on an able bodied PC).

        Zap - you're a bit delusional. It most definately IS true now, and has been true for the strong majority modern gaming. I'll grant you that around release a few of the PS2 games looked better than anything that had been seen before in some aspects - poly count in DoA2 say. Apart from that, you've haven't got much. Also Soldier of Fortune came out on PC in early 2000 so it's debatable whether the PS2 could even match that, Games that were cross-platform or that were similar either a) ran better on a powerful PC or Mac, or b) looked better as a result of the increased power - and you could play at much greater resolutions if you wanted.

        "Around the time of the PS2 is when 3D really started coming to the PC"... what? Dude, 3D games were common on my acient Amiga, nevermind the pentium era where PC's (and even Macs) were defined by their awesome 3D games (such as Wolfenstein 3D 1992, Doom 1993, Marathon 2 1995, Quake 1996, Jedi Knight 1997, X-Wing vs. Tie Fighter 1997).

        "Every time a console was released, it leapfrogged PC graphics card performance". No.. it didn't. I think your best chance at that one is probably the PS2, and even then by that stage the Geforce 256 was well established, and possibly even the Geforce 2 was out.

          PSX, PS2, PS3/360 all leapfrogged.

          It actually took PC cards over two years to reach parity with the PS2.

          Some of those 3d games you listed aren't quite 3d as such. Doom. Wolf. They used 2d sprites as characters and objects not 3d rendered objects.

      It certainly wasn't true when the PS2 was released.

        I'm certain that unreal tournament looked better on pc than it did on console. It's harder to compare games tit for tat back then because they weren't as cross platform as now. I've been into pc's consoles since the NES (am 30 now). I can't honestly think of a time where a console has looked better than a PC in graphics.

        Consoles are amazing, fun, generally the better choice but talking pure graphics.... the pc has/will always win.

          That's a game that was developed for PC and ported to PS2, you would expect it to look better on whatever system it was originally developed for.

            A good example is Deus Ex: Invisible War - originally designed for consoles, but ported to PC - and it was a less than decent conversion.
            Ok, it could be that the team was just lazy and did not properly play to the PC's strengths, but at a time when Lock On, GTA: Vice City and N4S: Underground, there was no real reason to dumb down the graphics, control system and numerous other concessions that had been made for the console version.

          I'm a PC guys myself, however with steam OS you can easily install it on any system (Computer), I could get two Radeon R9 290X 4670K and 16Gbs of RAM stick it in my lounge-room, and if I am correct steam is calling it a console so in theory it could be on par to better than some PC. Just thought I say that as it is interesting.

            Sure, but it's still only 'on par'. It'll never be 'better than'.

              Well it could, a quite unrealistic example would be 4 way-SLI GTX 690, it could run Star Citizen on full.

                How could it? Any hardware you can put in your Steam Box can also be put in a standard PC. It's PC hardware you're using, after all. Hence it can never be 'better than' a PC.

                  I can see your point but, with the PS3 you use to be able to run Linux meaning it could be used as a PC and console.


                  There's probably a whole separate debate about what constitutes a PC vs a console, but it's a bit tangential to the topic. The point Nvidia is making is that hardware available to PCs is (and has for a long time been) more powerful than hardware used in consoles, or at worst equal to it.

                  Compared to the amount of money in the PC graphics industry, console makers can't afford to spend as much on research and development to make a new graphics card for their device that is more powerful than those already available in PCs. It's one of the biggest reasons why the current and next generation of consoles use graphics hardware made by PC hardware companies.

                  Yes it can. Closer API access is a big thing. Up to NINE TIMES the efficiency of a Windows PC.

                  But that's assuming they use the new AMD/ATI open source API that allows that low level of h/w access.

                  @Zap There's minimal performance difference between console APIs over the last few generations. The difference is API complexity, ie. how difficult it is to get the same level of performance. You're obviously not a programmer or worked with console kits before, but I am so let me debunk whatever junk source you used - there is not a nine-fold difference in API performance between any console made to date and a PC released in the same year. There hasn't even been a two-fold difference. AMD's figure for its Mantle API is for one specific task out of the few hundred that a graphics API is responsible for, and it's compared against hardware-agnostic APIs without vendor extensions. You might as well compare the utility of a fully functional iPhone vs an Android device (but only if the play store is disabled).

                  I notice you also haven't responded above. You asked for games that look better than consoles of the day, I gave you seven that look better than the PS2. This is your chance to back up your arguments instead of just pulling bogus figures out of your arse.

                  Last edited 28/09/13 9:13 am

            I don't think anyone would actually consider anything steam based as a console, definitely pc turf there :)

    Considering that the PS4 / Xbone are already using midrange equivalent components, I'd say he's right.

    Consoles are still cheaper than a high range graphics card that will run the same game anyway

      You can get a graphics card that can run current games in 1080p at 60+fps on ultra settings with AA for less than the cost of a console. I'm not sure where you're getting your info from but it's out-dated.

        Well it would depend on the game. if it is source engine it would be hard to imagine it would run at 1080p 60FPS and on ultra.

          Huh? The source engine isn't massively demanding. That kind of performance is easily achievable with a cheap card.

    It's mostly sour grapes due to AMD winning both console contracts.

      I don't think it is. You really think that PS4/Xbone will have better graphics than currently available PCs?

        Bang for buck, yeah. I'd like to see direct comparisons of equiv cards.

        Against a $600 to $800 card? Probably not, but I'm not spending $600 on one component of a console.

        What will be interesting next year isn't nVidia being sour, it'll be the implementation of AMD's new API. 9x improvements on older cards could see price reductions to make PC gaming more accessible.

          Console hardware is subsidised by increased game unit pricing. Have you ever heard of the term 'loss leader'? They sell the console at a loss to get it into your home, so you'll feel compelled to buy games for that platform, which they then get the rest of their money from. You're still paying $1000 for your console, you're just doing it $500 up front and about $20 a game for every game you buy.

          More 'nine times improvement' garbage. The AMD announcement about draw calls is A) about one specific function of the overall API, and B) compared against hardware-agnostic APIs. It's a pointless comparison for consoles because they don't use hardware-agnostic APIs, and it's a weak comparison for PCs because the major rendering APIs today (DirectX and OpenGL) offer seamless vendor-specific extensions, which all major studios make use of. Comparing against the stock extensionless API is disingenuous, it's just there for marketing buzz.

          Last edited 28/09/13 9:09 am

            I see you're happy to assume I'm not a developer or don't have my arms shoulder deep in this.

            Have you actually used the new AMD stuff? Have you personally seen a minimum 5x improvement on comparative code? Have you physically seen the improvement achieved by combining this new tech with a Rift?

            I'm assuming not, given that if you haven't managed to see the existing performance boost from current gen kits vs equivalent cards on Windows PCs, even two, three, four releases of those card lines later, you haven't truly developed for a console.

            When all the above applies to you, come back and lecture me. Until then, calm your pants and your posts.

              To answer your questions:

              1. No, I haven't worked directly with the Mantle API as I'm not on any next gen game projects. I do have several former colleagues who are working on cross-platform projects for both upcoming consoles, and I've taken both their opinions and benchmarks into account when coming to my conclusion.

              2. I haven't personally seen a five-fold improvement in code that is explained by API improvements and not simply hardware improvements, no. Neither have you, and neither has anyone I know who has worked with next gen console APIs. Mantle is still firmly in lockdown, so there are no developers that have seen what Mantle does...except that it's been confirmed that Mantle is basically an API copy of the XB1 API, which means its low level performance is likely to be on par with XB1 API performance.

              3. Rift is simply an output system. It doesn't provide rendering quality improvements or API improvements in and of its own right, you use existing rendering libraries like DirectX and feed the output to the Rift API. The Rift SDK is right here (https://developer.oculusvr.com/?action=dl), feel free to download it, familiarise yourself with it, and then send me a demo showing the difference in performance you claim exists with the Mantle API, which hasn't been released yet. I'll wait.

              I've seen the performance difference on one next gen kit (the XB1, haven't had a chance to see the PS4 kit in play yet) and it's significantly weaker than current generation PC hardware. In the past I've worked on engines for the Xbox, PS2, PS3 and X360. I've been in software development since 1997, my first game development job was in 1999. Mostly game systems (AI, physics, game logic, that kind of thing), but you get exposed to engine code as a matter of course. Unfortunately I'm missing out on being there out of the gate next generation as I'm working in business software development at the moment. The game industry, particularly in Australia, is somewhat volatile, sad to say, and job security is non-existent.

              What about you? You've carefully danced around whether or not you actually have any development experience, so why not come out and say it. Do you have development experience? What language, platform and APIs have you worked with? Have you worked with any console kits before? How well do you think you understand low-level APIs and the way they interact with hardware?

          Bang for buck? That's not what the article linked to is about though. The article is about graphics technology being at a point where PCs out-perform consoles before the consoles are even released, which didn't used to be the case.

            I'd be happy to examine that with you, but first if you take a look back at my first comment, it was in response to a claim that PCs have always been superior. It seems pointing out the fact that has never been accurate (as you just stated as well) is enough to fire up the fanboy downvotes.

            Holding a mature discussion seems impossible here if I end up stuck in moderation. Perhaps through TAY?

    Would love to join the mustard race, have shitty Internet where I live so I don't really see too much point seeing as to get games off steam it would take me way to long

      You can buy retail copies of Steam games and avoid the initial download altogether. Then you only have to worry about patches, which are generally a good idea regardless.

    -shrug-. Since with the PC you have the potential to keep up-to-date at all times, and consoles tend to have to sacrifice even in present trends in order to keep a reasonable price tag, this is hardly news.

    At least for consoles, games are optimised to fit the same hardware across the board - which means you're far less likely to encounter specific, finnicky issues with the software. Console games are pretty enough for one to be content, whilst potentially also enjoying pushing the graphical side to the benchmark on the PC side of things.

    And the award for the "No shit Sherlock" comment of the year goes to!

    first point no shit sherlock, but this is my view of the way things are, yes pc's have better graphics, because they can be upgraded continuisly, but they also have some pitfalls i.e. drivers, background apps etc, consoles are used (now anyway) as a level step a baseline if you will, as a way to control piracy and gain as much money as possable for developers, so they both have a role in gaming, the new consoles will be a boon for pc gamers rasing the bar and breath of games, and in the end isn't thats what we all want, not this tit for tat crap that keeps coming up

    I'm not too fussed about it. I'm happy to sacrifice graphics for ease of use that comes with consoles. I just haven't got the time or the money to build a decent gaming PC or to continuously update its components and drivers.

      I'm going to put this to bed right now:

      I own a decent gaming machine. 6 cores, crossfired HD 7950s etc.

      With exception to two hard drives and two graphics cards I put in it this year, I haven't updated since 2010.

      I didn't need the graphics cards, I wanted them. I had enough graphics power with the single HD6970 to keep going (brother uses it now).

      We don't live in the '90s anymore. This fear of persistent upgrades isn't there because there aren't tech advancements every second week.

        Except for CPU sockets. And must have got in early for a 6970

    I've been gaming for 15 years and the only console I remember thinking 'that looks better than my PC' is the xbox. And then within the year, my PC left it for dead.

    Never since then have I thought a console game held a candle to what my PC could do.

    Of course, I had to pay for the privilege though. You'd want to hope that the PC was better, seeing how you can buy an entire console for less than a gfx card.

    Nvidia released a statement:

    "We all know PC gaming is king. Those console peasants have never gotten close to our level of sweet sweet visuals. And if you would like to keep it that way, here is our new range of graphics cards! Don't be left behind like those unfortunate console players, buy a new card today!

    “It’s no longer possible for a console to be a better or more capable graphics platform than the PC.”

    Ok. So what's your point then?

    @Zap Consoles have NEVER leaped beyond a PC's capabilities. Ever. Not in the entire history of consoles and computers has there EVER been such an event. Simple fact is, consoles use current PC technologies incorporated into a tiny little box with absolutely NO upgrade options (with exception to hard drive space). And even then, because of the severe limitations console manufacturers impose on themselves (CPU choice, RAM, Video Cards most definitely) they will ALWAYS be a step behind same day PC's. No if's, no but's. I've been gaming since the days of the original Atari's and Commodore 64's, pretty sure I have more of an idea of what I am talking about that you do mate. I've been around long enough to see all the consoles be brought onto the market, compared graphic and playability versus a PC, and then decided to stick with PC gaming because consoles just can't keep up. The PS3 was junk within a year of it being released. All this talk of full HD gaming and barely anything was released above 720p, and when it was, you'd have framerate issues. Time to come down off your high horse and accept that just because you're a fan boy trying to justify your preference, does not mean that it's in any way accurate or true.



      Back in the day the ps3 was bang for its buck in processing power.

    Thank you for confirming my point. You've shot yourself in the foot.

    On a side note steam rate the Intel HD 3000 as there most used graphics card. Compare that to the highest selling console the ps2.

Join the discussion!

Trending Stories Right Now