Why AMD And Nvidia Are Fighting

Why AMD And Nvidia Are Fighting

AMD and Nvidia are at it again. The two reigning champs in the market for video game graphics have been fighting since late last month when some performance issues on the PC version of Watch Dogs kicked up a fresh controversy. And given that AMD is still talking about the issue publicly, it doesn't look like things are going to settle down anytime soon.

Are you one of the people perplexed by all the sound and fury emanating from PC gaming forums? Don't worry: I am too. To help us all get up to speed, I prepared a handy guide to the main talking points here.

Uh, what are AMD and Nvidia exactly?

Great place to start! AMD stands for "Advanced Micro Devices." Nvidia stands for Nvidia. Both companies are best known in the computer and video game industries for making graphics cards (also known for our purposes here as Graphics Processing Units, or GPUs) and other bits of hardware that go inside a number of different gaming devices — anything from smartphones, past and current-gen consoles, to the highest of high-end gaming PCs. AMD secured contracts with Nintendo, Sony, and Microsoft to make the hardware inside all three of the current-gen systems (the Wii U, the PlayStation 4, and the Xbox One respectively). That hasn't stopped Nvidia from edging ahead of its close competitor in other areas, though.

Have they ever been at peace with one another?

Not really, no. They're sort of like the Coke Vs. Pepsi of video games. That comparison is all the more relevant considering that some of their other competitors, like Intel, have captured a much larger portion of the overall graphics market by appealing to PC users who don't need to play serious games and thus don't care as much about spending upwards of $US300 for the best graphics card imaginable. Something similar happened when Pepsi and Coke locked horns so intensely that they didn't notice other, smaller competitors had started making little things called energy drinks.

Is there a substantial difference between their cards?

It depends on who you ask. Last year when we polled our readers, the Kotaku community seemed to overwhelmingly favour Nvidia cards. That doesn't say anything about performance, mind you — just people's preferences. But market share could be a significant issue here, since Nvidia has been beating out its closest competitor specifically in the PC realm in recent years. Here's quick description of Nvidia's current, enviable position from the financial site The Motley Fool:

NVIDIA has benefited from the growing PC gaming market, with revenue from its GeForce gaming GPUs rising by 15% in fiscal 2014. This growth came during a continuing decline in the PC market as a whole, with NVIDIA specializing in one of the few areas that have remained immune to the PC sales slump. NVIDIA's share of the discrete GPU market has also been on the rise, with the company now commanding around 65% of the market. NVIDIA was nearly even with rival AMD back in 2010 in terms of market share, but the gap has been widening each year.

What does that have to do with anything?

Well, each company's influence in the PC gaming market rises and falls depending on the worth that individual game developers give to it. So if a company like, say, Ubisoft thinks that it should form some special partnership with Nvidia because lots of PC gamers use its cards over AMD tech, the company's executives would probably feel more inclined to form such a special partnership if they were convinced that keeping Nvidia happy would guarantee them the rapt attention of 65 per cent of PC gamers.

Whenever AMD and Nvidia butt heads, vocal critics begin to use rhetoric about innovation, corporate bullying, even monopolization. Nvidia's relative success doesn't necessarily mean that AMD users should suffer anything just because they're not enjoying the fruits of some special partnership, however.

Ok, so then what happened recently to kick the hornet's nest?

So speaking of Ubisoft, the company finally released its new open-world game Watch Dogs last month. Once it came out, it was met with an outcry from many irate gamers experiencing problems with the game's PC version. Many of these were online connectivity issues, but breakdowns of relevant versions of the game also suggested that it wasn't always running nearly as well as it could — particularly on PCs that were equipped with AMD hardware rather than Nvidia cards.

Late last month, AMD's Robert Hallock took Nvidia to task for this, blaming his rival's GameWorks program for deliberately undermining AMD products and thus effectively disenfranchising gamers using its graphics cards. Even for a corporate rivalry, the language here was incredibly strong; Hallock called GameWorks "a clear and present threat to gamers." From the original Forbes story:

"Gameworks represents a clear and present threat to gamers by deliberately crippling performance on AMD products (40% of the market) to widen the margin in favour of NVIDIA products," Hallock told me in an email conversation over the weekend. But wait, it stands to reason that AMD would be miffed over a competitor having the edge when it comes to graphical fidelity and features, right? Hallock explains that the core problem is deeper: "Participation in the Gameworks program often precludes the developer from accepting AMD suggestions that would improve performance directly in the game code — the most desirable form of optimization."

Holy crap, I'm a gamer. Am I being threatened?

I don't know, maybe! I'm a gamer, and since I created a Kinja account people have been saying some pretty scary stuff to me. I'm ok though, thanks for asking.

Oh wait, you're asking about AMD! Well, maybe in some long-term sense if GameWorks really does end up stifling innovation and impeding performance the way AMD is warning us it will, the entire industry will stagnate and we'll all be cast back to the stone age of chunky pixels and bloops.

But no, I think you're fine for now. The thing we do need to keep track of, however, is how many games that come out for the PC exhibit radically different performance levels on AMD or Nvidia cards. GameWorks, the program at the center of the current spat, is less than a year old. We don't have enough material to sink our teeth into yet, let alone start making dramatic predictions. And seeing as Watch Dogs took five rocky years to make it to market and doesn't look as good as many expected it to, it's not clear if its problems were an anomaly or symptoms of something truly systemic.

Ok, back up a sec. What is GameWorks exactly?

It's a new-ish program Nvidia came up with to work more closely with PC game developers than it had been previously in an effort to spiff up the performance of big games on PC rigs using the company's tech. Nvidia announced it last October around the same time that Batman: Arkham Origins came out — one of the first games the company highlighted as benefitting from the new program. As I said in a recent story on a deal Nvidia just inked with Ubisoft, GameWorks "was designed to put the company closer to the entire development process of a given game — giving a company like Ubisoft access to a more robust set of tools so they can 'bring an enhanced gameplay experience to [...] PC players,' as Ubisoft VP Tony Key put it in today's press release."

So basically: Nvidia gives a game developer an enhanced set of tools to help a game look better. In exchange, Nvidia gets to join in on the marketing for a highly anticipated game like Watch Dogs and thus spread its "The Way It's Meant To Be Played" mantra.

Just to be clear here in case the name confuses you, GameWorks is a developer-side initiative. That's part of what's been making PC gamers so upset: many feel that their experience as average customers is being negatively effected by a lot of high-level corporate deal-making in which they have no say. So if you're a PC gamer who just happened to buy a high-end AMD card a few months before Watch Dogs dropped, you're out of luck.

Ok, so AMD is upset that Nvidia is getting some kind of special treatment from top-tier game developers like Ubisoft that it's not receiving?

Well, AMD is framing it as a clash of two different ideals: an open-source ethic that it embodies as opposed to a ruthless competitive spirit that's driving Nvidia to make unfair backroom deals with no regard for the state of the industry as a whole. Here's how AMD's Robert Hallock described Mantle, the company's software that's designed to make it easier for game developers to process visual effects for games (particularly when played on PCs with AMD graphics cards), in an email to me this week:

Operating in the spirit of sharing and transparency positively impacts the industry by creating an environment of mutual learning. If game developers are open and honest with us about their challenges, we can design hardware/software solutions that directly address development issues. Mantle is a great example of this. And, if we are open and honest about the form and functions of those solutions, they may be analysed and investigated by developers to learn more about the nature of our hardware or to discover a previously unthought-of technique. In contrast, if one party is not playing with all cards on the table, then problems can certainly be solved, but the nature of the solution is indecipherable and nothing is learned — the industry has not moved a step forward.

That's the idealistic part. As for the Nvidia-is-terrible portion of its current messaging, AMD's Richard Huddy doubled down on the company's previous charges in a recent interview with Maximum PC. You can watch the whole interview here:

The relevant discussion comes around the 30-minute mark, when Huddy starts talking about how GameWorks forces ISVs (independent software vendors) into contracts in which they must use code provided by Nvidia that hampers the performance of AMD hardware.

Now: Mantle isn't necessarily optimised for Nvidia cards either. Huddy's critique here is that Nvidia is prompting game developers to use software that not only runs less gracefully on AMD cards, but is harder to work around because the company black-boxes relevant parts of its code. Meanwhile, Huddy insists — like Hallock did above — that at the very least Mantle is an open source technology in comparison.

The key quote, to me, is when Huddy says: "We are running code in a benchmark which is harming us and this code has been written by Nvidia, and their contract is stopping the ISVs from changing it. This is not equitable."

Oh, snap! So what does Nvidia have to say for itself?

Starting with the original Forbes story, a number of Nvidia spokespeople have repeatedly denied AMD's assertions, calling them "mysterious" at best. When I reached out to an Nvidia spokesperson this week, he responded by saying: "Hi...boy, if AMD spent as much time working on their drivers and actually making investments in gaming than they did talking about us, then maybe their customers would not be stuck with sub-par gaming experiences in today's cutting-edge titles."

These are some serious fighting words. So who's in the right here?

It's hard to say, honestly. Tech-heavy analysis by writers at places like ExtremeTech, Forbes (which ran the story that started the recent kerfuffle), and Digital Foundry have broken down the issue to show that Watch Dogs' performance really is subpar with AMD cards. But that's a different statement than concluding Nvidia is to blame for AMD's problems. Digital Foundry, for instance, simply suggested: "a particular rethink is required in the way that AMD graphics hardware handles this game."

Wait, has AMD ever entered into exclusive contracts with big game companies?

It has. You know that Mantle program I was just talking about? Well, the company has formed plenty of partnerships with top-tier game developers to optimise their work for AMD's tech. When I asked an AMD representative for a full picture, she told me that "the list is hundreds long" but offered up some of the most recognisable examples:

  1. Battlefield Hardline
  2. Dragon Age: Inquisition
  3. Plants vs. Zombies: Garden Warfare
  4. Murdered: Soul Suspect
  5. The Banner Saga
  6. Dyad
  7. Guacamelee!
  8. Tales from Space: Mutant Blobs Attack
  9. Civilisation: Beyond Earth (Mantle support)
  10. Sniper Elite III
  11. Lichdom
  12. Star Citizen
  13. Thief
  14. Deus-Ex: Human Revolution
  15. Dirt 3
  16. Dirt Showdown
  17. Far Cry 3
  18. Far Cry 3 Blood Dragon
  19. Sleeping Dogs
  20. DMC Devil May Cry
  21. Hitman Absolution
  22. Sleeping Dogs
  23. Tomb Raider
  24. Battlefield 4
  25. Crysis 3
  26. BioShock Infinite

That's a lot of games.

It sure is! And that's not even all of them. Plus, when I asked Nvidia for a similar list of all GameWorks-powered games, a representative said that it's already such a far-reaching program that "it would be impossible for us to maintain a current list."

The funny part about this is that while many people seem to enjoy playing up the rivalry between the two companies, once you start to reflect on the two you'll notice plenty of similarities. I mean, just look at Nvidia's recent demo showing off a wolf's hair in The Witcher 3. Then compare that to AMD's "TressFX Hair" tech that it promoted with Tomb Raider to show off the best, most next-gen version of Lara's lovely locks. They're going after similar things here, often in similar ways.

Could't you argue that AMD is just doing the same thing as Nvidia then?

It's sort of a "he said, she said" problem, isn't it? AMD insists that it's current initiatives are more ethical and beneficial to the game industry as a whole. In the statements Nvidia has fired back at AMD in recent weeks, company representatives keep suggesting that AMD is playing the same game that it is — only not as well.

So are these companies constantly at war with one another?

Yes. The ongoing rivalry is pretty much the final battle in Godzilla, in that it looks gigantic and expensive but moves at a bewilderingly tedious pace.

But I'd actually argue that in spite of all the strong rhetoric being thrown about here, the two companies have a great deal invested in playing nice with one another — as long as both keep putting out popular graphics cards that PC gamers want to buy. Deliberating hampering the performance of one another's products would bifurcate the market in a way that could be disastrous for both companies.

Early on in that Huddy interview I referenced, he talks about how the entire industry has centralized around a small cluster of companies like AMD and Nvidia, leaving smaller and less successful competitors to slouch into obsolescence. The same thing happened in the console market, leaving gamers with three main choices (the Xbox, PlayStation, and Wii U currently) after companies like Sega and Atari bowed out of the hardware game.

This is one of the things that makes the game industry so much fun to watch: the entire ecosystem still so young that it's not clear if this kind of centralization is the natural order of things or a historical anomaly.

But I'm talking about market forces as if they're mad gods we must bow down to. The important thing to remember here is that this is a consumer-facing entertainment industry. We, the average gamers, have the power to tell companies what we want. Why else would we be buying and installing these graphics cards in the first place? Because we're just waiting for something better than Nvidia and AMD to come along?

Fuck that. That's not the thinking that brought us Mario, or Princess Peach, or Nazi murder simulators, or the insane mirror image of myself I've made in Tomodachi Life and feel strangely drawn to. If you don't like something that's happening in games (on your PC or otherwise), you have a voice now more than ever. So keep your eyes peeled for more games with problems like Watch Dogs has had, and let us know about them.

Ok, but I play my games on consoles. Does this matter to me?

For the time being, it sounds like everything is copacetic on the console front. An AMD representative affirmed in an emailed statement that Huddy's comments were only about the GameWorks program, which, again, is PC-centric game development platform.

As I mentioned earlier, AMD tech is in all three of the major current-gen systems. Major developers understandably optimise their work for the PlayStation and Xbox consoles. As for the Wii U? Well, that's a whole other story.

Well, that was intense.

I know! I think I need a cigarette right about now.

That's a nasty habit.

Thanks mum! It means so much when you make it all the way to bottom of my articles like this.


Comments

    AMD are a bunch of whiny cry-babies
    They pulled this "oh woe is me" crap against Intel back in the day accusing them of unfair practices
    Just face it AMD, if you cant keep up in the game, you can AND WILL get left behind
    I would hate to see computer technology slow down just because one company fails to innovate as fast and goes crying to mum that the big bad competition is being tough on them

      I get the feeling that you either didn't completely read, or just skimmed the article [or just wanted to comment after reading the title].....

      There is nothing here about tech capabilities and innovation. On the GPU front, AMD and nVidia are effectively neck and neck. Each graphics card release just out-doing the opposition like clockwork.
      So, no, this isn't an issue of AMD not being able to keep up and innovate in the GPU world.

      The issue here is the ways in which GPU makers *both* have programs in place to work with game developers to optimise their games for their own hardware [I'm sure there is some financial dealings involved here, too]. They do this in the development process, and in application-specific driver optimisations.

      Quite frankly, they *both* do this, and quite frankly I think they are both liable for negative consequences, and they both need to reign things in a bit. The more this practice goes on, the more they may lead to a fragmented PC market. Not only will we have xbox vs PS, we will have AMD vs nVidia, each brand getting its own list of optimised titles.

      Frankly, both companies success has been due to the open nature of the PC market, using open standards [openGL, and, while not necessarily 'open' but definitely a standard, directX]. Optimising for single brands is just a bad idea for everyone, and both nVidia and AMD need to stop the bullshit.

      Either way, the point of this is that they are SO close in tech capability, that they are having to resort to questionable deals like this to try and get an edge of the competition.

        I completely agree with you that both companies are neck-and-neck technology wise but, the business playing field encompasses more than simple technological innovation.

        I have used both companies products on many occasions and have found the nVidia experience to be the superior one on more occasions than not, and this was mainly due to greater driver support and updates.

        While I will concede that my initial comment was simplistic, very much so, I dont comment here just for the sake of seeing my own name appear on a web site. The point I was trying to make still stands. If AMD cant, or wont, compete and keep up in all aspects of the graphics "business", they will get left behind. All that they are currently achieving with their complaints is to slow down their own growth. If they dont want to compete in this way, come up with a better idea and change the rules of the game. That is innovation.

        Last edited 26/06/14 8:54 am

          The problem is if people think it's 'good business' for Nvidia to continue making these deals (giving their cards better support on certain games but screwing over AMD) why wouldn't AMD start doing the same thing? In the end we'll be left with PC's as segregated as consoles are now..

          Want to play Watch Dogs? Tough shit AMD guys. Want to play Star Wars Battlefront? Tough shit Nvidia guys.
          Do you really want that?

        I don't think you read it correctly. You May think they're neck and neck but there are a number of assertions in that article that AMD are underachieving in key areas of business and technology. It's not to say that this isn't an issue that still involves nVidia but you're flat-out misrepresenting the information.

        Last edited 26/06/14 1:34 pm

          The only assertion I could find to that was the commend from the nVidia representative saying that AMD need to spent more time working on their drivers.... I'd hardly call that an objective statement ;-)
          Likewise, AMD have stated that nVidia have partnerships with game developers, in which they must user certain components of code in their products, which will cause performance degredation when run on AMD hardware.
          Now, sure, that is a statement, but it isn't an opinion. It is currently heresay, but it *can* be verified.

          I'm not an AMD fanboi, I've got all sorts of systems which meet whatever needs I have. I'm always looking for the best experience on whatever PC I'm using [as a consumer and producer of software]. I honestly don't care about nVidia vs AMD, but I do want to continue to see an *open* and innovative industry.

          With this in mind, AMD, for quite some time now, have been pushing for open standards and specifications across the industry, wheras nVidia is continuing with proprietary, in-house practices only.
          Examples?
          GPGPU implementations: AMD has helped develop and push openCL, whereas nVidia uses CUDA.
          Graphics APIs: AMD is developing Mantle, which is an open standard. Not as a means to segregate themselves from nVidia, but as a means to remove their reliance on directX, which puts them in the hands of Microsoft!

          If you want to talk about innovation and trying to push the boundaries, AMD is doing just that, whereas nVidia is relying on trade secrets.

          It's not even a question of framerates and performance at this point in time for me. It's a matter of asking which technology future do I want to live in? And at the moment, I want to live in the one AMD is heading down.

          To often this argument is dominated by bro's with their 1337 gaming-rig allegiences.

          I'm saying this is bigger than just pushing pretty frames onto a screen. It is about the direction of computing technology. nVidia know that they don't have a business model which is sustainable into the *long* term future. GPU functions will eventually migrate to the CPU as the technology develops, in the same way that we all just take 'onboard' sound for granted on motherboards. This isn't just for gaming, but for the GPU functions which are being leveraged by openCL and the HSA paradigm. Why have a separate card when it can be on the CPU itself?
          Every other manufacturer has the tech portfolio to proceed down this path, intel and AMD both have CPU with currently-not-awesome-but-getting-rapidly-better GPU implementations.
          In the x86 PC market, nVidia only has a graphics arm, so it seems they are doing whatever they can to kill the competition and prevent this future from transpiring.

          Yeah, it may seem a little tin-foil hattish, but I'm looking at the long game here, and this is how I see things going down. I'll buy you a coke if this doesn't end up being the case :p

            My tinfoil hat dream is the opposite of your 'no gpu only CPU' dream. Because GPU's are so much more advanced than CPU's (In some ways depending on how you look at it) I would love to see a system with no CPU, only a GPU, and I suppose GPUCompute 'could' be paving the way to that.

            You could have low cost high performance 'all in one' systems build off of JUST the GPU. Imagine that.. System memory etc, all from the GPU shared memory. The GPU being like a motherboard: the one card/ board you connect everything too, like your hdd etc, that can be replaced and upgraded like your traditional GPU. Imagine: Upgrading almost your whole system (CPU, GPU and RAM) just by getting a new GPU. And you wouldn't even have to re-install your OS.

            A radicle pipe dream... Admittedly yes.. Impossible.. No.

            Last edited 26/06/14 3:34 pm

            Nvidia = Apple
            AMD = Android
            Lol.

            Anyways, I do believe Nvidia is screwing up the PC gaming by doing something like this. My Nvidia GTX670 died after 9 months and I don't blame Nvidia I blame Gigabyt but when it came to buying a replacement I went AMD 280X because it was well spec'd, decent price and mantle was coming out as well as Battlefield 4. The only issue I had with getting a AMD card was it was not compatible with my 3D monitor. I loss I'm happy to live with.

            Anyways, when it comes to Battlefield 4, AMD didn't restrict the playability of Nvidia cards. They had Mantle to improve their cards. To me, this article sounds like Nvidia is trying to block AMD cards to perform just as well in a stock standard situation. You put a AMD and a Nvidia card equally spec'd against each other on BF4, they will come out similar. Wheras Watch Dogs they won't. That's the issue

            By the way things are now, Nvidia or AMD with Watch Dogs, you're still stuffed. Lol.

        I take turns going from Nvidia to ATI every time I upgrade because of their different benefits, and so yes I agree.

        I've had buggy drivers and iffy cards from both teams over the years. Nvidia can be more on the ball for drivers BUT AMD cards are cheaper in terms of price to performance.

        At current I have an AMD card and all has been great in terms of drivers. The card performs how you would expect it too: In most games it is neck in neck with the more expensive Nvidia card in it's performance tear.

        Over the years the only difference I have found is that, while both teams have similar amounts of games 'optimised' for their cards over the other, if a card is optimised for AMD the nVidia card is close on its heals. If, however, a game is optimised for nVidia then the AMD card runs like a turd.

        What does this mean to me? My next card will be nVidia... Why? Because:

        1) When I upgrade from what I have now I wan't to splurge so Price to performance wont matter.
        2) It's just nVidia's turn for me.

        Picking sides is always stupid because you stand a chance on missing out on what is best for you at the time. On a side note, I have found that nVidia do 'appear' to cock block AMD at time's, but in the end that is business and the 'feud' only benefits all of us.

      Kind of agree, looks like they are both assholes just AMD is having a cry because they're not winning. If the tables were turned it would be the exact same argument.

      Correct me if im wrong but I thought the PS4 and XBone had both AMD CPU's and GPU's inside them? so...... wtf are they carrying on about? seems like a lot of greed to me.

      I don't take what I am going to say as the 'be all end all' comment, but you are forgetting AMD's price to performance.

      The R9 295x2 is a great example of what AMD CAN do if they wanted to. It is fast, cool, quiet, smaller (Pci slot wise) than the competing Titan Z, water cooled and cheaper too.

      They both have their market, and I find BLINDLY slamming one team is counter productive.

      Last edited 26/06/14 3:23 pm

    That's it, I'm going back to my voodoo!

    AMD's assertions about Gameworks forcing exclusive AMD-harming contracts is bunk. While I can't speak for Ubisoft specifically, I have former colleagues in other studios using Gameworks who back Nvidia's statement that there is nothing anti-competitive in the program at all, it's just a way for Nvidia to assist with optimisations earlier in the development process. AMD is and always has been free to engage in the same sort of thing with the same companies that use Gameworks.

    The ethics argument is also mostly hot air. It's a marketing ploy to try to paint AMD as 'good guy hardware maker' in the same way Apple tried to paint itself as the underdog long after it was no longer true. I don't want to see AMD leave the discrete graphics market, but there's a degree of rising panic in the company in light of its long-diminishing market share, and trying to portray the market leader as greedy and anti-consumer is an age-old tactic to try to claw back market share. Nvidia's response is much closer to the reality - they're both playing the same game, using the same tactics, it's just working better for Nvidia than it is for AMD.

    Both players are needed for the market to be consumer-friendly, and it needs both of those players to be actively competing with each other to drive innovation and technological advancement. Open standards seem like a holy grail for consumers but they actually slow advancement in the long run by disincentivising new technologies and pushing the sole competitive focus onto performance. The way the market works at the moment is ideal, we just need AMD to step up its game in the areas it's weak (like drivers) so they'll keep Nvidia on their toes - and the same in reverse.

    Last edited 26/06/14 8:28 am

    I personally love these historical rivalries such as AMD/nVidia, Coke/Pepsi, Blue/Red lol
    I see it as motivation for the other guy to push even harder and throw out even better stuff than the other guy, motivation to be the best of the best!

      I see it as motivation for the other guy to push even harder and throw out even better stuff than the other guy

      Or just complain...

    This fight has been going on since ATi days before AMD took over. Like in every industry, each competitor tries to outshine the other by coming up with their own proprietary innovations. I wish they could just work together and keep things consistent.

    Like how Nokia, Blackberry, Samsung, etc now all use common micro USB chargers. Small thing, but it sure is nice. I've even noticed manufacturers like Remington get on to the same standard.

    Also to add to the article: AMD cards have always been traditionally cheaper for the same level of performance. People generally complain about driver issues though (of which I've never had any).

    Last edited 26/06/14 9:10 am

    Forget AMD versus Nvidia. Let's talk about the real scourge of PC gaming - Ubisoft!

    I went full AMD recently (about a year ago) when it was realised that AMD would be powering the next consoles. It's a sad world when you realise that PC does not take priority for devs anymore. The console version gets made then they make the PC version. my logic was that if consoles were AMD games would run better on AMD hardware :) guess I'm just silly lol.

      i have made a similar assumption.

      apparently logic is not welcome here

    Only skimmed the article, but as a game consumer I don’t care if games are optimised for one card or another, the bulk of the thing I play games for is gameplay and enjoyment yes graphics good to have but I don’t care if my version of BF runs at 10 frame per second faster on one card then the other (given that its running between 50-60fps)

    I have had both NVidia and AMD, and for me the most important factor these days is what I get the most bang for my buck (so for the past 2 pc's I have had AMD's in it) never had any issue with them

    I'm just happy that we have two companies that because of the rivalry we get better and faster innovation

    “Hi…boy, if AMD spent as much time working on their drivers and actually making investments in gaming than they did talking about us, then maybe their customers would not be stuck with sub-par gaming experiences in today’s cutting-edge titles.”

    #rekt

    Sort of a related question, but does AMD get somewhat of a performance advantage in many games (not just the 'Made for AMD' ones) due to their GPUs being used in the PS4 and Xbox One? Would this make console ports easier for developers to do since all the 'AMD optimisations' are already there, or does it not really work that way?

      No, they don't unfortunately, yes AMD make the hardware for the consoles, but it is not their software that drives them, which is key to performance. Your drivers is what converts what the software says to what the GPU hears. So basically what the big issue is about is drivers rather than hardware, when they say Watch_Dogs doesn't perform as well for AMD on PC you can think of it as it doesn't perform with AMD's drivers, rather then their hardware.

        Ahh okay that makes sense. Just was something I always wondered.

    Apropos of not much, Coke and Pepsi *do* make energy drinks...

      they only started to make them onced they were getting trounced by V and Red Bull though.

    I have 780ti and watch dogs still runs like shit, whatever deal nvidia has with ubisoft didn't work.

      I think it might be more, however ubisoft is developing its games for pc's isnt working.

    I have an AMD 7970 and run Watch_dogs on Ultra/high at 60FPS. I have not had a single problem with Watch_dogs at all.

    So is the gist that Mantle is an option to make games run better if you can support it but doesn't otherwise hamper the process for everyone else, but that the Gameworks process does end up hurting performance on non-nvidia cards? Because that's how this debate reads every time it comes up from either side, and it makes the two things apples and oranges that aren't remotely comparable.

      That's my understanding.

      Separately, there is an issue of one or the other of the graphics card companies getting game devs to spend time optimising for their hardware when they could spend that time optimising for all hardware.

    The best outcome would be for both companies to make it easy for games to be optimised for both sets of hardware. Which shouldn't be hard, but apparently is.

    Also have game companies ever done a survey about how much gamers hate seeing random unrelated corporate logos at the start of a game? Because that is probably the main impact of these arrangements for most people.

    Just a point on the micro USB chargers

    The European Union enforced that all mobile chargers be the same
    The only reason why apple still has the lighting adapter is in the EU when you purchase an iPhone or iPad they also give you a micro-usb to lighting adapter so you can use micro usb chargers

    I have use both cards over the years and appart from the 9800, AMD cards have always been subpar and its really the drivers that do it injustice. Many times I have and weird issues with AMD cards and never have issues with nvidia cards they just work.

    I don't like what nVidia are doing, but AMD must think I'm an utter moron if they think that they've convinced me that they wouldn't do EXACTLY the same thing to nVidia that nVidia is doing to them. nVidia aren't good guys - they're a business and their main motive is profit, just like with any reasonable business in a capitalistic society (which I enjoy living in! Don't get me wrong - while I know capitalism is flawed, I sure as heck don't want to live in Cuba!).

    But AMD is also a capitalistic business primarily driven by profit. This "hurting the industry" bullcrap is just.... bullcrap. If AMD were the top dogs, they would do the same thing and to heck with "hurting the industry".

    Again - I'm not defending what nVidia are doing, I'm saying that AMD look silly when they cry out that nVidia are being "immoral". Oh really? A big business putting profits over morals and the long term health of its industry? COLOUR ME SHOCKED SHERLOCK!

    AMD are just as money-hungry, profit-obsessed and corporate as nVidia, and Vice Versa and anyone who doesn't realize it must also think that McDonald's really values them as a customer, that Nike honestly care about their health and that their insurance company really is looking out for their best interests! In other words, if you love a company you don't work for or own stock in, you're a moron. Companies are useful, don't get me wrong, they can be extremely useful - but never love a company. Never trust a company. Don't be stupid. The CEO of nVidia and AMD would run you over with a forklift to get the money in your wallet if they thought they could get away with it.

      Well AMD are doing the same thing, but they say they are doing it differently, if you read the article. AMD have a similar program but they say theirs is about helping all consumers by not making their "help" limit the competitors hardware, which is what they accuse nVidia of doing.

Join the discussion!

Trending Stories Right Now