The Rainbow Six: Siege NVIDIA Trailer Shouldn’t Upset Just AMD Users

The Rainbow Six: Siege NVIDIA Trailer Shouldn’t Upset Just AMD Users

Ubisoft’s latest trailer showing off Rainbow Six: Siege looks nice. Very nice, in fact, which shouldn’t be a surprise considering the camera angles, panning and various cinematic shots they’re deploying for what is supposed to be an intense first-person competitive shooter. But there’s a problem: it’s got NVIDIA all over the shop. And while that’s handy if you own a flashy GeForce GTX graphics card, it’s not something gamers should be thrilled about.

It’s not that Gameworks in principle is terrible, or the kind of efforts that we shouldn’t expect GPU manufacturers to pursue. And it’s not as if the game is in dire need of it: NVIDIA’s own blog notes that the GPU for Siege’s minimum system requirements, a GTX 460, will be able to run the game at 720p and 60fps on low settings, while a GTX 960 is recommended for buttery smooth 1080p fun.

That’s a particularly achievable benchmark for the majority of gaming PCs; according to the latest hardware stats from Steam’s monthly surveys, the GTX 970, GTX 760 and the GTX 750 Ti are the most common discrete GPUs in use inside gaming PCs. (The most common AMD card is a HD 7900 series chip, by 2.21 per cent of respondents.)

So most of those should be able to take advantage of NVIDIA’s special occlusion and anti-aliasing features — which Ubisoft was only too happy to show off in the trailer below.

But is it genuinely good for PC gamers to have major releases tied in so heavily to one graphics card manufacturer?

It’s caused problems before. AMD users had an immense amount of trouble with the launch of The Witcher 3 and Project CARS earlier this year. It’s not simply that AMD cards can’t run NVIDIA’s proprietary tricks like HairWorks or HBAO+ (the latter of which is advertised in the new trailer), but that the performance is so unoptimised that it’s simply not worth the effort.

AMD has blamed NVIDIA in the past for not sharing the source code to its graphics APIs, although that’s not an argument that plays out well in the real world. This is a business, after all, and there are more performance gains to be made by AMD better working alongside developers to improve the baseline performance of games. That’s especially the case with more and more users targeting 2K and 4K resolutions, which improves the overall fidelity so much that the need for special anti-aliasing techniques is lessened.

Developers can be partly to blame too — they don’t have to license proprietary technologies such as Hairworks, to use an example — but the real damage done at the end of the day is to competition.

When games proudly advertise their affiliation with one GPU manufacturer over another, it highlights the very antithesis of PC gaming. It’s meant to be an open, accessible platform, a platform that has always relied upon competition. That is how users get the best performance; that is how users get the best value.

This doesn’t mean, of course, that Rainbow Six: Siege’s performance will crumble the millisecond the game detects the presence of AMD’s Catalyst drivers. Considering the game’s low minimum requirements, it’s highly likely you could get a satisfactory frame rate on a potato (to use the parlance). And it’s not unusual for Ubisoft to do this either: the studio has had a long affiliation with NVIDIA, with the GPU manufacturer debuting its HBAO+ technology in Splinter Cell: Blacklist.

But gamers shouldn’t be proud about it. They shouldn’t welcome the sight of trailers above. In an age where DirectX 12 is promising multi-GPU setups with cards from rival manufacturers, and all the performance gains that that brings, the gaming public should be demanding more competition.

The above trailer doesn’t advertise that. The platform needs the two major GPU manufacturers to be fighting tooth and nail. Gamers need the two to be dropping prices left and right. Implementing brand-specific technologies doesn’t help that one iota, and PC gamers should be angry.



    I AM angry haha.

    I just wrote a huge rant on Gizmodo’s Nvidia VS AMD article (Kotaku gonna publish that?), this proprietary tech stuff is BS.

    I still reckon Nvidia offer their resources (ie; people) to companies like Bethesda and Ubisoft for free, and those companies agree to it because they then don’t have to spend money adding vfx stuff to their engine themselves. And if not free, more than likely less than what it would cost to do it themselves

    BONUS Tin-Foil Hat bit: If Nvidia DO do it for free, that also explains why they’re OK charging so much more than AMD for their GPU’s.

    Open Industry Standard or GTFO.

  • I have to agree. Between Steam and Nvidia Gameworks, the PC (as a gaming platform) is slowly more closed as the years roll on.

    But I guess, money…

  • Played the beta on my 290X at 1440p, and had no issues. Ran perfectly. *shrug* I don’t condone optimising any title for specific cards/manufacturers/GPUs, but sometimes I think it’s a bit of a storm in a teacup. We had compatibility issues long before brand-exclusive optimisation, so not sure how this changes anything.

  • As someone who is not a PC gamer, why would someone pick AMD over NVIDIA? I constantly see NVIDIA exclusive features for these games, but never AMD ones. Is there something AMD do much better than NVIDIA or something? Why wouldn’t you pick the brand that consistently delivers less issues with games and exclusive features?

    • There’s been a number of AMD optimised/branded titles as well, both camps do this regularly. As for which to choose, and why AMD or Nvidia, it’s usually (or should be) a case of which card is best for you in terms of price and performance. But that being said, fandom is rife in both camps, and a lot of people buy their chosen brand because things.

      Me, I was a Radeon boy back in the AGP days, and was a Nvidia boy from my 7300GT all the way through to my SLI Titans. The Titans were the catalyst (cwutididthar?) that made me switch back to AMD. Mostly because at the time, AMDs (most notably the 290x) were the best bang for buck cards hands down. Also, it drives my 1440p display perfectly without any issues. Got it for a really great price too.

      At the end of the day, both camps have pros and cons and your choice should be based on what suits you best for all the reasons.

      • Generally I’m not worried about the AMD branded titles as generally speaking the results are that the game has been closely optimized for current industry standard which generally benefits everyone regardless of hardware vendor (they were big proponents of DX11 adoption a few years back for instance).

        Developers from what I’ve seen over the years loves the level of support and engineering assistance that they receive from Nvidia (John Carmack has gushed about it on several occasions). The problem is in a number of cases this often leads to the implementation of some fairly green team weighted solutions and a bunch of proprietary bells and whistles.

        • My favorite thing was how revolutionary AMD’s Mantle tech was, which they ended up terminating because DX12 was pushed forward, incorporating similar technology that AMD pioneered. AMD realised it’s pointless to fight DirectX and just let it go. Essentially, Mantle forced DX12 to move forward faster, because it made DirectX look incredibly dated.

          There’s been a few examples of AMD being trailblazers in various departments, and pretty much being ignored by the masses for it. Hell, they even wrote the 64 bit instruction set which was adopted and adapted by Intel.

          • There’s an interesting development story behind x64. The AMD-developed instruction set was made in 2/3 part from existing Intel 16- and 32-bit instruction sets, plus a redesign of Intel’s IA64 instruction set to be compatible with the older instructions. AMD didn’t actually innovate much in the development of this standard, but what they did was have the business acumen to understand that Intel’s 64 bit architecture would never take off if it didn’t maintain backward compatibility. It’s definitely something to be thankful for and a valuable contribution to computing, it just wasn’t a trailblazing thing per se.

            Mantle is another interesting story. It was never going to succeed in its original form. I wrote some extensive posts on why back when it was announced, but the gist was that its focus being on performance instead of abstraction made for some (in my opinion) very poor design decisions that made it a lot more complex for developers to implement. But as you point out, it did give a kick in the pants for DirectX to include some performance improvements but still keeping the strong level of abstraction that made it the most popular rendering platform around.

        • and promotes incompetence. When game developers can’t make their own games anymore we get batman and fallout. Doesn’t help that every time nvidia is involved there are massive FPS costs associated, screwing up the experience of the average gamer.

          The actual technology provided by nvidia is likely not the main reason. Nvidia is likely paying up in most of the cases in advertising deals etc. which include the need for gameworks to be integrated.

    • The key reason is that generally speaking, AMD cards offer comparable performance at much lower prices than nVidia. You miss out on a few fancy features and some games will run badly on your hardware because it was built specifically around nVidia, but most things will run as well on a high end AMD card as on a high end nVidia card and for a couple hundred bucks less.

      …That said, I still stuck with nVidia personally.

      • AMD’s other strength is its OpenCL performance which is far ahead of Nvidia. On the other hand, AMD cards run hot and burn more electricity than their Nvidia equivalents. AMD cards tend to pack faster memory but Nvidia uses excellent compression techniques to avoid the need for high memory bandwidth. In that respect Nvidia strictly has the advantage in the long term and future generations, but currently there’s no performance difference.

          • In respect to what, Nvidia’s memory bandwidth? That’s where I believe Nvidia has the advantage in the long term. Their next generation Pascal cards will feature HBM so memory performance will presumably be on-par with AMD’s latest generation from a hardware perspective, but Nvidia retains its excellent compression techniques so should theoretically have a strong advantage on throughput than AMD.

            HBM does have a down side in its current form though, that it doesn’t respond well to overclocking. There’s been a lot of debate back and forth on whether HBM can be overclocked and the conclusion seems to be that yes it can, but not by very much, and it reports clock settings much higher than the hardware actually uses.

      • AMD’s prices are not that much better now if at all. The main reason I think is simply that they provide great performance now and later. Currently they are faster in every category but the highest and possibly lowest. Given this and the fact their hardware stays relevant for more years, its a no-brainer buying AMD. Even in spite of all the rubbish nvidia pushes into games (turn them off, odds are you won’t notice the degradation in quality but will notice the increase in smoothness)

    • Cost is one factor. AMD cards are generally cheaper which can often mean almost jumping to the next tier of gaming performance for the same price.

  • I’m not surprised considering nvidia’s last shareholder meeting showed that nvidia has around 82 present of the market when it comes to discrete gpus. 82 percent! Why wouldn’t developers jump on board and use game works? It’s less work they have to do and practically everyone can utilise it. I personally hate any of the propriety stuff even though I own nvidia gpus. It should be open so amd can optimise for their hardware as well but sadly that’s business.

    • Actually that is not logical. Those percentage numbers are quarterly, the actual total share is something like 60-40 or so. GPUs do not disappear after each quarter and clearly going with nvidia brings with it performance issues for a vast majority of gamers even on nvidia’s side, as well as the ill will from gamers that results from the negative experiences.

      WHY they would do it is if nvidia is paying them. Which they do. They can also form other non-monetary arrangements to get developers to use gameworks such as free advertising, development assistance with free hardware etc.

  • I was angry… but then for my next upgrade, I bought an NVIDIA card because more developers seemed to be designing for it.

  • I’ve made the decision that my next GPU will be AMD. I admit that physX was once a selling point. But I can count the number of games with it on my fingers. While the same can’t be said for their CPUs. AMD’s GPUs are competitive and have been trading blows with nvidia for years. What Nvidia are charging these days isn’t justified by the performance gap.

    • Agreed, and I am sitting on two 980 Tis so I can actually back that up.
      Don’t worry though. AMD have DX12 utilisation in the bag over nVidia (they just jumped too soon). AMD are set for a comeback.

    • physX was almost a clincher for me upgrading my HD5850. but ive decided to hold off until next year to see what happens with DX12 and HBM and AMDs and NVIDIAs new range of cards.
      its a friggin painful wait though.
      i couldnt even try the Star Wars Battlefront Beta coz my card only ahs 1Gb RAM…
      patience is the key. ha ha

  • It’s Ubisoft.
    Has everyoen forgotten Watchdogs and that cring filled Division trailer?
    If you fall for this shit and preorder you need your head kicked in by a donkey.

    • Really looking forward to division but yes, the more time passes the more I realize they will likely screw that up and possibly double down on gameworks in it.

  • Anyone else seeing comparisons between the x box and playstation camps? God help us if they ever release exclusive titles for one of these brands. May as well just play console like a peasant ;).

    I’m on my first radeon and its been great especially for the price.

  • Gameworks definitely hurts AMD and the industry as a whole. The performance is weak at best even on NVIDIA cards (including my GTX 970), only serving to promote the “prowess” of their very top tier models. To top it all off we have tech reviewers like Linus preaching endlessly the merits of NVIDIA to all the pre-pubescent kids on the internet who think they know all about PCs.

    • That’s what is funny, gameworks rarely works well on Nvidia cards. Like Physx still causes problems, the developers jump on-board and incorporate effects that DO looks good, but usually aren’t coded properly, and end up causing all sorts of issues.

      I have been on the green side for a few cards now, but overall I think it’s the same on both sides. AMD have had quite a few games that they have signed up, and aren’t void of little backend tricks to make certain games run bad on nvidia systems – like backend code that specifically reads what type of card you run and if AMD isn’t present some features are disabled (not talking AMD specific effects either) , so AMD aren’t innocent.

      All I wanted is for PC to become the main platform for developed games, so we don’t get shit ports, so you could look at it like this – whether Nvidia or AMD (mainly NVIDIA), throw out cool looking videos with PC only graphics for big games, I can only hope that brings more people to PC.

      • I, in actuality have never owned an AMD card. I’ve been on the green side since I started with PC’s (just a matter of circumstance and pricing at the time). I however am fed up, just like you, with all of the proprietary technologies created by Nvidia that so many consumers think is terrific.

        I appreciate that AMD have partnered with companies like EA, but I really don’t feel like your statement about AMD “back end code” to be easily quantifiable (Heck, maybe they’re just more subtle at this business than Nvidia!). But from my observations, AMD gaming evolved partnered games like BF4 and Theif have worked exceptionally well on Nvidia cards as well. The only reason in my eyes that these games had exceptionally better frames on AMD cards was their addition of their Mantle API which AMD actually offered to Nvidia as an open source project, however Nvidia rejected this offer. The Mantle API has now gone on to inspire and foster the development of Vulkan and Dirext X 12 which will indeed benefit all PC gamers. So in this way, much like AMD’s other graphics initiative (i.e. crossfire, freesync, tress FX), I feel like AMD do more to benefit the PC gaming industry as a whole without deliberately undermining their competitor and creating awful proprietary tech in the process.

        That being said, I think we’re on the same page. I really hope that the PC gaming community can overcome this nonsense and have a bright future filled with great games and well functioning ports.

  • As a PC gamer, I see no problem with this. At least the game will still run with AMD cards, unlike the silly bullshit that is consoul exclusive games.

  • Yes and you still don’t get it. What happens if AMD dies? How well will your Nvidia hardware be optimised when Nvidia has nothing to gain?

    Idiots like you are killing gaming.

Show more comments

Comments are closed.

Log in to comment on this story!