Ubisoft's Newest Deal Could Be Bad News For Some PC Gamers

Ubisoft's Newest Deal Could Be Bad News For Some PC Gamers

Despite lingering tensions between AMD and Nvidia over the latter's close relationship with major game developers, Ubisoft is extending its partnership with Nvidia to cover its biggest upcoming PC titles. Ubisoft announced the partnership in a press release today, saying that it is working closely with Nvidia's GameWorks program to develop four of its most highly anticipated games: Assassin's Creed: Unity, The Crew, Far Cry 4 and The Division.

The announcement in and of itself is nothing out of the ordinary. As today's press release noted, Ubisoft entered into a similar partnership with Nvidia for its popular games Assassin's Creed IV: Black Flag, Splinter Cell: Blacklist, and the newly released Watch Dogs.

But it comes just a week after AMD threw some hefty stones at its longstanding rival over the influence that GameWorks has in the game industry and for PC gaming in particular.

The company's accusations came in response to some spec-heavy analysis showing that Watch Dogs' performance was subpar on PCs using AMD cards compared to ones running Nvidia tech. Combined with the online connectivity issues that have been keeping some players out of Watch Dogs entirely, PC gamers haven't been a happy bunch as of late.

And the Watch Dogs performance gap is just the latest example of AMD users getting a raw deal for games developed under the GameWorks program. Other titles like Batman: Arkham Origins have raised the same issues for PC gamers.

Speaking to Forbes in late May, AMD's Robert Hallock said that GameWorks "represents a clear and present threat to gamers by deliberately crippling performance on AMD products (40% of the market) to widen the margin in favour of NVIDIA products."

Nvidia quickly fired back with a flat denial, but that hasn't done much to assuage the fears of PC gamers with fresh memories of Watch Dogs and Arkham Origins not running so well on their AMD-powered rigs.

GameWorks is a recent initiative spun out of Nvidia's existing products and services that was designed to put the company closer to the entire development process of a given game — giving a company like Ubisoft access to a more robust set of tools so they can "bring an enhanced gameplay experience to [...] PC players," as Ubisoft VP Tony Key put it in today's press release.

Slick visuals and "enhanced gameplay" aren't usually a bad thing. But AMD's charge highlighted a concern that many PC gamers also have about GameWorks — that it cements an arbitrary relationship between a graphics card maker and actual game developers — one that leads to unnecessary performance-hampering effects when the games are played with a competitor's technology.

In the short term, therefore, AMD's accusation suggested that this means PC gamers who prefer the company's cards to Nvidia's would get a lesser version of a game like The Division or Assassin's Creed: Unity for no better reason than a high-level corporate deal being made without their prior knowledge or approval.

And so the great Nike-versus-Adidas rivalry of video games continues.


    I don't get it. with the Wii U, Xbone & PS4 all having AMD based graphics cards, why would publishers go out of their way to give benefits to NVidia cards which are only gonna be a (relatively) small percentage of the market?

    Edit : & by 'small percentage' I mean the 60% of PCs (as stated in the article) that have NVidia based cards, as opposed to the remaining 40% of PCs and 100% of the new console generation.

    Edit #2 : of course, I know the answer is "NVidia have paid a shit-tonne of cash to certain publishers for this special treatment to help them remain competitive," but seriously.... its gotta make any publisher who goes down that road unpopular with any developer who has to work for them.

    Edit #3 : also, i'll be the first to admit I'm no expert when it comes to PC vs console vs AMD vs NVidia market share. if anyone cares to enlighten me, feel free.

    Last edited 06/06/14 1:35 pm

      Doesn't Nvidia have the largest market share for video cards (not including intel)?

        maybe at the moment. but all new consoles are AMD-based. so if NVidia have the majority, it probably wont be for long.

      AMD products (40% of the market)

      So NVIDIA aren't a relatively small percentage, in fact they're the majority? (Though that figure could just as likely be "in the PC space")

        yeah, see the edits.
        AMD have their hardware in every Wii U, Xbone & PS4 out there, so I'm guessing that the 40% in the article is probably PC market only.

          AMD have their hardware in every Wii U, Xbone & PS4 out there, so I'm guessing that the 40% in the article is probably PC market only.

          Definitely in the PC market as this article is about PC market.

          maybe at the moment. but all new consoles are AMD-based. so if NVidia have the majority, it probably wont be for long.

          Even with AMD dominating in the consoles, it does not mean nvidia will lose the majority. The AMD chips in console are specifically designed for the console except maybe X1's which you can get the APU in market, not too sure about Wii U's APU but PS4's APU is designed to work on PS4 only.

          In PC scene, nVidia have the edge even when they are more costly as their driver support is far better than AMD and nvidia have a few more exclusive feature that allow developers to make nifty stuff happens on PC like PhyX. Just features that PC developers choose to use those feature for a better looking product.

          For the extra bucks I get much better stability and driver support. Never going to ATI unless I won a card from competiton. I cbf to install new hotfixes for every game release and downgrade to play older games due to hotfix breaking old games.

        How about make games that run well for 100% of the market? seems pretty legit.

      The reason this isn't an issue for consoles is although yes these consoles use amd hardware they use their own software for drivers, this basically makes consoles a whole third other party in the sense of this story, the advantage this deal gives nvidia is more in the drivers rather than the hardware.

      Also it's not a two way split, Intel have a percentage in that for their on board hardware, so really it may be an AMD majority as I remember Intel had a fair chunk themselves. Unless this percentage is based on gamers, which would be a weird statistic to use as it would be very hard to obtain.

      Last edited 06/06/14 2:35 pm

        From a financial statement perspective, AMD's share has been slipping in the desktop market for years and Nvidia showing commensurate growth. Q1 2014 discrete card market share was at 35% for AMD, 65% for Nvidia.

        From a hardware relevance perspective, developer tend to look only at gamer market statistics, but it's not that difficult to get those statistics as you suggested. A good indicator for that is the monthly Steam hardware survey and particularly the DX11 hardware chart, which has Intel at 10.46%, Nvidia at 48.85%, AMD at 27.38% and Other at 14.16% (0.8% in rounding errors are present in these totals). AMD cards have similarly been on the decline over the last 6 months (the extent of the survey range available on the website) and Nvidia and Intel both on the rise.

        Last edited 06/06/14 3:16 pm

      The problem is, you're pretty much suggesting they should focus on AMD as they're on consoles. Which is the same thing you're complaining about them doing with Nvidia.

      Probably the fact the consoles and their hardware is locked in, they are already focusing a lot on AMD anyway and there's not much else they can do. They want to push themselves and do better and Nvidia is the way.

        oh, i'm not complaining either way. I'm a GPU pragmatist - whatever brand best fits my needs when I'm in the market get my money.

        just from a business standpoint I'm wondering if it wouldn't be more efficient to put more developmental focus on the hardware that's going to have the most market share in the long run.

          Maybe they don't have much faith in current gen consoles long term.

            maybe. but unless something insanely unprecedented happens, I think the current console setup is gonna be the norm for at least the next 5-10 years.

          I wouldn't be surprised if these sorts of deals only affect the bleeding edge, anyway. Which renders console market share pretty irrelevant.

    It's a double edged sword, really. With 2 "main" standards we're never going to push the limits of one over the other given that it needs to be compatible with both, but then the alternative (a single "standard") would lessen competition and be just as bad in the long term.

    I think that the nVidia cards are better overall than the AMD cards, but with AMD in every current gen. console, there could be trouble ahead. And even though I have nVidia cards in my rig, Watch_Dogs (the super-duper nVidia game) still runs like I'm playing it on a vacuum cleaner rather than a PC.

    I just hope their next games run properly, and a lot smoother out-the-box than Watch_Dogs. Far Cry 3 ran superbly from day one, beautiful game!

    I would like to see at least one console running nVidia, but I know it won't happen on this generation of consoles. Just want more variety to produce more competition, resulting in more innovation.

      the other side of the 'variety = competition = better tech for customers' argument would be that an AMD standard for all 3 current consoles = easier for devs to perfect working with the tech = better running & better looking games for customers.

      both arguments have their strong points.

        There is no standard between the consoles though, despite running AMD hardware. They each use unique APUs from different core families. In fact, this was one of the down sides of AMD being so desperate to take control of the whole console generation, all of the manufacturers wanted to secure unique (and ideally superior) hardware for their respective platforms, requiring AMD to develop different feature sets for each.

    It doesn't really affect me since i've been using only nVidia for years now after consistent problems with AMD/ati cards dating back to around 2001/2002 when I bought my first video card and it was an ati Radeon 9800. I remember having this ridiculous artifacting problem on loads of games and went through three cards over the course of a few years, with the problem just being the Radeon line. After going nVidia for a while and getting the hang of computers, I would go back to ati and buy another card, again experiencing not just compatibility problems but overheating problems. I ended up buying another nVidia card again shortly after.

    I've just been soured on AMD/ati through personal experience. It's probably bad for those that use AMD cards and even a little damn unfair.

    I prefer AMD myself and this kind of turns me off of purchasing Ubisoft games.

    Ubisoft's PC ports are bad news full stop.

      Agree. Many nVidia users are also having issues with running Watch Dogs, including myself. I have an OC'd 780 & 4770k, and the game brain farts every time I enter a car, even with settings turned down.

        I got burnt with Ghost Recon Future Soldier, and haven't touched a Ubi PC port since. On release, GR:FS wouldn't recognise most mice, ran at like 640x480, and was (all drama aside) literally unplayable. Gave them benefit of the doubt and waited weeks - they fixed some issues, but it was still rubbishy as hell.

        Then heard of people getting the same experience on stuff like AC3 and Watch_Dogs. There are other examples I can't recall off the top of my head that other people have mentioned. But yeah - this opitmisation stuff is iffy, but Ubi's cut and paste ports are iffier again.

        Same exact situation here.

        High end rig with Nvidia card that demolishes every other game on ultra. Hop in a car while playing Watch Dogs and the game starts shitting all over itself, even on low.

        I feel your pain bro.

    Watch_Dogs is an unoptimised POS. Check out the AMD results for the game here:

    As you can see, the whole 'nvidia program' thing didn't help nvidia cards by much (if at all). The game also doesn't feature any nvidia-specific tech, so using that game as 'proof' is invalid.

      It's a fairly even distribution between Nvidia and AMD cards in those benchmarks, it's certainly hard to believe there's an Nvidia bias based on the results.

      I thought I'd point out that Watch Dogs does support one Nvidia-specific technology, TXAA. It's a very high quality antialiasing technique designed to improve motion aliasing, but it's also quite demanding on cards before the 700 series. In fact, Nvidia themselves don't recommend using TXAA outside of Titan or high end SLI setups.

    Kind of funny people hammering Ubisoft for a bad PC release when a) 70% of those users bagging it havent updated drivers and b) with so many different configurations for PC's it so difficult to get it right on everyones machines.

    Want the tip , if you want things to work , buy a console and stop being a hipster dufus PC lover.

      Just so you have a bit of a better understanding on how these things work...

      In modern gaming, there are these things called graphics APIs, think of them as the middleman between what the game renders and what you see on your screen. Examples would be Direct X or OpenGL. The reason why developers are able to make games that run on numerous setups (including these consoles you're suggesting we all switch to) is because they program to the API and not the hardware itself.

      Graphics cards are basically just API translators and depending on how fast a card (or cards) you have will determine how much it can translate at any given time. Hence the reason you also have detail settings.

      So to wrap this up, it's not hard for any developer to make code to work on everyone's machines, they just need to make good code so that the API can talk to the GPU and get it's point across.

      Besides, these new consoles are lame anyway.

    Not sure about the specifics of this Nvidia GameWorks thing - sounds more concrete than just a simple optimsation - but AMD have had games 'optimised' as well. (loading screen with 'Optimised for/Works best on AMD, etc) How is that any different/better than what Nvidia does?

    Again, being unfamiliar with what Nvidia GameWorks is exactly, it's an honest question.

    EDIT: AMD make some fantastic hardware, but like others I've been burnt more than once back in ATI days, both with shoddy hardware and garbage drivers. I started out as an AMD/ATI crusader, but when ATI cards kept letting me down, and AMD CPU's started sucking hard, I went Intel/Nvidia and haven't looked back.

    Last edited 06/06/14 2:34 pm

      It's certainly the case that both companies optimise their drivers for particular games, and contribute their own proprietary tech to games (Nvidia's various proprietary AA modes, for example), even though in general terms AMD is more open with its tech than Nvidia is. But the difference in the case of Gameworks is that as part of the licensing agreement with developers, Nvidia is preventing them from cooperating with AMD to make optimisations to their games, or from sharing code with AMD to allow AMD to optimise its own drivers. This is a new and different practice, and one that's potentially extremely harmful.

        Nvidia has labelled Rob Hallock's accusations as bunk. Cem Cebonoyan stated fairly unambiguously that there has never been and will never be a restriction in Gameworks contracts preventing developers from working with both Nvidia and AMD (or any competitor) simultaneously and Nvidia has never attempted to stop a developer adding AMD-specific optimisations.

        It's true that Nvidia has more proprietary technology than AMD does, but they're all optional and I've never seen a game that doesn't provide alternatives. TXAA is the best antialiasing technique out there in terms of effect vs performance, but that doesn't mean AMD cards don't have powerful antialiasing techniques available to them. A similar thing goes for PhysX.

        It's easy to get upset at Nvidia for using proprietary tech in their cards, but you have to keep in mind this is a competitive environment in which Nvidia has had the technological lead for a few years now. It's natural for AMD to complain that Nvidia's technology should be made available to everyone so that AMD can benefit from it, and it's natural for Nvidia to want to keep it to its own products so they can retain their advantage.

        Open standards are great for consumers in the short term, but in the longer term they tend to stifle innovation and slow advances because companies become less willing to invest money in research and development of new features that, if opened up, would basically be free gifts to competitors who haven't made the same investment. If all graphics card interaction was standards-driven, performance would be the only investment manufacturers would bother to compete on, and advances like updated shader models, new antialiasing and anisotropic filtering techniques and such would be much slower to trickle through.

        As it stands, I believe the environment we have now, with the majority of card features being common across all manufacturers while retaining the ability for each to develop their own proprietary new features, is the most conducive to advancement of technology in the industry.

    As a PC user, I really only want to have to concentrate on the best specs. These sorts of company deals smell like console exclusive deals that I thought I left behind.

    I’d also like to say that Ubisoft is sort of an odd company. You can see that the developers there are fantastic and generally get a lot of support to do some interesting things (eg. Blood Dragon). But the corporate side seems so out of sync with the rest of it – like Uplay, DRM, stunts like this, etc., Sure this happens in other gaming companies but the stark difference is so pronounced here. I never know whether to love them or hate them.

    Crap like this is exactly why I got out of PC gaming. They may not have all the bells and whistles but at least on an Xbone or PS4 I know the game is the best it can be for that platform, I don't need to get that extra stick of RAM or tweak these drivers or upgrade that video card every 6 months.

    Last edited 06/06/14 4:36 pm

    I'm just going to go on record as being very happy with this. I've been fortunate enough to never having had an issue with games not running/running poorly due to card setup.
    I have had issues with Nvidia drivers but that wasn't games related, but with my desktop triple screen side of things. Found myself some stable drivers and haven't had an issue since.
    I'm possibly just lucky.

    This screams to me that NVIDIA decided to pursue the PC market 100% as they are no longer part of the console market.

    AMD spent lots of resource on the consoles.

    It makes perfect sense that a company with its eggs in one basket will bet the house on it.

    however it still sucks. the whole point of things like direct x was to allow the specs of the cards to be the deciding factor. not deals with the developers.

    there have been card specific features for ages, tesellation has been on AMD cards for years but only relatively recently has been used, and that was once it was a mostly standard feature on gpus.

    im looking to get back into building a pc probably mid next year so that i have a pc that is drastically more powerful than the consoles, im traditionally an amd guy because they have always had really good connectivity features. ie they beat nvidia for audio over hdmi.

    i really dont want to have to pick nvidia just because games are being under-optimised for amd

      Audio over HDMI was only a problem on the first generation of HDMI-capable Nvidia cards, the two have been equivalent basically ever since, both supporting the HDMI maximum of 24 bit depth, 192kHz rate. HDMI audio is an optional part of the spec which is why early Nvidia cards didn't support it, but that was some time ago. HDMI audio is important for me too, and I use Nvidia cards at the moment.

      A small side note, tessellation was actually driven by Direct3D 11 (and its corresponding 'me-too' update in OpenGL) rather than by either of the manufacturers specifically. Both manufacturers supported tessellation on their first DX11 cards.

      Eyefinity is currently better than Nvidia Surround, no question. If that's a particularly important feature for you, AMD scores a lot of points there.

      The main thing I wanted to point out is that Nvidia still has a heavy presence in mobile, much larger than AMD the last time I looked at mobile device stats. It might seem counterintuitive to say, but the console market is actually a pretty shitty money-maker for the GPU makers. The chips all required their own R&D because they're each unique variants, and the margins are rather thin due to the devices themselves being loss leaders.

    Ubisoft is extending its partnership with Nvidia to cover its biggest upcoming PC titles

    Read this part and didn't need to read anymore, I'm all set with my 780Ti :)

Join the discussion!

Trending Stories Right Now