Interactive smoke and fog, dollar bills that flutter across the floor, enhanced rain that beads off Batman’s cape — that’s right, it’s NVIDIA’s obligatory “look-how-good-Batman-looks-on-our-hardware” video for Arkham Knight!
While I’ve never caught myself playing an Arkham game on a console or AMD-powered PC and lamenting the lack of fluttering debris, I am a big fan of smoke that moves like smoke and rain that acts like rain. These are things that NVIDIA’s GameWorks technology can do for you. The people watching your “Let’s Play” video will appreciate the extra attention to detail, probably.
For comparison’s sake, here’s Batman: Arkham Knight running without GameWorks:
I can’t wait to turn these features off so the game runs smoother on my computer when it drops on 23 June.
Comments
46 responses to “NVIDIA Makes Batman: Arkham Knight Look All Fancy”
And this is why I buy Nvidia, because they invest the time and money into helping devs implement these features into their games and make PC gaming better. It’s unfortunate that AMD doesn’t do the same.
They’re too busy looking for things to blame nvidia for.
Pretty much…
It would be very difficult for AMD to get developers on board to implement exlusive features when nvidia has way more market share than AMD. I just hope that AMD can turn things around because nvidia needs some serious competition to make sure it continues to innovate and for prices to be kept in check.
It has become this big cycle which only makes the issue worse for AMD as time goes on…
Devs favour Nvidia, more people buy Nvidia because of it, Nvidia’s market share gets larger, more devs favour Nvidia. Rinse and repeat.
I buy Nvidia, not out of some sort of brand hatred for AMD either. It simply makes more sense to buy what is likely to have the least amount of problems going in, for the most part they just seem to have far fewer issues with games these days than AMD users when new games roll out.
Witcher 3 is a great example. The majority of complaints I saw the first few weeks were from AMD users, and they were general performance/optimisation issues that had nothing to do with the Nvidia Hairworks stuff.
It’s not always the case, but yes, the main reason is that Nvidia smashes AMD when it comes to driver support. Nvidia always gets a day one driver out for big games, the only exceptions in the last few years have been games with AMD deals, like Tomb Raider on PC, it had the TressFX implementation (which was an AMD version of Nvidia HairWorks) and Nvidia said they didn’t get preview code for the game to make a driver for it.
The difference being that AMD aren’t dicks, and provide their TressFX code for devs and Nvidia to optimise. Nvidia do the opposite, and stop devs from optimising Nvidia specific content for their competition. Even Mantle could work with nvidia gear if they wanted.
So sadly yes, because Nvidia are a horrible company, in general you’ll get the best experience from them. Because when they choose to pay money to a dev to gimp the code for their competition, they do a great job at it, and AMD doesn’t do that.
I’m sure AMD made TressFX available to nVidia in the same way nVidia made GameWorks available to AMD: Through the release of the technology in games and not through the direct supply of source code.
If that is the case, and I may be mistaken, it’s not really nVidia’s fault for AMD’s lack of optimisation, and seeing AMD’s past and current reputation regarding lacklustre optimisation, even in their own graphics drivers (Eurogamer have even run some interesting benchmarking articles about it), that doesn’t seem so unlikely to me.
As I say, I may be wrong, and I’m also not saying that nVidia aren’t dicks, especially with how much tessellation they force in Hairworks, very possibly to bog down AMD’s performance. They would have known. My point is that AMD aren’t on the best foot in regards to optimisation as it stands, regardless of whether gameworks is intentionally kept from them or not.
As a guy with a 7870 paired with an i5, I would love to see AMD better optimise their own drivers for lower CPU overhead and higher draw calls before crying about not being able to optimise for nVidia’s in house technologies.
Nvidia Gameworks doesn’t involve payments. This is a myth that is constantly repeated because it makes for a nice sounding conspiracy theory, but it’s completely bunk. No money changes hands at all.
Both Hairworks and TressFX SDKs are openly available on their respective company websites.
You’re an idiot that doesn’t know what he is talking about and is looking for a conspiracy where there is none. The fact of the matter is this, for years AMD has made lacklustre products, and for years they have paired that with horrible driver/software support, because of this and because PC gamers normally aren’t stupid, they have lost most of the market to their competition. Because of their low market share they can’t afford to be doing what NVIDIA is doing, and so don’t, further compounding the issue they allow rumours and fan boys like yourself to drum up conspiracy to try and hurt NVIDIAs bottom line while still delivering substandard products, e.g. NVIDIA spent money and time and create G-Sync, basically fixed all tearing and studdering problems for PC monitors, AMD said NVIDIA was lying about needing a module and they could do the same thing but make it free, many many months later we have FreeSync, and it’s horrible, all FreeSync panels have atrocious ghosting, small variable frame rate windows and no support on the low end, but they continue to lie about FreeSyncs specs when compared to G-Sync in their marketing material case in point:
http://cdn.wccftech.com/wp-content/uploads/2015/03/AMD-FreeSync-Slide14.jpg
The licensing fee is a lie, the compatible with standard monitor features is mostly a lie (the only thing G-Sync doesn’t support atm is additional inputs, but that is by choice) the published range is highly inaccurate, G-Sync works from 1 frame all the way to the monitors cap, FreeSyncs range doesn’t work below approx. 35 fps due to psychically limitations in pixel design and there are no panels that support 240hz so that’s a load of shit as well. and the performance penalty is a half truth/lie only kepler based NVIDIA cards take a hit and that hit is less then 1%, the new maxwell cards don’t have a performance impact.
And this is the advertising AMD releases, they are full of shit and they have been for years, as someone that grew up PC gaming during the years AMD/ATI where the best for gaming, NVIDIA was always the “evil corporation” now that the tables have turned, AMD fans boys think that is still the case. GUESS WHAT! All corporations are evil… You just need to spend your money on the ones that deliever the goods, and in this case for MANY MANY years that has be NVIDIA…
And yet every console is packing AMD GPUs (even the Wii u!).
They really have no excuse for the crappy driver issues they’ve had since the rage cards. How can that STILL be a prominent issue?!
I don’t think Nvidia believed they needed the money from the console market. Those console GPUs aren’t very technically advanced, and are cheap to make, they are also sold to the vendors (Microsoft, Sony, Nintendo) for DIRT cheap bulk prices, so I don’t think there is really a lot of market in it.
The AMD driver issue comes down to AMD having to operate with a Skeleton crew business for the past 10 years, they don’t have the resources (in this case qualified staff) to spend on driver updates.
Normally, I would completely agree about the competition thing, but the current state of affairs is that NVIDIA has 70% market share (I think, don’t quote me on that, I think that is what it was when I looked last, and that is for the PC GPU market) and they were the ones that basically made Variable Refresh rate a thing (innovation) and there are currently price cutting very aggressively into AMD forcing AMD to go cheaper then they wanted to.
Yes I thought it was around 70 percent also. I guess my question is what happens if amd is completely forced out of the market? Will the innovation and the aggressive pricing continue? Probably not.
I think that belief is somewhat flawed, they will always need to give you a reason to upgrade components, and they will need to cater to different price points. If AMD was to vanish and Nvidia was the only GPU manufacturer I don’t believe they would immediately go “AHHHHH HAHAHAAH! EVERY GPU IS NOW $1,000,000! LOLOLOLOLOL!”. All that would do it force people to consoles. You need to factor in Nvidia isn’t competing with AMD, they haven’t needed to “compete” with them for years, I mean heck, they have, as we said 70% market share, they have to compete with consoles, and they are losing that market by a huge margin. I think if AMD disappeared tomorrow, nothing would change.
Actually AMD has an equivalent and they are going to make it open source soon so anyone can use it.
This tech doesn’t necessarily make it into games because it’s any better than AMD’s tech, but because Nvidia abuses its market power to put it there.
To take just one example, AMD developed TressFX for hair simulation more than two years ago, and made it available open source, and helped for it to be integrated into, eg, Tomb Raider. Now Nvidia has Hairworks, as part of Gameworks, which is closed source, which they pay developers to exclusively include in their games, the contracts for which prohibit game devs from collaborating with AMD on game optimisations. So naturally when the implementations of these features tank performance on AMD cards (as with hair in the Witcher 3, for example), AMD have to start from scratch on patches.
You are wrong there. TressFX was an AMD specific tech Nvidia had to do the work to make it run on their cards, and even then the tech was better optimised for AMD:
http://www.gamespot.com/forums/system-wars-314159282/tressfx-patched-to-work-on-nvidia-cards-amd-exclus-29363863/
HairWorks is the same deal, but it DOES work on AMD from the get go, it just doesn’t preform very well on AMD cards because AMD cards aren’t optimised for Tessellation like Nvidia cards are and Tessellation is the technology that HairWorks and TressFX are based on. The issue with HairWorks on AMD isn’t that it is closed source or made by Nvidia to run badly on AMD cards, it’s just that it takes advantage of technology Nvidia made sure would work well on there cards and AMD haven’t done the same.
No, it’s not AMD specific. It has worked on Nvidia cards from the beginning, and the code is available to all, available for download here:
http://developer.amd.com/tools-and-sdks/graphics-development/amd-radeon-sdk/
By contrast not even the developers who add Gameworks to their games have access to Nvidia’s code.
Having the code available to make it work via software and having proprietary hardware required for having it work effectively often go with AMD ‘open’ standards. Mantle anyone?
I don’t think you understand any of this.
TressFX is implemented via DirectCompute shaders, DirectCompute being the compute component of DirectX. There is no proprietary hardware involved, it’s all just normal parts of the DirectX spec that everyone in the PC space implements in their cards. Nvidia consumer cards are simply bad at compute, ever since the first generation of Kepler cards. Nvidia’s preferred solution is to pay developers to include their own proprietary compute solution (CUDA) that runs poorly, if at all, on competitors’ cards, rather than actually make their cards better at GPU compute.
Hairworks doesn’t use CUDA, it uses DirectCompute, and the shader source is available in the SDK download.
I was a Nvidia user for a long time, after my 570 broke I got myself a 7970. I was very impressed with its performance. Overall for the 2 years or so I had it I had very few driver issues, and the few problems I did have were addressed rather quickly but still I got myself a 970 this upgrade.
AMD continually make excuses and I could not continue to support their product. Whenever something remotely cool comes along for Nvidia AMD are almost always are up in arms. The Witcher 3 was the last straw. Yet again they blamed Gameworks for poor performance even though 1 you can turn it off 2 they had more then enough opportunity to approach CDPR to apply there own tech 3 . They could of optimized Gameworks features, they cried saying they didn’t have the source code etc but most games drivers are made without the source code only working with the release build, code which gameworks had available.
I am so sick of there attitude, if you can’t stand the heat get out of the kitchen I say. If/when AMD go broke it wont be the consumers fault for not supporting AMD but AMD’s fault for not supporting the community.
(this comment glitched and replied tot he wrong person)
This is going to be a big one!
Honestly I’m finding the arguments about how one manufacturers cards run the other manufacturers features ‘sub par’ to be an odd thing to be complaining about.
Side note:
Especially when AMD’s base drivers still leave a lot to be desired in terms of DirectX work loads, let alone their capability of handling nVidia’s features. It’s been a talking point for years now, @baraqyal can down vote me for calling a spade a spade if he likes. I’m manufacturer agnostic. I buy the best bang for buck whenever I upgrade. I just went from a 7870 to a 970. Not because I hate AMD, but because the 970 offered me a good value proposition for my needs. My next upgrade may well be back to the Red team. Who knows. Digital Foundry have done an article about AMD’s DX11 drivers inability to properly leverage multi-core CPU performance at a driver level, leaving their GPU’s under delivering DirectX draw calls in comparison to nVidia’s cards and their own Mantel performance. (http://www.eurogamer.net/articles/digitalfoundry-2015-why-directx-12-is-a-gamechanger) Is that not a more important thing to be concerned about than the sub par performance of their cards while running their competitors features?Back on topic: Not taking that into account, it would seem to me that if nVidia had just made Hairworks not available on AMD cards at all, people wouldn’t have even complained about it. But because they have it, even though nVidia (or AMD with their tech) technically have no ‘obligation’ to supply their tech to their competitors, we are complaining about sub par AMD performance with the nVidia tech. Just like nVidia users complained about sub-par TresFX performance in Tomb Raider.
I may be old school in saying this, but a few years ago these features would have only worked on nVidia or AMD cards. The market has opened up in that regard, which only benefits the consumer. We have these features which have every right to be exclusive and here we all sit bitching about it instead of being happy that, as an AMD user, my brother can still enjoy Hairworks and myself, an nVidia user, can enjoy TressFX. Even if it’s not the most ‘optimal’ way to enjoy those features.
I’m not saying we shouldn’t want better. I’m as bitter and cynical about that as the next guy, but do we need to get so worked up about about stuff that, at the end of the day, we are all able to enjoy, even though historically exclusive features have been entirely exclusive… It’s just strange.
Then again, would Xbox owners complain if they got Uncharted 4 (or PS4 owners if they got Halo or something) and it ran worse? Maybe?
“Then again, would Xbox owners complain if they got Uncharted 4 (or PS4 owners if they got Halo or something) and it ran worse? Maybe?”
Well the Xbone doesn’t have equal hardware to the PS4 so yes, it would run worse. Though if they did have equal hardware and it ran poorly compared to the PS4 version then yes, Xbone owners would have every right to complain as paying customers and vice versa with Halo on the PS4.
“Hairworks not available on AMD cards at all, people wouldn’t have even complained about it.”
Yes they would because proprietary vendor specific graphics is exactly the reason we have graphics libraries that run on all hardware, to avoid situations where your gaming experience is defined by the hardware manufacturer you use.
Having “non optimal” hardware for running GameWorks is just another step back towards the above scenario. Hell, GameWorks sabotages performance on older Nvidia cards that outperform the lower end new cards so Nvidia are even screwing over their own customers.
” Just like nVidia users complained about sub-par TresFX performance in Tomb Raider.”
Which AFAIK was later corrected. I don’t see such an improvement happening with GameWorks on AMD hardware any time soon due to a lack of source code.
You’re really focused on seeing the hole and not the doughnut aren’t you.
“Well the Xbone doesn’t have equal hardware to the PS4 so yes, it would run worse.”
AMD hardware is not equal to nVidia hardware either. looking at the high end, even on paper, their current single GPU high end cards do not match nVidias at current. Even if they were equal, most AMD cards don’t utilise their full on-paper potential because of AMD’s poor DX11 drivers. I’ll say it again: Running nVidia’s technology poorly (Which is actually debatable, I’ll get to that later) is a moot point when you look at AMD’s OWN drivers. (Or maybe you’d say AMD’s poorer directX 11 driver level optimisation is nVidia’s fault too? Or perhaps Microsofts?)
“proprietary vendor specific graphics is exactly the reason we have graphics libraries that run on all hardware”
Yes, but this one in particular is nVidias library. They put money into developing it. They chose to make it available to everyone running the game. It’s available to everyone! Further more, you don’t even need to run Hairworks to run the Witcher 3. Ruining the game without Hairworks, a feature that cripples performance, even on high end 900 series nVidia cards, doesn’t sabotage AMD’s performance. Choosing to complain about poor performance in a bonus feature that runs poorly on everything is an odd thing to complain about.
“Having “non optimal” hardware for running GameWorks is just another step back towards the above scenario.”
Or perhaps it’s actually a step closer to what you’re looking for? Oh right, nvm, it is, but only when AMD do it with TressFX etc.
Further more. Driver optimisation shouldn’t require source code. You are however right when you say “I don’t see such an improvement happening with GameWorks on AMD hardware any time soon”.
This is because AMD have fewer driver updates in general. I’m talking about updates that improve GAME optimisation, let alone optimising for nVidia specific technologies. Also, have you ever stopped to think about this:
If Hairwork’s obnoxiously high tessellation factor is what is hampering AMD’s performance, who’s to say Hairworks doesn’t already runs as well as it can on AMD cards? You have no proof to saying other wise other than AMD’s blame passing accusations.
Further more AMD lets users lower tessellation factors from the CC panel. An amazing feature that I WISH nVidia had, but one that admittedly came about from AMD’s poor tessellation performance with ANY application, not just nVidias tech. When the tessellation factor is lowered to 8-16x instead of 64, Hairworks performs well if not better on AMD cards than nVidia’s and looks very similar. Tessellation seems to be the biggest contributor to AMD’s poor performance in Hairworks. AND YOU COULD FIX THAT FROM DAY ONE.
Why complain about something you can already fix, by yourself, and get similar if not better results than people running nVidia cards?! Even TressFX wasn’t that easy to fix. NVidia users had to WAIT for an update to help performance.
But yes. AMD’s lacklustre optimisation offerings, be it for their drivers or other venders technologies, or their inadequate HARDWARE LEVEL TESSELLATION PERFORMANCE are all nVidia’s fault. Just because AMD’s excuses say it is. Clearly.
Most of the same team green effects where in the previous 2 games, enhanced rain appears to be the only new one.
And the rain feature is really not an exciting edition at all in my opinion.
I’ll likely turn most of it off just so the frame rate doesn’t flip out all because someone left their hot cup of tea sitting around and the steam interacted with Batman’s cape.
if its anything like haiworks though you will need a 980ti to play at max settings. Looks amazing though makes me exited for this one.
Yeah, I agree, Kasterix. I don’t have anything against AMD (in fact I’m very glad to have them around to keep nvidia in check for cost competition), and I’ve mostly bought nvidia hardware, but these gameworks features don’t impress me at all from a performance standpoint. They might look good enough, but I don’t care for some realistic hair/fog/rain just to see the frame rate get cut by anywhere up to (and in some scenarios exceeding) 25% – 30%.
Unless you have SLI top tier cards, then gameworks is a severe frame rate killer. But hey, some people prefer to have a game running @ a struggling 30fps with all those features on @ 4K instead of a game running smoothly at 60fps(+), so at least the option is there 😛
A lot of it is really down to future proofing as well, as people update their PCs they can go back to older games and crank everything up to the max without issue.
The whole Nvidia exclusive features got me thinking…
The Xbox One, Playstation 4, and Wii U all have AMD Radeon graphics architecture in them…
If AMD wanted to be a massive dick, they could’ve implemented their own exclusive AMD features. Not only would it appear on their PC platforms, but consoles too, and anyone rocking an Nvidia card would be left out.
That’d work right up until Nvidia patched in compatibility like they did with AMD’s ‘exclusive’ TressFX feature.
A lot of Nvidia’s stuff could probably work on AMD cards IF they actually supported with drivers properly, and made sure new cards could operate certain technical features.
It’s simply not Nvidia’s fault that AMD don’t do this.
Yeah, true. But I’m talking about a hypothetical scenario where AMD are absolute dicks and purposely create an exclusive closed-source feature, additionally where it automatically disables the exclusive feature when an Nvidia card is detected. THEN they further market the feature, stating on certain media that Nvidia just isn’t capable of running it, even though they are.
You know, dick stuff…
Uhhh what? Maybe your memory is a little foggy because Nvidia cards were having crazy stability issues at the launch of Tomb Raider, regardless of TressFX. It was an issue that both Crystal Dynamics and Nvidia acknowledged.
Maybe your reading skills are a little foggy… At no point did I say Nvidia cards were fine at launch of Tomb Raider.
All I said was that Nvidia had compatibility for TressFX patched in, and they did.
I played Tomb Raider with TressFX enabled on my Nvidia card, wouldn’t have guessed it was an AMD ‘exclusive’ feature.
TressFX was not exclusive, it uses Directcompute.
But with that logic, Hairworks isn’t exclusive either, because it does in fact work on AMD cards doesn’t it?
Correct, neither uses any proprietary API’s. If Hairworks used CUDA for it’s motion simulation then it would be exclusive. As it is it uses Directcompute just like TressFX with a ludicrous amount of tessellation which has always bogged down hardware needlessly (AMD more than Nvidia but both needlessly suffer).
Definitely. I’ve said it myself as well that they used, IMO, too much tessellation with Hairworks. (Which I think they they would have known would hit AMD’s performance in particular.)
My point was that I’m not sure if AMD can really ‘blame’ nVidia for their perceived ‘inability’ to optimise. AMD often fall behind with optimisation, although there was a point when it seemed that they were improving. (But that’s not to say nVidia didn’t intentionally/ needlessly inflate the tessellation factors for Hariworks.)
It’s pretty ridiculous actually. My brother sees less of a performance hit on his 7870 when he runs hairworks than when I do on my 970. Obviously he’s not at the same GFX settings as I’m running, but the cost difference of vanilla hairworks (with MSAA lowered) vs less tessellated hairworks is MASSIVE.
It’s Just because AMD allow you to set the maximum tessellation levels of applications, which he set to about 8-16x without loosing the visual appeal of hairworks. (Not that either of us feel that it’s worth while to run hairworks anyway).
That’s pretty telling ain’t it.
The level of tessellation set as default in the Hairworks SDK is excessive, as you mentioned. It defaults to 64x, you can get very similar results with 16x but twice the framerate.
I’m not sure if @korwin is saying tessellation in general bogs down hardware unnecessarily or if he’s just referring to the amount set in Hairworks, but hopefully the latter. Tessellation is an excellent technology that is essential for a lot of high-end graphics effects that simply weren’t possible before because of processing requirements. Especially as scenes become more complex, tessellation is becoming increasingly important to render that complexity.
I was referring to the level in hairworks.
I buy NVIDIA because it actually works on Linux and the GPUs perform correctly. It helps that NVIDIA use the same driver code on Windows/Linux/OSX/FreeBSD and Solaris.
They’re cool effects but I’d prefer they spent there time developing drivers that don’t crash while simply browsing with firefox or chrome. The last few releases have been abysmal.
NVIDIA needs to make a game with pimp water again. Just Cause 2 is the last one I remember with PhysX water, and omfg was it amazing.
“I can’t wait to turn these features off so the game runs smoother on my computer when it drops on 23 June.”
You call yourself a pc gamer? if your that bad off just buy a console. got to give props to nvidia. the immersive feeling in gameworks titles is nothing like I’ve ever experienced on an amd card . who needs a Fury’s exrtra 5% fps when it can’t compete in gamework titles like this . -Although I can play in 1024×768 res i choose not to because I like the prettier things. Although I can afford an amd fury i choose not to because well …. I like my games to have all the bells and whistles turned on .