The Witcher 3 Benchmarked: The New Crysis

The Witcher 3 Benchmarked: The New Crysis

Around this time four years ago The Witcher 2: Assassins of Kings impressed critics with its opulent and demanding PC graphics, rich environments and storytelling, along with innovative combat mechanics. Selling nearly two million copies in its first year, the game was a great success for CD Projekt Red so it came as no surprise when a follow-up was announced.

After much anticipation and a few delays, The Witcher 3: Wild Hunt launched this week to a similar degree of critical acclaim, with praises in the order of “one of the best role-playing games ever crafted” (GameSpot), “a game that often feels like a stunningly confident, competent shot across the bow of the open world genre” (Polygon), and “The Witcher 3: Wild Hunt is shaping up to be one of the best RPGs of the year” (PCMag).

Less than a week in however there’s some controversy in the PC world involving the game’s reveal versus launch graphics. In 2013, CD Projekt Red (CDPR) teased a world that was bound to bring PC gaming graphics to a new level. That changed with the arrival of the Xbox One and Playstation 4. To optimise Wild Hunt for consoles, it CDPR slightly downgraded some of the visuals that were previously expected to be in the PC build. Quick to react, CDPR has already released a new patch in hopes of addressing some of these, improving graphics and graphical settings in the PC.

Adding to the squabble, The Witcher 3 includes Nvidia’s HairWorks technology, which is essentially NVIDIA’s version of TressFX. But unlike the latter, NVIDIA won’t share its HairWorks code, so AMD can’t optimise its drivers for the technology.

With the NVIDIA-influenced Radeon performance issues in Project CARS, it seems the gaming community has had enough and it’s not just AMD users crying out over NVIDIA’s underhanded tactics. Those using Kepler-based GeForce 700 series GPUs and older have started to notice poor performance in new GameWorks titles.

Project CARS was a perfect example as the $US200 Maxwell-based GTX 960 was found to be uncharacteristically fast, matching the two-year-old, $US1000 GTX Titan. It will be interesting to see how Kepler-based GPUs compare to Maxwell parts in The Witcher 3.

“Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA’s HairWorks technology — the answer is yes! However, unsatisfactory performance may be experienced as the code of this feature cannot be optimised for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations.” — CD Projekt Red spokesperson Marcin Momot

We’ve been preparing this article for days and just as we were ready to go live, CD Projekt Red released a new patch that optimises and improves graphics settings in the PC platform. We’ll elaborate on those later.

We are expecting stunning visuals as CDPR recommends an Intel Core i7 and Radeon R9 290. Using the latest AMD and NVIDIA drivers, we tested nineteen DirectX 11 graphics cards covering most price ranges. Our test rig was outfitted with an Intel Core i7-5960X to remove CPU bottlenecks that could influence high-end GPU scores.

Testing Methodology

Using FRAPS we recorded 120 seconds of gameplay starting from the first time Geralt mounts Roach (his trusty steed) and rides toward a Griffin attacking a villager. The test ends when the Griffin flies away with the villager’s horse.

The Witcher 3: Wild Hunt was tested at three resolutions: 1920×1080, 2560×1440 and 3840×2160 using medium, high and ultra quality presets. The medium setting disables NVIDIA’s HairWorks, high only applies it to Geralt and ultra uses HairWorks on everything with hair. The post-processing effects were left on high for all three quality tests.

Test System Specs

  • Intel Core i7-5960X (3.00GHz)
  • x4 4GB Kingston Predator DDR4-2400 (CAS 12-13-13-24)
  • Asrock X99 Extreme6 (Intel X99)
  • Silverstone Strider Series (700w)
  • Crucial MX200 1TB (SATA 6Gb/s)
  • Gigabyte Radeon R9 290X (4096MB)
  • Gigabyte Radeon R9 290 (4096MB)
  • Gigabyte Radeon R9 285 (2048MB)
  • Gigabyte Radeon R9 280X (3072MB)
  • HIS Radeon R9 270X (2048MB)
  • HIS Radeon R9 270 (2048MB)
  • HIS Radeon R7 265 (2048MB)
  • HIS Radeon HD 7970 GHz (3072MB)
  • HIS Radeon HD 7970 (3072MB)
  • HIS Radeon HD 7950 (3072MB)
  • HIS Radeon HD 7850 (2048MB)
  • Gigabyte GeForce GTX 980 (4096MB)
  • Gigabyte GeForce GTX 970 (3584+512MB)
  • Gigabyte GeForce GTX 960 (2048MB)
  • Nvidia GeForce GTX Titan (6144MB)
  • Gigabyte GeForce GTX 780 Ti (3072MB)
  • Gigabyte GeForce GTX 780 (3072MB)
  • Gigabyte GeForce GTX 770 (2048MB)
  • Palit GeForce GTX 760 (2048MB)
  • Gainward GeForce GTX 680 (2048MB)
  • Gainward GeForce GTX 660 Ti (2048MB)
  • Gainward GeForce GTX 660 (2048MB)
  • Microsoft Windows 8.1 Pro 64-bit
  • Nvidia GeForce 352.86 WHQL
  • AMD Catalyst 15.4 Beta

Benchmarks: Medium Quality

At 1080p using the medium quality settings we already get a sense for just how demanding The Witcher 3: Wild Hunt really is. The performance above isn’t that much faster than what we saw when benchmarking GTA V using the maximum quality settings with normal textures.

That said, for a minimum of 30fps gamers need only a Radeon HD 7850 or GeForce GTX 660 Ti. Those shooting for a minimum of 60fps are going to need some serious firepower — think GTX Titan or R9 290.

Now at 1440p we find that not even the GTX 980 can deliver a minimum of 60fps, though it did provide an average of 62fps. The R9 290X was the next fastest GPU, though it averaged only 53fps — slightly faster than the GTX 970.

For a minimum of 30fps, the HD 7950 Boost or ideally the R9 285 will be required, while NVIDIA users will get away with the GTX 960 or GTX 780.

4K gamers won’t find The Witcher 3 particularly playable on anything less than a GTX 980 when using just the medium quality settings. The GTX 980 averaged just 30fps with a minimum of 26fps.

Benchmarks: High Quality

Enabling the high quality graphics preset had a huge impact on frame rates, dropping the GTX 980 from 88/98fps to just 30/65fps. What is most noticeable here is the huge reduction in minimum frame rates, which in the case of the GTX 980 is 3x less than the average frame rate. This means at 1080p using the high quality visuals the GTX 980 was the only GPU able to deliver a minimum of 30fps.

When looking at averages, gamers will require either the GTX 680, GTX 760 or R9 285 for 30fps+.

The 1440p high quality performance is rough, requiring a GTX Titan or R9 290 just to average 30fps.

Multi-GPU technology is a must for gaming at 4K resolutions in The Witcher 3. Here we see a single GTX 980 is good for just 24fps on average with a minimum of 12fps.

Benchmarks: Ultra Quality

Ultra quality graphics reduce the GTX 980 to just 56fps at 1080p, down from 65fps when using the high-quality settings. For an average of 30fps at 1080p gamers will require a GTX 770, GTX 960 of R9 280X.

Playing at 1440p dropped the GTX 980 to just 40ps with a minimum of 20fps, while the R9 290X was good for just 28fps with a minimum of 12fps.

Things looked bleak during the last round of punishment for our GPUs. Playing on ultra at 4K, the GTX 980 averaged just 22fps, so even with two cards in SLI the best gamers can hope for is a little over 40fps with a minimum of around 20fps.

Read More:

Republished with permission from:

Steven Walton is a writer at TechSpot. TechSpot is a computer technology publication serving PC enthusiasts, gamers and IT pros since 1998.


  • I think the criticism for Witcher 3’s graphics is a bit undeserved and overblown. We aren’t talking about a 5-10 hour game you smash out over one weekend, including all collectables and quests.

    The game looks great and all the small details come together to create an amazing world that feels so alive.
    Sure, better looking graphics would be great, but there is clearly a practical balance going on with this game.

    • Yup. I have to agree, the size of the game makes up for it IMO. But I also think that the climate of graphics ‘downgrades’ has rightfully (again IMO) made people less forgiving towards downgrades of any kind.

  • Wax models running through a rubber forest. I’m sure the game is fine but to compare the graphics to how Crysis looked on release suggests you weren’t playing games when Crysis was released (or, even better, Far Cry 1 which still hasn’t been beaten in my eyes for best graphics of it’s time).

      • I miss lanning Crysis so much, mostly for laying cloaked with a gauss rifle and the community map scene. We had heaps of fun with it.

        • I was mad about the SP. Loved it. The only thing that’s dated is the torchlight effects. I must’ve played it through at least 4 times over the years. Disappointing the direction they went with 2 & 3

          • I would have started it a hundred times, probably only finished it two or three though. I always loved just running about and being amazed by the graphics. Can’t believe it’s about to turn seven!

            I never bothered finishing number 2 and didn’t even get 3.

          • Oh faaaaahk you! Now I have to reinstall the first!

            I got most of the way through Crysis 2 before losing interest. Nice graphics and great controls but it didn’t have the open-ness that I loved about 1. And 3? Same story. Got it for free with a gfx card. Amazing gfx but just so on rails that you never feel like you can do what you like. Just what the game will let you. Shame they had to remove almost everything that felt amazing about the first. Especially because the controls for 2 & 3 are definitely better.

  • one thing I would like to have seen is benchmarks of ultra but with hairworks turned off as it really drops fps especially on AMD as some people were getting an extra 10fps by turning off just this one setting (in a similar way to how witcher 2 had ubersampling).

    EDIT: I found that techspot even mentioned about the hairworks and had some benchmarks with hairworks turned off and it made a bigger impact then I realised the r9 290 with hairworks averaged a fps of 36 but with hairworks turned off reached an average of 63fps which is a huge difference.

    • You’re not very imaginative then. The comparison is pretty clear, Crysis was best known in benchmarking as a game you could test your GPU’s performance on for several generations after it was released because it couldn’t run on Ultra on anything available at the time. It should be pretty obvious that the comment is about benchmarking since the context is in the title.

      • Benchmarking is exactly what I’m referring to. When Crysis launched the best GPU on the market was the 9800GTX, and that was just a rebadged and overclocked 8800GTS 512 (infact the 8800 GTX and Ultra out performed it in some instances due to the larger memory bus). Those cards couldn’t come close to maintaining a playable framerate at 1920×1200 at Ultra.

        On the other hand my GTX980 has no problems running Witcher 3 at 2560×1440 with Hairworks disabled. The performance results are night and day when compared historically.

        • “My GPU has no problem running the game on max settings when some of the settings aren’t on max settings” doesn’t really work as a concept.

          What minimum/average framerate do you get at that resolution on your current settings? If the minimum is below 60, you have considerable room for improvement.

          • The settings are ultra, the AO is set to use HBAO+. Hairworks isn’t running because of it’s ludicrous performance requirements. Framerate sits around 40-60 quite comfortably.

          • I guess I was a bit vague based on outatime’s reply below. I wasn’t suggesting Witcher 3 is the same as Crysis, but rather that there’s a very short list of games that get sub-par framerates at max settings on top end hardware. Based on the results above, I expect Witcher 3 will be used as a benchmark staple for the next two years.

            Enabling Hairworks may not be a good idea from a “playing” perspective, but it makes for a pretty good (and long-lasting) hurdle from a benchmarking perspective. With that enabled, we’re unlikely to see acceptable performance for any of the current generation, possibly not even until the high range of the next.

            I’m disappointed that the Titan X wasn’t included in the benchmarks here, I’m curious how it would have performed on the same testbed.

          • Well, by this reasoning, Tomb Raider 2013 is as good a benchmark as my 970 maxes it out, with TressFX, at about the same FPS I get in the Wither 3 with Hairworks on.

            Hairworks, at the time of posting, seems to be ‘unoptimised’ in that it uses wasteful amounts of tessellation and MSAA. Both of those things severely hamper performance on AMD cards (Which have always been sub-par at tessellation) and nvidia cards below the 900 series.

            In general, benchmarks don’t include the use of features that hamper performance, especially if it does on one brand more than another. It’s why we don’t normally see PhysX turned on for benchmarks.

            It doesn’t give an accurate representation of power, making it a POOR choice for objective benchmarking.

          • I don’t agree. Benchmarks test everything. Hairworks as an API may be Nvidia technology but the underlying implementation still shows a deficit in tessellation performance on AMD cards. That deficit is in the card, not a flaw of the benchmark to be corrected by disabling the test. An accurate representation of power is determined by testing all aspects of the card’s performance, and tessellation is a very significant DX11 feature. It is a perfectly reasonable choice for objective benchmarking. It’s also perfectly reasonable to test AMD’s TressFX’s OpenCompute implementation on Nvidia cards, which are typically weak in that area.

            This is precisely the point of benchmarks, to show the strengths and weaknesses of each tested card.

          • I would normally agree 100%, but we are taking about ‘nvidia hairworks’. It’s been designed to run best on nVidia cards and not AMD, to the point of stifling AMD’s performance. (Whether by intention or not). It’s intended as a selling point to get people to swap cards for a better experience after all.

            Yes you ‘can’ run it on AMD hardware, but it’s a clear disadvantage and it’s very clear that it hasn’t been optimised for it at all, just like running PhysX on a CPU. It’s not something like MSAA or a technology designed to work ‘optimally’/ stress/ leverage all platforms.

            It’s hence not a true representation of performance, hence a waste of time to benchmark, unless you are looking at benchmarking the performance of Hairworks on nVidia vs. AMD specifically.

            Like I say, it’s the same reason people don’t normally benchmark with PhysX enabled if they are comparing nVidia and AMD cards. It’s not a true representation/ comparison of power.

            Edit: TressFX is naturally an exception as it performs well on both teams. Should hairworks/ PhysX become optimised for all platforms I’m sure we will see it cropping up in more benchmarks in time. Until then, as I say, most people/ comparisons won’t see it as an accurate hardware benchmark (outside of interest pieces).

          • Let me re-phrase myself to avoid some of this confusion.

            As an interest piece/ a benchmark to compare performance of Hairworks on nVidia and AMD, or as a side not to the Wither 3’s performance on AMD/nVidia, I think it’s vital to include Hairworks comparisons. It’s also very interesting to use it to speculate in regards to AMD’s teselaton performance, but in the grand scheme of things this sort of speculation makes for a poor ‘clinical’ hardware benchmark.

            Hence when benchmarking graphics card power in general and not in a specific case (ie. gpu performance with Hariworks), it doesn’t result in an accurate display of power.

            Hence why you don’t see Hairworks implemented in most GPU benchmarks which take the Witcher 3 into account as one of the many tested games. It’s not an over arching/ accurate benchmark.

            It simply depends ‘what’ you are benchmarking.

      • I agree with the OP. Crysis 1 was a beast and couldn’t run on low/medium settings @1024×768 at 30FPS on most of my friends (and my own) PC’s for years. We all had mid ranged PC’s of the time.

        Crysis devoured hardware, with most people struggling to hit 30FPS, let alone 60FPS, for years (At decent settings and resolutions).

        The Witcher 3 performs just fine on mid range and high end hardware. I can max out the Witcher 3 at 1080p 30FPS on my old/mid range 7870 and 60FPS on my new 970 (Without Hairworks).

        Considering that the Witcher also doesn’t stress my 3.1GHz i5 past 30% on all four cores for the most part, I’d say GTA V is a more rounded modern benchmark. (And, IMO, better looking, but that part is just my opinion.)

        • I’m not suggesting Witcher 3 is the same as Crysis, just that it’s very rare to get new release games that top end hardware can’t handle on max settings (that is all settings max, not selectively disabling some because they’re particularly taxing). The only other one that comes to mind since Crysis was Oblivion. Console hardware limitations have a lot to answer for when it comes to pushing against current caps.

          To be honest, I have a hard time believing that you get a stable 30fps on a 7870 with max settings at 1080p. The 7870 was a worse performer than the GTX680, which seems to only rate a 26fps average in the chart above with drops as low as 13. I’m not suggesting you’re lying and I apologise if you infer that, it just doesn’t seem to match the surrounding data.

          CPU is rarely the bottleneck when it comes to graphics features, that work is pretty much all deferred to the GPU these days. Scene complexity can be a factor but even then proper use of DX11 tessellation should push most of that onto the GPU.

          GTA5 has a remarkable engine, no question. What it was capable of displaying on the X360/PS3 consoles was amazing considering the severe hardware limitations. On the other hand, as a benchmark tool it’s really only good at testing mid-range cards where pipeline limitations are less likely to interfere with GPU load. Project CARS is probably a better one for testing high end cards.

          • I was also surprised at how well the 7870 performed TBH, considering that it normally competes with the PS4, but runs this particular game looking better and running smoother. I could probably record some gameplay if you like, but it would be a real hassle and would probably degrade performance anyway. I should also say again that I run it without hairworks.

            With hairworks on I can run the game maxed out at 30-47fps on the 970. IMO Hairworks tessellates ridiculously and has 8xMSAA enabled by default, which boggs down the performance even further. If I edit the MSAA to 2x or 4x the performance is much better. On the AMD card I could use the AMD CCC to edit the amount of tessellation for performance gains as well, but I honestly don’t think hairworks is worth the performance trade-off when the standard hair simulation is so much better than average already.

            I also have to agree that Project cars is also a very good GPU benchmarker, but I still prefer to compare GTA V (Or even Metro LL) to Crysis because it can push the CPU, as well as GPU, pretty hard, like Crysis did back in the day.

            My point was that I agree with the OP that the Witcher 3 isn’t as good a benchmark, when compared to other modern games, as Crysis 1 was, when compared to games of it’s time.

          • Sorry to keep bothering you, but I thought this was super interesting:

            I tried the 7870 at Ultra settings with Hairworks enabled. I edited the config to only have 2xMSAA on hairworks at first. It helped, but framerates were still unacceptable, dropping from 25-34FPS (PS4 frame rates, but at far higher settings) to well below 30FPS all the time.

            I then used the AMD CCC to lower the tessellation from 64x to 8x and now Hairworks only has a 5FPS(!!!) hit on performance when you max it out, when compared to the game running without Hairworks and without the tessellation tweaks.

            Just thought that was interesting.

  • I’ve got a 780Ti HydroCopper (watercooled basically) and an 15 4870K (I think it’s that or the 4770, can’t recall).

    It runs well on ultra/high settings unless the hairworks is on (which looks weird to me anyway).

    Turned off hairworks and lowered the grass (makes it damned hard to find things when the grass is all the way up) and it’s running very nicely.

    Main gripe I have is with the controls. Everything is in 90 degree increments so I can’t slowly turn. It makes everything incredibly jerky, wish they’d change that.

  • I’ve always been a ” I need 60fps minimum to enjoy a game” type guy. My GTX780 is only getting probably 30fps average at 3440×1440 on medium, I must say it still looks great and is playable at that rate. I’m hanging for the GTX980TI to drop. Fingers crossed this week some time.

    • I’m having trouble deciding on 30FPS or 60FPS for the Witcher tbh.

      I’m playing on a 970 and I can max the game out, without hairworks, at 60FPS. But even then there is micro-stutter. Even though my frame counters say it’s a fixed 60FPS update (locking the FPS to 60 in the options).

      Which means it’s a frame pacing issue and since this game wont let me use triple buffering in the geforce control panel, there’s little I can do about it. So my other option is to run the game maxed out with hairworks and cap the FPS to 30FPS, which lessens frame pacing issues but has more input latency.

      Thankfully the witchers controlls are already clunky and theres no frame rate on earth that’ll fix that, so it’s been a daily decision for me: Stutter or.. well.. a different kind of stutter.

Show more comments

Comments are closed.

Log in to comment on this story!