Tomb Raider Performance Test: Graphics And CPUs

Tomb Raider Performance Test: Graphics And CPUs

Although this year’s Tomb Raider reboot made our latest list of most anticipated PC games, I must admit that it was one of the games I was least looking forward to from a performance perspective. Previous titles in the franchise have received mixed to positive reviews, but gameplay aside, their visuals weren’t exactly mind-blowing so we’ve never bothered doing a performance review on one — until now, anyway.

As with the last few entries, Crystal Dynamics developed the new Tomb Raider using the Crystal Engine — albeit a heavily modified version. Being a multiplatform release, we were naturally worried about the game being geared toward consoles with PC being an afterthought, which has become increasingly common (Dead Space 3 comes to mind as a recent example) and generally results in lackluster graphics.

Those concerns were at least partially alleviated when we learned that the PC port was being worked on by Nixxes Software BV, the same folks who handled the PC versions of Hitman: Absolution and Deus Ex: Human Revolution, both of which were great examples of what we expect from decent ports in terms of graphical quality and customisation. Hitman in particular really stressed our higher-end hardware.


We were also relieved to learn that Tomb Raider supports DirectX 11, which brings access to rendering technologies such as depth of field, high definition ambient occlusion, hardware tessellation, super-sample anti-aliasing and contact-hardening shadows. Additionally, compared to the diluted console versions, the PC build offers better textures as well as AMD’s TressFX real-time hair physics system.

The result should be a spectacular looking game that pushes the limits of today’s enthusiast hardware — key word being “should,” of course — so let’s move on and see what the Tomb Raider reboot is made of.

Testing Methodology

We’ll be testing 27 DirectX 11 graphics card configurations from AMD and Nvidia covering a wide range of prices from the affordable to the ultra-expensive. The latest drivers will be used, and every card will be paired with an Intel Core i7-3960X to remove CPU bottlenecks that could influence high-end GPU scores.

We’re using Fraps to measure frame rates during 90 seconds of gameplay footage from Tomb Raider’s first level, the checkpoint is called “Stun.” The test begins with Lara running to escape from a cave system.


Our Fraps test ends just before Lara exits the cave, which is ironically where the built-in benchmark begins. We decided to test a custom section of the game rather than the stock benchmark because this is how we will test Tomb Raider in the future when reviewing new graphics cards. Using Fraps also allows us to record frame latency performance, though for this particular article we didn’t include those.

Frame timings weren’t included for two reasons: it’s not easy to display all that data when testing 27 different GPUs, and we feel Nvidia needs more time to improve their drivers. We’ll include frame time performance for Tomb Raider in our next GPU review.

We’ll test Tomb Raider at three common desktop display resolutions: 1680×1050, 1920×1200 and 2560×1600 using DX11. We are also testing using the three top quality presets that includes Ultimate, Ultra and High. No changes will be made to the presets.

  • HIS Radeon HD 7970 GHz (3072MB)
  • HIS Radeon HD 7970 (3072MB)
  • HIS Radeon HD 7950 Boost (3072MB)
  • HIS Radeon HD 7950 (3072MB)
  • HIS Radeon HD 7870 (2048MB)
  • HIS Radeon HD 7850 (2048MB)
  • HIS Radeon HD 7770 (1024MB)
  • HIS Radeon HD 7750 (1024MB)
  • HIS Radeon HD 6970 (2048MB)
  • HIS Radeon HD 6870 (1024MB)
  • HIS Radeon HD 6850 (1024MB)
  • HIS Radeon HD 6790 (1024MB)
  • HIS Radeon HD 6770 (1024MB)
  • HIS Radeon HD 6750 (1024MB)
  • HIS Radeon HD 5870 (2048MB)
  • Gigabyte GeForce GTX Titan (6144MB)
  • Gigabyte GeForce GTX 680 (2048MB)
  • Gigabyte GeForce GTX 670 (2048MB)
  • Gainward GeForce GTX 660 Ti (2048MB)
  • Gigabyte GeForce GTX 660 (2048MB)
  • Gigabyte GeForce GTX 650 Ti (2048MB)
  • Gigabyte GeForce GTX 580 (1536MB)
  • Gigabyte GeForce GTX 560 Ti (1024MB)
  • Gigabyte GeForce GTX 560 (1024MB)
  • Gigabyte GeForce GTX 550 Ti (1024MB)
  • Gigabyte GeForce GTX 480 (1536MB)
  • Gigabyte GeForce GTX 460 (1024MB)
  • Intel Core i7-3960X Extreme Edition (3.30GHz)
  • x4 4GB G.Skill DDR3-1600 (CAS 8-8-8-20)
  • Gigabyte G1.Assassin2 (Intel X79)
  • OCZ ZX Series 1250w
  • Crucial m4 512GB (SATA 6Gb/s)
  • Microsoft Windows 7 SP1 64-bit
  • Nvidia Forceware 314.14
  • AMD Catalyst 13.2 (Beta 7)

Ultra Quality Performance


Moving from high to ultra has a huge impact on performance so we dropped several cards from testing as they couldn’t handle this quality.

For an average of 60fps, you’ll want the HD 7870 or GTX 680. We’re not sure if we’ve ever seen those two cards sitting next to each other, so it seems like there’s something that is really hurting Nvidia’s cards (note that TressFX is off).

In another first, the HD 7970 GHz Edition tangoed with the GTX Titan, slipping behind a few frames in the average results but doing much better on the minimum fps.

The minimum frame rates of the Nvidia cards were quite low and we see this most notably with the GeForce GTX 670 which averaged 61fps but had a minimum of just 28fps.


This time the GTX Titan is just barely able to outclass the HD 7970 GHz Edition, while the HD 7950 Boost was the first card to break 60fps, though you could probably get by just as comfortably with the standard model.


You won’t be playing on ultra quality at 2560×1600 without a respectable graphics configuration, as the GTX 680 was reduced to a mere 33fps while the HD 7970 fared better with 45fps.

Ultimate Quality Performance


Ultimate quality unsurprisingly calls for an ultimate GPU — probably more than one. Even at 1680×1050, it took an HD 7970 or GTX Titan to render an average of more than 60fps, while the minimum frame rate of the 7970 was around 30fps. Nvidia’s cards continue to struggle and the minimum frame rates are far too low as the GTX Titan dropped to just 19fps in spots.


For now, those wanting to play Tomb Raider are far better off with an AMD solution as the HD 7970 GHz Edition was able to deliver more consistent performance than the GTX Titan and it offered substantially better results than the GTX 680, which ranked lower than the HD 7870.


Playing at extreme resolutions such as 2560×1600 or beyond will likely require more than one GPU with the fastest GPU tested (the HD 7970 GHz Edition) averaging only 34fps with a minimum frame rate of 21fps.

Continue Reading…


TechSpot is a computer technology publication serving PC enthusiasts, gamers and IT pros since 1998. Republished with permission. [clear]


    • They have beta drivers out with proper support for it(they say improvements of up to 60%), but they haven’t provided much information past that.

    • The performance gap isn’t because of anything wrong with Nvidia cards or drivers, and the poor performance of the Titan goes toward proving that. The problem is the way Crystal Dynamics optimised the game. Many games are designed with one brand or the other in mind (which is why you see ‘plays best with ATI’ or ‘designed for Nvidia’ logos in the intro sequence) but in this case, it appears there might have been something shadier going on. Nvidia didn’t even get a copy of the final game code until a few days before release, and some commentators have suggested CD may have intentionally hampered performance on Nvidia cards as part of the game’s sponsorship from AMD. That or they were just utterly incompetent at developing their game engine.

      Nvidia will put out driver updates that improve performance as they always do, but this is one game that needs proper patching by the developer to fix graphics engine problems. Whether they’ll actually do that, I don’t know.

  • ..Some of these graphs are pretty screwed. Multiple columns with the same value, but go to different lengths. And then sometimes the values are larger than the last marker line they passed along the axis?

    • Eh? Are you sure you’re reading them properly? They look fine to me? They’re organised by resolution.

      *Edit* My bad. I see exactly what you mean. I think the problem is in the yellow area, they haven’t added the two together and instead have just given a seperate amount. If you add them together it gives the proper lining up.

  • “We’ll test Tomb Raider at three common desktop display resolutions: 1680×1050, 1920×1200 and 2560×1600”

    These are common resolutions?
    I would have thought 1920×1080 would be most common *shrug*
    Not doing 1920×1080 makes this all kinds of useless…

    • 1920×1200 and 1920×1080 are so close it barely makes a difference, add 2 frames and you are good. 8:5 (or 16:10 if you hate simplifying fractions) is by far the most common aspect ratio for PC screens.

      • 1920×1200 used to be popular. You look now on the market, its actually hard to find a 16:10 monitor now. 16:9 is a general standard across the board now.

  • I know this really isn’t what this thread is about.. But those colours look pretty seriously washed out to me..

  • For Nvidia cards I am guessing it’s the old 314.14 drivers?

    Just FYI to anyone with an Nvidia card, the new 314.21 drivers give a decent boost in FPS. 🙂

  • I didn’t update my drivers to play it, and it still looks pretty damn beautiful without any noticeable lack of frame rate. Occasionally it appears that it lags out a bit, but that seems to happen when the game is loading more data. And it’s over in a second.

Show more comments

Log in to comment on this story!