Benchmarking Fun With The GTX Titan, Nvidia's Most Advanced Video Card

Earlier this week I told you about Nvidia's GeForce GTX Titan, the $US1,000 graphics card with supercomputer power, tons of tweakability and an astounding acoustic footprint which keeps it whisper-quiet under the heaviest of loads. What I couldn't tell you about where the benchmarks — those all-important numbers.

Nvidia claimed this was the card to kick Crysis 3's arse. Today I can tell you Crysis 3 isn't going to be sitting comfortably anytime soon.

It's a little odd, announcing a brand new advanced piece of graphics hardware, putting it in the hands of the critics and then making them wait two days, but it gave me plenty of time to reflect on the technology I had at my disposal. My GTX Titan came inside the latest version of Digital Storm's Bolt, a mere slip of a computer at only 3.6 inches wide. This is the sort of card Digital Storm's engineers designed their case to hold, powerful enough to handle the most taxing games, while efficient and acoustically sound enough to do so without making the tiny case roar like a lion the minute 3D rendering starts happening.

The GTX Titan never got loader than a low hiss during my testing period, and that's incredibly important in this sort of system. Keep that in mind while looking at the numbers to come.

The Titan comes with third-party tools to tweak the hell out of performance, should the user so choose.

I mention this because I did not choose to do so. I am a bit of a coward when it comes to fiddling with settings. I've been burned before, quite literally, so I leave the tweaking to the professionals. These benchmarks reflect the default settings.

I did, however, fool with the card's innovative display overclocking, which allows the GPU to override the monitor's refresh rate, allowing for higher frame rates with VSync enabled. Results differ for every monitor, but my 23-inch AOC i2367fh jumped up to 75Hz without a problem — things got janky higher than that. It was a small difference, but an incredibly noticeable now.

I began my benchmark odyssey with Unigine, my new favourite method for judging graphic performance. First I went into Heaven and turned everything up to DirectX 11.

Solid scores, as expected, but the Heaven benchmark is child's play compared to Unigine's latest, the Valley.

Valley integrates all the latest bells and whistles — dynamic sky, volumetric clouds, sun shafts, depth-of-field, ambient occlusion — giving today's advance graphics cards a serious workout. Performance dipped a little during the dynamic weather portion of testing, but that's what happens. That the Titan managed to stay above 20 frames per second during that nightmare is impressive.

After spending an hour wandering through the Valley benchmark (it's really addictive, and much easier than real walking), I went to my library of benchmarking games, making sure to mainly pick games with the number three in them. Batman: Arkham City slipped past me while I wasn't looking. It's sneaky that way.

Remember back when Battlefield 3 was a system-taxing monster? Mind you I performed the benchmark during the game's opening story sequence on the train, but I've seen plenty of video cards stumble through that bit. Most cards I've tested the sequence with hovering between 30 to 50 FPS. Titan rendered it while doing the crossword and catching up on A Game of Thrones.

The Arkham City numbers might not seem as impressive, but for this particular test I cranked up the PhysX as high as it would go. I'd say 78 FPS with ice particles bouncing all over the place is nothing to scoff at.

Far Cry 3 ended up on the low end of the frame rate scale, which I attribute to my mad dash through the jungle on a four-wheeler, jumping it off a cliff into the ocean. I cannot sit still in Far Cry 3, sorry.

Thanks to an issue with my cloud save, my Assassin's Creed III test was spent playing Connor's arsehole dad in the early bits of his trip to Boston. I gave the jerk quite a workout, running him through crowds, over rooftops and ultimately into a group of redcoats. The benchmark ended in the middle of the fight, so I let him die. That was pretty satisfying.

And then we have Crysis 3. The supposed system-strangler was no match for the Titan's prowess. My benchmark was performed in the middle of a vast open area, in which a dozen enemies were attempting to end my life with extreme prejudice while I sprinted in and out of them like a madman. After the benchmark completed I spent a good 10 minutes bragging to Kirk Hamilton about how much better the game was running for me than it did for him. I guess this counts as bragging as well. Oops.

I would have spent more time benchmarking threequels, but I figure the five I've run plus the Unigine tests presents a pretty solid picture of the sort of performance a cowardly non-tweaker can expect from Nvidia's GeForce GTX Titan. I'd imagine that someone with the courage to adjust settings could get those numbers even higher. I salute the brave men and women that will be spending the next few weeks potentially jeopardizing $US1,000 video cards. They are true heroes.


Comments

    Maybe I'll pick one of them up in 6 months when they're under 200 bucks.

      Yeah, 700-800 maybe, which is pushing it, but no way any gpu card drops that much in such a short time. Enjoy waiting a lifetime

    You should really put these benchmarks alongside benchmarks for the 690. Costs about the same and a sizable difference in performance. $1000 is really way too much for this card.

    Last edited 22/02/13 4:07 pm

    In Jim Sterlings review of Crysis 3 he mentions that he has three of these things in SLI, and the frame rate still 'dipped' to 40 fps at times on the highest setting. Obviously the game is very playable still, but it gives you an idea just how hard it pushes.

      That or the game needs patching due to poor optimisation of using what is available to it.

        This has been happening since the first Crysis. No matter what your hardware is, the game will Hoover it up but not return a matching performance level.

        I also think there is some sloppy programming at play.

          I never had an issues back in 07 running Crysis mostly maxed out with my E6600, 2gb DDR2 & my 8800GTS. But then again It was only ever developed for PC, it could be a contributing factor. I don't know. But it was depressing that I was able to run Crysis 2 better on that rig than I could Crysis.

          Now I just scrape by with my i5, 8gb DDR3 and GTX560 =(

            Can't be worse than mine:
            * 8 GB DDR2
            * Core 2 Quad (Q6600)
            * GTS 450

            But if Crysis 2 was running better, then that meant Crytek was improving its coding. How can it be a bad thing?

              Well, the thing is that rig of mine died 2011, it was running games that were coming out at high still with minimal frame rate loss. I just put it down to developers being lazy at pushing anything to the limits. It was the overclocked edition, but still!

              Yeah fair point. Still if it runs the game, then its better than nothing!

              I thought it was because Crysis 1 had way more effects and was DirectX 10 and Crysis 2 was smaller environments on DirectX 9.

              It wasn't improved coding, so much as dropping some effects all together. It was also massively scaled down, even if the textures were more detailed - I remember one map in Crysis had probably 250 NPC working in squads. Just incredible stuff.

    So I might still need 2 of these to drive my 2560x1600 (with PLP) setup? Awesome tech, but still a potential pass. (Not that I'll be playing C3 anytime soon, regardless)

    These babies will be $1600 when they first hit Aussie shores due to initial gouging. I could probably buy the next Xbox and PS4 for that amount. A few months ago I got 2 x GTX580s for $200 each brand new for SLI goodness. It pays to wait a while.

      PLE has the Gigabyte one listed at $1399 and the ASUS one listed as $1499. None in stock though. I was close. :)

      Where the hell did you get 2x GTX580's for that price?! What brand were they though?

        Gigabyte 1.5GB models from Storm Computers. I nearly fell off my chair when I saw them. Thank you staticice.

    Not sure what the point of reviewing a GPU is if you don't compare it to other cards. TLDR; its not much better than other cards, and in some cases is worse.

    http://www.anandtech.com/show/6774/nvidias-geforce-gtx-titan-part-2-titans-performance-unveiled

    This card isn't as bad as everyone is making it out to be, apparently Nvidia held back on releasing this for a while and this was going to be the 670 or 680, but it could very well have put ATi out of the market.
    The main reason for picking this card up over the current 680 and is that it has amazing overclocking potential, this card will stomp all the current cards with a few tweaks as its not running anywhere near it's full potential. Though you will be hard lucked into finding something that requires this much power and when later drivers come out nothing will be holding this bad boy back.
    Im keeping my 650ti cause it looks so damn cute in my M-ITX build.

    Yeah the Gigabyte one is $1,299.00 at Mwave in stock 27th Feb.
    A shit load for one card, do people actually spend this much?

Join the discussion!

Trending Stories Right Now