What's Most Impressive About Nvidia's New Graphics Card? How It Sounds

On Thursday Nvidia's latest "most powerful graphics card in the world" hits the market — the Kepler-based GeForce GTX Titan. Packed with a ridiculous number of CUDA cores, all the Teraflops you can eat and innovative features like display overclocking, it's sure to impress the hell out of people that watch performance monitors and keep their frame rate displayed at all times.

What impresses me the most about the GeForce GTX Titan is that I have one that's been running Far Cry 3 at max settings not two feet from my head for more than eight hours, and I can barely hear it.

The GeForce GTX Titan was built to power the world's first gaming supercomputers. In fact it utilises the same GK110 graphics processing units used in each of the 18,688 Tesla K20X GPU accelerators inside Oak Ridge National Laboratory's Titan — the world's fastest supercomputer. With 2688 CUDA cores and 7.1 billion transistors, the GK110 is pushing the limits of what the 28 nanometre scale can hold. There are many, many angels on this pinhead.

The card features 6GB on onboard memory and a 384-bit memory interface, making it perfect for players looking to go super high definition or string together multiple monitors. It's capable of three-way SLI, a configuration Nvidia says is the only way to enjoy Crysis 3 maxed out across three monitors (5760x1080 resolution) at a playable frame rate. No doubt that claim will be vigorously tested by enthusiasts that pick up three of the cards when they officially release on Thursday.

What I've got here on my desk is the latest iteration of Digital Storm's Bolt. The super-slim PC has come quite a long way since I first tested it back in October. Back then it was noisy and novel, powerful enough to run games relatively well, but within minutes the system and graphics card fans would be working so hard you couldn't hear yourself think.

Since then Digital Storm has revised the case design for the Bolt, increasing ventilation, maximizing air flow and integrating a better power supply, significantly muffling the sound of the world's thinnest gaming PC. Thanks to the GTX Titan's GPU Boost 2.0 technology, the Bolt Titan Edition is ready to handle the toughest PC games while remaining whisper-quiet.

Originally launched with the GTX 680, Nvidia's GPU Boost technology was originally designed to dynamically adjust the GPU's clock speed according to a predefined power target. It provided a modest boost in performance, but had a tendency to limit clock speed at low temperatures, where there was still room to improve.

GPU Boost 2.0 controls the power of Titan based on a temperature target, rather than a power target. The default temperature target is 80 degrees Celsius (users can tweak it manually as well). The Titan will automatically boost the GPU clock frequency to the highest it can go while still remaining at or below that target.

This high degree of temperature control means that the Titan never has to over-exert itself, which keeps the card's acoustic profile incredibly low. It's wonderfully silent. Coupled with a design that blows air from back to front, it's the perfect graphics card for small form factor PCs.

The GeForce GTX Titan is as versatile as it is powerful, giving users total control over the balance of power and performance. Nvidia has unlocked the ability to extend voltage limits beyond normal operating limits, as long as the tweaker accepts the possibility of their technical hubris causing damage.

The card will even has the potential to increase the refresh rate of your monitor using display overclocking, making vertical sync (VSync) screen tearing reduction prettier than ever. VSync is usually capped at a monitor's refresh rate — generally 60 Hz. GPU Boost 2.0 has the potential to adjust the pixel clock of your display, allowing it to hit higher refresh rates. It won't work with all monitors, but when it does work it will be glorious.

You'll be hearing a lot about the Nvidia GeForce GTX Titan in the coming week. Benchmarks should start hitting on Thursday morning. Digital Storm's Titan Edition Bolt will be hitting (look for a full review on Thursday). Maingear has several different form factor systems ready to go. Origin PC lays claim to the world's first liquid-cooled Titan (coupled with GPU Boost 2.0's temperature sensing it should be pretty amazing).

But the most amazing thing about the GeForce GTX Titan is what you won't hear.


Comments

    This thing is badass, I wonder how it compares to the 680 and if it would lower the prices on the current graphics cards. It would also be intresting to see if AMD have a responce to this.

    Last edited 20/02/13 7:50 am

      Based on a bit of very rough flops calculation, and some stuff I read a few weeks ago, when this was rumoured, you're looking at a smidge over double the calculating power of a 680(Nvidia say that it has 4.5 Tflops), with considerably more RAM and bandwidth. So, I can't imagine performance would be considerably better than what you can already get with a 690, unless you're dealing with an absolutely ridiculous resolution.

      Edit: In terms of price, in recent years there hasn't been that significant a drop in price immediately after a new piece drops, and since the earlier models stop being produced, they can start to become more expensive. Since the Titan doesn't replace the line-up though, it's hard to say if it will affect prices, given it won't affect availability(which isn't very helpful, I know).

      Last edited 20/02/13 7:54 am

    This is just an uber expensive show pony like the GTX690 that only makes sense if you must have top performance whatever the price.
    Just like how the 690 is over twice the price of two 680's but provides less power, this will have limited availability, sell in low numbers and only make sense for people that want two or more and have budgets over five grand.

      Except the GTX690 isn't double the price? I know from personal experience. Also, the reason I bought the 690 over two 680's is that is runs about 20 degrees cooler in the case, and way more silently. It also has only a slightly larger power consumption than two 680s and doesn't require all the computer trickery that goes into setting up SLI.

        Look up the msy price list, maybe the 680's prices have dropped from when you bought your 690, also you so realise that the 690 still uses sli with all the driver problems and micro-stuttering that goes with that?

      I think you made a mistake there. Not "twice the price of two 680's but provides less power."
      That's a ridiculous expenditure that wouldn't make any sense. It's actually the price of buying two 680's, for a slight loss in power, but this loss is practically as minute as the price difference in just buying two 680's and running them in SLI.

      Which, coincidentally, also makes your second claim a little bit hard to believe, in that only consumers who want two of the card will buy. I know people who bought the 690 just for the convenience of having it all in one card. I wouldn't mind that myself.

        2G GTX 680 MSI OC / Gigabyte N680SO / Asus $520 / $555 /$ 548
        4G GTX 690 Gigabyte / Asus $1149 / $1249
        That's from the current MSY price list as there seems to be some confusion.
        The 690 really is a " a ridiculous expenditure that wouldn't make any sense" unless you have a 5 grand budget and use 3 monitors or something like that.

          You do realise the top price is for -one- 680?
          The best I can find is around 1000 for the 690 if you actually bargain hunt, and with the 680 being around the 500 mark, it takes TWO of said card to top the 690... No matter where you buy, you're looking at a very similar price for a very minor difference in grunt from 2 680's and 1 690.

          Anyway, my point was that you stated the 690 was "more than twice the price of two 680's", which is plainly false. The 690 is only ever slightly more expensive than paying for two 680's.

    My main question, is why is there a glass of wine next to one of them?

      Maybe to give a reference point of size? Though it's not particularly useful because wine glasses can range significantly in size. A more useful tool would be something like a iPhone or a specific model of mouse... but then of course you open the door to more problems.

        hmmm I would have thought using a ruler would be the best method of comparing size, but then again maybe that's why I am not in marketting

      More importantly why is there still wine in the cup?! outrageous

    Looking at the card, the fan is a conventional centrifugal blower fan. So how can the card blow air from "back to front", when it will be installed in a case blowing air from front to back? Is there something technically different with this card where it now draws air from outside the case, and dumps the hot air onto other components?

    The article starts with a video card, then ends with mini pcs.....

    $999 US. So if I add the Australia tax it should be a bazillion dollars here. My wallet will sue Nvidia for rape if this price doesn't go down.

      Not rape, you asked for it in the first place. Try the New Zealand tax instead. Thats just snuff.

    I remember the problems I had with the graphical capabilities of my first computer. It didn't support SVGA graphics so eventually I was left behind :( Whoever thought you would need more than 16 colours?

      I'd be surprised if I could name more than 16 colours.

Join the discussion!

Trending Stories Right Now