The GeForce GTX 680 was Nvidia’s first 28nm part featuring 1536 CUDA cores, 128 texture units and 32 ROP units. It’s remained since release Nvidia’s fastest single GPU graphics card of the series, second only to the dual-GPU GTX 690 which features a pair of GK104 GPUs.
And so for the last 12+ months the GTX 680 and the Radeon HD 7970 have been battling over the performance crown, forcing numerous price cuts and even a little overclocking from AMD to produce the 7970 GHz Edition. In the end AMD was able to undercut Nvidia on price, producing what we believe to be the better solution.
Most recently however Nvidia showed what they could really do with the GK104 architecture by beefing it up with more CUDA cores, texture units and ROPs creating the GK110. The GeForce GTX Titan is a monster that belongs to an entirely different league, crushing the GeForce GTX 680 as well as the Radeon HD 7970 GHz each and every way possible. Real-world gaming tests saw the GTX Titan outpace the GTX 680 by a 42 per cent margin and the Radeon HD 7970 GHz Edition by a 30 per cent margin. In the past we’ve seen performance jumps of 20 to 25 per cent from one generation to the next, so these numbers are indeed something special.
But of course, with a $1000 price tag it’s comparing apples and oranges. If anything, the Titan did show how much more complex and powerful Nvidia could make the current generation 28nm GPU without putting the TDP rating through the roof. It also meant that Nvidia could move to the next generation mainstream GPUs without having to completely redesign their architecture for the GeForce 700 series and that is exactly what they have done.
The new GeForce GTX 780 is based on a similar, albeit slightly cut down version of the Titan GPU, managing to keep many of the features that make the $1000 graphics card great, such as the 384-bit memory bus.
GeForce GTX 780 in Detail
The GeForce GTX 780 reference board measures 10.5-inch (26.7cm) length. Display outputs include two dual-link DVIs, one HDMI and one DisplayPort connector. With 2304 CUDA cores at its disposal, the GeForce GTX 780 features 50 per cent more CUDA cores than the GeForce GTX 680. The GTX 780 also gets a 3GB memory buffer standard which is 50 per cent more than the GTX 680.
Helping to take advantage of the extra memory are six 64-bit memory controllers for a 384-bit wide memory bus. Paired with a 6008Mhz GDDR5 memory clock, it provides up to 288.4GB/sec of peak memory bandwidth to the GPU.
Those specs mean that those still rocking a GeForce GTX 580 should be looking at around a 70 per cent performance improvement when upgrading to the GTX 780. It’s not just GTX 580 owners that can expect a decent performance upgrade as owners of last year’s GTX 680 will still receive around 30 – 40 per cent more performance, at least on paper.
The 12 SMX units providing 2304 CUDA cores are clocked at 863MHz though using Boost 2.0 they can be clocked up to 900MHz in certain scenarios. The second generation GPU Boost technology works in the background, dynamically adjusting the GPU’s graphics clock speed based on operating conditions.
Originally GPU Boost was designed to push the GPU to the highest possible clock speed while remaining within a predefined power envelope. However Nvidia’s engineers found that the GPU temperature usually limits performance first. Therefore with Boost 2.0 they have changed the way the technology works boosting clock speeds according to the GPU temperature rather than power target. The new target in question for the GTX 780 is 80C.
In other words, the GTX 780 will automatically boost to the highest clock frequency it can achieve as long as the GPU temperature remains at 80C. Boost 2.0 constantly monitors GPU temperature, adjusting the GPU’s clock and its voltage on-the-fly to maintain this temperature.
Nvidia has borrowed the design of the GTX Titan for the GTX 780 which is great news as the Titan not only looked imposing but it was also whisper quiet. Many other recent high-end GPUs like the GTX 780 make use of vapour chamber cooling, which consists of a copper vapour chamber that extracts heat from the processor using an evaporation process similar to a heatpipe, but more powerful. Helping to improve efficiency here is a new thermal material designed by a company called Shin-Etsu, which is said to provide twice the performance of the grease used on the GTX 680.
Additionally, Nvidia has included an extra heatsink behind the 80mm blower-style fan that increases the cooling area. There is also an aluminium baseplate, which provides additional cooling for the PCB and board components. The guts of the cooling operation are covered in a case that encloses the top of the card.
Given the high-end nature of this board, Nvidia engineers decided to use an aluminium casing for the cover. At its centre is a clear polycarbonate window, allowing you to see the vapour chamber and dual-slot heatsink used to cool the GPU.
Another nice touch in our opinion: the side of the card features a large GeForce GTX logo that glows green when the system is turned on. We think this looks cool, but if it’s not for you, the LED intensity can be adjusted in software.
Beside the logo towards the end of the card is a pair of PCI Express power connectors. The configuration is the same as the GTX Titan meaning you will find a single 8-pin along with a 6-pin connector. The GTX 780 has been given a TDP rating of 250 watts, which is 28 per cent greater than the GTX 680, so Nvidia recommends using a 600W power supply. The board features a 6+2 power phase design that Nvidia says feeds enough power even when overclocking. Six phases are dedicated to the GPU while two are for the GDDR5 memory.
Testing Methodology
Reporting average fps (frames per second) using Fraps is how things have been done for… well, forever. It’s a fantastic metric in the sense that it’s easy to record and easy to understand. But it doesn’t tell the whole story, asThe Tech Report and others have shown.
To get a fuller picture, it’s increasingly apparent that you need to factor in a card’s frame latency, which looks at how quickly each frame is delivered. Regardless of how many frames a graphics card produces on average in 60 seconds, if it can’t deliver them all at roughly the same speed, you might see more brief jittery points with one GPU over another — something we’ve witnessed but didn’t fully understand.
Assuming two cards deliver equal average frame rates, the one with lowest stable frame latency is going to offer the smoothest picture, and that’s a pretty important detail to consider if you’re about to drop a wad of cash. As such, we’ll be including this information from now on by measuring how long in milliseconds it takes cards to render each frame individually and then graphing that in a digestible way.
We’ll be using the latency-focused 99th percentile metric, which looks at 99 per cent of results recorded within X milliseconds, and the lower that number is, the faster and smoother the performance is overall. By removing 1 per cent of the most extreme results, it’s possible to filter anomalies that might have been caused by other components. Again, kudos to The Tech Report and other sites like PCPer for shining a light on this issue.
Test System Specs
- Intel Core i7-3960X Extreme Edition (3.30GHz)
- x4 2GB G.Skill DDR3-1600(CAS 8-8-8-20)
- Asrock X79 Extreme11 (Intel X79)
- OCZ ZX Series (1250W)
- Crucial m4 512GB (SATA 6Gb/s)
- Radeon HD 7990 (6144MB)
- Radeon HD 7970 GHz (3072MB) Crossfire
- Radeon HD 7970 GHz (3072MB)
- Radeon HD 7970 (3072MB)
- Radeon HD 7950 Boost (3072MB) Crossfire
- Radeon HD 7950 Boost (3072MB)
- Radeon HD 7950 (3072MB)
- Radeon HD 7870 (2048MB) Crossfire
- Radeon HD 7870 (2048MB)
- GeForce GTX Titan (6144MB)
- GeForce GTX 780 (3072MB)
- GeForce GTX 690 (4096MB)
- GeForce GTX 680 (2048MB)
- GeForce GTX 670 (2048MB)
- GeForce GTX 660 Ti (2048MB) SLI
- GeForce GTX 660 Ti (2048MB)
- Microsoft Windows 7 Ultimate SP1 64-bit
- Nvidia Forceware 320.14
- AMD Catalyst 13.5 (Beta 2)
Benchmarks: Battlefield 3, Crysis 3
The GeForce GTX 780 averaged 62.4fps in Battlefield 3 at 2560×1600, making it 20 per cent faster than the Radeon HD 7970 GHz Edition and 29 per cent faster than the GTX 680. When compared to the dual-GPU GTX 690, the GTX 780 was 32 per cent slower and just 8 per cent slower than the GTX Titan.
The GeForce GTX 780 slipped further behind the GTX Titan when measuring frame time performance — it was 14 per cent slower to be precise. Still, when compared to the Radeon HD 7970 GHz Edition the GTX 780 was 17 per cent faster, as it was 22 per cent faster than the GTX 680.
The GeForce GTX 780 spat out 29.1fps in Crysis 3 at 2560×1600, just 2fps slower than the GTX Titan but also just 1fps faster than the GTX 680. Nevertheless, it was 31 per cent faster than the Radeon HD 7970 GHz Edition.
When measuring frame time performance in Crysis 3, the GeForce GTX 780 was 15 per cent slower than the GTX Titan but 12 per cent faster than the GTX 680. It was also 17 per cent faster than the Radeon HD 7970 GHz Edition card.
Read More…
TechSpot is a computer technology publication serving PC enthusiasts, gamers and IT pros since 1998. Republished with permission. [clear]
Comments
23 responses to “Nvidia GeForce GTX 780 Review: The Titan Descendant”
One of these or a 770 will be replacing my 570, waiting til next week when the 770 releases to decide.
ill be buying the 780 to replace my 580
will be looking at the midrange 7xx cards to be released soon to replace my 7750. This be too pricey, although if it brings the price down of the high end ati cards as well, I won’t complain about having extra options
Well PC Case gear have the Overclocked EVGA model for a tad over 700 bucks. Which is cheaper than what the 680 debuted at which was 800-900 dollars. So in that respect it’s a good price, but I agree it is a lot to be shelling out for a graphics card.
If you think of it in terms of being a baby Titan it makes a little more sense. Titan has 6GB of vram as opposed to 3GB, and just shy of 400 more CUDA cores. Then again, waiting for 680’s to drop in price might be a better deal. The 780’s are 15-20% better (going purely off the benchies above, and I use the term ‘only’ very loosely) whereas picking up a 4GB 680 FTW/Classified, etc might be awesome value.
Not sure if it’s a viable solution without doing the research, but worth considering. Also, the 7970 Ghz/OC versions are giving crazy bang for buck at the moment.
Also gotta remember with the titan it’s still over a grand, 1200 is the cheapest anyone sells it for according to Staticice.
i THOUGHT I was hanging out for this, but that card still doesn’t put up 60FPS+ on every game at normal resolutions. Looks to me like this is really the time to be looking for price drops on 690s or the titans.
Even though I currently have a 7970, I will be selling it and going nvidia. Never had an AMD before. With their drivers the way they are, I’ll never be going AMD again.
I heard the AMD drivers are absolute turd, but the hardware is sound. What kind of issues have you had? Interested to know as a generally Nvidia man.
My old ati card required me to update drivers evey week or 2. Bloody annoying.
The nvidia drivers are way better. stable and mainly work.
The AMD drivers are crap to work with and constantly drop profiles and become unstable.
I’ve never experienced any of these issues with the many nvidia cards I’ve owned.
right there with you Rowan, have been on the 7970 since its release as I was getting back into PC gaming after many many years out. Will not be going AMD again, NVIDIA all the way from here on out, most of the games I play seem to have the optimised for NVIDIA tag on them, and even with out that, the issues with drivers has been very annoying, I have had driver releases screw my multiscreen setup too many times to count, while also making my system entirely unstable at points as well.
Chris Wisby • a day ago−
I AM SORRY FOR TYPING IN CAPS BUT I GOTTA CORRECT ALL THIS DISINFORMATION ABOUT THE GTX 780 K CHECK IT I JUST RECEIVED MY MSI GTX 780 YESTERDAY FROM NEWEGG AND LET ME TELL YOU NOW THIS AINT NO SMALL BUMP UP FROM A GTX 680 K MORE LIKE A SMALL BUMP UP FROM THE TITAN K I AM SCORING X5617 IN 3DMARK11 EXTREME WHILE THE TITAN SCORES ABOUT X5058 K THATS ALMOST 600 POINTS HIGHER FOR MY GTX 780 AND REASON BEING CLOCK SPEEDS THIS THING IS A FUCKIN BEAST I AM RUNNIN +38MV OVERVOLTAGE +214 CORE +501 MEMORY AND THIS CARD BOOST TO 1241MHZ UNDER LOAD AND WITH A CUSTOM FAN PROFILE IT STAYS AT A COOL THROTTLE FREE 68 DEGREES AT THOSE CLOCKS AND 1.2 VOLTS SO FOR THOSE ACTING LIKE THIS CARD AINT A 680 AND A TITAN KILLER YOU SHOULDNT TALK TILL YOU GET THE CARD THEN SEE FOR YOURSELF THIS IS THE MOST AMAZING GPU EVER MADE AND AT 649.00 ITS A KILLER CARD AND ALSO FOR THOSE THINKIN ABOUT GETTIN AFTERMARKET LIKE THAT CHEAP PLASTIC ACX EVGA CRAP LMAO YOUR TRADING DOWN IN QUALITY AS THE REFERENCE CARD IS OF THE HIGHEST BUILD QUALITY OF ANY CARD EVER MADE AND THIS AFTERMARKET CRAP AND SUPERCLOCKED NONSENSE IS JUST WAYS OF EVGA AND OTHER COMPANIES CHARGING MORE MONEY AND CUTTING THEIR COSTS BY USING CHEAP ASS PLASTIC FANS AND TRYING TO ACT AS IF THEIR ACTUALLY VALUABLE WHEN THE TRUTH IS THE GTX 780 REFERENCE MODEL IS PURE PLATINUM QUALITY AIGHT I HOPE I HELPED THOSE OUT WHO GOT A GTX 680 LIKE ME WHO WAS ON THE FANCE WIT THIS CARD IM TELLIN YA DONT BE K IF YOU AINT GOT A TITAN THEN BELIEVE ME THE UPGRADE IS WARRANTED
3GB????
It’s the future!
I’ll hang onto my 670 for a little longer, though it’s pretty tempting to upgrade XD
I currently own a GTX680 and I have a 3D 120HZ Samsung SA950D. I cannot get 3DVision to work because of nVidia’s proprietary BS. I have to go with one of 4 monitors they chose for me, and are expensive for what they do.
I used to have a HD5870, ran beautifully until very recenly when I sold it. No problem with 3D support with any 3D monitor or TV.
By the way you have to buy an additional driver called 3DTV Play [$35.00] if you want to get your 3D TV to work with your nVidia card.
I bought it primarily because of all this hype about PhysX and it doesn’t even make that much of a difference, still struggles to run the game I love [The Witcher 2, etc…] at 60FPS ultra quality.
I will never buy nVidia again until they get rid of this proprietary nonsense.
Waiting for the next gen Radeon, the driver issues are almost a thing of the past and nVidia has their own driver issues.
Things will only get better for Radeon, because all major consoles will have Radeon architecture in them.
nVidia is far more expensive for, what little or no benefit they offer.
I made a big mistake by spending $730.00 on this GTX680, and they release GTX690, Titan and a GTX780 within a few months of each other and I also have to spend more money to get it working with my 3D TV, have to choose from a few selected monitors to enjoy 3D.
Thank You nVidia. I used to be a fan.
It’s hard to have any sympathy for you when you are wasting your money on dumbfuck 3D tech.
I have the money to experiment and that’s my buisness, sir. I don’t want sympathy, I was making a point.
Radeon HD 7990 is what you’re looking for and should be coming to our shores pretty soon 🙂
I’ve got an Asus HD 7870 DirectCU, so I might get another one very soon to run crossfire, only about $250 and the benchmarks running this setup don’t seem to be far off at all from the Titan!
Honestly, Nvidia lost me after the 550Ti.
Yes. I will be there.
OMGZ 700 series!! Damn I’m so excited and have to start saving to upgrade my 580. You seriously made my day with this 😀
I just hope it isn’t too loud and hot
A brew, I own two Titan cards in my rig, they are in SLI, since the 780 uses the same reference design and cooler, I can tell you that they are almost silent even when rocking games. Mass effect 3 for example running at 60fps with all the effects maxed at 1920×1080 utilises 13% of my card and around 3% of the CPU. The 780 will be very quiet indeed.
Ambrew sorry about getting the name wrong, I hate ipad autocorrect.
If i’m going to spend $750 on a graphics card I might as well up it to $1100 and get real titanium majesty. I could buy two 7970s for about the price of a 780 and they beat the shit out of it on those fancy charts. Fuck upgradability! I want to play games. This is about games right? I mean, if you’ve got money, a Ferrari 458 would be your titan.
Impressive, but what shocked me is how well the 660 Ti in SLI scored in your benchmarks!