If you play games on PC, there’s a good chance you use a graphics processing unit (GPU) built around technology from one of two companies: Nvidia or AMD. The question is: Which?
I’ve used both brands off and on for a while now — both Nvidia’s GTX line and AMD (formerly ATI)’s Radeon cards have their own cadre of optimised games, and each one is plenty powerful enough to have a good PC gaming experience. But I’ve been curious as to which cards more of our readers use, so I thought I’d do a poll.
A few notes on the poll options. BOTH is intended for those who own a GPU of each brand, and either A) regularly swap them out depending on which game you’re playing or B) Have two PCs, one AMD and one Nvidia. OTHER is intended for those who use some other graphical solution to play PC games. I DON’T PLAY PC GAMES should be self-explanatory.
OK! Let’s see which GPU is more popular among Kotaku readers:
Comments
63 responses to “Which Graphics Card Do You Use, Nvidia Or AMD?”
I use a Sapphire HD Radeon 7970 Dual-X OC w/ Boost and it is amazing! running with an i7 4770k (overclocked to 4.5GHz) and getting about 45fps on the highest settings on Crysis 3….. love it! 😀
All that for 45 FPS? What is wrong with that freaking game?
I swear you have to rent out space on Amazon EC2 to get that thing to run at 60+ FPS.
I have a i7 3770 @ 4.3GHz, 16GB 1600MHz Cl8 Ram, 2x 660ti’s in sli and get 60fps constant with vsync on.
2x IceQ Turbo 3GB Radeon 7950.
Does the job. =)
nice! you don’t run into many problems with drivers for certain games or does it actually run pretty smoothly?
If I keep up to date with Catalyst there’s usually no problems.
BF4 is putting it though a bit of a pace though. Pulling a wicked 20 FPS on Ultra.
damn! I’m trying to get past the loading screen but still no luck… anyway, I might think of getting another 7970 and adding them to my LC loop, but it’ll be interesting though, considering my Sapphire card is not reference and you can’t fit a full cover waterblock onto the PCB (need to go a bit more custom!)… sucks 😐
Man, I miss my IceQ Turbo 4870! Every time I’m even slightly reminded of it I get all lovey. Rocking an MSI 6950 Twin Frozr III right now, which I’ve unlocked all the shaders on and over-clocked so it’s essentially a 6970.
I used a Radeon 6950 (w/ unlocked 6970 shaders). Looking at upgrading to a R9 290x in a few months time once launch shortages and the cost spike settles down a bit. 😀
a few years ago i was living on my own and things were tight, i bought a semi decent second hand pc from a friend for a good price, but it had no video card, i scraped what little money i had together and borrowing a few bucks from my girlfriend i got myself a Radeon HD5770 (1050 monitor), and man was i impressed, the performance for the price blew me away i was so grateful to be able to game properly at last, it renewed my love for pc games. Now i have a much more powerful machine with a GTX570 (with a second one on the way) and i love it, this cards performance exceeded my expectations.
anyway i believe that both companies make awesome cards, and strong competition is never a bad thing for consumers.
i’ve always used nvidia and always had a screen tearing problem over 5 or 6 years. i think its time to switch to a radeon
Tearing isn’t caused by the brand. Try Adaptive V-Sync (Make sure v-sync is disabled in game settings). Or get a 120hz monitor.
Using MSI GTX780 lightning, had Radeon 6950 before that switched to nvidia for the physX.
I have no particular allegiance – when it’s time to upgrade I will buy whatever looks the best deal in my price brackett. Nvidia SLI currently.
I think this is the smart way to go, I do the same. Brand loyalty is a waste of money.
Honestly, this is the only intelligent answer. Pragmatism and rationality get the best results every time.
Have always used nvidia, no need to even look at radeon because I’ve never had anything but great experiences with nvidia. Currently using 2xGTX670 SC 4GB with 3xIPS monitors at 7680×1440. Unfortunately I am going to have to upgrade next year when the 800 series Maxwell cards are released because I’m barely getting 30fps maxxed out at that resolution these days.
Just built mine a few weeks ago. Nvidia GeForce GTX 780 – was going to CrossFire two AMD 7970’s, but you should always start with one card, so you can upgrade to a second later on.
AMD when i want to play games, Nvidia when i want to fry an egg
You’re a bit out of date there. For the last 18 months (last two generations), Nvidia cards have had much better thermal profiles than AMD, mainly because they use about 20% less power than their respective AMD competitor cards for the same performance.
I’ve been looking at going AMD for my new CPU but I’m pretty new to pc and just don’t know what’s good, I’ve got a gtx 660ti what would a good CPU from either brand for about $250 to play bf4/titan fall
here’s 2 decent options below for about that price range:
http://www.pccasegear.com/index.php?main_page=product_info&cPath=187_1512&products_id=21806&zenid=f98aa70f259d0ce052ff803b2f6fb17e – absolute beast if you’re just gaming.
http://www.pccasegear.com/index.php?main_page=product_info&cPath=187_1490&products_id=23494&zenid=f98aa70f259d0ce052ff803b2f6fb17e – a pretty good quad core for gaming, a little pricier as it is Intel (as per usual).
Personally if I was only gaming (or mostly), I would go for the AMD FX-8350, but I’m no fanboy (as mentioned above I have an Intel i7 4770k and it’s absolutely great for what I do). So both Intel & AMD have some great choices, although you’ll notice Intel seems to be that little bit pricier across the board but still great (kind of like comparing AMD & Nvidia GPU’s).
It’s true that Intel CPUs are pricier, but they also run cooler and get better bang for buck than AMD CPUs. It’s unfortunate because I wish the two were closer to put more pressure on competitive advancement, but there’s a reason Intel dominates PC gaming statistics.
I think AMD may make an interesting comeback with their upcoming 2014 lineup of FM2+ Steamroller ‘Kaveri’ APUs…. and with the new R7 & R9 GPUs along with the Mantle API, I reckon Intel is gonna get a run for their money in 2014, especially if the Kaveri APUs end up being great overclockers (which is a hit-n-miss with Intel’s Haswell lineup)
Always Nvidia.
AMD Mantle and True Audio made me change over. Plus anyone can make expensive chips to out perform their competitor, AMD excels in architecture and therefore produces better cost -value proposition.
Neither Mantle nor TrueAudio are out yet, and the latter is largely pointless. Gamers typically use one of three systems: discrete sound cards which have excellent quality DSP and DAC already, digital output that doesn’t require DSP in the first place, or onboard audio.
AMD’s DSP would only be useful in the latter case, and even then it’s not really bringing anything to the table. PC processing power has sufficiently advanced in the last 10 years that technologies like EAX and other hardware-driven DSP aren’t actually used by games any more, because there’s no need for it. It’s all done in software, with minimal CPU overhead in comparison with other game systems.
The idea as a game developer is to try to make your game as compatible with everyone’s PCs as possible. When it’s within reason, things should be done in software by the CPU, offloading to dedicated hardware when necessary. When the dedicated hardware isn’t necessary, you bring it back to the CPU, reducing dependencies on tech not everyone has, leveling the audio quality for everyone and investing development in improving audio quality for everyone.
AMD are a capable company that produce decent cards, but I wouldn’t put that much stock in TrueAudio. It’s a gimmick, and I’d be surprised if it’s picked up by any serious percentage of games over the next 5 years. Mantle is a different story, but its effectiveness remains to be seen.
I see your point. But AMD’s strategy has been to include these technologies (GCN, TrueAudio) into the console platforms at the same time.
The problem with PC dedicated sound chips is the variety, as you pointed out. Same goes for GCN. It’s been included on AMD cards for quite some time, but hasn’t been utilized. I’m sure they are using these features in the upcoming PS4 exclusives and for PlayStation VR.
Things like Occlusion culling, HRTF will make much better use of dedicated hardware, where response times and transition times come into play. But they can’t do this on PC, because of the variety.
AMD was the first to introduce Unified Shader Architecture. nVidia adopted it further down the line. But PS3 had to do without the use of it throughout it’s lifespan. I just think it’s a shame when that happens.
I usually just get whatever is best quality/value at the time I’m buying one. At the moment I’ve got a Radoen 6970 (I think) but it’s around 1.5-2 years old. If I was to buy one today I’d probably go with Nvidia, their recent cards seem to be pretty awesome.
AMD is great on a budget, but now I can afford it I’m Intel/nvidia all the way.
I run dual HD6990. Yes…8GB of graphical goodness 🙂 gets me from A to B.
Not good enough, it doesn’t get you to C! 😛
A minor nitpick, but SLI/Crossfire cards don’t actually pool VRAM, it’s mirrored. I’ve had a few friends buy a second card thinking it would up their VRAM pool but it doesn’t, sadly.
Meh. Either way it’s TWICE THE AWESOME.
Given the choice, I usually pick Nvidia because I like their Control Panel better than AMD’s Catalyst and I like the stupid little extra stuff like PhysX.
At the moment I have an ASUS 660Ti which has been serving me very well since I bought it. I think the main reason I chose that over something similar from AMD was the power consumption was really good for what you got…. Have yet to regret that decision.
Once upon a time I use to use ATi/AMD GPU’s, but they never seemed that great, and they didn’t last that long, so I switched to Nvidia. Never looked back after that and I have never had a single issue with their cards.
this exactly, you pay for quality when you buy nvidia. Ive personally never had any problems with nvidia but the one time i bought AMD all i had was constant errors and hardware failures.
Yeah, I used AMD back then as well, was never really fussed with them, when they were meant to be the gaming CPU, my old Core 2 Duo E6600 lasted as long as my 8800GTS did, both died around the same time in 2011 oddly enough. Yet they still ran games in 2011 on high to ultra high fine.
You couldn’t buy either brand and expect quality previously, they have both had more than their fair share of screw ups over the years.
The 8xxx series through to the 48x of Nvidia ran very hot compared to AMD parts and before the 5870 AMD parts were pretty rubbish themselves.
These days they are both pretty good however the upcoming AMD parts have a massive amount of interesting features that will pull me towards them again from my GTX 680.
I’m glad that AMD exists to provide competition, but I’m a Linux user, and Nvidia has always had much better Linux support. Driver performance is on par with, or sometimes better than, the Windows version.
Yeah, true, Nvidia are real asses about releasing programming details for their chipsets. Still, they make the best performing Linux graphics drivers. If you’re serious about gaming under Linux, you don’t really have much choice.
I’ve used my old 550Ti in Ubuntu before and wasn’t too bad actually, also my 7870 ran pretty well, so I’ll have to test out my 7970 once I reinstall Ubuntu 🙂
Using an AMD HD 7970, I don’t mind either though.
My current card is AMD, which is how I’ve voted since that’s how the poll was put, but if I get a new PC / new graphics card it will have a Nvidia in it.
The new PC has always had issues with video playback and the only performance bottleneck I haven’t eliminated is the video card.
That said, having first gamed in 3D around 1990, I’ll generally put up with frame rates and graphics quality that many of the 4xHD-60FPS crowd would blanch at.
I had ATI up to and including the legendary 9700 pro from Half-life 2 days that overclocked supremely, but after it died (got 3 years out of it) the next was a X1950 which I wasn’t all that happy with, so have been using nvidia for the last 2 cards since then.
Of course before the 9700 pro it was pretty much Voodoo add on cards lol i remember them from HL1 days
Currently running GTX 670, which I’ve been very happy with.
I have owned 3 graphic cards in my time as actually owning the computer (So not the family computer or anything), a sis card, think that stands for silicon something system, I remember it barely able to run GTA3. Then a Nvidia GTX 280, and now I own a Nvidia GTX 680.
AMD. Not because of some weird loyalty to a brand, but because Radeon cards tend to hit the sweet spot in the price range I’m usually looking at (~$150).
I’ve never had issues using my last few AMD cards, they have run very cool and very smooth. Nvidia just doesn’t have the performance to cost value.
I have always used Nvidia but with all 3 of the new consoles using AMD graphics chips my next upgrade i will probably go with them as I think console ports of which there will be many will run better on them.
Nvidia for me, although I’ve used AMD in the past. Currently using a GTX 760 that manages most games at around ultra in 1080p. Will possibly buy a second card to run SLI when the next-gen has truly started, or if I decide to get a high res monitor to replace the big screen TV.
I just recently switched form a GTX570 to a HD7970 Ghz editon after tossing up between that and a GTX770. In the end free games made my choice.
Theres nothing wrong with the 7970 but I wish I went the Nvidia option. Nvidias control panel is way better and they are always releasing driver updates. I couldn’t tell you the last time AMD released new drivers.
try radeon pro 😀 way way more options than catalyst
I used to always go for nvidia, but several big issues had been happening lately that forced me to upgrade the PC’s in the house to ATI. First, was the whole TF2 / HL2.exe nvidia driver crash, forcing me to restart the game. Second the crash to desktops in borderlands for my partner. Tried many a driver updates to fix the issues, but both were fixed immediately on getting the new cards.
Both are pretty much the same, for someone to say one brand is absolutely better is a fool. not only are individual cards exceptional on either side, both work great at different aspects. I’ve not shopped for a long time but I’ve got a 6970 that was way to competitively priced to pass up. I just went with my hip pocket.
Uh, they have released new drivers almost every month this year. In fact there was a new beta and normal release just two weeks ago. Must not be paying attention.
Nvidia. And I know that the AMD/ATI are good. I just had too many issues historically that it has left a bad taste in my mouth. Remember the days of ATI where depending on what game you played is what version of Catalyst you needed, because version x would crash the game, y would be jaggy and z would work? How about the days when game makers had a specific troubleshooting section in their help/faq for “ATI Owners”
Yeah. That’s all fixed now. I know it is. But it’s just my brain can’t accept it. Maybe one day I’ll try again.
The
driver
issues, which caused hundreds of hours of frustration over a couple of years drove me to buy a few consoles.I have only recently bought a new Desktop PC in the hope of some improvement (I wanna rock on Battlefield 4).
I’ve always been with ati/amd until I got my gtx580 direct cu2. I never had any major issues with ati besides the fan noise from my last card. The 580 is amazing but damn Its not a good idea to play any graphic intense games during summer unless I wanna roast.
I’m mostly a console gamer (PC DRM put me off) but when it came to graphics cards I have always used nVidia cards. Mostly because I rarely ever had a problem with them. The only time was with a 6600GT but I later found (after returning it) the real culprit was the cheap PSU I had at the time (Usicase, AAAAGH!).
Brand wise though, I get Gigabyte nVidia cards. They maybe the budget variety but they still hold up well.
I have 2x 7970oc Editions running 1100mhz core clock. They run really well on BF4 FPS is between 60 and 180fps which im pretty happy with. Go AMD, bring on the 290X
i have always had AMD cards except my first pc with an mx400.
if i remember correctly one of the deciding moments was half life 2 being bundled with ATI cards. that was when i got my first “real” graphics card and have stayed with ATI/AMD ever since
Team Green (Nvidia) vs Team Red (ATI/AMD)…I’ve always been with Team Red for many years….
I’m running a 6950 Dirt 3 Edition with the 6970 shader unlock and all is well…planning on upgrading to either a 7950 or whatever AMD is releasing on the horizon…
Im still using a HIS HD5850 alongside my AMD 965 Black Edition – and playing current gen games on high settings. even crysis 2 (not really sure how intensive it is, ive never been a crysis person).
i keep wondering about nvidia cards for my next build, but i find amd does the job i need it to at a better price point, ive never been disappointed with my HD5850 and its already 3 years old now.
Used to run AMD CPU’s and Radeon graphics back in the day, because they were great to build performance on a budget. Nowadays, Radeons aren’t cheaper and their drivers are woeful. Also AMD CPU’s suck – oh how the mighty have fallen.
These days it’s Intel CPUs and Nvidia GPUs for me.
Running a pair of reference Gigabyte GTX Titans with XSPC Razor Waterblocks and backplates.
I use Nvidia SLI for gaming, and AMD for cryptocurrency mining.
I have been AMD for the past 10 years (before that NVidia and before that 3DFx); and I am moving back to NVidia for one reason and one reason alone….
I bought a 27-inch Asus 3D monitor (inbuilt 3D and glasses); but I discovered that my AMD (Crossfire) will not work in 3D with this monitor. That REALLY GRATES MY TEETH.
Unless the new AMD R9 cards will do 3D on it, I will be buying NVidia real soon; but its hard to find the information, and my patience is wearing thin. At least my PS3 runs 3D perfectly on the monitor..