With the game finally out for all and sundry, people are binging Fallout 4 left, right and centre. Bethesda’s Pete Hines even tweeted out a permission slip this morning for people taking sickies off work (props to anyone ballsy enough to use it).
Naturally, the PC is the lead platform and it’s where people will experience the fewest performance issues. That’s provided, however, you’re using a NVIDIA card.
A set of benchmarks has appeared on the German site PC Games Hardware, and if you’re an AMD fan then the figures aren’t good. In fact, they’re actually pretty awful.
NVIDIA doesn’t just come out ahead in the figures: they’re so far in front, that the Sapphire R9 Fury and the HIS R9 390X are actually losing to an GTX 970 and a GTX 780 Ti in 1080p. Ouch.
The story doesn’t get better at higher resolutions either: the Fury and 390X continue to play second fiddle, although not by much, to the 780 Ti and the GTX 970 at 1440p, although the Fury X regains a measure of respectability at 4K by achieving a fraction over 30 FPS on average. (The GTX 980 was on par, the 980 Ti was far superior and the 780 Ti and GTX 970 were several frames behind at 4K.)
You can check out the rest of the graphs and the remainder of the specs used at PCGHW. Most of the text is in German,but all the specs and the key figures you need are in English so you shouldn’t have any trouble parsing it all. NVIDIA has also got another major driver release coming out soon, and if their track record this year is anything to go by that gap could widen.
Comments
91 responses to “If You’re Playing Fallout 4 On PC, You’d Better Have A NVIDIA Card”
It’s stuff like this which has me previously on the fence about a 980ti vs a Fury-X saying “AMD, this is why I haven’t been buying your cards in years!”
It’s stuff like this that makes me happy I’m playing on an “inferior” PS4. No tweaking, no drivers shenanigans and no glitches (so far). Best solid 12 hours of gaming I’ve had in years (hurrah for rdo’s!).
So happy I defected from the “master race”. I just dont have time for that nonsense all for marginally more shiny graphics. Each to their own.
No platform is superior, they all have advantages and disadvantages. Just a matter of which one is right for each specific person and circumstance.
Also amusing and related: https://twitter.com/Grimecraft/status/663791445438959616
Mods are why I play it on the PC. Performance? I don’t care really. I’ve been playing vidya since the 80s and even as I type I’ve got an ASCII roguelike running in the background. Whether I play on max settings, min settings or somewhere in-between is neither here nor there to me.
I’m all about the gameplay. Bethesda games come with a ton of bugs and their balance is weighted too much towards casual play for my liking. Mods are what makes those problems go away. Sure, Bethesda release patches but so far they’ve never caught all the issues – that’s left to the Unofficial patches.
You might find that most of the people playing on PC do so for the love of tweaking and modding. It’s strangely addictive.
Definitely can be fun but not for everyone.
Personally I use to love all that but dont really have the time anymore. I just want to play the game so have purchased it on ps4. Maybe one day down the track when mods have had some time to settle in I might purchase it on steam and play around again with it, who know.
Either way dont care much on whats better, everyone has their own opinion on whats better.
Yeah that Sub 30 fps PS4 gameplay, I hear it drops to sub 10 frames later on;)
So, nothings changed since Fallout 3 on PS3 then. Wonder if it’s still the buggiest platform
I’m playing on a PC and don’t bother with the tweaking and driver shenanigans, and find it perfectly satisfactory. (I do occasionally upgrade my drivers; it takes about three clicks to do so, never fails, and occasionally boosts my system performance transparently and painlessly.)
Messing around with hardware is not actually NECESSARY for playing with PC. It costs you an edge in performance, but since the hardware baseline (for a “gamer PC”) is better anyway, upgrading is not needed often.
Of course, finding a PC with decent performance for $350 is pretty difficult. On the other hand, I wouldn’t recommend word processing on a PS4 or XBox One.
As you say, each to their own.
you only occasionally update drivers? are you nuts?… So you play PC because of the better performance but don’t update regularly, 95% of the time its free performance waiting to be downloaded with also a lot of bug fixes of the graphical kind, they are very very beneficial. Some drivers can offer massive amounts of fps for new games, which Nvidia and I believe AMD now release when a new game is out, the GTA 5 drivers for instance offered substantial improvements, some games it can be as low as 5% average fps improvement other games it can be as high as 20% with fixes to noticeable problems like shuttering.
You might not tweak settings here and there and that is fine neither do I unless it is something you have to fix… But not updating drivers for new games is just bonkers.
I have GeForce Experience running in the background, so when there’s a new driver it prompts me. Like I said, three clicks. When I was running an AMD GPU, much the same thing would happen.
Yes, it’s free performance. No I don’t miss out on it, because upgrading drivers is no longer a hard thing to do.
Actually, it’s prompting me now.
Not just marginally better graphics (sometimes not just marginally) but VASTLY superior performance and frame rate. Batman is an entirely different game series with phys-X on graphically. And ah yes because clicking 3 or 4 times to download a driver once a month if that is soooo hard. Technically PC’s glitches get fixed faster if game has modding and generally the consoles have the same ones and vice versa for the most part. Let alone the vastly extended life of games with mods. And sometimes (and the “marginality” of it will increase as years go by, next gen consoles are already tapped out performance wise) the PC versions aren’t just a little better but vastly better. Resolution, frame rate and anti aliasing being most important and pre dominate. But what people forget sometimes is you don’t HAVE to download every driver that comes out, you don’t have to twiddle with graphics settings (every game on planet has an automatic detect) and you have choice of keyboard/mouse and controller. Most important PC has mods. Until the new Deus Ex (original, re done by fans) came out I was playing an essentially brand new Deus Ex original for free. With new graphics and systems. It’s really not as complicated as it sounds and like someone said below it’s actually strangely addicting comparing systems, tweaking, seeing what you can get out of rig performance wise, upgrading your rig etc. I highly doubt you were much part of the “master race” ha if you defected because the benefits are immense. I’ve never understood the “I don’t have 5 minutes before playing a 200 hour game to tweak it or download a driver.”
well, from what I’ve experienced in ps4, I have to say gameplay really sucks. Can’t even aim properly. I will prob sell the ps4 game.
Man that sucks. Sorry to hear that. Maybe they should have really tuned the graphics for the console crowd so the thing would run atleast decently. Nobody cares about graphics if you get 15 fps. I get a solid 60 fps in 1080p, but its on a computer i built to run 4k… Which it will do perfectly for most other games.
It’s a self fulfilling prophecy:
1) nVidia writes Gameworks to favor their hardware
2) Lazy developers use their crappy, biased APIs
3) Performance in those games are, predictably, crap
4) nVidia users point to the performance results and use it to justify their purchase choice
There are enough games out there now running Mantle and DX12 that it’s clear that nVidia and AMD are effectively the same, when you compare price/performance. That sort of makes sense: they are competing against each other, and that competition keeps them very close to each other. Each has strong areas and weak areas, but over-all, objective analysis suggests that the margin between them is in single digit percentages.
Battlefront is pretty obvious: GTX 980 Ti 43/81/114 FPS — Fury X 45/81/109 FPS
Seeing as the GTX 980Ti that was used in the benchmark I quoted was $45 more than the Fury X in the benchmark you actually pay $45 extra for that 4% peak performance boost but cop some extra stutter at the low end, at the same 4% margin.
If everyone took your attitude there would only be nVidia, and then, very quickly, nVidia would just stop innovating altogether. Why would they need to if they didn’t have someone else in the market to force them to do better? Their cards have a pretty fixed life, and they know that you only can come back to them for a replacement. They aren’t a charity and the only thing keeping them honest is competition.
I, as an AMD owner, am thankful for nVidia. I used to love their cards and I am glad they are around to keep AMD from getting lazy. As I play UHD these days, AMD just pushes higher res better, so I made the switch. It’s one of those strengths I talked about earlier.
Bethesda are lazy. They prefer to focus on the ‘art’ aspects and very much see technology as a secondary concern. This is evident in Fallout 4: Crossfire AND SLI are totally ruined right now, and Bethesda’s current responses on the subject are sounding very Arkham Knight-ish… Basically they are saying UHD and 4k gaming are too niche to fit into the $100 million they raked in the first week in online sales alone…
Am I surprised that Fallout 4 is better on nVidia? Not at all: Gameworks is incredibly biased and Bethesda are lazy, especially in the areas of QA and optimization. Is that the hardware or drivers fault? Not at all. I am sure I could find a dozen games that are coded crappily and that penalize nVidia, and that would still be just as false a comparison.
I’ve only got a 5870HD but its done me proud before, don’t see why it won’t again. also I find those figures hard to believe, they seem doctored to be honest and I’m not an AMD fanboy in fact I’m considering a 970 for my next card (if the 5870HD can’t handle fallout) although it might be out of my price range.
hey mate! I have an AMD HD 7970 and it runs 40+ fps with ULTRA settings. I’d expect you may need to drop texture quality (only 1GB GPU RAM) and maybe some other settings, but it should run ok.. if a little slow. benchmarks put the 7970 roughly 2x as fast as the 5870, so might be time for an upgrade (my 7970 cost me around $130 a year ago).
You have no idea how happy you just made me.
Is that the 3GB OC version?
nope.. the old 2GB stock clock version – specifically 1100MHz clock and 1200MHz memory model.
I can advocate for the 970. I have one and whilst it will never blow you away (i.e. ultra setting at 4k) it will certainly run every game for the foreseeable future at a good frame rate at 1080p.
Yeah, but I reckon it’s too pricey for me sadly
no card will blow you away ultra 4k for the majority of games, the 980 is there latest GPU, a 970 is the exact same but with a few things disabled, 980’s require premium level of manufacturing and some of the ones that don’t make it get turned into 970’s… TV’s are the same, a manufacturer will pump out panels for say LG, the top of the line ones require an A+ for rating (as an example) the ones that score lower say a A- still need to pass a quality check but are often used in cheaper products.
So what does that mean for GPU’s and gaming, while there very tangible differences between cards, its never such a large gap that one card won’t blow the other completely out of the water. To play the latest games at 1080p 60fps on high to ultra a 970 is plenty… it raises exponentially after that. A 980 costs like 30% more but only offers 20% more performance (both cards not overclocked)… So to run a game at 4k comfy your either going to have to fork out thousands for an SLI setup and pray to god the game you want to play is 1 configured and 2 optimized for SLI/Crossfire or wait until a single card can do the job. The only thing that bucks the trend of this is the Titan cards that tend to be about next generation in terms of speed, but come at a whopping 1500+.
So when buying a card, remember nothing is future proof, saving for a 980 over a 970 won’t magically give you an extra couple of years of not having to upgrade, I personally believe your better off buying say a 960 every 2 years if that’s what you can afford then a 980 every 4+
Did you get a chance to try fallout 4 w that graphics card yet? I’m literally in the process of getting the game and HAVE that graphics card and am quite concerned it’s not gonna be too fun since it’s way below these statistics and many others… BUT, MAYBE.
This is why I play on consoles… You put the game in. Then you play it!
At bellow 20 frames per second in a lot of places in Fallout 4’s case, the PS4 and Xbox versions are unplayable in spots going by the current editorial buzz.
Yeah because games journalism is so relevant to actual gaming.
If it’s unplayable on console, how am I ten hours in?
The performance problems are so bad in heavy combat that Giant Bomb knocked it down from 4/5 to 3/5 for the console releases leaving only the PC version with the original 4/5.
Oh, god. Grow the hell up, you’re embarrassing yourself. It’s so wierd to read someone ignorant enough to believe their reading of a review trumps the personal experiences of many or vice versa. Accept it, this dude didn’t encounter problems, stop embarrassing yourself. You don’t need to try and legitimize your point with irrelevant straw man mentality, be secure enough to accept difference in experience.
wow, someone needs a break
One person not encountering a problem does not mean there are no problems. Korwin is referring to a demonstration showing the 0 frames per second on Xbone. Surely you can be “secure enough” to accept its not flawless?
Pitfalls of the Gen Z gamers – they cant experience the games themselves and have the positive experiences, they crave off the negativity of websites and youtubers who will say anything negative about a big game for the clicks.
For the rest of us, we just play and make up our own minds. 7 Hours into the PS4 version and so far no ‘unplayable’ elements – oh except for the one lock in the overseers room I couldnt break because i didnt have the right perk for it
Gen Z gamer? Ewww.
Is there a Gen Z yet? I’m loosing track.
Millennials
I’m simply making the point that it isn’t all roses for everyone, regardless of platform. Jaded started from the tired assertion that the game just works because “consoles”, this demonstrably not true. Remember the shambles that was the PS3 version of Skyrim?
If he can get by with single digit frame rates when things get going then more power to him, but there are going to be great swaths of the audience who aren’t going to find that acceptable.
It’s all roses for me! I’m not playing it!
What’s heavy? I had my base invaded and it was chaos with all the friendly and enemy NPC’s fighting it out with me and Dogmeat.
The only time the action slowed was during V.A.T.S, for obvious reasons.
Had one crash and the only bugs I have had are rare awkward pauses during some conversations and a strange little red light I found floating in the Water Plant
Similar situations from what I’ve heard however the environment in which the chaos is happening can count for a lot. Eurogamer had the PS4 version scraping 20fps getting into some fights in the City environment.
im out the second i see stutter and frame drops. i want the best graphics completely smooth. or i cant concentrate on the game.
correction: you put in a disc and have to wait hrs for the game to download as on pc i can up the bandwidth on my downloads (which makes it go faster). Plug and play my ass
With day 1 patches becoming so prevalent on consoles that argument unfortunately hasn’t had much merit for a while now.
I’m trying to think of the last time I had to check a list to make sure my console’s graphic card could run a game.
And, software gets patched on all platforms. Your point?
My point is that consoles are no longer a case of putting the game in and just playing, particularly with big titles. Console gamers gets forced to jump through as many shitty hoops as PC gamers these days.
Let’s all revel in our equally shitty UX!
So true. I miss when developers had to make sure a game was as close to perfect as possible, before they could rely on the internet for patching things later.
Day 1 patches are included in the digital versions. Just start the game. So you’re right about disc versions, just not about the digital console market.
I haven’t bother firing up Fallout 4 on my PS4 yet – my old PC runs it fine and apparently there’s long loading times on consoles. My PC SSD loads save games up in about 10 seconds.
Not getting any long load times on the PS4.
To be fair going from the warehouse in (Covales?? – where all the Raiders are in what looks like a car manufacturing warehouse) back to the Commonwealth had a load in excess of 40 seconds
One a side note – how awesome is the Pip Boy app??
Very awesome, you can rotate the camera to where you want to change gear and see how it looks, and you can turn off the radio after dialogue start.
ok, the thing is GODRAYS slows things down – specifically it’s the way nvidia handle volumetric lighting. So turn down the GODRAYS option from Ultra to medium and you’ll notice a good 10%+ framerate difference. I have an AMD HD 7970 (equivalent to an R7 270 I think) and with all other settings at ULTRA (except for depth of field, which I have set to LOW, as I hate that effect) and I get 40+ frames per second with an old GPU, an intel i5-2500K CPU and running old AMD drivers version 14.1!
I’m pretty sure there was a work around for reducing the tessellation in witcher 3 when using hair works and the result was better performance on amd hardware than nvidia. Maybe it can also be applied in this case? I run nvidia hardware but I’m not a fan of game works due to how it segments the pc player base. I was hoping that amd would claw back some marketshare but sadly this didn’t end up happening.
The performance problems stem from the Nvidia Gameworks advanced Godray tech which uses ludicrously high level’s of tessellation to generate the effect. It has a significantly lower impact on performance on Nvidia’s hardware vs AMD’s however it’s still a needlessly heavy hit on both. Team green is perfectly happy to slightly hobble their own performance provided the effect on the competition is more dramatic.
Yeah, no joke. On the site linked in the article you can see the difference with godrays turned on and off. Turning it off nets >20FPS increase at 1080 and 1440, and around 10-15FPS at 4k. It looks good in the pictures, but I sure as hell won’t miss it.
That’s an especially negative way of stating the fact that Nvidia cards are much better at tessellation than AMD cards, and have been for generations now. AMD cards likewise are much better at compute tasks than Nvidia cards. It’s not a grand conspiracy to cut out the competition, it’s writing features in their library that work well with their hardware.
The better response to that isn’t “Nvidia are douches for using high tessellation in their library” or “AMD are douches for using high compute in their library”, the better response is “AMD should improve their tessellation support” and “Nvidia should improve their compute support”.
That’s the thing though. If it’s godrays via Gameworks then that’s a Nvidia proprietary function, meaning whatever it’s doing is designed specifically for Nvidia cards. In other words, no matter how good or bad AMD cards are at tesselation, they’re practically guaranteed to perform poorly with these APIs.
The Gameworks source code is proprietary but the way it interacts with hardware is through standard DirectX API calls. It does feature detection to pick up what shader model and sub-features are supported by the hardware but it’s all above board through standard DirectX calls. Once the DX layer is hit, everything is device-agnostic and as long as the DX hardware features are supported and/or the supported shader model is the same, the actual calls that go from DX to the GPU are the same whether you’re using Nvidia or AMD hardware.
It’s the opposite of what you just described: the performance is entirely down to how good the hardware is at responding to the calls being sent from the DX layer. Naturally Nvidia is going to request calls that play to its strengths (like heavy tessellation use) but there’s nothing stopping AMD from making a card that responds as well, if not better. And in the case of DirectCompute, that’s exactly what happens.
Edit: Just wanted to add, the Gameworks source code is available from Nvidia through a standardised license, has been for the last 18 months or so. It isn’t free but it is complete and any company can buy a license. If there were code segments designed to intentionally cripple AMD performance it would almost certainly have been leaked by now.
That’s very interesting and I didn’t know these things. The results are the only things I’m able to encounter. I don’t know as much about Gameworks, I’ve mostly encountered this aberration between AMD and Nvidia on Physx support, where there’s definitely a case for proprietary stonewalling.
This bit is what sticks out though:
I know you can’t answer this but why the hell haven’t AMD made such a solution if it’s that easy? I mean, the specifics of these articles and the resulting AMD/Nvidia shitfights both here and elsewhere are 100% because of these systems resulting in benchmarks that wholly favour an entire brand rather than individual cards. It would be a no-brainer to develop products that smash the opposition on their own turf.
Processor design isn’t something I’m that well versed in (software’s my side of the coin) but from what I understand it’s always a balancing act of priorities in the space available. It’s expensive (or at times impossible) to build something that’s good at everything, so compromises and trade-offs are made to focus power on some areas at the cost of others.
Obviously the actual designs and priorities each manufacturer deals with are trade secrets and advances in some tech areas mean some cards are better at some things than others. For example, AMD’s new memory system seems promising (but also buggy) and Nvidia has always excelled at compression. Those are things the other side can’t necessarily just design to beat, not easily at any rate and usually not without sacrificing something else in the process.
Educate yourself:
https://np.reddit.com/r/pcmasterrace/comments/3s5r4d/is_nvidia_sabotaging_performance_for_no_visual/cwukpuc
(that writeup has sources and other comments too). The tldr is nvidia bribes studios to use gameworks then skews it’s hardware to perform better for almost no benefit to the user. Articles like this just help their cause.
It might help if you read the comments you linked to. They discuss the rivalry between Intel and AMD in the CPU market, not Nvidia and AMD in the GPU market.
The ‘Nvidia bribes studios’ claim comes up a lot and has yet to be proven. It’s been rejected by Nvidia and rejected by game developers that work with both Nvidia and AMD. As best as I can find, not even AMD made that claim, its origins seem to be firmly in fan conspiracy territory.
What AMD did accuse Nvidia of was preventing game developers who use Gameworks from being able to liaise with AMD to do optimisations. Nvidia rejected the claim, and a number of game developers who used Gameworks came out publicly and said they were never prevented from working with AMD, but when they tried to reach out to AMD for support they got very little in response, and what they did get was half-hearted. They described Nvidia’s help, on the other hand, as prompt and useful, right from the early stages of development.
They are without a doubt better when it comes to tessellation performance, that’s not in dispute. The problem is the gameworks feature leans unnecessarily hard into tessellation to the point where large portions of performance are burned for no real world gain.
There was a very similar problem when The Witcher 3 released if you seem to recall, with nVidia’s hairworks and AMD calling them out for ‘rigging’ the game (or something to that extent). I can’t seem to recall specifically when, but AMD did step up their game and released some updates to Catalyst to help with the issue at the time.
It will most likely happen again in this situation.
The original hairworks implementation in that game had essentially the same problem, AMD had a work around in which at the driver level you could artificially limit the tessellation factor in order to bring the performance back to a healthy level.
Eventually CDPR put out a patch where some of the hair stuff could be tweaked in game. Not that it really mattered at any rate since the hair didn’t really add much value even when optimized, I just elected to play with it turned off to get a smoother experience (and that’s on a 980 Ti).
Hairworks is a pretty nice library, I’ve played with it myself. The problem with Witcher 3 was it was left at 64x tessellation which is intended for demos, not production releases. It should have been defaulted to 8x, which from memory shaved off something like 85% of the performance hit.
There’s no question that Hairworks is a harder hitter than TressFX, but there’s also good reason. TressFX is basically just a compute shader, which plays to AMD’s strengths, but the implementation is fairly lightweight and doesn’t handle physics interactions all that well. Hairworks uses tessellation instead, which has better scaling and gives more reliable results but costs more processing power.
I dispute the “unnecessarily hard” part. Tessellation is a critical feature, I’d argue that tessellation was the single most important advancement in DirectX in the last three major versions. Nvidia absolutely should be pushing it where possible and AMD should be making every effort to not lag behind. Hopefully AMD’s DX12 cards will bring them up to speed in that respect.
I’m not saying it shouldn’t be used, I’m a huge proponent of it (does wonders for smoothing out rounded surfaces, stops character models from having pixie points in their ears etc). There is however a point where excessively over tessellating something nets no real world benefit from a visual perspective but comes at a an excessively high performance cost which is what is happening here. Nvidia set the factor so high in a lot of instances that they end up generating sub pixel level geometry which is just straight up crazy.
When you have a horse power budget it doesn’t make sense to blow half on it on something that’s invisible.
Tessellation is a scalable feature the same as shadow quality or anisotropic filtering passes. It’s not like Gameworks hard-codes the tessellation rates either, they’re all configurable. The ideal solution in my opinion is to allow tessellation rates to be controlled by the user with an in-game slider, which is possible now if developers would just include it. Failing that it should be controllable through the drivers, which AMD allows and Nvidia doesn’t.
Ive been using an i5 3570k and an r9 280x and have been running everything maxed at 1080p and have been very happy with performance (mind you im happy as long as i get 30fps).
Wow that’s pretty much my exact same set up. I’m surprised my 280X is doing so well
Let’s all remember that this is in no way AMD’s fault. It’s Nvidia’s scummy anti-competitive business practices that are to blame.
I’m using an R9 280X and it’s doing surprisingly well. I did dip a bit to the 20FPS mark at one point outside of Diamond City but all other times I haven’t noticed anything (the game auto-detected ultra settings and I haven’t really looked more into it since then).
Gotta get dem max fps’s so the game runs faster and you can finish it quicker
Screw you Nvidia! I know it’s just conspiracy theorist crap but it’s just been way too common a trend to see AMD failing on specificities where ‘the way it’s meant to be played’ content has been pressed into games. Features that clearly don’t require such special processing, that wouldn’t tax ANY video card more than expected and yet cripples AMD cards so consistently.
I’ve had 3AMD/ Radeon Cards in my years of PC gaming. Each three of those cards have needed extra heatsinks on them to run games that they were ‘optimized’ for. One of those cards had Water Cooling. All 3 of those cards fried themselves.
2 nVidia Cards so far (geforces 760ti and 970) and they’ve been perfect. nVidia *does* have the step up over AMD and they *have* had the step up for the last 4 years at the least.
I have had similar experiences. The only cards I’ve had that broke down were AMD and for a long time I was a Nvidia die-hard. My greatest purchase ever was a 8800GT – a fantastic quality card that I still have running in one of my PCs, still surprising me with its performance on games I never thought it would run and was the best value for money ever.
I kept hearing reports like this about how x game component ran poorly on AMD or that they were hot, or that their drivers were poor (still were I suppose) so I stuck with Nvidia.
Until I got hammy-downed an AMD that was, at the time, more powerful than my Nvidia unit. Then I started to take real note, of the instances where x game ran worse and everyones limp excuse was that it’s because ‘it’s AMD’ and like why is this acceptable? Why is it across the board? There’s just no excuse for how specific game components in different games that were completely unrelated to any limitations in the card or its drivers would consistently fail on ALL AMD cards regardless of the hardware differences between them.
Of course I was happy to accept the loss years ago. But now? Look at the current line-up. Just shit. Neither manufacturers are making good value cards, both have drawbacks that shouldn’t exist (970 memory). Their ideas of a ‘refresh’ was to make slightly better versions of the high-end cards! 980ti! Ti series are meant to be power for value which has historically always been in the mid-range $350 price range.
So my hands been forced: Nvidia has NOT had the step-up over AMD for the last 4 years, just look at benchmarks were the two have been trading blows pretty nicely. Nvidias advantage comes from practises and company attitudes I can’t condone – proprietary locked-down game components. Physx, Gameworks etc are not only exclusive bullshit for their cards but it’s pretty clear from benchmarks that attempting to use them with the competing cards is stonewalled.
tbh, its the fault of the developer. Back in 2009 when we are playing oblivion, the same gen graphics cards (200 series Nvidia) should run this game. it doesn’t. its badly coded.
You seem confused.
you are not telling me the engine needs that much overhead 250 > 970 is a lot of power
SInce when has this been true? Games have been mainly designed for consoles for years now baring some rare PC exclusives or a handful of companies that actually do it right.
because potato’s will always be out powered by pc’s.
Still no SLI support although I’m getting mid 50s at 4k with a single overclocked 980ti and with gsync it looks spectacular! Can’t wait for the modding community to make it even better!
PC is the lead platform? Surely you’re havin’ a laff. The game is built from the ground up to run no faster than 60fps (ie, the max you’ll get on a console). Higher than 60fps and all sorts of things start going wrong, its where most of the glitches are coming from. I was using Gsync with it originally on my 980ti and hitting a solid 144fps and I couldn’t even get out of the starting vault because every time I used a terminal my game bugged out. Other people are reporting at 110+ fps you can’t even get in power armour anymore.
After some googling last night and finding other people having the same problem, turns out the ‘solution’ is manually locking your frames to a lower value and then the game behaves properly. Playing it in borderless windowed mode also fixes it, but thats just because in borderless windowed mode Gsync was no longer active for me and it was being capped at 60fps anyway. The thing I discovered though when changing from 144fps down to 60fps wasn’t just that it fixed the terminal glitch, it completely changed the fundamental mechanics of the game. At 144fps I ran faster, mouse sensitivity was higher and twitchier, my jump animation played quicker so my jumps were shorter and not as high, and attacking animations played quicker. I’d imagine it would also play havoc with any sort of ingame timers, buff timers, etc too.
Made this video to compare the difference between 60fps and 144fps just in the starter vault bit. Seriously Bethesda, what the hell.
https://youtu.be/87Ucmfhh5WE
I only JUST fixed this issue last night for me, after about 12 hours of gameplay. So freaking frustrating. Agreed that it’s not the lead platform. Easily the most powerful of the 3, but the casual/easier to obtain Consoles really take the pique of the market for Beth.
Thought that doesn’t excuse the fact that they should have been checking this out. I changed back to 144 fps after I got past the terminal I was freezing on. It is a HUGE difference, and I like it.
I wonder if FO4SE (Script Extender) would fix this?
you know that fallout 4 and most games run at 30 fps and dip to the low 20’s on Next gen consoles don’t you? same with the witcher 3 and many other games. Why do you need 144 fps? I run this game on ultra at 50-60 fps in 1440p mostly 60 on an AMD… yes AMD 390x People!!!. And you don’t really need much more than that else I would get a 144 hz monitor but it’s pointless imo. btw this post is bs
Can’t play any more until next patch, as trying to load any of my saves crashes the game.
Bit of a kick in the guts, but it’s Bethesda.. Maybe it’s my fault for being an Xboner.
I don’t know why PC gamers get all defensive about having to manually update stuff and tweak settings… Console gamers explain that is why they love their console, put the disk in and play and then PC gamers feel like they need to justify the PC platform by trying to beat console in that area by claiming they hardly update or they never have diver issues or need to tweak something..
I am sorry but you do have to update your drivers almost every major release there is, its free performance, and things do go wrong from time to time, its expected as the variations in hardware are astronomical unlike console which are all built the same… You also have to tweak settings because sometimes mouse acceleration might be forced and require a config edit or the FPS is locked at 30 with Vsync on so you have to disable it in game and force it through the control panel… So what?
Great. We both love our platforms and love the game. Console works right out of the box with lower performance and PC runs with better graphic fidelity and higher frames but often requires legwork with tweaking a setting or downloading a new driver before you play.
Look at that, two points of view that are both right and the world didn’t explode.
Well said. Updates and maintenance are just a fact of live with gaming these days, hell, I even have to update and restart my 3DS. Yes us playing on PC have to go through driver updates and whatnot and our return for that effort is a measurable increase in graphical fidelity and performance vs console, assuming devs did their job. Having said that, one of the reasons I have chosen nvidia cards over amd is because maintenance of their hardware is so much less of a bother. You know updates will be rolled out in a timely fashion and you know it’s going to work whereas I can’t say the same for amd, at least that’s been my experience. So yeah, partially why this discussion even exists.
Going to have to call bullshit on those benchmarks. My old 9770 is getting 60 frames at 1080p with the high preset and all the distance sliders at max. No drops, no slowdowns.
Well, they did test it on ultra.
Yeah fair enough, but the prevailing narrative everywhere is that it runs terribly on AMD. My pc has been brilliant for it. The only issues I’ve had have been the awful FOV and the ludicrous keyboard bindings.
Not surprised. I have it running on high settings with my GTX 560 1GB & haven’t had any stuttering, I did when I tried ultra though heh.
well you should be surprised because it is bs. I find it hard to believe that card runs it on high though. i run it on ultra @ 1440p and high at 4k on an AMD 390x so they obviously have an anti AMD agenda with this post
I play it on my work laptop (they were nice enough to get me a Gigabyte gaming laptop because despite its great specs, it was thousands less than an equivalent HP that we usually use), which has a GeForce 860M with 4Gb VRAM, on a Core i7-4810MQ @ 2.8Ghz, 16Gb RAM and it runs beautifully on Ultra, 1920×1080. Connected by HDMI to 24″ screen.
May not have the highest numbers for framerate nuts, no idea what actual framerate I get, but more than smooth enough to play well and it looks great.
*sigh* weak journalism.
Instead of touting how Nvidia have “done it again!”, how about investigating the claims Nvidia wiggle their way into a game’s development cycle, then apply their proprietary vfx API’s, that effectually hobble the game for anyone who isn’t using their tech!
That’s the real story here.
People often say Nvidia pay their way in, but it might be just a simple matter of resource management for the developers and publisher.
If a company says “we can do all this stuff for you so you don’t have to, and we’ll do it for free”, why WOULDN’T you let them do it? It frees up your resources to work on other stuff, thus costing you less money for a better product. Meanwhile, consumers keep paying the inflated prices on that first company’s product, so that company can continue to disservice them…
290’s are fairly cheap now. $230-250. Though you’ll have to buy used as of course they gimmicky don’t manufacture them anymore. See no point in buying a 390 (besides for DX12, which 290 has on same level I believe) for same price 290 was when it came out 2-3 years ago. I have 2-290’s and they still work like a charm. When Crossfire doesn’t work at release yet or it has issues one card generally suffices at 1080p, 60 fps with like the most demanding settings in any game turned down one notch.
I prefer my 8gb video ram thankyou lol 390x is the only card that can run games like shadow of mordor on ultra as it needs 6 gb Vram. when 390x’s get cheaper I will buy another but I can even pair it with a 290x anyway 🙂
Instead of reporting what nvidia is doing to sabotage PC gamers’ experiences, the site chooses to just tell people to buy nvidia more. Further messing up the PC gaming situation.
Turn down one or two features and AMD performance is fine. Do not go buying an nvidia GPU then end up on the losing end of the stick in better made games (eg battlefront, tomb raider, deus ex etc)
btw console is also locked at 30 fps max which it struggles to hold at 1080p
Well this is bull. I am playing fallout 4 on an msi r9 390x and it runs great in 1080p 1440p and even 4k with settings reduced a bit in 4k. and am getting more than 30 fps in 4k lol btw console dips to 23 fps in most places on low graphics settings lol
Fury X user, here.
Nice shilling, you got there, Kotaku. You forgot to mention abuse of overtesselation and the normal Gameworks shenanigans used to hurt AMD performance (Gameworks hurts everyone. It’s cancer), absolutely abysmal optimization all around, CPU bottlenecking with even the best i7’s, and, perhaps MOST importantly, that AMD hasn’t yet released gameready drivers for Fo4, yet (presumably because they have so much to un-fuck. And possibly waiting on the Crimson software release.)
Also, for much better performance, set God rays low (literally cannot find a difference), use a better tessellation with Catalyst (TSAA is garbage), and run in boarderless windowed mode (for some reason regular fullscreen hurts performance).
Fury X getting perfect 60 FPS @ 2k (1440p) resolution with Vsync on. Very rarely drops.
I have a Radeon 7870, and have absolutely no problem running the game on High settings.
+1 i’m runnning the game on a MSI Radeon 7870 Hawk edition, no issues and the game is running very well on high settings however…
how ever I’ve followed up some tips, the game have issues with frame rate on all (incl. high end ) graphics cards, nVidia and AMD there are some sum missing elements which have a huge impact on stable/comfortable game play and frame rate. mouse settings, draw distance, the shadow setting and God rays have the most impact and the game will run much better tweaking those.
If we all played on low settings like the PS4 is limited to, no-one on PC would have any problems. Just because you can stand crappy graphics, limited framerates, limited resolution, the inability to mod it and want to pay twice as much for it, go ahead. As someone who owns a PC and PS4 I can’t help but laugh at the idiocy of statements I see like this – Playing on PS4 looks horrible now it’s that outdated already. My 4 year old laptop is capable of better graphics that that money dumpster is – oh and playing online is free, the games are cheaper, you can mod your games and there’s literally a million more games on PC than PS4 too…oh and did I mention when playing online (for free) you also get matched with intelligent people unlike the 5 year olds or otherwise complete idiots on PS4…which to be honest, explains why they play on one over a PC. Haven’t used my PS4 since BF4 came out – there’s literally no benefit to doing so. So yeh, keep thinking you’re so much better off when yo look at Fallout Nexus in a couple of months and see the stuff you’re missing out on (that’s also free) even though you paid twice as much for the game all because you’re playing on something inferior. Like how hard is it to realise all PC gamers have to do to beat consoles these days is put the graphics on equal settings – low. It’s a joke how deluded console fanboys are that they can’t accept reality. You can literally either console on graphics with a £80 gpu. Spend double that and you blow them out the water. Now look at the cards above. These are only having trouble cos there’s no dedicated drivers out yet when playing on ultra settings – something the PS4 isn’t even capable of. Look again at the numbers above – PS4 is stuck at around 20 fps on it’s shitty settings while standind in an empty corridor (see http://www.eurogamer.net/articles/digitalfoundry-2015-fallout-4-face-off). So why are you glad you play on PS4 when your console can’t even match the lowest on this list seeing as it’s also playing on inferior graphics?
Great game, runs very well on my MSI Hawk Radeon HD 7870 2GB !!!
Problems sorted for AMD cards, check this out !
http://venturebeat.com/2015/11/17/fallout-4-gets-performance-boost-on-amd-cards-thanks-to-new-drivers/