Everyone wants to play all games at the highest settings possible, and every time a game or hardware is tested that’s what it used. Very High, Ultra, Extreme or Crazy. But should that really be what everyone aims for?
2kliksphilip has a neat argument outlining that, at least in 2017, Ultra settings aren’t really worth the hit to performance that they used to be. Back in the days of the original Crysis, there was a substantial visual difference from High to Medium to Low.
But these days that’s not so true, with the difference between higher graphics presets often being an increase in post-processing filters (which come with a huge hit to performance) and not a great deal of improvement to textures, shadows, and lighting. And if you can play a game at a higher resolution with a decent frame rate, and you’re not sacrificing that much visual quality, it makes sense to drop Ultra or Very High for something easier on your system.
There’s a good comparison in the video with Battlefield 1, a game that scales incredibly well from the Low preset all the way up.
It’s interesting to consider, especially if you start looking at CPUs and graphics cards through a lens of what can be accomplished at slightly lower settings. Do you really need to spend $500 on a GPU if you can spend $300 and get acceptable frame rates at 1440p or 4K, compromises permitting? And should benchmarks focus more on the difference between presets, identifying which settings have the greatest impact on frame rate?
Comments
35 responses to “An Argument Against Ultra Settings In Games”
Absolutely.
And this point sort of touches on why everyone strives to play on ultra/max settings. Because having to go through the settings yourself and finding the best way to maximise your performance based on every different iteration is often a painstakingly slow process which takes away from just playing the game.
If you can put everything on max and carry on with playing the game, it allows you to put more time into just enjoying the game itself.
I don’t know about everyone else, but having the little debate in my head over compromises over graphics settings is a huge distraction for me.
This is where stuff like Geforce Experience is good for a lot of people. It looks at your hardware, and optimizes the game settings to get the most out of your card. Saves you the hassle of having to play around with settings to find the sweet spot. Often you can enable or turn up one or two of the settings and still get decent performance, but sometimes it nails it bang on.
I found it was very helpful to me when I tried to play Witcher 2. I started playing it on my old system, and set everything to the highest, and it ran at like 10 fps. I used Geforce experience to optimize it, and the only setting it changed, was turning off SSAO, and the game ran at over 60fps!
Amen to that, I’m actually so glad for Nvidia’s inbuilt solutions. It’s great even just for giving you a ballpark setting, which you an tweak if you want. Everyone complains about the mandatory nvidia sign ups and extra software, but it’s bloody great
Totally agree. I don’t know what half these settings do (always appreciate a game that offers a description for each setting when you highlight it – shoutout to Elite Dangerous), so have little idea what I should be fiddling with or looking for when I change it. I just want the damn thing to run as fast and smoothly as possible without turning everything down or off.
The worst for me is different AA types.
I’m fairly well travelled as far as PC gaming goes, but I still get confused with all the types of AA. On top of that, different companies have their own types of AA, or even their own names for the same types of AA.
It’s at the point where my general rule is; MSAA if it runs well, otherwise no AA. Most non-MSAA solutions produce blurry messes anyway.
Ambient occlusion is another one with a lot of variants. SSAO, HBAO, etc…. It’s usually the first one to go for me as it’s particularly resource hungry for not a huge amount of extra bling.
Yeah, those things are all gobbledegook to me. Something describing what kind of performance hit each one takes would be nice.
That’s actually one of the few things I think the PC port of Arkham Knight did really well. Each setting had a description of what it did and explained the expected impact it would have, as well as having a visual indicator of how hard your GPU was being hit.
NVIDIA’s 3D settings explains what each feature does if you click on it. What I don’t understand is that GeForce experience puts the Witcher 3 at mostly low settings for me but I can run it fine completely maxed out. I don’t really notice a difference in performance but I do notice a huge difference in the quality. It really depends on the game and how fast paced it is/how much detail is in the game. For a game like Black Ops 4 that’s competitive and performance matters, I will adjust settings in favor of better performance vs better eye candy, especially since you’re going to be moving so fast that you will hardly notice the difference. A game like Witcher on the other hand where I like to take my time and really soak in the beauty of the game, I’ll set to whatever gives me the best graphics possible. Same thing for Destiny 2, even though it’s a fast paced game it’s not as detail intensive as a game like Witcher so I can get away with bumping up graphics on that too without seeing any performance issues. NVIDIA’s control panel allows you to set different settings for different games which is super convenience instead of just adjusting your global settings and leaving it like that for everything. This also allows you to save on power and resources when you’re not gaming or when you’re playing a game that doesn’t need MAXIMUM EFFORT. <— See what I did there? Anywho, it can be fun tweaking the settings and seeing the differences in game. Some nights I’ll actually spend a lot of time tweaking settings to see what gives me the most bang for my computers buck.
Nice necro. For me that’s the opposite of fun, it’s tedious timewasting keeping me from having *actual* fun – ie playing the damn game.
Isn’t that the point of the nvidia experience. You click optimize and off you go with the best settings for your card. The rest is left to personal preference (like some prefer to take a hit on terrain detail over AA).
Even though I have a 1080ti, and I can play stuff like Unreal Tournament and Doom at 4K with Ultra settings at over 100fps, I actually don’t like all the Ultra settings in most games. I tend to dislike stuff like motion blur, and some of the higher AA settings. Some games seem to do DoF in an ugly way too, so depending on the game, I often turn those kind of settings off. AA isn’t really all that noticeable at 4K for the most part anyway, but things like motion blur, I find takes away from the visual beauty of most games, especially if they have multiple settings (low, high etc.) For me, setting them at the higher end just makes shit look ugly. I like crispy clear graphics, and turning on motion blur often just makes it look ugly/worse. It does depend on the game though. I tend to tackle it game by game, but I rarely find I like the look of all settings being on ultra.
Me too! And there I was thinking everyone had moved on to UT2004 and Doom 2 😀
I’m only half joking… my GPU is pretty much an actual potato.
Lol… with a couple of wires stuck in? And a face drawn on? 😉
You had to pay extra for the face, mine is just the stock model.
Hahaha I’m a PS4/Mac owner. At least you can name, and identify, said potato.
Motion blur and depth of field shouldn’t even be in the ultra preset, they’re both techniques for increasing performance at the cost of visual quality. I don’t count them when I look at what would constitute ‘ultra quality’, I always switch them off if they’re present. Also film grain.
heh yeah same! It’s weird how some games categorize certain effects. I re-installed Witcher 2 yesterday to play through it at last, and that’s got some odd settings. Even setting the preset to Ultra, didn’t actually even enable all options, and strangely, set a lot of things to high despite there actually being an ultra option available! I honestly have no idea why people would want effects like film grain though. Like, in a cutscene in a game like LA Niore I could understand it, but why anyone would want to muddy up their picture with useless crap like that during gameplay is beyond me!
I don’t really see options like motion blur as being something for visual quality. DoF I kind of get, but again, in limited scenarios. For the most part, (especially motion blur) they just make stuff look messy/muddy to me 🙂
Haven’t played PC games in a while but the better reviews always did offer a range of tests such as graphs for Medium/High/Ultra at a few resolutions 720,1080 and I guess now 4k. If they don’t use graphs in an explicit benchmark I find most reviews worth their weight will mention trying lower graphics settings if the max ones didn’t perform well and mentioning how it looks and performs at these settings.
My argument about ultra settings are that the majority of games are so badly optimised ports of console games that simply put there is very marginal gain for such a performance hit.
Does anyone remember no mans sky? I was running a maxwell titan x in my pervious machine and it still looked and ran like a dog turd on my machine 45fps ish. But when we look at say overwatch a game that was built for pc and then ported to console with pc being the primary focus it ran beautifully on max settings and at 160ish fps.
Nowadays I strive for max settings only on games I know will be a visual treat. But one competitive games or ports I just aim for max performance as a lot of the post processing effects in competitive games detracts from the gameplay.
Also having a 240hz screen for competitive games the advantage of the extra frames is far more outweighed than the game looking a little prettier.
The biggest issue for me are the presets and the auto-detect features in some games.
With presets, some games in Ultra don’t give you Ultra settings. An example of this is Fallout 4, where Godrays are High and not Ultra. Another example is Forza Horizon 3, where Ultra gives you 4x MSAA, but the highest possible setting for that option is in fact 8x.
Then there’s the games that auto-detect graphics for you, which for some reason always sets so low that I get a smooth +200FPS and the game looks like crap.
heh I find this a lot too. Even now with my 1080ti, I loaded up a game the other day, and it autodetected settings during initial load phase, and set everything to high, even though there were higher settings available in almost all categories!
I don’t see the fact that there is little noticeable difference between say “high” and “ultra” to be a bad thing. Nor do I see a problem with sites testing an “ultra” setting in benchmarks. Yes they are good points about tweaking settings slightly to get more performance at the expense of barely noticeable visual degradation, but why not shoot for the highest settings?
I love the fact that thanks to hardware improvements some of the games from a couple years ago are now playable at absolute max settings, in fact some of them could go with HQ packs to improve them even further. And in benchmarking terms it’s good to have a title that lasts a long time in benchmarks so you can see how iterations of hardware have improved performance over time.
Personally, I always switch to Ultra settings but drop Anti-aliasing immediately since I run at 4k and I don’t find it noticeable anymore. After that I’d rather play the game and see if it’s smooth before turning anything else down.
Sometimes the highest settings are kinda overkill though, especially when it comes to something like shadow resolutions (High is 2048 and Ultra is 16,384), or tesselation (High is 16x and Ultra is 256x).
Maybe I should have clarified, I don’t mean play at horrible framerates. I mean go for the highest quality possible and see if it works, then if necessary dial it back.
It depends a bit on the game*, but I’d rather play with the highest possible visible quality as long as it maintains a decent framerate. It’s been kinda cool playing stuff like Far Cry, Stalker and Metro when they first came out and then running them again now.
* FPS/twitch type games are an exception because it’s actually better gameplay wise to reduce shadows, ground clutter etc so you can see the enemy more easily.
Jeez, feelin’ bad here as the guy who always plays on ‘Low’ just to get smooth, uninterrupted gameplay.
Feels bad needing to cut my resolution below native just to run some of the new games :’v
I rarely, if ever, bother with the preset settings… First thing I do with any new game is go through all the settings and tweak them to my taste. If something’s new and I don’t know what it is I’ll google it before deciding how to set it.
Sure I always gravitate to the highest settings possible anyway, but there are some things I know are a waste of GPU power for little to no visible difference (like excessively high AA), or settings I simply don’t like which just consume resources better used elsewhere (I’m looking at you DoF and Motion Blur). After that I’ll go back and tweak things if needed until I strike a nice balance of performance/quality.
I both completely agree and completely disagree at the same time, half of me agrees ultra often doesn’t give enough benefit to warrant the performance hit, but whenever i play a game the other half of me is like ‘no not high settings, put everything to ultra it will be wonderful’
Also i wish there was something like this (http://www.geforce.com/whats-new/guides/grand-theft-auto-v-pc-graphics-and-performance-guide) for every game. I don’t mean from Nvidia (i have an AMD card anyway) but a detailed analysis on what settings affect performance and how different they actually look when high or low
You’re not the only one! I’ve seen quite a few people asking for this kind of stuff to come back in the nvidia forums. They used to have a bunch of games that they’d done this for, but doesn’t look like they wanted to maintain it. Even though I’m pretty familiar with what all the settings do thanks to my graphics/game dev history, I always liked reading through this kind of stuff. Most engines perform very differently with various settings, so it’s always nice to know which give the biggest performance hits.
It would be very handy if it came back, it was tons of help for me with GTA V because there were so many settings and at the time i was new to PC gaming. Even though now i generally know what a setting will do its exactly like you are saying you can never guess how each game will handle the same setting.
And if no one can be bothered doing a guide like this anymore i would love if all gamer developers took some notes about how Ghost recon wildlands has their settings, where it shows you live what the difference will be, it doesn’t show you performance hit but it still makes it a lot easier to prioritise which settings to turn up. (im sure other games do it but thats a recent example and the only game i can remember playing that does it)
I don’t mind fiddling with the knobs to achieve a quality/frame rate compromise.. It’s a necessary skill of a PC gamer. I’ll usually see what the game suggests see what it looks like and then tweak away.happily.
What annoys me though is when a game needs to restart to apply settings…
Kindred spirits. I feel the same way – I love to tweak and find just the right balance between graphics and performance. Performance actually takes priority for me, and I often run MSI Afterburner in the top corner to get an idea of how things are going. But testing different settings and finding just the right combination is part of what I enjoy about gaming on PC (as weird as that sounds). I agree about games that require restarts for setting changes – it does make the process of tweaking a little cumbersome as you often have to wait for all those stupid splash screens, etc before you can get back into the game. I also love it when a game shows you in real time what the changes in graphics settings look like (e.g. it will clear up the menu enough to show you the game where you paused it). I’ve never really understood people who are always compelled to have all the highest settings; sometimes it can be just a setting or two that can WILDLY affect performance, but often have no appreciable impact on how the game looks, where dropping down a single notch makes the game run so much better without really affecting the overall presentation.
I have no issue with ultra graphics, but i do have in issue with is useless cinematic effects that do nothing but cuase more fps usage or things that kill the graphic.
things that need to be removed:
+Motion Blur( when you turn your head quickly there is no bluring at all)
+Colour Filters (examples Fallout 3,NV,4 Battlefield 3 etc)
+Film Grain (unless your purposely making a Noir style game)
+Chromatic Abberation. ( its called that because it ruins photos)
+Lens Flares (unless the player character is wearing glasses/googles or looking through a clear windows, lens flairs are unneeded)
learn the settings… they only going to get more complex in a few years
when you can change the settings freely, can get up to 200% performance out your GPU especially with a little tweak to shadows or god rays and post processing effects shaders and particles
I still wonder how some people cope with when theres an explosion in game and particle goes off and the GPU has to work 5x harder to do the pixels of the textures and particles combined…. particles are a huge hit to power
sometimes can get better settings than what the developer set s it at.. and definitely better than geforce experience, remember they are not set to your hardware usually
a few games detect your hardware and set appropriate setting but not most the time
theres a fine line between pleasure and pain
Funny, considering Kotaku reviews maybe a handful of PC releases each year, with very limited graphs/info.