Everyone wants to play all games at the highest settings possible, and every time a game or hardware is tested that's what it used. Very High, Ultra, Extreme or Crazy. But should that really be what everyone aims for?
2kliksphilip has a neat argument outlining that, at least in 2017, Ultra settings aren't really worth the hit to performance that they used to be. Back in the days of the original Crysis, there was a substantial visual difference from High to Medium to Low.
But these days that's not so true, with the difference between higher graphics presets often being an increase in post-processing filters (which come with a huge hit to performance) and not a great deal of improvement to textures, shadows, and lighting. And if you can play a game at a higher resolution with a decent frame rate, and you're not sacrificing that much visual quality, it makes sense to drop Ultra or Very High for something easier on your system.
There's a good comparison in the video with Battlefield 1, a game that scales incredibly well from the Low preset all the way up.
It's interesting to consider, especially if you start looking at CPUs and graphics cards through a lens of what can be accomplished at slightly lower settings. Do you really need to spend $500 on a GPU if you can spend $300 and get acceptable frame rates at 1440p or 4K, compromises permitting? And should benchmarks focus more on the difference between presets, identifying which settings have the greatest impact on frame rate?