Something I’ve noticed more of over the last couple of years: an increased anxiety regarding video game performance. PC gamers have always discussed this, that’s part and parcel of the experience: getting the best out of your gear and whatnot.
But with this generation of consoles we seem to be discussing it more and more. Does it matter to you?
I kind of empathise with it. Both the PS4 and the Xbox One are so similar in terms of their functionality and their performance, and most major releases are multi-platform. What else do we have to compare and contrast? Particularly if you’re trying to decide which console to buy or want confirmation that you made the right choice.
So it’s weird. These sorts of discussions become simultaneously relevant but totally irrelevant. Sometimes I’ll watch comparison videos. Sometimes I’ll think about those minor details. Sometimes I’ll act like they matter, even when I know they don’t.
Very strange indeed.
What’s your take on it?
Comments
53 responses to “Tell Us Dammit: Does Frame Rate And Resolution Matter To You?”
To be honest if say frame rate is the only thing that’s important to me. Nothing worse than shit going down and the frame rate dies in the arse
A stable framerate is hugely important to me. Random hitches and stuttering can kill a game. A consistent 60fps is preferable but a stable 30fps > an unstable 60fps.
I’m spoilt as a PC gamer because I can toy with settings to get the performance I want.
Resolution and overall shininess is neat but won’t sell me on a game.
Totally agree, there is just no excuse for a non-locked frame rate in my opinion. Stable 60 is amazing, but if you can’t hit that make sure it’s a stable 30 (see Destiny and DRIVECLUB as perfect examples on the PS4).
100% agree, and after playing The last of us on PS4 it sparked an Idea that really should be used more in games, while there would be limitations of course I would love to see options for maybe 2 presets, one for top notch graphics and another one that main point is too insure a stable frame rate, an excellent example is Bloodborune having the option to tone down a few things to make sure game play is always smooth as would be great!
Frame rate matters to me in the sense that it needs to be steady. I don’t care if it’s 60fps or 30fps as long as it can maintain that on a consistent basis.
Resolution does matter to me as part of overall image quality… I’d certainly much rather have 1080p than 720p. Whether I’d notice the difference between, say, 900p and 1080p I’m not so sure. But yeah, all other things being equal I’d much rather have 1080p, as long as the frame rate is fairly consistent.
I think both are important but i’m struggling to make a decision of one over the other. Do I get a 144hz 1440p monitor or a 60hz 4k monitor?
I have been trying to make this decision for the last few months and failing.
Anything less than 60fps is unacceptable in this day and age however.
I just went through the same decision, man it was hard. In the end I got the 144hz 1440 monitor and I think I made the right call. With my set up that gives me the ability to crank the settings and still get a solid frame rate (it looks friggin spectacular in Witcher 3, project cars etc). I was a bit worried about having to lower settings if I went 4k and 60hz to get a decent frame rate (even with a Titan x).
I did a heap of research and I think 4k will be the target for my next build (or upgrading this one) I don’t think the tech is quite there yet. Cant see the point in building a new rig that cant max settings (I’m a graphics whore….so sue me!!!)
that’s my 2 cents for what its worth. Good luck with the decision I agonized over it for a looong time.
I got my very first gaming pc a week ago. I went for 144HZ 1440 Monitor, The acer predator with G Sync and I’m literally blown away! Its amazing.. And the control via a mouse over joypad (new to me) is just brilliant!
Some games I like at 60fps, some games I like the additional viewing range of 1080p.
BUT whatever the choice made by the developers, screen tearing is NOT ok! The last gen was littered with the disgusting scourge, and it’s one area where the Wii u stood out.
Screen tearing, ew.
Resolution matters for about 2 seconds as my TV automatically changes from 1080p to 720p or whatever and a slight whiff of disappointment overcomes me, but by the time the game hits the first splash screen it simply doesn’t matter anymore. Framerate is more important but only in the sense that it needs to be stable and playable. I honestly couldn’t care less what the number is.
I want 60FPS. That is all, the game can look like ass for all I care, I just want 60FPS.
ARK for example, I don’t care at all how pretty it CAN look, I just want 60 FPS. I suppose, always having a PC that wasn’t like top of the line like 20,000 Titans or whatever I never really cared how games looked.
Agree. 60fps minimum. Then add graphics effects on until u hit less than 60.
Framerate matters. Playing action games that become turn based due to framerate drops or have excessive screen tearing is more or less a deal breaker. This of course can be solved by running the game with a 640*480 resolution (all the pros do it!) so maybe resolution matters too.
In terms of a raw comparison, the fact that the PS4 could possibly deliver a better framerate or resolution over the Xbone definitely played a part in my purchase of a PS4, though it was a fairly mild factor.
In terms of the actual numbers, I’m not particularly fussed. I like things to run at 60 fps (if I can run them at 120 on my PC it’s even better, but nearly everything that isn’t Source Engine seems to bottleneck before there), but if something’s locked at 30 it’s fine. If something drops badly (Blighttown springs to mind) then it’s annoying, but I’m hardly going to go on a crusade to change it, just whinge about it briefly online.
Resolution’s a weird one, because I really like good image quality, but it’s not as direct as a pixel count. To my eyes, a 900P image running SMAA with a temporal filter (or some equivalent) tends to look better than an unfiltered (or one with a standard FXAA implementation) image.
But I do read/watch comparisons. I love to compare the differences between versions, but I do so for fun, not some base desire to show that x or y is worse, or that developer z is garbage for not running at 1440P.
You forgot to mention that you’re all about the bokeh.
It’s true. Bokeh is the only thing I care about in games. I’m happy with it running at one frame every two seconds if I can get some of that sweet Bokeh action.
They matter and they don’t.
If I get a solid 30 FPS, I’m happy. But if it starts to dip consistently it becomes annoying. While I don’t expect a 4k resolution it needs to be decent. But smooth is more important than high resolution.
I’m firmly in the Not Bothered camp. Sure, I don’t want to play a game that’s jittering all over the place. Who does? But I couldn’t tell th difference, unless maybe it was in a comparison video. And I think that goes for the majority of people who play games.
As long as it looks good and runs well, I don’t care about the number attached to it.
As long as a game works. As above, framerate is key to that beyond anything else in terms of graphics. Resolution is important I guess too, but these days most games run at a minimum of 720p anyway so it’s not so important anymore.
I agree that steadiness is more important than a high figure. I do notice 60fps versus 30fps, but I’m not even sure I notice when a game is in 1080…
As primarily a PC gamer I’d prefer a higher number, but as long as frame rate is consistently above 30 I’m fine. When something is unnecessarily locked to 30 it’s mostly annoying and confusing…
With res, if I can’t run at native (currently 1080p) it’s a bigger annoyance than it probably should be, but it is what it is.
If I’m given the choice, I’ll always drop the res to get a better frame rate, because to me dems grafiks are far less important than the gameplay.
Tbh though as a majority console gamer, with only a single 770 4gb/4790 rig I cant push the thing that high anyway! I’ll always choose framerate over res though.
I have a monitor a 24560×1440 monitor at 144Hz powered by 2 x GTX 980’s.
Resolution and frame rate matter.
Frame rate yes, not looking for a high number, just consistency so the game experience isn’t ruined.
If I’m playing a PC game (rare but it does happen) I notice it and I’ll tinker with settings until I get a smooth frame rate
Consoles though I’ve never noticed. Take Watch Dogs – reportedly 720P on Xbone- I have a 60″ TV and it looks sharp as a tack!
Not really.
Stable frame rates are best, but I don’t mind if they are 30 or 60 fps. I can run anything under 4k resolution so not really no.
As long as the game play is good and consistency is maintained.
After playing on consoles for such a long time, 30 Fps doesn’t seem that bad anymore at 1080p. 60fps is nice, but to me not really that essential unless you are playing an online fps.
At 4K i can only bear 30 fps being the minimum, it is extremely obvious when the fps dips lower than this. Preferably i would love it to be at 60fps, but once again this isn’t a grave concern.
The only time i need 75fps, is when I am using the Oculus rift. You will need the high fps or you will definitely vomit from motion sickness. Try using an Oculus at 30fps or even 60fps and you can kiss the rest of your day goodbye.
Yes, absolutely.
I posted this over at PCMR just two days ago. Read the truth.
https://www.reddit.com/r/pcmasterrace/comments/3fqhum/the_peasantry_truth_of_why_30fps_is_superior/
In order:
– Stable 60 fps
– Stable 30 fps
– 1080 dpi and stable frame rate
– 720 dpi and stable frame rate
I don’t give a stuff about resolution. I just want a consistent frame rate (higher equals better but not at the expense of stability
This is why the WiiU rocks. It is such a smooooooth console (Splatoon peeps will know what I mean!)
It does matter to me, but how much it matters depends on what we’re talking about specifically. When it comes to the minor difference between two roughly equal consoles ports it doesn’t matter at all. When it comes to 480p vs 4K then obviously it’s a world of difference.
I’ve never really been bothered by resolution. In an age when people are embracing pixel games (and other non-realistic art styles), I find it bizarre that on one hand, people obviously realise that resolution doesn’t mean fun, but then they still cry when something is “only” 900p.
When I got my Xbox 360, I didn’t even have an HDTV, but the games were just as fun. The TV I have now only does 720p (and 1080i, but screw that). I still have a blast with games, and I seriously doubt my enjoyment would increase if I was playing them in 1080p.
I’ll even put up with a poor framerate if the game is fun (EDF!).
I guess the bottom line is that I like playing games, not counting pixels.
To me fram erate and resolution come way down on the list behind graphics, gameplay and the dreaded f word FUN.
Take Bloodborne for example. Wonderful game which entertained me fully for 70 odd hours. Did I know what the resolution was? No. Was the frame rate a bit choppy in some situations? Yes. Did it bother me? Not in the slightest.
To me, if it was fps vs resolution, I would choose 720p at 60fps than 1080p at 30fps. I find frame skipping annoying.
It very much depends on the game
If its frame rate drops, as in screen lag from the usual stability, that can be really jarring
If the resolution and frame rate detract in any way from the immersion of the game, then yes, it is important
More important to me is a stable framerate in general. A lot of games that I’ve played lately seem to not be able to stay at 30fps constantly, and that is just sad and inexcusable.
As for the whole 60fps thing. I love 60+fps, but I do realise not all games require it. My main thought though is if a game is striving for a realistic appearance, it really should be aiming for that number 60 benchmark.
As with others, neither really matters to me as long as the frame rate is stable. And as @cubits mentioned, no screen tearing.
I’ve always had a mid to low end PC, so straight away anti-aliasing goes off and resolution goes down so the game doesn’t jitter all over the place. Oh, and vsync goes on to prevent tearing, I don’t care if I’m only getting 16 FPS
As long as it’s not noticeably chugging, I don’t really care. Sure 60fps is going to look smoother but as long as it looks smooth moment to moment I’m not going to get hung up on it.
Resolution is more important but again, as long as it looks generally good I’m going to pardon a few jaggies.
Stability of framerate is more important to me than a high framerate, if I had to choose I’d pick stability any day.
Resolution is another matter, it varies depending on the game & how much you need fine detail, the average 3rd person game doesn’t really need super high resolution, I imagine that I’d enjoy The Witcher 3 in 1080 as much as I do in 1440 since it’d look more or less the same and super fine detail isn’t that relevant to gameplay since it’s a game of melee combat. By contrast having a 2560×1440 monitor in a game like ARMA almost feels like cheating since you get so much more fine detail to spot enemies at long range. With ARMA my average engagement range is in the region of 4-600m which is bigger than most call of duty maps, at ranges like that a 1080 resolution is going to render enemies heads poking out behind a wall as little more than a few pixels but the additional resolution afforded by a 1440 monitor means that you could get dramatically more detail, though at that range it’s more “Yes that is indeed a human head” rather than “Hey, that blonde guy has a handlebar moustache”.
High resolution can also be fun & weird, I’m running FTL on my 1440 monitor, I’ve never seen blockiness so well defined, lucky I don’t have a 4k monitor or those sharp edges could have my eye out!
As long as I don’t have to squint to read text or sit through endless loads mid-walk, I’m fine.
Frame rate, as long as it is above 30fps, isn’t a big deal. Resolution as long as it is over 720p is fine. The only time frame rate matters to me is when it fluctuates widely and causes screen tearing (can be alleviated in some cases with v-sync) and stuttering or some other annoyance. Admittedly, I only have a 21″ widescreen monitor, so resolution isn’t something I care about, while if I had a 27″ widescreen or a multi-monitor setup, I might care more.
I don’t care so much. There’s only one or two games that I wished were in 60fps (Ratchet and Clank: Nexus comes to mind) but otherwise, so long as the game plays well, I couldn’t give two shits about the graphical fidelity.
Although – and this might be a bit of a weird opinion – there are one or two games that were in 30fps on console and almost made me throw up seeing them in 60fps on PC. The only one I can think of at the moment is Dirt 3; I loved it on console, but not only did I feel a little bleugh playing it on computer, but it felt a little less… exciting? I dunno, it’s hard to explain, and I’m weird, so yeah.
Strategy gamer. Not twitch gamer. I can play at 5fps and not have my gameplay affected. Resolution is handy for getting more information on the screen at once in a readable format.
I’m gonna enjoy walking to you and melee you to death while you enjoy you 5fps and can’t see me coming when I slowly walk in front of you.
There’s the weird assumption that I play FPS games. I don’t. Couldn’t care less about them. If I get shanked because of FPS, I’m playing the wrong game. Turn base FTW.
Both. I want native resolution output with smooth frame rate. Consoles games that is 720/900p upscaled to 1080p looks great on TV, but on PC, anything lower than your native screen resolution looks completely horrible.
Never below 60fps on PC tho. I’d rather play 40-60 fps than 30 on PC.
30 actually gives me headaches after a while in some game types.
My first test of settings in Witcher 3 got me a stable 30fps but it felt off the whole time, like there was this constant lag between input and reaction. At first I thought it might have been the game, but when I dropped some settings and hit 60 stable, the lag vanished.
Yeap. your input will lag by 50% comparing the same game running on 30fps ad 60fps. That is why when people say there is no difference between 30fps and 60fps, I literally face palm since the difference is so huge.
They should be a standard. They shouldn’t have to “matter” because messing them up should be terrible negligence instead of a preference.
But then how could they foist subpar console hardware on you? Funniest thing I saw recently was a group of console owners actually trying to justify to themselves that 30fps looks better than 60fps and they’re glad a lot of console games run at 30.
Yeah, I just cannot stand that.
I’m ok with games that can only manage a rock solid 30 (not racers, not fighters), but never ever pretend that it’s optimal and never ever cap it on PC.
The both matter a lot to me, though high AND consistent frame rates are probably more important.
30FPS is sluggish as hell, even without dips, after spending the last few years on my PCs getting stable 120+ FPS. It’s expensive to achieve, requiring regular upgrades, but there’s no going back for me now.
Until I played The Last of Us Remastered, I didn’t get what the deal was with more than 30fps. Now it puts a smile on my face whenever I’m playing something that’s hitting 60FPS – aesthetically it just looks so much better, more fluid, I can’t quite describe it. It’s really made me push more into PC gaming, just because if anything is gonna be able to hit 60 most of the time, it’s PC. Consoles are kinda limited in what they’re able to achieve.
Ehhhh… It’s important but not worth all the angst and cost sometimes.
Although I’m still rocking a 19″ 1440 x 900 display on my desktop machine and I’ve got an R9 270X sitting in it.
I’ve said it before and I’ll say it again, graphics do matter but they are way down the list of things that make a game fun, or pleasing, to play. While I don’t really mind games at 30fps there is a huge difference when it’s at 60fps. Games are so much better at 60fps becuase they play smoothly.
Not unless frames matter as a measurable unit of time, as in a fighting game. The vast, vast majority of games don’t require that level of attention, and I can’t typically tell the difference.
In general though I don’t like that AAA developers are killing themselves trying to make their games prettier and prettier just to fuel the hype engine.
Yeah, but within reason, of course. I understand the limitations of cross-platform development and time constraints imposed on open world games or AAA games so I can handle a glitch or two without unreasonably assuming something should be “fixed” when the severity of the reasonable and understandable glitch is dubious. I think people should be vigilant however, “reasonable” only goes so far. I’m definitely not someone who cares about minor graphics disparity between development and release. Its an understandable perspective and if you actually understand the development process of any creative endeavor then it’s pretty easy to empathise with.
Actually it’s kinda simple really, for the most part it doesn’t matter until AFTER you have it. I was fine gaming at 720p for a while, even 1080p (in my brief encounters) was nice but I made a full jump to 1440p and I could go back but why would I? Same thing for frame rate. I started on consoles with 30fps, got a taste of 60 and even the upper 120 on my computer. Now could I go back to 30fps? Sure I could but why would I? It’s like having a taste into the finer things. Sure you could go back to normal but if you have the means…..why would you? It doesn’t matter until you have it, then it doesn’t matter until you don’t.