So here’s a question: does a video game’s frame rate matter to you? With the release of new consoles it’s only natural that we argue specs, the varying quality of cross platform games, etc — but at the end of the day, do you care if a game is 60 frames per second or 30?
Personally? I care. I notice and feel the difference if a game is smoother. Playing a smoother game makes for a better experience. It feels more responsive. It feels more rewarding. That’s just me. I know there are people out there that simply don’t care.
What about frame rate drops? Is this a game breaker for you? Vote and let us know your reasoning in the comments below.
Comments
60 responses to “The Big Question: Does Frame Rate Matter To You?”
For me frame rate > resolution.
Good point. I can’t tell while I’m playing that a PS3 game is upscaled from 720p or whatever rubbish. I can tell the difference between sections that are “60” and “30” fps.
Yeah out of all the xbox one and ps4 things to be buzzing about, me and my mates were stoked with how smooth the games ran compared to their predecessors.
Personally I don’t think its so black and white. Frame Rate is more important than resolution to some degree. It also depends on the type of game. ACIV on PS4/XB1 runs at 30 fps. For the style of game I think it’s fine. On PS4 its 1080p and XB1 its 900p. If they scaled back to 720p and got 60fps I think that would be worse. ACIV is a beautiful game and 1080p on PS4 really helps the game stand out graphically. I would take 1080p/30fps over 720p/60fps easily. However, if it was a racing game or a first person shooter like CoD/BF I would take 720p/60fps over the resolution.
The biggest thing for me is if the frame rate is consistent. A constant 60fps in any game is fantastic. Buttery smooth even. But if you get the frame rate constantly fluctuating between say 35 – 60 then you will end up with on screen judder, variable latency in controller input and possibly screen tearing if v-sync is not engaged. This to me is the worst possibly outcome and i’d prefer a locked consistent 30fps.
Digital foundry did a brief look at Tomb Raider definitive edition on PS4 and XB1 and their findings showed that on PS4 during game play the avg fps was around 51 and on XB1 around 29.8. At first glance it seems clear that the PS4 is the obvious choice. But DF states the frame rate fluctuates from 32-60 and in parts causes noticeable judder on screen. XB1 also drops frames at times but is overall more consistent. I’ll wait for DF to do their complete analysis of the game but I personally would prefer a consistent frame rate. (http://www.eurogamer.net/articles/digitalfoundry-2014-tomb-raider-definitive-performance-analysis)
I’m not trying to be biased towards any console. I have both an XB1 and PS4. I just don’t think everything is as simple as it seems sometimes.
Even then, though, I think it’s only an issue if it fluctuates a lot. If the PS4 version is constantly swinging back and forth between 32 and 60 and is just all over the place, then yeah, that’s a problem and would probably put me off the game a bit. But if it only dropped like that once or twice during the testing (I can’t actually watch those videos because YouTube is blocked here at work) then it’s really not an issue and I’d take the higher slightly variable frame rate over a consistent 30fps. I believe (haven’t played it myself) that there was a similar issue with COD Ghosts on PS4 where the frame rate dropped occasionally, but from what I read that was just when it was autosaving at checkpoints, so if the frame rate drops for a second there it’s not really going to bother me compared to having it drop during the important bits.
True. This is one of the major reasons I’ve almost entirely stopped playing games on consoles. At the start of a console generation the machines are capable of putting out nice graphics at a solid frame rate. Everything looks pretty and runs smoothly.
As time drags on though, developers are constantly pushing for better “graphics”, because that’s what customers expect. You need to look as nice as everyone else does in those still images for publicity’s sake, so the typical frame rate seems to keep dragging lower and lower until the new acceptable standard is “a solid 30 FPS”.
These days if I can, I’d rather just get games on PC, that way I know I can keep the frame rate where I want it.
I can. It’s not rubbish, you just aren’t as observant about something that doesn’t appeal to you.
Only if it drops below 30fps I guess would be noticeable. If it’s zooming along but fluctuating between 30 & 60 fps, I can deal with that in most cases, but not when it starts to be like a slideshow
I would much rather next gen games that run smooth over ones that have high fidelity visuals. As good as real life means nothing when you chug down to 10FPS during the action.
only when it drops low enough to be noticeable.
while I appreciate the smoothness of 60 FPS, anything higher than 30 is fine with me.
Yes, it matters. The whole argument over graphics between the two next-gen consoles is a complete waste of time and it has little bearing on the overall quality of the gameplay for a multi-platform game, but it does affect the experience you get while playing and therefore, all other factors being equal, if it was 60 fps on one platform and 30 on another you’d generally want to go for the 60 fps option.
I wouldn’t refuse to play a game because it’s only 30fps or something though.
Basically, this. The only problem i’ve had with fps recently was DayZ running stupidly slow on my pretty great PC (Arma worked fine, mod didn’t) and the console versions of Assassin’s Creed 3. If it doesn’t affect the way I play the game then it matters less to me, it’ll still skew my decision though. I would appreciate it, however, if the media could actually tell the truth instead of pandering about the subject. When the CoD: Ghosts info came out (One at a lower res, PS4 at a higher one), I thought it was pretty self-explanatory. PS4 had a lead in terms of power. So what? One system always does. But then articles started coming out trying to give “fair” comparisons to both, blatantly lying about how much the PS4 version slowed down to try and balance the story somewhat. After playing and owning Ghosts on PS4, i’ve found that it’s untrue. There’s almost no slowdown and the only time it hitches is clearly when the level is being loaded, a fact everyone failed to mention. The perception that the numbers really matter all that much when we’ve been fed a healthy diet of low res “HD” console games running at 25fps on TVs that basically eliminate the effects of low fps forever is just fuel for arguments. It doesn’t really matter, consoles have more to offer than raw power.
I don’t care as long as it stays above 30fps for the vast majority of games.
I was on here arguing around the Xbone/PS4 launch that I’d rather have my games look better (higher res, more characters on screen, bigger everything) at the expense of frames 31 and above. Developers know this which is why every new generation for the past 15 years has increased graphics at the expense of the extra frames.
I couldn’t work out what the big deal was with people complaining that some games ran at <60fps like it was suddenly “the point of the next gen”. I didn’t see the point of the PS2 or N64 being to run PS1 and SNES games at 60fps!
It’s nice, and is important for some game types (racing games mostly), but it’s not the be-all-and-end-all.
Because the graphics are on par or less with midrange PCs. There is no ‘increased graphics’ _at this point_, which is why people just want a stable framerate to at least enjoy the game with.
Consoles have always been on par with mid-range PC’s at launch. This is exactly what I don’t f*cking get.
Maybe you can explain to me why the comparison is suddenly between consoles and much more expensive PC’s?
A high end PC cost $2000 when the N64 came out. Where you completely horrified when the $400 N64 wasn’t significantly more powerful than a ‘mid-range’ PC?
Even accounting for the fact that they’re new hardware (put Perfect Dark Zero next to Crysis 3 to have a look how much games improve over a generation), there’s absolutely no disputing that there is a leap in graphical quality between the Xbone/ PS4 launch games and the 360/PS3.
That’s the graphical improvement, of course it’s not going to look a million times better than more expensive PC hardware.
Both next gen consoles run about how you’d expect them to run if you put the same hardware in a PC. Funny that.
Actually, the comparison wasn’t meant to be with PCs per se, what I’m saying is ‘next generation’ can’t even handle 1080p in some situations, let alone 60fps. That, to me, is a failing – especially when 1080p has become par for the course, especially for those used to PC gaming in the last couple of years. If you can’t hit the default resolution of most HDTVs, I consider you’re either a) creating an unoptimised game, b) caring more about graphic fidelity in the sense of polys, FX etc than the overall experience.
I would semi-agree with you, there are a few launch titles on each that look ‘next-gen’, but several that don’t look significantly different to their predecessors. With regards to your last comment, it’s interesting you say that, because there are several high profile multiplats that look and run better on PCs with comparable hardware to the Xbone/PS4. I’m not against consoles (i own a PS4, PS3 and 360), but I’m also not kidding myself about the superior _platform_.
I agree that it is a big sh*t that the Xbone isn’t hitting 1080p for everything. It would be nice.
PC’s are always going to be the superior (and more expensive) platform, but don’t forget that this is the first time out for all developers using the new consoles operating systems and hardware idiosyncrasies. It shouldn’t be a complete shock that some comparable PC setups will run games better, and some will run worse.
One thing I can guarantee, in 3 years time the Xbone and PS4 will be running smoother, better looking versions of Elder Scrolls VI or whatever than the $600 2013 PC will be.
You can’t guarantee that last statement at all. I’m running a 3 year old PC (it cost me $900) that still runs multiplats at a much higher graphic fidelity than Xbone/PS4. Consoles will always be playing the catchup game thanks to cheaper hardware, overclocking, and Moore’s law. It isn’t a big deal, but it’s just something that console players should accept.
As long as it’s consistent and smooth I don’t really care. I don’t think anyone can really tell the difference between 30fps and 60fps anyway. Sure, 60fps probably does look a bit smoother when you put them side by side, but it’s like looking at paint colours and being mystified by why both bone and eggshell need to exist. Sure, there’s a distinct grade when you put the samples next to each other, but at the end of the day only one is going on your wall and it’s gonna look white.
People can tell. I can easily tell. That’s like saying, no one can really tell the difference between 24fps and 48fps in a movie.
People can tell but I think his point is more of why should it matter since frame rate is less noticable on a tv than a monitor.
That’s ridiculous. They’re vastly different and anybody who can understand the concept of frames per second can see the difference clear as day.
I think plenty of people can tell the difference, but (in my opinion, anyway) it really doesn’t matter as much as some people claim.
The Uncharted series, as an example, was not ruined by being 30fps, and it would not have been dramatically improved by being 60fps. It was rock-solid and designed to work well within its technology’s restrictions.
Stability is most important… I don’t care if it’s 30, or 28, or 60, or 45. So long as it stays consistent.
Preferably at a multiple of the refresh rate though, otherwise you’ll get tearing even if the framerate is steady.
And doesn’t have any screen tearing!
Never really cared about frame rates – until I played an unpatched Arkham Origins. Goddamn, that was horrifying.
yes I care about frame rate cause if you’ve ever played on a frame rate lower than 30 you’ll get really frustrated as the game keeps pausing so yes frame rate means a lot to me I want my game to run smoothly
Depends.
It’s something you notice when it goes wrong, but notice less for good framerates.
Example: Link Between Worlds looked gorgeous in 60fps, but I totally forgot about it while playing. It could just as easily have been 30fps and I would have enjoyed it just as much.
If the same game is available at two different frame rates (such as Tomb Raider definitive), I’d naturally account for it in my decision between version, but I’d also accommodate for things such as the controller, where friends are at for multiplayer, possible differences in resolution, etc.
I love 75 FPS overclocking my monitor at home, no vsync necessary
As long as it doesn’t drop, it’s fine. I’m ok with 30, though 60 does look nice.
Found it funny that the Wii U seems to have so many games that run at 60fps in 1080p while the more expensive ones seem to have so few 😛
No Wii U game runs in native 1080p. They are upscaled to 1080p. Yes they run at 60fps and they still look great, but they are usually rendered at 720p and upscaled to 1080p. This is like many games last gen on PS3/360. In fact, lots of games (see CoD) run at resolutions much lower than 720p and then are upscaled from there. Only on PC and now on *some* XB1/PS4 games are we seeing games that are actually rendered at 1080p.
Really depends on the game, but generally hell yes.
I’d like it to run smoothly. If it drops noticeably at any point, fix that shit up immediately. But at the end of the day, as long as it looks smooth, I’d like a visually stunning game. At a certain point, higher frame rate is redundant, visual fidelity is always welcome.
That being said, 50fps minimum please. I guess 30 isn’t always noticable on console but still, keep up with my TV refresh rate.
Absolutely. Beyond the simple pleasing presentation input response is directly tied to frame rate. The higher the frame rate the quicker your actions are represented on screen (16.6ms @ 60 vs 33ms @ 30). It’s why any serious fighting game, shooter or racing sim always targets 60fps.
Game play matters above all and a high stable frame rate ties directly into that, it’s what gives Call of Duty it’s secret sauce.
Sure, a faster frame rate isn’t vital, but having a game run smoother can never be a bad thing, can it?
It matters to me. To me it’s like moving from 4:3 to 16:1. The PS4 and x1 are basically super computers compared to the PS and Xbox. They have the hardware to pump out 60fps. Why not use it? I guess also with the speed that games are developed these days, frame rates don’t really come into question. I would like to see more 60fps games come out on both systems.
As long as it’s above 30fps I have trouble noticing any difference. Drop below 30 however and I start caring big time. So I just do whatever I can to make sure my settings keep things at a steady 30 and everybody is happy.
Dwarf fortress sucks once it drops below 30fps.
Stability of framerate matters more to me than the actual number but I can tell the difference between 25, 30 & 60 and its definitely a case of the more the better. But then I’ve been known to hear refresh rates in CRT monitors so I’m something of a freak
need an option for “solid frame rate”… as that is what counts for me… if i can’t get a solid 60 i’d rather have a solid 30…
I care about frame rate in the sense that I don’t want to be seeing stuff like screen tearing, frame skips, or just having a plain old bad time because I can’t move anywhere due to frame lag. That being said, frame skips are ok if they don’t happen often and don’t make the game look janky.
Like most have already stated, I believe that stability is more important than anything… Sure, that 60 > 30, but I don’t really care what the game runs at as long as it is stable.
Constant anything is far better than something that jumps around, and drops between 10-50% of frames because something awesome happens on screen.
Not to the point of frothing anger that some seem to work themselves into.
I don’t get all uppity if a game ships at 30 fps instead of 60, but it should be able to hold that frame rate and not suffer screen tearing or stuttering … then it gets a bit annoying
120 frames? Nope. Don’t care. 30 frames? Yes. I care. It makes everything weird and twitchy.
As long as it doesn’t dip below, say 45 I’m happy.
60-80 vsync on,,, best
OR
30-60 action quality
I tune settings to get em like that
yes it does matter
I would rather a stable framerate.
Im still getting used to PP, love a tiny bit of AA AF @ 1200p
For me I can tell when a game drops under 60fps and my eyes hurt when it drops under 45
Around 30fps is fine for me.
Years ago I play WoW with around 15fps for 8months and had to stay away from capital cities otherwise my computer would explode. 30fps was golden when I upgraded.
To be perfectly honest, I don’t notice such things unless it’s really bad. Average or really good, whatever, I won’t notice. Really bad and I will notice. Otherwise, I don’t mind.
boallen.com/fps-compare.html
of course it matters
FPS in MP is more important.
The bigger issue for me is inconsistency. If I am playing a PC game around 40, it’s fine. But if I’m playing at 60 and it’s like BF4 where you get massive drops, it’s distracting as hell and ruins it.
I dont care as long as the game is playable and looks good. While tweaking Far Cry 3 I had people saying it was unplayable at 40fps. This is of course total and utter crap. Even at 30 fps the game is playable, the picture isnt stuttering or anything like that.
If a game isn’t pushing out 120fps for my 120Hz monitor, it’s a problem. The difference between 120 and 60 is astounding, it hurts my eyes
Articles that fit on my mobile phone matter to me without the text wrapping off the screen.
As long as its at least 30fps and is stable…..
I prefer better frame rate ANY day over resolution or better graphics, but it’s nice to have all of those things at once. There seems to be a lot tension these days over people who play consoles and people who play on PC. You always hear the term ‘PC elitists’ getting thrown around, and the console guys can’t stand them because they think we are trying to s**t all over their parade.
Truth of the matter is that I own an Xbox360 and a PS3, and before that I owned an Xbox, a Gamecube, and a PS2 … and really liked some of the games that were ONLY available for some of those specific platforms (which in truth was the only reason I got them). But often, when playing an exclusive on a console, I find myself thinking that I wished it was available for PC.
For example, I am currently playing The Last of Us for the first time, and even though I am enjoying it, I can see where the hardware is struggling to render the graphics smoothly, and I get annoyed when missing easy shots on an enemy as the input lag from the controller is pretty bad (because of the aforementioned problem). Same kind of thing applies to GTA V … I did enjoy that game, but I lost count with how many times I thought to myself that I can’t wait for the PC release (when I first started playing the game I was literally shocked at how laggy the game felt).
But as time has gone on, my acceptance of games on consoles has become less tolerant. It seems to me that graphics are pushed too far for what the machines can handle comfortably. On the last generation of consoles (PS3 and Xbox360), I kept seeing this trend where not even 30fps was guaranteed, nor stable. Frame rates on some games would often dip into the low 20’s or teens (I watch a lot of Digital Foundry’s stuff). I could swallow the situation if 30fps actually meant a SOLID 30fps … but it’s rare for games on those platforms to even hold that number.
Now we are in the 8th generation of consoles and the situation doesn’t look good to me. I thought that 1080p and 60fps for this generation would’ve been a standard, but it’s not … it’s far from it. Tomb Raider was touted as running at 1080p and 60fps on PS4, but then I saw Digital Foundry’s analysis and I could see that the frame rate continually jumped from 60fps all the way down to the low 40’s and even 30’s. I know that kind of fluctuation causes judder, and it takes me right out of the experience. This generation is not even a year old and already we are seeing games at 720p and 30fps (and even that’s a struggle for games like Dead Rising 3). How in the world are these consoles going to stack up in 3 – 4 years time, let alone 8 – 10 years time?
The only reason why I would ever get a console from this generation is to see what Naughty Dog can do with the PS4. And even though The Last of Us is a good game, I think that if they re-released that game on PS4, running at a constant 60fps, then I’m sure many people would go back out and buy it again. And that’s exactly the point I’m making about the PC. When you have the option and the means to get something better, it’s hard to go back to something which in comparison looks drastically pale.
One last thing for the road … anyone who has ever said to me that they can’t tell the difference between 30fps and 60fps has been a console gamer. Most people who I have known that play PC games primarily can EASILY tell the difference between 30fps and 60fps. It’s like night and day.
Yeah. I own the major last-gen consoles and PC, but the PC is the one I keep current and powerful, so whenever you play a console exclusive it’s always a case of, “Gee, I wish I could play this game on hardware which would let it run at its true capacity instead of being handicapped by console specs.”
Or, occasionally I’ll think, “GRRRRRRAAAH WHAT THE FUCK GUY WHO PUTS SHOOTERS ON CONSOLES WHERE YOU HAVE TO USE A GAMEPAD, WHAT THE HELL IS WRONG WITH YOU HEATHENS, KEYBOARD AND MOUSE IS THE ONE TRUE WAY, THIS IS INTOLERABLE.”
But y’know. We all have our thing.
Pokemon X/Y are a perfect example of why Frame rates matter to me… The way the game skips frames kills my life.
Can i answer with neither?
Gameplay to me is the most important part of a game. This Its better on XB/PS4/PC argument cheapens the game overall.
I have Mass Effect 1/2/3 and Skyrim on PS3 and i know how horrible the textures/frame rate are, in the end it just doesnt matter. They are still great games to play.
I say this about Nintendo games all along, when my cousins see me playing Zelda/mario on Wii and go oh god i can’t play that look at the horrible graphics… i just turn and say. So? Games are games, graphics/resolution/framerates overall don’t add much to the story and visuals shouldnt be what you play a game for.
Framerate matters when it’s not at least 30fps. Past that, it’s playable so I don’t care. Stability is also important – sure, 60fps is nice, but when it’s 60fps for 3 out of 10 seconds of gameplay then it’s not that great.
(But if my game is dropping to below 30fps or can’t stick to around the same framerate for a decent length of time I’m going to want to know why and then how to fix it.)