The Big Question: Does Frame Rate Matter To You?

So here's a question: does a video game's frame rate matter to you? With the release of new consoles it's only natural that we argue specs, the varying quality of cross platform games, etc — but at the end of the day, do you care if a game is 60 frames per second or 30?

Personally? I care. I notice and feel the difference if a game is smoother. Playing a smoother game makes for a better experience. It feels more responsive. It feels more rewarding. That's just me. I know there are people out there that simply don't care.

What about frame rate drops? Is this a game breaker for you? Vote and let us know your reasoning in the comments below.


Comments

    For me frame rate > resolution.

      Good point. I can't tell while I'm playing that a PS3 game is upscaled from 720p or whatever rubbish. I can tell the difference between sections that are "60" and "30" fps.

        Yeah out of all the xbox one and ps4 things to be buzzing about, me and my mates were stoked with how smooth the games ran compared to their predecessors.

          Personally I don't think its so black and white. Frame Rate is more important than resolution to some degree. It also depends on the type of game. ACIV on PS4/XB1 runs at 30 fps. For the style of game I think it's fine. On PS4 its 1080p and XB1 its 900p. If they scaled back to 720p and got 60fps I think that would be worse. ACIV is a beautiful game and 1080p on PS4 really helps the game stand out graphically. I would take 1080p/30fps over 720p/60fps easily. However, if it was a racing game or a first person shooter like CoD/BF I would take 720p/60fps over the resolution.

          The biggest thing for me is if the frame rate is consistent. A constant 60fps in any game is fantastic. Buttery smooth even. But if you get the frame rate constantly fluctuating between say 35 - 60 then you will end up with on screen judder, variable latency in controller input and possibly screen tearing if v-sync is not engaged. This to me is the worst possibly outcome and i'd prefer a locked consistent 30fps.

          Digital foundry did a brief look at Tomb Raider definitive edition on PS4 and XB1 and their findings showed that on PS4 during game play the avg fps was around 51 and on XB1 around 29.8. At first glance it seems clear that the PS4 is the obvious choice. But DF states the frame rate fluctuates from 32-60 and in parts causes noticeable judder on screen. XB1 also drops frames at times but is overall more consistent. I'll wait for DF to do their complete analysis of the game but I personally would prefer a consistent frame rate. (http://www.eurogamer.net/articles/digitalfoundry-2014-tomb-raider-definitive-performance-analysis)

          I'm not trying to be biased towards any console. I have both an XB1 and PS4. I just don't think everything is as simple as it seems sometimes.

            Even then, though, I think it's only an issue if it fluctuates a lot. If the PS4 version is constantly swinging back and forth between 32 and 60 and is just all over the place, then yeah, that's a problem and would probably put me off the game a bit. But if it only dropped like that once or twice during the testing (I can't actually watch those videos because YouTube is blocked here at work) then it's really not an issue and I'd take the higher slightly variable frame rate over a consistent 30fps. I believe (haven't played it myself) that there was a similar issue with COD Ghosts on PS4 where the frame rate dropped occasionally, but from what I read that was just when it was autosaving at checkpoints, so if the frame rate drops for a second there it's not really going to bother me compared to having it drop during the important bits.

          True. This is one of the major reasons I've almost entirely stopped playing games on consoles. At the start of a console generation the machines are capable of putting out nice graphics at a solid frame rate. Everything looks pretty and runs smoothly.

          As time drags on though, developers are constantly pushing for better "graphics", because that's what customers expect. You need to look as nice as everyone else does in those still images for publicity's sake, so the typical frame rate seems to keep dragging lower and lower until the new acceptable standard is "a solid 30 FPS".

          These days if I can, I'd rather just get games on PC, that way I know I can keep the frame rate where I want it.

        I can. It's not rubbish, you just aren't as observant about something that doesn't appeal to you.

    Only if it drops below 30fps I guess would be noticeable. If it's zooming along but fluctuating between 30 & 60 fps, I can deal with that in most cases, but not when it starts to be like a slideshow

    I would much rather next gen games that run smooth over ones that have high fidelity visuals. As good as real life means nothing when you chug down to 10FPS during the action.

    only when it drops low enough to be noticeable.
    while I appreciate the smoothness of 60 FPS, anything higher than 30 is fine with me.

    Yes, it matters. The whole argument over graphics between the two next-gen consoles is a complete waste of time and it has little bearing on the overall quality of the gameplay for a multi-platform game, but it does affect the experience you get while playing and therefore, all other factors being equal, if it was 60 fps on one platform and 30 on another you'd generally want to go for the 60 fps option.

    I wouldn't refuse to play a game because it's only 30fps or something though.

      Basically, this. The only problem i've had with fps recently was DayZ running stupidly slow on my pretty great PC (Arma worked fine, mod didn't) and the console versions of Assassin's Creed 3. If it doesn't affect the way I play the game then it matters less to me, it'll still skew my decision though. I would appreciate it, however, if the media could actually tell the truth instead of pandering about the subject. When the CoD: Ghosts info came out (One at a lower res, PS4 at a higher one), I thought it was pretty self-explanatory. PS4 had a lead in terms of power. So what? One system always does. But then articles started coming out trying to give "fair" comparisons to both, blatantly lying about how much the PS4 version slowed down to try and balance the story somewhat. After playing and owning Ghosts on PS4, i've found that it's untrue. There's almost no slowdown and the only time it hitches is clearly when the level is being loaded, a fact everyone failed to mention. The perception that the numbers really matter all that much when we've been fed a healthy diet of low res "HD" console games running at 25fps on TVs that basically eliminate the effects of low fps forever is just fuel for arguments. It doesn't really matter, consoles have more to offer than raw power.

    I don’t care as long as it stays above 30fps for the vast majority of games.

    I was on here arguing around the Xbone/PS4 launch that I’d rather have my games look better (higher res, more characters on screen, bigger everything) at the expense of frames 31 and above. Developers know this which is why every new generation for the past 15 years has increased graphics at the expense of the extra frames.

    I couldn’t work out what the big deal was with people complaining that some games ran at <60fps like it was suddenly “the point of the next gen”. I didn’t see the point of the PS2 or N64 being to run PS1 and SNES games at 60fps!

    It’s nice, and is important for some game types (racing games mostly), but it’s not the be-all-and-end-all.

      Because the graphics are on par or less with midrange PCs. There is no 'increased graphics' _at this point_, which is why people just want a stable framerate to at least enjoy the game with.

        Consoles have always been on par with mid-range PC’s at launch. This is exactly what I don’t f*cking get.
        Maybe you can explain to me why the comparison is suddenly between consoles and much more expensive PC’s?
        A high end PC cost $2000 when the N64 came out. Where you completely horrified when the $400 N64 wasn’t significantly more powerful than a ‘mid-range’ PC?

        Even accounting for the fact that they’re new hardware (put Perfect Dark Zero next to Crysis 3 to have a look how much games improve over a generation), there’s absolutely no disputing that there is a leap in graphical quality between the Xbone/ PS4 launch games and the 360/PS3.
        That’s the graphical improvement, of course it’s not going to look a million times better than more expensive PC hardware.

        Both next gen consoles run about how you’d expect them to run if you put the same hardware in a PC. Funny that.

        Last edited 29/01/14 1:50 pm

          Actually, the comparison wasn't meant to be with PCs per se, what I'm saying is 'next generation' can't even handle 1080p in some situations, let alone 60fps. That, to me, is a failing - especially when 1080p has become par for the course, especially for those used to PC gaming in the last couple of years. If you can't hit the default resolution of most HDTVs, I consider you're either a) creating an unoptimised game, b) caring more about graphic fidelity in the sense of polys, FX etc than the overall experience.

          I would semi-agree with you, there are a few launch titles on each that look 'next-gen', but several that don't look significantly different to their predecessors. With regards to your last comment, it's interesting you say that, because there are several high profile multiplats that look and run better on PCs with comparable hardware to the Xbone/PS4. I'm not against consoles (i own a PS4, PS3 and 360), but I'm also not kidding myself about the superior _platform_.

            I agree that it is a big sh*t that the Xbone isn't hitting 1080p for everything. It would be nice.

            PC's are always going to be the superior (and more expensive) platform, but don't forget that this is the first time out for all developers using the new consoles operating systems and hardware idiosyncrasies. It shouldn't be a complete shock that some comparable PC setups will run games better, and some will run worse.

            One thing I can guarantee, in 3 years time the Xbone and PS4 will be running smoother, better looking versions of Elder Scrolls VI or whatever than the $600 2013 PC will be.

              You can't guarantee that last statement at all. I'm running a 3 year old PC (it cost me $900) that still runs multiplats at a much higher graphic fidelity than Xbone/PS4. Consoles will always be playing the catchup game thanks to cheaper hardware, overclocking, and Moore's law. It isn't a big deal, but it's just something that console players should accept.

    As long as it's consistent and smooth I don't really care. I don't think anyone can really tell the difference between 30fps and 60fps anyway. Sure, 60fps probably does look a bit smoother when you put them side by side, but it's like looking at paint colours and being mystified by why both bone and eggshell need to exist. Sure, there's a distinct grade when you put the samples next to each other, but at the end of the day only one is going on your wall and it's gonna look white.

      People can tell. I can easily tell. That's like saying, no one can really tell the difference between 24fps and 48fps in a movie.

        People can tell but I think his point is more of why should it matter since frame rate is less noticable on a tv than a monitor.

      That's ridiculous. They're vastly different and anybody who can understand the concept of frames per second can see the difference clear as day.

      I think plenty of people can tell the difference, but (in my opinion, anyway) it really doesn't matter as much as some people claim.

      The Uncharted series, as an example, was not ruined by being 30fps, and it would not have been dramatically improved by being 60fps. It was rock-solid and designed to work well within its technology's restrictions.

    Stability is most important... I don't care if it's 30, or 28, or 60, or 45. So long as it stays consistent.

      Preferably at a multiple of the refresh rate though, otherwise you'll get tearing even if the framerate is steady.

    Never really cared about frame rates - until I played an unpatched Arkham Origins. Goddamn, that was horrifying.

    yes I care about frame rate cause if you've ever played on a frame rate lower than 30 you'll get really frustrated as the game keeps pausing so yes frame rate means a lot to me I want my game to run smoothly

    Depends.
    It's something you notice when it goes wrong, but notice less for good framerates.

    Example: Link Between Worlds looked gorgeous in 60fps, but I totally forgot about it while playing. It could just as easily have been 30fps and I would have enjoyed it just as much.

    If the same game is available at two different frame rates (such as Tomb Raider definitive), I'd naturally account for it in my decision between version, but I'd also accommodate for things such as the controller, where friends are at for multiplayer, possible differences in resolution, etc.

    I love 75 FPS overclocking my monitor at home, no vsync necessary

    As long as it doesn't drop, it's fine. I'm ok with 30, though 60 does look nice.

    Found it funny that the Wii U seems to have so many games that run at 60fps in 1080p while the more expensive ones seem to have so few :P

      No Wii U game runs in native 1080p. They are upscaled to 1080p. Yes they run at 60fps and they still look great, but they are usually rendered at 720p and upscaled to 1080p. This is like many games last gen on PS3/360. In fact, lots of games (see CoD) run at resolutions much lower than 720p and then are upscaled from there. Only on PC and now on *some* XB1/PS4 games are we seeing games that are actually rendered at 1080p.

    I'd like it to run smoothly. If it drops noticeably at any point, fix that shit up immediately. But at the end of the day, as long as it looks smooth, I'd like a visually stunning game. At a certain point, higher frame rate is redundant, visual fidelity is always welcome.

    That being said, 50fps minimum please. I guess 30 isn't always noticable on console but still, keep up with my TV refresh rate.

    Absolutely. Beyond the simple pleasing presentation input response is directly tied to frame rate. The higher the frame rate the quicker your actions are represented on screen (16.6ms @ 60 vs 33ms @ 30). It's why any serious fighting game, shooter or racing sim always targets 60fps.

    Game play matters above all and a high stable frame rate ties directly into that, it's what gives Call of Duty it's secret sauce.

    Sure, a faster frame rate isn't vital, but having a game run smoother can never be a bad thing, can it?

    It matters to me. To me it's like moving from 4:3 to 16:1. The PS4 and x1 are basically super computers compared to the PS and Xbox. They have the hardware to pump out 60fps. Why not use it? I guess also with the speed that games are developed these days, frame rates don't really come into question. I would like to see more 60fps games come out on both systems.

    As long as it's above 30fps I have trouble noticing any difference. Drop below 30 however and I start caring big time. So I just do whatever I can to make sure my settings keep things at a steady 30 and everybody is happy.

    Dwarf fortress sucks once it drops below 30fps.

    Stability of framerate matters more to me than the actual number but I can tell the difference between 25, 30 & 60 and its definitely a case of the more the better. But then I've been known to hear refresh rates in CRT monitors so I'm something of a freak

    need an option for "solid frame rate"... as that is what counts for me... if i can't get a solid 60 i'd rather have a solid 30...

    I care about frame rate in the sense that I don't want to be seeing stuff like screen tearing, frame skips, or just having a plain old bad time because I can't move anywhere due to frame lag. That being said, frame skips are ok if they don't happen often and don't make the game look janky.

    Like most have already stated, I believe that stability is more important than anything... Sure, that 60 > 30, but I don't really care what the game runs at as long as it is stable.

    Constant anything is far better than something that jumps around, and drops between 10-50% of frames because something awesome happens on screen.

    Not to the point of frothing anger that some seem to work themselves into.

Join the discussion!

Trending Stories Right Now