At the start of this new generation of consoles the debate of framerate and resolution importance still wages on. The gold standard to live by is a solid 60 fps and 1080p. If those numbers aren’t on a bullet-pointed press release somewhere, eyebrows and questions are raised. But what if hitting 30fps is intentional and defensible?
Dana Jan, game director on The Order: 1886, is committed to 30 fps — at least for his game. I sat down to speak with him about two weeks ago after playing a brief demo of chapter III of The Order.
You can check out a sample of what I played in the video above. (Note that it might not be running at 30 fps because it is footage provided to me by Sony, and uploaded to YouTube after our editing process, so I can’t be sure what’s happened to it in doing so.)
Remembering the debate some of our readers had on the legitimacy of needing to operate with 30 fps, I decided to pick his brain and see if he had a response to everyone’s hesitations.
“60 fps is really responsive and really cool. I enjoy playing games in 60 fps,” Jan told me. “But one thing that really changes is the aesthetic of the game in 60 fps. We’re going for this filmic look, so one thing that we knew immediately was films run at 24 fps. We’re gonna run at 30 because 24 fps does not feel good to play. So there’s one concession in terms of making it aesthetically pleasing, because it just has to feel good to play.
Jan: “I don’t know of any other games that are gonna look like our game in real-time with no pre-rendered movies, with all the stuff that’s going on lighting-wise, and run at 60.”
“If you push that to 60, and you have it look the way we do, it actually would end up looking like something on the Discovery Channel, like an HDTV kind of segment or a sci-fi original movie maybe. Which doesn’t quite have the kind of look and texture that we want from a movie. The escapism you get from a cinematic film image is just totally different than what you get from television framing, so that was something we took into consideration.
“Then, on top of it, I don’t know of any other games that are gonna look like our game in real-time with no pre-rendered movies, with all the stuff that’s going on lighting-wise, and run at 60. I think that’s probably the thing that most people underestimate is [that] to make a game look like this — the way that they’re lit, the number of directional lights that we have… We don’t have a game where you’re just outside in sunlight, so there’s one light. We have candles flickering, fires, then characters have lights on them. So [to make] all those lights [work] with this fidelity means, I think, until the end of this system most people won’t have any clue how to make that run 60 and look like this.
“That was something where we kind of said, ‘What was important to us?’ We’re visual creatures. when we see things, that’s kind of our first senses. I think immediately we look at this game, one of the things that’s exciting to me, it feels next-gen. It’s one of the first things that I go, ok, I think this is helping define what next-gen really means. Getting a new system and actually booting up and saying something that is like, ‘I’m blown away by what I’m seeing.’ There’s almost nothing that you can take away from that.”
Jan: “Until the end of this system [the PS4], most people won’t have any clue how to make that run 60 and look like this.”
About five years ago, Mike Acton of Insomniac Games proposed something that probably seemed wild to some people. He said that “Ratchet and Clank Future: A Crack in Time will probably be Insomniac’s last 60fps game.” His reasoning at that time aligns fairly closely with what Jan said to me two weeks ago.
Here’s what Acton found in researching the importance of framerate:
However, during development, there are hard choices to be made between higher quality graphics and framerate. And we want to make the right choices that reflect our commitment to providing you with the best looking games out there. To that end, our community team did some research into the question of framerate. The results perhaps confirmed what I’ve known for a long time, but found it difficult to accept without evidence. They found that:
- A higher framerate does not significantly affect sales of a game.
- A higher framerate does not significantly affect the reviews of a game.
And in particular they found that there was a clear correlation between graphics scores in reviews (where they are provided) and the final scores. And they found no such correlation between framerate and the graphics scores nor the final scores. As an interesting side-note, our team also found no direct correlation between gameplay scores and final scores, however it does appear that gameplay scores are also influenced by graphics scores. i.e. Better looking games appear to be more “fun” to reviewers, in general.
So, what do you say? Can you be content with 30 fps?
Comments
64 responses to “A Developer’s Defence Of 30 Frames Per Second”
HAHAHAHAHA, can’t wait to go on IGN and watch the PlayStation fan boys cry after dissing XB1 framerates, personally i think the game looks great regardless (edit: just realized this is old news, will crawl back into my cave)
They will just scream about res instead.
Just annoys me when gamers have the ideology that graphics define your enjoyment of a game, so many indie developers put so much heart and passion into their titles and just because it’s not the best looking game they’re ignored by many
I’ve just upgraded my GPU, but that’s so I could play certain new games. I play stacks of old games that look like crap by today’s standard, but the thing is – They were fun when they were released, how can time detract from the level of fun you experience? I can’t say it does.
Well I play GTA Online every day, so yeah. I’m fine with it. I’d like more, but meh.
Wow even developers are jumping on the “It feels more cinematic” bandwagon now.
The games industry (particularly western devs) are more and more forgetting that they should make games with great game play and not shit, slightly interactive movies with horrendous stories.
I have no problem with Fps or how many “p’s” there are in the resolution, as long as it looks and plays well. I really don’t understand the whole fan boy thing. You clearly like games, so get both if you can afford it. Meanwhile the PC is there if you truly care about visuals.
Having said that, I currently have a dusty ps4 that is only used for Netflix at the moment and am waiting for destiny and a game like 1886 so I can play games on it again. Again, 30fps is fine, but from the video I have seen, 1886 is noticeably dipping below this, enough to affect gameplay. I really hope they take the extra time to optimise, as the game has a great look with somewhat generic shooter mechanics, but if it is constantly juddering along then I think it will detract from the whole experience.
Back to the PC master race until good games come out for the consoles..
Unfortunately nothing can justify 30fps. He said it himself unintentionally that most people will not know how to push it to 60fps with what they have. I guess they are the people that can’t do it and trying to justify it by saying they choose not to.
Yeah it all sounds like excuses. Clearly they would have it in 60fps if they could but they’ve made compromises.
And this guy says how nothing else will look like this game fidelity-wise, maybe he should check out PC gaming some time.
As a filmmaker who loves games, frame rate is ABSOLUTELY an aesthetic choice, at least in part. Seeing Dark Souls 2 running on a PC at 60fps was a shock to my system, and not necessarily for the best. Granted there’s no excuse for a shitty frame rate (hello DS1 Blighttown) but Demon’s Souls and Dark Sous were both around 30fps. This gave the games a certain look and feel, which is changed drastically by running it at 60. Though it’s arguable that 60 is better suited to the high-stakes gameplay of the Souls games, I think the 30fps “look” is a big part of the visual identity of those games too. 60fps just looks less “classically cinematic” to a lot of people’s eyes.
I agree with the point that they are trying to keep it 30fps for the cinematic visual that they are trying to achieve but it ticks me off when they said no one can do 60fps with their game. He make it sound like “Yeah I can do it but no one else is doing it and won’t impact my sales so I’ll save time not to optimize it”
They can at last do something like Infamous Second Son and Killzone, allow player to lock the fps if they want to. I guess they just don’t want to spend more resources on building it. But I have to admit I like game direction.
I think you misread his comment. I think he was trying to say, that if anyone made this game at 60fps, it wouldn’t look right. So even if the system was capable of it, this particular game would look different (in his eyes, wrong) at 60fps vs 30fps.
I thought he was trying to say that the drop in frame rate allowed for them to put more power into other aspects so it’s about balance rather than just maxing out the fps for the sake of hitting somewhat arbitrary goals. You wouldn’t make a 1000fps game that couldn’t handle basic shadows, alternatively you wouldn’t make a 0.0001fps game that rendered the entire world with realistic lighting.
With the same time and effort, a 60fps version of his game would come out looking worse due to the compromises he’d have to make to get a stable framerate. There’s no optimise button you can keep hitting so to achieve the desired results you’ve always got to be making sacrifices.
So it’s not a statement about 30fps vs 60fps, but an argument in favour looking deeper than the dumbed down checkbox stats. It’s a pretty valid critism of the way we evaluate games. Especially PC gamers who have a long history of just ignoring the game itself and focus on the required specs. I remember as a kid hanging around with my friends who were PC gamers, and they’d go on and on about how well the bland, uninspired games they were playing performed. 60fps Colonial Marines is still Colonial Marines. =P
I can get behind graphical minimum performance to a degree, I can’t play non-HD 3D games without feeling like I’m wearing really strong prescription glasses, but fanboys at the start of a new console generation always get ridiculous about it.
That is one way to interpret it. Personally I feel offended when he say this:
I don’t know if it is just me but he sounds so cocky about it. I accept the fact that it was for the experience they are after which is similar to movie quality but that line ticked me off.
EDIT:
Sorry missed a point. The way he explain things, ultimately his reason for not do 60fps to do is was:
1.Movie quality decision
2.No profit difference
3.Most people won’t be able to do such quality game on 60fps
It’s fine that it can’t be done, we know how detailed graphics can impact on the performance. No profit difference is uncalled for and saying that most people can’t do it is a little overboard.
With time people may be able to achieve these results at 60fps. Technology improves as developers get more and more familiar with the hardware that they’re working on and spend hours profiling and tweaking the hell out of their code. That doesn’t necessarily mean that they can do it right now with this particular title.
As for “nothing can justify 30fps”, that’s really not true. Increasing the frame rate requires making compromises like simpler shading, fewer polygons, etc. If developers have designed with a particular look and feel in mind that can’t be achieved at 60fps then they’re not wrong, they just have different priorities than you do.
Games have been perfectly fine at 30fps and below for as long as they’ve existed. There are unquestionably some games that feel better at higher frame rates, but if your design doesn’t require it then it’s really not the game-destroying issue that some fans claim it is.
I suppose the debate on framerate in games is similar to the film debate as well. Peter Jackson copped some flak for doing I think it was 48 fps when filming the Hobbit movies and people said it looked like a daytime soap opera. Personally I felt that things were a lot more fluid in viewing it but it did highlight where corners had been cut in production eg. some of the props looked particularly fake.
I like to think that the people who stand by lower framerates for film, TV, and games will eventually come to accept that technology has changed to allow for better quality in all aspects of production. By saying that something doesn’t look or feel cinematic at a higher fps means then perhaps the core content of a scene is not particularly cinematic and you’re trying to blur things by making it run slower.
I disagree a little, high framerates in slower dramatic scenes in films does feel ‘soap opera-ey’ to me. But in an action scene where lots of things are happening? Bring it on! There’s nothing to say they can’t use a variable framerate.
I like higher fps in modern movies because the current trend of crappy, shaky, camera work and idiotic fast-cut editing is reaching such levels that everything becomes a blur at 24fps. That’s just bad film-makers failing at their jobs, though.
maybe sony fans will realise the fps doesn’t mean shit and it’s a consistent frame rate without large fluctuations that really matters. There’s a reason some of the sony titles that were touted as 60fps were patched with options to drop back to 30 and that’s because sane people like myself were annoyed at the frame rate fluctuations. Give me a 30fps game that only drops a small amount over a 60fps game that drops 20-30fps frequently during gameplay any day.
What’s that got to with sony fans? Don’t have to go all fan boy to do some logical arguments 🙂
I guess it is just a matter of choice. There are quite a few that prefer to play at locked frame rate while some doesn’t affect them at all. Infamous and Killzone did a good job in maintaining at least 30fps. Killzone doesn’t really go up much, usually the frame rate hover around 30-50, rarely 60. Infamous is quite similar being minimum 30fps and it goes up to 60fps in certain place especially when you are high up in the building, it feels so fun to dash around in 60fps in air lol.
Titanfall is better in that regards but still suffer from dips. Game is usually 60fps but can drop to 30 or lower when smokes and titan get clogged in some area. Forza was good, constant 60fps and rarely dips. Dead Rising 3 30 fps constant is great, I used the TV’s motion flow and it became butter smooth because of the constant fps. Sadly I can’t do it with PS4 games as it makes Sony TV go “game mode” which have the fps locked by the PS4.
Either way, both consoles does the exact same thing with PS4 giving you choice while X1 locks it down for you. FPS is just a matter of preference for now as previous gen consoles were all running at 30fps and not many people know or understand about 60fps since majority of gamers are console gamers rather than PC gamer.
Even my badass PC will fail to impress me with constant 60fps sometimes and I ditch it to play on my console.
TL:DR;
Yes constant framerate is always preferred but I prefer to have the choice to enjoy the fluctuation or to lock the frame.
People who don’t embrace new technology and make the most of it will generally fall by the wayside. Obviously in games this applies more to AAA developers rather than indies.
Personally from what I’ve seen, The Order 1886 is looking like it should’ve been a launch game as there’s nothing in the gameplay that I’ve seen that is anything new from the last gen and the graphics aren’t looking better than inFamous or Wolfenstein as of yet, although if that’s down to Youtube then I can let that slide.
As it is, I was initially excited for this game when it was announced last year, then on hearing it was a third person game I was fearing it will turn out to be largely generic and now it seems like Sony develoers are already making excuses for their games. I’ll wait for other developers in a year or two who’re interested in pushing these consoles to their max potential to say that 60fps, 1080p with massively improved graphics isn’t possible before I believe it. A developer who’s largely previously done PSP games, I’m not going to take what they say about a console’s potential for graphics as gospel
Having grown up in an age where 10fps in non-textured polygons was regarded as quite good, anything north of about 15fps keeps me pretty happy. (Insert story about gaming uphill both ways, through the snow.)
That said, for similar reasons graphic fidelity can be less important than the question of whether the graphics support the gameplay. This is partly a matter of framerate, but more a matter of whether attention is properly directed at the elements that are most important in actually playing the game. Developers are getting much better at this sort of thing.
I don’t know about uphill both ways, but playing games off a tape involved watching a loading screen for about 30mins just to find out that something, somewhere screwed up and you need to start again.
10fps with non-textured polygons? Luxury.
30 minutes? Luxury!
Medieval Combat on the Atari 800 took 42 minutes to load. And it took my brother and I about 3 weeks to type it all in from Creative Computing magazine.
Agreed. Virtua Racing on the Mega Drive blew my fragile little mind when I was a kid, and that ran at about 20fps. Doom was a revelation, as I was used to games like Steel Talons (Atari Lynx 3-5fps) and Corporation on the MD. Anything north of 30fps is a luxury for me. I can definitely tell the difference between 30 and 60, but I don’t get hung up on it. As long as it’s consistant, and I can see what I’m doing, then I don’t really care. Some devs go for more detail, less fps. Others choose less detail and higher fps. It’s up to them, and that’s fine by me.
If they want to make all their decisions by a metric, to see how well it’s correlated with sales, then that’s fine. But I’ve been gaming for a long time and I KNOW that I enjoy games that are responsive with tuned controls.
I also make movies. And the movies I make are in 24fps. So I know they are NOT equatable. One is a purely visual experience, sitting there, soaking it in. Frame rate isn’t important in this situation. As long as there’s no obvious flickering, the audience doesn’t care. The other is IN the experience, controlling it, affecting it. And 60FPS+ is much better. 120FPS is better again.
I personally think these kind of snippets are more about the developers trying to convince us of their views, than actually giving us the truth. Anyone who’s played any FPS competitively can tell you that 60FPS is the minimum for actually having some kind of control.
Bam, you nailed it on the head.
Thanks man 🙂
it makes you think: why not have your in-game custscenes in 30 and everything else in 60? Or why not lock at 45? I run BF4 on Ultra so I only get 45 but it feels better than 30.
Totally. But they also mentioned in the article that they don’t know how. I think that explains a lot. I wonder if they could, would they? Or are they really that dedicated to the film like look?
I run BF4 with everything on ultra except for the particle effects etc so I can see a bit more.
I’m lucky though that I have a decent rig, so I get around 60FPS v-sync’d. Bonus of having a PC that you use for video editing, grunty GPU 🙂
If you’re playing competitive FPS with a controller then that’s going to be a FAR bigger impost on your ability to control the game than the difference between 30-60fps are going to be.
My point is for all the whinging people do about a framerate of “just” solid 30fps there’s much larger factors in play which people don’t whinge about because they’re not so easily quantifiable.
Any kind of online gaming is going to result in lag which affects your ability to play precisely far more than the amount that a solid framerate of 45fps is going to.
Playing with a controller is the same, if we play Quake 3 and you’re using a controller at 60fps I’m going to back myself to kick your ass with a mouse and keyboard at 20fps.
I think this argument is very console-centric. Games will run at whatever you’re willing to pay to get them to run on PC.
If you’re a competitive PC gamer then hell yeah I understand that it’s important that your PC is smooth and fast, but for console games the majority of the argument about 30 or 60fps is just fanboy bullshit. Arguing over a figure because it’s easy to determine.
I know I’ve blamed a lot of things when I’ve died in games, but it’s never been a solid 30fps framerate.
It is, and yeah… I see your point. I think if you’re playing at 20fps though, a skilled gamepadder might have a chance 😛
I don’t think it’s fanboy bullshit though. I don’t have either of the consoles yet, but I have a PC and I’ve dabbled with being a competitive player a few times. If I know one of them can run the same game at higher framerate, then that’s the one I’m going to buy. And I’m as far removed from being a fanboy as possible.
The power of the machine should always be a consideration if one machine is going to consistently outperform the other. Whether is draw distance, texture pop-in, load times, resolution or frame rate, they’re all factors which should be considered for sure.
I’m just against the people who are getting hysterically mad about games this console generation not performing at upwards of 30fps like it’s some kind of right or a massive failure of a game developer. Very, very few console games would be significantly impacted by a frame-rate of “just” 30fps. Of all those factors listed above, I’d put frames 30-60 as the lowest priority if it led to an increase in any of the others.
The metrics I assume you’re referring to at the bottom were said by someone else, not the interviewee. I assume they were added as a second voice to give another side to the argument.
As a film maker myself I think you’re selling frame rate short. Changing the frame rate actually has quite a strong subconscious effect that can greatly alter what you’re shooting, especially if you reinforce it in the grade.
Dropping the frame rate into the teens can be really useful for creating as part of an old silent film look and having a too high frame rate is one of the main reasons that these sometimes look off.
Similarly shooting at 30fps gives a great tv/tape feel, due to it’s almost exclusive use on NTSC (Read US) TV. Punch up your colours and saturation a little and you’ve got a great little telenovela.
Last season of VGHS used 24fps for most of the series but 48fps for the action scenes set inside videogames to further differentiate the two worlds with a subtle effect. Which also gives them more freedom in their grades since they don’t need to use some other visual language to unify the game scenes.
However, I’m not a programmer so I’m not too sure on how games “capture” the players view as it’s rendered. This whole cinematic look actually relies on their virtual camera imitating the shutter speed and creating motion blur that would match a real camera in that situation.
That level of motion blur is the thing that could potentially cause more issues in very fast twitch reaction style games though, much more so than the frame rate itself.
Anyone more familiar with programming willing to jump in with info on the virtual cameras?
Gotta disagree.
Tuning is what’s important to gameplay. There are enough examples of shooters in past generations that didn’t need to go to 60.
I don’t think the debate is whether 30 or 60fps is better, i always thought the debate was whether games who claim they are 60fps have their fps range wildly between 40 to 60 rather than remaining constant
i agree that the actual fps rate is a design decision but the game should be constant in its fps.
From a technical standpoint, you have no idea how hard it is to do ‘constant 60fps’ with no visual glitches. The problem is that the consoles aren’t powerful enough to run a constant 60fps, so it’ll never happen. Dips will always happen.
Slower fps is still a valid excuse for film, as people simply don’t like it yet. It is no excuse for a game, as higher fluidity in user-controlled actions actually increases the immersion (feeling that you’re inside this game world). The dude’s lying out his ass.
Good for Dana, I agree with him 100%.
Unless you’re playing a racing game or something like that frames north of 30fps are almost totally unnecessary.
I’m not 100% certain which console was the first to technically be able to push out a framerate north of 30 but it’s been a LONG time. Developers have never put frames above 30 before increased graphical fidelity and I don’t expect them to start now.
Frames above 30fps don’t show up in pretty screenshots, they don’t translate well to Youtube trailers and they don’t sell games.
At the risk of drawing the ire of a handful of very uppity posters to a games-orientated website, there’s a damn good reason that sales aren’t impacted by framerates between 30 and 60fps and that’s because 99% of the population gives absolutely zero fucks.
For the gamer with a particular lack of interesting things going on in their lives, frames above 30 become a “fun” arguing point for fanboy wars and a solid benchmark for an expensive PC. It’s not the end of the world at all if a title, and particularly a shooter, runs at 30fps.
For a racing game sure, if there’s two different console versions and one runs at a higher framerate then that’s a good selling point for that version.
Most people just want their games to look great and run smoothly, solid 30fps is “smoothly”. I get much more uppity about screen tearing, poor writing, bad controls and a million other factors that can’t as easily be labelled with a number.
Fun thing to consider, the VAST majority of people who buy COD games buy them to play online. Every single person playing that game has their control impaired by some level of obvious network lag and they deal with it….. yet the same people will melt the internet with rage if told that the game occasionally drops to 45fps- something that the vast majority of them would never be able to notice themselves.
FPS is just an easily quantifiable source of rage for gamers, as long as it sits at a solid 30fps, I’m not going to complain.
+1. Totally agree.
Frame rate is absolutely a choice. Wherever there are limits, there will be sacrifices, and what you sacrifice is a choice.
As these are new consoles, they need to impress the masses and it is much easier to wow them with how a game looks than how it feels.
Hence in this case they chose graphics over gameplay. Could they have made this game runs at 60fps? Yes but they would have to downgrade visuals.
They could even downgrade the visuals and get it to 1080p 60fps.
It’s all a choice and personally 30fps is good enough, as most things are on a console.
I’ve always kind of thought, and Pat from the Two Best Friends confirmed it, that it’s only a problem once you learn the difference between 30fps and 60fps.
I never learned the difference and I’ll be damned if I’m gonna go out of my way just so that I can get pissed off a videogames even more.
Nah, you learn that is is the cause of your rage. You died when you felt you actually emptied a clip into a dude? Welp, the cause is actually a combination of your reflex, 30fps, input lag on TV, and input lag on controller.
It just makes you rage more because you know what the problem is, but you can’t do anything about it because devs treat you like idiots.
…whaaa? No, I’ve never felt that. Unless we’re talking about multiplayer, then to me it’s clearly host lag.
No, just no.
Seriously, this comment is just totally BS on so many levels before you even get to the blatantly false accusation aimed at the framerate.
None of what you list as causes is the cause of your example, since you saw it happen you have already overcome your reflexes as well as all forms of lag from inputs, since you saw it the game has processed it happening.
As @neo_kaiser already retorted the only reason for that to happen is host lag. Your internet connection is slow, so the host has overwritten what it saw happen so that everything matches up.
The difference between displaying 30 and 60fps relates to there being an extra frame every 0.017 seconds, so the most advantage you could possibly have is spotting movement that 1 frame earlier when someone enters your field of vision.
Once there in that’s not enough time to do anything that would give you an advantage. You can’t move or change direction to throw off a sniper shot in it for example.
The only time a higher frame rate could make an absolutely tiny difference would be in a twitch shooter where the two players are playing at different framerates. So if a game is locked at 30fps everything is even.
So now you can stop raging over a nearly meaningless number 🙂
Nailed it.
@stickman ‘s comment is all kinds of ridiculous.
If you’re blaming death in a console shooter on a framerate that’s above 30 then you’ve gone full retard.
Controller lag, TV lag, NETWORK LAG are all much larger factors than missing frames between 30 and 60 are ever going to be…. And even then (apart from network lag) those lag times are so small they’re irrelevant compared to the actual reaction times a human is capable of.
If you’re on PC then I’ll concede that a silky smooth framerate (in conjunction with the finesse of a mouse and keyboard) is probably some advantage over a player with a worse one, but on a console everyone’s the same and there’s far more impactful mitigating factors for lag than any number of missing frames above 30fps.
Wait, you concede that on PC a better framerate can probably give some advantage? And even though I obviously list controller+input_lag+framerate issues (hinting at these all being a problem for consoles), you still imply that I’m talking about the difference between 30 and 60fps on a console? Okay.
I’ve had people who have tried to show me the difference and I still can’t really tell. The arguments about them are a constant source of bemusement.
Pat said you can see it if you put your eyeball up to the monitor.
I once took heaps of meth before a Forza 5 session and I could tell that every time the lap counter reset the frame rate dropped to 55fps.
I got up between races and quickly sent a 10,000 word email to Turn 10 explaining the issue and demanding free DLC and an in-game Ferrari. They never replied though and by the time I woke up the next day someone must have emptied my sent items because there was no trace of the email.
http://boallen.com/fps-compare.html
this will clear things up nicely
Can’t say it does since it only shows one side of the equation. Perhaps a demo from the latest build of the crytek engine showing various scenery at 15, 30, and 60 fps would lend more to the cause.
Anyhow, I find the whole 30 vs 60 fps argument a bit pointless. We all have our preferences but at the end of the day, a developer has a vision and sometimes in order to realise it, they have to accept limitations of technology.
They get to make the choice and not us. Feel free to go make your own game if you don’t like it.
What’s truly said is how we’re prepared to write a game off over such trivial things. Whatever happened to just enjoying the game?
I sure as hell don’t walk in to a shop with a technical checklist before I buy a game!
I actually kinda dislike some games running at 60fps. The cinematic vs. HDTV doco analogy was well put. Personal preference plays a large part but I am sick of hearing people cry “No 1080p 60fps? Not good enough!”. I think of playing something like SotC at 60 frames and it makes me feel strange.
I’m fine with 30fps. I definitely get that “digital handcam feel” with 60fps games but have no issue with them. The big issue is consistent frame rate. So many “60fps” games dip below that constantly which pulls you out of the game.
This whole console vs console vs PC hardware/resolution/fps war is crazy. The industry should be focusing on making games that are fun/entertaining/scary etc.
Gameplay seems to be the last thing most developers focus on sadly.
Gameplay can’t be defined by a number though, which makes it much less fun if you’re going to be an overly entitled fanboy who likes to whinge/ gloat.
If you’re a PC gamer then this shouldn’t be an issue. Turn the graphics down or buy a new PC if you can’t deal with sub 60fps.
If you’re playing on a console then your controller is a far bigger impost on precision control in any shooter than a framerate above 30 but less than 60 will ever be. Plus everyone else is in the exact same situation as you are so it’s fair.
If you’re playing online and your framerate is 30fps, it doesn’t matter how good your connection is- lag is a bigger factor than the framerate by a large margin.
Most developers DO focus on gameplay/ graphics over 60fps. Consoles have been able to do 60fps for the last few generations at least, but games have forgone fps above 30 because like Dana said, your average gamer doesn’t care and would rather have a bigger, more interesting, more detailed game world.
If websites didn’t publish framerate data I’d guess that 90% or more of gamers would never ever notice as long as the rate stays above 30.
TLDR: a lot of bullshit reasons why we are using 30fps, when the real reason is we would have to sacrifice visual quality to run it in 60
There is a lot of merit to what he’s saying, you just have to not be an ignorant asshole to understand it.
240fps >> 120fps >> 60fps >> 30fps >> 15fps so on and so forth.
Quite simply, if you want to have more detail in a scene or achieve a constant look and feel, lock your FPS. The lower you lock it, the more detail you can cram in. Why is this so hard to understand? If you add in motion blur this mentally can fill in details as well.
I’ll take a rock solid 30fps that is optimised and timed perfectly to a wobbly mess than bounces between 30-60fps. I would take a solid 60fps or greater but I actually understand what the developer wants to achieve and deliver so I’m not being a whiny bitch about it.
At first his argument was fine, he was literally saying “If we made the frames higher, the game wouldn’t look as nice graphically.” which makes sense, good on him for admitting that and not using “30 FPS is more cinematic” as an excuse…. oh wait… he did…
in my personal opinion i’d rather have 60fps than superior graphics.
It reminds me of when Nintendo was bragging about the N64, and they boasted that even though the N64 often had worse framerate than PS1 games, but looked better.
Putting aesthetics over framerate is the same as putting graphics over gameplay. A better framerate is an inherently more playable and more responsive game. The fact that developers have this attitude disgusts me.
40-45FPS is where it’s at. (Personally) smooth enough for gameplay but not so smooth it looks cheap like soap opera.
I hate playing a game when it feels like watching an old movie or show on a new TV.
720p60 > 1080p30
End of.