Lately, the internet has mustered a plethora of opinions on whether anyone should care about the finer technical details of a game. How much does screen resolution really matter? Is frame rate the end-all be-all?
Some of us at Kotaku have already made our opinions on the matter known — but I thought I’d ask you what you think. How much do you care about resolution and frame rate?
First up: Resolution. Resolution refers to the number of pixels a game fits on screen. These days, most discussion of resolution centres around whether a game runs at the full 1920×1080 (1080p) that most HDTVs are capable of or somewhere below that, usually at 900p or 720p. Though some PC gamers blow resolutions up much higher, if they have monitors or 4K TVs that can support it.
Second up: Frame rate. A game’s frame rate is a measure of the speed at which the image is refreshed on screen. Lately, the discussion of frame rate has centered around the difference between games that run at 60 frames per second (very smooth, generally considered a target for PC games and some console games like Call of Duty) and 30fps (where most last-gen console games run, closer to the 24fps frame rate of most movies and TV shows).
Comments
41 responses to “Do You Care About Resolution And Frame Rate In Games?”
Looks like NeoGAF got wind of this little poll. Why does resolution and/or frame rate matter so much? Since when was it such a huge issue?
Cool projection, but it matters a lot to me. When I get drops way below 30, spiking every 30 seconds, it diminishes the fun. Also, I don’t feel the need to excuse console developers that can’t even target the standard resolution of HDTVs in this day and age. But cool guest post.
This if any game i play drops below 30 its not playable to me and its honestly inexcusable. (Obviously not talking about once every couple hours in rare occasions but real frequent drops)
However given the sheer power of current console every game should really be trying to hit 60, it makes a world of difference and anyone that claims otherwise is probably the kind of person who says graphics are irrelevant and doesn’t notice (or physically can’t see bottom of post) the monumental difference between having shadows on vs off or real time lighting effects.
* no real evidence, cue nerd rage.*
They probably have bad tv’s/monitors and have them improperly tuned, sit so far back its impossible notice the improvement (this is a real thing) and or have bad eyesight (Or lying/in denial). Similar to my parents who can’t even appreciate the difference between broadcast low res tv on a wretched screen to the literal bluray on a top notch sony. (Outside of the fact that there is a difference.)
Unrelated to my cheeky yet truthful point there is also the fact the people are different. Some people are easily able to detect the difference between 30 and 60 fps, others literally can not, they see no change past 30 or even 40. Some even notice changes well past 60. But EVERYONE notices a drop below 30 because it looks awful and can ruin the game. so to those people Im sure fps doesn’t matter because they are poor souls who can’t tell the difference. To those of us who can, please believe us the difference is HUGE between 30 and 60.
Frame-rate is extremely important to me, but not in the 30 vs 60 sense, I’m one of those people that don’t see much difference as long as it’s stable.
But when framerate drops (even when dropping from 60 to 30) is a terrible thing and if I can’t run a game at a stable 30+ I won’t play it, it’s honestly that bad (for me at least).
Want to hear something funny? The GTX 780 I am using occasionally runs WoT at only 30fps.
And I care about resolution because my monitors native res is 1080P; if the game doesnt do that or my machine cant do it, its gonna look poor
Honestly, I couldn’t care less about the frame rate of games. Sure, anything under 20fps is going to be quite annoying to play, but I know so many people who have an issue with frame rates of 30, and honestly I think that they’re just trying to find something to complain about. Until the last couple of years, I had never heard anyone complain about 30fps, so why is it such a big issue all of a sudden?
I’m fine with 30, like you said, I’m just not a fan of 30 with drops to 15 or 20 every 10 seconds.
If 30 was a minimum frame rate we could deal with it but its not.
It’s probably a bad example, but I really enjoy when the frame rate drops when I’m playing Dynasty Warriors, hahaha! I can understand frame rate drops in high action scenarios, so I don’t really care to be honest.
Let’s be blunt though. We all know this is about PS4 and Xbox One, so when one system has better framerate and resolution while also being cheaper than the other it’s pretty obvious which is technically better.
I’m going PS4 over Xbox One purely due to Infamous rather than those miniscule issues. And yeah, I remember playing the Dynasty Warriors Next demo and I got to the point where I was totally surrounded and so smashed special moves. Oh, the tremendous amount of joy I felt as the enemies fell at glorious 10fps. It’s kind of cool because you get the feeling that you’re so powerful that you’re breaking the game. I can’t wait to get it on PS Plus next month!
I love when you’re going crazy in huge crowds as the frame rate plummets, haha! It’s one of my favourite things about Dynasty Warriors :3 I need to get a Vita also so I can try DW Next! Too bad I’ll miss out on it from PS+ 🙁
I heard that you can download PS Vita games on your PS3 and then transfer them to a PS Vita when you get one! My PS Vita just plays PS Plus games, so I’m kind of annoyed that I didn’t start claiming them earlier for my inevitable purchase!
Really?! Thanks man, I’ll need to look into that! Damn, I’ve missed so many games already then!! D:
Best option is to use the sony entertainment store via browser. This way you can ‘buy’ games for all consoles (yes, PS4 too).
I didn’t really think it mattered much but here I am buying games for PC that I’ve already played on console. I guess it does matter after all.
Play Arma 3 at 1366×768, 1920×1080 & 2560×1440 & then try to tell me resolution doesn’t matter. It’s the difference between spotting an enemy at 200m, 500m or 1000m
I barely ever play first-person shooters, but it really bugs me how people can get so competitive that all of a sudden the specs of their system is a huge issue. I honestly don’t care about if someone is a split-second faster because of their fancy monitor and graphics card, because that’s only unnecessary $$$ spent on something that shouldn’t even matter in the first place.
It’s really not a competitive thing in terms of hardware bragging, it’s literally an information advantage conveyed by sheer number of pixels on screen. The more you have the more information you get and the more likely you are to see things earlier.
I’ve seen videos of people playing Arma 2/3 & DayZ spotting enemies at remarkably long ranges and being able to make better plans exclusively because they have a monitor that puts more pixels on the screen. A person at 1000m would be so small on a 1920×1080 monitor that it’d fall between 2 pixels and just not be rendered but on 2560×1440 it’s big enough to see even only just.
These arguments about resolution mattering are always only ever about consoles anyway and as long as everyone has the same there then no one is at a disadvantage so it doesn’t matter
I was talking about when people say “I killed you because I have a higher frame rate” or “you killed me because I was only on 30fps”. Honestly, reaction times in games are never critical to the point in which a sixtieth of a second counts, so why not talk about the skill of the player rather than their graphics card. Same deal with screen resolution; you’d have to be really nitpicky to care so much about screen resolution that you think it actually alters gameplay.
To some people, pumping money into something that decreases outside factors is beneficial. In the end, only the player remains.
Can I say “it depends”?
For an FPS, sure, they are very important, but for a strategy game? Not so much. Giving it 3 secs thought, it seems to me that if reflexes are very important in the game, then FPS is a big deal. And if draw distance is another important factor, then resolution is very important.
This.
Being a pc gamer myself, i cant play games unless they run at 1080 and 70fps at minimum. Anything lower and its horrible
“My game is running at 65fps… it’s horrible :(“
I was really down on my choice of XB1 when all the resolution and frame rate issues first started getting press but now I’m not so bothered. I have been playing the much discussed AC4 over the last couple of weeks and sat 2m + from my 47 inch TV, I’m not sure I would be able to really pick out the difference between PS4 and XB1. A lot of commenters seem to be pc players and I can see thier point, I reckon if I was sat 1-2 feet away from a 27inch monitor resolution would be everything but on the couch I dunno.
I think the question is wrong, the game can look like classic half-life for all i care as long as its in 1080 at 120fps (speaking of, classic half-life can run in 1080 at 120fps). compared to smooth movement and a sharp image. nothing really matters, have all the muddy textures on low poly models you want as long as the edges are sharp and it moves smooth.
This.
I would be a fps/res snob to those who don’t care but its like you say. Not every game i play must be the bleeding edge crysis 3 graphics QUALITY, but those that don’t better damn well be minimum 60 fps and at least full HD because there is no excuse for not having it when it is the difference to that cartoon style looking good and that cartoon style looking like some 90’s retro stretched spit out.
Hell i still regularly buy own and play many old games from Gameboy advance to original xbox because the games are still good and run well via emulation or on original hardware.
I care a little about resolution, but not beyond 1080p at the moment. Not unless I were using a much bigger display.
Frame rate doesn’t bother me at all, as long as it’s solid and playable. I’ll always turn on v sync because tearing annoys me way more than a reduced frame rate.
I would always sacrifice resolution for frame rate with all the effects, not the other way round. That being said I spent more on a gpu than most people spend on a computer so it’s rarely I need to compromise. 😉
Man, I love Super Troopers. Thanks, now I know what to watch this long weekend!
I pretty much came in here hoping for a coment like this.
ENHANCE
I remember the first time I booted quake. It ran shockingly on software rendering but being the age I was it didn’t really matter to me at the time.
I also remember the first time I booted quake in open GL rendering and playing at a smooth 60fps.
Those memories do each other such a great deal of justice, and those experiences forever set me on the path towards the love of 60fps and above.
Frame rate entropy is the main cause of my dwarf fortress fort abandonment nowadays. It’s no fun being able to see the rounds from my steam powered magma filled mine cart gun slowly flying across the screen.
Hells yes it matters!!! To me anyways. Unless a game is “coded” to not run in any other res and I have no choice, I play in 1920×1080, I notice the difference too much.
The same goes for framerate, I keep hearing about ” oh the human eye can’t register more than 30fps” or something to that effect, total rubbish, I notice the difference between 30fps and 50-60fps…without fraps even. But, that’s just me, just one of those, PCmasterrace types.
Yeah there is a clear difference between 30- 60 and 120 fps, I can also tell the difference between proper 1080p and all that 720p upscale shit, it’s all in the aliasing.
I have a 32″ sony bravia LCD. that runs at 1366×768. its great for FPS. as its huge. I have a card in it that can put out 100+ FPS but I slow it down to 66 FPS top revent desync as im running around 60 hz. and higher frame rate and it tears.
Both times that frame rate/resolution have bothered me have been with TF2; once on the PC version I was running on a very underpowered machine and found if I didn’t turn the textures and effects right down I was chugging along with an unplayable frame rate.
On the 360 version of TF2 I once swapped out my HDMI cord for RCA cables to record some footage and found the blurry image made they game unplayable to me!! It was confronting to find my game affected by something I’d only ever regarded as cosmetic up to that point.
As a PC gamer, I don’t think I should be participating in this poll.
I am used to 1920×1080. I am used to 60 frames per second.
However, consoles are a different market to PC, as they use hyper-expensive televisions over the cheaper monitors.
My monitor runs at a higher resolution than my television. That is probably true for a lot of people. However, I find it funny that people are still playing on the same resolution that Soul Calibur 2 for the original Xbox ran at.
However, 30fps is unacceptable in this day and age. I believe that Mass Effect 3 ran at 20fps on the Playstation 3. That is utterly horrid.
It won’t be long before 120fps becomes the standard for PC. I give it 5 years. The console users will still be running at 30fps. And people wonder why PC gamers have a superiority complex?
Frame rate affects quality of gameplay, resolution does not.