RESOLUTION AND FPS IN VIDEO GAMES
The next generation of consoles brought with them promises of 1080p resolution and 60 frames per second (fps) as the norm for all of their games. However, evidently, those promises were not kept. While many of the games released have been in 1080p resolution, the fps was locked to 30. Many developers tried to justify this by saying that they wanted to increase visual fidelity of their video games and thus were forced to choose a lower frame rate over graphical compromises.
But why spark this debate about fps and resolution anyway? Are not the graphics of the game good? Is not the game extremely playable? If so, then why are many people so adamant about 60 fps and 1080p resolution in gaming? Before we answer that question, we need to look at some technical terms and what they mean.
Frames per second: These are the number of consecutive frames produced per second to give the idea of motion. At low FPS counts, the motion of objects is hard to discern as it will appear as if the objects are stuttering. At higher FPS counts, the motion gets noticeably smoother.
Resolution: These are the distinct number of pixels in each dimension that can be displayed. On bigger screens, the higher the resolution, the better. A game running at 1080p with be noticeably more crisp and sharp and have less jagged edges compared to games running at 720p.

Refresh Rate: It is the number of times your monitors updates its buffer. If you run a game at 120 fps on a monitor with a refresh rate of 60 Hertz then you will see “screen tearing” as the monitor is not updating its output fast enough to keep up with the increased fps.
So now that we know what these technical jargons mean, we can infer from this that a higher resolution and higher framerate is better. Thus developers should always strive to make their games at 60fps and 1080p, right? The answer to this one is not that simple.
Ultimately, the resolution and frames per second will be restricted by the hardware; be it the display's refresh rate or the console's components. If developers strive to aim for 60fps in all their games, they must sacrifice visual fidelity which in turn would make games look “less pretty”. While some people might accept the tradeoff for better fps, others would not be comfortable with the fact that the game could have looked much better. This is because of the fact that 30fps does not bother them as much as other people. It is way beyond the threshold above which we perceive smooth motion and it does not have a detrimental effect on gameplay.
However, 60 fps is much smoother and games played on higher frame rates do have significant effects on the overall gameplay experience. This is why professional gamers do opt for higher frame rate counts when they play online even if they have to turn off most of the eye candy features. However, for single player games, while 60 fps is a great thing to have; 30 fps is no slouch either.
So what do you think? Should developers strive to offer 60 fps and 1080p resolution in all their games regardless of the graphical downgrades? Or do you think they should opt to provide the best experience possible in terms of eye candy and what not?
For an in-depth look into FPS: Check out Abhik Hasnain's article published earlier: http://bitly.com/ShoutDSFPS
Explained
Comments