
Originally Posted by
Bolivar
Framerate is important to the extent that very low numbers lead to intolerably choppy animations. Generally, the higher frames per second the better, as it does indeed reduce input latency and is more comfortable for your eyes. 60fps has become the "gold standard" because many people mistakenly believe it's the point at which increases to framerate stop becoming appreciable. The real reason is because 60hz the maximum refresh rate of most High Definition displays, and since high end PC hardware is aimed at achieving maximum spec, resolution, and framerate, it's the standard to which gaming consoles are inappropriately held to.
Unlike PCs, consoles have limited power, while framerate and Image Quality exist on a sliding scale. Most game developers will tell you they would rather take a solid framerate at 30 and spend that extra horsepower on more robust graphics, physics, and game world - it's not just a graphical tradeoff. Hence why The Last of Us Remastered could achieve 60fps on PS4 but GTA V could not, despite TLoU being the more graphically sophisticated game. Moreover, developers can leverage motion blur techniques to make 30fps appear just as continuous and often more stylized than their 60fps counterparts. 60fps really is just a buzzword. Battlefield Bad Company 2 dispelled the myth last generation that it's necessary for online multiplayer games, much like Shadow of Mordor did so this generation for responsive action adventure games.
Personally, my R9 290 runs PC games up to 144fps on my high refresh rate monitor. The Order is still the most technologically impressive game I've ever seen.