I Can't Notice the Difference
This is one of the most common points of
There are some self-test methods out there, but I don't know of anyone who has done a formal, scientific, published study about the highest frame rate a random selection of human beings can see before they notice a difference in something higher. Personally, I can see the difference between 30 and 60, and I can usually identify which one a game is doing with some playtime with the game. There are few recent high-end sets that run at 120Hz, but most recent television sets and portable displays only run at 60Hz, which means the highest number of frames they can render is 60, which generally dictates that media is rarely made to run at higher than 60 fps. I don't know that I've ever seen something running at 120 fps, so I couldn't tell you if I could see the difference. The PS3 and PlayStation Eye can output at 120 fps, though there are downsides to both. The camera runs at half resolution, for example and the PS3....well we'll get to a general discussion about that in a second.
Higher is Always Better
That's generally true, and if you can get a better frame rate, then why not get it? I'd say a small portion of those who can tell the difference between 30 and 60 fps will swear by 60 and turn up their nose at anything else. Some people report that games with a frame rate lower than 30 per second make them feel nauseous after awhile, which isn't too bizarre. It likely has more to do with a reaction to the more noticeable change in light, like rapid flashes that trigger seizures in some people, but obviously nausea is a less severe reaction. But there is a trade off for a higher frame rate in all medias.
The Detail of Rendered Geometry vs. Frames Per Second
Back to the downside I was speaking of with rendering more fps. With any computer, any console, rendering the scene more often uses more computing power. With a fixed set of hardware, producing more fps means reducing geometry detail, effect detail, and even screen resolution in some cases. Some of the more famous examples of games that are programmed to take a hit in resolution are games from the Halo series on the Xbox 360; they run at 30 fps and a resolution slightly lower than 720p. Why would Bungie do this? To produce better geometry and effects, that is to say, there is more detail per frame than there would be at a resolution of 720p at 60 fps. A famous example of a game that runs at below 30 fps is Shadow of the Colossus. Not only did this let Team ICO (the developers of the game) produce more detail in the game, but some will say that if added on purpose, a frame rate lower than 30 per second creates a very cinematic, film like quality. And I have to agree. Have you ever seen a soap-opera and thought "this just looks different from a movie"? Well many of them are recorded at higher frame rate than 24 per second. Detail is the general trade off for all real-time computer generated imagery on fixed hardware, but for pre-recorded media, more frames means a larger file size.
A programmer has three options when dealing with the final frame rate; lock the frame rate at a certain number per second, let the frame rate change on the fly as the game scene changes, or lock the frame rate and cause the action to "slow down" to compensate for the higher strain on the computer. I believe most people will tell you they prefer a locked fps, and that most people will tell you that a changing fps count is the worst way to handle it. Some would argue against slow down because it alters the "perceived physics" of a game and you have to react to the game as it slows down and resumes normal speed. I prefer the locked fps myself, and don't have a problem adapting to slow down in games where it is common, like side-scrolling shoot 'em ups. And speaking of game genres....
Some Genres are Unplayable at Lower Frame Rates
Another pretty common argument concerning fps is how it effects your ability to play a game. Most commonly you see this pinned on fast paced games where the scene is changing constantly; the first person shooter. For PC first person shooters, where the user has some control over what fps are being produced, I agree that those that choose a higher fps have a slight, but tangible advantage over those that are using a lower fps. Otherwise, I think that the people that say or imply that there is some loss of enjoyment directly related to the frame rate being less than 60 (and 30 or above) are just being ridiculous. If everyone has the same frame-rate, then no one has an advantage. Speaking of advantages, some people claim to have an advantage with this next visual change.
Stereoscopic 3D vs. Frames Per Second
Stereoscopic 3D. Whether with red-blue glasses, polarized 3D specs (the kind you get at movies now), crossing your eyes, "looking past the picture", Virtual Boy style goggles, or parallax barrier technology, it creates the effect that the images are three-dimensional. And this is where the 3DS comes in the picture. Perhaps you've heard the reports from publishers and developers of 3DS games that their games run at 30 fps with the 3D turned on, and 60 fps with the 3D turned off. And if you've followed PS3 games where stereoscopic 3D was added after the game was finalized or released, many of them run at half the fps that they usually do without stereoscopic 3D. This makes sense, as the game has to render the game scene from two different perspectives at once to display the 3D effect, one perspective for your left eye, one perspective for your right. Effectively, a game running stereoscopic 3D at 30 fps is rendering 60 images of the game scene per second, very approximate to running at 60 fps. A game running stereoscopic 3D at 60 fps is rending 120 images of the scene per second, like running at 120 fps. The argument with this situation is whether the 3D effect is worth the drop in frame rate. I'm not bothered by 30 fps, but I also don't know how valuable the 3D effect is to a gaming experience.
Perhaps some people's eyes really can't see the difference between 30 and 60 fps, or perhaps they don't understand the difference they should be looking for, but I think it will be very interesting to see what people start saying about frame rates when they have a very common device where they can instantly see the difference between 30 and 60 fps.
Feel free to weigh in and leave a comment.