Originally posted by ZZII
The human eye sees at under 30fps so I wouldn't worry about it.
That's not really true.
Interestingly enough, I have discussed this subject at length with Dr. Norberto Grzywacz, senior scientist at the Smith-Kettlewell Eye Research Institute.
There's a lot of mis-information about how fast the human eye sees. You only need about 20 FPS to trick someone into thinking they're seeing smooth motion, but that requires motion blurring. Movies run at 24 FPS, and you can definitely see choppiness during long pans in outdoor scenes that use faster film, because there is no blurring. Video games do not blur one frame into the next, so you need more like 60 FPS before things are completely smooth.
The rods in your eyes react very quickly, and can easily perceive flicker at very fast rates. That's why modern video cards don't support refresh rates below 60 Hz, because it will give you a splitting headache. The recommended refresh rate for all modern monitors is 75 - 85 Hz. Higher than 85 degrades image quality a little, but lower than 75 will give most people eye strain after an hour.
PAL TV's are 50 Hz because European electricity is 50 Hz. If you're not used to being in Europe, most people will notice immediately that the street lights flicker because they're running at 50 Hz. Some people can detect flicker up to 100 Hz.
So, it all depends how you look at it. With proper motion blurring, 24 FPS is fairly acceptable, but games don't have this, which is why it's desirable for any game to maintain a framerate over 60 FPS.
TV's are more forgiving than your average monitor, though. The persistance of phosphors blurs things together a bit making them look more smooth.