The perception of frame rate depends on a lot of things, as has been explained already. But, to reiterate, it depends on the lighting conditions (versus the screen brightness), the actual images being displayed (e.g. blurred or otherwise), the display in question and, of course, the particular individual observer.
GT5

does suffer from a bit of slowdown. I get it in the first few corners of Suzuka at the start of a race. Probably elsewhere, too - first corner of London is a bad one.
Anyway, the problem here is that the framebuffer slows down briefly, so the TV's "clock" (if we're assuming it's a modern flat-panel jobbie) is out of sync briefly. This means that the TV updates its frames as usual, but the PS3's framebuffer is a bit behind, so updates the buffer halfway through, so the TV gets half of one frame, half of another. This is page / screen tearing.
A similar thing happens with off-screen footage. The camera is recording at a defined framerate (e.g. 15; 24; 30; 60 etc.) and the game and TV are running around 60 fps or so (assuming the TV can do 60 fps

) The camera is almost certainly not going to be in sync with the TV, so it'll pick up part of one frame from the TV, and the TV will refresh
as the CCD on the camera is being read, so that the rest of the frame (on the camera) is actually the next frame (on the TV)...
Capture cards potentially suffer the same problem; including that test some company did with GT5

. You must be sure to get sync. The problem with "v-sync" is that if a frame is "late" (slow-down), it drops down the frame / refresh rate to the next lowest multiple (30; 15 etc. fps, Hz) by delaying the refresh of the buffer. A bit of tearing is much better than such a step / jerk / stutter, especially given the framerate can be locked at 60 fps (i.e. no higher) on the PS3 and can stay there most of the time thanks to heavy optimisation on what is a fixed hardware spec. On PCs, the frame rate is far more variable (both ways) and is so annoying that v-sync is often a better option.
As for what frame rate is best, it depends. I can see differences up to 100 fps on a good flat-panel. On a CRT, that tops out around 30, due to the "lag" in the phosphor-luminescent pixels. Some people claim they can see differences in excess of 120 - 150 fps on certain displays. I remember it being discussed to death on the RSC fora some years ago.