arwin
It makes sense that the clock display is a part of it, leaving all else out - it would be nearly impossible to make the clock display exactly the same time with different framerates being used. So if the PAL version draws 50 times per second and the NTSC version 60 times per second, the actual times are going to differ slightly. You can't display each 1000th of a second if your screen only refreshes about 50 times per second. In theory that means times would always show like 0.020, 0.040, etc. on PAL, and 0.017 for NTSC. In practice, there's going to be a tiny bit of variation depending on what instructions are being run under any given circumstance which determines at which point the time is 'snapshotted' and displayed on the screen. This can then be masked by additional tricks like random additions or whatever to give the illusion of a 0.001 precision. I think that these weren't in place yet in Prologue, where you could see that the times actually differed about 0.01666 minimum in the online db at db.gtrp.de.
you are correct arwin. there is no way for the playstation to graphically display precision to 1/1000th of a second accuracy. in fact, it cant even display precision to up to 1/100th of a second! a closer look at the lap time and total lap time counters will partly prove my theory.
i did a test in b-spec mode with a number of different cars, and what immidiately struck me was the fact that the clock was displaying different figures between the total time and lap time on the first lap! whilst the total time was advancing 'seeminlyly' at 1/100th of a second, the 1/1000th of a second counter was stuck on zero. the lap counter was also 'seemingly' advancing at 1/100th of a second, whilst the 1/1000th of a second counter was stuck displaying a random number every time the car crossed the finish line!
so how does GT4 manage to display lap times to a precision of 1/1000th of a second? some of the theories i have managed to come up with suggest that the timings in GT4, may not be accurate or real-time, but calculated. my best guess though, is that there is a clock running in the background, separate to the actual displayed clock, and this clock is the clock that lap times are calculated from. this clock then displays the information on screen as the car crosses the start/finish line. it is a lot easier to display one frame with the correct time, than it would be to try and replicate 1/1000th of a second precision.
so what does that mean for the fastest lap times we set? compared to the real world, GT4, might seem to be inaccurate, but you must remember that afterall, its only a simulation. Whether the game is simulating car physics, distance travelled, or time taken to travel a certain distance, it is still a simulation, and simulations are open for interpretation. there is bound to be a margin of error for the time calculation regardless of the frame rate - all timing systems have a % error or % accuracy either in game or in the real world.
what does this mean for NTSC vs PAL lap times? comparing the times against each other is pointless, as they are both running separate time simulations. PAL lap-times maybe faster but equally, they could also be slower. this is partly because of the differing frame-rates. if the times database doesnt have a column for NTSC, PAL (and maybe SECAM), then maybe now would be a good time to include one.
i would be interested to find out how the time is displayed under SECAM (the french telivision standard).