OK, but I don't think any displays really need it these days, in terms of HD LCD/Plasma, I don't think their is any constraint of fixing a 50Hz real refresh rate as per the old CRT Days, and thinking about it, I'd have said screen tearing isn't a known artefact of any known (to me) basic 60-50Hz conversion systems anyway?, they just drop frames, it's the same principle (I believe) for NTSC video on PAL tv's, the artefact was micro-stuttering due to dropped frames. Well, technically some early PAL TV's that would kind of accept NTSC video 'blindly' just used to have horrific rolling/screen tearing/black screens.. I am talking about the later CRT's that at least buffered the input digitally for conversion..
With regards to the discussion of tearing, Digital Foundry at Eurogamer use the correct equipment to analyse this, and it's there, it's been quantified, it exists, it is far from perfect, and far from the mess that some modern games are..
Oh sure, dropping frames is a sensible way to do it, and not so drastic for TV and film (although I bet it's still annoying), but for an
interactive medium like games, especially those requiring
fast reactions, like racing games, any kind of intermittent slow-down is a no-no. Of course, the blind NTSC interpretation suffers from a lack of accurate "frame" recognition due to the different line count, so the
whole image scrolls, which is much worse than the tearing you see in digital raster output. Like I said, I'd take tearing over stutter any day, for a reflex- / anticipation-based interactive game at least.
Now, I don't have an encyclopaedic knowledge of old TVs, so I have no idea how many people still use exclusively 50 Hz displays for console gaming, but there must be at least some. Thinking about it more, the consoles will still output in the legacy modes, like 50 Hz PAL - according to their region, for guaranteed "SD" compatibility anyway. Some games may allow 50 / 60 Hz switching, although I haven't really seen this myself on the new consoles.
The issue, then, is whether the game
renders at 50 Hz, or the consoles just downsample it somehow before sending it out (i.e. digitally - so tearing is still possible
if the frame buffer is simply filled at 60 Hz but "polled" at 50 Hz; the tear lines would travel
upwards in this case, not down as in the case of in-game slowdowns.) Then, if the console is outputting at a true 50 Hz, is the display adding more latency and more potential aliasing and image degradation etc. through its own processing? Not a trivial question to answer, I'm sure; equally, are you even going to notice screen tearing at SD resolutions? I would agree that dropping frames may be the more "elegant" solution even here, but it's not necessarily good for games, as I've already stated.
Basically, I'm trying to figure out why some people claim the tearing is horrendous, whilst others (like me) only see it occasionally, as verified in the Digital Foundry analyses.