Sony Uses Gran Turismo Sport to Showcase Its 8K TV at CES 2019

Yep.
But related to refresh rates as the panel can only refresh quick if the pixels respond fast.

At least this is what I remember from researching when buying my TV 12 months ago.
A TV panel expert I am not. :P

As long as you can game on the tv without issue like display lag, no worries :)
 
Human eyes see in 2k 24fps

If you can't see a diffrence between 24 fps and 60 fps, you should visit your doctor.
The human i can very well register the diffrence between 24 frames and more. you will see it way more fluent.
The human eye doens't "work" with frames. That statement is just not correct.
 
Input delay is more than refresh rate / frame time.

Given an instantly responsive PC, and zero transmission time from the player controller and any controllers, at worst you're going to have 16.6ms of input lag between you doing something and WHAT YOU SEE. This isn't the same as WHAT IS SIMULATED.

Simply put, if you do something a nanosecond after the previous frame is shown, you've got to wait a minimum of 16.6 milliseconds before the next frame shows something.

But let's think about all the things that have to happen:

You move the joystick > controller registering that move. Might be 1/8,000th of a second (not sure about DS4 polling rate).

Then, the controller transmits that wirelessly to the PS4, so encoding, transmission, verification+decoding the signal.

Then GT Sport takes that info, and simulated based on it. Physics simulation is done hundreds of times a second, so each of those frames at 60Hz might contain dozens of physics steps. BUT, it won't show any of that until it's done enough physics and started rendering, at which point time is 'frozen'. Not actually frozen, but the renderer takes that point in time. So there's an offset there, if that makes sense.

Now to output - through the wire to the TV. The TV then does whatever it does, and this is where the high-response rating of monitors shines. Because they're built for interactive things, this is smaller than a TV (because you never really test this on a TV do you).

I'm assuming this is what he's alluding to in that post, the TV response time part.

I'd love to get some numbers on all of these bits. Do some actual objective math on it per-system.

It's quite a lot actually

https://www.eurogamer.net/articles/digitalfoundry-lag-factor-article
  • The lowest latencies a video game can have is 50ms (three frames) - the PS3 XMB runs at this rate, but few games reach it.
  • Most 60FPS games have a 66.67ms latency - Ridge Racer 7, for example.
It's old, but still valid. It's not only 16.6 ms render time, games are at least double buffered as render time is variable, transmission time to the tv is not instant, and a frame can not be displayed until the next refresh cycle.

This is more recent comparing last gen to current and one game did break the 50 ms barrier (A remaster, CoD Modern Warfare 40 ms)
https://www.eurogamer.net/articles/digitalfoundry-2017-console-fps-input-lag-tested

GTS has a lot going on and is definitely double buffered at least. 60 ms input delay before factoring in display lag would be a good guess.
 
Not natively. The PS4 Pro doesn't even do 4K natively in most cases. We're talking 16x the resolution of a 1080P screen, or 33 million pixels.

AFAIK, there's very few companies producing 8K content right now. The Russo brothers filmed both Avengers 3 and 4 with IMAX/ARRI cameras that don't hit true 8K. Film companies are driving demand because an 8K camera can still produce better 4K content, but honestly, unless you're running a setup as big as this 98-inch Sony screen, you're not really going to notice a difference between 4K and 8K. 4K is still a hard sell for most folks — there's not a lot of content out there that's genuinely made for it — so good luck to any company trying to push for 8K to be remotely common place in the next decade.

...That said, I'm sure it's stunning to witness. :drool:
The Tokyo Olympic games will be transmitted in 8K. They've been planning it since they won the right to host the games. Domestically 8K in Japan 2020. The West 2021. Maybe.

The PS4 was not the usual 40 times more powerful that the previous console therefore you can't use it as a yardstick.
 
Unpopular opinion time: The "fad" that is super high resolution gaming is only ensuring the actual fidelity of these games improves slowly, or even gets worse while power demands for pointless pixels increases. Somewhat murdering visual enhancement.

Even GTS's 1080p mode is just a supersampled 4k image. So I still don't get any real benefit except for some nice, way over the top and extremely power hungry AA. Such a waste.
 
If you can't see a diffrence between 24 fps and 60 fps, you should visit your doctor.
The human i can very well register the diffrence between 24 frames and more. you will see it way more fluent.
The human eye doens't "work" with frames. That statement is just not correct.
In fact, the 24 fps is actually the absolute minimum framerate where most people would perceive the frames as continuous motion. Somehow people took an absolute minimum, and started saying it's the absolute maximum perceivable.
 
lmao at the human eye is only 2k. The human eye is far more complex and sophisticated than any tech, Its not discrete but if you were trying to compare, research from a NASA scientist estimate it at the equivalent of 576 MP. 4K is about 8.5MP.

We won't even go into the colour range, movement capture and adaptability of the human eye.
 
lmao at the human eye is only 2k. The human eye is far more complex and sophisticated than any tech, Its not discrete but if you were trying to compare, research from a NASA scientist estimate it at the equivalent of 576 MP. 4K is about 8.5MP.

We won't even go into the colour range, movement capture and adaptability of the human eye.

The human eye drops off in resolution perception over 80 pixels per degree with a maximum of 200 pixels per degree turning comparisons in absolute guess work.
TLcd.jpg


That's the optimal resolution which you only get in the center of the fovea.
Vis_Fig5.jpg

Beyond 10 degrees your eye only perceives 1/5th of your maximum resolution.

So the absolute maximum a screen needs to be at 200 pixels per degree with 40 degree viewing angle is 8K. That's at 1.2 x the diagonal of the screen as seating distance. 6.5 ft from a 65" tv, 4 ft from a 40" tv or 2ft from a 20" monitor.

For VR you need at the very most, 30K per eye at 150 degree fov (the max per eye). That's a waste though as you only need a 2 degree cone at 200 pixels per degree to fool your eyes. Beyond 5 degree from the fovea 100 pixels per degree or 15K is enough. Beyond 10 degrees 40 pixels per degree is enough or 6K for the remainder of the screen. Beyond 30 degrees 20 pixels per degree or 3K is plenty. Beyond 60 degrees 10 pixels per degree is plenty.

With foveated rendering you would need 10*200 + 10*100 + 40*40 + 60*20 + 180*10 is 7.6K per eye to simulate a perfect 30K per eye.
That's taking the absolute guess work maximum into account, using half, 100 pixels per degree as starting point you only need to render 3.8K per eye to get a near perfect 150 degree fov in VR. One catch, you still need a 15K screen per eye to put it on.

Currently PSVR only has average of 10 pixels per degree. It already has a higher density in the middle due to the warping of the lenses. Foveated rendering has no use yet as on the edges it's already down to close to 5 pixels per degree.


Color, brightness and contrast play a role as well of course. Lower contrast and lower brightness lower resolution perception.
Color peaks at about 15 degrees. You actually don't see much color in the fovea.
humanvisionfigure4.jpg


Adapting resolution based on the brightness and contrast in a scene isn't that practical though. Neither is using different color resolution while rendering. Those are very useful when it comes to compression for wireless VR. Chroma subsampling has been used since the start of digital video, cutting the color resolution to 25% before any further compression. Blu-ray only uses 960x540 color, 4K blu-ray 1920x1080. Specialized compression algorithms for VR based on eye tracking can reduce the required bandwidth a lot.

Adapting resolution based on eye tracking is very promising. A VR headset made to the specifications of the human eye can be very efficient.

Btw if you like to know what pixels per degree you are used to, use this formula
1080p (16:9) -> distance x 38.45 / diagonal
4K (16:9) -> distance x 76.8 / diagonal
(of course use the same units for distance and diagonal)
 
Last edited:
What happened to people puking and fainting from watching 8K demos? I remember that being a nice showcase fad a couple of years ago.
 
8K screens look like they will start to filter into high end ranges quicker than I expected. But these are designed predominantly for upscaled 4K content. 4K has barely started to filter into homes, with only a handful of movie and sports channels. Likewise 4K Blu-Ray is struggling for traction and shows diminishing returns.

On the gaming front it’s easier for developers to produce native 8K content. Trouble is there’s barely any hardware on the market to run the latest titles as such resolution. Certainly not next-gen consoles.
 
And VR. 8K spread over 150 degrees is comparable to 864p over 30 degrees, the recommended smpte viewing angle for 1080p tv. Of course in VR you can use foveated rendering with eye tracking, no need to render full 8K as your eyes only see sharp detail in a 2 degree cone.

VR is the real deal and I think next gen it should be given also VR resolution, IQ should also get better next gen with high fps. IMHO even 4K is unnecessary. I think most PC gamers prefer 1440P at high or max setting with 60 fps rather than 4K and sacrificing fps, IQ
 
VR is the real deal and I think next gen it should be given also VR resolution, IQ should also get better next gen with high fps. IMHO even 4K is unnecessary. I think most PC gamers prefer 1440P at high or max setting with 60 fps rather than 4K and sacrificing fps, IQ

The thing with fps is, you need more the higher the resolution. The human eye doesn't really see fps as such, it does track objects, sort of scanning them. To see a smooth image, objects should not skip over the screen, ideally only shifting one pixel per frame. Thus the higher the resolution, the more steps you need to stop an object from skipping over the screen. Currently games use motion blur to trick you into believing you're watching smooth motion. This only works with rotating objects though, as anything else you can follow with your eyes, yet all you get is a blurry image. (up to a point, a variable point, looking at the road in front of your car it's hard to tell where it starts to get blurry)

Rendering objects with different frame rates has already been done. Background objects don't need as high fps as foreground objects. GTS could probably render the cars at 30fps and only shift them into place for the next frame and you wouldn't notice the difference. (Until they start jumping around from lag...)

Your eyes also sooner perceive smooth motion with darker images. The easiest way to get a smoother picture is to eliminate ambient light and turn the brightness down. I noticed that every time when watching 24p blu-ray on my projector vs tv. The projector screen is much bigger (larger steps) however without ambient light and max brightness of 15 fL (50 nits) it looked smoother than on tv (50 fL = 170 nits) Of course, now we have HDR with peak brightness of over 1000 nits, we need more fps to keep things smooth!
 
Beyond 10 degrees your eye only perceives 1/5th of your maximum resolution.

We're not talking about peripheral vision. Peripheral vision is meaningless when talking about displays as eyes look all around displays not just at the centre.

So the absolute maximum a screen needs to be at 200 pixels per degree with 40 degree viewing angle is 8K. That's at 1.2 x the diagonal of the screen as seating distance. 6.5 ft from a 65" tv, 4 ft from a 40" tv or 2ft from a 20" monitor.

Again thats a wrong conclusion and when watching a display you don't just focus on a centre.

Also it was the claim that eyes see in 2K that I showed was wrong. Perception is different to actual biological functions.
 
We're not talking about peripheral vision. Peripheral vision is meaningless when talking about displays as eyes look all around displays not just at the centre.

I was talking about VR with eye tracking. Once the device knows where you are looking, it won't have to render the entire display at the highest resolution. Foveated rendering would also work for flat displays with eye tracking. Most of what is rendered is never seen by your eyes.

Again thats a wrong conclusion and when watching a display you don't just focus on a centre. Also it was the claim that eyes see in 2K that I showed was wrong. Perception is different to actual biological functions.

That's not just focusing on the center. A 40 degree viewing angle means the tv takes up 40 degrees of your field of view. The highest resolution you can ever perceive is 200 pixels per degree in the fovea. 40*200 is 8K. (Although technically 8K tv is only 7680 pixels)

Vision is mostly recognition anyway. It's always interesting watching someone play a game for the first time, not knowing the visual language of that game yet. They see exactly the same thing right in front of them, yet at first it's all very overwhelming and hard to tell the difference between static background and interactive items. Same reason universal symbols are used everywhere.

The human eye sees very little, the mind makes up the rest. However it can perceive a lot more color than current displays. Current displays aim for 100% DCI-P3 coverage, which is only 54% of vision. Rec.2020 is next covering 76% of vision. Of course it varies per person and if you want to enjoy more color you should have been born female. Women also have a 2% to 3% chance at superior color vision (tetrachromats) with extra cones between green and red, 4 distinct cones instead of the regular 3.

And of course humans have 2 eyes. VR is still far from the real thing as it doesn't allow for variable focus. Eye tracking can help with that as well. Prototypes have already been build to simulate correct depth of field based on what you are looking at, as well as adjusting the focal plane based on your pupils. The alternative is holographic displays which are very costly to render.

Human vision is quite fascinating :)
 
There's no point playing a racing game on a TV because of input delay...I wonder what this thing's input delay is?

That is completely false. There are TV's with as low as 10ms input lag while the best monitors are around 5 (That's input lag, NOT response time. There's a difference) and that 10ms lag TV happens to be LG's lowest range TV. Many cheap TV's have low input lag, and expensive TV's are usually 15-20 ms which is hardly pointless to use.

Obviously people should do their research though. There are still TV's that have above 40ms lag even with game mode enabled but that's only one of the many reasons you should always do your research before buying a TV. I go to RTINGs dot com before buying any TV because they go to insane depths reviewing TV's and will tell you basically everything you need to know about a TV in detail, including display lag which is often hard to find info on.

I was talking about VR with eye tracking. Once the device knows where you are looking, it won't have to render the entire display at the highest resolution. Foveated rendering would also work for flat displays with eye tracking. Most of what is rendered is never seen by your eyes.



That's not just focusing on the center. A 40 degree viewing angle means the tv takes up 40 degrees of your field of view. The highest resolution you can ever perceive is 200 pixels per degree in the fovea. 40*200 is 8K. (Although technically 8K tv is only 7680 pixels)

Vision is mostly recognition anyway. It's always interesting watching someone play a game for the first time, not knowing the visual language of that game yet. They see exactly the same thing right in front of them, yet at first it's all very overwhelming and hard to tell the difference between static background and interactive items. Same reason universal symbols are used everywhere.

The human eye sees very little, the mind makes up the rest. However it can perceive a lot more color than current displays. Current displays aim for 100% DCI-P3 coverage, which is only 54% of vision. Rec.2020 is next covering 76% of vision. Of course it varies per person and if you want to enjoy more color you should have been born female. Women also have a 2% to 3% chance at superior color vision (tetrachromats) with extra cones between green and red, 4 distinct cones instead of the regular 3.

And of course humans have 2 eyes. VR is still far from the real thing as it doesn't allow for variable focus. Eye tracking can help with that as well. Prototypes have already been build to simulate correct depth of field based on what you are looking at, as well as adjusting the focal plane based on your pupils. The alternative is holographic displays which are very costly to render.

Human vision is quite fascinating :)

Foveated resolution looks like TRASH though. Robinson the Journey, and Driveclub VR are perfect examples. The PS4 Pro removed foveated resolution on both games and it looks immensely improved over the PS4 version. Maybe there are better ways to implement it but you can't just brush it off like we don't need the resolution to be uniform, because those games prove that you do. They seriously look so much better without it to the point you would think the native resolution was doubled.
 
Foveated resolution looks like TRASH though. Robinson the Journey, and Driveclub VR are perfect examples. The PS4 Pro removed foveated resolution on both games and it looks immensely improved over the PS4 version. Maybe there are better ways to implement it but you can't just brush it off like we don't need the resolution to be uniform, because those games prove that you do. They seriously look so much better without it to the point you would think the native resolution was doubled.

Of course it looks like trash, PSVR does not have eye tracking, it doesn't change the center point of the high resolution area when you don't look straight ahead. Besides that, PSVR is already below 10 pixels per degree out of the center. It's at the minimum for peripheral vision.

Btw Robinson the Journey and Driveclub VR did not use foveated rendering. Robinson the Journey renders at a higher resolution on ps4 pro. I did the comparison myself. The lower resolution contributes to more distortion at the edges.
fZcd.jpg

What you see is the effect of de-warping the image. The nature of the lenses puts more pixels in the center of the image than the outside. It's sort of like foveation, yet only if you loom straight ahead. The pro super samples the image, which means a sharper image in the center where it gets zoomed in while converting the image for the headset optics.

Driveclub VR renders at the same low resolution on both base and pro. The only thing added on pro are a few minor lighting effects.

Resident evil 7 did use a crude form of foveated rendering on the base ps4. Very crude. It had an outer rectangle where the game renders at half resolution. It was very visible when you look at the edges in the headset. On pro it kept the resolution the same and switched to a higher resolution when not moving too fast.

Foveated rendering will become effective once the base resolution goes up. PSVR is at less than 10% of what the human eye is capable of seeing, comparable to about 320x360 in the center part, viewed at a 30 degree viewing angle.
 
I was talking about VR with eye tracking. Once the device knows where you are looking, it won't have to render the entire display at the highest resolution. Foveated rendering would also work for flat displays with eye tracking. Most of what is rendered is never seen by your eyes.



That's not just focusing on the center. A 40 degree viewing angle means the tv takes up 40 degrees of your field of view. The highest resolution you can ever perceive is 200 pixels per degree in the fovea. 40*200 is 8K. (Although technically 8K tv is only 7680 pixels)

Vision is mostly recognition anyway. It's always interesting watching someone play a game for the first time, not knowing the visual language of that game yet. They see exactly the same thing right in front of them, yet at first it's all very overwhelming and hard to tell the difference between static background and interactive items. Same reason universal symbols are used everywhere.

The human eye sees very little, the mind makes up the rest. However it can perceive a lot more color than current displays. Current displays aim for 100% DCI-P3 coverage, which is only 54% of vision. Rec.2020 is next covering 76% of vision. Of course it varies per person and if you want to enjoy more color you should have been born female. Women also have a 2% to 3% chance at superior color vision (tetrachromats) with extra cones between green and red, 4 distinct cones instead of the regular 3.

And of course humans have 2 eyes. VR is still far from the real thing as it doesn't allow for variable focus. Eye tracking can help with that as well. Prototypes have already been build to simulate correct depth of field based on what you are looking at, as well as adjusting the focal plane based on your pupils. The alternative is holographic displays which are very costly to render.

Human vision is quite fascinating :)
Of course it looks like trash, PSVR does not have eye tracking, it doesn't change the center point of the high resolution area when you don't look straight ahead. Besides that, PSVR is already below 10 pixels per degree out of the center. It's at the minimum for peripheral vision.

Btw Robinson the Journey and Driveclub VR did not use foveated rendering. Robinson the Journey renders at a higher resolution on ps4 pro. I did the comparison myself. The lower resolution contributes to more distortion at the edges.
fZcd.jpg

What you see is the effect of de-warping the image. The nature of the lenses puts more pixels in the center of the image than the outside. It's sort of like foveation, yet only if you loom straight ahead. The pro super samples the image, which means a sharper image in the center where it gets zoomed in while converting the image for the headset optics.

Driveclub VR renders at the same low resolution on both base and pro. The only thing added on pro are a few minor lighting effects.

Resident evil 7 did use a crude form of foveated rendering on the base ps4. Very crude. It had an outer rectangle where the game renders at half resolution. It was very visible when you look at the edges in the headset. On pro it kept the resolution the same and switched to a higher resolution when not moving too fast.

Foveated rendering will become effective once the base resolution goes up. PSVR is at less than 10% of what the human eye is capable of seeing, comparable to about 320x360 in the center part, viewed at a 30 degree viewing angle.

"Robinson and Driveclub did not use foveated rendering"

WRONG. I just googled it and you're just wrong about that. Robinson does have reduced foveated resolution on PS4 Pro, and they both have it in general. Conversation over since you're gonna lie like that.
 
Monitors are better than TVs.

If you want proper gaming get yourself a monitor.

30 plus inch along with 4k and hdr is the best thing to purchase.
 
"Robinson and Driveclub did not use foveated rendering"

WRONG. I just googled it and you're just wrong about that. Robinson does have reduced foveated resolution on PS4 Pro, and they both have it in general. Conversation over since you're gonna lie like that.

https://www.eurogamer.net/articles/digitalfoundry-2016-how-does-ps4-pro-improve-the-psvr-experience
I see I was wrong about Robinson the Journey, Driveclub however has no foveated rendering. I only noticed it in RE7 though. I guess they did a better job at hiding it in Robinson the Journey. (The whole game was rather blurry on base ps4, much better on pro)

However the term is used incorrectly here as the headset has no way of tracking the fovea. It's multi-res rendering and the illusion is broken as soon as you look away from the center.

Here's a good video on the subject as well as variable focus

It's coming.
 
Last edited:
Monitors are better than TVs.

If you want proper gaming get yourself a monitor.

30 plus inch along with 4k and hdr is the best thing to purchase.

30 inches is not enough to take advantage of 4k. The pixel density is way too high. Plus, not many people want to be sitting that close to a screen. Monitor's are hardly better besides. There's a minimal difference in input lag as long as you don't get the wrong TV. Most TV's are only 5-15ms slower than even the best monitors.
 
If you can't see a diffrence between 24 fps and 60 fps, you should visit your doctor.
The human i can very well register the diffrence between 24 frames and more. you will see it way more fluent.
The human eye doens't "work" with frames. That statement is just not correct.
I work on flight simulators for a major US airline. We have to have at least 90 fps on the visual systems or the pilots will get nauseous and throw up. I’m the past we have had way worse visuals than Microsoft FS because that amount of data running on fulll size visual systems was very hard to do at 90 FPS.

Of course, newer simulator visual systems are gaining more and more capability all the time and are getting much better all the time. We are currently upgrading about a dozen of our visual systems, and it’s high time. Plus have a new 777 coming in that will be the good visuals already installed. These are waaaaay above my GTS stuff obviously; I know they spent north of 34 million for this one simulator.
 
can we all just appreciate how amazing that looks?

But how can you Tell? Did you get to see it in Person?
I mean, we are all seeing it on less than an 8K display, so what we see is no where even close to how it would look in person...

It's the same with people using a 1080 display, and saying how awesome 4K videos look?? How would you know, you're not watching it on a 4K display?

It is the same here. how do you know how awesome that 8K display looks, when you're viewing it on a 1080p or even a 4K screen, doesn't do it justice.
 
But how can you Tell? Did you get to see it in Person?
I mean, we are all seeing it on less than an 8K display, so what we see is no where even close to how it would look in person...

It's the same with people using a 1080 display, and saying how awesome 4K videos look?? How would you know, you're not watching it on a 4K display?

It is the same here. how do you know how awesome that 8K display looks, when you're viewing it on a 1080p or even a 4K screen, doesn't do it justice.

4K video does look better than 1080p video on a 1080p display.
- Less compression. Your average 1080p video runs at 7mbps vs 18mbps for 4K.
- Full 1080p color resolution. (4K video is chroma subsampled and only has 1080p color resolution, 1080p video only has 540p color resolution)

4K videos also down sample nicely to 1080p, same as supersampling improves how ps4 pro games look on a 1080p display.

For the tv itself, who knows how it actually looks. It's big that's one thing.

Anyway resolution is kinda pointless without actual content. My cable provider is still stuck on 720p/1080i 7mbps mpeg-2. Analog tv looked better in busy scenes.
 
Back