Polyphony Digital Has Been Working on GT Sport's HDR Feature for Three Years

Sounds like vacuum cleaners? You mean, just like real cars do? ;)

Check this video at 1:22 and compare it to the vacuum cleaner below:




You Sir, are a genious. You're absolutely right.
 
The advantages are future proofing for future tech that will deliver on the promise of greater colour range, greater artistic flexibility with regard to the compression schemes used (they are not limited to a narrow render gamut and can change how much they use as and when) and no more ugly dithering. 👍

Agreed, it just seems to me that they could have restricted the output to more realistic/practical values. Taking the theoretical maximum out of the HDR10 format in both colour and luminance range could end up being counter productive as any value outside of a given display capabilities will inevitably be somehow remaped with the effective risk of a new breed of visual artifacts.
 
Agreed, it just seems to me that they could have restricted the output to more realistic/practical values. Taking the theoretical maximum out of the HDR10 format in both colour and luminance range could end up being counter productive as any value outside of a given display capabilities will inevitably be somehow remaped with the effective risk of a new breed of visual artifacts.
That's exactly what HDR rendering already does, and is what PD have been doing since GT5P (the dithering is in-game). The game will still need to work on normal TVs etc. so that function has to exist anyway.

Stated specifically, many games already have a wider internal (effective) gamut than the displays they are intended to be played on, and this has been the case for some time.

So 10000 nit (over 13 bits effective) HDR rendering being clamped down to work on pseudo-10-bit panels is not any different from current HDR implementations in games clamping their internal gamuts to work on 8-bit (and pseudo-8-bit) displays.

The difference is that the new displays still have a higher dynamic range than standard, so there will still be a benefit. PD are positioning themselves to be able to offer a benefit once true 10-bit (or even 12-bit) displays emerge in the mainstream as well (Sony, see).

Note that 10000 nits is still only 40 dB, which is a fair way shy of the 90 dB needed to match the full range of our vision. I doubt we'll be seeing 30-bit displays any time soon, though...
 
The difference is that the new displays still have a higher dynamic range than standard, so there will still be a benefit. PD are positioning themselves to be able to offer a benefit once true 10-bit (or even 12-bit) displays emerge in the mainstream as well (Sony, see).

I'm not questioning the benefits of HDR over SDR, that's a given. However, a full 2020 gamut 10K nits output just outperforms every single SUHD display available out there. Currently BT.2020 coverage is barely reached by laser based prototypes. Higher luminance ranges also poses industrial concerns too (Power drain/cooling/...). When unveiled Sony was touting 4000 nits for their Backlight Master Drive tech currently implemented in their ZD9 top of the line range. The commercially available displays effectively maxes out at around 2000 nits peak brightness and can't even track P3 gamut correctly. With PD reportedly releasing unclamped/full throttle HDR10, every bit of code outside of that effective range (no matter the bit depth used) will ineluctably be remaped in one way or another.
 
Last edited:
HDR is pretty wonky when choosing between HDR TVs, as some TVs handle HDR better than others even though they might carry the HDR10 certification. Some TVs can even do HDR but do not carry the certification, like what Sony does. IMO to get the best out of HDR you need a TV that checks all the boxes for HDR10 delivery, and a lot of TVs don't actually check every box.
Sounds like you have some insight into the subject. It would be great to have a guide on the subject. Do you have any links you can point us to?
 
I'm not questioning the benefits of HDR over SDR, that's a given. However, a full 2020 gamut 10K nits output just outperforms every single SUHD display available out there. Currently BT.2020 coverage is barely reached by laser based prototypes. Higher luminance ranges also poses industrial concerns too (Power drain/cooling/...). When unveiled Sony was touting 4000 nits for their Backlight Master Drive tech currently implemented in their ZD9 top of the line range. The commercially available displays effectively maxes out at around 2000 nits peak brightness and can't even track P3 gamut correctly. With PD reportedly releasing unclamped/full throttle HDR10, every bit of code outside of that effective range (no matter the bit depth used) will ineluctably be remaped in one way or another.
Sure and, as usual, I have less interest in the hardware in practice as compared with the underlying systems and theory, if you like. So forgive me the gaps in my knowledge on that front!

But I really don't think that PD offering more gamut in-engine is any worse for "HDR" displays than it has been for "SDR" displays so far. If anything it will be better because I'm pretty sure that most game engines don't render as wide as even the slackest of nominally HDR hardware implementations, so developers will have some work to do to make full use of them.
Except PD, who are set to 2020 and beyond... :dopey:

I also expect that PD will have served as useful first-party guinea pigs for other first-party devs.
 
HDR wont be something thats bolted on, they'll have made the decision to go the HDR route 3 years ago and developed and built the rendering pipeline and assets for HDR over that time, also making sure everything is still acceptable on non HDR displays and that, for example, there are the bloom and other effects that will be required or different for non HDR displays. Add to that the 2 different hardware specs and sonys order of consistency between setups and you get a huge headache.
 
Sure and, as usual, I have less interest in the hardware in practice as compared with the underlying systems and theory, if you like. So forgive me the gaps in my knowledge on that front!

But I really don't think that PD offering more gamut in-engine is any worse for "HDR" displays than it has been for "SDR" displays so far. If anything it will be better because I'm pretty sure that most game engines don't render as wide as even the slackest of nominally HDR hardware implementations, so developers will have some work to do to make full use of them.
Except PD, who are set to 2020 and beyond... :dopey:

I also expect that PD will have served as useful first-party guinea pigs for other first-party devs.

Sorry if it sounds overly pragmatic. I can enjoy things on a conceptual level too. In this particular case I’m skeptical; pushing a tech demo to the point it’s not even demo-able sure raises some concerns. I’m genuinely interested in HDR and the workflow used by PD for the matter. I can envision how some tools may have allowed more accurate capturing and computing/rendering of spectral data, but with the end product being ultimately about visuals it leaves me wondering how final grading decisions where/could be taken. With no practical tool at hand you’d either refer to abstract models or guesstimates.

As for other developers, if nvidia white papers on HDR are anything to go by (selling that many GPU’s should make there voice at least be heard I imagine), then sure more conservative takes on HDR may become more popular as they seemingly suggest to stick to sRGB/REC.709 primaries and to extend the luminance range to about a one thousand nits peak. Potentially more in line with current generation of SUHD televisions capabilities and probably easier to trim pass for the majority of existing HD displays out there.
 
Sorry if it sounds overly pragmatic. I can enjoy things on a conceptual level too. In this particular case I’m skeptical; pushing a tech demo to the point it’s not even demo-able sure raises some concerns. I’m genuinely interested in HDR and the workflow used by PD for the matter. I can envision how some tools may have allowed more accurate capturing and computing/rendering of spectral data, but with the end product being ultimately about visuals it leaves me wondering how final grading decisions where/could be taken. With no practical tool at hand you’d either refer to abstract models or guesstimates.

As for other developers, if nvidia white papers on HDR are anything to go by (selling that many GPU’s should make there voice at least be heard I imagine), then sure more conservative takes on HDR may become more popular as they seemingly suggest to stick to sRGB/REC.709 primaries and to extend the luminance range to about a one thousand nits peak. Potentially more in line with current generation of SUHD televisions capabilities and probably easier to trim pass for the majority of existing HD displays out there.
Pragmatism is always welcomed, but I wonder what cost the extra colour gamut actually entails in respect of the game itself.

From the Ars Technica interview it seems that Kaz was primarily targeting photo mode with the 10 000 nit range, so in essence it is purely for the purposes of expression - always welcome. In that sense, this is really no different from the HDR implementation as seen in GT's photomode to-date, despite the fact that HDR consumer displays were not even on the horizon yet.


I highly doubt that nVidia still thinks 1000 nits is enough now (and bear in mind the nVidia white papers are often "guest-authored" - it's just a platform). That figure would be the compromise I alluded to regarding the loss of contrast and loss of range during the simulated "accommodation" that HDR renderers typically use for "SDR" displays - that compromise will not be the same for the "average" HDR display in the near future.

In fact, and as I suspected, the current standard (1000 nits), assuming it is such, is nowhere near enough to take advantage of HDR displays going forwards and would, as you say, be "un-demo-able"! The features of HDR rendering, as we have come to know them, will be absent on most such displays as the output gamut exceeds the internal provision (assuming the "standard" is accurate).

In other words, these recommendations are out of date, people have work to do to synergise the software and hardware, and PD would surely serve as an example once again. (Whether that's had an impact on the game is another matter altogether, but their example is good for the technology outright).


The noise PD (Sony) are making about this is clearly intended to invigorate interest in the concept and the "excess" range will bring benefits to the photomode at the very least and won't look too shabby in-game, either - win / win. We've come a long way since the Lost Coast demo :)



In short: Rendering at 1000 nits (range) to display at 100 nits (the current "standard") is no less "excessive" than rendering at 10 000 nits to display at 1000 - 4000 nits.

Since we can expect the hardware standard (thanks to video formats) to be closer to 4000 nits, the current SDR situation is technically even more excessive than what PD have done for GTS, and has been for some time. Because HDR rendering to-date has brought more interesting and immersive visuals into wider appreciation, and HDR displays only increase that potential, I think we can conclude that the "excess" is in fact sufficient ;)
 
That's quite different to what GT's site indicates.

You don't make yourself very clear. I can't see mention of any motivation beyond the implicit suggestion that the wider colour gamut is closer to what our eyes see.

Interesting, thanks.


A few choice quotes:

However, the frame buffer is now in full floating point. The ability to encode negative numbers allows us to encode colors outside the gamut of sRGB. In fact, it allows the representation of all of BT 2020 and much more.
https://developer.nvidia.com/displaying-hdr-nuts-and-bolts


Make sure your rendering pipeline supports [Physically Based Rendering] and floating point surface formats right until the end of the frame.
https://developer.nvidia.com/implementing-hdr-rise-tomb-raider


The first concern is obviously whether your game is rendering with HDR internally today. Since most PC games are, we’ll consider this a pretty safe bet.

When I’m talking about having really good dynamic range, I’m talking about having some highlights that have values approaching or exceeding 184.0 in the frame buffer after adjusting for exposure. This value is 10 stops above the photographic middle gray of 0.18. The good news is that with tech like physically-based rendering getting data like this isn’t really a problem. The real world produces scenes like this, and rendering algorithms that attempt to mimic the real world do as well.

Applying the same tone mapper to screens with a maximum luminance of 200 nits and 1000 nits does not result in a pleasing image on both. It is the same as just turning up the brightness. You really want colors and luminance levels represented well today to remain the same.
https://developer.nvidia.com/rendering-game-hdr-display


The core addition to UE4 to support HDR displays is a pipeline to get the high-precision data to the framebuffer. First, this means changing the allocation of the swap chain to being [full floating point].
https://developer.nvidia.com/hdr-ue4



Based on that, I think it's safe to say that 1000 nits is not the target for renderers, but for the tone-mapped output. The extra headroom in the renderer means it should be relatively easy to expand that output for better displays as and when they become available.

Furthermore, PD have not at all gone overboard since everyone else is doing the same by default.

EDIT: I hadn't realised that physically based rendering was so widespread already, so that means (at least large) developers are probably well positioned to take advantage of HDR displays just the same as PD are. This is good news.
 
Last edited:
Based on that, I think it's safe to say that 1000 nits is not the target for renderers, but for the tone-mapped output. The extra headroom in the renderer means it should be relatively easy to expand that output for better displays as and when they become available.

Well I was convince output values and display capabilities was specificaly what we were talking about!:confused:

As for PD being determined to output these extreme (unpractical) values:

GT-Sport-BT2020-Color-Spec-2-1.jpg
 
Well I was convince output values and display capabilities was specificaly what we were talking about!:confused:

As for PD being determined to output these extreme (unpractical) values:

GT-Sport-BT2020-Color-Spec-2-1.jpg
I don't think you understand exactly what PD are "determined to output". That seems to be the core issue here.

You are not separating the internal workings of a true high dynamic range renderer from the tone-mapped output intended for a low(er) dynamic range display; current "HDR" displays included.

That has been my point all along: rendering at a higher dynamic range and compressing it to fit a lower one has been the norm for a decade now, and the sources you cited state that future HDR rendering will be "full BT.2020 or more".
 
What does that red line supposedly represent then? In game engine rendering capabilities? Why compare it to the range of luminance currently used for grading HDR (bluray) movies? Doesn't the real world offer more than a 1000 nits of luminance to play with? Or is it because movies aren't CGI intensive? And if those values aren't refering to HDR10 Code Levels (thus output at the HDMI connector of the console) why even bother to compare them to an actual 8 bit REC709 signal?
 
What does that red line supposedly represent then? In game engine rendering capabilities? Why compare it to the range of luminance currently used for grading HDR (bluray) movies? Doesn't the real world offer more than a 1000 nits of luminance to play with? Or is it because movies aren't CGI intensive? And if those values aren't refering to HDR10 Code Levels (thus output at the HDMI connector of the console) why even bother to compare them to an actual 8 bit REC709 signal?
In-engine, yes.

Floating point offers over 65 000 nits total range (linear precision; logarithmic colour spaces have even more range, exploitable due to the cube-root relation of intensity and sensitivity), but clearly the bigger issue is colour space translation, which is probably why it's limited to 10 000 nits / BT.2020. In fact, PD state that their capture and measurement processes are all BT.2020, so they can't actually exceed that at this time without resorting to synthetic content.


The output can be whatever you want it to be, because the game will be able to poll the display to find out its capabilities. Example "code" for that (through driver-level stuff obscuring the HDMI interface) is available at the source you linked to previously. You do still have to do the tricky two-step of tone mapping and colour space conversion, though, which is where I suspect we will see the difference between "good" and "excellent" HDR visuals in games.

Failing that, a simple menu option would suffice - it's worked so far!


I expect the comparison is merely to demonstrate the level of progress of the various media both as their individual and as their inter-connected wholes. One thing is clear: games today will be able to offer imagery with far more depth and detail than HDR movies (outside of the cinema / theatre) will for some time. Assuming you have the hardware...
 
…which is probably why it's limited to 10 000 nits / BT.2020...

10,000 nits BT.2020 is the actual limit of the HDR10 format the UHD alliance have agreed upon earlier this year.

BT2020 currently works as a container for various (more limited) other colour spaces and is indeed an ultimate target for the format future. The 10000 nits ceiling comes from the Perceptual Quantizer (PQ curve/EOTF) originally developed by Dolby Laboratories, standardised as SMPTE ST-2084 and ultimately retained for HDR10.
 
10,000 nits BT.2020 is the actual limit of the HDR10 format the UHD alliance have agreed upon earlier this year.

BT2020 currently works as a container for various (more limited) other colour spaces and is indeed an ultimate target for the format future. The 10000 nits ceiling comes from the Perceptual Quantizer (PQ curve/EOTF) originally developed by Dolby Laboratories, standardised as SMPTE ST-2084 and ultimately retained for HDR10.
So there we are, PD have been able to achieve the maximum that the standard asks for. And why not.

Incidentally, on a linear scale, 10 bits is only 1024 values, so to get a range of 10 000 out of ten bits implies a non-linear scale, which is obviously what the "Perceptual Quantizer" does: "Electro-Optical Transfer Function".

I don't envy all the graphics programmers tasked with getting all of this to run efficiently!!
 
Well the standard didn't really ask for this (the recommendations being clearly more sensible), the format was originally intended to be somehow future-proof (up until broadcasters request Hybrid Log-Gamma and dynamic metadata:ouch:) with enough headroom for upcoming technology breakthrough in the display field. We may still be a few years shy before those full specs are commercially achievable. By that time I imagine PD could have it's focus on 8k already and decide to rebuild everything again. As you say why not...:cheers:
 
I have one, but don't have a PS4 Pro, just a standard one, so it's even less people.

Polyphony and Kaz need to **** off now tbh, beyond a joke that they target useless little things instead of attacking the huge core deficiencies in the games.

HDR is available on both the Pro and the PS4 Vanilla though...
 
I find no inconvenience at all and only ever play GT6 in 3D. Passive glasses dont need charged and have wider viewing angle than active, and they're lighter so i hardly notice them on. Also no flickering from them which is something I'm sensitive to and feel ill under certain lighting.

(sorry for the "off-topic")
EXACTLY THAT.
I'm a big fan of 3D. 3D games are amazing if they're well done, just like with every other thing. If you use the tech properly is outstanding.
GT6 in 3D is still an amazing thing. Sometimes you have to tweak a little bit the settings depending on the person who is playing, but it is really something really really good. It gave me a lot of immersion.
 
I have one, but don't have a PS4 Pro, just a standard one, so it's even less people.

Polyphony and Kaz need to **** off now tbh, beyond a joke that they target useless little things instead of attacking the huge core deficiencies in the games.
HDR is available on the standard PS4.
Also, technology is moving forward. HDR will become the norm within this decade with many big brands already competing with this tech. The game industry would be dumb to not utilize HDR in the very near future.

Apparently I have to re-iterate this again:

You are acting like 100% of PD's development went purely into HDR and no other assets whatsoever, which clearly isn't the case here. :indiff:

Just because the game is still unreleased and the article presents one of the various things PD is working on doesn't mean it was all they were working on!

It always seems like anti-GT whining posts get liked the most on an article about GT on a site dedicated to GT.
 
...I know you meant it in jest, but hey, I'm fairly sure not all 200 of PDI staff worked on HDR exclusively for the last three years.

I mean, you can't ask the AI guys to work on graphics now, can you? That'd be like.... like asking a pro wrestler to, I don't know, ice skate in the Olympics or something. They could do it, but they'd suck at it.

Well, look at the crash damage. Obviously very little to no work has been done on it over the past 5 or so years. PD working on AI for 3 years isnt a given considering there is supposedly just one guy working on it based on the end credits in GT6. If they made no progress on damage (it actually looks like they've regressed in terms of damage based on how its completely lacking in GTS), how can we be sure they made progress on AI?

Kaz has made it abundantly clear if he doesnt care for a feature, like damage, he will ignore it. With his focus on e-sports and online with GTS to me its more clear then ever he doesnt really care about his team producing quality AI. Kaz has a very narrow vision and anything outside that gets ignored.

The advantages are future proofing for future tech that will deliver on the promise of greater colour range, greater artistic flexibility with regard to the compression schemes used (they are not limited to a narrow render gamut and can change how much they use as and when) and no more ugly dithering. 👍

Just like how the PS3 premium car models were suppose to be future proof?

By the time the tech that can take advantage of everything Kaz's team is adding makes it mainstream there will be newer, better tech and Kaz will again gleefully throw away all that his team has worked on to build a graphics engine that people wont be able to fully appreciate for years and the cycle will continue.

The only thing this news says to me is that Kaz is extremely wasteful with the resources he has available.

He is Ken Kutaragi, too ambitious for his own good. And apparently nobody is doing anything to rein him in.
 
Back