Jedi2016
They must be seeing HDR differently than I do, then.. hehe.
To me, HDR is simply a removal of the previous limitation in terms of lighting and reflections.
Basically, in a current-gen console, the total illumination for a scene is limited to 100%. This includes everything from lights to reflections to specular highlights. At no point can any of those values go higher than 100%.. the machine simply isn't designed to cope with it. This is why the lighting and shading in many current-gen games looks very "flat".
Next-gen consoles, on the other hand, don't have this limitation. If they want to make the sun at 250% intensity, they can. That alone is going to add a HUGE amount of realism to games.. far more than people are expecting, I think. And it doesn't really require any extra oomph.. I simply don't see why they have to drastically change the way scenes are rendered just to accomodate a larger range of displayable colors. HDR isn't near as complicated as a lot of people seem to think. And Cell in particular is uniquely suited for doing this. So's the RSX.. that's what all that floating-point power will be good for.
AA, I can understand needs a frame buffer. But it depends on how it's done. There are different ways of doing AA.. if they think outside the box of the standard FSAA that's been around for years, they might find ways to get all this stuff to work. They could use multi-pass AA with adaptive sampling, or simply render the scene larger and shrink it down. Even Hollywood does that on occasion.
If they stop treating these games as mere games, and start treating them like real-time CGI, they'll be able to do a hell of a lot more with it.
HDR is not simply the controlling of lighting through the measure of brightness.
What HDR consists of is the range of visable detail through dratically different lighting situations.
If you're into photography, it is much easier to understand, but think of it like this.
If you take a picture of a sunset, then you can only have 1 of 2 possible results with one capture, you can have the detail of the sun and a few colors from the sky, but when you do, the landscape and clouds will often be "blacked out" because the apeture is too small to capture enough light to get the detail for everything else.
If you take a picture with the apeture at a larger setting, then you will get more detail in the landscape and in clouds, however, the sun will be "blown out" (this is a common term in photography, it's when there is little detail and it's just a huge bright/white mass that lacks any detail whatsoever). These are the only two possible results.
What HDR does, is "merge" the two results, creating detail in both the light and dark area's, without having harsh shadows or washed out brights. In order for this to be possible, multiple renders are needed (passthrough's). This is done and the images are combined and usually kept in the frame buffer afterwards. After that they are sent to your TV, of course it all happens very quickly.
THe problem occurs, when you need to do BOTH HDR and AA. You see, the frame buffer is so big, and cannot handle both tasks at the same time. As of right now, in order to do both effectively in hardware, it would require you to have a frame buffer of roughly 256MB, with your total available memory capacity peaking at around 1GB...if the PS3 had prices like these, it would be unaffordable as a console to many people, lol.
So, yes, there are way's to do it in software, however, once you hit 720p or 1080i, you're really negating the need for 'true' AA, and you can cut a few corners without harsh consequences.