Digital Foundry v GranTurismo 6

  • Thread starter Thread starter phil_75
  • 123 comments
  • 13,033 views
Research all the bugs and glitches still present in GT6. They 5 years to develop GT5, GT6 is a continuation of the foundation they built with GT5 (another 2 years later) and still has tons of issues.

For example, Naughty Dog created Uncharted 1,2,3 AND The Last Of Us in the same amount of time, looks just as good if not better IMO, and I never encountered ONE game breaking let alone annoying bug during any of those 4 titles.

PD has one of the biggest budgets in the video game industry with a virtually non-existent time table. They aren't making smart decisions with their development, putting super high graphics and car models that are using up WAY too many resources which doesn't leave much to get everything else right. Doesn't take rocket science.
Well, I've read about the bugs, but haven't really fallen prey to any except the update shut-down issue.

Uncharted had a fair few bugs in its time, mostly on 3 but for 2 as well, mostly online if my memory serves me correctly. Anyway, comparing an adventure game with a racer is a bit odd as both have a different focus.

As for your point about PD's decisions and development choices - everyone knew it would be tricky to achieve any major improvements on the PS3 - I think that they have though, despite that. GT6 has been a big surprise for me, especially as I wasn't expecting much more than GT5 with new tracks and cars.

And as for the 'rocket science...' dig - where was I rude to you? I merely asked a question...
 
As I thought, GT6 is technically better than GT5

PoDi took risks which unfortunately impacted framerate - 10 fps drop is correct -... This doesn't mean they are incompetent, on the contrary, they show that they are ambitious. I can only congratulate them.

And finally, I was right about the sounds. PoDi are blocked by the RAM.
 
Last edited:
While its a very different style of racing game, NFS Rivals shows that while the car models from GT6 don't need to change much, they still have a far bit to do in other areas for the move to GT7 on the PS4.

Yet, PS4 NFS Rivals runs @ 30 FPS



...I am one of those who had hoped for GT6 that PD went in the way DF suggested, which was to reduce the res to 720 to lock the frame-rate...I still wish they had gone down that route (along with 30fps for mirrors, etc.).

Isn't that what the 720P option (almost) offers?



@phil_75 Thank you for posting the DF article 👍
 
I really hope PD can achieve their Absolution with the PS4, they have all the right cards at hand:

Easy to develop console (PC like architecture), tons more RAM and power than the old console, future proofing practices already in play by the PS3 era. The Future is looking bright.
Never bad to be an optimist, but the PS3 was far stronger than the 2 and so on, and the entire current gen life cycle they have chosen to fill up all available RAM with the wrong priorities, leaving no space to improve aspects that desperately needed improvement since the days of PS2.

What's to say they won't do the same with the PS4?
 
I watched videos on the framerate.

PoDi guys were very smart.

At, 1080p in extreme conditions - rain, snow and earth -, they almost locked at 50 fps for a drop of 10 fps because it's much less noticeable than a drop of 20 fps. In my opinion, this is the best compromise they could find.

By cons, in town for example, they were able to do better than GT5 which included many less spectator, no camera, etc.
 
Never bad to be an optimist, but the PS3 was far stronger than the 2 and so on, and the entire current gen life cycle they have chosen to fill up all available RAM with the wrong priorities, leaving no space to improve aspects that desperately needed improvement since the days of PS2.

What's to say they won't do the same with the PS4?
The thing is, the one bottleneck that plagued the PS3 the most was actually bandwidth, which in most cases was nowhere near the PS2, thus creating the increased framebuffer issues that we saw in comparison.

The OTHER issues, however, are entirely on PD, and in the end, those priorities are still their decision, whether anyone considers them wrong or not.
 
The thing is, the one bottleneck that plagued the PS3 the most was actually bandwidth, which in most cases was nowhere near the PS2, thus creating the increased framebuffer issues that we saw in comparison.

The OTHER issues, however, are entirely on PD, and in the end, those priorities are still their decision, whether anyone considers them wrong or not.

It's true that, for a media streaming engine, the PS3's Cell was a little light on bandwidth (this would be between storage, main memory and the processing units, overall), especially when you factor in the meagre quantity of RAM it has. But there are bandwidths everywhere there is data transfer, and the one that Kaz was talking about on the PS2 was pixel fill rate, which is on the "GPU" at the very end of the render cycle, schematically speaking. When you get an imbalance in the bandwidth of data connections internally to a chip, that's how you get bottlenecks - e.g. the main memory data exchange not keeping up with the SPU throughput for pure streaming (a game is some balance of streaming and constant re-use of stuff already in memory).

What kills the PS3 is not being able to have enough "detail" (textures, models etc.) stored in memory, and not being able to get at it fast enough to decode any kind of compressed format either (e.g. Rage) - then there's the I/O bandwidth, i.e. the BluRay, which is the biggest problem, and why harddrive installations for games that rely on streaming have become almost necessary. The way to get around that is to somehow generate detail in the SPUs from information describing patterns or trends, i.e. procedural techniques (like the new sounds appear to me to be), but they're tricky to control.

Despite that, I'd wager the PS3 has been an excellent learning tool (if hard-earned) and a useful stepping stone (not to mention catalyst) to a more parallel, streaming-oriented architecture for game system programmers. This new console generation will be a boon for the widespread learning / adoption of general purpose "heterogeneous programming" techniques, so maybe the next jump in hardware won't be too far away either (it's always carefully kept not more than one step ahead of mainstream programming "ability").
 
@Griffith500 any idea how much would have 512MB + 512MB ram implementation opened the bottleneck and eased the job for game developers? And what would have been the added cost to the console?
 
Quote from article;

"We also see the elimination of 2x quincunx anti-aliasing (QAA) in favour of a morphological (MLAA) solution...This frees up memory and resources on the RSX, allowing for a higher resolution, but the end results are mixed. While the increased clarity and lack of blur associated with multi-sampling definitely help produce a sharper overall image, the aliasing along finer edges such as trees and fences is increased, resulting in more noticeable shimmering...engaging the 720p mode manages to solve most of the more severe frame-rate problems..Image quality definitely takes a hit - especially in terms of the MLAA, which has fewer pixels to work with..."



For me, this is the biggest eye sore in GT6. I just cannot stand all the shimmering/flickering/aliasing that comes with having to run the game in 720P mode. They mention trees & fences in the article, but it's also bad on some kerbs like at Laguna Seca/Motegi etc, & white lines on tracks especially grid slots. GT5 was much better looking in this regard, & now we find out why, because they were running a different kind of anti aliasing. Hopefully the situation will improve as they optimize the code & can allocate more power to sort out these issues.


@phil_75 - Great article, thanks for sharing. 👍
 
Shimmering of repetitive neat structures was there from PS1 on and is still present. I hope that PS4 will cure it finally.
 
Shimmering of repetitive neat structures was there from PS1 on and is still present. I hope that PS4 will cure it finally.

Yes, but it was noticeably less in GT5. Going from GT5 to GT6 feels like I'm being poked in the eyes! :crazy:
 
GT in 3 Generations :

53206057517416586843.jpg


61227267210029946993.jpg
 
For all its problems, GT6 still manages to hold its voodoo over me for some reason.

Honestly I don't think GT7 will be that much of an upgrade in visuals besides being smoother in frames and detail on the trackside.

Good thing theres room for ALOT of improvement in those 2 areas. Another thing I think your overlooking is the possibility that they will be transporting alot of PS3 quality content straight over to the PS4. The car models of GT5 had 100x more detail then those in GT4. If the car models receive little or no upgrades for next gen, then that frees up ALOT of resources. If all next gen GT turns out to be is what we have now but with lifelike soft shadows and lighting, 3D trees, photomode quality cars, and a solid 60fps, it will be amazing.
 
@Griffith500 any idea how much would have 512MB + 512MB ram implementation opened the bottleneck and eased the job for game developers? And what would have been the added cost to the console?

It's difficult to say, because it's a balancing act. We know where the balance is in general for the PS3 as-is, because people who make games have told us what they have to work against. That balance differs depending on what kind of game you make, and how you make it.

Put simply, adding extra memory creates more space to fill, increasing loading times. And if they touch nothing else, you can still process stuff faster than you can pull it out of memory, never mind from storage, meaning "compression" of any kind is preferable ("expanding" it to full detail in the SPUs) to get the most out of the system, still. More things can be kept in the RAM instead of left to storage, which is much faster to access, though.

The split-memory itself is also detrimental because they operate on different data transfer paths with different performances, and allocation of game resources to each "bucket" is probably hard-coded per game for simplicity. So even if you've got memory free in one, you can't necessarily use that free space if all you've got is stuff you have to put in the other bucket (these situations change dynamically as you play a game).

id Software's problems with Rage I think best highlight where the main issue is for the PS3 (e.g. here, 2nd page 5th paragraph) - Carmack's Quakecon keynotes are great for general insight to these sorts of issues, because he's generally so candid.

But games would have looked better with twice the memory, for sure - even with 512 Mib of unified memory.

EDIT: the split memory was possibly because of the use of expensive, but fast, XDR memory for half of it. They may have been better off with GDDR memory, or just "ordinary" DDR for all of it. But the parallel / streaming bias of the Cell requires fast memory, so it was always going to be expensive and / or compromised. For me, it's a victory in terms of herding the software community in the direction they need to eventually be heading, if not necessarily in terms of elegant hardware design (the Cell has been used to great effect in properly balanced systems by IBM).
 
Last edited:
@Griffith500 any idea how much would have 512MB + 512MB ram implementation opened the bottleneck and eased the job for game developers? And what would have been the added cost to the console?

Well, the system was originally designed for 512+256. The cutting down to 256+256 happened relatively late in the development (I remember it throwing at least one developer, I think EA, for a loop), and was probably for cost concerns (as Griffith500 pointed out, the PS3's system ram was expensive compared to more conventional types at the time) considering how much they were losing on it already.

512 on the RSX, though, probably wouldn't have made much of a difference. The RSX was not too far removed from an off the shelf cut down version of nVidia's upper mid-range card of the time, so it likely would have had much more difficulty utilizing the extra ram except in certain situations (much like a lot of mid range cards with over a gig of ram do today). That would have probably meant higher clock rates to try to take advantage of the increased RAM better (which is what nVidia did with the 7800 when they released a version with 512 MB of RAM), which means increased concerns with cooling the system. And while it's obviously not something Sony knew about at the time when they selected it, this is the same GPU design which ultimately had serious reliability problems (to the extent of a payout by nVidia, if I remember correctly) in the higher spec laptop versions regardless of the cooling solution; so we're probably lucky that Sony didn't chase after more performance from it than they did.
 
Last edited:
Thank you, will dig the posted link for sure.👍

I find it interesting (suspicious?) that some developer’s kits where equipped with 512MB of system ram (DECR-1400). And yes, the Blu-ray format establishing Trojan horse is a no brainer and I understand that with a substantial loss per 599$ unit shipped Sony had to cut cost wherever feasible. But I can’t help but think that PS3 was at some critical decision point somehow compromised as a gaming platform despite it’s (over)ambitious initial architecture...


@Tornado maybe less cost savings on thermal paste could have been beneficial:scared:
 
BWX
It is interesting that DF did not do ANY cockpit view FPS/ tearing tests. Was it too bad to show?

wait, what?? I haven't read the DF analysis yet but they seriously didn't post any numbers from the cockpit view?? Why the heck would they not look at the situation under which the game performs at its worst?? If the thing is dropping to 40fps from the classic GT view, you better believe its below 30 at times with the Cockpit view.
 
Well, the system was originally designed for 512+256. The cutting down to 256+256 happened relatively late in the development (I remember it throwing at least one developer, I think EA, for a loop), and was probably for cost concerns (as Griffith500 pointed out, the PS3's system ram was expensive compared to more conventional types at the time) considering how much they were losing on it already.

512 on the RSX, though, probably wouldn't have made much of a difference. The RSX was not too far removed from an off the shelf cut down version of nVidia's upper mid-range card of the time, so it likely would have had much more difficulty utilizing the extra ram except in certain situations (much like a lot of mid range cards with over a gig of ram do today). That would have probably meant higher clock rates to try to take advantage of the increased RAM better (which is what nVidia did with the 7800 when they released a version with 512 MB of RAM), which means increased concerns with cooling the system. And while it's obviously not something Sony knew about at the time when they selected it, this is the same GPU design which ultimately had serious reliability problems (to the extent of a payout by nVidia, if I remember correctly) in the higher spec laptop versions regardless of the cooling solution; so we're probably lucky that Sony didn't chase after more performance from it than they did.

I wasn't aware of that, and it makes a lot of sense. For comparison's sake, the last GPU I had with 256 MiB of VRAM was an ATI Radeon 9600 XT, from around 2002, although that was double what the reference version had. Framebuffers were still typically smaller than "standard" 1080p, in terms of pixel count, for most people, and games looked roughly no better than this. VRAM is hovering around 2-3 GiB today, for discrete GPUs, but that's targeting 4k(+!) resolutions in some cases, and probably also the impending GPGPU wars.

So I think 256 MiB VRAM for the PS3 was perhaps just too little; indeed, the general texture quality achieved would reinforce that idea. In practice, then, it would seem the XBox 360 uses more than 256 MiB for graphics, effectively (probably helped by its OS / overlay having a smaller footprint, as well as the unified memory itself, and possibly texture formats).

You have to wonder how that "last-minute" re-spec affected PD's plans, too.
 
To be honest I read all posts so far and they seem overly positive.

This are the closing thoughts of the article. Don't just read and quote the last two paragraphs but all four:
Gran Turismo 6: the Digital Foundry verdict
If Gran Turismo 6 is the final last-gen title from Polyphony Digital, it could be suggested that the developer still hasn't met its objective set forth almost a decade ago: to deliver a 60 frames-per-second racing simulation at 1080p on PlayStation 3. In fact, generally speaking, performance remains smoother in both PS2 and PSP versions of the game. We still have elements in the game remaining from the PS2 generation and the gold standard 60fps update clearly hasn't been met with the consistency we would have liked. While the resolution boost is welcome, the move to MLAA from multi-sampling feels like a retrograde step (especially in 720p mode) that doesn't particularly suit the style of the game.

When GT4 was released on PlayStation 2 it felt like something close to a complete, finished product from end to end: there was the sense that everything the team set out to accomplish - bar online racing - had been achieved. With Gran Turismo 6, we simply don't get that same feeling - the engine created for Gran Turismo on PlayStation 3 has never quite fully delivered and Polyphony's ambitions were seemingly too high to be delivered on last-gen tech. The frame-rate dips here are just too jarring at times, impacting the interface between player and game, introducing too much inconsistency into the way the cars handle from one race to the next.

Gran Turismo 6 certainly suggests that Polyphony Digital has learned some valuable lessons in transitioning across generations this time out - creating assets at a quality higher than could be fully appreciated on PlayStation 3 certainly makes a strong case for this. The inclusion of features such as adaptive tessellation, which could become genuinely practical on PS4, only serves to further cement this position. By further refining the lighting, shadows, and track detail while increasing resolution and performance levels, we could be looking at a proper PlayStation 4 sequel. Other enhancements spring readily to mind: real-time lighting as standard on all tracks, improved weather conditions, a revamped damage model, a genuine revolution in the racing AI model and fewer compromises on background scenery all spring readily to mind.

Gran Turismo 6 still delivers the best GT experience to date with more content, features, and details than any instalment before it. There are still weak spots, including somewhat spotty AI and less than remarkable engine sound reproduction (we can't help but think that the RAM requirement alone rules out the mooted audio patch, but let's hope that this is coming) - but the driving model is refined to near-perfection, held back only by the inconsistent performance. Despite falling short of the original goals, the end result is still remarkable - this is the most complete Gran Turismo package released to date.
 
"Gran Turismo 6 still delivers the best GT experience to date with more content, features, and details than any instalment before it"

What features are there more of in GT6? The one big piece of content I loved in GT5 (course creator) is no where to be seen in 6. GT6 is great but I wish they wouldn't gloss over the missing features just because there is some awesome promised update coming at some unknown time sometime in the future maybe.. :grumpy:
 
Last edited:
If wanted to test, use 3 PS3 from 3 generation ( good condition - cleaned inside for the older ones)

I think there will be variances on all 3 PS3 generation, even if it might be only 1 to 2 fps on average or it could be 5 or more :)
Cleaning an old PS3 doesn't make it "new" and the comparison would be flawed from the start.
Having said that, I did it with the very first generation and the very last and there was no noticeable difference. The hardware is the same, the only differences would be noticed over prolonged use and would depend on condition of the consoles themselves.
 
Cleaning an old PS3 doesn't make it "new" and the comparison would be flawed from the start.
Having said that, I did it with the very first generation and the very last and there was no noticeable difference. The hardware is the same, the only differences would be noticed over prolonged use and would depend on condition of the consoles themselves.

Have you done such analysis frame rate test to give such conclusion ? if only we can have access to brand new fat, new slim and super slim :( I do have 6 months old last gen model slim, and it perform very well, no freeze or any issues at all, even frame drop at Bathurst is minimum - I often do arcade races in the rain.
 
I wasn't aware of that, and it makes a lot of sense. For comparison's sake, the last GPU I had with 256 MiB of VRAM was an ATI Radeon 9600 XT, from around 2002, although that was double what the reference version had. Framebuffers were still typically smaller than "standard" 1080p, in terms of pixel count, for most people, and games looked roughly no better than this. VRAM is hovering around 2-3 GiB today, for discrete GPUs, but that's targeting 4k(+!) resolutions in some cases, and probably also the impending GPGPU wars.

So I think 256 MiB VRAM for the PS3 was perhaps just too little; indeed, the general texture quality achieved would reinforce that idea. In practice, then, it would seem the XBox 360 uses more than 256 MiB for graphics, effectively (probably helped by its OS / overlay having a smaller footprint, as well as the unified memory itself, and possibly texture formats).

You have to wonder how that "last-minute" re-spec affected PD's plans, too.

X360 also has an additional 10mb (iirc) of superfast, high bandwidth edram, that can provide 2xMSAA and help out with the alpha buffers.
 
Have you done such analysis frame rate test to give such conclusion ? if only we can have access to brand new fat, new slim and super slim :( I do have 6 months old last gen model slim, and it perform very well, no freeze or any issues at all, even frame drop at Bathurst is minimum - I often do arcade races in the rain.
I don't have access to equipment that would let me do that, no.
I considered all the known facts and observations and came to the statement in my previous reply.
 
Well, I've read about the bugs, but haven't really fallen prey to any except the update shut-down issue.

Uncharted had a fair few bugs in its time, mostly on 3 but for 2 as well, mostly online if my memory serves me correctly. Anyway, comparing an adventure game with a racer is a bit odd as both have a different focus.

As for your point about PD's decisions and development choices - everyone knew it would be tricky to achieve any major improvements on the PS3 - I think that they have though, despite that. GT6 has been a big surprise for me, especially as I wasn't expecting much more than GT5 with new tracks and cars.

And as for the 'rocket science...' dig - where was I rude to you? I merely asked a question...

Sorry about the dig, I apologize. I just don't get why people continue to defend PD when they clearly are not meeting their potential and releasing titles that are beta form at best.

AAA developers should not rely on updates to finish a title. They should be solid from the get go, bar a few minor issues. To expect a 100% bug free game is not realistic. But there's no excuse for PD's use of updates to finish a game. Updates should be for minor patches and DLC, not fixing game breaking bugs and glitches and finishing a title to where it should have been. It's the same story that was GT5, but IMO worse.
 
Last edited:
PD showed a video of such "accumulation" effects before GT5 came out, in relatively high quality (it looks texture based: RAM); I think it's part of the long game thing. They seem to have been prototyping systems (presumably on beastly hardware), ironing out the kinks, optimising and then bringing it into the game when they can shoehorn it into the overall hardware budget. No doubt that particular system has had more work and is being prepared for the PS4.

Not to pick holes, since I agree in total, just something I've been noticing over the years. Kinda makes the "GT7 this year" claim more plausible (if not quite believable yet). I agree with your other points, too.

If actual gameplay on next Gen consoles comes close to the smoothness of that trailer, I am happy.
 
Sorry about the dig, I apologize. I just don't get why people continue to defend PD when they clearly are not meeting their potential and releasing titles that are beta form at best.

AAA developers should not rely on updates to finish a title. They should be solid from the get go, bar a few minor issues. To expect a 100% bug free game is not realistic. But there's no excuse for PD's use of updates to finish a game. Updates should be for minor patches and DLC, not fixing game breaking bugs and glitches and finishing a title to where it should have been. It's the same story that was GT5, but IMO worse.
I think updates via a patchable version of what we are used to is quite brave alright - though I also don't think it's so much to patch bugs or whatnot, more to add elements of the game once ready (plus any bug fixes that are needed on the way).

I appreciate that this is the source of your frustration, though people getting the game now rather than eight months to a year further along, when everything is ready, does seems to be more popular than not. Hope that helps explain where I'm coming from more thoroughly.

Appreciate the apology - thanks.
 
Back