So PS3's Cell is more powerfull than PS4's CPU after all..explanation?

  • Thread starter Alex p.
  • 21 comments
  • 34,957 views
6,513
Germany
Hanover/Germany
alexpkas
As you already might have seen in the news section, Kazunori talked in one recent interview about both PS3's and PS4's strenghts and weaknesses and it's not the first time he hints to the fact, that PS3's Cell IS actually stronger than the CPU in the most powerful console PS4, which will be about 6-7 years more advanced in technology in general. Here is what Kaz said in the aforementioned interview: "The PlayStation 4′s large memory size is a unique trait, but in terms of CPU performance, the PS3′s Cell chip, while difficult to handle, is really capable. They both have different advantages."

Also, I read on neogaf that the Cell's real world peak performance is about 180 GFLOPS, while PS4's CPU is at 100.

So my questions are:
1. Is that about right?
2. And if so, how can it be that a console launching 7! years later with focus on being somewhat high-end has a CPU that is almost half of the power of the CPU from the previous generation of consoles?

Can someone elaborate on this with a certain confidence?
 
I am not at liberty to discuss this but judging by his quotes he only said the Cell is capable not better.
 
I think he was clearly indicating the Cell is more powerfull. Think about it: "They both have different advantages".
 
You're reading far too much in-between non-existent lines.

The architectures have their advantages. CBE, in and of itself, favors too much of the absolute in regards to throughput on a platform where it was bottlenecked from day one, whereas Jaguar has no such complication.
 
OK, but I have been hearing/reading this numerous times already, also on other sites. So I got suspicious.
 
It struck a few people as odd that Sony would invest so much into the platform only to move to an entirely new, yet simpler, platform.

Understandably so, but I'm of the opinion that the PS4's architecture, especially it's memory pool, would have played very nicely with a second generation CBE platform. Mark has his reasons for not opting to go that route and most of them are based entirely on what I've said above and it being a pain to optimize.
 
I dont care about power. The simple PS4 architecture is apparently going to shave up to a year off of development time according to Mark Cerny. Give me that over the CELL any day.

ps4-time-to-triangle-690x388.jpg
 
Sony did plan to use two Cell chips for PS3 as GPU and CPU, both can do render work but they opted for a nvidia GPU at the last minute. Can't really compare it as the parts back then were selected for a certain jobs. It had to be more capable.

PS3 GPU had 300-400 gflops and now the PS4 is PC like which has 1.8 tflops on the gpu.

I will say the One/PS4 CPU is pretty much on the nail about what they could get away with. Any lower would be bad.

The CPU performance may show some problems in high number multiplayer however. We'll just have to see, its good enough for most things for sure.

I would've liked to see a slightly higher IPC CPU chosen to cover every issue but they want low power multi-tasking 8 cores as its not just a gaming console anymore and AMD have this solution.
 
Last edited:
Posting paragraphs in bold just makes you look attention seeking.

Thank you for your informative and constructive input.

Sony did plan to use two Cell chips for PS3 as GPU and CPU, both can do render work but they opted for a nvidia GPU at the last minute. Can't really compare it as the parts back then were selected for a certain jobs. It had to be more capable.

That's one further point, which led me to the conclusion that Cell is probably more capable than the new PS4 CPU. On the other hand, I looked for the power of the best desktop CPU for PC's and it had only around 170-180 GFLOPS, can that actually be true? So little? I actually thought we are not far from the first TFLOP CPU's...
 
Last edited:
Don't compare CBE to the average desktop CPU, numbers or otherwise; CBE would fall flat on it's face in that environment, as would a desktop counterpart in an environment where Cell is most beneficial.

Scalability is of far more importance than brute bandwidth that only a handful of developers will be able to tap in to.
 
Let's get a few things straight: There are more versions of the CELL than the one found in the PS3. The CELL in there doesn't even have all of its SPUs fully operational. The CELL itself (if my memory serves) was benched at, I believe, ~200 GFLOPS or something like that, but I'm quite sure that that's not quite the same configuration that's running in the PS3, as it's measured on a system running all eight SPUs - and it's based on single precision matrix multiplication. Change the benchmark to a Linpack 1000x1000 double precision run, and the CELL achieves (according to IBM) 9.67 GFLOPS with all 8 SPUs.

And that's why you can't trust some theoretical peak GFLOPS numbers to actually indicate the performance of the CPU. Depending on the tasks it'll have to perform, the actual FLOPS output might be drastically lower than the peak (which is basically what T-12 just said). In the case of the CELL BE, the difference would be tremendous, according to some articles I've read in the past. Which, then, is why AMD's Jaguar is still a better solution than the CELL.

On the other hand, I looked for the power of the best desktop CPU for PC's and it had only around 170-180 GFLOPS, can that actually be true? So little? I actually thought we are not far from the first TFLOP CPU's...
From what I know, desktop CPUs are generally benched with double precision benchmarks. That's like comparing one guy running in sport cloths and one guy running with 70 pounds of military equipment.

edit: Thought of a better analogy.

It's more like comparing two cars: One has its weight noted as 2500 lbs. The other is weighing in at 2000 kg.You can't claim that the 2000 kg car is ligher just because the number is lower. Same goes for simple single precision benchmarks vs. complicated double precision benchmarks.
 
Last edited:
Thank you for your informative and constructive input.



That's one further point, which led me to the conclusion that Cell is probably more capable than the new PS4 CPU. On the other hand, I looked for the power of the best desktop CPU for PC's and it had only around 170-180 GFLOPS, can that actually be true? So little? I actually thought we are not far from the first TFLOP CPU's...

No, we're a long way off aside from custom super computer 50+ core CPUs.

Like Luminis said, desktop is double precision benches. How is the PS4 tested, maybe single?

Desktop has a 4 core version of the PS4 CPU out, if you run this in a gflops bench its considerably weaker than a typical CPU, something like 20-30 gflops vs 70-150 on a i7. Even if you add the extra cores in its still weak. No surprise though as its drawing very low power and used in laptops. CBE gflops don't mean much if you can't use it in real world.

I would only look at GPU gflops/tflops for a sense of power, these have gone up with GPU performance you can measure and see easily, from 400 gflops years ago to 1-3 tflops today.
 
Last edited:
Let's get a few things straight: There are more versions of the CELL than the one found in the PS3. The CELL in there doesn't even have all of its SPUs fully operational. The CELL itself (if my memory serves) was benched at, I believe, ~200 GFLOPS or something like that, but I'm quite sure that that's not quite the same configuration that's running in the PS3, as it's measured on a system running all eight SPUs - and it's based on single precision matrix multiplication. Change the benchmark to a Linpack 1000x1000 double precision run, and the CELL achieves (according to IBM) 9.67 GFLOPS with all 8 SPUs.
The PS3 usually doesn't run 64bit.

No, we're a long way off aside from custom super computer 50+ core CPUs.
The closest you can get is the use of a Intel Phi, but that doesn't really count. Although workstations would miss the point even more...
Let's see what happens if Intel brings the standalone next gen phi platform (announced). I think they called it "Knights Landing".
 
Probably just a slightly defensive/"reassuring" statement from Kaz as to the decision of GT6 releasing on the PS3.

I believe the Cell is great at number crunching, but Sony are aiming for GPGPU computation for the next generation, where the PS4 is going to massively outperform anything the Cell is capable of.

The game Resogun for example is almost entirely based on GPGPU computation, they are using it to give half a million voxels their own lighting, shadow, collision detection and physics, so that an entire level (and enemies) can be made out of these and fully destructable, also while running natively at 1080P/60FPS, I doubt the Cell would even get close to matching that task.

3Pe9.gif


This is being done by a small indie dev team, I'm sure PD will use similar tactics to calculate most of their physics and graphical effects, GT7 should look and feel glorious, and it will probably leave us foaming at the mouth, we're just going to have to wait.
 
Wasn't it confirmed that GT5's release was caused by the PS3 and online features?
nothing is preventing PD from making a GT game for 3-5 years now, it won't come out in 2014, we all know that, 2015 or 2016 i would say, PS4 is just a very simple console like the PS1, expect FFXV to come soon aswell, most of the games will be developed for ps4 and ported to the X1, the opposite from what happened this gen.
 
I am curious what a PS3 Cell CPU can do with the same GPU and unified memory pool as PS4 ( DDR5 ). Imagine the combined power of CELL and powerful GPU with 8 GB DDR 5 :D or even better PS3 CELL with 4GB XDR RAM and 4 GB GPU DDR5 RAM :drool:
 
Last edited:
It's surely marketing. Simultaneously promoting the system's "strength", and the developer's ability to be one of the few to unleash it's potential. An attempt to make sure that GT6 appears valid as the new consoles near.

Oh, and yes, capable does not equal superior.
 
The PS3 usually doesn't run 64bit.
Precisely. And that's why you don't use its CPU for a system that does. This is also why this "CELL > Jaguar" topic in and off itself is rather stupid. The CELL simply isn't suited for what Sony wants to do with the PS4. As such, its inflated, out-of-context single precision GFLOPS mean squat.
I am curious what a PS3 Cell CPU can do with the same GPU and unified memory pool as PS4 ( DDR5 ). Imagine the combined power of CELL and powerful GPU with 8 GB DDR 5 :D or even better PS3 CELL with 4GB XDR RAM and 4 GB GPU DDR5 RAM :drool:
I don't know, I'd rather have a CPU that's suited for that kind of setup. 32 bit computing is soooo Windows XP.
 
I am curious what a PS3 Cell CPU can do with the same GPU and unified memory pool as PS4 ( DDR5 ). Imagine the combined power of CELL and powerful GPU with 8 GB DDR 5 :D or even better PS3 CELL with 4GB XDR RAM and 4 GB GPU DDR5 RAM :drool:
Very bad combination. A successor of the PowerXCell 8i (not the one in the PS3) would've been the only halfway reasonable option due to at least 5-8x dp performance the 8i already has, although that one uses DDR2 and not XDR. A 32i (name?) was planned, but IBM halted their Cell project, so don't expect too much.

Btw a merged Jaguar/7860 APU probably doesn't cost 1/3 of what a next gen Cell + dedicated 7860 would've cost if they hit the yields (I just call it 7860 because it's between a 7850 and 7870).
 
Last edited:
Back