PlayStation 5 Rumors Latest: Console Reveal Within the Next Six Weeks

10Tflops ow ow, aww well will be easier to wait and see for a pro version with 15tf+ and more memory..
It's worth noting that Cerny pointed out that one should not rely on TFLOPS alone as a metric, and indeed each PS5 GPU TFLOPS is doing more than each PS4's. Just looking at the equivalence, it seems like the PS5 GPU is doing the work of a PS4-spec GPU running at over 16TFLOPs.
 
So the PS5 uses some form of Precision Boost, which makes sense because it's Zen 2, but it's really weird that they decided on an obtuse 825GB SSD. :odd:

It seems like the SSD could prioritize TLC and QLC, but it's up in the air at the moment. Interesting presentation, but they really couldn't take it upon themselves to show the damn thing off?

It'll be a 1TB SSD, but after over provisioning, system requirements, and other stuff that's needed that's what'll be available for the end user - or at least that's my guess.
 
It's worth noting that Cerny pointed out that one should not rely on TFLOPS alone as a metric, and indeed each PS5 GPU TFLOPS is doing more than each PS4's. Just looking at the equivalence, it seems like the PS5 GPU is doing the work of a PS4-spec GPU running at over 16TFLOPs.
Like he said though, you can't really compare PS4 TFLOPS to PS5 TFLOPS, but this will be compared to the XBox TFLOPS, and how different are the numbers compared to performance there! :confused:
 
This was announced as a hardware architecture deep dive, and that's exactly what it was.

Also everyone, don't immediately discount it based on TFlops vs Xbox. All the optimizations in the surrounding hardware, plus all the coprocessors, may actually make it faster in practice than the xbox

That's not exactly how that would work. You can't make up non-existent horsepower if said horsepower isn't available to you in the first place.

Now, what the PS5 could do is have a more efficient pipeline that leads to better performance in the sense of stability. But in terms of head-to-head power, the PS5 is lesser than the Series X because both GPUs are RDNA 2, meaning they are 1:1 for comparison's sake. Is Cerny correct in saying one shouldn't place too much of an importance on TFLOP measurements? Yes, but that's a general rule of thumb anyway as it's peak theoretical performance, which we'll never see.
 
That's not exactly how that would work. You can't make up non-existent horsepower if said horsepower isn't available to you in the first place.

Now, what the PS5 could do is have a more efficient pipeline that leads to better performance in the sense of stability. But in terms of head-to-head power, the PS5 is lesser than the Series X because both GPUs are RDNA 2, meaning they are 1:1 for comparison's sake. Is Cerny correct in saying one shouldn't place too much of an importance on TFLOP measurements? Yes, but that's a general rule of thumb anyway as it's peak theoretical performance, which we'll never see.
Sure, but the point is that systems usually aren't actually capable of running at their to speed all the time, as they are bottlenecked by the surrounding hardware. Sony seems to have really focused on removing those bottlenecks, so they may be able to run at their peak performance for a higher percentage of time
 
Also everyone, don't immediately discount it based on TFlops vs Xbox. All the optimizations in the surrounding hardware, plus all the coprocessors, may actually make it faster in practice than the xbox
Was the Xbone ever actually faster in practice than the PS4?
 
The variable processing speed to balance heat production makes me nervous though. Like if a game uses the maximum speed available too much it might overheat and shut off, thus making that peak speed unpractical in reality. But games always try use as much as it can.
 
Well Eurogamer got impatient, here are the leaked specs:
81nI3MQ.png

That's all I really needed & wanted to know. I got more out of that than the talk itself. Nice guy, but boring presentation...
 
The variable processing speed to balance heat production makes me nervous though. Like if a game uses the maximum speed available too much it might overheat and shut off, thus making that peak speed unpractical in reality. But games always try use as much as it can.

That's a non-issue. The CPU would throttle (rapidly decrease its clock speeds to lower temperature) long before being hot enough to the point the system would have to shut itself off. By the sound of it, both the GPU and CPU use Precision Boost (just boost frequency in the case of the GPU) that specifically take into account temperature, workload, power consumption, current draw, etc.

Basically, you've nothing to worry about.
 
Sure, but the point is that systems usually aren't actually capable of running at their to speed all the time, as they are bottlenecked by the surrounding hardware.
Since when? Sony introducing variable clock speeds to offset what sound like cooling concerns (from the higher clocked but lower-specced GPU perhaps?) aren't a competitive advantage for the PS5 if the XBox SeX doesn't have problems with overheating.



Really, it's okay if the PS5 is somewhat less powerful. It truly is. The ten percent raw performance difference or whatever probably won't make that much of a difference in practice beyond edge cases; certainly much less so than the bad old days of PS3 versions of games made on the 360. There's no reason to talk about how the PS5 is somehow super optimized in its design but the nearly identical hardware of the Xbox isn't so that will make them perform the same.
 
Have to say the io and storage speeds, are an amazing increase from last gen though.
And more efficient.
@Terronium-12
That's a non-issue. The CPU would throttle (rapidly decrease its clock speeds to lower temperature) long before being hot enough to the point the system would have to shut itself off. By the sound of it, both the GPU and CPU use Precision Boost (just boost frequency in the case of the GPU) that specifically take into account temperature, workload, power consumption, current draw, etc.

Yeah but that could mean slowdowns no?

Sorry i missed quoting you.
 
Last edited:
Since when? Sony introducing variable clock speeds to offset what sound like cooling concerns (from the higher clocked but lower-specced GPU perhaps?) aren't a competitive advantage for the PS5 if the XBox SeX doesn't have problems with overheating.



Really, it's okay if the PS5 is somewhat less powerful. It truly is. The ten percent raw performance difference or whatever probably won't make that much of a difference in practice beyond edge cases; certainly much less so than the bad old days of PS3 versions of games made on the 360. There's no reason to talk about how the PS5 is somehow super optimized in its design but the nearly identical hardware of the Xbox isn't so that will make them perform the same.
That's not what I'm saying, I'm just saying that we don't know enough about the Xbox optimizations and coprocessors and things like that to conclusively say that it is the better machine. I'm also not saying that the Xbox is slower. I'm just saying that TFlops don't tell the full story, and a lower TFlops machine can in certain real-world situations outperform a higher TFlops machine.

My point is: people shouldn't be worrying because the PS5 TFlops is slightly lower than the Xbox TFlops.

And even if it is a bit slower, it will still be an incredible machine.
 
TFlops tell quite a bit more of a story when they are TFlops numbers on the same architecture.

Yeah but that could mean slowdowns no?

Sorry i missed quoting you.
It could, but unless its a serious and sudden drop in clocks it probably wouldn't be that perceptible for gaming contexts. You're talking single digit frame differences, and there are plenty of games that have bigger swings than that naturally.
 
That's not what I'm saying, I'm just saying that we don't know enough about the Xbox optimizations and coprocessors and things like that to conclusively say that it is the better machine. I'm also not saying that the Xbox is slower. I'm just saying that TFlops don't tell the full story, and a lower TFlops machine can in certain real-world situations outperform a higher TFlops machine.
There's a pretty detailed article on the series x specs and design on Eurogamer digital foundry.
They of course did try put alot of optimization customizations too, in different ways.
I need to reread it.
 
TFlops tell quite a bit more of a story when they are TFlops between the same architecture.
Coprocessors are also part of the architecture, and they will differ between Xbox and PS. So they don't use the exact same architecture. They do use the same CPU/GPU platform, but there is more to it than that.
 
I'm getting original Xbox vs PS2 vibes from this. I bet the difference graphically will be something along those lines despite them back then having much more defined spec differences.
 
Well the whole system design and every part is important, not just gpu cpu power.
But, i still want also to be sure to get all it can give and at a good performance.

So will wait and see how it goes, and if will be a pro later on.
It is a little underwhelming just from this low key presentation but let's wait and see the results.
Seems you can plug external ssds, but will need to be match the Sony one.

About raytracing, it seems still really early to know what devs can do.
He says it can do global illumination,
shadows although that is very taxing, reflections which he says he saw really good implementation of it with very low usage, and 3d audio.

That's why i want a pro, to be more sure to get all shadows detail and looking great and smooth, and GI etc. looking smooth clean fast. Max physics, audio etc.
And a bigger ssd hopefully, better than spending an external ones after.
External drive is the USB3 HDR from your PS4 not an SSD. There is an internal slot for an extra SSD.
 
Just a thought..
Are those 10.28 TFlops made from the boost frequency?
If yes, in some cases we might get fewer than that?
 
djs
Just a thought..
Are those 10.28 TFlops made from the boost frequency?
If yes, in some cases we might get fewer than that?

That's kinda what I am thinking.

On a side note.
Based on what is here.
Less CU cores, but higher freq.
Xbox has Higher CU but lower freq.

Wonders if under max load on both consoles would equal about the same.
Box would have to lower it's TF due to heat and PS increases because it has better heat availability?
 
https://blog.eu.playstation.com/202...ls-of-playstation-5-hardware-technical-specs/


CPU

  • x86-64-AMD Ryzen™ “Zen 2”
  • 8 Cores / 16 Threads
  • Variable frequency, up to 3.5 GHz
GPU

  • AMD Radeon™ RDNA 2-based graphics engine
  • Ray Tracing Acceleration
  • Variable frequency, up to 2.23 GHz (10.3 TFLOPS)
System Memory

  • GDDR6 16GB
  • 448GB/s Bandwidth
SSD

  • 825GB
  • 5.5GB/s Read Bandwidth (Raw)
PS5 Game Disc

  • Ultra HD Blu-ray™, up to 100GB/disc
Video Out

  • Support of 4K 120Hz TVs, 8K TVs, VRR (specified by HDMI ver.2.1)
Audio

  • “Tempest” 3D AudioTech
 
I guess the thermal throttling could allow the console to stretch its legs in more favorable local environments, not having to always protect itself from the lowest common denominator of space, ambient temperature, or cleanliness. So long as the thing only runs hot in less-than-ideal conditions...
 
djs
Just a thought..
Are those 10.28 TFlops made from the boost frequency?
If yes, in some cases we might get fewer than that?

Yes, but not every game is going to need the full boost frequency.

Lets take ACC v AC. The former puts my RTX 2080 Ti into boost clock even just hotlapping. AC on the other hand barely gets the graphics card moving, even in VR, with a full grid.

F1 2019 doesn't even get the fans moving despite my very aggressive fan curve, and neither does Wreckfest.

It's not at all necessary to get 450FPS when you've only got a 60 or 144Hz monitor, so it makes no sense to run the system to limit in those cases.
 
It's worth noting that Cerny pointed out that one should not rely on TFLOPS alone as a metric, and indeed each PS5 GPU TFLOPS is doing more than each PS4's. Just looking at the equivalence, it seems like the PS5 GPU is doing the work of a PS4-spec GPU running at over 16TFLOPs.

With the tests that Digital Foundry ran, RDNA 1 was about 40% faster Tflop per Tflop compared to Polaris.

Lets take the absolute worst scenario with the following assumptions.

Polaris TFlop (PS4 Pro) = Pitcairn TFlop (PS4)
RDNA TFlop = RDNA2 TFlop
Realistic GPU clock for PS4 is actually 10% lower = 9.25 Tflop
Normalizing PS5 Tflop to PS4 Tflop = 9.25*1.4 = ~13 Tflop
Times PS5 is more powerful than PS4 = 13/1.84 = ~7 Times

With the increase in resolution to dynamic 4K (1600p - 2160p) lets assume it takes 3 times more power than PS4 so 1.84 * 3 = ~5.5

Power available to improve image quality at a higher resolution = 13 / 5.5 = ~2.36 Times PS4

This kind of power bump is in line with the increased power that is available to Forza Horizon developers vs Motorsport developers (30fps vs 60fps). So a decent bump in visual quality can be expected but not as significant as PS3 > PS4 (This is the worst case scenario).

The only thing that could potentially be an issue is the memory bandwidth. 448 GB/s might not be enough when its being shared with a power CPU like Zen2 and running in 4K.

On the positive side, there is Hardware RT which might free up some additional GPU power as some tasks are off loaded to the dedicated hardware (But is it truly dedicated like the XBSX?).
 
External drive is the USB3 HDR from your PS4 not an SSD. There is an internal slot for an extra SSD.
Yeah thanks, i mean extra, there's expandable slot for an ssd, plus extra std external hard drives plugin.

The sound processing unit seems pretty capable. It says it's the equivalent of the 8 jaguar cpu cores on the ps4 but on a specialized amd gpu cu unit.
Can process 3d complex audio with 100s of sound sources, and can be enjoyed on a std stereo headphone.
 
Coprocessors are also part of the architecture, and they will differ between Xbox and PS.
What "coprocessors," and what real world situation do you think will occur that will cause them to eliminate performance differences caused by a more powerful GPU and CPU? What "bottlenecks" is the PS5 going to overcome because of these "coprocessors" that the XBox will not be able to, assuming the XBox has the bottlenecks to begin with? Because the biggest bottleneck for PC hardware that has become much more prevalent since the PS4/Xbone came out is boost frequency and the subsequent performance losses from heat, and that is one that Sony seems more concerned about than Microsoft is.


The only "coprocessors" I've seen mentioned for the PS5 so far are integrated into the SSD's I/O controller, which does explain the higher performance of it compared to the Xbox SeX's presumably more "off the shelf" integration; but if I built a game console with a 3800x and a 2080 hooked up to a SATA-6 SSD it is going to perform better than a 3700/2070 Super with an M.2. The differences aren't going to be big, but the the SSD speed isn't going to do much of anything to offset them regardless.

So they don't use the exact same architecture. They do use the same CPU/GPU platform, but there is more to it than that.
Not terribly much more in an industry that has been using x86 boxes for 7 years now; so much so that you generally can even do a pretty good guess for what design AMD started with when they were designing their custom solutions for the console makers (Cerny has even said there's a chance that there will be a PC equivalent to the PS5 GPU on the market when the PS5 launches). Last console generation the most "exotic" thing either of them did was Microsoft sticking some eSRAM on the GPU for the XBone, but the PS4 still walked it because it was still basically the same hardware with better specs.



On a side note.
Based on what is here.
Less CU cores, but higher freq.
Xbox has Higher CU but lower freq.

Wonders if under max load on both consoles would equal about the same.
Box would have to lower it's TF due to heat and PS increases because it has better heat availability?
Traditionally, it has been the other way around. That is, the higher clocked GPU of the PS5 would be more sensitive to heat than the slower clocked but "higher" specced one in the XBox. AMD GPUs have been space heaters for a while now, so hopefully Big Navi has solved some of that somewhat and it won't be that big of a deal.



Keep in mind that even though Microsoft didn't say the system has variable clock rates like Sony did, doesn't necessarily mean it will run full clocks all the time even when there's no load. I'm speculating here, but I'd guess what Microsoft is doing (since they didn't outright say that the clocks would vary) is specifying the clock rates that both will go to; whereas Sony is relying on boost frequency for the performance and specs that they gave. Basically the XBox will run hotter but hold onto the clocks on both (and lower them when there's no load), whereas the PS5 will boost to a certain speed but pare back more as temperatures increase. So long as the PS5's cooling solution is acceptable for the design, there's not really an definite advantage to either solution. For Zen 2 in particular, there's been a lot of argument since it debuted about whether using PBO provides more performance rather than a traditional manual overclock.
 
djs
Just a thought..
Are those 10.28 TFlops made from the boost frequency?
If yes, in some cases we might get fewer than that?

Basically, what @Slapped already said, but I'll add onto it with this: AMD already has a solution to that in what they call "Gaming Frequency". Which is basically a sustained clock when running games and with Cerny mentioning that the PS5 supplies "generous" power to both the CPU and GPU so the clocks can fluctuate accordingly, they're both TDP and TGP limited <--- All that means is they'll be power limited depending on whatever the profile is.

The total boost frequency is 2230 MHz, and let's say the gaming frequency is...2055MHz. That's still 9.47 TFLOPs, which is above and beyond the PS4 Pro.
 
Back