PlayStation 4 General DiscussionPS4 

  • Thread starter Thread starter Sier_Pinski
  • 9,445 comments
  • 633,354 views
Regardless, all 5 PS3s in my house (One Slim, one Fat, 3 SuperSlims) have never had a single issue. But when it comes down to the Xbox, I've been through 4 since 2007.

And it sounds like an idiotic idea to put a PS3 in an oven. Wouldn't that dry/crack the thermal paste even more?
 
BWX
Actually they aren't idiots, it is a well known way to fix a certain problem that makes a lot of PS3s experience YLOD do to a design flaw, or them using the wrong type of thermal compound or something. I forget, I never looked into it much because I never had to do it, but a lot of people did. Google it, you will see.

I don't think it's a design flaw.
I don't think it's them using the wrong type of thermal compound.

Most YLOD's are because the lead-free solder balls are lead-free solder balls (due to enviornmental groups clamping down and such).

I don't think Sony had control over which soldering balls to use. I could be wrong though.

Also, sticking an electronic board in an oven is a stupid idea.
 
BWX
Actually they aren't idiots, it is a well known way to fix a certain problem that makes a lot of PS3s experience YLOD do to a design flaw, or them using the wrong type of thermal compound or something. I forget, I never looked into it much because I never had to do it, but a lot of people did. Google it, you will see.

It was to heat up the solder paste under the BGAs to re-establish a good electrical connection. Basically the thermal paste wasn't always applied properly and the heat sinks weren't clamped to the board sufficiently so the BGAs would heat up so much that the solder on the pads would melt and then solidify as dry or broken joints. Normally you'd use an IR board heater (basically a big hot plate) and a heat gun but the average Joe doesn't have those lying around, so putting the board in the oven is the best way. The 360 had a similar fix for the RROD because that was the same issue, but actually it's possible to turn a 360 on without any of the cooling gear connected so the chips would bake themselves, so they were actually easier to fix; no oven required.

I say that because I managed to fix two PS3s for about half an hour each (long enough to back up data) but I fixed a 360 which has since played a few hundred hours of Forza 3, 4 and Horizon.


And no, Sony had no choice in the matter, RoHS regulations state that all solder used in mass produced electronics must be lead-free silver solder, but that wasn't the issue. Silver solder has a higher melting point than tin/lead solder (hence why the hobbyist industry uses it; cheaper soldering irons are fine with it) and all computers use silver in their BGAs too.

Oh and you're supposed to clean the thermal paste off because when you remove the heat sink after the PS3 has died, it's a useless baked-on mess. That's not unique to the PS3 either, whenever you remove a heat sink from anything with thermal paste on it, it has to be replaced.
 
Eks


And it sounds like an idiotic idea to put a PS3 in an oven. Wouldn't that dry/crack the thermal paste even more?

In theory I think so. If it doesn't harm other components in the process. Wouldn't get up to melting temperature anyways (if you were smart).
 
And no, Sony had no choice in the matter, RoHS regulations state that all solder used in mass produced electronics must be lead-free silver solder, but that wasn't the issue. Silver solder has a higher melting point than tin/lead solder (hence why the hobbyist industry uses it; cheaper soldering irons are fine with it) and all computers use silver in their BGAs too.

I figured as much. Thanks for the clarification! 👍


As far as the oven idea goes, I'm pretty sure many local PC shops actually do reflowing with one or two actually able to replace the solder balls. At least in my boondocksy area there is. :lol:
 
I don't think it's a design flaw.
I don't think it's them using the wrong type of thermal compound.

Most YLOD's are because the lead-free solder balls are lead-free solder balls (due to enviornmental groups clamping down and such).

I don't think Sony had control over which soldering balls to use. I could be wrong though.

Also, sticking an electronic board in an oven is a stupid idea.

No it isn't a stupid idea just because you say it is..

Anyway, that's a little off topic I suppose, but in a way not really, because Sony should have figured out a way for that not to happen. RoHs or not.
 
BWX
No it isn't a stupid idea just because you say it is..

Anyway, that's a little off topic I suppose,

So why did you bring it up in the first place? And if it isn't a stupid idea, why did you say this?

BWX
I mean the entire thing with taking it apart and baking it in an oven to "fix it for a while"?? Seriously? That doesn't concern you one little tiny bit? LOL.

It seems that you changed your opinion on the matter fairly quickly. The point is, people have been using irregular methods of fixing electronics for decades. Some safer than others. It's hardly a recommended fix.
 
So why did you bring it up in the first place? And if it isn't a stupid idea, why did you say this?



It seems that you changed your opinion on the matter fairly quickly. The point is, people have been using irregular methods of fixing electronics for decades. Some safer than others. It's hardly a recommended fix.

What are you talking about?

He said it was a stupid idea, I said it wasn't if it fixes a broken machine. Maybe off-topic to argue about it here.

I'm the one that brought it up in regards to PS3 reliability in the first place. Read the thread. I didn't change my opinion on anything. PS3s are still unreliable, so are XBOXs because they are both cheaply built.
 
In the post you originally replied to I mentioned that the ps3 isn't 100% reliable but has proved to be far more reliable than the 360. You then brought up the oven trick and said "that doesn't concern you at all? Lol". At what point in my original post did I condone said oven trick? I didn't. People have been going to extreme measures to try and fix broken electronics for decades. If they want to do it, fine by me but don't bring it up and ask me to justify it.

Then you went on to say it wasn't a stupid idea. That's your opinion and you're entitled to it, but it seems that you did a complete 180 on the subject.

So I have to ask; does it concern you? LOL :rolleyes:
 
In theory I think so. If it doesn't harm other components in the process. Wouldn't get up to melting temperature anyways (if you were smart).

If you were smart, you wouldn't be putting a PS3 in an oven. :sly:
 
Is it true that games for PS 3 won't run on PS4?

From what info is out there so far it seems like the discs won't no due to the Cell, however they may run via Gaikai.

We don't know when and how that will affects your current PS3 library so you may have to rebuy any PS3 games you want to play on PS4 digitally or they may be away of using the discs to re download it for free.

Make sense?
 
I've still got my launch Fatty PS3 and will do so until PS4, I still had my original launch PS2 & 1 also until upgrading to the next console. If you look after your possessions there is no reason they shouldn't last you a long time.

I would never stick anything like a console in an oven that's just asking for a culinary disaster.
 
Well, it depends on whether AMD is manufacturing everything in the console - or just supplying the APU while the console itself is being made in a different plant. Which I'm considering to be more likely.
I don’t think that will have any factor regarding who knows what about each other’s console.
Well, I'd still assume that significant changes in the hardware would require a lot of time, money and resources to develop and produce.
I don’t think that would be the case even for a last minute change regarding development costs. Simple modifications like adding more CUs and increasing clock speeds should be easy to do for AMD soon as they already have the technology for it. Prototyping a console case and cooling solutions is hardly going to cost a lot. What would cost a lot is if they already started production and then decide to change things. Also missing out on crucial time for getting it released would be costly. I do think though if there are any significant changes, they most likely already been planned before. Only chance of big changes last minute unplanned is if they realised something fundamentally wrong.
I'm not too sure of that. Sony did have some issues with manufacturing the PS3, after all.
That was regarding Blu-Ray diode shortage IIRC. I don’t think they will be struggling on something like a good cooling solution.
Being in development for three years doesn't mean the tech is as old.
I know but the way you make it sound like, if it is not already made or developed to final form in the years before, it can’t be changed towards the end.
I'm actually with you on this; I wish they would show something more, but, well, they don't need to. They'd be paying more than they'd need to pay without getting a significant additonal amount of money out of it.
I don’t think they would need to increase costs by much to give a noticeable improvement. I just feel current system is a bit borderline.
Meh, let me put it this way: The deal didn't get AMD out of its desperate financial situation, so it can't have been that good (which is why I think the loss for NVIDIA isn't as big as some might think).
It is most likely profitable for AMD. Will help them in the bigger picture too as now games will be designed for their GPUs and CPUs. Something like this would never turnaround a company’s financial prospects straight away. For Nvidia it would potentially been a loss going by what they are saying. Most likely Sony and Microsoft wanted a ridiculously lower price than compared to how much it would have cost to get a GPU only from AMD as they would then have to have got a CPU from say Intel or AMD which adds more costs.
We're not yet sure whether the PS4 will do 1080p on all its games, and you're expecting 4K? :indiff:
Well the 720p 60FPS Battlefield 4 rumour AFAIK was acknowledged as fake quite quickly. I think 1080p will be the main target for these consoles as expected. I think there will be more 4K games than there will be 720p only games.
I wish the rumours surrounding the next Xbox were at least as stable as the ones surrounding the PS4...
They seem quite stable; similar to PS4. All depends on how much change is actually happening.
Which a) wasn't that far ahead of the 7800 and b) still isn't the equivalent of today's GTX 680. So what?
My point was 7950 series had a single GPU version in it.
You know, playing Pong in 4k doesn't cut it. A German PC magazin tested Crysis 1 in 4k, ona system based around a HD7970 Toxic, overclocked to 1200mhz and matched it with an i7. Crysis 1's benchmark ran at 14 FPS. That shoudl give you and idea of jsut how taxing 4k is.
4K is not as taxing as quite a lot of people think. You can see your GPU playing 4K games quite well on YouTube. I do think though given performance of PS4, you will see it on say games that are probably low budget as strange as that sounds and also potentially in future in like sport games like FIFA. Shooters like new Killzone and Battlefield games, I don’t think we will see. GT might possibly have some 4K support as PDI seem to be into them kind of things. Maybe 4K rereleases will be something that will be pushed in the upcoming generation.

It certainly would be. Though since you've ignored every reason why it won't every time you brought the topic up in the GT5 forum, and you've ignored all of the evidence that the PS4 is probably going to be shooting for 1080p@60fps for the best case based on the hardware it has, and you've ignored Sony's own statements that they've made multiple times for how the system won't support 4k for games, it still wouldn't be a feather in your cap so much as you just talking until it turns out you're right.
I don’t really bring the topic up IIRC. From what I remember I join in the discussion. I counter as to why it would be. I do also think most games will be aiming for 1080p, don’t think I have said anything on the contrary. I have not ignored Sony’s own statements as well. Maybe you should look at what they have said. Understandably I know what you are trying to say, I could say the same thing about you, you’ve ignored every reason…. And in your case it might be actually possible to prove correct looking at post history but I am not going to spend time looking at it and making such a claim at you. You just levelled an accusation I think you will find hard to back up using my post history.
I assume you're referring to the PS3 Super Slim news, where you kept repeating yourself over and over and over again and refused to clarify your statements why you felt it was being made, then when Sony actually announced it you acted as if you were right all along even though you might as well have been making it up as you went along for all the validity your original points had.
I did clarify my statements; you seemed to only want your idea to be perceivable. I noticed you deleted the word haughty from your reply, which I can understand why...
Good job within a day or so of discussing it with you, actual proof came out regarding a slimmer PS3. I don’t think any amount of discussing with you would convince you when your mind is so set on one thing. I also expect at least one more PS3 redesign still as long as they got over the hurdle to shrink Cell to 22nm. There should be at least some revisions though. Can you explain why they did release a revision then, if it was not for any of my reasons?

I'm going to go out on a limb here and say that in my opinion, 4K in a 'proper' game (AAA release like GT or Battlefield) just isn't going to happen. Resolution doesn't count for anything when you're playing at 24fps, which is the current maximum you can get through HDMI 1.4. 30fps is bad enough, but 24...? No thanks. It probably will do 4K films though, I see no reason why it wouldn't be able to do that. But if the hardware can't physically do any better than 4K at 24fps... What's the point? I'd rather have lower resolution that runs smoother.

As far as I know, HDMI 2.0 (which is intended to support 4K at 60fps) doesn't exist yet. If the PS4 has HDMI 2.0 when it's released, I'll take this post back!
GT will probably be possible; I don’t think Battlefield will be.
I do hope they get HDMI 2 in the new PS4 somehow.
Wait, are we seriously going to have the 4k discussion AGAIN? Even though Sony has outright stated PS4 won't be able to do 4k games and everything we know about the hardware suggests the same thing?
Someone from Sony also said this: Link
And the interview with Mark recently even though a translation seems also to point in this path. They are now in initial stages of supporting 4K games. Given the power I expect 1080p games to be the focus; it would no sense to push 4K as goal for all the developers even if PS4 had twice the power. You would end up with games not fully utilising the PS4 on 1080p which will be most common resolution their target audience will have for quite some time.
 
BWX
but in a way not really, because Sony should have figured out a way for that not to happen. RoHs or not.

RoHS regulations had nothing to do with it, it was Sony's poor quality control when it came to the application of thermal paste and the poor design of their heat sink mounts resulting in insufficient thermal transfer, so the heat was dumped via the solder connections. I think lead solder would've actually been worse because of the lower melting point. But yes, off topic. Sorry.
 
How does this apply to the ps4? These are pc tests.

It certainly doesnt show its coming to the ps4.
If you look in context of what I posted, it would probably make more sense to you.

Now I will use something more relevant to prove a simple point, 5870 gaming performance wise is worse than a 7850 and PS4 is most likely more powerful than 7850. Games on PC AFAIK are not optimised for these higher resolutions. Now look at what a 5870 can do at resolutions higher than 4K:

Six%20Monitors.png


Now I would think most can gather from this if a 5870 can run higher than 4K at good frame rate, then it is a high chance we will see 4K gaming on PS4 when it is more relevant much like 3D was this generation.
 
A Core i7 at 3.33GHz paired with a GTX 680, DX9. Not a PS4.

Also:

That's not to say that our first test was a complete success, however. The stark, pristine nature of the visuals looked a little "boxey", while low-polygon objects and basic textures could look pretty rough

The same can't be said for Crytek's visual masterpiece, Crysis 2. In DirectX 11 mode, we lost our consistent 30FPS simply by moving from the high settings to very high (that's from low to medium by any other game's definition) while extreme and ultra were completely off the tablet, frame-rate collapsing whenever effects work dominated the scene.

Despite some artwork upgrades over the console versions, close-up camera viewpoints on Batman and his supporting cast and environmental detail reveal textures that are simply too low-res to look effective when rendered in ultra-HD.


--

Clearly a game intended for 4k output is going to need much better textures, pushing performance even further down.

EDIT: The same can be said for your above example, those older games were not designed for 4k output so all you're doing is upscaling poor textures. That isn't going to be good enough for games designed to output at 4k.
 
Last edited:
A Core i7 at 3.33GHz paired with a GTX 680, DX9. Not a PS4.

Also:

That's not to say that our first test was a complete success, however. The stark, pristine nature of the visuals looked a little "boxey", while low-polygon objects and basic textures could look pretty rough

The same can't be said for Crytek's visual masterpiece, Crysis 2. In DirectX 11 mode, we lost our consistent 30FPS simply by moving from the high settings to very high (that's from low to medium by any other game's definition) while extreme and ultra were completely off the tablet, frame-rate collapsing whenever effects work dominated the scene.

Despite some artwork upgrades over the console versions, close-up camera viewpoints on Batman and his supporting cast and environmental detail reveal textures that are simply too low-res to look effective when rendered in ultra-HD.


--

Clearly a game intended for 4k output is going to need much better textures, pushing performance even further down.

EDIT: The same can be said for your above example, those older games were not designed for 4k output so all you're doing is upscaling poor textures. That isn't going to be good enough for games designed to output at 4k.
That system can do DX11 4K as well and I like the way you posted everything negative from that article. It does however show what I was saying, games currently out are not optimised for these high resolutions. Something more positive from article below.

The overall impression even more striking than Battlefield 3: this game took on a whole new level at 4K, with an astonishing wealth of detail in the visuals and effects work, along with motion blur doing an excellent job of mitigating the traditional 30FPS judder.

"Take your pick: Crysis at high/DX11 or very high/DX9 both offer a stunning, smooth 4K experience on our Core i7/GTX 680 gaming PC."

On top of that it was really rare for us to find much in the way of sub-par texture work and in terms of overall detailing, only foliage and trees felt like a bit of a let-down - and that was mostly down to a lack of geometry on these elements, which really stood out at this extreme resolution. Indeed, overall detailing was so fine that in some places we actually had difficulty capturing footage in real time - by using mathematically lossless compression we could acquire video and stream it to a Samsung 840 SSD at around 180MB/s. Crysis 2's detail level was a compression nightmare in certain areas, with bandwidth spikes well in excess of the drive's maximum 260MB/s sequential write speed.

And finally about texture quality. PS4 having 8GB GDDR5 will help in that regard. This is the frame rate hit on Crysis 2 regarding higher resolution textures:

Dx11%20HR%20Textures.png
 
"On our Core i7/GTX 680 gaming PC". Exactly what the PS4 isn't. TFLOPS figure from Sony puts it between 7850 and 7870, still far off the GTX 680 in those tests.
 
Last edited:
Just curious why post these 4k tests when they are on systems superior to what the ps4 will be what does it prove?
 
I don’t think that will have any factor regarding who knows what about each other’s console.
Simply put, I do think a lot of insider information is available for the companies in question. Even more so thanks to the link that is AMD.
I don’t think that would be the case even for a last minute change regarding development costs. Simple modifications like adding more CUs and increasing clock speeds should be easy to do for AMD soon as they already have the technology for it. Prototyping a console case and cooling solutions is hardly going to cost a lot. What would cost a lot is if they already started production and then decide to change things. Also missing out on crucial time for getting it released would be costly. I do think though if there are any significant changes, they most likely already been planned before. Only chance of big changes last minute unplanned is if they realised something fundamentally wrong.
First off, the hardware wasn't at least very similar to what's going to be released, we wouldn't have coherent leaks and dev kits wouldn't be out. What we'ev seen recently are last minute changes. It doesn't get much later if Sony wants to get the console out in time.

That was regarding Blu-Ray diode shortage IIRC. I don’t think they will be struggling on something like a good cooling solution.
Not for the cooling, but for the R&D on energy consumption, reliability, heat development, practicability and also on business cases for the profitability of such measures. Easier decisions take months at my conpany, that's for sure.

I know but the way you make it sound like, if it is not already made or developed to final form in the years before, it can’t be changed towards the end.
You have to stop changing stuff somewhere and that's usually when you announce your product. Once you think you're finished, you announce it. Sony has announced it, so they think they're done. Only morons would announce a product when significant changes are to be expected.

I don’t think they would need to increase costs by much to give a noticeable improvement. I just feel current system is a bit borderline.
It is borderline, but they won't care.

It is most likely profitable for AMD. Will help them in the bigger picture too as now games will be designed for their GPUs and CPUs. Something like this would never turnaround a company’s financial prospects straight away. For Nvidia it would potentially been a loss going by what they are saying. Most likely Sony and Microsoft wanted a ridiculously lower price than compared to how much it would have cost to get a GPU only from AMD as they would then have to have got a CPU from say Intel or AMD which adds more costs.
Games will be optimized for a low-end APU that's mostly prevelant in laptops. Won't make too much of a change in terms of PC gaming.

Well the 720p 60FPS Battlefield 4 rumour AFAIK was acknowledged as fake quite quickly. I think 1080p will be the main target for these consoles as expected. I think there will be more 4K games than there will be 720p only games.
I'm convinced that there will be zero 4k games, as Sony said themselves.
That system can do DX11 4K as well and I like the way you posted everything negative from that article. It does however show what I was saying, games currently out are not optimised for these high resolutions. Something more positive from article below.



And finally about texture quality. PS4 having 8GB GDDR5 will help in that regard. This is the frame rate hit on Crysis 2 regarding higher resolution textures:
You do realize the game is running on a system that utilises as GPU that's about 1.5 times as powerful as the PS4 as a whole, right? And even then, it's struggling with a now outdated game on medium settings.
 
Luminis
And you don't think the same issue could creep up with 720p vs. 1080p? I most certainly do, especially since the hardware will have to last for a good bit of time.

It shouldnt be an issue. A GTX 460 can handle 1080p PC games. Theres no reason why the PS4 shouldnt be able to from day 1 and beyond.

Luminis
Wow, it serves the needs of a 2011 game well. Add the PS4's operating system and you've got little headroom for the next six or so years it'll be in service. Also, the RAM and VRAM aren't the only things that matter, in case you've forgotton.

Everyone except nVidia has spoken positively of the PS4's performance capabilities. If they're happy with it, Im happy.

Luminis
It'd be inbetween the 680 and 690. NVIDIA's line-up isn't as diverse as it was in the day. Nevertheless, the 680 is the 7800's equivalent, so I don't quite get what you're trying to do here. Find a single-GPU card that outperformed the last generation consoles by a factor of 1.5?

From everything I've read and heard, the PS4 GPU is a "supercharged" 7850, while the PS3's GPU was a "dumbed down" 7800.

As for the "supercharged" parts (in the PS4), or the parts that SCE extended, he said, "There are many, but four of them are representative." They are (1) a structure that realizes high-speed data transmission between the CPU and GPU, (2) a structure that reduces the number of times that data is written back from the cache memory in the GPU, (3) a structure that enables to set priorities in multiple layers in regard to arithmetic and graphics processing and (4) a function to make the CPU take over the preprocessing to be conducted by the GPU.

Link

(Playstation 3 RSX)Based on G71 Chip in turn based on the 7800 but with cut down features like lower memory bandwidth and only as many ROPs as the lower end 7600

Link

So the comparisons arent that simple.
 
So the comparisons arent that simple.

I said that a day or two ago but some people just don't understand the differences or in some cases I think it just does not suit their agenda. MANY processes that are currently software rendered on the GPU because of potential bottlenecks within the way a PC does things can be pushed to CPU quite easily on PS4 thanks to the speed of which both CPU and GPU can communicate with each other. Thus freeing up more (wasted) processing power on GPU for more GPU related tasks.

So to say PS4 GPU is comparable to a same spec PC GPU is nonsense in this case. If the architecture was the same as PC then yes that would be the case but because of the very high bandwidth on PS4 the two are incomparable.
 
Back