• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Mubrik_

Member
Dec 7, 2017
2,725
Can someone more informed than me weigh in on this topic?

Because my understanding from watching years of Digital Foundry videos is that consoles use less power when playing less demanding games. So no console is running at "full speed" when playing a non demanding game. This is why my PS4 Pro is quiet while playing indie games.

Liabe Brave made a very good post explaining in this thread I think.
Can't find atm
 

UraMallas

Member
Nov 1, 2017
18,930
United States
Is the XSX more powerful than the ps5? Absolutely.

Does that invalidate his design approach? Absolutely not.

The way he approached the design is different from the usual, it's not completely proven yet but from what he has detailed the design is genius, the developers are not casting doubt about it hell some have even publicly posted how excited they are about it and if some of the posters countering the design stop to actually understand it before posting maybe we'd be having more fruitful discussions about what the thermal solution might be (as Cerny was quite impressed with it) and how it's implemented or how this design can be implemented going forward if they have to add more CU's how will cooling be done e.t.c

E: shit, I thought I was still in the ps design thread.
No need to direct this at me. Argue with the person who I was interpreting. I could have been wrong about what they were saying but that was what I got out of it. I quoted someone who seemed to be really off base on what the other poster was saying, is all.
The ssd itself isnt super custom. And I believe the whole point of the power limiting hijinx is so that the cooling and power system can be cheaper.
I think you're probably right. I'm only going off of the rumored BOM being close. The rumor had them within $50ish and def not up towards $100. I also don't know if controllers are included in BOM? I thought I was reading something Albert Penello said where he said they weren't. I could be wrong on that. However, if they aren't included, it also seems to me like the PS5 controller will be more expensive to produce what with all the new stuff they are putting in plus the track pad, plus an internal battery that (hopefully) doesn't suck. So. Who knows. The XSX controller added a share button and textured bumpers after all.
 

RingRang

Alt account banned
Banned
Oct 2, 2019
2,442
Locked clocks don't mean a game will max out the CPU and GPU all the time, but it might need more power to keep running stable at those clocks. Like when you overclock a PC CPU and it needs certain voltage to run stable.

PS5's Smartshift tech redirects power from CPU to GPU as the load requires, making the system efficient with it's power usage and clock speeds. If a game isn't needing a lot of CPU power at a certain time, it can feed more power to the GPU or vise versa. That's the idea anyway.
Thanks
 

Lady Gaia

Member
Oct 27, 2017
2,479
Seattle
The X has locked clocks and can power though tasks, even when the system isn't being pushed to the max. It easily gets the job done but is less efficient with it's power usage, running full speed all the time, even when not needed, while PS5 is more efficient by changing CPU / GPU speeds as needed to get the task done.

This is a pretty fundamental misunderstanding of the differences. Many mobile devices do indeed keep clock speeds low with minimal activity in order to reduce battery consumption, but that's not the case with the PS5 nor the Series X as I understand them. It's a necessary compromise on a phone that comes with the downside that you never know for sure how long the next task is going to take, so typically the clock speed ramps up only after it has been busy for a brief period. This means that simple tasks are efficient but can take longer - a reasonable trade-off in a battery powered device.

On a game console, however, with a race to wrap up all the work necessary to produce a frame, it's essential that the CPU and GPU hit the ground running each and every time. Small tasks have to be done fast even if they're just handing off work to some other piece of hardware. Everything is a race to the finish line.

The difference with the PS5 is that it starts off every CPU and GPU task at such high clock that some of the most demanding kinds of work can't be sustained on both without exceeding the cooling capacity of the device. Instead of ramping up, it ramps down, and according to its architect this should be rare and minimal (and I can completely understand why, as some of the most demanding legal series of instructions would involve doing a ton of vector math in registers without ever touching RAM, and this is an absurdly rare workload even if it is valid code.) The clever bit is that it does so in a completely deterministic fashion, so it runs the same workloads at the same clock speeds every time on every PS5. Frankly, no CPU or GPU has ever run at full capacity for any length of time anyway, as cache misses result in brief pauses all the time waiting for data to make its comparatively leisurely journey from RAM. There are enough complex variables involved that armchair architects should really cool their jets and wait to see how the two systems work in practice with workloads that matter: games.

Is it likely that the Series X will have a modest performance advantage? Sure, but ~16-20% seems like the ballpark difference which, as has been pointed out over and over, is much smaller than the gaps we observed in the last generation. Is an advantage guaranteed at that level? Not at all, as these are complex systems with a lot of interactions among details we aren't yet privy to. Are there other parts of the system design that might make a bigger difference in our gaming experience? Absolutely. The PS5's SSD advantage will be a legitimate topic of debate, but I'm betting on it being a Really Big Deal. Developers have been I/O starved for two generations and it's going to be fascinating to see what creativity this unleashes.

Even more intriguingly, there are other aspects of the console designs that impact our gaming experience. The latest talk didn't say a peep about the controllers, and we haven't actually seen one yet so it'll be interesting to get a more detailed overview and hands-on experience from a range of people. PSVR2 is also widely expected and we know effectively nothing at all about it. Interesting times, indeed.
 
Last edited:
Oct 25, 2017
11,716
United Kingdom
This is a pretty fundamental misunderstanding of the differences. Many mobile devices do indeed keep clock speeds low with minimal activity in order to reduce battery consumption, but that's not the case with the PS5 nor the Series X as I understand them. It's a necessary compromise on a phone that comes with the downside that you never know for sure how long the next task is going to take, so typically the clock speed ramps up only after it has been busy for a brief period. This means that simple tasks are efficient but can take longer - a reasonable trade-off in a battery powered device.

On a game console, however, with a race to wrap up all the work necessary to produce a frame, it's essential that the CPU and GPU hit the ground running each and every time. Small tasks have to be done fast even if they're just handing off work to some other piece of hardware. Everything is a race to the finish line.

The difference with the PS5 is that it starts off every CPU and GPU task at such high clock that some of the most demanding kinds of work can't be sustained on both without exceeding the cooling capacity of the device. Instead of ramping up, it ramps down, and according to its architect this should be rare and minimal. The clever bit is that it does so in a completely deterministic fashion, so it runs the same workloads at the same clock speeds every time on every PS5. Frankly, no CPU or GPU has ever run at full capacity for any length of time anyway, as cache misses result in brief pauses all the time waiting for data to make its comparatively leisurely journey from RAM. There are enough complex variables involved that armchair architects should really cool their jets and wait to see how the two systems work in practice with workloads that matter: games.

Is it likely that the Series X will have a modest performance advantage? Sure, but ~16-20% seems like the ballpark difference which, as has been pointed out over and over, is much smaller than the gaps we observed in the last generation. Is an advantage guaranteed at that level? Not at all, as these are complex systems with a lot of interactions among details we aren't yet privy to. Are there other parts of the system design that might make a bigger difference in our gaming experience? Absolutely. The PS5's SSD advantage will be a legitimate topic of debate, but I'm betting on it being a Really Big Deal. Developers have been I/O starved for two generations and it's going to be fascinating to see what creativity this unleashes.

Even more intriguingly, there are other aspects of the console designs that impact our gaming experience. The latest talk didn't say a peep about the controllers, and we haven't actually seen one yet so it'll be interesting to get a more detailed overview and hands-on experience from a range of people. PSVR2 is also widely expected and we know effectively nothing at all about it. Interesting times, indeed.

Yeah it does seem like a lot of doom and gloom just because PS5 isn't quite as powerful. I'm sure the games will speak for themselves though and be great.

The I/O and SSD, are incredibly exciting though and devs seem to be very happy, so that's a good sign.

The DS5 Controller and PSVR 2 are definitely far more interesting than all the Tflop talk. The controller sounds like it could be pretty awesome, from the small bits of info we have about the haptic feedback and I'm dying to know more on PSVR 2.
 

Phil me in

Member
Nov 22, 2018
1,292

the poster was replying to you listing a check list of series x features that was bringing nothing to the conversation.

what's the deal with accounts that have been active for years with only small amount of posts, mainly only console warring lol.

I can't see series x running at its full
Clock speed all the time, even in os menu? Seems like a terrible waste of power.
Did PS4 and xbone run at full clocks all the time?
 

Pheonix

Banned
Dec 14, 2018
5,990
St Kitts
Oof. Maybe you should educate yourself about the XSX.
Excuse me? How is it not the same thing that's been done with consoles or PC CPUs or GPUs?

Lets even leave it at just game consoles...

Make Custome APU, set a cook frequency, build a cooling solution to keep it at that clock for as long as possible without overheating.

That's been the same way consoles have been made since they started needing a cooling solution.

I am talking about the design principle here. XSX is pretty much a beefier XB1X. Made with 2020 CPU and GPU and with a splash of the usual customizations that will always be in any semi-custom SOC.

So pls educate me, what am I missing? Which other device or system has been built the way sony is building he PS5? Not saying one or the other is better... imply saying that what MS has done s conventional and what sony has done isn't.
 

M3rcy

Member
Oct 27, 2017
702
So pls educate me, what am I missing? Which other device or system has been built the way sony is building he PS5? Not saying one or the other is better... imply saying that what MS has done s conventional and what sony has done isn't.

MS didn't have to do anything unconventional to clock the CPU higher and have more compute capability, though. So what did being unconventional actually achieve?
 

revben

Banned
Nov 21, 2017
57
the poster was replying to you listing a check list of series x features that was bringing nothing to the conversation.

what's the deal with accounts that have been active for years with only small amount of posts, mainly only console warring lol.

I can't see series x running at its full
Clock speed all the time, even in os menu? Seems like a terrible waste of power.
Did PS4 and xbone run at full clocks all the time?
What does my post history have to do with Xsx not being brute force?
 

zombiejames

Member
Oct 25, 2017
11,934
MS didn't have to do anything unconventional to clock the CPU higher and have more compute capability, though. So what did being unconventional actually achieve?
A 36 CU chip will be smaller and cheaper to manufacture than a 52 CU chip. There might be other manufacturing savings (size, materials, etc) depending on what cooling solution Sony's come up with. We'll have to wait and see to get the full picture, but just based on the APU size alone Sony's saving money.
 

M3rcy

Member
Oct 27, 2017
702
A 36 CU chip will be smaller and cheaper to manufacture than a 52 CU chip. There might be other manufacturing savings (size, materials, etc) depending on what cooling solution Sony's come up with. We'll have to wait and see to get the full picture, but just based on the APU size alone Sony's saving money.

Maybe. I'm also expecting a smaller form factor, FWIW.
 

Lady Gaia

Member
Oct 27, 2017
2,479
Seattle
I can't see series x running at its full
Clock speed all the time, even in os menu? Seems like a terrible waste of power.

Whether a CPU core idles when there's no work to do or whether it runs at a lower clock speed are generally two different considerations. There's a specific halt instruction in the architecture that stops the CPU core dead in its tracks and waits for an interrupt to resume normal execution.
 

marecki

Member
Aug 2, 2018
251
What are the respective power draws of each system?



People are denying that. People are debating with me which approach is better. its almost comical.
It seems you just cannot see past that TF figure. Think of it as a theoretical though experiment instead:

You are give a set number of Gpu compute units, say 40 and you have now two options how build the system around them
Option 1 (traditional and what Xbox is using): unlock the power draw, try out different clocks until you narrow it down to a clock which 99.9% will not overheat your console even if the gpu is running an extremely power hungry workload. The frequency is then locked and the power fluctuates based on workload. You are still left with a variable power draw and hence variable thermals.

Option 2 (what psv is doing): lock your power to a maximum your cooling can handle, then aim and test for a maximum clock which for 95% of the time comes under your power and for remaining 5% workload scenarios design it in the way that clock can be lowered to accommodate that and the reduction in clock doesn't need to be significant.

Now the main difference is that in option 1 your clock speed needs to accommodate 99.9% if not 100% cases for how your workloads affect the power draw and thermals, as the power draw is basically unlocked your thermals are only predictable to a degree so you need to be quite conservative with your clocks.

In the option 2 power and thermals are capped and frequency is unlocked so the system utilisation becomes more deterministic and you can be much more aggressive with your clocks.
 

ShapeGSX

Member
Nov 13, 2017
5,228
Excuse me? How is it not the same thing that's been done with consoles or PC CPUs or GPUs?

Lets even leave it at just game consoles...

Make Custome APU, set a cook frequency, build a cooling solution to keep it at that clock for as long as possible without overheating.

That's been the same way consoles have been made since they started needing a cooling solution.

I am talking about the design principle here. XSX is pretty much a beefier XB1X. Made with 2020 CPU and GPU and with a splash of the usual customizations that will always be in any semi-custom SOC.

So pls educate me, what am I missing? Which other device or system has been built the way sony is building he PS5? Not saying one or the other is better... imply saying that what MS has done s conventional and what sony has done isn't.

If you hand wave away enough things, you can make either of them sound "conventional."
 

T0kenAussie

Member
Jan 15, 2020
5,101
Excuse me? How is it not the same thing that's been done with consoles or PC CPUs or GPUs?

Lets even leave it at just game consoles...

Make Custome APU, set a cook frequency, build a cooling solution to keep it at that clock for as long as possible without overheating.

That's been the same way consoles have been made since they started needing a cooling solution.

I am talking about the design principle here. XSX is pretty much a beefier XB1X. Made with 2020 CPU and GPU and with a splash of the usual customizations that will always be in any semi-custom SOC.

So pls educate me, what am I missing? Which other device or system has been built the way sony is building he PS5? Not saying one or the other is better... imply saying that what MS has done s conventional and what sony has done isn't.
You need to look at the software suite MS is making for XSX and not just the hardware configuration. ML upscaling, velocity architecture, VM blades, 3D audio configurations.

I agree MS went the stable route with building by not using a highly customised part set (other than the SSD footprint and external SSD) but the supporting software will provide a lot of gains to all games. BC games will receive auto HDR, better visuals and frames. It's an exciting time to be a console gamer ✌️
 

Mubrik_

Member
Dec 7, 2017
2,725
This is a pretty fundamental misunderstanding of the differences. Many mobile devices do indeed keep clock speeds low with minimal activity in order to reduce battery consumption, but that's not the case with the PS5 nor the Series X as I understand them. It's a necessary compromise on a phone that comes with the downside that you never know for sure how long the next task is going to take, so typically the clock speed ramps up only after it has been busy for a brief period. This means that simple tasks are efficient but can take longer - a reasonable trade-off in a battery powered device.

On a game console, however, with a race to wrap up all the work necessary to produce a frame, it's essential that the CPU and GPU hit the ground running each and every time. Small tasks have to be done fast even if they're just handing off work to some other piece of hardware. Everything is a race to the finish line.

The difference with the PS5 is that it starts off every CPU and GPU task at such high clock that some of the most demanding kinds of work can't be sustained on both without exceeding the cooling capacity of the device. Instead of ramping up, it ramps down, and according to its architect this should be rare and minimal (and I can completely understand why, as some of the most demanding legal series of instructions would involve doing a ton of vector math in registers without ever touching RAM, and this is an absurdly rare workload even if it is valid code.) The clever bit is that it does so in a completely deterministic fashion, so it runs the same workloads at the same clock speeds every time on every PS5. Frankly, no CPU or GPU has ever run at full capacity for any length of time anyway, as cache misses result in brief pauses all the time waiting for data to make its comparatively leisurely journey from RAM. There are enough complex variables involved that armchair architects should really cool their jets and wait to see how the two systems work in practice with workloads that matter: games.

Is it likely that the Series X will have a modest performance advantage? Sure, but ~16-20% seems like the ballpark difference which, as has been pointed out over and over, is much smaller than the gaps we observed in the last generation. Is an advantage guaranteed at that level? Not at all, as these are complex systems with a lot of interactions among details we aren't yet privy to. Are there other parts of the system design that might make a bigger difference in our gaming experience? Absolutely. The PS5's SSD advantage will be a legitimate topic of debate, but I'm betting on it being a Really Big Deal. Developers have been I/O starved for two generations and it's going to be fascinating to see what creativity this unleashes.

Even more intriguingly, there are other aspects of the console designs that impact our gaming experience. The latest talk didn't say a peep about the controllers, and we haven't actually seen one yet so it'll be interesting to get a more detailed overview and hands-on experience from a range of people. PSVR2 is also widely expected and we know effectively nothing at all about it. Interesting times, indeed.

Good informative post.

Hopefully people read through before making counter arguments and rolling the thread around.
 

Deleted member 24021

User requested account closure
Banned
Oct 29, 2017
4,772
the poster was replying to you listing a check list of series x features that was bringing nothing to the conversation.

what's the deal with accounts that have been active for years with only small amount of posts, mainly only console warring lol.

I can't see series x running at its full
Clock speed all the time, even in os menu? Seems like a terrible waste of power.
Did PS4 and xbone run at full clocks all the time?
JFK you are not thinking critically at all. They pushed the GPU to that clock and designed their cooling solution to be able to sustain that clock. But there is no need to be running at that clock constantly so they let whatever workload the GPU is doing determine what clock it is running at. They didn't just arbitrarily decide we are going to lock the clock that high at 2.23GHz and slap whatever cooling solution on it. RDNA clocks really high up to 2GHz and (TBP) power draw is roughly at 250W. RDNA 2 has improved upon that by about 50% according to AMD, along with a more mature 7nm fabrication process. That is literally the simplest explanation.

This. It's hard for people to understand that the "variable frequency" on the PS5 basically means that it won't be running the 2.23Ghz clock on the main menu or map screens. The PS5 GPU will be at 2.23Ghz when it demands that frequency.
 

Pheonix

Banned
Dec 14, 2018
5,990
St Kitts
If you hand wave away enough things, you can make either of them sound "conventional."
No handwaving here..I talked to a specific point. And that point stands. If the XSXwas made in 2020, with Zen 2 and RDNA2, what you would end up with s the XSX.

I don't know why my saying that seems like its a bad thing. It isn't. We could predict what kinda chip and even clock the XSX was going to have, just by looking at its TF number and applying conventional wisdom. That's a fact, cause most of us made that prediction the second MS said 12TF. Now if sony had come and said 10.2TF, no one would have guessed that it was still being done with a 36CU GPU, because no one would have thought it even remotely possible for a GPU in a console to be clocked at 2.2Ghz.

That by definition is unconventional. An how they went about cooling their chip, again unconventional. MS is cooling theirs exactly how every console since the PS2 has cooled its chip to date.

None of this is a bad thing, I am not saying that MS did something wrong here, I am just saying that when it comes to the actual engineering of their hardware, they followed convention.
You need to look at the software suite MS is making for XSX and not just the hardware configuration. ML upscaling, velocity architecture, VM blades, 3D audio configurations.

I agree MS went the stable route with building by not using a highly customised part set (other than the SSD footprint and external SSD) but the supporting software will provide a lot of gains to all games. BC games will receive auto HDR, better visuals and frames. It's an exciting time to be a console gamer ✌
I said nothing about the software suite or even the customizations inside the APU that every company pretty much does if they are making a custom APU. Everything I was saying has to do with the actual engineered hardware, not the software stack backing it.

And even the SSD is conventional (well as convenient as having an SSD in a console can be), it's a 2230 SSD, like the ones used in laptops and some surface tablets. Its just not a consumer form factor so you would only see it inside another product. The SSD solution sony used, literally doesn't exist on the market.

I don't even know why this is a topic f contention. People seem to forget that sony is first and foremost a hardware company. If anyone would ever be doing anything crazy with hardware, it would be them.
 

ThatNerdGUI

Prophet of Truth
Member
Mar 19, 2020
4,551
You need to look at the software suite MS is making for XSX and not just the hardware configuration. ML upscaling, velocity architecture, VM blades, 3D audio configurations.

I agree MS went the stable route with building by not using a highly customised part set (other than the SSD footprint and external SSD) but the supporting software will provide a lot of gains to all games. BC games will receive auto HDR, better visuals and frames. It's an exciting time to be a console gamer ✌

Are you suggesting MS is using "off the shelf" components?
 

icecold1983

Banned
Nov 3, 2017
4,243
Then by all means, educate me. : )

Its been explained in this thread and by Cerny himself in the presentation. The GPU will clock up to a max of 2.23 ghz when thermal and power conditions allow it. When those become an issue it will clock down as low as needed. We dont know how low it will clock but Cerny did state that they couldnt achieve a locked 2 ghz, which is just over 10% less. By far the most reasonable deduction to be made from the information we have is that under heavy load the GPU will likely need to drop below 2 ghz.

Clocks dropping during low demand areas like menu screens etc have nothing to do with a variable boost clock. Thats a power saving measure and has been in every GPU and CPU for ages now.
 

Dave.

Member
Oct 27, 2017
6,154
Its been explained in this thread and by Cerny himself in the presentation. The GPU will clock up to a max of 2.23 ghz when thermal and power conditions allow it. When those become an issue it will clock down as low as needed. We dont know how low it will clock but Cerny did state that they couldnt achieve a locked 2 ghz, which is just over 10% less. By far the most reasonable deduction to be made from the information we have is that under heavy load the GPU will likely need to drop below 2 ghz.

Clocks dropping during low demand areas like menu screens etc have nothing to do with a variable boost clock. Thats a power saving measure and has been in every GPU and CPU for ages now.
This has to be posting in bad faith at this point, good lord.

There is no, none, zero "thermal conditions" component of the PS5 downclocking algorithm. There is no "as needed".

Locked 2.0GHz was unachievable in the traditional design method (and handily demonstrated by the XSX with it's wind tunnel design and vapor chamber cooling failing to even come close to 2.0GHz). It is EASILY achievable using this new method, as Cerny stated in the presentation.

You clearly haven't encountered a PS4 Pro if you think map and menu screens are "low demand areas".
 

Deleted member 24021

User requested account closure
Banned
Oct 29, 2017
4,772
Its been explained in this thread and by Cerny himself in the presentation. The GPU will clock up to a max of 2.23 ghz when thermal and power conditions allow it. When those become an issue it will clock down as low as needed. We dont know how low it will clock but Cerny did state that they couldnt achieve a locked 2 ghz, which is just over 10% less. By far the most reasonable deduction to be made from the information we have is that under heavy load the GPU will likely need to drop below 2 ghz.

Clocks dropping during low demand areas like menu screens etc have nothing to do with a variable boost clock. Thats a power saving measure and has been in every GPU and CPU for ages now.

Not how I understood it, but sure. Not sure what presentation you watched, because that's not what I heard Cerny say.
 

ShapeGSX

Member
Nov 13, 2017
5,228
This has to be posting in bad faith at this point, good lord.

There is no, none, zero "thermal conditions" component of the PS5 downclocking algorithm. There is no "as needed".

Locked 2.0GHz was unachievable in the traditional design method (and handily demonstrated by the XSX with it's wind tunnel design and vapor chamber cooling failing to even come close to 2.0GHz). It is EASILY achievable using this new method, as Cerny stated in the presentation.

You clearly haven't encountered a PS4 Pro if you think map and menu screens are "low demand areas".

Clocking a chip higher is not always easily achievable. And it is not just a function of the clocking and voltage algorithm. The chip's critical timing paths need to be tuned for this to have a prayer of working. It's a long and laborious task. I suspect that AMD's engineers probably spent a while working on this for them. It may even be the reason behind the rather high E0 stepping mentioned in the GitHub leak.

Also, a lot of maps and menus end up running at ridiculously high frame rates if they are unlocked. That's the often reason for fans cranking up during those times.
 

Betelgeuse

Member
Nov 2, 2017
2,941
Clocking a chip higher is not always easily achievable. And it is not just a function of the clocking and voltage algorithm. The chip's critical timing paths need to be tuned for this to have a prayer of working. It's a long and laborious task. I suspect that AMD's engineers probably spent a while working on this for them.

Suffice it to say jacking the clocks this high wasn't a reaction to XSX's 12 TF. Sounds like a fairly herculean engineering effort.
 

Dave.

Member
Oct 27, 2017
6,154
Also, a lot of maps and menus end up running at ridiculously high frame rates if they are unlocked. That's the often reason for fans cranking up during those times.
This is besides the point - GoW in it's 1080p performance mode has an uncapped frame rate the majority of the time. It doesn't come close to the howling fan of Warfare's menu (which may or may not have an uncapped frame rate, who knows). I feel we have had a few games issue patches to correct this issue in their menus at some point, was Rocket League one? But, like I said - besides the point. It doesn't matter if the extreme load comes from doing X at a very high frame rate or doing Y at a low frame rate - the point is that it is these these troublesome sequences which are spotted ahead of time and accounted for so very high clocks can be maintained for the remaining well-behaved code.
 

ShapeGSX

Member
Nov 13, 2017
5,228
Suffice it to say jacking the clocks this high wasn't a reaction to XSX's 12 TF. Sounds like a fairly herculean engineering effort.

No, there's more they could have done in reaction. Basically means that they have to reject lower bins, their yield drops and the price goes up because they're only taking the faster ones. It's a trade off that can be made late in the game.
 

Aladan

Member
Dec 23, 2019
496
Good video with interesting theorys.

I think the consoles will both have the same price but as always this is just speculation. In Germany/Europe this will made no real difference because people here are traditionally more Sony/Nintendo and even some of my friends that are only PC gamers will get a PS5 this time thanks to bc and the next first party games.

I'm looking forward to see threads for game announcements and Controller/other Hardware.
 

Jekked

Alt Account
Banned
Mar 24, 2020
20
Its been explained in this thread and by Cerny himself in the presentation. The GPU will clock up to a max of 2.23 ghz when thermal and power conditions allow it. When those become an issue it will clock down as low as needed. We dont know how low it will clock but Cerny did state that they couldnt achieve a locked 2 ghz, which is just over 10% less. By far the most reasonable deduction to be made from the information we have is that under heavy load the GPU will likely need to drop below 2 ghz.

Clocks dropping during low demand areas like menu screens etc have nothing to do with a variable boost clock. Thats a power saving measure and has been in every GPU and CPU for ages now.

how are developer dealing with this though? Under what circumstances are the clocks going down? And by how much?
Is it even deterministic? How can devs optimize around that?
Wont performance suffer by that? Or will developer just optimize their games around the lowest possible, so that it's always stable, even in the worst case scenario?
 

icecold1983

Banned
Nov 3, 2017
4,243
This has to be posting in bad faith at this point, good lord.

There is no, none, zero "thermal conditions" component of the PS5 downclocking algorithm. There is no "as needed".

Locked 2.0GHz was unachievable in the traditional design method (and handily demonstrated by the XSX with it's wind tunnel design and vapor chamber cooling failing to even come close to 2.0GHz). It is EASILY achievable using this new method, as Cerny stated in the presentation.

You clearly haven't encountered a PS4 Pro if you think map and menu screens are "low demand areas".

I said power and thermal.

If its so easily achieved why is the clock rate variable to begin with? i think you should go back and watch the presentation again, you seem confused.

It has nothing to do with the PS4 Pro and everything to do with how a game approaches rendering its menu.

how are developer dealing with this though? Under what circumstances are the clocks going down? And by how much?
Is it even deterministic? How can devs optimize around that?
Wont performance suffer by that? Or will developer just optimize their games around the lowest possible, so that it's always stable, even in the worst case scenario?

Yes according to Cerny its completely deterministic. We will have to wait for more details on the specifics of how clock speed is regulated, but it could be tied to some GPU usage markers. Just a guess at a possibility.

Also, id assume sony knew roughly what power level microsft was bringing a long time before we did. Variable clocks doesnt have to be some last minute reaction.
 

terawatt

Member
Oct 27, 2017
336
People here making conclusions on thermals/power/clockspeed without any facts or data - it's amazing to read.
 

endlessflood

Banned
Oct 28, 2017
8,693
Australia (GMT+10)
Its been explained in this thread and by Cerny himself in the presentation. The GPU will clock up to a max of 2.23 ghz when thermal and power conditions allow it. When those become an issue it will clock down as low as needed. We dont know how low it will clock but Cerny did state that they couldnt achieve a locked 2 ghz, which is just over 10% less. By far the most reasonable deduction to be made from the information we have is that under heavy load the GPU will likely need to drop below 2 ghz.
This is all of the information we have about the 'continuous boost' setup:
  • The GPU is expected to spend most of its time at or close to 2.23 GHz and 10.3TF.
  • The CPU spends most of its time at 3.5 GHz.
  • A couple of percent downclocking results in a 10% reduction in power, so any downclocking is expected to be pretty minor.
The comment Mark Cerny made about not being able to hit 2GHz was if they had been using a cooling solution designed for fixed clocks. They're not using a cooling solution designed for fixed clocks.

Clocks dropping during low demand areas like menu screens etc have nothing to do with a variable boost clock. Thats a power saving measure and has been in every GPU and CPU for ages now.
The PS5 has a fixed power budget. While clock speeds are a factor in power usage, they're not the only factor, and that is the key point. It's what allows both the CPU and GPU to run at that maximum boost rate most of the time.

The other thing with a big bearing in power usage is the type of workload, or activity. Cerny specifically mentioned menus in his presentation.

He pointed out that, somewhat counter intuitively, processing simple geometry uses more power than processing dense geometry, which is why the fans go crazy in menus and map screens. Those are the sorts of times when the GPU will be clocked lower to stay within the power budget.
 

icecold1983

Banned
Nov 3, 2017
4,243
This is all of the information we have about the 'continuous boost' setup:
  • The GPU is expected to spend most of its time at or close to 2.23 GHz and 10.3TF.
  • The CPU spends most of its time at 3.5 GHz.
  • A couple of percent downclocking results in a 10% reduction in power, so any downclocking is expected to be pretty minor.
The comment Mark Cerny made about not being able to hit 2GHz was if they had been using a cooling solution designed for fixed clocks. They're not using a cooling solution designed for fixed clocks.


The PS5 has a fixed power budget. While clock speeds are a factor in power usage, they're not the only factor, and that is the key point. It's what allows both the CPU and GPU to run at that maximum boost rate most of the time.

The other thing with a big bearing in power usage is the type of workload, or activity. Cerny specifically mentioned menus in his presentation.

He pointed out that, somewhat counter intuitively, processing simple geometry uses more power than processing dense geometry, which is why the fans go crazy in menus and map screens. Those are the sorts of times when the GPU will be clocked lower to stay within the power budget.

If the cooling solution was the only barrier for higher clocks, it still wouldnt be variable.

You need to look at why simple geometry uses more power. More complex geometry means smaller triangles. Smaller triangles means more developer effort required to fully saturate the GPU. Not every developer has the technical know how to achieve utilization rates as high as the best devs.
 

KeRaSh

I left my heart on Atropos
Member
Oct 26, 2017
10,262
If its so easily achieved why is the clock rate variable to begin with? i think you should go back and watch the presentation again, you seem confused.

Because they had a set power draw in mind. Cerny said they could have easily gone even higher implying that they just didn't want to slap a big ol' PC PSU in there.
 

Alexandros

Member
Oct 26, 2017
17,815
My personal objection to calling Sony's approach efficient is that Cerny himself said that a big amount of power is needed to push those clocks. "A couple percent of downclocking saves 10% power" means that you have to spend 10% of your power budget to get those clocks 2% higher, right? I wouldn't characterize this as efficient design.
 

Lady Gaia

Member
Oct 27, 2017
2,479
Seattle
how are developer dealing with this though?

The same way they have dealt with the complex realities of running code on CPUs for ages: by testing code under real-world conditions. I expect Sony will have provided tools that show when the clock rate is reduced, just like they have tools to analyze branch mispredictions, cache hit ratios, hot code paths, and everything else a developer with a focus on performance needs in their toolkit. After all, this isn't the first or only reason code runs at different rates. We haven't had trivial cycle counts for instructions for a very, very long time.

Under what circumstances are the clocks going down? And by how much?

Generally speaking, when lots of logic is engaged by a series of instructions in rapid succession. SIMD instructions are a common culprit. Executing SIMD code that's all in cache, with no branch mispredictions, using operands that are already in registers or in L1 cache is probably close to your worst case. The only reference point we have is Mark Cerny describing a very modest reduction of a couple percent to save 10% of the power budget. That shouldn't require more than ~3% reduction which would take us from 2.23GHz to 2.16GHz, and only under rare circumstances.

Game code often spends time idle waiting for the GPU. It's not uncommon to want to get work done fast and then have the luxury of "resting" in an idle state. All actual computation would likely take place at full clock speeds in the kind of utilization traces we see in Horizon Zero Dawn, for instance.

Is it even deterministic? How can devs optimize around that?

It's entirely deterministic according to the description we have. It's based on workload and not ambient conditions. Developers have been optimizing around complex architectural interactions for a very long time now. This won't be significantly different.

Wont performance suffer by that? Or will developer just optimize their games around the lowest possible, so that it's always stable, even in the worst case scenario?

Outside of hard real-time systems where you're controlling something life-and-death critical like avionics, nobody takes the time to even understand the worst case scenario. You optimize code until it runs well in a wide range of test scenarios and call it a day. Performance is, on average, going to be much higher than you can get with fixed clocks ... and that's the whole point of designing something like this in the first place.
 

GhostTrick

Member
Oct 25, 2017
11,316
My personal objection to calling Sony's approach efficient is that Cerny himself said that a big amount of power is needed to push those clocks. "A couple percent of downclocking saves 10% power" means that you have to spend 10% of your power budget to get those clocks 2% higher, right? I wouldn't characterize this as efficient design.



People rather read it the opposite way.
"Holy shit ! Cerny is so good that he can reduce power consumption by 10% just with a few percent downclock".
 

Lady Gaia

Member
Oct 27, 2017
2,479
Seattle
My personal objection to calling Sony's approach efficient is that Cerny himself said that a big amount of power is needed to push those clocks. "A couple percent of downclocking saves 10% power" means that you have to spend 10% of your power budget to get those clocks 2% higher, right? I wouldn't characterize this as efficient design.

It's able to more closely track a power budget than a system with a fixed clock, so in that sense it's an efficient use of die space and the available parts budget. Yes, you could run the whole system at half the speed for a small fraction of the power. Of course, then you'd have a system running at half the speed and you're still paying just as much for it. So just about everyone runs silicon as fast as they can get away with reliably unless cost is no object or power efficiency is incredibly important, as when considering battery life. It doesn't matter if it's an Intel CPU, AMD APU, or Nvidia GPU, power scaling around the clock speeds devices ship at scale at roughly the cube of the clock. This chip isn't an exception, it's just following the same curve as everyone else.