• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

2Blackcats

Member
Oct 26, 2017
16,053
This please.

I think this was one thing I did not understand in the presentation, he mentioned this after talking about high clocks but I'm not sure how it worked in relation

Can anyone expand on it?

I guess the speed at which the GPU accesses the memory is defined by the memory speed so is unaffected by GPU clock boosts?
 

Pheonix

Banned
Dec 14, 2018
5,990
St Kitts
This please.

I think this was one thing I did not understand in the presentation, he mentioned this after talking about high clocks but I'm not sure how it worked in relation

Can anyone expand on it?
You have GPU speed and you have memory access. For simplicity's sake, I will use basic numbers.

Lets say typical GPU speed is 10, and Memory access is 1. Now say that GPU accesses the Memory every 2 cycles. If you increase the GPU speed to 20, it now means that instead of only 2 cycles passing for every 1 access hit, 4 cycles would have passed instead.
 

Pheonix

Banned
Dec 14, 2018
5,990
St Kitts
I have a feeling PS5 is going to be a beast.
True, it will... they both will. I am deeply concerned about memory bandwidth though. It seems like such an obvious potential bottleneck and its also such an easy (albeit more expensive) fix.

I just don't see how 448GB/s is enough for the CPU + GPU (that does so much more) would be enough.
 

PJV3

Member
Oct 25, 2017
25,676
London
True, it will... they both will. I am deeply concerned about memory bandwidth though. It seems like such an obvious potential bottleneck and its also such an easy (albeit more expensive) fix.

I just don't see how 448GB/s is enough for the CPU + GPU (that does so much more) would be enough.

I assume their own devs will be telling them if it's a serious problem.
 

cooldawn

Member
Oct 28, 2017
2,445
Is PS5 going to have noticeably worse RT though if generally it's running games at a slightly lower dynamic resolution than the same game on Xbox? RT scales linearly with resolution pretty much, so if PS5 is pushing less resolution then it's also pushing less rays needed for equivalent RT visuals. I cannot honestly see 3rd party PS5 RT games looking massively different from XSX RT games apart from the resolution.
I'm just going by what I read here but yeah, if RT nicely scales with resolution then sure, no reason why a 3rd party PlayStation 5 games couldn't at least match. Dictator thinks bandwidth is key, rather than clock frequency though.

I'm not really concerned with 3rd party games though. Sony's WWS will push their own titles past 3rd party anyway, which is why...

Do you believe that first-party PS5 games could have better looking RT than ps5's multi-platforms?
...I'd gladly bet Polyphony Digital will be at the very forefront of this technology to a point that is noticeably better than any 3rd party implementation and comparable to, regardless of bandwidth, any Microsoft title.

Essentially, at best case scenario, it'll take the eyes if Digital Foundry/NXGamer to make the distinctions because I think the majority of gamers will see similar 'OMG, this is beautiful' results from Sony and Microsoft.
 

Pheonix

Banned
Dec 14, 2018
5,990
St Kitts
I assume their own devs will be telling them if it's a serious problem.
This, unfortunately, is something that wouldn't matter i the devs like or not. After making their PU, and ding there SSD, the RAM is the only component that they can change right now and APU clocks. Everything else I set in stone. All they have t do is go from 14Gb RAM chips to 16Gbs/18Gbs chips. That's the difference between 448GB/s vs 512Gb/s/570GB/s respectively.

Thy wouldn' have to tinker with anything else and its as simple as a chip swap.

However, to sony, its a gain an easier 30% fps boost vs spending an additional $1B.

Yup, sorry devs... deal with it.
 

TheRealTalker

Member
Oct 25, 2017
21,451
We are getting another DF video in the future right? something about the interview they had with Cerny
--------
III-V your avi is hilarious by the way
 
Aug 9, 2018
666
Nothing I typed changes anything about what was presented at the talk - which is just as valid as it was said there. One component uses more Power when the other one is not using it, same thing that was presented there. Both cannot Max the Power Budget, as was also said there.
But the only example given there was when the CPU is not being fully utilized then it can send unused power to the GPU but the inverse might not be true since Cerny never mentioned it, or is that not possible? Also can it not be possible that both stay near the max power budget for games that uses both CPU and GPU equally as hard?
 

Elios83

Member
Oct 28, 2017
976
This, unfortunately, is something that wouldn't matter i the devs like or not. After making their PU, and ding there SSD, the RAM is the only component that they can change right now and APU clocks. Everything else I set in stone. All they have t do is go from 14Gb RAM chips to 16Gbs/18Gbs chips. That's the difference between 448GB/s vs 512Gb/s/570GB/s respectively.

Thy wouldn' have to tinker with anything else and its as simple as a chip swap.

However, to sony, its a gain an easier 30% fps boost vs spending an additional $1B.

Yup, sorry devs... deal with it.

For them it's pretty clear that price matters and it had an influence on design.
Cerny stated they had to be price conscious on what's inside when he explained why the SSD is 825GB and not 1TB or more.
Same for faster DRAM. So it's always a matter between performance and price.
It will be interesting to see if they can indeed swallow the loss thanks to services like Plus and Now to launch at their now usual 399$ price point. Some design decisions (smaller APU with 36CUs, 14Gbps ram chips and 825GB SSD) are definetly cost savings measures on the most expensive parts of the design. Yeah they're paying the high frequency with a custom cooling solution but according to the Bloomberg BOM estimate that still amounts to just a few dollars of costs vs a traditional cost of less than a dollar for a conventional cooling solution.

We are getting another DF video in the future right? something about the interview they had with Cerny
--------
III-V your avi is hilarious by the way
They said as much, Cerny talked with them about other details or features not discussed in the GDC presentation. Don't know how many weeks we'll need to wait to learn about that stuff.
 

makanuihoho

Member
Jun 4, 2019
5
For them it's pretty clear that price matters and it had an influence on design.
Cerny stated they had to be price conscious on what's inside when he explained why the SSD is 825GB and not 1TB or more.
Same for faster DRAM. So it's always a matter between performance and price.
It will be interesting to see if they can indeed swallow the loss thanks to services like Plus and Now to launch at their now usual 399$ price point. Some design decisions (smaller APU with 36CUs, 14Gbps ram chips and 825GB SSD) are definetly cost savings measures on the most expensive parts of the design. Yeah they're paying the high frequency with a custom cooling solution but according to the Bloomberg BOM estimate that still amounts to just a few dollars of costs vs a traditional cost of less than a dollar for a conventional cooling solution.


They said as much, Cerny talked with them about other details or features not discussed in the GDC presentation. Don't know how many weeks we'll need to wait to learn about that stuff.
I agree, I think Sony's approach will make much more sense when we see price. I fully believe Sonys main design criteria was $399 and a fast SSD. I also think the form factor will be quite a bit smaller than XSX. Now things become much more interesting if the PS5 and XSX are the same price.
 

Elios83

Member
Oct 28, 2017
976
But the only example given there was when the CPU is not being fully utilized then it can send unused power to the GPU but the inverse might not be true since Cerny never mentioned it, or is that not possible? Also can it not be possible that both stay near the max power budget for games that uses both CPU and GPU equally as hard?

This stuff has now been sufficiently well explained.
There is a max power budget which is a fixed top quantity and the cooling solution is designed around that.
Both CPU and GPU can work at their top frequency together as long as the total workload does not exceed the power cap.
Power consumption is a function of the workload and the kind of specific operations/instructions you're using not just on the working frequency, hence why both can run simultaneously at the top frequency.
Power budget is basically decided by devs and games based on what they want to do and can be shifted to GPU or CPU based on where it's needed in every moment.
It's unlikely that in games you'll end up using both CPUs and GPUs together and constantly in every moment of a game at a power budget that requires a downclock of either the CPU or GPU and if that happens Cerny has stated that the downclock will be minor because it's sufficient to lower clocks by 2% to lower power consumption by 10%.
 

amstradcpc

Member
Oct 27, 2017
1,768
This stuff has now been sufficiently well explained.
There is a max power budget which is a fixed top quantity and the cooling solution is designed around that.
Both CPU and GPU can work at their top frequency together as long as the total workload does not exceed the power cap.
Power consumption is a function of the workload and the kind of specific operations/instructions you're using not just on the working frequency, hence why both can run simultaneously at the top frequency.
Power budget is basically decided by devs and games based on what they want to do and can be shifted to GPU or CPU based on where it's needed in every moment.
It's unlikely that in games you'll end up using both CPUs and GPUs together and constantly in very moment of a game at a power budget that requires a downclock of either the CPU or GPU and if that happens Cerny has stated that the download will be minor because it's sufficient to lower clock by 2% to lower power consumption of 10%.
Great explanation!.
 

zombiejames

Member
Oct 25, 2017
11,918
This stuff has now been sufficiently well explained.
There is a max power budget which is a fixed top quantity and the cooling solution is designed around that.
Both CPU and GPU can work at their top frequency together as long as the total workload does not exceed the power cap.
Power consumption is a function of the workload and the kind of specific operations/instructions you're using not just on the working frequency, hence why both can run simultaneously at the top frequency.
Power budget is basically decided by devs and games based on what they want to do and can be shifted to GPU or CPU based on where it's needed in every moment.
It's unlikely that in games you'll end up using both CPUs and GPUs together and constantly in every moment of a game at a power budget that requires a downclock of either the CPU or GPU and if that happens Cerny has stated that the downclock will be minor because it's sufficient to lower clocks by 2% to lower power consumption by 10%.
This clears it up for me 👍
 
Aug 9, 2018
666
This stuff has now been sufficiently well explained.
There is a max power budget which is a fixed top quantity and the cooling solution is designed around that.
Both CPU and GPU can work at their top frequency together as long as the total workload does not exceed the power cap.
Power consumption is a function of the workload and the kind of specific operations/instructions you're using not just on the working frequency, hence why both can run simultaneously at the top frequency.
Power budget is basically decided by devs and games based on what they want to do and can be shifted to GPU or CPU based on where it's needed in every moment.
It's unlikely that in games you'll end up using both CPUs and GPUs together and constantly in every moment of a game at a power budget that requires a downclock of either the CPU or GPU and if that happens Cerny has stated that the downclock will be minor because it's sufficient to lower clocks by 2% to lower power consumption by 10%.
I know that, that is what I got from that presentation as well, but Dictator has said that won't be the case. Only one of those can reach top frequency that is why I asked if it is possible that only the CPU would be sending unused power to the GPU and not the other way around meaning that both can achieved their top frequency at the same time because even though the GPU will not be used fully it won't send the unused power to the CPU.

Edit: As an addition IIRC the slide only illustrates the CPU sending unused power to the GPU and not the other way around.
 

Elios83

Member
Oct 28, 2017
976
I know that, that is what I got from that presentation as well, but Dictator has said that won't be the case. Only one of those can reach top frequency that is why I asked if it is possible that only the CPU would be sending unused power to the GPU and not the other way around meaning that both can achieved their top frequency at the same time because even though the GPU will not be used fully it won't send the unused power to the CPU.

Cerny stated clearly they expect both CPU and GPU to spend most of their time at their top frequencies.
If it wasn't possible to do that and the GPU for example runs most of its time at its top frequency then the CPU would need to run most of its time at a downclocked frequency. This is simply not what he stated, both are expected to run at their top frequencies and minor downclocks will happen in worst case scenarios.
I'd take what Cerny has stated for facts atm. Power cap is the limit, running at top frequency is just a part of the equation of the power budged used, it depends on what you're asking the CPU or GPU to do at that frequency that is the instructions you're using in your code.
If further details emerge and things are better explained I'm open to it.
 

Decarb

Member
Oct 27, 2017
8,641
Something tells me the PSU is going to be external this time. Just a feeling but could be wrong.
 

tzare

Banned
Oct 27, 2017
4,145
Catalunya
Cerny stated clearly they expect both CPU and GPU to spend most of their time at their top frequencies.
If it wasn't possible to do that and the GPU for example runs most of its time at its top frequency then the CPU would need to run most of its time at a downclocked frequency. This is simply not what he stated.
I'd take what Cerny has stated for facts atm. Power cap is the limit, running at top frequency is just a part of the equation of the power budged used, it depends on what you're asking the CPU or GPU to do at that frequency that is the instructions you're using in your code.
If further details emerge and things are better explained I'm open to it.
Exactly. He even explained that load is not necessarily tied to frequency, and workload is what can make, at max frequency, necessary to downclock .
So that would be the scenario: max frequency on both cpu&gpu, AND workload that surpasses power budget. Which he stated is not what usually happens.
 
Aug 9, 2018
666
Cerny stated clearly they expect both CPU and GPU to spend most of their time at their top frequencies.
If it wasn't possible to do that and the GPU for example runs most of its time at its top frequency then the CPU would need to run most of its time at a downclocked frequency. This is simply not what he stated, both are expected to run at their top frequencies and minor downclocks will happen in worst case scenarios.
I'd take what Cerny has stated for facts atm. Power cap is the limit, running at top frequency is just a part of the equation of the power budged used, it depends on what you're asking the CPU or GPU to do at that frequency that is the instructions you're using in your code.
If further details emerge and things are better explained I'm open to it.
Yeah, I just wanted to clarify if I did understand it right. As I said that was what I got from the presentation.
I expect it to only work in that direction since the CPU is general purpose (unlike a GPU).
Got it, thanks.
 

Elios83

Member
Oct 28, 2017
976
So that would be the scenario: max frequency on both cpu&gpu, AND workload that surpasses power budget. Which he stated is not what usually happens.

Yes I think that for people the difficult part to grasp is that max frequency does not equal to max power.
I could make the CPU execute NOP instructions (no operation, instructions that are loaded, decoded and do nothing) at 3.5GHz and the power consumption of the CPU would be really minimum even though the instructions are fetched at the top frequency the CPU is capable of, in that case there is nothing to prevent the GPU to run at its top frequency and claim the total power budget for itself because power budget is the cap.
That is also why they capped frequencies, if power is the cap for the whole system and one component is idle, to keep power constant the clock control system might be tempted to raise frequencies to crazy levels on the other component that is actively doing something, that is not possible and there is a cap, 2.23GHz for the GPU and 3.5GHz for the CPU.
 
Last edited:

JaseC64

Enlightened
Banned
Oct 25, 2017
11,008
Strong Island NY
Didnt that guy from NX Gamer say he knew something about the cooling solution but couldnt talk about it until Sony does a reveal/breakdown? Like he made it sound it's something "neat" or "special".

So in confident Sony has a good case design with a decent cooling solution this time. I think Cerny made the point of the noise levels getting more profound with an increase on cpu/gpu usage and something they have tackled as well.
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
Didnt that guy from NX Gamer say he knew something about the cooling solution but couldnt talk about it until Sony does a reveal/breakdown? Like he made it sound it's something "neat" or "special".

So in confident Sony has a good case design with a decent cooling solution this time. I think Cerny made the point of the noise levels getting more profound with an increase on cpu/gpu usage and something they have tackled as well.
that was Gamers Nexus
 

Elios83

Member
Oct 28, 2017
976
Didnt that guy from NX Gamer say he knew something about the cooling solution but couldnt talk about it until Sony does a reveal/breakdown? Like he made it sound it's something "neat" or "special".

So in confident Sony has a good case design with a decent cooling solution this time. I think Cerny made the point of the noise levels getting more profound with an increase on cpu/gpu usage and something they have tackled as well.

Cerny stated as much by himself. They're proud of the cooling solution they came up with and we'll find out in the teardown of the system.
 

DavidDesu

Banned
Oct 29, 2017
5,718
Glasgow, Scotland
...I'd gladly bet Polyphony Digital will be at the very forefront of this technology to a point that is noticeably better than any 3rd party implementation and comparable to, regardless of bandwidth, any Microsoft title.

Essentially, at best case scenario, it'll take the eyes if Digital Foundry/NXGamer to make the distinctions because I think the majority of gamers will see similar 'OMG, this is beautiful' results from Sony and Microsoft.
I'm salivating at the prospect of a RT GT game. I'd be happy with GT Sport with a massive visual improvements patch. There's plenty of cars and tracks, give me the visual flair. Racing games already push visuals and at high frame rates, RT lighting and reflections will absolutely push them into true photorealism territory, GT and Forza alike. Going to be insanely cool.

And for GT at least in VR too! Given that the original PSVR headset is getting supported and only has 1080p screen I would expect the full visual suite and RT in PSVR in all game modes on PS5 . Ooft.
 

Pheonix

Banned
Dec 14, 2018
5,990
St Kitts
For them it's pretty clear that price matters and it had an influence on design.
Cerny stated they had to be price conscious on what's inside when he explained why the SSD is 825GB and not 1TB or more.
Same for faster DRAM. So it's always a matter between performance and price.
It will be interesting to see if they can indeed swallow the loss thanks to services like Plus and Now to launch at their now usual 399$ price point. Some design decisions (smaller APU with 36CUs, 14Gbps ram chips and 825GB SSD) are definetly cost savings measures on the most expensive parts of the design. Yeah they're paying the high frequency with a custom cooling solution but according to the Bloomberg BOM estimate that still amounts to just a few dollars of costs vs a traditional cost of less than a dollar for a conventional cooling solution.
I agree, I think Sony's approach will make much more sense when we see price. I fully believe Sonys main design criteria was $399 and a fast SSD. I also think the form factor will be quite a bit smaller than XSX. Now things become much more interesting if the PS5 and XSX are the same price.
Yes, obviously price matters..and it always matters especially in relation to whatever price target they are trying to hit.

All I am saying is that there seems to be an obvious potential bottleneck in their system. And one that is really easy (albeit expensive) to fix.

Its also obvious to me that it was a decision made to save money, because the options to make it better are also readily available.

They can price it at $299 or $599... doesn't change the fact that 448GB/s of bandwidth may still be not enough. I get it, if they have a current BOM of say $450 with a retail price of $399 that is already going o be putting them like $90 in the hole per console sold, making that change to their RAM chips could take that BOM up to around $490 or something and make them lose like $130/console sold at $399. So its obviously not what they would want to do.
 

Hulkamania78

Member
Oct 29, 2017
167
Manchester UK
I think the power difference will only be negligible but I hope the PS5 doesn't become a PS3 in strange design choices. I dont see it but I am concerned for the higher clock speed it feels a knee jerk reaction to the XSX. But also I can see MS throwing a ' we have tweaked some more power out of the machine' and increasing to 2ghz across the board.

Either way I cant wait and once the world gets back on track we can enjoy this new consoles
 

Elios83

Member
Oct 28, 2017
976
I think the power difference will only be negligible but I hope the PS5 doesn't become a PS3 in strange design choices. I dont see it but I am concerned for the higher clock speed it feels a knee jerk reaction to the XSX. But also I can see MS throwing a ' we have tweaked some more power out of the machine' and increasing to 2ghz across the board.

Either way I cant wait and once the world gets back on track we can enjoy this new consoles

There is nothing strange or developer unfriendly (like Cell and split RAM pools in the PS3) in the PS5. Actually it's the opposite, they made the system totally asking developers what they wanted trying to optmize and remove all the possible bottlenecks while being price conscious.
This is why so many developers are happy and excited.
36CUs working at a high frequency was the target since the beginning, probably because they targeted a 399$ price and they knew that using RDNA2 efficiencies and removing bottlenecks with a really strong I/O system the gap compared to the PS4 GPU would be absolutely huge.
The APU has lots of logic to monitor workload and decide how to clock CPU and GPU and it's working on different principles compared to schemes found traditionally in PC components where the performance is temperature dependent so each user has different performance depending on the operating temperature present in the place they're using the system. This cannot be a last minute measure to close a gap. It was planned. The only possible thing is that last year they decided to up the clock from 2GHz to 2.23GHz using a better cooling solution.
About Microsoft, the system is already done, they cannot change components and they already have a better GPU so they're not incentivated in trying to up their clocks, but if they know that their system is 499$, they fear that Sony will price at 399$ and they were able to do much better than what they expected at that price (8-9TF) they might do something if possible.
 

Hulkamania78

Member
Oct 29, 2017
167
Manchester UK
There is nothing strange or developer unfriendly (like Cell and split RAM pools in the PS3) in the PS5. Actually it's the opposite, they made the system totally asking developers what they wanted trying to optmize and remove all the possible bottlenecks while being price conscious.
This is why so many developers are happy and excited.
36CUs working at a high frequency was the target since the beginning, probably because they targeted a 399$ price and they knew that using RDNA2 efficiencies and removing bottlenecks with a really strong I/O system the gap compared to the PS4 GPU would be absolutely huge.
The APU has lots of logic to monitor workload and decide how to clock CPU and GPU and it's working on different principles compared to schemes found traditionally in PC components where the performance is temperature dependent so each user has different performance depending on the operating temperature present in the place they're using the system. This cannot be a last minute measure to close a gap. It was planned. The only possible thing is that last year they decided to up the clock from 2GHz to 2.23GHz using a better cooling solution.
About Microsoft, the system is already done, they cannot change components and they already have a better GPU so they're not incentivated in trying to up their clocks, but if they know that their system is 499$, they fear that Sony will price at 399$ and they were able to do much better than what they expected at that price (8-9TF) they might do something if possible.
great response thanks cant argue with that mate
 

VanWinkle

Member
Oct 25, 2017
16,089
Cerny stated clearly they expect both CPU and GPU to spend most of their time at their top frequencies.
If it wasn't possible to do that and the GPU for example runs most of its time at its top frequency then the CPU would need to run most of its time at a downclocked frequency. This is simply not what he stated, both are expected to run at their top frequencies and minor downclocks will happen in worst case scenarios.
I'd take what Cerny has stated for facts atm. Power cap is the limit, running at top frequency is just a part of the equation of the power budged used, it depends on what you're asking the CPU or GPU to do at that frequency that is the instructions you're using in your code.
If further details emerge and things are better explained I'm open to it.

So either Cerny is wrong or Dictator is. He says that if one is running at max frequency, the other will not.
 

TitanicFall

Member
Nov 12, 2017
8,262
36CUs working at a high frequency was the target since the beginning, probably because they targeted a 399$ price and they knew that using RDNA2 efficiencies and removing bottlenecks with a really strong I/O system the gap compared to the PS4 GPU would be absolutely huge.

Well hopefully that's the case because if it ends up as $499 they messed up.
 

unapersson

Member
Oct 27, 2017
661
So either Cerny is wrong or Dictator is. He says that if one is running at max frequency, the other will not.

I doubt if Dictator is really saying that, there's a difference between frequency and load.

Frequency is just how many cycles the chip is running, the load is how much work it will be doing. So both can run at high frequency the majority of the time, the throttling only occurring when simultaneous load on both is approaching the max.

The GPU frequency is also being capped, which suggests its not really maxed out, just set to the max that will work happily in the APU setup they have.
 

VanWinkle

Member
Oct 25, 2017
16,089
I doubt if Dictator is really saying that, there's a difference between frequency and load.

Frequency is just how many cycles the chip is running, the load is how much work it will be doing. So both can run at high frequency the majority of the time, the throttling only occurring when simultaneous load on both is approaching the max.

The GPU frequency is also being capped, which suggests its not really maxed out, just set to the max that will work happily in the APU setup they have.
He literally said, "if the GPU is at 10.2TF (2.23GHz), the CPU is not at 3.5GHz."

Don't know how much clearer he can be.

And, I have a hard time listening to his opinions on this specific subject, to be honest, as he has not said a single positive thing about PS5, nor a single negative thing about Series X. People are allowed to have biases (and that doesn't mean the business as a whole, ie Digital Foundry, present things from a biased point of view, because I believe their videos are pretty objective), and I just believe the guy has a strong bias.
 

Elios83

Member
Oct 28, 2017
976
So either Cerny is wrong or Dictator is. He says that if one is running at max frequency, the other will not.

Cerny is the system architect of the PS5 and designed the system, I think he might know something more ;) Unless he used misleading expressions and we all have comprehension issues.


Well hopefully that's the case because if it ends up as $499 they messed up.

It would not be the ideal situation, true.
It's clear they designed the system for 399$.
You pick the three most expensive components in the system (APU, RAM, SSD) and in each of them you can see the obvious price compromise (36CUs, 14Gbps GDDR6 modules and the weird 825GB size instead of 1TB).
Now they won't be able to sell this system at 399$ without a loss since it's pretty advanced and powerful but the loss might be sufficiently under control and covered by sales on subscriptions and games that the price might be 399$ as the target.
I woudn't price match XSX at 499$, if they can't price at 399$, 449$ might still be an option although less effective than 399$.
 

modiz

Member
Oct 8, 2018
17,831
I doubt if Dictator is really saying that, there's a difference between frequency and load.

Frequency is just how many cycles the chip is running, the load is how much work it will be doing. So both can run at high frequency the majority of the time, the throttling only occurring when simultaneous load on both is approaching the max.

The GPU frequency is also being capped, which suggests its not really maxed out, just set to the max that will work happily in the APU setup they have.
that is how i understood that this works too, the relationship between GPU and CPU clocks wont be inherently inverse, it depends on the load that the GPU/CPU have to take, thats why different games consume different amounts of power on the same console on current gen, the consoles are running at the same clocks but some games demand more work so more power gets consumed. This time as the power consumption remains constant it means that the clock is decided on the load of the CPU/GPU
 

nib95

Contains No Misinformation on Philly Cheesesteaks
Banned
Oct 28, 2017
18,498
He literally said, "if the GPU is at 10.2TF (2.23GHz), the CPU is not at 3.5GHz."

Don't know how much clearer he can be.

And, I have a hard time listening to his opinions on this specific subject, to be honest, as he has not said a single positive thing about PS5, nor a single negative thing about Series X. People are allowed to have biases (and that doesn't mean the business as a whole, ie Digital Foundry, present things from a biased point of view, because I believe their videos are pretty objective), and I just believe the guy has a strong bias.

The thing is, based on just what was actually said, the implication is that Dictator is wrong about this, though it could still be a case of Cerny messing up delivery of messaging.

The actual transcript reads as the following.

"Running a GPU at 2Ghz was looking like an unreachable target with the old fixed frequency strategy. With this new paradigm we're able to run way over that, infact we have to cap the GPU frequency at 2.23Ghz so that we can guarantee the on chip logic operates properly.

36 CUs at 2.23Ghz is 10.3 Tflops and we expect the GPU to spend most of its time at our close to that frequency and performance.

Similarly running the CPU at 3 GHz was causing headaches with the old strategy but now we can run it as high as 3.5 GHz, infact it spends most of its time at that frequency"

So based on the above, we're told that both the CPU and GPU will spend most of their time at the subsequent max frequency clocks. That's the first implication that they can both run at max frequencies simultaneously.

Then he says the following.

"That doesn't mean all games will be running at 2.23 GHz and 3.5 GHz. When that worst case arrives, it will run at a lower clockspeed, but not too much lower. To reduce power by 10%, it only takes a couple of percent reduction in frequency, so I'd expect any downclocking to be pretty minor"

This next part, once again, implies it can indeed run both the GPU and CPU at max frequencies, since saying he doesn't expect all games to, implies there will be games that do. Add to that, using the term worst case, again implies it'll be a rarer occurance that causes them to not both run at their potential max clocks.

Now of course, perhaps Cerny minced his words and Dictator is right, but I'm not so sure based on the language, and everything else Cerny said. My guess is that hitting max frequencies on both the CPU and GPU simultaneously, is not enough to hit their set power limit, and instead it's more so the types of tasks or instructions being carried out (some are more power hungry than others) that is likely the potential limiting factor. But I could be wrong.
 
Last edited:

Elios83

Member
Oct 28, 2017
976
Since we actually don't know everything about the PS5 outside of interpreting specs and the goals Cerny laid out, I'd say we all have some comprehension issues.

I was rhetoric :P
Cerny was clear on the point. The issue is not the peak frequency, it's the total workload which determines the total power consumption.
I could keep both CPU and GPU at their peak frequncies doing next to nothing or running an indie game with 2D graphics and the power consumption budget would be far from reached and the clock arbiter would assign working frequencies at their respective caps (3.5GHz and 2.23GHz).
 

Pheonix

Banned
Dec 14, 2018
5,990
St Kitts
I think the power difference will only be negligible but I hope the PS5 doesn't become a PS3 in strange design choices. I dont see it but I am concerned for the higher clock speed it feels a knee jerk reaction to the XSX. But also I can see MS throwing a ' we have tweaked some more power out of the machine' and increasing to 2ghz across the board.

Either way I cant wait and once the world gets back on track we can enjoy this new consoles
The PS5 has taken everything about the PS4 and has improved on it exponentially. Even its most complex feature (the SSD) is made transparent to devs, simplified and automated. If anything, the XSX is the one that has some weird technical quirk (different pools of RAM at different clocks) but that is not likely to be too much of an issue.

And outside pure hardware-based limitations, the PS5 would be the easier machine to build for.

Oh, and by the way, you can't knee jerk higher clocks to the tune of around a 500Mhz clock speed bump. It's clear they have always intended to go narrow and fast, I mean just look at that dev kit prototype.

And MS can't just upclock to 2Ghz... this whole clock upping thing is actually a lot more complicated and than most think. Especially when its not been planned and designed for. And tested. Just know that the XSxcurrent clock, is something that's been interesting for months by this point and as is the resultant cooling solution.
 

ManOfWar

Member
Jan 6, 2020
2,465
Brazil
I'd like to know what more technically minded people than I make of the PS5 Pro and eventually PS6 in regards to be backwards compatible with the PS5. Seems to me Sony is aiming squarely at its own feet with this design when time comes to make it compatible with newer hardware.
 

Wereroku

Member
Oct 27, 2017
6,201
Isn't there a decent chance that Cerny is giving devs the option to lock to set clocks as well as the new dynamic system? He said developer feedback was something they took very seriously. If some said they would rather the specs be toned down and locked for ease it should be easy to do that right?
 

Decarb

Member
Oct 27, 2017
8,641
I'd like to know what more technically minded people than I make of the PS5 Pro and eventually PS6 in regards to be backwards compatible with the PS5. Seems to me Sony is aiming squarely at its own feet with this design when time comes to make it compatible with newer hardware.
I feel like going forward they'll have to stick with this fixed power/variable frequency philosophy if they want proper BC. If it can solve overheating and fan noise, I'm all for it.
 

Liabe Brave

Professionally Enhanced
Member
Oct 27, 2017
1,672
It's clear XBOX Series X has the power for outstanding RT on consoles but I'm still really bloody excited about how Polyphony Digital's implementation will end-up looking like. I'm sure Kaz's team, as a bleeding edge visual team, will have a treat lined up for PlayStation 5 owners.
And racing games are typically not very demanding to render, so there's more possibility to spend budget on RT without impacting the rest of the presentation.

Thanks, didn't know that. Interesting if Lockhart comes into scene with very limited rt capabilities. Then devs should have two very different versions of the game if understood correctly, or very limited rt mode.
Lockhart would also be rendering much lower resolution, which will reduce the calculations needed. Even if it's only 4TF as rumored, number of rays should be about 1/3rd of XSX. If resolution is only 1/4 (1080p vs. 2160p), there shouldn't be much problem. This rough number does seem to indicate lowered RT quality if Lockhart renders at 1440p, or if XSX is rendering below full 4K.

Does RT scale linerally with clock speed then? Haven't seen that confirmed myself.
It's confirmed. The way Microsoft promoted their RT performance is by saying XSX can do 380 billion intersections per second. This number is amount of TMUs times clockspeed. (TMU is Texture Mapping Unit, of which there are 4 per CU.) This gives an exact figure of 379.6; doing the same calculation for PS5 gives a value of 321.1. That's 15% lower, the exact same gap as general compute...which makes sense, since it's dependent on the same two things, amount of CUs and clock.

In other words, if a PS5 game is 15% lower resolution, then it should have the same quality of RT. Or, XSX could have 15% better RT, but then the two games would run at the same resolution. (In general, logical terms only, of course; real results will differ slightly from game to game.)

I think this was one thing I did not understand in the presentation, he mentioned this after talking about high clocks but I'm not sure how it worked in relation

Can anyone expand on it?
The GPU operates on data stored in its local caches. When those calculations are done, the results are sent back to RAM and new source data is loaded from there. When the GPU clockspeed is high, calculations finish faster so you need to refill the RAM more often. But GPU clockspeed doesn't apply to RAM, which has its own invariant bandwidth. That means the windows where RAM will accept requests are farther apart, compared to the amount of math you're doing.

This is why people with technical know-how think the RAM bandwidth may be a limiting factor for both PS5 and XSX performance. The way both platforms might not be bandwidth-starved is if they can keep as much data in the local caches as possible, reducing trips to RAM. But also, AMD has worked to reduce bandwidth needs for the same amount of work by changing the architecture.

861-cache-diagram.jpg


This may also be ameliorated by the fact that PS5 is likely to have more L2 cache per CU than XSX.

And, I have a hard time listening to his opinions on this specific subject, to be honest, as he has not said a single positive thing about PS5, nor a single negative thing about Series X.
Please stop this. Mr. Battaglia has been part of our community for years prior to his employment at DF, as a level-headed and reasonable contributor. Of course he may well have personal biases, but the public statements he makes are grounded in expertise and knowledge, with an obvious intent to be informative, not divisive. You've heard him mostly talking about XSX because Microsoft has simply given more detail about the tech inside their machine. (And perhaps also because through DirectX more of it will apply to PC, which has always been his platform of choice.)