• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Pat002

Banned
Dec 4, 2019
856
Guys can someone do a TL;DR for me? Ill give him 10 internet points xD
For real tho, if someone has time to do so I'll be very thankful

Of course only about insightful things that we maybe didn't know before.
 

III-V

Member
Oct 25, 2017
18,827
My understanding, from what Cerny was saying, is that the level of power consumed by the chip isn't merely a function of clockspeed, but also dependent on what is actually being processed. That certain instructions, certain types of work, consume more power, independent of the clockspeed it is running at.

Hence, you could have times where both the CPU and GPU are, effectively, running at their max clocks, and other times where it has to vary.

Usually clocks are varied with thermals. Or cooling is varied with thermals. The latter is what happens in console often - even though, in PS4, the clockspeeds were fixed, the power consumption varied with workload, and thus the thermals - which is why your fan kicks up more in some games or at some times in some games, than in others.

PS5 is looking at the workload it's currently processing, what it will mean for power consumption, and if the workloads' power consumption at max clocks comes in under the PS5's power and thermal dissipation budget, then max clocks will be maintained. Cerny, it seems, expects that 'most' workloads will fit that criteria.

Their alternative was to try to guess the worst case workloads, and then always have the chip running at a lower fixed clockspeed to accommodate that. That's what they did with their consoles previously. Why do that, why fix around the assumed worst case, if the workload and its power demands will actually change, and in some, or perhaps many cases you could get away with a higher clock? That was the question they appear to have started with.
This is my understanding as well. Plan to create a transcript tomorrow.
 

mullah88

Member
Oct 28, 2017
951
People are intentionally being dense....I'm probably the least technical person in here and I understood what Cerny meant.
 

VanWinkle

Member
Oct 25, 2017
16,089
Nothing I typed changes anything about what was presented at the talk - which is just as valid as it was said there. One component uses more Power when the other one is not using it, same thing that was presented there. Both cannot Max the Power Budget, as was also said there.
So, just to be clear, the power supply of the PS5 cannot handle 3.5GHz CPU clock and 2.23GHz GPU clock concurrently? Ever? That being the case, I'm trying to figure out why he would say that clock speeds decrease specifically in "worst case games."
 

zombiejames

Member
Oct 25, 2017
11,918
I'll have to rewatch that section of the presentation but I'm pretty sure it can maintain those speeds. If someone can get there before I get the opportunity and quote what was said, that'd be great.

So I went back and re-watched that section. Here's what Cerny said:

"That doesn't mean all games will be running at 2.23GHz and 3.5GHz. When that worst-case game arrives, it will run at a lower clock speed but not too much lower. To reduce power by 10%, it only takes a couple of percent reduction in frequency. So I'd expect any down-clocking to be pretty minor."

Right before this quote, he said both CPU and GPU will run at those max clocks most of the time. I'm taking this to mean that both CPU and GPU can run at those max frequencies when needed, but if they're needed simultaneously then the frequencies for one or both would drop by a few percent. There's some debate about this in the other thread.
 

Crazy_KiD_169

Member
Jun 21, 2018
293
So, just to be clear, the power supply of the PS5 cannot handle 3.5GHz CPU clock and 2.23GHz GPU clock concurrently? Ever? That being the case, I'm trying to figure out why he would say that clock speeds decrease specifically in "worst case games."
So all this seems to depend what happening in the game at certain times. But if the game needs max power from both gpu and cpu ( that is worse case), it's drops couple of percent. That's how I understand it.
 

mordecaii83

Avenger
Oct 28, 2017
6,855
My understanding, from what Cerny was saying, is that the level of power consumed by the chip isn't merely a function of clockspeed, but also dependent on what is actually being processed. That certain instructions, certain types of work, consume more power, independent of the clockspeed it is running at.

Hence, you could have times where both the CPU and GPU are, effectively, running at their max clocks, and other times where it has to vary.

Usually clocks are varied with thermals. Or cooling is varied with thermals. The latter is what happens in console often - even though, in PS4, the clockspeeds were fixed, the power consumption varied with workload, and thus the thermals - which is why your fan kicks up more in some games or at some times in some games, than in others.

PS5 is looking at the workload it's currently processing, what it will mean for power consumption, and if the workloads' power consumption at max clocks comes in under the PS5's power and thermal dissipation budget, then max clocks will be maintained. Cerny, it seems, expects that 'most' workloads will fit that criteria.

Their alternative was to try to guess the worst case workloads, and then always have the chip running at a lower fixed clockspeed to accommodate that. That's what they did with their consoles previously. Why do that, why fix around the assumed worst case, if the workload and its power demands will actually change, and in some, or perhaps many cases you could get away with a higher clock? That was the question they appear to have started with.
That's an excellent explanation of what's actually happening inside the PS5, and why Cerny/Sony went the way they did. There's really no point lowering clock speeds and having them fixed instead of letting them run higher most of the time and only lowering them when required.


So I went back and re-watched that section. Here's what Cerny said:

"That doesn't mean all games will be running at 2.23GHz and 3.5GHz. When that worst-case game arrives, it will run at a lower clock speed but not too much lower. To reduce power by 10%, it only takes a couple of percent reduction in frequency. So I'd expect any down-clocking to be pretty minor."

Right before this quote, he said both CPU and GPU will run at those max clocks most of the time. I'm taking this to mean that both CPU and GPU can run at those max frequencies when needed, but if they're needed simultaneously then the frequencies for one or both would drop by a few percent. There's some debate about this in the other thread.

I'm not sure why you're spreading the same incorrect information in two threads? He says both chips will spend the majority of their time at max clock speed, which is impossible if only one can be at max speed at a time.
 

nib95

Contains No Misinformation on Philly Cheesesteaks
Banned
Oct 28, 2017
18,498
Seems the PS5's principle software engineer endorses your video NXGamer

 

Sklaary

Member
Mar 21, 2020
546
How does the SSD effects RT? I'm really into this next gen feature and would like to know which platform can produce, in theory, better RT.
 

Raide

Banned
Oct 31, 2017
16,596
How does the SSD effects RT? I'm really into this next gen feature and would like to know which platform can produce, in theory, better RT.
RT is done on RT/CU cores. All the calculation is done on those. SSD is just data transfer etc. Its why many suggest the Series X will be better at RT overall but without solid examples on both sides, its still up in the air as to which one will be better. Both are taking a different approach.
 

Sklaary

Member
Mar 21, 2020
546
RT is done on RT/CU cores. All the calculation is done on those. SSD is just data transfer etc. Its why many suggest the Series X will be better at RT overall but without solid examples on both sides, its still up in the air as to which one will be better. Both are taking a different approach.
So SSD is not really a factor how good RT will look. Cz PS5 has less CUs does it mean that a Dev can make more an effective use of them and reach quicker the potential of the PS5 RT capabilities?
 

Raide

Banned
Oct 31, 2017
16,596
So SSD is not really a factor how good RT will look. Cz PS5 has less CUs does it mean that a Dev can make more an effective use of them and reach quicker the potential of the PS5 RT capabilities?
Its more complicated than that, since there are so many factors that will come into play, like RAM speed, bandwidth, CPU usage etc. On paper the Series X should be better but without RT samples on both sides, it's hard to say. Less CU would generally mean less horsepower to generate RT, which would lead to lesser implementation of RT. RT, even on PC's, really needs lots of grunt to push it well and with lesser performance hits,

Again, we have to wait to see what Sony does with RT. I am sure they will have examples at some point.
 

Deleted member 20297

User requested account closure
Banned
Oct 28, 2017
6,943
Its more complicated than that, since there are so many factors that will come into play, like RAM speed, bandwidth, CPU usage etc. On paper the Series X should be better but without RT samples on both sides, it's hard to say. Less CU would generally mean less horsepower to generate RT, which would lead to lesser implementation of RT. RT, even on PC's, really needs lots of grunt to push it well and with lesser performance hits,

Again, we have to wait to see what Sony does with RT. I am sure they will have examples at some point.
Forgive me if I'm wrong but to my understanding RT is a computationally hard problem. Not hard in CS way as in time complexity and NP hard but you have to do a lot of calculations to do RT. Like really really a lot. Yes, compute units in modern computers are often limited because of io wait, no doubt about that, but both consoles do a lot to mitigate the io wait problems. But in the end you calculate a lot for RT, and if I'm not completely wrong, especially a lot in parallel.
 

Raide

Banned
Oct 31, 2017
16,596
Forgive me if I'm wrong but to my understanding RT is a computationally hard problem. Not hard in CS way as in time complexity and NP hard but you have to do a lot of calculations to do RT. Like really really a lot. Yes, compute units in modern computers are often limited because of io wait, no doubt about that, but both consoles do a lot to mitigate the io wait problems. But in the end you calculate a lot for RT, and if I'm not completely wrong, especially a lot in parallel.
Agreed. Its not a simple thing to do but that is why the PC space had RT cores. How that works for Series X and PS5 with their different approaches is yet to be seen. Minecraft has been the main gaming RT/Path Tracing stuff we have seen for MS. Nothing on the Sony front but I am sure its there. Obviously nobody wants RT if it has a major hit on performance/ or if the RT is of lesser quality.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,930
Berlin, 'SCHLAND
RT is done on RT/CU cores. All the calculation is done on those. SSD is just data transfer etc. Its why many suggest the Series X will be better at RT overall but without solid examples on both sides, its still up in the air as to which one will be better. Both are taking a different approach.
Series X will be better at RT. Better GPU with more shading Power, more intersection Tests, and more bandwidth = better RT performance.

This is common sense stuff.

They are both using the AMD approach, and unless AMDs Hardware RT scales negatively, XSX will have better RT performance.
 

Marble

Banned
Nov 27, 2017
3,819
Dictator Curious: are you expecting fully ray traced console games coming generation? With ray traced audio and levels maxed out with reflections and realistic bouncing light and whatnot? And what would be the difference between the 2 in that case. Will SX have more detailed reflections or something?

I may be in the minority here, but I don't really find today's ray tracing demo's that impressive. Sure, I see what it does, but it's typically something you totally forget after 5 minutes of playing and don't even notice when you're on the heat of the game. HDR does more for me tbh.
 
Dec 8, 2018
1,911
Dictator Curious: are you expecting fully ray traced console games coming generation? With ray traced audio and levels maxed out with reflections and realistic bouncing light and whatnot? And what would be the difference between the 2 in that case. Will SX have more detailed reflections or something?

I may be in the minority here, but I don't really find today's ray tracing demo's that impressive. Sure, I see what it does, but it's typically something you totally forget after 5 minutes of playing and don't even notice when you're on the heat of the game. HDR does more for me tbh.

I mean Xbox showed off Minecraft and even that only runs at 1080p and below 60fps on the series x when full ray tracing was applied. Sure it was a "fast" job but it shows how much power actually is needed and expecting AAA graphics and full ray tracing unless some MAJOR breakthrough comes along sound improbable and with Lockhart seemingly being real it kind of put the idea of games build around only ray tracing more or less dead unless they create two completely different versions of the same game (or the Lockhart version runs <720p) when it comes to lighting for the rest of this gen.
 

tzare

Banned
Oct 27, 2017
4,145
Catalunya
I wonder if next gen, we will have performance mode for high framerate enthusiasts, and RT mode at a lowered resolutions (720/1080p) for 'better visuals ' lovers.
 

Wollan

Mostly Positive
Member
Oct 25, 2017
8,809
Norway but living in France
Global illumination like seen in Metro Exodus is the most impactful & practical usecase for hybrid-raytracing that I've seen. Ironically it's one of the least power-hungry methods of utilising RT hardware.

I said it months ago and based on XsX demos and such that remains true in my mind: That the launch consoles have RT is a good for technology adoption but it won't really shine before 2021/22 PC GPU's arrive and the mid-gen consoles. We're still in premature technology land.
 
Last edited:

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
I wonder if next gen, we will have performance mode for high framerate enthusiasts, and RT mode at a lowered resolutions (720/1080p) for 'better visuals ' lovers.
If you design for RT from the start, you can't just turn it off because then you have to do lighting design for a full raster pipeline. There might be a lower precision RT toggle, as we seen on PC, but any dev who completely jumps in the pool (like Metro's 4A Games) ain't getting out.

Besides there will be things like upscaling and dynamic shading rates to claw back performance
 

Rabalder.

Member
Dec 8, 2018
1,481
I mean Xbox showed off Minecraft and even that only runs at 1080p and below 60fps on the series x when full ray tracing was applied. Sure it was a "fast" job but it shows how much power actually is needed and expecting AAA graphics and full ray tracing unless some MAJOR breakthrough comes along sound improbable and with Lockhart seemingly being real it kind of put the idea of games build around only ray tracing more or less dead unless they create two completely different versions of the same game (or the Lockhart version runs <720p) when it comes to lighting for the rest of this gen.
The minecraft demo is fully path traced. It's more of an experiment, really. There are a whole host of other ray tracing effects that can, and will, be implemented next gen.
 

Roarschach

Member
Dec 18, 2018
889
I mean Xbox showed off Minecraft and even that only runs at 1080p and below 60fps on the series x when full ray tracing was applied. Sure it was a "fast" job but it shows how much power actually is needed and expecting AAA graphics and full ray tracing unless some MAJOR breakthrough comes along sound improbable and with Lockhart seemingly being real it kind of put the idea of games build around only ray tracing more or less dead unless they create two completely different versions of the same game (or the Lockhart version runs <720p) when it comes to lighting for the rest of this gen.
I believe they said it was 60 fps. Maybe not the capture itself, but when they were showcased the demo, it ran 60 fps.
 

cooldawn

Member
Oct 28, 2017
2,445
It's clear XBOX Series X has the power for outstanding RT on consoles but I'm still really bloody excited about how Polyphony Digital's implementation will end-up looking like. I'm sure Kaz's team, as a bleeding edge visual team, will have a treat lined up for PlayStation 5 owners.
 

tzare

Banned
Oct 27, 2017
4,145
Catalunya
If you design for RT from the start, you can't just turn it off because then you have to do lighting design for a full raster pipeline. There might be a lower precision RT toggle, as we seen on PC, but any dev who completely jumps in the pool (like Metro's 4A Games) ain't getting out.

Besides there will be things like upscaling and dynamic shading rates to claw back performance
Thanks, didn't know that. Interesting if Lockhart comes into scene with very limited rt capabilities. Then devs should have two very different versions of the game if understood correctly, or very limited rt mode.
 

Sprat

Member
Oct 27, 2017
4,684
England
So, just to be clear, the power supply of the PS5 cannot handle 3.5GHz CPU clock and 2.23GHz GPU clock concurrently? Ever? That being the case, I'm trying to figure out why he would say that clock speeds decrease specifically in "worst case games."
It can.

What he means is if they're both running max abs within the power usage threshold everything will be fine but say if a game goes super physics heavy and pushes the cpu beyond its allowed power threshold it will downclock slightly.

Worst case scenario would most likely be if something wasn't optimised fully and going wild.

Probably very unlikely to see until late on the generation (or a Bethesda game)
 

DavidDesu

Banned
Oct 29, 2017
5,718
Glasgow, Scotland
It's clear XBOX Series X has the power for outstanding RT on consoles but I'm still really bloody excited about how Polyphony Digital's implementation will end-up looking like. I'm sure Kaz's team, as a bleeding edge visual team, will have a treat lined up for PlayStation 5 owners.
Is PS5 going to have noticeably worse RT though if generally it's running games at a slightly lower dynamic resolution than the same game on Xbox? RT scales linearly with resolution pretty much, so if PS5 is pushing less resolution then it's also pushing less rays needed for equivalent RT visuals. I cannot honestly see 3rd party PS5 RT games looking massively different from XSX RT games apart from the resolution.
 

2Blackcats

Member
Oct 26, 2017
16,051
Is PS5 going to have noticeably worse RT though if generally it's running games at a slightly lower dynamic resolution than the same game on Xbox? RT scales linearly with resolution pretty much, so if PS5 is pushing less resolution then it's also pushing less rays needed for equivalent RT visuals. I cannot honestly see 3rd party PS5 RT games looking massively different from XSX RT games apart from the resolution.

Does RT scale linerally with clock speed then? Haven't seen that confirmed myself.
 

space_nut

Member
Oct 28, 2017
3,304
NJ
Series X will be better at RT. Better GPU with more shading Power, more intersection Tests, and more bandwidth = better RT performance.

This is common sense stuff.

They are both using the AMD approach, and unless AMDs Hardware RT scales negatively, XSX will have better RT performance.

Can't wait to see MS first party games taking advantage! Halo Infinite confirmed to be doing RT reflections :) Let me see that Forza
 

Duderino

Member
Nov 2, 2017
305
Series X will be better at RT. Better GPU with more shading Power, more intersection Tests, and more bandwidth = better RT performance.

This is common sense stuff.

They are both using the AMD approach, and unless AMDs Hardware RT scales negatively, XSX will have better RT performance.

Not that I disagree with you (I don't), but there is one other factor that could come into play looking beyond Minecraft, available memory for BVH trees. And in the case of offline constructed BVH trees, SSD bandwidth.
 

Orioto

Member
Oct 26, 2017
4,716
Paris
RT and 4k.. That generation is weird in a way cause it's starting with fancy stuff that everyone wants, cause they've been talked about so much, but they are way too shiny and costly to reach (at least at the same time) with the type of graphics we want from that gen. It's like it's starting already with a bigger mouth than it can afford and people will continuously be disappointed.
 

mentallyinept

One Winged Slayer
Member
Oct 25, 2017
3,403
So, just to be clear, the power supply of the PS5 cannot handle 3.5GHz CPU clock and 2.23GHz GPU clock concurrently? Ever? That being the case, I'm trying to figure out why he would say that clock speeds decrease specifically in "worst case games."
It can, if the instructions/workload being run through both are not power (electricity) hungry.

Cerny can't say both CPU/GPU run at max clock "most of the time" if it can't happen simultaneously, it would have to be under 50% which wouldn't be "most of the time"

I think Dictator is using some shorthand to say:

"You can use the GPU and CPU at max power budget at the same time"

That's not the same as saying:

"You can't run the GPU and CPU at full clocks simultaneously"

Correct me if I'm wrong here Dictator
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,930
Berlin, 'SCHLAND
Not that I disagree with you (I don't), but there is one other factor that could come into play looking beyond Minecraft, available memory for BVH trees. And in the case of offline constructed BVH trees, SSD bandwidth.
What do you mean? RT is so horribly bandwidth intensive and will probably be even more so on these New consoles. Bandwidth for Real time things. Being bandwidth limited and going from from hundreds of gb/s in memory to the pitifully slow ssd in comparison would be a disaster.

What may save memory for RT will be mesh shaders doing some great New culling and, for bandwidth, RT per instance LOD.
 

Pat002

Banned
Dec 4, 2019
856
What do you mean? RT is so horribly bandwidth intensive and will probably be even more so on these New consoles. Bandwidth for Real time things. Being bandwidth limited and going from from hundreds of gb/s in memory to the pitifully slow ssd in comparison would be a disaster.

What may save memory for RT will be mesh shaders doing some great New culling and, for bandwidth, RT per instance LOD.
Unrelated to this, but when can we except the more in depth talk about Ps5 with the stuff that Cerny shared with you?
Soon-ish or are we talking about months?
 

2Blackcats

Member
Oct 26, 2017
16,051
To my knowledge every part of the GPU improves with an increase in clock, which should mean that RT scaling should end up pretty similar to the scaling of overall performance?

I don't think that's true. Didn't Cerny even mention an example of something that didn't improve? Can't remember the feature now.

But my question was does RT improve linearly with the clock speed? I imagine it does, I was just looking for confirmation.
 

gofreak

Member
Oct 26, 2017
7,734
Is PS5 going to have noticeably worse RT though if generally it's running games at a slightly lower dynamic resolution than the same game on Xbox? RT scales linearly with resolution pretty much, so if PS5 is pushing less resolution then it's also pushing less rays needed for equivalent RT visuals. I cannot honestly see 3rd party PS5 RT games looking massively different from XSX RT games apart from the resolution.

Per pixel I wouldn't expect a difference. Some games, though, might fix their RT-ed buffers at the same resolution used on PS5, and then spend the extra power on a higher per pixel quality. But I'm not sure that would be worth it from a quality point of view. The power difference wouldn't be enough to e.g. double your samples per pixel for shadows or AO. Meanwhile if they fix their RT buffer at the same lower resolution as on PS5, but run the main buffer at a higher resolution, the per pixel quality of data would be lower than on PS5 where the main buffer resolution is also lower - and I don't know if that might offer odd artifacts, or just end up as a wash. Maybe it could buy you back framerate though?

Quality wise, though, I think scaling the RT buffer with output resolution will be the easiest way to approach it.
 

Duderino

Member
Nov 2, 2017
305
What do you mean? RT is so horribly bandwidth intensive and will probably be even more so on these New consoles. Bandwidth for Real time things. Being bandwidth limited and going from from hundreds of gb/s in memory to the pitifully slow ssd in comparison would be a disaster.

Correct me if I'm wrong here, but moving BVH construction offline (obviously for static objects) is one of the optimizations Microsoft have talked about in the latest Series X Tech DF article, correct?

Like you mentioned, streaming directly off the SSD is not practical, but if a dev team takes the offline (or hybrid) approach getting BVH data from storage will have a time cost. Not a raw RT performance cost mind you, but still a potential bottleneck for large or geometry rich worlds.
 
Last edited:

Deleted member 27315

User requested account closure
Banned
Oct 30, 2017
1,795
Series X will be better at RT. Better GPU with more shading Power, more intersection Tests, and more bandwidth = better RT performance.

This is common sense stuff.

They are both using the AMD approach, and unless AMDs Hardware RT scales negatively, XSX will have better RT performance.
How do you comment on this from Cerny:
I am starting to get quite bullish. I've already seen a PS5 title that successfully using Ray Tracing based reflections in complex animated screens with only modest costs.

Do you believe that first-party PS5 games could have better looking RT than ps5's multi-platforms?
 

III-V

Member
Oct 25, 2017
18,827
What do you mean? RT is so horribly bandwidth intensive and will probably be even more so on these New consoles. Bandwidth for Real time things. Being bandwidth limited and going from from hundreds of gb/s in memory to the pitifully slow ssd in comparison would be a disaster.

What may save memory for RT will be mesh shaders doing some great New culling and, for bandwidth, RT per instance LOD.
I have some feeling that ram is the real Achilles heel of both of thee consoles this gen.
 

lightchris

Member
Oct 27, 2017
678
Germany
But my question was does RT improve linearly with the clock speed? I imagine it does, I was just looking for confirmation.

Absolutely. Unless the clock the RT units are running on is decoupled from the main GPU clock (which is unlikely), the RT performance is linearly tied to the GPU clock.
Or in other words: The RT performance difference between XSX and PS5 will be the same as the general GPU performance difference.
 

Newbong

Member
Oct 27, 2017
180
What do you mean? RT is so horribly bandwidth intensive and will probably be even more so on these New consoles. Bandwidth for Real time things. Being bandwidth limited and going from from hundreds of gb/s in memory to the pitifully slow ssd in comparison would be a disaster.

What may save memory for RT will be mesh shaders doing some great New culling and, for bandwidth, RT per instance LOD.

when are you guys dropping the mark cerny interview?
 

Hellshy

Member
Nov 5, 2017
1,170
I don't think that's true. Didn't Cerny even mention an example of something that didn't improve? Can't remember the feature now.

But my question was does RT improve linearly with the clock speed? I imagine it does, I was just looking for confirmation.

I think cerny said you get hit on system memory in relation to the gpu. Something like the memory is 33% further away in terms of cycles. Someone else would have to explain what it means but cerny seems to think the hit is worth it.
 

Mubrik_

Member
Dec 7, 2017
2,723
I think cerny said you get hit on system memory in relation to the gpu. Something like the memory is 33% further away in terms of cycles. Someone else would have to explain what it means but cerny seems to think the hit is worth it.

This please.

I think this was one thing I did not understand in the presentation, he mentioned this after talking about high clocks but I'm not sure how it worked in relation

Can anyone expand on it?