NX Gamer: PS5 Full Spec Analysis | A new generation is Born

cyenz

Member
Oct 27, 2017
588
I guess this s the new narrative now.

At this point, there s no need to even address comments like these.


Its not that no one would use the extra power. It's about how that extra power would present and what it translates to.

Its a simple case of either higher native rez, or a higher framerate. Not both. And the logical thing for any dev to do is to look the framerate and dynamically scale the rez. What that translates to is that the PS5 would on average rn at scaled-down rez from the XSX about 15-20% of the time. While everything else remains exactly the same.

And while I and many others have repeated this, that means that The XSX could be at [email protected](Y)fps (where Y is whatever fps lock the devs set out for) 100% of the time and the PS5would be at [email protected](Y)fps around 80%-85% of the time. All effects and assets would be the exact same thing with the only difference being that when that part of the game comes along that is taxing on the hardware, the XSX would likely be able to hold its rez @2160p while the PS5 would scale its rez down to around 1800p/2052p to maintain the locked framerate.

That's what this nonsense is about. 2160p vs 1800p/2052p - 2160p.

What makes all this even crazier is that with those kinda rez targets/range,1800p - 2160p ; 95% of the people out there wouldn't be able to detect or tell apart. Its actually easier to see a difference between 1080p vs 900p (around a 40%+ rez difference), than it is to see one between 1440p vs 2160p. Even though the latter is over a 100% resolution difference. And that's simply because when pixel count is already that high diminishing returns kicks in and it starts getting really hard to tell resolutions apart. And yet such a fuss is being made about 1800p/2052p vs 2160p. Even crazier when you consider that those drops in rez on the PS5wouldonly really happen about 15-20% of the time.

Crazy times.
It can be also used for effects, its a matter o developers choice on how to use the extra TF. It's not mandatory to use it just for res or fps increase.
 
Apr 4, 2018
462
I guess this s the new narrative now.

At this point, there s no need to even address comments like these.


Its not that no one would use the extra power. It's about how that extra power would present and what it translates to.

Its a simple case of either higher native rez, or a higher framerate. Not both. And the logical thing for any dev to do is to look the framerate and dynamically scale the rez. What that translates to is that the PS5 would on average rn at scaled-down rez from the XSX about 15-20% of the time. While everything else remains exactly the same.

And while I and many others have repeated this, that means that The XSX could be at [email protected](Y)fps (where Y is whatever fps lock the devs set out for) 100% of the time and the PS5would be at [email protected](Y)fps around 80%-85% of the time. All effects and assets would be the exact same thing with the only difference being that when that part of the game comes along that is taxing on the hardware, the XSX would likely be able to hold its rez @2160p while the PS5 would scale its rez down to around 1800p/2052p to maintain the locked framerate.

That's what this nonsense is about. 2160p vs 1800p/2052p - 2160p.

What makes all this even crazier is that with those kinda rez targets/range,1800p - 2160p ; 95% of the people out there wouldn't be able to detect or tell apart. Its actually easier to see a difference between 1080p vs 900p (around a 40%+ rez difference), than it is to see one between 1440p vs 2160p. Even though the latter is over a 100% resolution difference. And that's simply because when pixel count is already that high diminishing returns kicks in and it starts getting really hard to tell resolutions apart. And yet such a fuss is being made about 1800p/2052p vs 2160p. Even crazier when you consider that those drops in rez on the PS5wouldonly really happen about 15-20% of the time.

Crazy times.
I love posts like this that try to hand wave the large GPU power difference between the consoles as though it will be unnoticeable.

If you scroll back to my previous posts, you'll notice that I made a fat list of graphical features that can easily be tweaked by 3rd party developers, especially multi-plat devs supporting PC, to make stark visual improvements in the XSX versions (such as post-processing, anti-aliasing, lighting, shadows. Even features that have minor memory footprints such as tesselation can look better).

This doesn't even include what I've heard about Ray-tracing apparently heavily relying on greater CU counts. Lighting could end up being the greatest visual differentiator, along with significantly better Ray-tracing support.

Devs could surely spend all the GPU power in poorly chosen ways, I'm sure some will. But just as they always have in the past, many developers will also choose to be smart, and utilize the extra GPU features and efficiencies to make the game look as good as it can.

Also remember that the AAA space is incredibly competitive. Developers need to show thier games off in the best light possible, that's likely why Microsoft got so many AAA game reveals since release of the X1X. That alone is reason enough for devs to push visuals as far as they can on XSX.

I have a feeling this gen will be less about resolution than last gen.
 
Last edited:

Rukumouru

Member
Nov 12, 2017
1,596
That's litterally 8GB of GDDR5 all over again here.

How did PC games do the last generation with PS4's unified pool of memory ?
Multiplats needed to accomodate the OG Xbox One DDR3 memory. That is not going to be the case here.

PCs generally have more cache in their CPUs. It’s not an issue.
While CPU cache is insanely fast, it's also much, much smaller than the unified GDDR6 memory pool. Seems to me like each would allow for different approaches, meaning techniques optimized for the console architecture may not be able to translate to the PC's advantages.

I hope I'm wrong, I enjoy both the console and PC experience and I buy the same games on both often so I can enjoy them depending on my mood. So I'd like the PC to continue receiving the very strong multiplats support we've seen this gen.
 
Last edited:

Pheonix

Member
Dec 14, 2018
2,586
St Kitts
It can be also used for effects, its a matter o developers choice on how to use the extra TF. It's not mandatory to use it just for res or fps increase.
Yes and No. Yes, it can be used for "effects", but no that is most likely not going to be the case.

The power difference is simply not that much. It the smallest power difference between two PS and Xbox consoles ever. To be used for "extra effects" would mean that devs are trying to keep the rez and frame rate identical on both platforms, then so doing they've ~2Tf worth of leeway on the XSX to invest in effects. Why do that when you can just drop the rez of the PS5 down by around 15% - 20% and have complete feature/effect parity at the cost of just running at a slightly lower rez?

Especially when the rez drop would b non-consequential to the overall presentation the game?
I love posts like this that try to hand wave the large GPU power difference between the consoles as though it will be unnoticeable.

If you scroll back to my previous posts, you'll notice that I made a fat list of graphical features that can easily be tweaked by 3rd party developers, especially multi-plat devs supporting PC, to make stark visual improvements in the XSX versions (such as post-processing, anti-aliasing, lighting, shadows. Even features that have minor memory footprints such as tesselation can look better).

This doesn't even include what I've heard about Ray-tracing apparently heavily relying on greater CU counts. Lighting could end up being the greatest visual differentiator, along with significantly better Ray-tracing support.

Devs could surely spend all the GPU power in poorly chosen ways, I'm sure some will. But just as they always have in the past, many developers will also choose to be smart, and utilize the extra GPU features and efficiencies to make the game look as good as it can.

Also remember that the AAA space is incredibly competitive. Developers need to show thier games off in the best light possible, that's likely why Microsoft got so many AAA game reveals since release of the X1X. That alone is reason enough for devs to push visuals as far as they can on XSX.

I have a feeling this gen will be less about resolution than last gen.
No hand waving here. Everything you have mentioned still amounts to a sub 20% difference in overall performance. Call it shader, RT... whatever. Its really a simple thing, whatever XSX can do at 12TF, the PS5 can do with 10TF but at a lower rez than what the XSX doing it at.

I don't know what past you are referring to, but I am using the last 7 years as a reference. Look at the PS4/XB1 and the PS4pro/XB1X. Both of those machines had significantly more differences between them than these next-gen machines do. And where and how the extra power was applied was predictable and consistent.

Now you are looking at two platforms with even less of a difference between them than any of those aforementioned platforms. And you somehow expect devs to suddenly break precedent when now more than ever they have even less of a reason to do so?

You don't have to take my word for it (and I know you won't) but just wait and see. Remember I said this and you are free to call me out if I end up being wrong. I'll say it again and bold it and enlarge it so you can use it against me later.

With regards to multiplatform games, the only difference between the two consoles would be their average sustained Resolution. It would be on average higher on the XSX than the PS5. That would be the only IQ related difference. There would be asset, feature and framerate parity.
 
Last edited:

GhostTrick

Member
Oct 25, 2017
8,195
Multiplats needed to accomodate the OG Xbox One DDR3 memory. That is not going to be the case here.

QUOTE="ShapeGSX, post: 30230320, member: 32853"]
PCs generally have more cache in their CPUs. It’s not an issue.
While CPU cache is insanely fast, it's also much, much smaller than the unified GDDR6 memory pool. Seems to me like each would allow for different approaches, meaning techniques optimized for the console architecture may not be able to translate to the PC's advantages.

I hope I'm wrong, I enjoy both the console and PC experience and I buy the same games on both often so I can enjoy them depending on my mood. So I'd like the PC to continue receiving the very strong multiplats support we've seen this gen.
[/QUOTE]

And PCs already have a lot of VRAM dedicated to GPUs. That + system ram. This isn't a real problematic.
 

BitterFig

Member
Oct 30, 2017
254
With regards to multiplatform games, the only difference between the two consoles would be their average sustained Resolution. It would be on average higher on the XSX than the PS5. That would be the only IQ related difference. There would be asset, feature and framerate parity.
Yeah well that would be the most common thing, but then you have oddities like RE3 demo running better on PS4Pro or earlier in the gen AC:Unity running better on XB1 because of the slightly better CPU. I think we can expect every single difference between PS5 and XSX to show up one way or another, including 15% better resolution or resolution parity but better effects/shadows or whatever on XSX and any advantage the SSD can bring to the table for PS5. DF is going to have the time of their life, and we should too. Parity is boring.
 

Pheonix

Member
Dec 14, 2018
2,586
St Kitts
Yeah well that would be the most common thing, but then you have oddities like RE3 demo running better on PS4Pro or earlier in the gen AC:Unity running better on XB1 because of the slightly better CPU. I think we can expect every single difference between PS5 and XSX to show up one way or another, including 15% better resolution or resolution parity but better effects/shadows or whatever on XSX and any advantage the SSD can bring to the table for PS5. DF is going to have the time of their life, and we should too. Parity is boring.
Yes. Exactly.

You see the thing is that none of us are wrong. People just seem to be trying to make more of something that's been a norm since there has been different powered anything. Or pushing this idea that devs would somehow choose to do what doesn't make sense.

To me I see it like this:

You have two systems. One is 12TF and one is 10TF. Everything else about these two systems is pretty much identical. Or t the very least is a necessary correlation to the difference in GPU power, eg...the one with higher TF has a higher memory bandwidth or slightly higher CPU clock...etc.

Now as a dev you have options. Now let's say they are running at [email protected] with Ultra settings on the XSX. What are their options for the PS5 version?
  • Run it at [email protected] with High settings on the PS5.(drop overall asset quality from ultra to high. Textures, RT, AA, shadows...everything)
  • Run it at [email protected] with Ultra settings on the PS5. (keep everything the same but drop framerate)
  • Run it at [email protected] with ultra settings on the PS5 (keep everything the same but drop resolution)
The third option is the one that would have the least presentation and performance impact (by that I mean its better than having a badly performing game with choppy framerates cause not everyone would have a VRR capable TV). And also happens to be the easiest thing to do.

I find it shocking that this isn't obvious to some people. ts the path of least resistance, the easiest thing to do.
 
Last edited:

Kiekura

Member
Mar 23, 2018
2,127
Yes. Exactly.

You see the thing is that none of us are wrong. People just seem to be trying to make more of something that's been a norm since there has been different powered anything. Or pushing this idea that devs would somehow choose to do what doesn't make sense.

To me I see it like this:

You have two systems. One is 12TF and one is 10TF. Everything else about these two systems is pretty much identical. Or t the very least is a necessary correlation to the difference in GPU power, eg...the one with higher TF has a higher memory bandwidth or slightly higher CPU clock...etc.

Now as a dev you have options. Now let's say they are running at [email protected] with Ultra settings on the XSX. What are their options for the PS5 version?
  • Run it at [email protected] with High settings on the PS5.(drop overall asset quality from ultra to high. Textures, RT, AA, shadows...everything)
  • Run it at [email protected] with Ultra settings on the PS5. (keep everything the same but drop framerate)
  • Run it at [email protected] with ultra settings on the PS5 (keep everything the same but drop resolution)
The third option is the one that would have the least presentation and performance impact (by that I mean its better than having a badly performing game with choppy framerates cause not everyone would have a VRR capable TV). And also happens to be the easiest thing to do.

I find it shocking that this isn't obvious to some people. ts the path of least resistance, the easiest thing to do.
This is how i also see it.
 

BitterFig

Member
Oct 30, 2017
254
Yes. Exactly.

You see the thing is that none of us are wrong. People just seem to be trying to make more of something that's been a norm since there has been different powered anything. Or pushing this idea that devs would somehow choose to do what doesn't make sense.

To me I see it like this:

You have two systems. One is 12TF and one is 10TF. Everything else about these two systems is pretty much identical. Or t the very least is a necessary correlation to the difference in GPU power, eg...the one with higher TF has a higher memory bandwidth or slightly higher CPU clock...etc.

Now as a dev you have options. Now let's say they are running at [email protected] with Ultra settings on the XSX. What are their options for the PS5 version?
  • Run it at [email protected] with High settings on the PS5.(drop overall asset quality from ultra to high. Textures, RT, AA, shadows...everything)
  • Run it at [email protected] with Ultra settings on the PS5. (keep everything the same but drop framerate)
  • Run it at [email protected] with ultra settings on the PS5 (keep everything the same but drop resolution)
The third option is the one that would have the least presentation and performance impact (by that I mean its better than having a badly performing game with choppy framerates cause not everyone would have a VRR capable TV). And also happens to be the easiest thing to do.

I find it shocking that this isn't obvious to some people. ts the path of least resistance, the easiest thing to do.
I hope dynamic resolution would be the most common developer choice. Gears [email protected] 4K is a thing of beauty and is by far the best thing I've seen my PC spit out. Probably in the case of dynamic resolution the average difference between XSX and PS5 will be much less than 15%-20% in resolution, unless the developer tunes the game so well that the XSX is rarely at 4K.

I wonder why dynamic res is not more common. Does it require some sacrifices? I wonder how it works. Is it some anytime algorithm that just stops working on the current frame and returns whatever it got at that time, or does it estimates how heavy is a scene before hand and chooses a resolution a priori.
 

Pryme

Member
Aug 23, 2018
3,197
Yes. Exactly.

You see the thing is that none of us are wrong. People just seem to be trying to make more of something that's been a norm since there has been different powered anything. Or pushing this idea that devs would somehow choose to do what doesn't make sense.

To me I see it like this:

You have two systems. One is 12TF and one is 10TF. Everything else about these two systems is pretty much identical. Or t the very least is a necessary correlation to the difference in GPU power, eg...the one with higher TF has a higher memory bandwidth or slightly higher CPU clock...etc.

Now as a dev you have options. Now let's say they are running at [email protected] with Ultra settings on the XSX. What are their options for the PS5 version?
  • Run it at [email protected] with High settings on the PS5.(drop overall asset quality from ultra to high. Textures, RT, AA, shadows...everything)
  • Run it at [email protected] with Ultra settings on the PS5. (keep everything the same but drop framerate)
  • Run it at [email protected] with ultra settings on the PS5 (keep everything the same but drop resolution)
The third option is the one that would have the least presentation and performance impact (by that I mean its better than having a badly performing game with choppy framerates cause not everyone would have a VRR capable TV). And also happens to be the easiest thing to do.

I find it shocking that this isn't obvious to some people. ts the path of least resistance, the easiest thing to do.
Actually, thAt was the path taken this current gen. As seen in PS4 Pro vs Xbox One X comparisons.
This new gen, however, ‘native 4K’ is a powerful marketing tool. Keeping the resolution at 4K and dialing down effects is probably just as easy as implementing a Lower resolution and allows the dev and marketers to brag about ‘4K 60fps’.


It’s not a straightforward decision.
 

JasoNsider

Developer at Breakfall
Verified
Oct 25, 2017
335
Canada
I doubt even resolution drop will be necessary. These are incredibly close machines. Multiplatform titles will most likely be nearly identical. Maybe PS5 games will get some very subtle variable resolution later in the generation, bumping a frame or two down in resolution. And 99% of players won't even be able to tell in that case.

As always, I could be off, but they are close. I expect PS5 games to impress big time.
 

Jade1962

Member
Oct 28, 2017
1,605
Actually, thAt was the path taken this current gen. As seen in PS4 Pro vs Xbox One X comparisons.
This new gen, however, ‘native 4K’ is a powerful marketing tool. Keeping the resolution at 4K and dialing down effects is probably just as easy as implementing a Lower resolution and allows the dev and marketers to brag about ‘4K 60fps’.


It’s not a straightforward decision.
Isn't native 4k already used in X1X marketing? Both the Pro and 1X have 4K branding. XSX and PS5 will also have the same branding. I don't think adding native is going to mean much to the mass market.
 

Fafalada

Member
Oct 27, 2017
1,584
The PS2 was substantially more expensive than the DC, moderately more expensive than the gamecube (if I'm remembering correctly) despite being a much weaker system, and cheaper than the OG Xbox. None of those systems ever got within a mile of it.
Not exactly. At least in US(and EU IIRC), the XBox maintained price-parity with PS2 for the entire time it was on the market(it dropped its launch price faster than PS3 or even the NDS to keep up with PS2 pricing)- even though it shipped with a built in HDD, Network Adapter and by far the biggest hardware gap between two consoles after PS2's own launch.
Of course if we really want to argue sales=specs, one could look at the fact that in 2000-2004 - PS1 outsold XB and GC and DC.


To look at it from another perspective - XB was 2-3x faster in every meaningful metric compared to its nearest market competitor - the GC.
And we still have people to this day argue about the two like they were basically 'on par' in terms of specs.
Meanwhile this month - 15% GPU gap is being blown to be all but a generational gap by now. If there's any lesson I take from this, is that specs don't matter to people at all - perceptions are 99% of the game.
 

ShapeGSX

Member
Nov 13, 2017
1,485
Multiplats needed to accomodate the OG Xbox One DDR3 memory. That is not going to be the case here.


While CPU cache is insanely fast, it's also much, much smaller than the unified GDDR6 memory pool. Seems to me like each would allow for different approaches, meaning techniques optimized for the console architecture may not be able to translate to the PC's advantages.

I hope I'm wrong, I enjoy both the console and PC experience and I buy the same games on both often so I can enjoy them depending on my mood. So I'd like the PC to continue receiving the very strong multiplats support we've seen this gen.
CPU workloads in games don't generally need to stream huge amounts of data from RAM. They'll be fine.
 

Pryme

Member
Aug 23, 2018
3,197
Isn't native 4k already used in X1X marketing? Both the Pro and 1X have 4K branding. XSX and PS5 will also have the same branding. I don't think adding native is going to mean much to the mass market.
I’m talking about publisher marketing, not console maker.

Thanks to TVs, 4K is getting to be a mainstream term these days.

But I guess time will tell.
 
Last edited:

Jade1962

Member
Oct 28, 2017
1,605
I’m talking about publisher marketing, not console maker.

Thanks to TVs, 4K is getting to be a mainstream term these days.
And implementing variable resolution accomplishes that. My question is why are devs going to change how they develop now with the 1X and Pro and previously with the Ps4 and 1S? I don't see a vast majority of games have major differences in graphical effects. Majority of differences will be resolution and/or frame rates. I can imagine as the gen progresses though a divergence in RT as developers become more adept at utilizing it.
 

Pheonix

Member
Dec 14, 2018
2,586
St Kitts
I hope dynamic resolution would be the most common developer choice. Gears [email protected] 4K is a thing of beauty and is by far the best thing I've seen my PC spit out. Probably in the case of dynamic resolution the average difference between XSX and PS5 will be much less than 15%-20% in resolution, unless the developer tunes the game so well that the XSX is rarely at 4K.

I wonder why dynamic res is not more common. Does it require some sacrifices? I wonder how it works. Is it some anytime algorithm that just stops working on the current frame and returns whatever it got at that time, or does it estimates how heavy is a scene before hand and chooses a resolution a priori.
Dynamic resolution is already the norm in the industry. What tends to happen is that the game is run with a target framerate, then when it drops frames resolution is, in turn, dropped to compensate and get the framerate back up to snuff. Both consoles have things in their chips that would make this process more streamlined. They just call it different things.
Actually, thAt was the path taken this current gen. As seen in PS4 Pro vs Xbox One X comparisons.
This new gen, however, ‘native 4K’ is a powerful marketing tool. Keeping the resolution at 4K and dialing down effects is probably just as easy as implementing a Lower resolution and allows the dev and marketers to brag about ‘4K 60fps’.


It’s not a straightforward decision.
Nope. That's just irrelevant. Both consoles would tout native 4k. And both will be native 4k. The difference is just going to be that the PS5 drops to 1800p/2052p more often than the XSX would.

And it would be unbelievably stupid... flat ut stupid for any dev to keep the resolution locked and then chose to dial down effects. Like it would shock me if any dev takes that route.

Its better to have your best-looking assets and effects but running at a total lower resolution than to be running at the peak resolution with lower-quality assets. The former is far less perceivable(nearly impossible to detect at the resolutions we are talking about) than the latter. And its significantly easier to implement.

If they are doing it by reducing asset quality, they would go through every asset type and tweak each one until they meet their targets. That's dropping shadow quality, then dropping AA level, then reducing AO, then reducing texture quality, lighting, then cutting out geometry...etc. Why do all that to see how many cuts you have to make when you can just drop the res from 2160p to 1900p in 30 mins and call it a day????
And again, the difference in rez between 2160p and 1800p/2052p is near imperceptible.
 

Kiekura

Member
Mar 23, 2018
2,127
I doubt even resolution drop will be necessary. These are incredibly close machines. Multiplatform titles will most likely be nearly identical. Maybe PS5 games will get some very subtle variable resolution later in the generation, bumping a frame or two down in resolution. And 99% of players won't even be able to tell in that case.

As always, I could be off, but they are close. I expect PS5 games to impress big time.
Most likely this is the case, but hey remember.

War, war never changes!
 

Ricky Ricardo

Member
Oct 26, 2017
959
Atlanta, GA
User Warned: Console Warring
Yes. Exactly.

You see the thing is that none of us are wrong. People just seem to be trying to make more of something that's been a norm since there has been different powered anything. Or pushing this idea that devs would somehow choose to do what doesn't make sense.

To me I see it like this:

You have two systems. One is 12TF and one is 10TF. Everything else about these two systems is pretty much identical. Or t the very least is a necessary correlation to the difference in GPU power, eg...the one with higher TF has a higher memory bandwidth or slightly higher CPU clock...etc.

Now as a dev you have options. Now let's say they are running at [email protected] with Ultra settings on the XSX. What are their options for the PS5 version?
  • Run it at [email protected] with High settings on the PS5.(drop overall asset quality from ultra to high. Textures, RT, AA, shadows...everything)
  • Run it at [email protected] with Ultra settings on the PS5. (keep everything the same but drop framerate)
  • Run it at [email protected] with ultra settings on the PS5 (keep everything the same but drop resolution)
The third option is the one that would have the least presentation and performance impact (by that I mean its better than having a badly performing game with choppy framerates cause not everyone would have a VRR capable TV). And also happens to be the easiest thing to do.

I find it shocking that this isn't obvious to some people. ts the path of least resistance, the easiest thing to do.
An actual falsehood. You think Sony clocked the PS5 GPU up so high for kicks and giggles?
36 CUs vs 52 CUs is a large, noticeable gap. It's not like the XSX GPU's clock of 1.8GHz is low.
At 1.8GHz, PS5 comes out to 8.4TF
At 2.0GHz, PS5 comes out to 9.2TF
Clearly, Sony felt the need to up clocks to raise teraflops so they would compare more favorably, because both of the previous configurations are a major perception blow in the face of the XSX.
No amount of denial from you lot will change this. Sony had no choice to be aggressive with clocks because the machine is lacking in grunt comparatively.
 
Last edited:

D BATCH

Member
Nov 15, 2017
137
There is no large GPU power difference between 2 consoles. It's smallest that has ever been and resolution difference will absolute be the only tangible delta you are going to (most probably not) notice.
FPS too if Sony does not implement some version of VRS. Also, Ray-traced games will also show a difference. Sony will also have a lesser effect such as mix of high settings less particles Etc. If a GPU on Pc that is 2-3tflops less show differences so will the consoles. 2070 does not perfom like a 2080 nor does a 2080 perform like a 2080ti. Ps5 games will look amazing but lets stop with this talk they will look nearly identical they won't.
 
Last edited:

BitterFig

Member
Oct 30, 2017
254
I doubt even resolution drop will be necessary. These are incredibly close machines. Multiplatform titles will most likely be nearly identical. Maybe PS5 games will get some very subtle variable resolution later in the generation, bumping a frame or two down in resolution. And 99% of players won't even be able to tell in that case.

As always, I could be off, but they are close. I expect PS5 games to impress big time.
Exactly, it could be that on XSX the game is optimized such that most of the time it is at native 4K and only on heavy loaded scenes it drops in resolution in which case the average fps diff would be less than the max ~20% diff. between XSX and PS5. Of course it could also be that on XSX the dynamic resolution is rarely at native 4K in which case the average would be much closer to the max.
Dynamic resolution is already the norm in the industry.
Doesn't seem so standard for Sony's 1st party but ok the question is more about 3rd parties.
What tends to happen is that the game is run with a target framerate, then when it drops frames resolution is, in turn, dropped to compensate and get the framerate back up to snuff.
Hmm interesting so the framerate is not exactly stable but between a small margin of the target.
 

-Le Monde-

Avenger
Dec 8, 2017
3,903
I doubt even resolution drop will be necessary. These are incredibly close machines. Multiplatform titles will most likely be nearly identical. Maybe PS5 games will get some very subtle variable resolution later in the generation, bumping a frame or two down in resolution. And 99% of players won't even be able to tell in that case.

As always, I could be off, but they are close. I expect PS5 games to impress big time.
Probably, we’ll end up with comparison that zoom in 20x or 50x times to show a small difference. That won’t stop people from going “this is pretty much unplayable” .😋
 

nelsonroyale

Member
Oct 28, 2017
4,069
An actual falsehood. You think Sony clocked the PS5 GPU up so high for kicks and giggles?
36 CUs vs 52 CUs is a large, noticeable gap. It's not like the XSX GPU's clock of 1.8GHz is low.
At 1.8GHz, PS5 comes out to 8.4TF
At 2.0GHz, PS5 comes out to 9.2TF
Clearly, Sony felt the need to up clocks to raise teraflops so they would compare more favorably, because both of the previous configurations are a major perception blow in the face of the XSX.
No amount of denial from you lot will change this. Sony had no choice to be aggressive with clocks because the machine is lacking in grunt comparatively.
There is no evidence that they were ever going for 1.8 / 36 and ample that they have always been aiming for 2.0 / 36. With their implimentation they have effectively achived beyond that.
 

gundamkyoukai

Member
Oct 25, 2017
8,441
FPS too if Sony does not implement some version of VRS. Also, Ray-traced games will also show a difference. Sony will also have a lesser effect such as mix of high settings less particles Etc. If a GPU on Pc that is 2-3tflops less show differences so will the consoles. 2070 does not perfom like a 2080 nor does a 2080 perform like a 2080ti. Ps5 games will look amazing but lets stop with this talk they will look nearly identical they won't.
You can"t have all those different things better at the same with only 15% difference .
I mean the XSX GPU is not even 15% better overall but in certain aspect which will make things even more close for other parts of the gfx pipeline.
 
Last edited:

cyenz

Member
Oct 27, 2017
588
The denial on this thread is something expected as soon as PS5 specs were announced.

In the end, everyone will have a great experience next gen, and that is great for everyone.
 

BreakAtmo

Member
Nov 12, 2017
5,246
I guess this s the new narrative now.

At this point, there s no need to even address comments like these.


Its not that no one would use the extra power. It's about how that extra power would present and what it translates to.

Its a simple case of either higher native rez, or a higher framerate. Not both. And the logical thing for any dev to do is to look the framerate and dynamically scale the rez. What that translates to is that the PS5 would on average rn at scaled-down rez from the XSX about 15-20% of the time. While everything else remains exactly the same.

And while I and many others have repeated this, that means that The XSX could be at [email protected](Y)fps (where Y is whatever fps lock the devs set out for) 100% of the time and the PS5would be at [email protected](Y)fps around 80%-85% of the time. All effects and assets would be the exact same thing with the only difference being that when that part of the game comes along that is taxing on the hardware, the XSX would likely be able to hold its rez @2160p while the PS5 would scale its rez down to around 1800p/2052p to maintain the locked framerate.

That's what this nonsense is about. 2160p vs 1800p/2052p - 2160p.

What makes all this even crazier is that with those kinda rez targets/range,1800p - 2160p ; 95% of the people out there wouldn't be able to detect or tell apart. Its actually easier to see a difference between 1080p vs 900p (around a 40%+ rez difference), than it is to see one between 1440p vs 2160p. Even though the latter is over a 100% resolution difference. And that's simply because when pixel count is already that high diminishing returns kicks in and it starts getting really hard to tell resolutions apart. And yet such a fuss is being made about 1800p/2052p vs 2160p. Even crazier when you consider that those drops in rez on the PS5wouldonly really happen about 15-20% of the time.

Crazy times.
Yeah, and there are even more ways the pixel counts could come out. We could also see games using dynamic temporal injection and the like, where PS5 gives around the 1440p-1620p area while XSX is more like 1620p-1800p.
 

CanisMajoris

The Fallen
Oct 27, 2017
536
FPS too if Sony does not implement some version of VRS. Also, Ray-traced games will also show a difference. Sony will also have a lesser effect such as mix of high settings less particles Etc. If a GPU on Pc that is 2-3tflops less show differences so will the consoles. 2070 does not perfom like a 2080 nor does a 2080 perform like a 2080ti. Ps5 games will look amazing but lets stop with this talk they will look nearly identical they won't.

This is not a Pro vs 1X situation, relative difference is much smaller and that's just GPU part, 1X had substantial advantage in RAM allowing games to have higher resolution textures which farther extended the perceptible gap in IQ and sharpness.

This time it's going to be mostly a resolution difference and it's a bit early to talk about graphical features and settings, RT will scale with resolution difference so not sure where the "advantage" will come from, its also 18-20% better. VRS and Mesh Shaders are RDNA2 (just like a lot of other features MS talked about)

Also there is a reason why Sony decided to dedicate so much silicon space to custom storage and IO hardware instead of going with a larger GPU. Will have to wait and see the underlying benefits there.
 

Ricky Ricardo

Member
Oct 26, 2017
959
Atlanta, GA
User Banned (3 Days): Console wars
There is no evidence that they were ever going for 1.8 / 36 and ample that they have always been aiming for 2.0 / 36. With their implimentation they have effectively achived beyond that.
There is no doubt in my mind they tested @ 1.8. They would have tested at a variety of configurations, the 2GHz clock we surmised from the Github leaks was apart of that testing. And yes, they went above, because they had no choice but to. Imagine where the conversation would currently be if Sony stayed at 2GHz/9.2TF. Again, 2.2GHz is not a move you make for your health. You make it because you're boxed in and need to eek out every shred of performance you can get. It's like overclocking a 1660 Ti to it's max or even beyond and then trying to say that it is practically identical to an RTX 2060, because of its newly adjusted core clock/teraflops. No, no it is not.
 

BreakAtmo

Member
Nov 12, 2017
5,246
Yeah well that would be the most common thing, but then you have oddities like RE3 demo running better on PS4Pro
This is only a bunch, but if we assume the demo build isn't just super unfinished, I think this is an example of what happens when the One X's extra bandwidth isn't especially needed. There are examples of games like RDR2 where you get the One X pushing more than twice the pixels over the Pro because the game also really likes a lot of bandwidth, and the Pro is constrained in that area. My guess is that in the case of RE3, the Pro's bandwidth is enough, and the result is exactly what you would expect to see when a 42% stronger GPU tries to push nearly double the pixels.

I might be complete off, though - I'm interested to see how the final game performs.
 

nelsonroyale

Member
Oct 28, 2017
4,069
There is no doubt in my mind they tested @ 1.8. They would have tested at a variety of configurations, the 2GHz clock we surmised from the Github leaks was apart of that testing. And yes, they went above, because they had no choice but to. Imagine where the conversation would currently be if Sony stayed at 2GHz/9.2TF. Again, 2.2GHz is not a move you make for your health. You make it because you're boxed in and need to eek out every shred of performance you can get. It's like overclocking a 1660 Ti to it's max or even beyond and then trying to say that it is practically identical to an RTX 2060, because of its newly adjusted core clock/teraflops. No, no it is not.
I think we need to wait to see the relative price on these things. I think you need to look at why Sony would go with 36 in the first place. I expect the RRP they were aiming for has a lot to do with it. I suspect PS5 will release cheaper than XSX for a number of reasons. That scenario would be pretty favourable to Sony I would say, especially if Lockhart is substantially weaker. I reckon XSX will be the best value for money on the Xbox side though.
 

Pheonix

Member
Dec 14, 2018
2,586
St Kitts
Doesn't seem so standard for Sony's 1st party but ok the question is more about 3rd parties.

Hmm interesting so the framerate is not exactly stable but between a small margin of the target.
First parties are a different thing. They would always push the machine the hardest. Third parties, however, would always just follow the path of least resistance.

The frame rate, while locked, is never actually stable. This is most common hen looking at games with an uncapped framerate. If a dev is aiming for a 60fps locked framerate, then they design an engine that can sustain like 65-70fps on average so even when in taxing scenes it doesn't drop below that 60fps. And with frame pacing their engine would basically just spit out a frame every 16ms.

In an event where the game dropping below that frame rate, rather than just let the game do whatever it has to do which in those cases results in judder, the devs designing such that the engine would scale down the resolution to go back to keeping that target locked framerate.


The denial on this thread is something expected as soon as PS5 specs were announced.

In the end, everyone will have a great experience next gen, and that is great for everyone.
There isn't denial here.

No one is saying thePS5 would outperform the XSX. Or that the PS4 would even match it. We are saying that the difference is nowhere near as bad or big as it has ever been between consoles. That's not denial... that's a fact.

And furthermore, there is debate on how that difference would show itself in games. This is not going to be a 2160p vs 1620p type difference between the XB1Xand the PS4pro as many here would like to have people believe. Its going to be significantly smaller than that.

There is no doubt in my mind they tested @ 1.8. They would have tested at a variety of configurations, the 2GHz clock we surmised from the Github leaks was apart of that testing. And yes, they went above, because they had no choice but to. Imagine where the conversation would currently be if Sony stayed at 2GHz/9.2TF. Again, 2.2GHz is not a move you make for your health. You make it because you're boxed in and need to eek out every shred of performance you can get. It's like overclocking a 1660 Ti to it's max or even beyond and then trying to say that it is practically identical to an RTX 2060, because of its newly adjusted core clock/teraflops. No, no it is not.
So basically, you saw evidence for 1Ghz, 1.8Ghz, 2Ghz and because you didn't say any for 2.2Ghzit means this was some last-minute thing?

One would think that all the previous testing clearly indicated that with each revision they were able to push things just a little bit more. Has it occurred to you that the reason nothing leaked about tests for 2.2Ghz clocks has a lot to do with the fact that after the 2Ghz leaks happened things may have gotten clamped down?

Do you realize how you actually sound right now?
 
Apr 4, 2018
462
Yes and No. Yes, it can be used for "effects", but no that is most likely not going to be the case.

The power difference is simply not that much. It the smallest power difference between two PS and Xbox consoles ever. To be used for "extra effects" would mean that devs are trying to keep the rez and frame rate identical on both platforms, then so doing they've ~2Tf worth of leeway on the XSX to invest in effects. Why do that when you can just drop the rez of the PS5 down by around 15% - 20% and have complete feature/effect parity at the cost of just running at a slightly lower rez?

Especially when the rez drop would b non-consequential to the overall presentation the game?


No hand waving here. Everything you have mentioned still amounts to a sub 20% difference in overall performance. Call it shader, RT... whatever. Its really a simple thing, whatever XSX can do at 12TF, the PS5 can do with 10TF but at a lower rez than what the XSX doing it at.

I don't know what past you are referring to, but I am using the last 7 years as a reference. Look at the PS4/XB1 and the PS4pro/XB1X. Both of those machines had significantly more differences between them than these next-gen machines do. And where and how the extra power was applied was predictable and consistent.

Now you are looking at two platforms with even less of a difference between them than any of those aforementioned platforms. And you somehow expect devs to suddenly break precedent when now more than ever they have even less of a reason to do so?

You don't have to take my word for it (and I know you won't) but just wait and see. Remember I said this and you are free to call me out if I end up being wrong. I'll say it again and bold it and enlarge it so you can use it against me later.

With regards to multiplatform games, the only difference between the two consoles would be their average sustained Resolution. It would be on average higher on the XSX than the PS5. That would be the only IQ related difference. There would be asset, feature and framerate parity.
Hey man, just so you know, I'm not trying to start any fights here, just enlighten. I have a lot of experience doing GPU performance profiling. And I have experienced the power difference between essentially every GPU on the market. I can tell you straight up, that while most casual users probably won't be able to tell the difference (and likely wouldn't have been able to tell the difference between thier PS4 Pro and X1X), that even the 20% difference between these two cards will be very noticable on many titles.

To give you an example of what I mean, we can use PC GPU performance profiling as an example. On the website passmark, a sub-20% difference essentially gives us (and correct me if my math is wrong) the difference between an RTX 2080 and a 2070.


RTX 2080 - 19371
RTX 2070 - 16615

We can look at any number of game benchmarks between these two, and immediately see performance differences (feel free to pick any PC game on the market and try this out).

Take Gears 5 as an example:
..

At 1080p (fps)

RTX 2080
Avg - 125
Min - 98

RTX 2070
Avg - 98
Min - 76


At 1440p (fps)

RTX 2080
Avg - 89
Min - 72

RTX 2070
Avg - 71
Min - 57

At 4k (fps)

RTX 2080
Avg - 51
Min - 43

RTX 2070
Avg - 40
Min - 33


If you feel the difference shown above won't make a difference to you, then fair enough, and all the power to you. In this coming generation though, I'm assuming we are going to see developers target all three of these scenarios, sometimes on a regular basis.

As you can see, the difference seems greater as the resolution goes down. For games that target 120fps at 1080p or 1440p (this could be a popular choice when using resolution scaling), PS5 will need to make up for a significant performance difference by adjusting graphical settings, tweaking resolution, whatever developers can do.

Another way you could also measure this is perhaps by comparing graphic presets between RTX 2070 and 2080 on games that do that well (Gears 5 once again does that incredibly well), to see how developers tend to prioritize visual settings to make the most out of a GPU.
 
Last edited:

cyenz

Member
Oct 27, 2017
588
First parties are a different thing. They would always push the machine the hardest. Third parties, however, would always just follow the path of least resistance.

The frame rate, while locked, is never actually stable. This is most common hen looking at games with an uncapped framerate. If a dev is aiming for a 60fps locked framerate, then they design an engine that can sustain like 65-70fps on average so even when in taxing scenes it doesn't drop below that 60fps. And with frame pacing their engine would basically just spit out a frame every 16ms.

In an event where the game dropping below that frame rate, rather than just let the game do whatever it has to do which in those cases results in judder, the devs designing such that the engine would scale down the resolution to go back to keeping that target locked framerate.



There isn't denial here.

No one is saying thePS5 would outperform the XSX. Or that the PS4 would even match it. We are saying that the difference is nowhere near as bad or big as it has ever been between consoles. That's not denial... that's a fact.

And furthermore, there is debate on how that difference would show itself in games. This is not going to be a 2160p vs 1620p type difference between the XB1Xand the PS4pro as many here would like to have people believe. Its going to be significantly smaller than that.


So basically, you saw evidence for 1Ghz, 1.8Ghz, 2Ghz and because you didn't say any for 2.2Ghzit means this was some last-minute thing?

One would think that all the previous testing clearly indicated that with each revision they were able to push things just a little bit more. Has it occurred to you that the reason nothing leaked about tests for 2.2Ghz clocks has a lot to do with the fact that after the 2Ghz leaks happened things may have gotten clamped down?

Do you realize how you actually sound right now?
Can you please stock quoting every comment I make? You don't need to force your point of view to everyone.
 

etta

Member
Oct 27, 2017
1,970
User Banned (1 Week): Console wars, history of similar behavior
An actual falsehood. You think Sony clocked the PS5 GPU up so high for kicks and giggles?
36 CUs vs 52 CUs is a large, noticeable gap. It's not like the XSX GPU's clock of 1.8GHz is low.
At 1.8GHz, PS5 comes out to 8.4TF
At 2.0GHz, PS5 comes out to 9.2TF
Clearly, Sony felt the need to up clocks to raise teraflops so they would compare more favorably, because both of the previous configurations are a major perception blow in the face of the XSX.
No amount of denial from you lot will change this. Sony had no choice to be aggressive with clocks because the machine is lacking in grunt comparatively.
And there's still an entire PS4 worth of difference between the two machines in the GPU department, but of course it's more convenient to say that's it's only a 15% difference.
 

BradleyLove

Member
Oct 29, 2017
587
There is no doubt in my mind they tested @ 1.8. They would have tested at a variety of configurations, the 2GHz clock we surmised from the Github leaks was apart of that testing. And yes, they went above, because they had no choice but to. Imagine where the conversation would currently be if Sony stayed at 2GHz/9.2TF. Again, 2.2GHz is not a move you make for your health. You make it because you're boxed in and need to eek out every shred of performance you can get. It's like overclocking a 1660 Ti to it's max or even beyond and then trying to say that it is practically identical to an RTX 2060, because of its newly adjusted core clock/teraflops. No, no it is not.
Do you have any receipts for this? Otherwise it’s tales from your ass.
 

catashtrophe

Member
Oct 27, 2017
385
UK
both machines will have GDDR6 Ram but I'm assuming GDDR5 is cheaper so would it not have been better to have that and more of it e.g. 20 or 24GB Ram
 

etta

Member
Oct 27, 2017
1,970
To see you in a PS5 thread downplaying PS, is as boring as it is predictable.
You're grasping at straws.
I find the PS5 more exciting and boundary-pushing, the only negative I see is Sony hasn't proved competent in cooling their machines yet and those high clocks will test that competence to the limits. I don't want another jet engine like both the base PS4 and Pro were.
 

nelsonroyale

Member
Oct 28, 2017
4,069
Hey man, just so you know, I'm not trying to start any fights here, just enlighten. I have a lot of experience doing GPU performance profiling. And I have experienced the power difference between essentially every GPU on the market. I can tell you straight up, that while most casual users probably won't be able to tell the difference (and likely wouldn't have been able to tell the difference between thier PS4 Pro and X1X), that even the 20% difference between these two cards will be very noticable on many titles.

To give you an example of what I mean, we can use PC GPU performance profiling as an example. On the website passmark, a sub-20% difference essentially gives us (and correct me if my math is wrong) the difference between an RTX 2080 and a 2070.


RTX 2080 - 19371
RTX 2070 - 16615

We can look at any number of game benchmarks between these two, and immediately see performance differences (feel free to pick any PC game on the market and try this out).

Take Gears 5 as an example:
..

At 1080p (fps)

RTX 2080
Avg - 125
Min - 98

RTX 2070
Avg - 98
Min - 76


At 1440p (fps)

RTX 2080
Avg - 89
Min - 72

RTX 2070
Avg - 71
Min - 57

At 4k (fps)

RTX 2080
Avg - 51
Min - 43

RTX 2070
Avg - 40
Min - 33


If you feel the difference shown above won't make a difference to you, then fair enough, and all the power to you. In this coming generation though, I'm assuming we are going to see developers target all three of these scenarios, sometimes on a regular basis.

As you can see, the difference seems greater as the resolution goes down. For games that target 120fps at 1080p or 1440p (this could be a popular choice when using resolution scaling), PS5 will need to make up for a significant performance difference by adjusting graphical settings, tweaking resolution, whatever developers can do.

Another way you could also measure this is perhaps by comparing graphic presets between RTX 2070 and 2080 on games that do that well (Gears 5 once again does that incredibly well), to see how developers tend to prioritize visual settings to make the most out of a GPU.
Can you give the equivalent for res between 1440p and 4k...because these systems are not going to probably go below the latter. Again you mentioned that the difference obviously reduces with res, which I would guess most people were talking about the difference at 4k or thereabouts. Comparing 1080p is...not that relevant for these consoles. There will be a downsampling option, but very doubtful there will be 1080p native res in any of the big games.
 
Oct 25, 2017
3,557
You're grasping at straws.
I find the PS5 more exciting and boundary-pushing, the only negative I see is Sony hasn't proved competent in cooling their machines yet and those high clocks will test that competence to the limits. I don't want another jet engine like both the base PS4 and Pro were.
Cerny called out their failure to cool the PS4 quietly and said people would be happy with the PS5's cooling.
 
Apr 4, 2018
462
Can you give the equivalent for res between 1440p and 4k...because these systems are not going to probably go below the latter. Again you mentioned that the difference obviously reduces with res, which I would guess most people were talking about the difference at 4k or thereabouts. Comparing 1080p is...not that relevant for these consoles. There will be a downsampling option, but very doubtful there will be 1080p native res in any of the big games.
My post has both 1440p and 4k performance numbers, I assume that is what you are asking for?

Also, since people generally either have 1080p or 4k TVs, I think it's possible, especially on PS5, that developers targeting 120fps could reduce resolution down that far (hopefully with some sort of temporal reconstruction to bring it back up towards 4k or 1440p).

These are just raw performance numbers though, I'm am certain that we will see developers use every trick in the book to get the most out of both GPUs.

Edit:
- In fact, the RX 5700 vs 5700 XT comparison is likely more of an indicator of how the PS5s GPU will scale when overclocked.
 
Last edited:

Ricky Ricardo

Member
Oct 26, 2017
959
Atlanta, GA
So basically, you saw evidence for 1Ghz, 1.8Ghz, 2Ghz and because you didn't say any for 2.2Ghzit means this was some last-minute thing?

One would think that all the previous testing clearly indicated that with each revision they were able to push things just a little bit more. Has it occurred to you that the reason nothing leaked about tests for 2.2Ghz clocks has a lot to do with the fact that after the 2Ghz leaks happened things may have gotten clamped down?

Do you realize how you actually sound right now?
I never said it was a last minute thing. Re-read the comment. I said it was a necessity because of where 2GHz/9.2TF would have left them in the next-gen conversation. It's about perception, marketing 101 teaches this.

Maybe you haven't spent any time overclocking in the PC realm, by 2.2GHz is obscene for GPU clocks. Heat is the enemy of component life and the higher your clocks, the more heat you produce. I do not believe Cerny opted for 2.2GHz from the get-go. In the PC space, you're hard-pressed to overclock that high on discrete GPUs, let alone consoles. We'll see how cooling works for PS5, but I have my doubts.