• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

Straffaren666

Member
Mar 13, 2018
84
but the gears 5 is outperforming the 2080 ti on series x. its running at naitve 4k 60 fps locked on ultra settings with several additional graphics effects that are not even in the pc version.

DA9ANPmJDKhSgzNxMAUj9W.png


a 10.3 tflops ps5 gpu should be around 2080 level at the very least. maybe even 2080 super level if we go by gears 5 and tbh we should because thats out only form of comparison so far with the exception of the minecraft ray tracing demo but we dont know what its exact framerate was.

its crazy to me that dictator continues to make these absolute statements as if hes seen how rdna 2.0 cards perform. as if hes seen ps5 games in action. the only point of comparison we have shows an amazing result matching if not exceeding the 2080 ti which is almost 35% more powerful than the rtx 2080. an 18% gap in tflops should not get us all the way down to 2070 levels.

I might remember incorrectly, but I believe Richard said the performance of the XSX version of Gears 5 to be comparable to a 2080 on ultra settings in the DF video. I might have misheard him though and he actually said a 2080ti?

Yes, IMO, a 5700XT scaled up to 10.3TF by increasing the clock frequency should be close to 2080 level of performance, so I also find it a little bit odd that he's so adamant that the PS5 won't reach that performance level. Of course there are a lot of uncertainties, so the PS5 might not reach a 2080 in the end, but based on available information I think there are far more reasons to believe it will than it won't.
 

Vimto

Member
Oct 29, 2017
3,714
I cannot believe I have to come out of my hiding to expressly yell how damn wrong you are. The gears 5 bench ran at Ultra settings, not at higher than Ultra Settings. Watch the damn Video or read the damn article where we say that. There at exactly the same settings as Ultra, it ran like a 2080. The exact same settings as Ultra. How often do I have to correct purposefully misconstrued Information.

Read
The
Damn
Articles
Watch
The
Damn
Videos
lmao
 
Oct 27, 2017
7,670
i think it will be utilized because even the xbox series x hardware has dedicated hardware to get those unique assets into memory at a lightning fast 6 gbps compared to the 8-9 gbps speeds of the ps5 hardware.

t3fI75I.jpg


it will be the norm on both consoles.
6 gb/s is XSX's max burst rate. PS5's is 22 gb/s.

Think about that difference for a second.

Typical compressed throughout is 4.8 gb/s for XSX and 8 to 9 for PS5.
 

Straffaren666

Member
Mar 13, 2018
84
Not really.
www.techpowerup.com

XFX Radeon RX 5700 XT THICC III Ultra Review

The XFX RX 5700 XT THICC III Ultra is a brand new triple-fan design from the company, which runs higher clocks, too. XFX listened to criticism and improved the memory cooling plate, reducing temperatures and noise levels significantly, which makes the THICC III one of the quietest RX 5700 XT...

15% difference at 4k and that's with a model that average around 2Ghz of clockspeed.

And stop saying clockspeeds are uncertain.
I kept providing you links with clockspeed tests in game with averages, peak and such.

Where do they say the average game clock is 2Ghz during the benchmark? From the performance delta there is a 3-4% performance improvement between the stock 5700XT and the overclocked 5700XT. That in itself implies the actual average game clock of the overclocked card is only 3-4% higher than the stock 5700XT. The average game clock of the 5700XTs in the benchmark I'm using the determine the relative performance between the 5700XT and 2080 is of course unknown. It depends on the distribution of the various 5700XTs used in the benchmark and how taxing the benchmark actually is on the 5700XT. It's reasonable to believe it's between 1755-1795Mhz, since that probably is the game clock of the majority of the 5700XT cards sold.
 
Oct 27, 2017
7,670
It's a much higher number and I'm sure first party will take great advantage of it, but won't third party just target the common denominator?
Potentially. But we may still see differences if the engine used is able to leverage scaling. Regardless, that's why I'm happy both machines are vastly improving over last gens paltry i/o throughput.
 
Feb 23, 2019
1,426
It's a much higher number and I'm sure first party will take great advantage of it, but won't third party just target the common denominator?

I think third party will take advantage of it

Assets are built/designed much higher than what is reflected in games. With the PS5 memory streaming system being so much better, I expect that a decent portion of third party games will have higher quality textures/assets for PS5 versions.

XSX will have the 17% advantage in resolution, so you'll have a situation where it's rendering higher resolution visuals with lower quality assets. At the end of the day, what are you going to notice more? It's going to be the asset quality over the minor difference in resolution.
 

Iwao

Member
Oct 25, 2017
11,781
XSX will have the 17% advantage in resolution, so you'll have a situation where it's rendering higher resolution visuals with lower quality assets. At the end of the day, what are you going to notice more? It's going to be the asset quality over the minor difference in resolution.
We can only wait and see the comparisons, but this is sounding almost precisely what I expect.
 

GhostTrick

Member
Oct 25, 2017
11,305
Where do they say the average game clock is 2Ghz during the benchmark? From the performance delta there is a 3-4% performance improvement between the stock 5700XT and the overclocked 5700XT. That in itself implies the actual average game clock of the overclocked card is only 3-4% higher than the stock 5700XT. The average game clock of the 5700XTs in the benchmark I'm using the determine the relative performance between the 5700XT and 2080 is of course unknown. It depends on the distribution of the various 5700XTs used in the benchmark and how taxing the benchmark actually is on the 5700XT. It's reasonable to believe it's between 1755-1795Mhz, since that probably is the game clock of the majority of the 5700XT cards sold.

www.techpowerup.com

XFX Radeon RX 5700 XT THICC III Ultra Review

The XFX RX 5700 XT THICC III Ultra is a brand new triple-fan design from the company, which runs higher clocks, too. XFX listened to criticism and improved the memory cooling plate, reducing temperatures and noise levels significantly, which makes the THICC III one of the quietest RX 5700 XT...

In the clockspeed section. And no, it's not reasonable to believe that since we have data about that too:
www.techpowerup.com

AMD Radeon RX 5700 XT Review

The AMD Radeon RX 5700 XT is based on AMD's all-new Navi 10 GPU featuring the RDNA architecture. We thoroughly test the card's gaming performance and look at power, heat, noise, overclocking, and clock frequency stability, too, sometimes with surprising results.

The average is around 1880mhz.

I dont know why you keep relying on that gpu userbenchmark site.
 

Aladan

Member
Dec 23, 2019
496
Of course there will be third-party developer that will take every advantage to optimize their engine and game as much as possible for the hardware whether graphics, sound or streaming.
 

Hermii

Member
Oct 27, 2017
4,685
I think third party will take advantage of it

Assets are built/designed much higher than what is reflected in games. With the PS5 memory streaming system being so much better, I expect that a decent portion of third party games will have higher quality textures/assets for PS5 versions.

XSX will have the 17% advantage in resolution, so you'll have a situation where it's rendering higher resolution visuals with lower quality assets. At the end of the day, what are you going to notice more? It's going to be the asset quality over the minor difference in resolution.
That 17% doesn't take into account fixed vs variable clock speeds, but I see your point.

Anyway I think diminishing returns are going to be strong, and both machines will perform amazing.
 
Last edited:

PeterLegend

Alt Account
Banned
Oct 29, 2017
180
I'm just excited for the first time in a long time it seems that game developers don't have gimped hardware whether it's a Jaguar CPU or Nvidia RSX Graphics Card. The ram being 16 GB concerns me slightly but should be aises by the fast SSDs in some way...
 

PeterLegend

Alt Account
Banned
Oct 29, 2017
180
Alex, I'm a big fan of your work and enjoy your videos immensely. Don't be bothered by all these out of bounds comments.
 

Straffaren666

Member
Mar 13, 2018
84
www.techpowerup.com

XFX Radeon RX 5700 XT THICC III Ultra Review

The XFX RX 5700 XT THICC III Ultra is a brand new triple-fan design from the company, which runs higher clocks, too. XFX listened to criticism and improved the memory cooling plate, reducing temperatures and noise levels significantly, which makes the THICC III one of the quietest RX 5700 XT...

In the clockspeed section. And no, it's not reasonable to believe that since we have data about that too:
www.techpowerup.com

AMD Radeon RX 5700 XT Review

The AMD Radeon RX 5700 XT is based on AMD's all-new Navi 10 GPU featuring the RDNA architecture. We thoroughly test the card's gaming performance and look at power, heat, noise, overclocking, and clock frequency stability, too, sometimes with surprising results.

The average is around 1880mhz.

I dont know why you keep relying on that gpu userbenchmark site.

That average is for some synthetic game load test, not the actual average clock frequency during the benchmarks.

I used the gpu userbenchmark since it's based on a very large data-set and shouldn't be as susceptible to the silicon lottery as a single card.
 

ShapeGSX

Member
Nov 13, 2017
5,210
I think third party will take advantage of it

Assets are built/designed much higher than what is reflected in games. With the PS5 memory streaming system being so much better, I expect that a decent portion of third party games will have higher quality textures/assets for PS5 versions.

XSX will have the 17% advantage in resolution, so you'll have a situation where it's rendering higher resolution visuals with lower quality assets. At the end of the day, what are you going to notice more? It's going to be the asset quality over the minor difference in resolution.

In order for games to have higher quality textures And other assets on PS5, games would have to saturate the SSD bus at (conservatively) 2.4GB/s. That's roughly 25 to 50 times faster than current gen games stream assets and current gen games look fantastic. Which means that games will have 20 to 50 times larger assets (on a 825GB SSD). And these assets will fit into RAM that is slightly larger than current gen. I think you guys have unreasonable expectations for next gen games. Compute and RAM is still going to be the limiter on games, not the SSD.
 

AegonSnake

Banned
Oct 25, 2017
9,566
I cannot believe I have to come out of my hiding to expressly yell how damn wrong you are. The gears 5 bench ran at Ultra settings, not at higher than Ultra Settings. Watch the damn Video or read the damn article where we say that. There at exactly the same settings as Ultra, it ran like a 2080. The exact same settings as Ultra. How often do I have to correct purposefully misconstrued Information.

Read
The
Damn
Articles
Watch
The
Damn
Videos
i just went back and watched that video. you are right, it seems there are two demos. probably even three. its hard to tell. one that runs the game at xbox one x settings, one runs with all the new visual features below and the last benchmark that runs at ultra but without the features listed below and they compared it to a pc with 64 gb of ram, 16 core cpu and rtx 2080 running the game at ultra.

but that is completely different from whats being reported by pretty much all the outlets.
According to the post, Rayner and his team showed Microsoft Gears 5 running on the Xbox Series X using the same Ultra settings usually reserved for the PC version. This version of the game also includes higher texture resolutions, higher-resolution volumetric fog, and 50% higher particle count than the PC Ultra specs. The team from The Coalition also showed an improved version of the game's opening cutscene, which ran at 60 frames per second in 4K — twice the frame rate of the Xbox One X version.

so yeah, there is definitely some misinformation going on here. reading the quotes above it's crazy to me how confusing this all is. you guys spent most of the time in that video talking about the new videos and at the very end discussed the three demos/benchmarks. i did miss the spot where you guys put up the 2080 performance bit up on the screen so i guess thats my fault, but reading the quote above it is not that hard to assume that MS is promising ultra level performance with ray tracing, high particle counts, volumetric fog and higher res textures all that native 4k 60 fps.

after all, why would they add all that stuff in, and ship the game at 45 fps like the rtx 2080. maybe even less, because you guys said that the series x benchmark did not have those new features and offered the same rasterization performance as rtx 2080. a simple oh btw, it ran at 45 fps would've cleared up a lot of this confusion. adding those new features will definitely drop the framerate below the 45 fps as well. and yet MS made it seems like this new gears demo was running at native 4k 60 fps with higher than pc ultra settings.

 

Hermii

Member
Oct 27, 2017
4,685
In order for games to have higher quality textures And other assets on PS5, games would have to saturate the SSD bus at (conservatively) 2.4GB/s. That's roughly 25 to 50 times faster than current gen games stream assets and current gen games look fantastic. Which means that games will have 20 to 50 times larger assets (on a 825GB SSD). And these assets will fit into RAM that is slightly larger than current gen. I think you guys have unreasonable expectations for next gen games. Compute and RAM is still going to be the limiter on games, not the SSD.
Yea but the SSD will allow exponentially faster loading in and out of ram, so that you don't have to keep things in memory that you don't need. At least that's how I understand it.
 

AegonSnake

Banned
Oct 25, 2017
9,566
I think third party will take advantage of it

Assets are built/designed much higher than what is reflected in games. With the PS5 memory streaming system being so much better, I expect that a decent portion of third party games will have higher quality textures/assets for PS5 versions.

XSX will have the 17% advantage in resolution, so you'll have a situation where it's rendering higher resolution visuals with lower quality assets. At the end of the day, what are you going to notice more? It's going to be the asset quality over the minor difference in resolution.
based on what i have heard and read, it seems the ps5 will have better looking character models, more detailed areas and even more unique assets. i have always wanted cutscene quality character models in game and if this magical ssd can give me that i will be a happy man.

for example.

tJ4NGrl.png


in game model LOD gets extremely low as you move further away from the screen.

hKZk8L7.png


i always thought that LOD drops like that are due to gpu not being able to handle higher quality models during the game, but from what devs have been saying on twitter, the ps5 should be able to bypass this limitation thanks to its ssd.
 

GhostTrick

Member
Oct 25, 2017
11,305
based on what i have heard and read, it seems the ps5 will have better looking character models, more detailed areas and even more unique assets. i have always wanted cutscene quality character models in game and if this magical ssd can give me that i will be a happy man.

for example.

tJ4NGrl.png


in game model LOD gets extremely low as you move further away from the screen.

hKZk8L7.png


i always thought that LOD drops like that are due to gpu not being able to handle higher quality models during the game, but from what devs have been saying on twitter, the ps5 should be able to bypass this limitation thanks to its ssd.


They are. The reason you get lower LoD is to maintaining those details and visual fidelity has a cost.
 

McScroggz

The Fallen
Jan 11, 2018
5,971
Errr...... great post, but I think you got what I was saying totally wrong.

I am not speaking for, but speaking against those that say things like, these consoles can't compare to PCs, or they are low range, mid-range...etc.

The example I gave, was based on using the raw TF number of the consoles based on how the PC audience interprets them. By the time these consoles are released, there would be GPUs from AMD and Nvidia. Based on their product stack, I expect these consoles would fall into the performance bracket of what the PC guys would call mid-range. I am saying even if they do that, it doesn't matter.

Trust me, you don't want to get me started on how hypocritical think most f the PC gamers are, because if you were to just read what they say you would think every single PC s running a 2080ti with a 16 core CPU and at 4K 144fpsorsme shit like that.

What exactly constitutes a high end PC? It is the price to build reaching a certain threshold? Like, $2000 and above?
 

Alexandros

Member
Oct 26, 2017
17,800
This was an interesting experiment for me. Now that we have the specs, the suggestion that these consoles could be around a "mid-range" PC stands out as such an absurd idea I had to take a look, so I reviewed the steam hardware survey.

GPU:
At the moment the best guess/evidence is that the new consoles' GPUs will be around RTX 2080 level. I personally have no doubt they'll soon be performing better than that, just like every console in the past over time performs better than the PC GPUs they initially seemed most similar to. But anyway:

Percent of users with a 2080 or better: 2.04%

That means these consoles are in the top 2 percent. Better than 98% of PCs out there. That's not mid-range. That's not even high-end. That's top-end.

But perhaps you meant the positioning of the GPU in the range of current available parts, not actual usage. Well, there is also exactly one graphics card above the 2080: the 2080 Ti. Also, there's the little detail that the 2080 is $600+. That's most likely more than either of these consoles will cost in their entirety. Again, that's not high-end, that's extremely high-end.

As for what to consider "mid-range", the most popular card is the GTX 1060 at 12.68%. Needless to say this is not even close to the power bracket the 2080 is in.

CPU:
This was pretty interesting to me. 72.51% are 4 cores or less. 93.68% are 6 or less. Only 6.32% are 8 core or more, and not all of those have hyperthreading. There is not as detailed of a breakdown of CPUs as there are for GPUs, meaning even this percentage of 8 cores is including things like the AMD FX-8350, which, well... lol.

These consoles are in the top 6 percent, better than 94% of PCs (just in core count, much higher when you consider architecture).

For clock speeds again it's tough to get detailed info, but 73.33% are 3.29 GHz or lower, 26.67% are 3.3 or higher.

The consoles are somewhere in the middle of the top 26% for clock speed, again with the latest, most efficient architecture.

They are similar to the 3700x (probably a little slower). That's a $300 CPU, with the only higher-end parts being the 3800x (which is similar, just clocked a little higher), the 3900x, and the 3950x.

Those Ryzen 9s are $500+. That's not high-end, that's EXTREMELY high end, enthusiast/professional level. So again, the consoles are basically top-end for a gaming PC.

=============

Now as silly as it seemed to claim next-gen consoles would be "mid-range", this was even better:



A low-end PC is something like the Intel Pentium G5400 or the AMD Ryzen 3 2200G, and... well:

iGPU.png


These aren't even close to last-gen consoles, not by a long shot. You seriously think that in a couple years, low-end integrated graphics will be 2080 level? No integrated graphics will be even close to next-gen console level at the end of that generation, let alone a couple years in.

Anyway, this was fun, thanks for giving me the idea. I knew these consoles were powerful, but even I didn't realize how extraordinarily high-end they really were. This isn't even counting the bespoke customizations for I/O where especially the PS5 will be far beyond anything possible on the PC, and likely for a good long while. Even when PC SSDs get to the base speed of the PS5, that doesn't include all the customizations that remove the bottlenecks and slowdowns PCs will still have. Remember Cerny said the decompressor was worth what, 9 Zen 2 cores? The very highest-end PC CPUs and GPUs will surely be faster, but they will also have to be used for these other functions that the consoles have dedicated hardware for.

I would like to ask two questions based on your post if that's ok.

1) Would your conclusion change if by the end of the year we have new hardware that offers console-like performance at mid-range prices?
2) Do you believe that Steam's hardware survey will show similar results a year from now?
 

Doctor Avatar

Member
Jan 10, 2019
2,593
XSX will have the 17% advantage in resolution, so you'll have a situation where it's rendering higher resolution visuals with lower quality assets. At the end of the day, what are you going to notice more? It's going to be the asset quality over the minor difference in resolution.

Problem is a 17% advantage in resolution isn't going to be enough to actually go up a resolution step.

1800p to 2160p (native 4k) is a 44% increase in number of pixels. 17% more grunt isn't going to come close to covering that.

XSX simply isn't powerful enough to push that bump is PS5 is going all out at 1800p.

If XSX tries to push higher resolutions than PS5 (eg: 4k vs 1800p) then we will have a situation where the XSX games will run worse than the PS5 ones, it will be like the current RE3 state. XSX will have worse frame rates, or possibly even cut effects.

So I very much suspect both will be running the same resolution, PS5 will probably drop more frames and drop down its dynamic resolution more often, maybe be lacking in some minor effects, but the baseline resolution for both will almost certainly be identical because (no matter what the Xbox console warriors will try and tell you) the XSX simply doesn't have the kind of advantage that would be required to power a resolution jump. For that you need a 40-50% power delta, just like PS4 had over One and what X had over Pro. XSX doesn't even have half that over PS5.

This is not going to be a Pro vs X situation. Not even close. Outside of side by side DF comparisons the games will be pretty hard to differentiate.

The GPU differences are a known quantity. We know from previous experience how much benefit a 17% advantage in GPU will get you.

The unknown quantity is how important SSD speed will turn out to be when it comes to performance. PS5 has a pretty significant advantage on paper, over 2x the raw speed of the XSX drive, but how that will translate into performance and game design is yet to be seen.
 
Last edited:

GhostTrick

Member
Oct 25, 2017
11,305
ex-technical art director at naughty dog is saying something different.








They are saying the same thing:
"So the ability to load in the highest resolution version of any asset just in front of you and drop it immediately as you turn around means that every tree can have 3d bark and moss and ants marching on it just when needed, without blowing up the budget. It's going to be great <3 "
Loading all those assets at their highest fidelity has a cost. Otherwise, there wouldn't have a need, even with an SSD, to drop to a lower quality one.
 

AegonSnake

Banned
Oct 25, 2017
9,566
Problem is a 17% advantage in resolution isn't going to be enough to actually go up a resolution step.

1800p to 2160p (native 4k) is a 44% increase in number of pixels. 17% more grunt isn't going to come close to covering that.

XSX simply isn't powerful enough to push that bump is PS5 is going all out at 1800p.

If XSX tries to push higher resolutions than PS5 (eg: 4k vs 1800p) then we will have a situation where the XSX games will run worse than the PS5 ones, it will be like the current RE3 state. XSX will have worse frame rates, or possibly even cut effects.

So I very much suspect both will be running the same resolution, PS5 will probably drop more frames and drop down its dynamic resolution more often, maybe be lacking in some minor effects, but the baseline resolution for both will almost certainly be identical because (no matter what the Xbox console warriors will try and tell you) the XSX simply doesn't have the kind of advantage that would be required to power a resolution jump. For that you need a 40-50% power delta, just like PS4 had over One and what X had over Pro. XSX doesn't even have half that over PS5.

This is not going to be a Pro vs X situation. Not even close. Outside of side by side DF comparisons the games will be pretty hard to differentiate.

The GPU differences are a known quantity. We know from previous experience how much benefit a 17% advantage in GPU will get you.

The unknown quantity is how important SSD speed will turn out to be when it comes to performance. PS5 has a pretty significant advantage on paper, over 2x the raw speed of the XSX drive, but how that will translate into performance and game design is yet to be seen.
this is why im hoping against all hope that phil doesnt mandate or push for native 4k like he did for the x1x. i bought the x1x and was blown away by rdr2 at native 4k but soon realized that i'd rather have 60 fps versions of these games. or at least stable 30 fps versions. i remember playing anthem and wondering why the framerate couldnt have been locked to 30 fps in favor of native 4k or whatever high resolution they were runing on.

sony and several other third party studios were able to drop down resolution to 1080p and run 30 fps games at 60 fps, and yet MS continues to push for higher res for some bizarre reason. re3 should not be a native 4k game.

a better use for that 17% advantage would be better ray tracing or smoother framerates. i am pretty sure we will get 30 fps games next gen that will struggle to run at 30 fps. it would be great to get a performance mode at ps5 res or new ray tracing effects that the ps5 is missing.
 

AegonSnake

Banned
Oct 25, 2017
9,566
They are saying the same thing:
"So the ability to load in the highest resolution version of any asset just in front of you and drop it immediately as you turn around means that every tree can have 3d bark and moss and ants marching on it just when needed, without blowing up the budget. It's going to be great <3 "
Loading all those assets at their highest fidelity has a cost. Otherwise, there wouldn't have a need, even with an SSD, to drop to a lower quality one.
well the cost hes talking about was limited by ram.

But we can't store all of the super detailed high res versions for all objs in memory at once

hes saying the ssd allows them to bypass this limitation.
 

2Blackcats

Member
Oct 26, 2017
16,053
this is why im hoping against all hope that phil doesnt mandate or push for native 4k like he did for the x1x. i bought the x1x and was blown away by rdr2 at native 4k but soon realized that i'd rather have 60 fps versions of these games. or at least stable 30 fps versions. i remember playing anthem and wondering why the framerate couldnt have been locked to 30 fps in favor of native 4k or whatever high resolution they were runing on.

sony and several other third party studios were able to drop down resolution to 1080p and run 30 fps games at 60 fps, and yet MS continues to push for higher res for some bizarre reason. re3 should not be a native 4k game.

a better use for that 17% advantage would be better ray tracing or smoother framerates. i am pretty sure we will get 30 fps games next gen that will struggle to run at 30 fps. it would be great to get a performance mode at ps5 res or new ray tracing effects that the ps5 is missing.

I don't think you have to worry. The reason Native 4k is so prevalent is because the game still has to run on the regular One. Thanks to the massive gap between the two it's the most sensible way to deploy the resources.

Once the cross gen period is over and we're past the initial marketing, blah, blah, true 4k phase in a few years we'll see devs use the specs in a smarter way.
 

Doctor Avatar

Member
Jan 10, 2019
2,593
this is why im hoping against all hope that phil doesnt mandate or push for native 4k like he did for the x1x. i bought the x1x and was blown away by rdr2 at native 4k but soon realized that i'd rather have 60 fps versions of these games. or at least stable 30 fps versions. i remember playing anthem and wondering why the framerate couldnt have been locked to 30 fps in favor of native 4k or whatever high resolution they were runing on.

sony and several other third party studios were able to drop down resolution to 1080p and run 30 fps games at 60 fps, and yet MS continues to push for higher res for some bizarre reason. re3 should not be a native 4k game.

a better use for that 17% advantage would be better ray tracing or smoother framerates. i am pretty sure we will get 30 fps games next gen that will struggle to run at 30 fps. it would be great to get a performance mode at ps5 res or new ray tracing effects that the ps5 is missing.

Those things are much harder to push when it comes to marketing. Which is all native 4K is. It doesn't offer a substantially improved picture compared to reconstructed when you take into account the cost required for it. It's one of the worst bang-for-buck uses of the GPU there is.

Problem being the marketing around X was so 4K focused, they can't backtrack now. The other thing to consider is that if they are supporting Xbox One and Xbox One X increases in resolution might be the most developer-efficient way to both get experiences the same on all three platforms but still use some of the power of the XSX. Having higher resolution is the "cheapest" way to use performance from a developer standpoint, even if it is the worst way when it comes to actually making games look better. MS also need to support PCs with HDDs and integrated graphics for their games as well.

Xbox One/low spec PC - 900p30
Xbox One X/mid spec PC - 4K30
Xbox Series X/high spec PC - 4K60

This is likely what we will get if they're making the same game to run across all three platforms.
 
Last edited:
Feb 23, 2019
1,426
Problem is a 17% advantage in resolution isn't going to be enough to actually go up a resolution step.

1800p to 2160p (native 4k) is a 44% increase in number of pixels. 17% more grunt isn't going to come close to covering that.

XSX simply isn't powerful enough to push that bump is PS5 is going all out at 1800p.

If XSX tries to push higher resolutions than PS5 (eg: 4k vs 1800p) then we will have a situation where the XSX games will run worse than the PS5 ones, it will be like the current RE3 state. XSX will have worse frame rates, or possibly even cut effects.

So I very much suspect both will be running the same resolution, PS5 will probably drop more frames and drop down its dynamic resolution more often, maybe be lacking in some minor effects, but the baseline resolution for both will almost certainly be identical because (no matter what the Xbox console warriors will try and tell you) the XSX simply doesn't have the kind of advantage that would be required to power a resolution jump. For that you need a 40-50% power delta, just like PS4 had over One and what X had over Pro. XSX doesn't even have half that over PS5.

This is not going to be a Pro vs X situation. Not even close. Outside of side by side DF comparisons the games will be pretty hard to differentiate.

The GPU differences are a known quantity. We know from previous experience how much benefit a 17% advantage in GPU will get you.

The unknown quantity is how important SSD speed will turn out to be when it comes to performance. PS5 has a pretty significant advantage on paper, over 2x the raw speed of the XSX drive, but how that will translate into performance and game design is yet to be seen.

Yes I totally agree with this

Third party devs will target 4k for both (or the same base resolution for both), and PS5 will have a lower effective resolution due to dynamic scaling or other techniques to account for the 17% gap

At the end of the day it really is a trivial difference, and most people are simply focusing on it because it's been a bigger difference in previous gens.

It's why everyone is rightfully discussing the SSD and IO difference because it is truly the only meaningful one at 120%
 

Hey Please

Avenger
Oct 31, 2017
22,824
Not America
They are saying the same thing:
"So the ability to load in the highest resolution version of any asset just in front of you and drop it immediately as you turn around means that every tree can have 3d bark and moss and ants marching on it just when needed, without blowing up the budget. It's going to be great <3 "
Loading all those assets at their highest fidelity has a cost. Otherwise, there wouldn't have a need, even with an SSD, to drop to a lower quality one.

One thing to note, iirc, ND used same "model" for in-engine cinematics as well as real time gameplay on Uncharted, so geometric detail would stay the same but graphical effects (such better SSR, additional lighting seen during cinematics, higher res shadow maps, etc) would be missing. Feels like TLoU2 may do the same.

hes saying the ssd allows them to bypass this limitation.

LoD system will still exist. The fall off for objects depending on their distance from camera ought to see a notable shift however.
 

Doctor Avatar

Member
Jan 10, 2019
2,593
At the end of the day it really is a trivial difference, and most people are simply focusing on it because it's been a bigger difference in previous gens.

It's why everyone is rightfully discussing the SSD and IO difference because it is truly the only meaningful one at 120%

SSD is a wild card from our perspective. It's very had to gauge how much of a graphic difference it will make when it comes to LOD and asset quality.

It may not be significant and simply allow PS5 to do stuff like faster traversal and scene switching than XSX. But on the other hand if it can really be used directly to improve graphics by allowing higher quality assets then the PS5 games may well look better than the XSX ones since it will be able to much faster stream in high quality assets.

Hard to know, will be interesting to see. Sony clearly gambled a bit of GPU power on the importance of the SSD speed. We shall see if it pays off.
 

Straffaren666

Member
Mar 13, 2018
84
Yeah, so during actual gaming.

I doubt that. The game clock varies between different games. If it was recorded for all games it would look different.

Since I don't have a life ;-), I took the performance data from Anandtech's review (https://www.anandtech.com/show/14618/the-amd-radeon-rx-5700-xt-rx-5700-review/12), since they write out the measured game clock and recomputed the benchmark FPS if the 5700XT was scaled up to 10.3TF. Then it looks like this,

Game Game Clock 5700XT/10.3TF 5700XT/2070 Super Performance ratio
Tomb Raider 1780MHz 39.8/44.98/42.6 1.0559
F1 2019 1800MHz 54.8/61.25/59.9 1.0225
Assassin's Creed 1900MHz 38.3/40.55/46.2 0.8777
Metro Exodus 1780MHz 34.7/39.2/35.0 1.12
Strange Brigade 1780MHz 69.6/78.66/75.0 1.0488
Total War: TK 1830MHz 26.2/28.8/30.0 0.96‬
The Division 2 1760MHz 34.7/39.66/40.2 0.9866
Grand Theft Auto V 1910MHz 41.0/43.1/47.5 0.9073
Forza Horizon 4 1870MHz 60.1/64.65/58.0 1.1147
avg: 1.0104

Unfortunately, the review doesn't include the 2080, so I used the 2070 super instead. For the 9 benchmarks a 10.3TF 5700XT would be about 1.0104 times faster than a 2070 super.

Then I used the relative performance from www.techpowerup.com to compare the 2070 super with the 2080. It looks like,

RTX 2070 Super: 114%
RTX 2080: 123%

Hence a 2080 would have about 1.23 / (1.14 * 1.0104) = 1.0678 higher performance than a 10.3TF 5700XT and a 10.3TF 5700XT would have about 1.14 * 1.0104 = 1.152 higher performance than a 5700XT.

We still don't take the RDNA 2 architectural improvements into account, nor the BW deficit caused by the CPU/SSD/Audio. I still believe there are good reasons to expect the PS5 GPU to perform on a comparable level to the 2080 and about ~15% above the 5700XT.
 

AegonSnake

Banned
Oct 25, 2017
9,566
I don't understand what exactly you're trying to get at.
this dev on era offers a more detailed explanation.

What John is saying sounds pretty right to me! I don't want to down play GPU power, but I promise everybody that you will be absolutely blown away by visuals on both consoles. However, the SSDs are the big difference when coming into this gen. We're not talking about "load times" in the classic sense. That's an antiquated way of thinking about data coming from your hard drive. For the last 10+ years we've been streaming worlds on the fly. The problem is that our assets are absolutely huge now, as are our draw distances, and our hard drives can't keep up. It means that as you move through the world we're trying to detect and even predict what assets need loading. Tons of constraints get put into place due to this streaming speed.

An ultra fast drive like the one in PS5 means you could be load in the highest level LOD asset for your models way further than you could before and make worlds any way you want without worry of it streaming in fast enough. The PS5 drive is so fast I imagine you could load up entire neighborhoods in a city with all of their maps at super high resolution in a blink of an eye. It's exciting. People don't realize that this will also affect visuals in a big way. If we can stream in bigger worlds and stream in the highest detail texture maps available, it will just look so much better.

I think the Xbox drive is also good! The PS5 drive is just "dream level" architecture though.
the naughty dog artist who posted that image of detail falling off as you went further away from screen is essentially saying the detail level will be identical regardless of the distance from the player/screen. so the line on the graph would look like a straight line.
 

GhostTrick

Member
Oct 25, 2017
11,305
I doubt that. The game clock varies between different games. If it was recorded for all games it would look different.

Since I don't have a life ;-), I took the performance data from Anandtech's review (https://www.anandtech.com/show/14618/the-amd-radeon-rx-5700-xt-rx-5700-review/12), since they write out the measured game clock and recomputed the benchmark FPS if the 5700XT was scaled up to 10.3TF. Then it looks like this,

Game Game Clock 5700XT/10.3TF 5700XT/2070 Super Performance ratio
Tomb Raider 1780MHz 39.8/44.98/42.6 1.0559
F1 2019 1800MHz 54.8/61.25/59.9 1.0225
Assassin's Creed 1900MHz 38.3/40.55/46.2 0.8777
Metro Exodus 1780MHz 34.7/39.2/35.0 1.12
Strange Brigade 1780MHz 69.6/78.66/75.0 1.0488
Total War: TK 1830MHz 26.2/28.8/30.0 0.96‬
The Division 2 1760MHz 34.7/39.66/40.2 0.9866
Grand Theft Auto V 1910MHz 41.0/43.1/47.5 0.9073
Forza Horizon 4 1870MHz 60.1/64.65/58.0 1.1147
avg: 1.0104

Unfortunately, the review doesn't include the 2080, so I used the 2070 super instead. For the 9 benchmarks a 10.3TF 5700XT would be about 1.0104 times faster than a 2070 super.

Then I used the relative performance from www.techpowerup.com to compare the 2070 super with the 2080. It looks like,

RTX 2070 Super: 114%
RTX 2080: 123%

Hence a 2080 would have about 1.23 / (1.14 * 1.0104) = 1.0678 higher performance than a 10.3TF 5700XT and a 10.3TF 5700XT would have about 1.14 * 1.0104 = 1.152 higher performance than a 5700XT.

We still don't take the RDNA 2 architectural improvements into account, nor the BW deficit caused by the CPU/SSD/Audio. I still believe there are good reasons to expect the PS5 GPU to perform on a comparable level to the 2080 and about ~15% above the 5700XT.


Yes, and as you noted: All of those clocks are always higher than the game clock. + you wasted your time for that comparison based on a weird calculation that makes no sense. All you had to do was to take a higher clocked RX 5700XT directly:
www.techpowerup.com

ASUS Radeon RX 5700 XT STRIX OC Review

The ASUS Radeon RX 5700 XT STRIX OC is a huge improvement over the AMD reference design. It comes with an excellent cooler that reduces temperatures and noise levels at the same time, matching NVIDIA's offerings. Idle-fan-stop is included, too, and the factory overclock nets additional performance.
www.techpowerup.com

ASUS Radeon RX 5700 XT STRIX OC Review

The ASUS Radeon RX 5700 XT STRIX OC is a huge improvement over the AMD reference design. It comes with an excellent cooler that reduces temperatures and noise levels at the same time, matching NVIDIA's offerings. Idle-fan-stop is included, too, and the factory overclock nets additional performance.

A model which maintains over 2Ghz and usually around 2030mhz:
clocks-and-temps.jpg


relative-performance_3840-2160.png


A 18% difference. And it's a 10.3Tflops RX 5700 XT.
 

TuMekeNZ

Member
Oct 27, 2017
1,278
Auckland, New Zealand
Man it's gotten to the point in all of the tech threads where I don't know if I should listen to anybody. It seems like every time somebody types out a comment or posts a video that seems to be a good breakdown an onslaught of people breakdown how their are wrong. It's frustrating because I'm excited for the next generation of consoles and I'm interested in understanding what new and cool things these consoles are doing but there is such a sharp divide on pretty much everything I can't enjoy the conversation. It sucks.

About the only thing I'm confident in is both consoles are going to be dope. I just wish I knew enough to be able to spot most of the BS because I've probably already gotten misinformed about things without even knowing it.

But who knows. I Just hate how antagonistic it is right now. I don't get it tbh.
Yep very annoying it always has to turn into a pissing contest. I'm no tech expert but I do love hearing what these systems are capable of. But I hate how every thread ends up being over taken by a handful of people going back and forth trying to prove each other wrong over specs we really don't know that much about.
I just hope both console launch this holiday period with some fantastic games for fans of both to enjoy.
 

amstradcpc

Member
Oct 27, 2017
1,768
I doubt that. The game clock varies between different games. If it was recorded for all games it would look different.

Since I don't have a life ;-), I took the performance data from Anandtech's review (https://www.anandtech.com/show/14618/the-amd-radeon-rx-5700-xt-rx-5700-review/12), since they write out the measured game clock and recomputed the benchmark FPS if the 5700XT was scaled up to 10.3TF. Then it looks like this,

Game Game Clock 5700XT/10.3TF 5700XT/2070 Super Performance ratio
Tomb Raider 1780MHz 39.8/44.98/42.6 1.0559
F1 2019 1800MHz 54.8/61.25/59.9 1.0225
Assassin's Creed 1900MHz 38.3/40.55/46.2 0.8777
Metro Exodus 1780MHz 34.7/39.2/35.0 1.12
Strange Brigade 1780MHz 69.6/78.66/75.0 1.0488
Total War: TK 1830MHz 26.2/28.8/30.0 0.96‬
The Division 2 1760MHz 34.7/39.66/40.2 0.9866
Grand Theft Auto V 1910MHz 41.0/43.1/47.5 0.9073
Forza Horizon 4 1870MHz 60.1/64.65/58.0 1.1147
avg: 1.0104

Unfortunately, the review doesn't include the 2080, so I used the 2070 super instead. For the 9 benchmarks a 10.3TF 5700XT would be about 1.0104 times faster than a 2070 super.

Then I used the relative performance from www.techpowerup.com to compare the 2070 super with the 2080. It looks like,

RTX 2070 Super: 114%
RTX 2080: 123%

Hence a 2080 would have about 1.23 / (1.14 * 1.0104) = 1.0678 higher performance than a 10.3TF 5700XT and a 10.3TF 5700XT would have about 1.14 * 1.0104 = 1.152 higher performance than a 5700XT.

We still don't take the RDNA 2 architectural improvements into account, nor the BW deficit caused by the CPU/SSD/Audio. I still believe there are good reasons to expect the PS5 GPU to perform on a comparable level to the 2080 and about ~15% above the 5700XT.
You arent including the 15% of IPC performance increment that RDNA2 offers.
 

Straffaren666

Member
Mar 13, 2018
84
Yes, and as you noted: All of those clocks are always higher than the game clock. + you wasted your time for that comparison based on a weird calculation that makes no sense. All you had to do was to take a higher clocked RX 5700XT directly:
www.techpowerup.com

ASUS Radeon RX 5700 XT STRIX OC Review

The ASUS Radeon RX 5700 XT STRIX OC is a huge improvement over the AMD reference design. It comes with an excellent cooler that reduces temperatures and noise levels at the same time, matching NVIDIA's offerings. Idle-fan-stop is included, too, and the factory overclock nets additional performance.
www.techpowerup.com

ASUS Radeon RX 5700 XT STRIX OC Review

The ASUS Radeon RX 5700 XT STRIX OC is a huge improvement over the AMD reference design. It comes with an excellent cooler that reduces temperatures and noise levels at the same time, matching NVIDIA's offerings. Idle-fan-stop is included, too, and the factory overclock nets additional performance.

A model which maintains over 2Ghz and usually around 2030mhz:
clocks-and-temps.jpg


relative-performance_3840-2160.png


A 18% difference. And it's a 10.3Tflops RX 5700 XT.

It's very obvious that the clock frequency shown isn't the actual average game clock from the benchmark. If it was, then it would vary. Look at the results from Anandtech,

Tomb Raider 1780MHz
F1 2019 1800MHz
Assassin's Creed 1900MHz
Metro Exodus 1780MHz
Strange Brigade 1780MHz
Total War: TK 1830MHz
The Division 2 1760MHz
Grand Theft Auto V 1910MHz
Forza Horizon 4 1870MHz

That's the average. The actual game clock will fluctuate even more, but if we just look at the average, there is a 8% fluctuation. That's more than 150Mhz at 2Ghz. For the average! If plotted in a graph, the actual fluctations shown would probably exceed 200Mhz. There is no way that's the actual game clock.
 

Hey Please

Avenger
Oct 31, 2017
22,824
Not America
this dev on era offers a more detailed explanation.


the naughty dog artist who posted that image of detail falling off as you went further away from screen is essentially saying the detail level will be identical regardless of the distance from the player/screen. so the line on the graph would look like a straight line.

If I am understanding you correctly, then you're stating ("straight line") that LoDs will cease to exist. That assumption is erroneous.

GPUs are limited in their capacity to process geometric and texture detail in a given scene. For example: Rendering a visible structure 200m away from the camera with the same texture and geometric detail as if it were less than a meter away is computationally very expensive and performance-wise very wasteful. Every frame generally has a budget to ensure it reaches the performance target.

What should happen with the next gen:
  • LoD quality overall will see a notable increase in detail due to the conventional increase in GPU power and RAM capacity
  • Mesh/Primitive shaders, according Ser Cerny, would mean far smoother transition between LoD states compared what we have today
  • SSDs ensure that highest LoDs, appropriate for the distance of the object from the camera and that which does not exceed the performance budget, is loaded. This is markedly different from current gen where speculative loading of additional data into RAM to accommodate for player's locomotion within a radius of approx- 30 seconds, means lower quality LoDs than what could have been possible if PS4 was built around SSD like PS5 is.
 
Mar 20, 2020
143
this is why im hoping against all hope that phil doesnt mandate or push for native 4k like he did for the x1x. i bought the x1x and was blown away by rdr2 at native 4k but soon realized that i'd rather have 60 fps versions of these games. or at least stable 30 fps versions. i remember playing anthem and wondering why the framerate couldnt have been locked to 30 fps in favor of native 4k or whatever high resolution they were runing on.

sony and several other third party studios were able to drop down resolution to 1080p and run 30 fps games at 60 fps, and yet MS continues to push for higher res for some bizarre reason. re3 should not be a native 4k game.

a better use for that 17% advantage would be better ray tracing or smoother framerates. i am pretty sure we will get 30 fps games next gen that will struggle to run at 30 fps. it would be great to get a performance mode at ps5 res or new ray tracing effects that the ps5 is missing.
I don't believe Phil spencer ever mandated native 4k, he said it's up to the developers, but the console is capable of native 4k.

As far as I can recall, both RD2 and Metro Exodus running in Native 4K had stable framerates. In this case 30fps. Gears 5 is dynamic 4K, however, because of this, it holds 60fps across all game modes.

As previously stated MS is not pushing developers to aim for higher resolutions, which is typically the result of the performance advantage the Xbox One X has over the PS4Pro. Developers who wish to up the resolution still have to optimize to ensure framerates have not been impacted as a result.

Re3 is not native 4K. As far as I can recall its 1800p. It's also a demo, most likely running on an older build. We will have to wait and see what the final retail version performs like, and where it sits in relation to the PS4Pro.

Note: The PS4/Pro has been the lead platform this generation, this tells me this platform gets the most attention during game development.

This 17% differential is not the complete picture, of course, many folks here have argued for and against what might this mean in real terms. I'm inclined to agree with Doctor Avatar.
 

modiz

Member
Oct 8, 2018
17,831
You arent including the 15% of IPC performance increment that RDNA2 offers.
that 15% figure was a misunderstanding, we still dont have an expected IPC improvement. although 15% would work rather well with estimated clock speeds that could be hit on a 36CU desktop RDNA 2 card going by the PS5's GPU clock.
 

darthkarki

Banned
Feb 28, 2019
129
I would like to ask two questions based on your post if that's ok.

1) Would your conclusion change if by the end of the year we have new hardware that offers console-like performance at mid-range prices?
2) Do you believe that Steam's hardware survey will show similar results a year from now?

Absolutely, I love questions :)
  1. I'm going to say a "mid-range" graphics card is something in the $250-300 range, as whatever is released there seems to end up the most popular. So, if a 2080-level card is released for $300, and there are more powerful cards above that, would that mean console GPUs are "mid-range"? Yes. I'd be surprised at that though. I think that level will stay at $400-500.
  2. Absolutely, and I think this is actually the more relevant metric than the "current product range." The 1060 came out in 2016, and is still the most popular card. Using the wayback machine to review previous years, the percentage hasn't even changed much. The 970 was the most popular before that, and it was released in 2014. With the economy the way it is, I think it's going to take even longer than usual for the average person to upgrade if they've got a functional computer at the moment. I'd bet the 1060 will still be the most popular card a year from now, though it will go down in percentage a bit.
So yes, it's possible the 2080 will be a "mid-range"-level card at the end of this year, just unlikely given the history of the prices of GPU power brackets going up over time. If the 3060 is equal to 2080, but at $400, that's not mid-range. But even when 2080 is midrange eventually, which it will be, it'll take a long time for the average gaming computer to actually be at that level.
 

GhostTrick

Member
Oct 25, 2017
11,305
It's very obvious that the clock frequency shown isn't the actual average game clock from the benchmark. If it was, then it would vary. Look at the results from Anandtech,

Tomb Raider 1780MHz
F1 2019 1800MHz
Assassin's Creed 1900MHz
Metro Exodus 1780MHz
Strange Brigade 1780MHz
Total War: TK 1830MHz
The Division 2 1760MHz
Grand Theft Auto V 1910MHz
Forza Horizon 4 1870MHz

That's the average. The actual game clock will fluctuate even more, but if we just look at the average, there is a 8% fluctuation. That's more than 150Mhz at 2Ghz. For the average! If plotted in a graph, the actual fluctations shown would probably exceed 200Mhz. There is no way that's the actual game clock.


How is that very obvious ?
The results from Anandtech are for the reference design. Those are for the Asus Strix OC. So yes, it is maintaining over 2ghz. It is the actual game clock.