• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Status
Not open for further replies.

Deleted member 46804

User requested account closure
Banned
Aug 17, 2018
4,129
Spidey is 4K@60fps .
I truly believe it will be the most impressive game come launch .
The new pic made me even more certain with the jump.
Correction it has a 4K 60 FPS mode. We don't know what that will look like visually compared to the visuals mode and we don't know if it will run at native 4K at all. 4K hasn't really meant 4K this gen so color me skeptical. Having said that what they've shown thus far does look good regardless of frame rate or resolution.
 

Deleted member 43

Account closed at user request
Banned
Oct 24, 2017
9,271
I'm really not console warrioring here at all, and even in this thread if you look at my posts that obviously isn't the case. I just was not aware of PS5 variable clocks being debunked somehow. Yes variable clocks in general are not a performance disadvantage because you can scale clock with demand and not have undesirable results, but very taxing titles that isn't necessarily true, you may need to hit max load for a long ass time. Has someone proven definitively how long a unreleased system can sustains 2.2ghz clocks if needed somehow? I really have no idea. I can say my 5700xt will sometimes need to sit at max load for hours if I don't hit the pause button or go into menus with demanding titles. Is the ps5 different than that definitively somehow?
There is no limit to how long it can run at max clocks.
 
Oct 27, 2017
5,136
Point me to where it's been debunked?
It's an ~10 TF machine.
Follow the above quotes to see the posts he was replying too.
There is also the "Road to PS5" video where Mark Cerny explained how the system worked and accompanying articles from Digital Foundry going into a bit more detail.
Plus there's this massive thread dedicated to the PS5, there's a posts somewhere on there that explains how the variable technology works with images for better understanding.
www.resetera.com

PlayStation 5 Pre-Release Technical Discussion |OT| OT

CPU: 8x Zen 2 Cores at 3.5GHz (variable frequency) GPU: 10.28 TFLOPs, 36 CUs at 2.23GHz (variable frequency) GPU Architecture: Custom RDNA 2 Memory/Interface: 16GB GDDR6/256-bit Memory Bandwidth: 448GB/s Internal Storage: Custom 825GB SSD IO Throughput: 5.5GB/s (Raw), Typical 8-9GB/s...
 

VinFTW

Member
Oct 25, 2017
6,470
There is no limit to how long it can run at max clocks.
Sorry if this is a stupid question, I'm just jumping into the conversation now and also not very technically knowledgeable on this stuff.

This is not just directed at you Matt but anyone who feels like answering, if there's no limit to running the PS5 at max clocks/settings, why call it variable at all to begin with?
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Digital foundry spoke to devs who confirmed they are running the gpu at max clocks. locked.



www.eurogamer.net

PlayStation 5 uncovered: the Mark Cerny tech deep dive

On March 18th, Sony finally broke cover with in-depth information on the technical make-up of PlayStation 5. Expanding …

Ok thanks, but this comes with the caveat of throttling the CPU because the jaguar is so weak it's not a big deal to do that... So, this still sounds like to me the PS5 isn't going to necessarily lock those GPU clocks depending on the title and how much it needs more CPU power and as we keep adding eye candy throughout the gen. So reads to me like yeah this will definitely effect things down the line and especially if trying to hit higher fps rates for which the CPU is critical especially if you aren't entirely GPU bound which seems kind of likely as this isn't a grunty enough GPU for native 4k which is pretty much the only GPU bound resolution and that's right now, not a few years from now (if it isn't already as we're already seeing reports of).
 

Automagical

Member
Jun 27, 2020
329
PS5 being 15% less powerful than SeX is just a fact until proof of secret sauce is provided.
That being said, 15% less powerful is a joke and generally wont be notice with modern checkerboarding and reconstruction/upscale tech.

gimme dat 1440p/60
 

MrBob

Member
Oct 25, 2017
6,670
Sorry if this is a stupid question, I'm just jumping into the conversation now and also not very technically knowledgeable on this stuff.

If there's no limit to running the PS5 at max clocks/settings, why call it variable at all to begin with?

Not every scene will be so demanding it needs full power. Videocards do this already in what is called a boost mode where the video card clocks higher for more demanding scenarios. Cpus boost too on Pc.
 

HeWhoWalks

Member
Jan 17, 2018
2,522
Teraflops definitely matter quite a bit.

For example, you could look at the teraflop numbers of the Xb1-ps4 and the xb1x-ps4pro and get a really good idea of what the resolution differences would be.
The teraflop number isn't what dictated that. One machine has 12GB of RAM, a higher specked GPU, and better memory clocks. That matters. Not a theoretical number.
 

gundamkyoukai

Member
Oct 25, 2017
21,156
Correction it has a 4K 60 FPS mode. We don't know what that will look like visually compared to the visuals mode and we don't know if it will run at native 4K at all. 4K hasn't really meant 4K this gen so color me skeptical. Having said that what they've shown thus far does look good regardless of frame rate or resolution.

Sony big FP games look native 4k like R&C , GT7 , HFW , DSR , Spidey MM.
Which is opposite of what i wanted to tell the truth .
I was in the reconstruct camp even before we see DLSS or get the specs.
 

VeePs

Prophet of Truth
Member
Oct 25, 2017
17,371
I just took a look at Dusk's twitter feed and... god damn lol.

Guys, you realize RE8 is going to look and play great right? Right?
 
Apr 4, 2018
4,518
Vancouver, BC
If power doesn't matter for Lockheart because of resolution scaling, then it matters even less for ps5.

As for the 20% gpu difference, that's 12fps more or less on a 60fps game, the difference between having clouds on ultra vs high like someone mentioned earlier in the thread. The difference is handy but not really game changing.

You aren't entirely wrong about saying that the power difference won't matter for PS5 if it doesn't matter for Lockhart, PS5 will be plenty powerful. The big difference though, is that Lockhart likely isn't targeting 4k, and is likely going to be significantly cheaper than anything else on the market.

I think where the power difference will matter for PS5 is when people who are on the fence start comparing the price of a PS5 to the price of an Xbox Series X. As you show above, 18% in TF (and I would argue that this is likely the closet the gap will be for well optimized games on both consoles, and could be greater in many cases), we are talking about 12+FPS. That's a big performance gap to make up, and could result in significant visual advantages and framerate advantages for multiplatform games for Series X if both consoles target the same resolution and framerates.

On the other hand, you could argue that PS5 could use reconstruction and resolution scaling to bridge the gap, but that goes both ways. Devs could also technically use the same techniques to significantly dial up visual settings. If MS also develops it's own DLSS type resolution reconstruction method using DirectML, that could also have a big impact.

Honestly, I think players are going to be spoiled for choice either way.
 

HeWhoWalks

Member
Jan 17, 2018
2,522
Have you been living under a rock this entire generation?

900p->1080p matches your pretty well to the teraflop difference between the Xb1/s and the PS4.
I always love ridiculous comments like the underlined. :D

Coming back to Earth, explain, mathematically, how a 0.64 teraflop difference "matches" res output differences to those consoles. How do theoretical numbers do that and how do those specific numbers [1.84 teraflops vs 1.25 teraflops] do that?
 

Rats

Member
Oct 25, 2017
8,113
It's a 12.155tf console vs a 10~TF console so it's around a 20% difference.

Using the PS5 as the point of reference. Let me walk you through this very carefully.

12.155 - 10.28 = 1.875

1.875 is 15.4% of 12.155

1.875 is 18.3% of 10.28

I had my math wrong in my last post, you're actually exaggerating the number way more than the other guy is.
 
Last edited:

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
I have a ~10tf 5700xt that boosts and holds 2.1ghz right now. It isn't a 4k card for current titles. I guess maybe a 4k30 card, but that's right now. In a couple years it won't be because it's literally outclassed by an XSX and is about to be massively outclassed by new cards. It is also already CPU limited with a 5.0ghz 8700k at 1440p.

I'm not actually concerned about PS5 performance in general because it really is sort of apples and oranges. The settings/eye candy aren't the same, more optimization in the console space, etc.

But if devs are already saying they are throttling the CPU to lock the GPU clocks, well yes I don't think it's crazy to think that will change overtime and we can expect to see more checkerboarding and a fair amount of 30fps titles, unless they adjust their objectives for 60fps which means higher CPU clocks and lower GPU clocks/eye candy. And yeah if anything people saying this had been "debunked" 5000x, no, it really hasn't. Dropping CPU clocks to hold GPU boosts literally tells you the psu isn't cut out for the maximum performance of the console. Like it's right there, an admitted, open perf compromise in order to hold GPU clocks.

I don't think any of this goes against this rumor. And in the console space I fully expect XSX to also be largely 30fps within a few years.

I'm not sure exactly what's controversial about 30fps console gaming. It's literally always how it goes for the most part especially later into the gen unless we see another big hardware refresh, which of course, requires buying new hardware because the launch specs are starting to lag.
 

Bradbatross

Member
Mar 17, 2018
14,230
It's 10.28 and that makes the PS5 ~16% less powerful than the XSX, or the XSX ~19% more powerful than the PS5.
Using the PS5 as the point of reference. Let me walk you through this very carefully.

12.155 - 10.23 = 1.875

1.875 is 15.4% of 12.155

1.875 is 18.3% of 10.28

I had my math wrong in my last post, you're actually exaggerating the number way more than the other guy is.
It's a 10~ TF console according to Matt, so that makes for a ~20 difference.
 

AegonSnake

Banned
Oct 25, 2017
9,566
I always love ridiculous comments like the underlined. :D

Coming back to Earth, explain, mathematically, how a 0.64 teraflop difference "matches" res output differences to those consoles.
its not 0.64 tflops. its 1.31 tflops vs 1.84 tflops. or 0.53 tflops.

the percentages are what you are looking at. numbers dont tell the whole story.

in terms of percentages, 1.84/1.31 = 40% difference in tflops.

2.07 million pixels (1080p) / 1.44 million pixels (900p) = 43%.
 

jroc74

Member
Oct 27, 2017
29,004
you would have a case if the ps5 was rdna 1.0. its not. the 5700xt doesnt even have hardware rt. it doesnt have the 50% perf/watt enhancements of rdna 2.0 which is how they are able to get 2.23 ghz in a closed console box when techspot couldnt even get there with nitrogen cooling. so clearly the architecture is different.

your benchmarks are just as outdated as vega benchmarks. they make zero sense in the context of next gen talks. wait for rdna 2.0 benchmarks.

Maybe some folks still think the PS5 is RDNA 1....that would at least make sense for some of these posts.
The teraflop number isn't what dictated that. One machine has 12GB of RAM, a higher specked GPU, and better memory clocks. That matters. Not a theoretical number.
Naw, teraflops are all that matter. 3GB more ram, higher clocks? pfffft. Please.

Like....how do some ppl cite this gen and the power difference with the base consoles...and leave out eSRAM and the ram differences in general? Dont they know MS did away with eSRAM with the One X, and that could also be why there is still a big performance hit for games running on the One S?

But teraflops tho.

I guess its always easier to focus on just one thing as the reason why something didnt go the way you wanted it to. Like some ppl still think power is the reason why this gen went the way it did. lol
 

Deleted member 224

Oct 25, 2017
5,629
I always love ridiculious comments like these
Coming back to Earth, explain, mathematically, how a 0.64 teraflop differences "matches" res output differences to these consoles.
Gladly :). I just assumed you didn't need this explained.

-900p is 1,440,000 pixels.
-1080p is 2,073,000 pixels.

That's a difference of 633,000 pixels. So, roughly a 30% difference.

Now, keep following.

-The ps4 gpu is 1.84 teraflops.
-The base xb1 gpu is 1.31 teraflops.

That's a difference of 0.53 teraflops. So, roughly a 30% difference.
 

HeWhoWalks

Member
Jan 17, 2018
2,522
Maybe some folks still think the PS5 is RDNA 1....that would at least make sense for some of these posts.

Naw, teraflops are all that matter. 3GB more ram, higher clocks? pfffft. Please.

Like....how do some ppl cite this gen and the power difference with the base consoles...and leave out eSRAM and the ram differences in general? Dont they know MS did away with eSRAM with the One X, and that could also be why there is still a big performance hit for games running on the One S?

But teraflops tho.

I guess its always easier to focus on just one thing as the reason why something didnt go the way you wanted it to. Like some ppl still think power is the reason why this gen went the way it did. lol
Indeed. Thankfully, someone recognizes the numbers that do matter.
Gladly :). I just assumed you didn't need this explained.

-900p is 1,440,000 pixels.
-1080p is 2,073,000 pixels.

That's a difference of 633,000 pixels. So, roughly a 30% difference.

Now, keep following.

-The ps4 gpu is 1.84 teraflops.
-The base xb1 gpu is 1.31 teraflops.

That's a difference of 0.53 teraflops. So, roughly a 30% difference.
Ironically, your percents don't match another person who explained it (albeit, far less condescending than you did and certainly didn't kill the idea that teraflops are at the back-end of what matters in a computer system). Thanks for playing, though. :)
 

Deleted member 224

Oct 25, 2017
5,629
Indeed. Thankfully, someone recognizes the numbers that do matter.

Ironically, your percents don't match another person who explained it (albeit, far less condescending than you did and certainly didn't kill the idea that teraflops are at the back-end of matter to the overall point either). Thanks for playing, though. :)
My percentages are rough, and are off by a slight bit. But it's definitely close enough and you can do the math yourself.

I guess it also depends on how you use the numbers. You can inflate/deflate the difference depending on how you calculate it (the ps4 is 'X' percent stronger than the xb1, but the xb1 is 'Y' percent weaker than the ps4).

Edit: That being said, you certainly aren't arguing in good faith. Two posters have now done the math to show you that the resolution differences between third party ps4 and xb1 titles matched up extremely well with the teraflop numbers of the systems. We could do the same for the mid-gen console refreshes. If you have some sort of explanation for that, be my guest.
 

Crayon

Member
Oct 26, 2017
15,580
Its not out of the realm at possibility that the average difference in pixel count ends up being north of 18%. Computers are pretty complicated. Some confluence of differing design decisions could add up to that. We just don't know how. You have to consider the exact thing could happen in the other direction tho. It could be less than 18. "What if".
 

Jtrizzy

Member
Nov 13, 2017
621
To my eye, 1440p looks great on a 4K screen; even without checkerboard or DLSS. I have a C9, and in order to run games over 60 fps, I can't exceed 1440p so this is what I target all of my games at. The only thing I notice consistently is some aliasing on hair. 4K looks better, until you move the camera lol.

It seems infinitely better than sub 1080p on a 1080p screen.
 

Ploid 6.0

Member
Oct 25, 2017
12,440
Not every scene will be so demanding it needs full power. Videocards do this already in what is called a boost mode where the video card clocks higher for more demanding scenarios. Cpus boost too on Pc.
Yes, this. I monitor my card all the time. I keep Riva Tuner running for everything because I love overclocking my stuff, so if anything is going wrong I'd notice it. When I'm playing not so demanding games or scenes the gpu and cpu reduce it's clocks because it's not needed. This is especially true when I cap the framerate. If it's not capped it will run as hard as it's allowed to, even if I'm staring at the sky, though that still might not push things hard enough.
 

HeWhoWalks

Member
Jan 17, 2018
2,522
https://www.techradar.com/news/what-are-teraflops#:~:text=So, do teraflops matter?,machine would have you think.&text=While flops are a more,6-tflop Xbox One X.
My percentages are rough, and are off by a slight bit. But it's definitely close enough and you can do the math yourself.

I guess it also depends on how you use the numbers. You can inflate/deflate the difference depending on how you calculate it (the ps4 is 'X' percent stronger than the xb1, but the xb1 is 'Y' percent weaker than the ps4).
I live by the standard set in this article for discussion. Simply put — do they matter 0%? Not quite. Do they matter more than things I mentioned? I wouldn't say so and neither would they.

" Yes, the 12 tflops on the Xbox Series X sound impressive, but for now it's just theoretical, and actual gaming performance will depend on other aspects of the GPU architecture, and how well the console's software, drivers and APIs take advantage of that power "

In this regard, I'm satisfied calling it a stalemate. For me, the SSD (type/speeds), clocks (CPU/GPU), and RAM (type/speed) will always far more for any PC I build or consoles I buy.
 

Kyoufu

Member
Oct 26, 2017
16,582
I don't know what you guys are even arguing for. Games are going to look great on both consoles. 🤷‍♂️
 

AegonSnake

Banned
Oct 25, 2017
9,566
Maybe some folks still think the PS5 is RDNA 1....that would at least make sense for some of these posts.
Yes, and I am not dismissing what dusk has been hearing from devs, but that is the only way this makes sense. If the ps5 is rdna 1.0 which as we know from the 5700xt is not a very capable 4k 60 fps machine. rdna 1.0 cards also cannot go beyond 2.1 ghz no matter how much you try and overclock it. and rdna 1.0 cards stop providing linear boosts to performance past 1.8.1.9 ghz. no vrs, fake hardware rt, no rdna 2.0 ipc gains. this theory relies on Cerny and Lisa Su lying about this being a rdna 2.0 gpu. which is funny because once the console and games come out, the jig is up. sony would have a PR disaster on their hands and they wont be able to hide it anymore. So all this secrecy for what?

Wouldnt be surprised if these same people also think sony is buying exclusives to keep the bad comparisons from leaking. as if sony can buy every single third party multiplatform game lol
 
Status
Not open for further replies.