• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Status
Not open for further replies.

MakgSnake

Member
Dec 18, 2019
608
Canada
To be very honest.

As long as next gen gives me 60 frames per second (STANDARD), I will be fine with whatever is inside.

I don't care for resolution.

I wanted this gen to give me 720p, as long as it is NOT 30 frames and in 60!

But we got 1080p with 30 frames for the first 4 years

and then... 4K in 30 frames - but with performance mode got 60 frames with 1080p. I always played the game in 60 frames with 1080p.

So please just give me 60 frames... that is all I ask. It should be standard by now, it is 2020 for crying out loud.
 
Oct 26, 2017
6,151
United Kingdom
So why leaked arden for xsx has 56cu and there is no hints for other big console apu only sparkman that is smaller and probably for lockhart ?

Because they are two independent project for two different companies that shouldn't really be related to one another. As well as the fact that some stuff hasn't leaked.

Is it so outrageous to consider that a data dump on Github by some intern in the tested department of AMD isn't going to be complete and comprehensive on all the details of the projects mentioned therein? I don't think so.
 

Evodelu

Alt Account
Banned
Dec 19, 2019
558
To be very honest.

As long as next gen gives me 60 frames per second (STANDARD), I will be fine with whatever is inside.

I don't care for resolution.

I wanted this gen to give me 720p, as long as it is NOT 30 frames and in 60!

But we got 1080p with 30 frames for the first 4 years

and then... 4K in 30 frames - but with performance mode got 60 frames with 1080p. I always played the game in 60 frames with 1080p.

So please just give me 60 frames... that is all I ask. It should be standard by now, it is 2020 for crying out loud.
You're going to be disappointed.

There will be more 60 fps games, but it won't be a standard for AAA games.
 

sncvsrtoip

Banned
Apr 18, 2019
2,773
Because they are two independent project for two different companies that shouldn't really be related to one another. As well as the fact that some stuff hasn't leaked.

Is it so outrageous to consider that a data dump on Github by some intern in the tested department of AMD isn't going to be complete and comprehensive on all the details of the projects mentioned therein? I don't think so.
Gonzalo/oberon leaks are best we got. If not Matt sugesting we shouldn't looks so much into github leaks and Klee source I would be sure 9.2tf is max for ps5 but ofcourse hope die last ;)
 

Klaw

Member
Nov 16, 2017
384
France
To be very honest.

As long as next gen gives me 60 frames per second (STANDARD), I will be fine with whatever is inside.

I don't care for resolution.

I wanted this gen to give me 720p, as long as it is NOT 30 frames and in 60!

But we got 1080p with 30 frames for the first 4 years

and then... 4K in 30 frames - but with performance mode got 60 frames with 1080p. I always played the game in 60 frames with 1080p.

So please just give me 60 frames... that is all I ask. It should be standard by now, it is 2020 for crying out loud.

With Ray Tracing on the line, I don't believe it will be the case... unfortunately.
 

Kage Maru

Member
Oct 27, 2017
3,804
To be very honest.

As long as next gen gives me 60 frames per second (STANDARD), I will be fine with whatever is inside.

I don't care for resolution.

I wanted this gen to give me 720p, as long as it is NOT 30 frames and in 60!

But we got 1080p with 30 frames for the first 4 years

and then... 4K in 30 frames - but with performance mode got 60 frames with 1080p. I always played the game in 60 frames with 1080p.

So please just give me 60 frames... that is all I ask. It should be standard by now, it is 2020 for crying out loud.

Prepare to be disappointed.
 

Md Ray

Member
Oct 29, 2017
750
Chennai, India
To be very honest.

As long as next gen gives me 60 frames per second (STANDARD), I will be fine with whatever is inside.

I don't care for resolution.

I wanted this gen to give me 720p, as long as it is NOT 30 frames and in 60!

But we got 1080p with 30 frames for the first 4 years

and then... 4K in 30 frames - but with performance mode got 60 frames with 1080p. I always played the game in 60 frames with 1080p.

So please just give me 60 frames... that is all I ask. It should be standard by now, it is 2020 for crying out loud.
PC is the way to go if you want 60fps standard, sorry.
 

MakgSnake

Member
Dec 18, 2019
608
Canada
You're going to be disappointed.

There will be more 60 fps games, but it won't be a standard for AAA games.
I know... it is sad and sucks.

But as long as I get AAA titles in 60 frames, I should be OK.

I know for a fact we will still be in the same boat with 30 frames with 4K and 60 frames with 1080p.

I wanted CyberPunk 2077 in 60 frames... not happening. :(
 

BreakAtmo

Member
Nov 12, 2017
12,805
Australia
To be very honest.

As long as next gen gives me 60 frames per second (STANDARD), I will be fine with whatever is inside.

I don't care for resolution.

I wanted this gen to give me 720p, as long as it is NOT 30 frames and in 60!

But we got 1080p with 30 frames for the first 4 years

and then... 4K in 30 frames - but with performance mode got 60 frames with 1080p. I always played the game in 60 frames with 1080p.

So please just give me 60 frames... that is all I ask. It should be standard by now, it is 2020 for crying out loud.

I would also like this but it's not going to happen, because framerate has almost always been down to developer choice. Unless Sony expanded their 60fps-minimum policy they have for VR to everything else (and we can be almost completely sure they haven't since the PS5 game Klee saw was 30fps), some devs will always focus on graphics instead. VRR will help though, and I could see the proliferation of 60fps YouTube and Facebook video maybe pushing more devs to focus on framerate.
 

III-V

Member
Oct 25, 2017
18,827
Why are they giving dimensions for Oberon and calling it an APU when it's an iGPU and there is no mention of Flute?
 

Kyoufu

Member
Oct 26, 2017
16,582
If DukeBlueBall 's speculation/analysis and the Chinese forum die sizes are correct, a difference between a 360mmsq and 316mmsq die size isn't going to be a "significant difference" in cost.

In fact, if the smaller 36CU die is clocked well past the inflection point (knee) on the clock versus power curve, the impact on overall console cost is going to make it the more expensive option....

...which is why it doesn't quite make sense.

I believe in the video Richard was predicting a much bigger die size than that, although I'd need to watch it again. I know for sure he thinks the power draw will be over 300W for the SX.
 
Oct 26, 2017
6,151
United Kingdom
Gonzalo/oberon leaks are best we got. If not Matt sugesting we shouldn't looks so much into github leaks and Klee source I would be sure 9.2tf is max for ps5 but ofcourse hope die last ;)

9.2 TF is still a very strong console, so there's no reason to lose hope.

On the other hand, I don't see why we shouldn't question the glaring inconsistencies in an info dump of test results that lack any meaningful explanation of what is actually being tested.

For example, if the Arden project is referencing Oberon test results in the Github data leak as a placeholder, what's to say the actual Oberon test data itself isn't placeholder, I.e. taken and renamed from some other existing RDNA GPU (I.e. a 5700 ā€” which it appears identical to)?

If there are obvious errors/inconsistencies, why should we assume the data is telling us anything meaningful? Because it's all we have?

At one point, the cavalcade of BS paste in rumours were all we had, but we never failed to question those.
 

Deleted member 43

Account closed at user request
Banned
Oct 24, 2017
9,271
The point of the premium SKU is the Halo effect. The point is to drive people to the platform and get them hooked in to Gamepass. The best possible box means your marketing footage looks amazing, streamers buy in, enthusiasts buy in etc. Then they can go win on affordability elsewhere. I don't think Sony or MS are looking to take heavy losses on any of these machines but any margins will be thin even on the premium box.
Yes, things can have more than one "point."
 

AegonSnake

Banned
Oct 25, 2017
9,566
I believe in the video Richard was predicting a much bigger die size than that, although I'd need to watch it again. I know for sure he thinks the power draw will be over 300W for the SX.
If the power draw for a 12 tflops navi is 300w then what's going to be the tdp of 16 tflops gpu part? 400? 500?

Hes basically using the tdp of the 5700xt and extrapolating it for a 12 tflops gpu without taking into account any power savings you get from going wide and slow.
 

Deleted member 56752

Attempted to circumvent ban with alt account
Banned
May 15, 2019
8,699
You guys think if I had enough room for two and a half Xbox one x's stacked horizontally, that I'd have enough space for a Xbox series x horizontally
 

Kyoufu

Member
Oct 26, 2017
16,582
If the power draw for a 12 tflops navi is 300w then what's going to be the tdp of 16 tflops gpu part? 400? 500?

Hes basically using the tdp of the 5700xt and extrapolating it for a 12 tflops gpu without taking into account any power savings you get from going wide and slow.

I think he meant the power draw for the overall system, so RAM and other bits and bobs included. Mind you he was only speculating.
 

Lirion

Member
Oct 25, 2017
1,774
This thread moves crazy fast. So this github leak, is there any indication it actually have anything to do with next gen consoles or are we just connecting the dots we prefer?
 

III-V

Member
Oct 25, 2017
18,827
9.2 TF is still a very strong console, so there's no reason to lose hope.

Other the other hand, I don't see why we shouldn't question the glaring inconsistencies in an info dump of test results that lack any meaningful explanation of what is actually being tested.

For example, if the Arden project is referencing Oberon test results in the Github data leak as a placeholder, what's to say the actual Oberon test data itself isn't placeholder, I.e. taken and renamed from some other existing RDNA GPU (I.e. a 5700 ā€” which it appears identical to)?
This is because we have test results for 3 modes, Native (2 GHz) BC1(0.91 GHz) and BC2 (0.8 GHz), which corresponds exactly to what we would expect. dont throw away the baby with the bath water. There is still much that can be gleaned from the leak, mistakes and all.
 
Oct 25, 2017
4,152
9.2 TF is still a very strong console, so there's no reason to lose hope.

Other the other hand, I don't see why we shouldn't question the glaring inconsistencies in an info dump of test results that lack any meaningful explanation of what is actually being tested.

For example, if the Arden project is referencing Oberon test results in the Github data leak as a placeholder, what's to say the actual Oberon test data itself isn't placeholder, I.e. taken and renamed from some other existing RDNA GPU (I.e. a 5700 ā€” which it appears identical to)?

If there are obvious errors/inconsistencies, why should we assume the data is telling us anything meaningful? Because it's all we have?

At one point, the cavalcade of BS paste in rumours we're all we had, but we never failed to question those.

I agree with the skepticism, but I wouldn't compare this leak which are actual AMD documents vs the earlier pastebin junk floating around.

FWIW, I would be totally on board with this latest info if it didn't use 2GHz, but I can't outright dismiss it like pastebin crap.
 
Jun 23, 2019
6,446
9.2 TF is still a very strong console, so there's no reason to lose hope.

On the other hand, I don't see why we shouldn't question the glaring inconsistencies in an info dump of test results that lack any meaningful explanation of what is actually being tested.

For example, if the Arden project is referencing Oberon test results in the Github data leak as a placeholder, what's to say the actual Oberon test data itself isn't placeholder, I.e. taken and renamed from some other existing RDNA GPU (I.e. a 5700 ā€” which it appears identical to)?

If there are obvious errors/inconsistencies, why should we assume the data is telling us anything meaningful? Because it's all we have?

At one point, the cavalcade of BS paste in rumours we're all we had, but we never failed to question those.

Lol Because once the GitHub info (in some people's minds) definitively placed XSX ahead of PS5, people didn't want to question it any more. The moment Matt came in and told people not to take GitHub as confirmation, people started ignoring him and even started questioning his validity of a dev/insider again. It's pretty transparent at this point.
 

III-V

Member
Oct 25, 2017
18,827
Both companies can shrink their APU to any node available.

If RDNA 1 is tied with 7nm how can they use future 5nm or even 3nm to shrink?
It's an issue with the litho process and the masks are not compatible and they are very expensive to create. It is not tied to the Arch in that same way. N7P does not require any re-work to the masks.
 

nelsonroyale

Member
Oct 28, 2017
12,122
If the power draw for a 12 tflops navi is 300w then what's going to be the tdp of 16 tflops gpu part? 400? 500?

Hes basically using the tdp of the 5700xt and extrapolating it for a 12 tflops gpu without taking into account any power savings you get from going wide and slow.

But it won't necessarily be clocked that slow...Possibly up to 1.7 ghz to get to 12 tf.
 

giblet

Banned
Oct 28, 2017
179
12 TF Console for Cyberpunk will be my choice. If PS5 isn't up to par on that game, I'll have to go Xbox this gen.
 

Kage Maru

Member
Oct 27, 2017
3,804
This is true, but I feel like frame rates have been generally better this gen then last (360/ps3). I'm expecting even more improvements nextgen due to the better CPU. I'm really looking forward to it.

No doubt, I certainly agree. I think we'll see more 60fps titles as well, especially if more games support graphical options. I just don't see it becoming a standard next gen. We'll still get sub 60fps games next gen just like we'll see sub-4K games as well.
 
Oct 26, 2017
6,151
United Kingdom
Both companies can shrink their APU to any node available.

If RDNA 1 is tied with 7nm how can they use future 5nm or even 3nm to shrink?

Yes, but later on down line.

I feel the design libraries are different between 7nm and 7nm+, starting out development on the former and moving to the latter midway through would lead to significant rework of the engineering design, delays and a very real impact on the overall project schedule deadline, I.e. launch date.

For any console to be on 7nm+ they would need to have been designed for it from the outset, which is something I AMD many more here consider unlikely. Not impossible, but a slim likelihood.
 
Oct 27, 2017
20,745
Im catching up on the whole 9.2TF PS5 thing but wonder:
  • do we know for sure Xbox SX is 12TFs? Xbox didn't give a specific count. I know they said "double" 1X but also said 4x One S (4x One S is 5TF, 2x One X is 12TF)
  • Couldn't the PS5 have been 9.2TFs at one point, maybe more now?
  • If Lockhart (Xbox Series S) is 4TFs, imo wouldn't any impact from a 3TF gap between Series X and PS5 be sort of mitigated? All Series X games would have to be designed around Lockhart anyway, and all multiplats in general, so would the consumer really see a difference?
 

Kyoufu

Member
Oct 26, 2017
16,582
Im catching up on the whole 9.2TF PS5 thing but wonder:
  • do we know for sure Xbox SX is 12TFs? Xbox didn't give a specific count. I know they said "double" 1X but also said 4x One S (4x One S is 8TF, 2x One X is 12TF)
  • Couldn't the PS5 have been 9.2TFs at one point, maybe more now?
  • If Lockhart (Xbox Series S) is 4TFs, imo wouldn't any impact from a 3TF gap between Series X and PS5 be sort of mitigated? All Series X games would have to be designed around Lockhart anyway, and all multiplats in general, so would the consumer really see a difference?

We don't know anything. No official specs have been released by either Sony or MS.
 

Kibbles

Member
Oct 25, 2017
3,417
Im catching up on the whole 9.2TF PS5 thing but wonder:
  • do we know for sure Xbox SX is 12TFs? Xbox didn't give a specific count. I know they said "double" 1X but also said 4x One S (4x One S is 8TF, 2x One X is 12TF)
  • Couldn't the PS5 have been 9.2TFs at one point, maybe more now?
  • If Lockhart (Xbox Series S) is 4TFs, imo wouldn't any impact from a 3TF gap between Series X and PS5 be sort of mitigated? All Series X games would have to be designed around Lockhart anyway, and all multiplats in general, so would the consumer really see a difference?
Klee said it's actually slightly over 12TF, also Microsoft said over 8x the One S, not 4
 

AegonSnake

Banned
Oct 25, 2017
9,566
But it won't necessarily be clocked that slow...Possibly up to 1.7 ghz to get to 12 tf.
the chart i posted above shows 40 CUs at 1.7 ghz giving us a 110w for the gpu. so yes, thats the ideal most efficient clock it seems for the 7nm navi. we will see what happens with 56 CUs but it should be a linear increase in tdp.

2.0ghz consumes 2x more power at 210w. why would sony go narrow and fast only to end up with a 30% less powerfull console that consumes almost 50% more power?
 
Oct 30, 2017
269
Im catching up on the whole 9.2TF PS5 thing but wonder:
  • do we know for sure Xbox SX is 12TFs? Xbox didn't give a specific count. I know they said "double" 1X but also said 4x One S (4x One S is 5TF, 2x One X is 12TF)
  • Couldn't the PS5 have been 9.2TFs at one point, maybe more now?
  • If Lockhart (Xbox Series S) is 4TFs, imo wouldn't any impact from a 3TF gap between Series X and PS5 be sort of mitigated? All Series X games would have to be designed around Lockhart anyway, and all multiplats in general, so would the consumer really see a difference?
  1. MS said "over" 8x the Xbox One, 2x the One X
  2. Don't know for sure, but possible... maybe
  3. If the main differentiater between Series X and Lockhart is the GPU, with the same or close in CPU, RAM, HDD then no it should not be a problem or hold back anything for next-gen. It will just run the same games at 1080 / 1440p.
 

III-V

Member
Oct 25, 2017
18,827
the chart i posted above shows 40 CUs at 1.7 ghz giving us a 110w for the gpu. so yes, thats the ideal most efficient clock it seems for the 7nm navi. we will see what happens with 56 CUs but it should be a linear increase in tdp.

2.0ghz consumes 2x more power at 210w. why would sony go narrow and fast only to end up with a 30% less powerfull console that consumes almost 50% more power?
Go get 'em tiger
 
Oct 26, 2017
6,151
United Kingdom
This is because we have test results for 3 modes, Native (2 GHz) BC1(0.91 GHz) and BC2 (0.8 GHz), which corresponds exactly to what we would expect. dont throw away the baby with the bath water. There is still much that can be gleaned from the leak, mistakes and all.

I'm not really throwing anything out.

The three test modes inform on how the part in question is being tested. It still doesn't fundamentally tell us anything about what I said being tested.

At most it proves a relationship between the Oberon codename and PlayStation. But that's about all it gives us.

I agree with the skepticism, but I wouldn't compare this leak which are actual AMD documents vs the earlier pastebin junk floating around.

FWIW, I would be totally on board with this latest info if it didn't use 2GHz, but I can't outright dismiss it like pastebin crap.

I never made this comparison.

I'm merely stating that just as when we saw pastebin posts with obvious red flags and inconsistencies, we treated them with a level of scrutiny, we should also employ a similar level of scrutiny here.

Lol Because once the GitHub info (in some people's minds) definitively placed XSX ahead of PS5, people didn't want to question it any more. The moment Matt came in and told people not to take GitHub as confirmation, people started ignoring him and even started questioning his validity of a dev/insider again. It's pretty transparent at this point.

I think some are doing this. But I think other are just innocently placing too much stock in the Github leak being legit internal AMD testing data, while failing to ask the important questions about what the info. is, what it means, while acknowledging that we just don't have answers for these questions just yet.

I repeat it in many posts ;)

Then on this point we agree, good sir.
 

gozu

Member
Oct 27, 2017
10,296
America
Let's contemplate the GPU performance jump between current and next gen given the latest developments. If anybody mentions the word CPU I will cut them! ;)

We'll compare ps360 as an amalgam because their nominal TF difference is only 3% or so. I'm also rounding up and down by up to 3% for more digestible numbers. We'll consider base PS4 this gen's benchmark since it crushed the x1 in sales and disregard ps4pro/1x.

ps360: ~0.25 tflop

ps4: 1.8 tflop = 7x jump nominal, ??x real

ps5 @ 9.2 RDNA = ~12.3 gcn : 5x jump nominal, ~7x real

sex @ 12 RDNA = ~16 gcn : 6.7x jump nominal , ~9x real


So series x has kept pace with last gen and ps has lagged behind. The sex is 33% more powerful than the PS5 in raw TFLOPs*, Why?

Maybe it's because 36 CU is double 18 CUs for the base ps4, and the same number of CUs as the PS4pro and this was chosen because it is very convenient for backward compatibility with ps4.

Either way, Microsoft's console is basically the high end of what people were hoping for, which is praise-worthy. If it releases at $500, it will be a fantastic bargain that will steal marketshare from Sony, guaranteed. At $600, it is much less compelling so I expect microsoft to stick to $500 and eat the loss.

Assumptions:

We know the GH leaks are legit, they have been confirmed multiple times. We know that RDNA has a 33% perf advantage per flop vs last gen. Assuming no major bombshells, we have the PS5 is both 9.2 tf rdna = 12.3 tf gcn. SeX is 12 tf rdna = 16tf gcn.



* Does anyone know the real (estimated) perf jump from last gen's tflop to current gen's tflop?

* In practice, PS5's higher clocks scale performance better than CU numbers so actual advantage is likely to be around 25-28%. Still significant. But only around 1/2 of current gen's advantage.
 

EBomb

Member
Oct 25, 2017
464
the chart i posted above shows 40 CUs at 1.7 ghz giving us a 110w for the gpu. so yes, thats the ideal most efficient clock it seems for the 7nm navi. we will see what happens with 56 CUs but it should be a linear increase in tdp.

2.0ghz consumes 2x more power at 210w. why would sony go narrow and fast only to end up with a 30% less powerfull console that consumes almost 50% more power?

Wouldn't the 7-10% power gains from N7P take you pretty close to 2.0 GHz from the 1.7 GHz sweet spot for N7 5700xt
 
Status
Not open for further replies.