From the 46th minute. They talked about how they put their simulation to test chips. A lot of what he said reminded me of what I've seen from the Github leak.
Interesting, I think he just explained what the Github tests are.
From the 46th minute. They talked about how they put their simulation to test chips. A lot of what he said reminded me of what I've seen from the Github leak.
There is no way Microsoft has an exclusive on the term RDNA2......I mean it's all over AMDs graphics cards and roadmaps now.The curious thing to me is that MS codeveloped RDNA 2.0 with AMD. Does sony get to use on rdna 2.0? its funny how we thought navi was made for sony but it seems it was MS who was working on the tech thats going in next gen consoles or at least one next gen console.
it will be curious to see what makes Sony's custom SOC different in case MS had some kind of exclusivity on rdna 2.0. are we looking at the same perf/watts ratio? is RT the same? wtf were the github leaks about if those 9.2 tflops are actually supposed to mimic 9.2 rdna 2.0 tflops? is sony like 6 rdna 2.0 tflops?
This is not correct. 50% gain is in performance per watt, that doesn't change the TF measurement. Though there may be IPC gains we have no idea what those are yet, and they will be the same for both Microsoft and Sony since they are both using RDNA2.9.2 rdna 2.0 tflops are 6 rdna 1.0 tflops.
do you think the ps5 will be 6 tflops?
RDNA 2 is heavily implied to be used on both consoles as the "common architecture" of next gen consoles
If Sony does end up having a console that is 9.2TF or less, then the reasons may bequoting myself, because my math was off. updated the original post.
these should be the correct numbers:
so now all of a sudden, that 2.0 ghz that was taking 161w should now take only 106w. zen 2 should only be 20w. maybe even less. gddr6 and the rest should be 30-50w and you can have a nice 170w console.
if MS is at 56 cu at 1.7 ghz, they are only around 97w for 12 tflops.
now the question is why would sony go with an 106w 9 tflops gpu when they can go with a 12 tflops 97w gpu. is 15% more die space really that expensive?
56 cu at 2.0 tflops should be around 148w. thats a 200-220w console. i really dont see why they cant do this.
edit: this also proves you wouldnt need a super expensive fancy cooling solution to cool that gpu. unless its really 56 cus running at 2.0 ghz.
quoting myself, because my math was off. updated the original post.
these should be the correct numbers:
so now all of a sudden, that 2.0 ghz that was taking 161w should now take only 106w. zen 2 should only be 20w. maybe even less. gddr6 and the rest should be 30-50w and you can have a nice 170w console.
if MS is at 56 cu at 1.7 ghz, they are only around 97w for 12 tflops.
now the question is why would sony go with an 106w 9 tflops gpu when they can go with a 12 tflops 97w gpu. is 15% more die space really that expensive?
56 cu at 2.0 tflops should be around 148w. thats a 200-220w console. i really dont see why they cant do this.
edit: this also proves you wouldnt need a super expensive fancy cooling solution to cool that gpu. unless its really 56 cus running at 2.0 ghz.
14 tflops might be improbable, but its the only explanation. both sony and ms went wide and slow, sony then got some fancy cooling solution and were able to clock a bit higher than MS. 13-14 tflops.
if anything, this proves that MS has a lot of headroom to increase clocks. they dont have to settle for 12 tflops. they should be around 13 tflops by the time the consoles launch.
RDNA 2 is heavily implied to be used on both consoles as the "common architecture" of next gen consoles
RDNA 2 has an IPC increase over RDNA 1
RDNA 2 has 50% performance per Watt increase.
that's pretty much it.
This is correct, though there may be some IPC improvements they will not change any of the clocks, shader engines, CUs, etc...I don't think architecture (RDNA 1, RDNA 2) has any bearing on GPU compute performance / teraflops.
That's determined by number of CUs and clock speed. While RDNA 2 probably allows potential higher clocks than RDNA 1, RDNA1 vs RDNA 2 does not dictate the number of shader engines, compute units, shader ALUs, etc.
Am I wrong?
This is exactly what I hoped to hear.
There's IPC increases, also the massive improvement to performance per watt means they can go for higher performance targets.I don't think architecture (RDNA 1, RDNA 2) has any bearing on GPU compute performance / teraflops.
That's determined by number of CUs and clock speed. While RDNA 2 probably allows potential higher clocks than RDNA 1, RDNA1 vs RDNA 2 does not dictate the number of shader engines, compute units, shader ALUs, etc.
Am I wrong?
They are referring to the rumors that MS will announce FM8 and RDNA 2 exclusivity which turned out to be a fake as expected.What is miles away from rumors? The recent rumor said RDNA 2 has 50% IPC increase over RDNA1 and people just doubted it few days ago. It is just confirmed now with exact number.
In fact, there is even more than just that:
PS5 Speculation |OT12| - Aw hell, Transistor's running this again? OT
Oh really? Then that would make me rate 1 equally likely as 2.www.resetera.com
Microsoft co-developed Xbox chips with AMD. This is the same thing that Sony does when they are making semi custom chips for their console.The curious thing to me is that MS codeveloped RDNA 2.0 with AMD. Does sony get to use on rdna 2.0? its funny how we thought navi was made for sony but it seems it was MS who was working on the tech thats going in next gen consoles or at least one next gen console.
it will be curious to see what makes Sony's custom SOC different in case MS had some kind of exclusivity on rdna 2.0. are we looking at the same perf/watts ratio? is RT the same? wtf were the github leaks about if those 9.2 tflops are actually supposed to mimic 9.2 rdna 2.0 tflops? is sony like 6 rdna 2.0 tflops?
They are referring to the rumors that MS will announce FM8 and RDNA 2 exclusivity which turned out to be a fake as expected.
Get this shit out of here/v/ - PS5 BC solution from AMD - Video Games - 4chan
PS5 BC solution from AMD - "/v/ - Video Games" is 4chan's imageboard dedicated to the discussion of PC and console video games.boards.4channel.org
From the 46th minute. They talked about how they put their simulation to test chips. A lot of what he said reminded me of what I've seen from the Github leak.
What? Xbox one s is only 55w during gameplay for the entire console. Here the gpu alone is 97w and the apu will likely by 130-150w. That's a 200w console. 4x more power consumption than the Xbox one s while playing sunset overdrive.Your result should tell you that you've made an incorrect assumption somewhere. You're describing an XSX that has the power envelope of an Xbox One S, but for some reason MS built it into an enormous tower.
Same point. If you were right, there'd be no reason to settle at 12TF for the kind of all-out, premium device XSX is supposed to represent in contrast to Lockhart. They'd just be leaving hundreds of MHz on the table. The fact that your calculations don't at all match the market strategy shown so far should give you pause.
Off topic, but just wanted to say, I'm going through Yakuza 0 now and just got past that part in your avatar two nights back. Such an amazing introduction for Majima, haha
But.. modiz where does this leave the Oberon CU count? From what I gathered.. the tests didnt confirm if those were the full CU count or not, so those saying 13TF are assuming 36-40CU at 2.0Ghz with RDNA2?RDNA 2 is heavily implied to be used on both consoles as the "common architecture" of next gen consoles
RDNA 2 has an IPC increase over RDNA 1
RDNA 2 has 50% performance per Watt increase.
that's pretty much it.
13.8tf would be 54 CUBut.. modiz where does this leave the Oberon CU count? From what I gathered.. the tests didnt confirm if those were the full CU count or not, so those saying 13TF are assuming 36-40CU at 2.0Ghz with RDNA2?
Does the possibility still exist that Oberon did not have the full CU count and could be higher? Or do we not know? Or we cant know until Sony tells us CUs/Clock etc?
I can't believe a developmental APU unearthed over a year ago isn't representative of final hardware to release 2 years later. Just flabbergasted.
I gave up on trying to find logic between all the rumors, either jason/DF whatever leak it all or let Sony do it. The fact that they were testing on 2000mhz doesn't mean oberon B0 also had the final TDP target.But.. modiz where does this leave the Oberon CU count? From what I gathered.. the tests didnt confirm if those were the full CU count or not, so those saying 13TF are assuming 36-40CU at 2.0Ghz with RDNA2?
Does the possibility still exist that Oberon did not have the full CU count and could be higher? Or do we not know? Or we cant know until Sony tells us CUs/Clock etc?
About tree fiddy
What? Xbox one s is only 55w during gameplay for the entire console. Here the gpu alone is 97w and the apu will likely by 130-150w. That's a 200w console. 4x more power consumption than the Xbox one s while playing sunset overdrive.
Xbox One S Power Consumption
The Xbox One S was the first major revision of the Xbox One console and came with a lot of changes. Apart from the fact that it was considerably smaller, It wasnerdburglars.net
And I literally said they won't be settling for 12 tflops.
It don't matter it performers better when it comes to marketing 9.2 is less than 10.7 .
Numbers are the easiest thing for the mass market to understand .
This giving flash back to when people thought MS or Jason was talking about something other than hard numbers
No but they hear from their misinformed friends who read misinformed articles on clickbaity sites.Wait, you think the public in general actually look for teraflops to know what console to buy?
To be fair didn't the 1X use quite a fancy cooling system itself? Maybe that was quite expensive so they decided the more economical way to go would be to just make it a tower?I thought you were talking rated watts, not power at the wall, but the point stands. An Xbox One X can draw 170W+ at the wall. So you think MS has gone all out to build a massive tower that will draw 150W GPU + CPU, maybe a bit more than XB1X overall? Doesn't seem to justify the increased overall volume and presumably cooling capacity.
The Xbox One X Review: Putting A Spotlight On Gaming
www.anandtech.com
Edit, misread your post somewhat.
So why does RDNA2 disprove the GitHub stuff?
To be fair didn't the 1X use quite a fancy cooling system itself? Maybe that was quite expensive so they decided the more economical way to go would be to just make it a tower?
I'm 99% sure the Github data is just showing a Navi 1 chip Sony is using to simulate BC and not the real Navi 2 chip.
Since Sony's BC solution is more hardware dependent they have to had started testing and developing their BC solution way before MS and way before AMD had RDNA 2 ready. Today's AMD investor conference confirmed that RDNA 2 is not different than RDNA 1 in terms of architecture, this makes the case for Sony using a RDNA 1 GPU for BC testing even more plausible, since they don't need the RDNA 2 feature set for testing BC, and can just pump out some HW for their engineers to work on.
I'm now more convinced then ever that the Github data is not the full PS5 chip.
This is literally the first thing I speculated in these threads, and I immediately was told how unlikely that would be.I'm 99% sure the Github data is just showing a Navi 1 chip Sony is using to simulate BC and not the real Navi 2 chip.
Since Sony's BC solution is more hardware dependent they have to had started testing and developing their BC solution way before MS and way before AMD had RDNA 2 ready. Today's AMD investor conference confirmed that RDNA 2 is not different than RDNA 1 in terms of architecture, this makes the case for Sony using a RDNA 1 GPU for BC testing even more plausible, since they don't need the RDNA 2 feature set for testing BC, and can just pump out some HW for their engineers to work on.
I'm now more convinced then ever that the Github data is not the full PS5 chip.
As the doctor has said, probably an old chip Sony used to test/develop BC. Klee and Matt been knew.
Im not educated enought to follow this thread, can someone tell if native 4K if gonna be the norm on PS5 games?
No.Im not educated enought to follow this thread, can someone tell if native 4K if gonna be the norm on PS5 games?
There is no actually test datafor Arden in github afaik. Only the theoretical comparison values.
Why should I give that up?
The chip tested in Github was Oberon.
Oberon is the chip that is seen as the PS5 chip.
That chip was tested on BC with modes at 800 Mhz, 911 Mhz and 2000 Mhz.
Those tests were commented with 36 CU full count in native mode.
The data is coming from AMD directly, by a mistake of a clueless intern..
And what architecture that chip was is not part of the data nor does it have relevance for the testing.
Show me a chip code name that is PS5 other than Oberon. You can't
For the moment I only see a desperate attempt to bury Github for undisclosed reasons.
Let's change roles now:
Please show me a scenario that makes sense with the given information about PS5 and its BC modes.
Is it a 3 SE 54/60 GPU at 2000 Mhz?
These past few pages have been a hoot. So let me get this straight, for 6 months we've been hearing that Github doesn't make any sense because the chip is clocked too hight at 2Ghz. A thousand posts about "1.7Ghz is the sweet spot" or "this chip will be hotter than XSX" and all that kind of stuff. And now, when we actually get confirmation that 2Ghz should make a lot of sense with the new RDNA 2 efficiency gains, suddenly "Github is dead"? The other way around my friends, Github just got another data point that adds up.
Suddenly "we" are saying Oberon A0 and B0 are RDNA 2? Think again, I've been saying that ever since XSX was announced as RDNA 2. Just one example from three days ago:
So yes, nothing changes except Github make even more sense now because we know why Sony went for 2Ghz. As I've said 100 times, and I will say 100 times more, Oberon A0 and the final PS5 iGPU are probably the same iGPU. Oberon A0 already was RDNA 2, it already had RT and VRS and Steppings are made for fixing bugs, improving power consumption, improving testability, improving clocks, etc..
That was very low.
The curious thing to me is that MS codeveloped RDNA 2.0 with AMD. Does sony get to use on rdna 2.0? its funny how we thought navi was made for sony but it seems it was MS who was working on the tech thats going in next gen consoles or at least one next gen console.
it will be curious to see what makes Sony's custom SOC different in case MS had some kind of exclusivity on rdna 2.0. are we looking at the same perf/watts ratio? is RT the same? wtf were the github leaks about if those 9.2 tflops are actually supposed to mimic 9.2 rdna 2.0 tflops? is sony like 6 rdna 2.0 tflops?
How do we know it's 13.8TF...? The one guy who said that BGs and then he said it was a joke.
I see. Thanks. Seeing guys throw 13TF as a possible spec .. very exciting times. I will be happy with 12.. or more.I gave up on trying to find logic between all the rumors, either jason/DF whatever leak it all or let Sony do it. The fact that they were testing on 2000mhz doesn't mean oberon B0 also had the final TDP target.
personally I still think it'll be over 10TF, and especially now that RDNA 2 has an additional IPC increase over RDNA 1 (I saw anex saying 15% earlier but I am not sure where it's from, 15% will be an amazing improvement, add that to the 30% improvement from RDNA 1 to Polaris, and you get pretty much 50% better performance per flop when compared to polaris, amazing!), it means that the console would be very capable.
The curious thing to me is that MS codeveloped RDNA 2.0 with AMD. Does sony get to use on rdna 2.0? its funny how we thought navi was made for sony but it seems it was MS who was working on the tech thats going in next gen consoles or at least one next gen console.
it will be curious to see what makes Sony's custom SOC different in case MS had some kind of exclusivity on rdna 2.0. are we looking at the same perf/watts ratio? is RT the same? wtf were the github leaks about if those 9.2 tflops are actually supposed to mimic 9.2 rdna 2.0 tflops? is sony like 6 rdna 2.0 tflops?
I don't think anyone doubted 2GHz clock speed after a while; the doubt comes from 36CU. With PS5 seemingly being RDNA2 (no 40CU limit), having RT, allegedly supporting VRS and Oberon being associated with Navi 10 Lite by both _rogame and Komachi is where github falls apart. The only reason we think Oberon/Ariel/Flute is PS5 in the first place is because Komachi guessed it was PS5 (not being sure themselves) the same way Komachi guessed that Oberon/Ariel/Flute are Navi 10 Lite.Thanks for this, thanks for breaking down the actual evidence we have right now.
This is what I have been seeing a lot of too. I've never even looked at github, but reading these last few pages I'm seeing a lot of evidence to backup github not kill it...yet there is a lot of github is dead.
I was in the, no way its 2000mhz camp but after today I'll hold up my hands and say I can now see how that's possible.
I'm going to make a prediction. If and that's a big IF, PS5 does indeed turn out to be more powerful than XSX, I can see some already latching onto "Sony must be using RDNA1 figures while XSX is RDNA2, so Sony is misleading consumers!". Then we get into the whole architectural efficiency arguments again. Hoo, boy, that oughta be fun
Because the github gpu was rdna 1Just woke up, what's the latest?
RDNA 2 for PS5 more or less confirmed?
9TF very unlikely? Why?