Thx but it's different interview. The one I quoted was for DualShockers.
Thx but it's different interview. The one I quoted was for DualShockers.
With costs of production increasing exponentially as we go to smaller nodes, i wouldnt be surprised if sony went with a smaller chip to save some money. though i dont know how much money they would be saving with a 320mm2 gpu compared to a 380mm2 gpu.
all i know is thats very risky. sony would be giving up precious CUs in favor of a hail mary on ridiculously high clockspeeds that might not have transpired if everything didnt go according to plan.
Expect game enviornments to look photorealistic thanks to megatextures used in the Rebirth video, at least in first party games, and character models to look very CGi. not everything can be photorealistic just yet, so expect stylized graphics like in the heretic demo, but i suspect some open world games set in barren lands will come very close to the rebirth video.
i thought the sony leaks showed that HBM2 would either stack on top of the APU or right around it. and a massive cooler would sit on top of it cooling everything from the APU to the HMB2 stacks.
Fake Edit: Found it.
This post says it would go under the APU. Like how?
And again he told first-generation title... Currently all the graphics pipeline is not raytracing friendly. For example, all current shaders systems with tons of different shaders need to be redone for keeping only a few gloabl shaders because it kills intruction cache simply the worst cache miss existing... After there is some improvement they maybe have like LOD management for the BVH or data structure or ray reordering which aren't inside the first DXR implementation...
It is not meaning after some optimization and improvement you will not have some1440p title...
My analysis no studio will go 8k because this is too costly. This is 16x more than PS4...
Even mentioning 8k as a possible rez' target is just plain retard...
As an Xbox primary guy my expectations are tempered based on their constant messaging around frame rates and loading times. Obviously you need much more GPU to push 4K60 than 4K30 but none of the messaging currently boasts about elevating visual fidelity. I'm expecting 10 TF Navi and not falling for the 12-14 trap some are boasting about.
So I am guessing 21 is the heatsink.There's holes in the PCB through which metal posts or thermal paste create a thermal path to heatsink on the other side.
I shouldn't respond couse you didn't deserve but it's you lucky day. Nope, I expect big improvement in graphics quality but not heavy rt (I pressume nextgen consoles will be less capable in this field than rtx2070).
So it seems you took to your mind what David Cage said :)I don't think anyone expects the RT solution in consoles to be better then the RTX cards, but it's already impressive that it seems like we'll be getting RT games next gen. I expect to be given a choice of 4K/30 or 60 without RT and 1080p/30 with RT on next gen games.
31/32 is a thermal interface to the mechanical package or heatsink assemblies.So I am guessing 21 is the heatsink.
32 is the APU and 31 is the RAM module. 11 connects the APU to GPU. And in the final console build, you essentially flip this diagram upside down and the heatsink (21) goes on top with the APU in the middle and the RAM underneath it? Do i have this right?
The question is why? MS has RAM modules on the side. they dont need to be cooled, right? or at least the system fan will dissipate any heat generating from those modules since they are so far apart. So why would Sony go with this? Is HBM2 really that power efficient that it can be cooled with whatever is left over from the APU? This would help them with a smaller mobo and more space for SSD chips soldered on the mobo i guess.
i hope some devs will just patch their dynamic res games to remove the upper limit of the resolution scaler.
It's funny because I was under the impression that wide and slow was the best option for consoles, because you got the best performance per watt. I know a bigger die is more expensive, but still.
They already came to that conclusion several years ago when designing the Xbox One X.
I expect them to have similar design principles on Scarlett as was seen on Scorpio, though perhaps some of their priorities changed given the hint of the die size. Prior to the Gonzalo and now Oberon leaks, I would've guessed Scarlett featured the higher clocks, but now I just can't seem to see how they could push 1800-2000 MHz and have an SOC around 380mm^2 in a console for a reasonable price - so something has to give, and my guess is they'll be clocked lower than the PS5.
Wide and slow is best when considering TDP. If you can figure out a way t cool the system though then smaller and faster is better.
My analysis no studio will go 8k because this is too costly. This is 16x more than PS4...
Nope, I didn't expect at all rt in nextgen consoles and I'm not complaining.So what exactly is it that you're complaining about lol? From your posts it seemed like you were expecting next-gen consoles to have some secret NASA RT tech that let them output next-gen games with higher pixel count than current gen games with RT..
No hail mary's here. Its a very sound approach. Assume power consumption is equal, then you have two options. You either go wide but clock lower to stay withing target power draw and you wouldn't have to worry too much about TDP. Or you narrow and clock high (while using the same amount of power consumption) but will generate more heat which means you would have to deal with TDP better. No hail marys there and nothing is being given up. Just two different ways to arrive at the same solution. Then there is the added benefit that a smaller chip will cost less to make. Though the argument there can be that you then spend more on cooling.With costs of production increasing exponentially as we go to smaller nodes, i wouldnt be surprised if sony went with a smaller chip to save some money. though i dont know how much money they would be saving with a 320mm2 gpu compared to a 380mm2 gpu.
all i know is thats very risky. sony would be giving up precious CUs in favor of a hail mary on ridiculously high clockspeeds that might not have transpired if everything didnt go according to plan.
This post says it would go under the APU. Like how?
Stuff like that seems to be a common misconception around here. There is not a chance in hell that sony goes with a 36CU GPU while as far as they are concerned are pushing the envelope of console design, then MS, releasing in the same year, using the same architecture, and even the same general design layout comes and manages to put 48CU in a similarly priced and sized box. That is just flat out impossible.So if PS5 is going with Navi 10 Lite. What will MS use? Will they go with 48 CU or higher?
So if PS5 is going with Navi 10 Lite. What will MS use? Will they go with 48 CU or higher?
Nope, I didn't expect at all rt in nextgen consoles and I'm not complaining.
Why we expect 399$ if 9.2tf ? It's performance teritory of 5700xt which cost 399$ Ps4 that cost 399$ have gpu on 7850 performance level and launch price of 7850 was 249$. I still asume 499$ for ps5.
I think the 8k support was more about video and a confirmation that this is a 2.1 HDMI hardware.8k support will be there for video and maybe the odd indie title(maybe). even 4k is going to be pushing it depending on game. if you look a the history of this gen and how the 1080p/60 dream eroded over the years, expect the same next gen with 4k.
As an Xbox primary guy that loves cool graphics and 4K on my Xbox One X, the consistent words from Phil Spencer about frame rates and smooth gameplay are music to my ears. Gameplay is king for me, and I am excited to hear that be a design goal for them.
though i dont know how much money they would be saving with a 320mm2 gpu compared to a 380mm2 gpu.
No hail mary's here. Its a very sound approach. Assume power consumption is equal, then you have two options. You either go wide but clock lower to stay withing target power draw and you wouldn't have to worry too much about TDP. Or you narrow and clock high (while using the same amount of power consumption) but will generate more heat which means you would have to deal with TDP better. No hail marys there and nothing is being given up. Just two different ways to arrive at the same solution. Then there is the added benefit that a smaller chip will cost less to make. Though the argument there can be that you then spend more on cooling.
Yes, sony's patent basically suggests an APU cooling sandwich. If taking it literally. As it stans chips are cooled on one side of the PCB, sony is suggesting that cooling will happen on both sides. This could improve cooling efficiency by as much as 40-80% if cooling on the alternate side is half as god as cooling on the primary side of the PCB.
So I guess there's more to it than just die size and clocks. You have to consider the entire system design and at least with the Pro and X1X MS was certainly doing something that Sony wasn't.
1.84tf is very close to 1.76tf and far from 2.56tfThe 7870 was $350, the ps4 gpu was more powerful then a 7850 but less then a a 7870
One thing to keep in mind is that die size isn't just CUs. Pro CUs we're slightly bigger and did more stuff than Xs. Another thing to keep in mind is that we don't know how much power the Pro APU was wasting, on average, pushing unnecessary voltage to guarantee yields - we know that the X wasn't using any.
Green Team resigning themselves to a narrower, higher clocked design it seems.
So is it going to be 60 or 66 active CUs for Scarlett?
I'm leaning towards 60 active but ymmv.
But they used the chip in the 7870, disabled two CUs and lowered the clocks by 200 mhz. 7850 only had 16 CUs.
Fair point but as you see, 399$ > 349$ + more expensive cooling system for this very high clocks on ps5.But they used the chip in the 7870, disabled two CUs and lowered the clocks by 200 mhz. 7850 only had 16 CUs.
Had they gone with a better cooling solution, they couldve clocked it at 1000 mhz and hit 2.3 tflops with the same chip. If MS hadnt gone with a shit 1.2 tflops GPU, sony would've at least thought about clocking it higher to get the advantage.
Thats exactly what they seem to be doing now. Both MS and Sony seem to want the power advantage. And the only winner is going to be us the consumer.
Cerny - "Challenge accepted"HBM + 2GHz GPU + 8 core zen 2 all chugging away in the same spot is going to be really challenging from a thermal/cooling perspective
It's going to be 36-40 on both I think.Green Team resigning themselves to a narrower, higher clocked design it seems.
So is it going to be 60 or 66 active CUs for Scarlett?
I'm leaning towards 60 active but ymmv.
which is why i think they went with a wide and slow approach initially and have only now begun to push clocks because they can. the first gonzolo leak was 1ghz. Do we really believe that was a 36 CU chip?Fair point but as you see, 399$ > 349$ + more expensive cooling system for this very high clocks on ps5.
Can we please drop 1080p once and for all (apart from games using RT ofc) and force developers (okay maybe not force) for next gen to use 4k native with 30 fps or 60. Just like we did with 720p this gen when those games were laughed at when they released. Only a few titles in the beginning of this gen used 720p from what I recall.
You will no get 2ghz console gpu with more than 36cu. 1ghz for first engeneering gonzalo sample seems quite reasonable.which is why i think they went with a wide and slow approach initially and have only now begun to push clocks because they can. the first gonzolo leak was 1ghz. Do we really believe that was a 36 CU chip?
i do think an expensive cooling system was always part of their design. If they knew in 2015 they wanted an SSD, they wouldve known that a $399 console was out of the picture. With an extra $100 to play with, a bigger APU and a more expensive cooling system to hit those high clocks is definitely on the cards.
You will no get 2ghz console gpu with more than 36cu. 1ghz for first engeneering gonzalo sample seems quite reasonable.
Yes, MS was using a vapor chamber cooling solution.Lets compare PS4 Pro with Xbox 1X:
Same GPU uarch family (broadly):
Pro Clocks 911 MHz
X1X Clock 1172 MHz
X1X is 29% higher clocks.
PS4 Pro peak power consumption is around 175W.
X1X peak averages between 180W to 200W.
29% higher power consumption than 175W is 225W and afaict the X1X doesn't go that high.
Now recall, X1X also has more GDDR5 chips AND roughly 10% more CUs... so expected peak power should have been greater than 225W for the X1X.... but it isn't.
So I guess there's more to it than just die size and clocks. You have to consider the entire system design and at least with the Pro and X1X MS was certainly doing something that Sony wasn't.
Yes, MS was using a vapor chamber cooling solution.
And yes, the entire system needs to be considered. But when talking about GPU power, it literally boils down to clocks and TDP. You clock higher than you are"supposed to" and you will run into TD issues for which you will need to use better cooling. You go bigger than you are supposed to then you will run into power consumption issues for which you will need to clock lower.
What we can't know is what is big enough or what is fast enough, cause as long as you are working within those limits you are fine.
Now looking at the 5700xt, its clear where those limits are. So if sony wants to clock higher then there are only two ways that is possible and maybe even a combination of both. Using a better or more efficient manufacturing process (7nm+) or using significantly more advanced cooling. If MS is opting to go wider then that would also mean that they would be set on working within the thermal limits of the current arch which again thanks to 5700xt we also know. So they can gain performance by not having to push much harder than the stock 5700xt which remember is also using a vapor chamber cooler.
PS5 could also mark the return of tiered console launches. But I think the lower priced one will be 399 with a 1 tb ssd and the higher priced 499 with 2.5 tb ssd.
Neither will be user replaceable.
People also said you'd never get 1.8 clock, let alone 2. Yet here we areYou will no get 2ghz console gpu with more than 36cu. 1ghz for first engeneering gonzalo sample seems quite reasonable.
This might be Sonys checkerboard rendering for Ray tracing.Komachi mentioned RayTracing only for Arden.
That could mean Ray tracing being hardware only on Scarlett and not PS5. Or Scarlet could have more hardware dedicated to RT, while PS5 only some minor stuff, like the AMD TMU (textures unit) patent.
Because it's sub 10tflops, and only 16GB of memory. But I think they'll lose more money for each console sold compard to PS4.
But that's a voxel based global illumination method, the name itself implies that. Sure, you could argue that it's tracing rays to a degree, but that's through a voxelized representation of the scene not actually casting rays and tracing them in each pixel in a scene comprised of geometry.Yes, and that's always been a raytracer, capable of reflections and all.