• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Status
Not open for further replies.

Albert Penello

Verified
Nov 2, 2017
320
Redmond, WA
I agree with Albert Penello on this one, with the slow down of Moore's law begins to be more and more problematic I think mid-gen consoles aren't a sure thing at all.

It may simply be less necessary as well. 4K was becoming a mainstream resolution for PC and TV's, and the the base consoles were designed around driving 1080p (or less) output. When you have a set that requires 4x the performance *just* to drive 4x the pixels, then you eat up all the performance just driving resolution. I think it's unlikely we'll see 8K TV's go mainstream in the same way we saw 4K go mainstream - we're more likely to see improvements in NITS (to drive better HDR) or better framerates to support greater than 60fps on TV's. CPU's and GPU's in the next-gen should easily support higher frame rates and wider colors.

So the mid-gen upgrades are not only less financially and technically viable, but also likely less necessary to keep up with display technologies.
 

bitcloudrzr

Member
May 31, 2018
13,855
It may simply be less necessary as well. 4K was becoming a mainstream resolution for PC and TV's, and the the base consoles were designed around driving 1080p (or less) output. When you have a set that requires 4x the performance *just* to drive 4x the pixels, then you eat up all the performance just driving resolution. I think it's unlikely we'll see 8K TV's go mainstream in the same way we saw 4K go mainstream - we're more likely to see improvements in NITS (to drive better HDR) or better framerates to support greater than 60fps on TV's. CPU's and GPU's in the next-gen should easily support higher frame rates and wider colors.

So the mid-gen upgrades are not only less financially and technically viable, but also likely less necessary to keep up with display technologies.
For the market they are targeting, they can follow the Pro and double the performance three years in for example. Being able to continue to play at ultra and higher ray tracing settings, possibly a higher base res would be enough.

At the end of the day, the market you are selling to is willing to upgrade for better visuals even without a major gimmick like 4k. Additionally you will have the customer that just buys the "best" version without being an enthusiast.
 

Hermii

Member
Oct 27, 2017
4,685
It may simply be less necessary as well. 4K was becoming a mainstream resolution for PC and TV's, and the the base consoles were designed around driving 1080p (or less) output. When you have a set that requires 4x the performance *just* to drive 4x the pixels, then you eat up all the performance just driving resolution. I think it's unlikely we'll see 8K TV's go mainstream in the same way we saw 4K go mainstream - we're more likely to see improvements in NITS (to drive better HDR) or better framerates to support greater than 60fps on TV's. CPU's and GPU's in the next-gen should easily support higher frame rates and wider colors.

So the mid-gen upgrades are not only less financially and technically viable, but also likely less necessary to keep up with display technologies.
I agree with all of this.

There is always the enthusiast market that wants the latest and greatest tech, but I don't think that market is big enough to justify a mid gen refresh, without the argument of keeping up with display resolution.

Depends how noticeable the gap between the ssds will be in practice when all is said and done, but if it's significant I could see Xbox upgrading to match or exceedps5 down the line.
 

Albert Penello

Verified
Nov 2, 2017
320
Redmond, WA
For the market they are targeting, they can follow the Pro and double the performance three years in for example. Being able to continue to play at ultra and higher ray tracing settings, possibly a higher base res would be enough.

At the end of the day, the market you are selling to is willing to upgrade for better visuals even without a major gimmick like 4k. Additionally you will have the customer that just buys the "best" version without being an enthusiast.

I don't see a 20 / 24 tflop machine being affordable in a console form factor even in 3 years. The node change from 7nm to 5nm or 3 nm is going to be cost prohibitive and just mathematically unless they hit 3nm you're only going to see a 30% reduction in size but you're doubling the tflops so the chip has to grow.

Additionally, you can't really double the GPU without growing CPU and Memory or you run in to other bottlenecks which further adds cost.

There may be other silicon advancements I"m not privy to, but it's pretty widely known this is a real challenge right now. So looking through todays lens, I think it's unlikely you're going to see a mid-gen console this cycle.
 

BreakAtmo

Member
Nov 12, 2017
12,805
Australia
It may simply be less necessary as well. 4K was becoming a mainstream resolution for PC and TV's, and the the base consoles were designed around driving 1080p (or less) output. When you have a set that requires 4x the performance *just* to drive 4x the pixels, then you eat up all the performance just driving resolution. I think it's unlikely we'll see 8K TV's go mainstream in the same way we saw 4K go mainstream - we're more likely to see improvements in NITS (to drive better HDR) or better framerates to support greater than 60fps on TV's. CPU's and GPU's in the next-gen should easily support higher frame rates and wider colors.

So the mid-gen upgrades are not only less financially and technically viable, but also likely less necessary to keep up with display technologies.

Based on a quote from Andrew House, the point of the Pro wasn't to keep up with display tech, it was to offer a more powerful PS4 to convince the hardcore big-spender types not to make a jump to PC after a few years. The need for that will absolutely still be around in 4 years.

I don't see a 20 / 24 tflop machine being affordable in a console form factor even in 3 years. The node change from 7nm to 5nm or 3 nm is going to be cost prohibitive and just mathematically unless they hit 3nm you're only going to see a 30% reduction in size but you're doubling the tflops so the chip has to grow.

Additionally, you can't really double the GPU without growing CPU and Memory or you run in to other bottlenecks which further adds cost.

There may be other silicon advancements I"m not privy to, but it's pretty widely known this is a real challenge right now. So looking through todays lens, I think it's unlikely you're going to see a mid-gen console this cycle.

The hope is that by 2024, they wouldn't just have 3nm but also chiplet technology that would make the project much more affordable.
 

Jedi2016

Member
Oct 27, 2017
15,598
Based on a quote from Andrew House, the point of the Pro wasn't to keep up with display tech, it was to offer a more powerful PS4 to convince the hardcore big-spender types not to make a jump to PC after a few years.
Jennifer-Lawrence-ok-thumbs-up.gif
 

disco_potato

Member
Nov 16, 2017
3,145
Sony says they need 8tf for 4k games. Releases a 4tf console. Clearly chasing that 4k dream and failing.
Some of ya'll are hilarious.


Any news updates?
 

Kyoufu

Member
Oct 26, 2017
16,582
Based on a quote from Andrew House, the point of the Pro wasn't to keep up with display tech, it was to offer a more powerful PS4 to convince the hardcore big-spender types not to make a jump to PC after a few years. The need for that will absolutely still be around in 4 years.

Right. PS5 Pro is pretty much a lock IMO. Display technology has never been the main driver of any "Pro" line of hardware option for any company.
 

gundamkyoukai

Member
Oct 25, 2017
21,077
Based on a quote from Andrew House, the point of the Pro wasn't to keep up with display tech, it was to offer a more powerful PS4 to convince the hardcore big-spender types not to make a jump to PC after a few years. The need for that will absolutely still be around in 4 years.

Yep if you have a market willing to pay why not make it once you not selling at a lost .
I would easily buy a pro version in the next few years .
 

BreakAtmo

Member
Nov 12, 2017
12,805
Australia
Yep if you have a market willing to pay why not make it once you not selling at a lost .
I would easily buy a pro version in the next few years .

Well, it doesn't necessarily prove a PS5 Pro will happen. I just think it will depend on whether or not the PS4 Pro reduced the percentage of big spenders leaving the ecosystem in the second half of this gen versus the second half of last gen. For all we know the current Pro didn't manage good numbers on that front and Sony will decide the experiment didn't work (or they'll decide it wasn't technically impressive enough to draw in those open to PC gaming and go stronger with the PS5 Pro).
 

Kyoufu

Member
Oct 26, 2017
16,582
With AMD making rapid progress with each Zen/RDNA iteration I can't think of any real downside to bringing PS5 Pro & Xbox Series X Plus using whatever the latest CPU/GPU architectures AMD will have at the time. More performance is no bad thing and Pro/X proved there are millions of consumers willing to buy a more premium product.
 

Jedi2016

Member
Oct 27, 2017
15,598
www.theguardian.com

PlayStation boss on PS4 Pro: our approach isn't reactive this time around

Andrew House also discussed relevance of physical media in gaming, following launch of Sony’s Ultra 4k streaming service
I wasn't doubting that Andrew House said it, I was doubting Andrew House. Because regardless of what they say they meant with the release, they (both of them) sure as shit pushed the hell out of the "4K Console" idea in their marketing and presentations.
 

BreakAtmo

Member
Nov 12, 2017
12,805
Australia
I wasn't doubting that Andrew House said it, I was doubting Andrew House. Because regardless of what they say they meant with the release, they (both of them) sure as shit pushed the hell out of the "4K Console" idea in their marketing and presentations.

Designing and planning a console around targeting the hardcore and also trying to get more casual sales through simplistic "4K console for your 4K TV!" marketing are not mutually exclusive by any means - if anything they're complimentary, since the hardcore will usually seek out information on these sorts of products themselves.

I also have no clue why you would think House was lying about this because why the hell would he do that? What benefit would there be, rather than just saying they made it for 4K TVs?
 

MrKlaw

Member
Oct 25, 2017
33,029
I don't see a 20 / 24 tflop machine being affordable in a console form factor even in 3 years. The node change from 7nm to 5nm or 3 nm is going to be cost prohibitive and just mathematically unless they hit 3nm you're only going to see a 30% reduction in size but you're doubling the tflops so the chip has to grow.

Additionally, you can't really double the GPU without growing CPU and Memory or you run in to other bottlenecks which further adds cost.

There may be other silicon advancements I"m not privy to, but it's pretty widely known this is a real challenge right now. So looking through todays lens, I think it's unlikely you're going to see a mid-gen console this cycle.

And you're then also painting yourself into a corner because you need some headroom for your next 'proper' console
 

褲蓋Calo

Alt-Account
Banned
Jan 1, 2020
781
Shenzhen, China
I wasn't doubting that Andrew House said it, I was doubting Andrew House. Because regardless of what they say they meant with the release, they (both of them) sure as shit pushed the hell out of the "4K Console" idea in their marketing and presentations.
I agree, just look at the enhanced games, some provided high performance/60 FPS mode, some were more playable than on the base console, but beyond that, resolution is the single most significant visual improvement. If you look at effect fidelity, there's nothing that rivalled their PC counterparts.
 

dgrdsv

Member
Oct 25, 2017
11,817
I don't see a 20 / 24 tflop machine being affordable in a console form factor even in 3 years.
This will depend solely on production lines advancements in cost per transistor.
Technically, you can build a 20-24 TFlops GPU right now on 7nm but it will be too expensive for console market.
Either a move to 5nm or pricing decrease of 7nm over time may make such chip suitable for console space in 3 years. Or it may not, depending on how production costs will fall during this time.

The node change from 7nm to 5nm or 3 nm is going to be cost prohibitive and just mathematically unless they hit 3nm you're only going to see a 30% reduction in size but you're doubling the tflops so the chip has to grow.
Not sure where you've heard about 30% as N7->N5 alone should be about 85%. But this is completely dependent on a design of the chip and it's impossible to predict the scaling which some specific chip may have - too many variable here. Should still be considerably more than just 30% size reduction between 7 and 3nm.

Additionally, you can't really double the GPU without growing CPU and Memory or you run in to other bottlenecks which further adds cost.
CPU is not an issue anymore (the issue is single threaded performance now but this isn't an issue of chip size) and there are obvious ways of expanding the memory bandwidth - which hopefully will come down in price enough to be suitable for next-next-gen consoles.

So looking through todays lens, I think it's unlikely you're going to see a mid-gen console this cycle.
This I agree with but for a different reason: there's no target to hit with a mid-gen upgrade this time. Last time is was 4K TVs, this time 8K won't take off till 2025 and even then it will be way too expensive - and likely useless for an average gamer - to try and render anything in 8K on a console. So what would such "upgrade" accomplish?
 

Mathieran

Member
Oct 25, 2017
12,853
Let's not forget that the Pro was more or less an experiment. We don't know if it met their expectations or not. I feel like that would be a large factor in determining if they want to do that again.
 

Bearly_There

Member
Mar 16, 2020
30
Also in the context of 120 FPS gaming, the latency between cores are probably minor. The operating system usually let you set core affinity for your worker threads, unless you deliberately want them to cause false sharing, most of the traffic should be for synchronization.

Twitter Matt later more or less admitted he has no direct knowledge of the SoC's architectural design:

I think that you are quite mistaken. It's in a such demanding situations as real-time interactivity at 120 frames per second that issues like latency in caches and inter-core communications become crucial. It makes sense that Sony might try to squeeze every ounce of efficiency here because high FPS real-time interactive simulation is especially relevant to VR.

I would also expect that, unofficially, Matt is very familiar with the architectural details of the new consoles. He'd be too well connected not to. His technical knowledge is also incomparably superior to a layman enthusiast journalist, though of course that doesn't mean his word should be treated as unquestioned truth.
 

褲蓋Calo

Alt-Account
Banned
Jan 1, 2020
781
Shenzhen, China
I think that you are quite mistaken. It's in a such demanding situations as real-time interactivity at 120 frames per second that issues like latency in caches and inter-core communications become crucial. It makes sense that Sony might try to squeeze every ounce of efficiency here because high FPS real-time interactive simulation is especially relevant to VR.

I would also expect that, unofficially, Matt is very familiar with the architectural details of the new consoles. He'd be too well connected not to. His technical knowledge is also incomparably superior to a layman enthusiast journalist, though of course that doesn't mean his word should be treated as unquestioned truth.
So, can I say that you think the PS5 has a single octacore CCX?
 
Jun 18, 2018
1,100
It may simply be less necessary as well. 4K was becoming a mainstream resolution for PC and TV's, and the the base consoles were designed around driving 1080p (or less) output. When you have a set that requires 4x the performance *just* to drive 4x the pixels, then you eat up all the performance just driving resolution. I think it's unlikely we'll see 8K TV's go mainstream in the same way we saw 4K go mainstream - we're more likely to see improvements in NITS (to drive better HDR) or better framerates to support greater than 60fps on TV's. CPU's and GPU's in the next-gen should easily support higher frame rates and wider colors.

So the mid-gen upgrades are not only less financially and technically viable, but also likely less necessary to keep up with display technologies.

I'm in agreement here. From a customer facing perspective what is going to be the thing that another mid-gen refresh can offer? More titles with 120fps and / or 8k? That'd be nice, but I can't see either technologies growing at the rate of 4K and HDR. Better graphics? We're at the point where big changes to visuals require huge improvements to hardware - probably not enough provided by a node shrink or two.

If there was a refresh from Sony and MS, I could see each side taking a step towards what the other offers - straight up horsepower vs I/O speeds, rather than doubling down on what they currently do better.

Looking beyond that (and taking this a step further away from the topic), I can't help but conclude that a true next generation heavily focus on hardware driven machine learning to assist in or take over all parts of world simulations - rendering, lighting, physics simulations, content generation, AI, audio & streaming to secondary devices.
 

xeroyear

Member
Nov 8, 2018
199
What he said was he preferred clocking higher than adding more CU to achieve the same theoretical TF because a rising tide riases all boats. Meaning that if the GPU is clocked higher, everything inside the GPU gets clocked higher and not just the CU.

52 CU at 1.6GHz = 10TF
36 CU at 2.23GHz = 10TF

The latter 36 CU GPU will have higher rasterization, pixel and texture fill rate, higher cache bandwidth. Even though both have the same theoretical 10TF. That is not downplaying CU and more talking about the effects of clocking higher. The downside of clocking high being latency but the benefits outweighs the downside assuming you can cool said GPU appropriately to perform at that high clock.


Nods. October 2019



[27:14]

"The easiest win, in improving graphics performance from a console SoC, is to improve the clock speed – every part of the GPU gets extra juice from that..."
 

Raven Prime

Member
Oct 31, 2017
174
This I agree with but for a different reason: there's no target to hit with a mid-gen upgrade this time. Last time is was 4K TVs, this time 8K won't take off till 2025 and even then it will be way too expensive - and likely useless for an average gamer - to try and render anything in 8K on a console. So what would such "upgrade" accomplish?

We are used to think of monitors and TVs as the only relevant display tech to consider when talking about consoles. But there is always the question of what PSVR2 will bring, and with over 5 million PSVR1 headsets sold I think this is always something to consider as well. Perhaps catering to the 8K market won't offer a lot for the coming years, but if VR's adoption keeps rising and more AAA games get developed for it, then this (improved VR experiences) could be a reason as well, beside all the other PRO features a mid-gen upgrade could bring.
 

dgrdsv

Member
Oct 25, 2017
11,817
But there is always the question of what PSVR2 will bring, and with over 5 million PSVR1 headsets sold I think this is always something to consider as well.
Nah. VR market is a complete commercial flop at the moment. It certainly won't be the driver behind something as expensive as producing a new console h/w.
 

Raven Prime

Member
Oct 31, 2017
174
Nah. VR market is a complete commercial flop at the moment. It certainly won't be the driver behind something as expensive as producing a new console h/w.

Compared to non-VR gaming perhaps, but for Sony at least I wouldn't call it a flop at all. Especially when PSVR2 manages a decent implementation of wireless and foveated rendering tech, it could really take off. It's clear that Sony understands the potential of VR and isn't backing down.

So I guess it comes down to how you look at it, they won't release a PS5 VR edition, but when they make that PS5 Pro, VR will definitely be a contributing aspect.

 

AegonSnake

Banned
Oct 25, 2017
9,566
The latest DF video on 60 fps Spiderman and KZSF footage is very interesting. If we can use hardware interpolation to get 60 fps, we could have fake 60 fps like we did fake 4k on the Pro. Sure there are some artifacts, but if you cant tell most of the time, whats the difference?

I wonder if Nvidia will have something like DLSS, but for framerates.
 

Carn

Member
Oct 27, 2017
11,908
The Netherlands
This I agree with but for a different reason: there's no target to hit with a mid-gen upgrade this time. Last time is was 4K TVs, this time 8K won't take off till 2025 and even then it will be way too expensive - and likely useless for an average gamer - to try and render anything in 8K on a console. So what would such "upgrade" accomplish?

I agree; and I never really believed Andrew House's arguments that they wanted to push out a mid-gen update so they could compete with PC (which in the end; they didnt imho). To me, it all sounds as an experiment driven by the fact that 4K displays and 4K content where getting pushed around heavily and it would be kinda lame if you bought a console that couldnt power those displays. It is a clear proposition for the average consumer as well.

That said, and as you mentioned, the 'race to 8K' will take quite a few more years and would be a big technical hurdle if attempted natively, unless we get there by DLSS-like solutions. That said, maybe we'll see an update with more storage and more advanced raytracing support; but that would really speak to the hardcore crowd mostly.
 
Last edited:
Feb 8, 2018
2,570
The latest DF video on 60 fps Spiderman and KZSF footage is very interesting. If we can use hardware interpolation to get 60 fps, we could have fake 60 fps like we did fake 4k on the Pro. Sure there are some artifacts, but if you cant tell most of the time, whats the difference?

I wonder if Nvidia will have something like DLSS, but for framerates.

The answer lies within your comment. Fake 60fps with artifacts versus another solution. It's like dynamic resolution scaling and other techniques. It isn't the real thing that holds the highest resolution, framerate, image clarity reliably.
 

MrKlaw

Member
Oct 25, 2017
33,029
The answer lies within your comment. Fake 60fps with artifacts versus another solution. It's like dynamic resolution scaling and other techniques. It isn't the real thing that holds the highest resolution, framerate, image clarity reliably.

its a technique that seems to work well on VR - timewarp/spacewarp etc. You can poll the controller at 60fps or higher, run other game logic at 60fps so you get proper responsiveness, and its just the graphics that is interpolated.

Eg if you're spinning around you could almost get away with just scrolling the screen right to left (super fast) rather than fully calculate a brand new view
 

DrKeo

Banned
Mar 3, 2019
2,600
Israel
its a technique that seems to work well on VR - timewarp/spacewarp etc. You can poll the controller at 60fps or higher, run other game logic at 60fps so you get proper responsiveness, and its just the graphics that is interpolated.

Eg if you're spinning around you could almost get away with just scrolling the screen right to left (super fast) rather than fully calculate a brand new view
It probably won't work well on a standard game. It works well for camera movement which is extremely important for VR because the camera is attached to the user's head movement but motion inside the frame itself won't be able to benefit much from it.

I really doubt frame interpolation will be a thing. It doesn't improve the response time and usually makes it even worse.
 

gundamkyoukai

Member
Oct 25, 2017
21,077
I agree; and I never really believed Andrew House's arguments that they wanted to push out a mid-gen update so they could compete with PC (which in the end; they didnt imho). To me, it all sounds as an experiment driven by the fact that 4K displays and 4K content where getting pushed around heavily and it would be kinda lame if you bought a console that couldnt power those displays. It is a clear proposition for the average consumer as well.

That said, and as you mentioned, the 'race to 8K' will take quite a few more years and would be a big technical hurdle if attempted natively, unless we get there by DLSS-like solutions. That said, maybe we'll see an update with more storage and more advanced raytracing support; but that would really speak to the hardcore crowd mostly.

It's not that they want to compete with PC per say but don't want to get left behind by a huge amount tech wise.
Which end up losing them money if the hardcore buy software some where else.
Still this gen these system much closer PC high end compare to PS4\XB1 so maybe they could wait the extra year if we get another 7 to 8 year gen .
 

MrKlaw

Member
Oct 25, 2017
33,029
It probably won't work well on a standard game. It works well for camera movement which is extremely important for VR because the camera is attached to the user's head movement but motion inside the frame itself won't be able to benefit much from it.

I really doubt frame interpolation will be a thing. It doesn't improve the response time and usually makes it even worse.

probably better to run a lower resolution at proper 60fps and upscale the resolution?
 

dgrdsv

Member
Oct 25, 2017
11,817
I agree; and I never really believed Andrew House's arguments that they wanted to push out a mid-gen update so they could compete with PC (which in the end; they didnt imho). To me, it all sounds as an experiment driven by the fact that 4K displays and 4K content where getting pushed around heavily and it would be kinda lame if you bought a console that couldnt power those displays. It is a clear proposition for the average consumer as well.

That said, and as you mentioned, the 'race to 8K' will take quite a few more years and would be a big technical hurdle if attempted natively, unless we get there by DLSS-like solutions. That said, maybe we'll see an update with more storage and more advanced raytracing support; but that would really speak to the hardcore crowd mostly.
A mid-gen "lite" upgrade is a given, as it was with almost every console generation. It will run cooler, be smaller, have more storage, etc. But a performance upgrade in veins of XBX and Pro? Doubtful. Rendering above 4K seems like a waste even when we will have affordable 8K TVs on the market and I doubt that there's anything else which would be interesting to consumers - a 30->60 fps type "upgrade" would be great but for that we'd need CPUs with 2X of single threaded performance and this just won't happen over the course of 3 years.
 

anexanhume

Member
Oct 25, 2017
12,912
Maryland
I'd venture the 8 core CCX debate is mostly moot. If your die is monolithic, then you don't have to go off-die for inter-die communication like desktop CPUs. Additionally, if they indeed only have 8MB L3, it's most likely unified and has lower latencies as a result.
 

DrKeo

Banned
Mar 3, 2019
2,600
Israel
probably better to run a lower resolution at proper 60fps and upscale the resolution?
Probably yes, inserting frames is usually almost all down side. It's up to the developer if they want native or reconstructed 4K, if they want 30, 60 or 120 fps, but I doubt frame interpolation will be a thing unless some new technique will become available. Warping in VR is something completely different, it's not a wholistic solution for standard games. In theory you could use it just for right stick camera movement in FPS games, but it will create a disparity between that and all the rest of the movement in the game.
 

褲蓋Calo

Alt-Account
Banned
Jan 1, 2020
781
Shenzhen, China
I'd venture the 8 core CCX debate is mostly moot. If your die is monolithic, then you don't have to go off-die for inter-die communication like desktop CPUs. Additionally, if they indeed only have 8MB L3, it's most likely unified and has lower latencies as a result.
2%20AMD%20Ryzen%20Mobile%20Tech%20Day_General%20Session_Architecture%20Deep%20Dive-page-005_575px.jpg

Renoir is monolithic, and only has 8MB L3$, yet it uses 2 CCXs. There may be some inherent limitations in scaling the CCX up, when their goal is to maintain equal average core-to-core latency. See my previous post:

Now the elaboration: Let's assume within a CCX each core has one-to-one bidirectional communication path to every other core, then a quadcore CCX would have sum(3, 2, 1) == 6 such connections just between the cores; if it's a octacore CCX, then it needs sum(7...1) == 28 such connections. Not only do you increase the wiring and number of interface logic by more than 4 time, the multiplexing logic within each core will complicates too (not as drastically), which means more latency for each transaction. Now you doubled your CCX size, but at the expense of significant amount of die area and slightly worse latency. Remember long wiring can generate a lot of heat, and even when they aren't working, there's leakage that again, generates a lot of heat. That's why I said hierarchy(not cache) are necessary for large systems, this is also why RDNA GPUs group CUs and fixed functional units into Shader Arrays and not otherwise.

amd-zen-800x447.png

1ab106842899024cb75c9d3237351cc45e3e05e4_2_1024x576.jpeg

The cores don't have dedicated links (e.g. L2$ to L2$), but each core is paired with one L3$ slice and the slices are fully connected.
 

Andromeda

Member
Oct 27, 2017
4,841
I'd venture the 8 core CCX debate is mostly moot. If your die is monolithic, then you don't have to go off-die for inter-die communication like desktop CPUs. Additionally, if they indeed only have 8MB L3, it's most likely unified and has lower latencies as a result.
I don't believe at the 8 cores CCX theory for PS5 CPU. You remember Flute benchmark ? It clearly showed 4MB L3 by CCX and I doubt they are going to use 4MB in total, it's going to be 8MB in total using 2 CCXs. But I think Sony could have customized their CPU in order to reduce caches latencies which is going to be very important for PSVR2.


CIwIE1r.png
 

No_Style

Member
Oct 26, 2017
1,795
Ottawa, Canada
I tried to search but is there an Xbox Series X equivalent of this thread? (Couldn't find one) Apparently MS is going to hold a Hot Chips session about its architecture:

The final talk in a very long day is from Microsoft, about the Xbox Series X system architecture. This is likely going to focus on the infrastructure design on the console, the collaboration with AMD on the processor, perhaps some insight into new features we're going to find in the console, and how the chip is going to drive the next 4-8 years of console gaming. I'm actually a bit 50-50 on this talk, as we've had presentations like this at events before (e.g. Qualcomm's XR) which didn't actually say anything that wasn't already announced. There's the potential here for Microsoft to not say anything new, but I hope that they will go into more detail.
 

Lukas Taves

Banned
Oct 28, 2017
5,713
Brazil
You are mischaracterizing what Mark Cerny said. How can he downplay CU when every GPU has them?

What he said was he preferred clocking higher than adding more CU to achieve the same theoretical TF because a rising tide riases all boats. Meaning that if the GPU is clocked higher, everything inside the GPU gets clocked higher and not just the CU.

52 CU at 1.6GHz = 10TF
36 CU at 2.23GHz = 10TF

The latter 36 CU GPU will have higher rasterization, pixel and texture fill rate, higher cache bandwidth. Even though both have the same theoretical 10TF. That is not downplaying CU and more talking about the effects of clocking higher. The downside of clocking high being latency but the benefits outweighs the downside assuming you can cool said GPU appropriately to perform at that high clock.

But overclocking the gpu core only is not a rising tide that lifts all boats.

The amount of on-chip cache does not increase nor the amount of memory bandwidth (unless you overclock the memory, which they aren't)...

Right there, that's 2 super important components for the target performance that is seeing 0 improvements from the overclock, and that often even limits to see the full effect of the clocks realized.

Pixel and texture fillrate are no longer a problem in any modern gpu, in fact they have theoretical values that are usually even impossible to reach in reality because there's simply no bandwidth to feed them (was the case on Ps4, and to an insane extent on Pro).

Rasterization is not an issue either, because what's the point in being able to draw billions of triangles, if you render a small triangle the entire gpu stalls? Hence the creation of the mesh shaders, and other previous attempts such as primitive shaders to make geometry processing more efficient/flexible instead of basically making a super outdated pipeline super fast. Which is what epic used in the Ue5 tech demo, that's why despite achieving pixel sized triangles the time it took to process them was similar to Fortnite on current gen consoles (a 60fps game even).

And the importance of cache for overall performance cannot be overstated, to a point Nvidia instead to chase AMD in the tflop battle decided to focus instead on having much bigger caches on their gpus, and despite the lower tf simply destroyed them in performance. It was also one of the highlights from AMD for Rdna, which added a whole level of cache compared to gcn, and again saw huge performance gains (So much that the "Nvidia flops" and "AMD flops" when comparing gpu performance has all but withered since rdna has been introduced.

I mean, sure, compared to the exact same gpu with a lower clock the new overclocked one will perform better, but you'd achieve higher performance overall by actually raising all boats.
 

avaya

Member
Oct 25, 2017
2,140
London
I don't see a 20 / 24 tflop machine being affordable in a console form factor even in 3 years. The node change from 7nm to 5nm or 3 nm is going to be cost prohibitive and just mathematically unless they hit 3nm you're only going to see a 30% reduction in size but you're doubling the tflops so the chip has to grow.

This is interesting because the communications to the market from TSMC and ASML has been that the initial move to EUV was the bigger technical and cost prohibitive shift, but once there the node cadence will be very rapid though to 3nm. I believe 3nm is already in testing.
 

Albert Penello

Verified
Nov 2, 2017
320
Redmond, WA
This is interesting because the communications to the market from TSMC and ASML has been that the initial move to EUV was the bigger technical and cost prohibitive shift, but once there the node cadence will be very rapid though to 3nm. I believe 3nm is already in testing.

My understanding (and keep in mind, this is a few years back) was that there were technical hurdles but the costs were going to go way up. IIRC the move from 16nm to 7nm was basically a wash (no real cost savings) and 5/3 were going to be dramatically more expensive in processes and yields. I haven't followed the industry closely so it's possible there have been breakthroughs.

This is why I say *unlikely* not *impossible*. The silicon industry finds a way. But my guess is that the push on the launch consoles performance was designed to negate the need for a mid-gen refresh.
 
Status
Not open for further replies.