• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

When will the first 'next gen' console arrive?

  • H2 2019

    Votes: 638 14.1%
  • H1 2020

    Votes: 724 16.0%
  • H2 2020

    Votes: 2,813 62.2%
  • H1 2021

    Votes: 141 3.1%
  • H2 2021

    Votes: 208 4.6%

  • Total voters
    4,524
  • Poll closed .

RoboPlato

Member
Oct 25, 2017
6,805
All this talk of the GPU but i'm more than delighted what we'll get in the CPU dept! I mean look at the games we have now and in the next 18 months....on 5..6 yr old hardware!
Yeah while I'm curious about the GPUs in next gen consoles I think it's safe to say that it's probably the most predictable and least important part of what next gen will be. Very excited for the CPU bump. It'll almost feel like a two gen leap for devs on that front.
 
Feb 10, 2018
17,534
Yeah while I'm curious about the GPUs in next gen consoles I think it's safe to say that it's probably the most predictable and least important part of what next gen will be. Very excited for the CPU bump. It'll almost feel like a two gen leap for devs on that front.

Will it be a big a leap in the CPU department as emotion engine to cell or Xbox cpu to 360s Xeon CPU?
 

lukeskymac

Banned
Oct 30, 2017
992
Can checkerboard not help with that?
One of the demos Nvidia showed (the one that was twice as fast with RTX) was already using an advanced checkerboard-like neural network technique to get above 30fps. It's just not happening this gen.

Yes, but the question was about emulation. And using the Cell BE to perform all those GPU-type tasks is difficult to emulate with a GPU. You would have to re-write the games. You're talking about millions of dollars of work per game.

A system with a modern CPU with more floating point performance than the Cell BE could potentially have an easier time emulating the PS3.

Well, I wasn't talking about that. I already stated that it makes no sense to waste die space with a Cell BE when modern CPUs can emulate it already (hi, RPCS3).
 
Last edited:

RoboPlato

Member
Oct 25, 2017
6,805
Hopefully somone here is.
I think it would be interesting to know how the potential next gen cpu upgrade stacks up compared to other gens.
Ps2/xbox to ps3/360 seemed huge in there cpu upgrades.
If I had to hazard a guess I would say it's probably not quite that huge but it is substantial, especially considering what devs have been able to do on the jaguars
 

Gemüsepizza

Member
Oct 26, 2017
2,541
I won't rule anything out just yet.

Interesting to compare what a prominent dev tweeted recently about RT and the next consoles:



And what he 'hoped' for in early 2011 re: RAM in the next consoles:

He got what he wished for last time round...


Interestingly he is talking about "consoles" in his tweet, so probably no nvidia/RTX in any next gen console? Not that it would surprise me, but some people were speculating about Xbox choosing nvidia for next-gen.

I'm not totally convinced that nvidia's RTX solution is the right way going forward, it seems a bit unflexible from a quick glance. I mean, I'm sure that devs like Naughty Dog and DICE would create amazing games with this tech, but what about other companies? If they won't use this tech, afaik that means that silicon is wasted for them, right?

Maybe it makes more sense to create a raytracing solution that depends more on traditional general-purpose compute tech, which can also be used for traditional graphics.

I'm convinced next gen consoles will have some pseudo raytracing lite solution. Not the real thing, but somewhat close in limited areas.

That's what I believe, too. We already have "raytracing" in some games, I think Fortnite is using some form of raytracing for shadows. I think we will see more of this, and those raytracing implementations will be able to run on traditional hardware. Even if it isn't a full blown implementation, it already helps making games look nicer, and that way devs can say "Oh yeah, we totally are using raytracing™ in our new game!"
 

anexanhume

Member
Oct 25, 2017
12,913
Maryland
Will it be a big a leap in the CPU department as emotion engine to cell or Xbox cpu to 360s Xeon CPU?
One benefit is the lack of ISA change. Devs don't have to completely change their tools to target a different ISA with differing strengths and weaknesses. Compared to Jaguar, Zen is only strengths.

IPC should be roughly 50-60% higher with Zen 2, and clock speeds are roughly double compared to base PS4/XB1. If consoles have hyperthreading, that's another potential speedup factor.

Xeon and Cell were such big CPU upgrades because in addition to huge clock jumps (4x-10x), they were also huge TDP jumps. Of course you'll get more performance with more power use.

These consoles APUs will probably have roughly the same power budget, so gains mostly come from IPC, clocks, memory enhancements, and various other hierarchy changes and optimizations.
 

eathdemon

Member
Oct 27, 2017
9,631
Interestingly he is talking about "consoles" in his tweet, so probably no nvidia/RTX in any next gen console? Not that it would surprise me, but some people were speculating about Xbox choosing nvidia for next-gen.

I'm not totally convinced that nvidia's RTX solution is the right way going forward, it seems a bit unflexible from a quick glance. I mean, I'm sure that devs like Naughty Dog and DICE would create amazing games with this tech, but what about other companies? If they won't use this tech, afaik that means that silicon is wasted for them, right?

Maybe it makes more sense to create a raytracing solution that depends more on traditional general-purpose compute tech, which can also be used for traditional graphics.



That's what I believe, too. We already have "raytracing" in some games, I think Fortnite is using some form of raytracing for shadows. I think we will see more of this, and those raytracing implementations will be able to run on traditional hardware. Even if it isn't a full blown implementation, it already helps making games look nicer, and that way devs can say "Oh yeah, we totally are using raytracing™ in our new game!"
you cant do raytracing without it though, normal gpu tech are decades away from being good enough to do it. the special hardware is the only way .
 

Gemüsepizza

Member
Oct 26, 2017
2,541
you cant do raytracing without it though, normal gpu tech are decades away from being good enough to do it. the special hardware is the only way .

What I hope is that they might find a way to improve the traditional compute capabilities on GPUs to make them more fit for raytracing, while still allowing devs to use them for traditional stuff, if they want.

A big, dedicated amount of silicon that can only do specific raytracing operations seems risky imo, especially on consoles where you have a limited silicon budget.

I really wonder what AMD's answer to nvidia's RTX will look like, maybe we will get a few hints in the next few months.
 
Last edited:

Locuza

Member
Mar 6, 2018
380
What I hope is that they might find a way to improve the traditional compute capabilities on GPUs to make them more fit for raytracing, while still allowing devs to use them for traditional stuff, if they want.

A big, dedicated amount of silicon that can only do specific raytracing operations seems risky imo, especially on consoles where you have a limited silicon budget.

I really wonder what AMD's answer to nvidia's RTX will look like, maybe we will get a few hints in the next few months.
Nvidia did both, the common shader cores have lower latency per operation and a much better cache system which also has lower latencies and higher capacities.
Judging from the quite blurry Videocardz picture Nvidia might halved the number of cores per SM, effectivly doubling the register-size per core and cache sizes.
On top of that they have implement RT logic to speed up BHV computation.
And define a big dedicated amount of silicon for that.
It wouldn't suprise me if its under 10% but if it's above 15%.

But looking at the RTX stuff it doesn't seem like local compute power would be enough to enable raytracing as an integral part of the rendering pipeline.
It's either happening in the generation after or they provide some enormous infrastructure for additional software and hardware acceleration.

Are we expecting Ryzen 2 or Ryzen in PS5?
If Zen or Zen2 will go into the PS5 might be easier to judge when we know how Zen2 looks like.
EPYC Rome with Zen2 should launch in Q1 2019.
 

KOHIPEET

Member
Oct 29, 2017
1,416
I hope Instead of focusing on ray-tracing (which would no doubt be extremely resource-intensive) next-gen consoles (and developers) will prioritize advanced particle-based simulations. While RT (in BFV for eg.) looks damn good, I'm getting tired of seeing basic smoke and water effects. IMO smoke like it was done in Batman: Arkham Knight should be standard next-gen.
 
Jan 2, 2018
2,027
I wonder what ballpark the Zen2 CPU that will likely be included in the APU of the PS4/XBO will be comparable to..maybe a Ryzen 2700X,obviously underclocked from it's base clock of 3.7ghz?
 

Wordstar

Member
Oct 27, 2017
303
Germany
I doubt they will support raytracing since amd hasnt talked about raytracing hardware before. And they probably will take a couple of years to catch up to nvidia on that point.
 
Jan 2, 2018
2,027
You might want to lower your expectations a little bit.
Hmm..I dunno..I mean,Ryzen 2700x is on a 12nm (really 16nm) process and the jump to 7nm should be really substantial regarding die space and with AMD rumored to be doubling the amount of cores per CCX to 8 from 4 it might be possible, and like I said,I'm talking about an underclocked Ryzen 2700X,it's not out of the realm of possibility it will be almost 3 years old by the time next gen releases.
 

Socky

Member
Oct 27, 2017
361
Manchester, UK
100% of games I play don't use the light bar. Why not just let me turn it off? Ridiculous design tbh.

The solution should that the system can turn on/off the lightbar/touch on a per-game basis. If a game doesn't use it, don't turn it on. This may well require a re-design of the controller on PS5, but I think it would be worth it, maximising battery-life while retaining full functionality and BC.

I'm not hopeful; Sony have always been conservative/uninspired in controller design/innovation, but they may respond to long-held complaints about the DS4 battery and lightbar. I'm not holding my breath for Move-like split-controllers as standard though, much as I would like to see them.
 

Andromeda

Member
Oct 27, 2017
4,844
Hmm..I dunno..I mean,Ryzen 2700x is on a 12nm (really 16nm) process and the jump to 7nm should be really substantial regarding die space and with AMD rumored to be doubling the amount of cores per CCX to 8 from 4 it might be possible, and like I said,I'm talking about an underclocked Ryzen 2700X,it's not out of the realm of possibility it will be almost 3 years old by the time next gen releases.
Why would they use a 3 years old CPU for their console? Even Jaguar was not one year old in 2013. They'll use a new design: Zen 3.
 
Jan 2, 2018
2,027
Why would they use a 3 years old CPU for their console? Even Jaguar was not one year old in 2013. They'll use a new design: Zen 3.
I'm not saying they will USE a Ryzen 2700X I was talking about the ballpark in terms of computational power their CPU part of choice will be comparable to. And Zen 3 is really unlikeley because it's a 2020 design on a 7nm+ node and I don't think either will be ready in time to meet Sony's production schedule.
 

VallenValiant

Banned
Oct 27, 2017
1,598
The solution should that the system can turn on/off the lightbar/touch on a per-game basis. If a game doesn't use it, don't turn it on. This may well require a re-design of the controller on PS5, but I think it would be worth it, maximising battery-life while retaining full functionality and BC.
As I keep reminding people, the main source of battery drain is the touchpad. To argue for the removal of light bar is misguded if batterylife is what you are after.
 

Intersect

Banned
Nov 5, 2017
451
What I hope is that they might find a way to improve the traditional compute capabilities on GPUs to make them more fit for raytracing, while still allowing devs to use them for traditional stuff, if they want.

A big, dedicated amount of silicon that can only do specific raytracing operations seems risky imo, especially on consoles where you have a limited silicon budget.

I really wonder what AMD's answer to nvidia's RTX will look like, maybe we will get a few hints in the next few months.
Infinity Fabric would allow custom blocks without impacting BC. According to a whitepaper, a matrix math block supporting coarse Ray Tracing is tiny. That paper mentions GCN GPUs and it's publish date is also 2014 which allows enough time to be in Navi or a PS4 iteration (still GCN so still possibly called PS4).

Nvida announced all games should be written like they have RTX SUPPORT to make way for coming GPUs with accelerators. IF this is the coming standard then AMD must support it also. Since Sony wants game developers to have an easy port to PCs, the Game Consoles must support it also. So the PS4 Pro has FP16 support from Vega and Sony worked with AMD on Navi far enough back that features out of Navi could be in a 2019 PS4 iteration.
 
Last edited:

Intersect

Banned
Nov 5, 2017
451
As I keep reminding people, the main source of battery drain is the touchpad. To argue for the removal of light bar is misguded if batterylife is what you are after.
An example of a feature only in the PS4 that game developers do not support so why is it in the PS4? 2020 ATSC 3.0 ramps and all PS4s are designed to support it, including the touchpad. Due to financial reasons (need multiple channels) 1080P will be the highest resolution on OTA ATSC 3.0 and all PS4s support UHD 1080P + HDR and depthmap.
 

Socky

Member
Oct 27, 2017
361
Manchester, UK
As I keep reminding people, the main source of battery drain is the touchpad. To argue for the removal of light bar is misguded if batterylife is what you are after.

I mentioned touch as well (nor did I argue for removal of the lightbar), but many people dislike the lightbar regardless of battery drain. The point was, if the game doesn't use a function, enable it to be turned off.
 

VallenValiant

Banned
Oct 27, 2017
1,598
I mentioned touch as well (nor did I argue for removal of the lightbar), but many people dislike the lightbar regardless of battery drain. The point was, if the game doesn't use a function, enable it to be turned off.
Since the battery complaint is the main argument against the lightbar in every thread that mentions it, I an not sure what is the point otherwise.
 

TheRaidenPT

Editor-in-Chief, Hyped Pixels
Verified
Jun 11, 2018
5,945
Lisbon, Portugal
Infinity Fabric would allow custom blocks without impacting BC. According to a whitepaper, a matrix math block supporting coarse Ray Tracing is tiny. That paper mentions GCN GPUs and it's publish date is also 2014 which allows enough time to be in Navi or a PS4 iteration (still GCN so still possibly called PS4).

Nvida announced all games should be written like they have RTX SUPPORT to make way for coming GPUs with accelerators. IF this is the coming standard then AMD must support it also. Since Sony wants game developers to have an easy port to PCs, the Game Consoles must support it also. So the PS4 Pro has FP16 support from Vega and Sony worked with AMD on Navi far enough back that features out of Navi could be in a 2019 PS4 iteration.

It will only be "standard" once the main market which is console has it.

I'm waiting to buy a GTX 1080 but would love to see this coming to the consoles in the near future
 

anexanhume

Member
Oct 25, 2017
12,913
Maryland
Why would they use a 3 years old CPU for their console? Even Jaguar was not one year old in 2013. They'll use a new design: Zen 3.

Zen 3 is likely a 7nm+ design. That's pretty aggressive schedule wise. An alternate solution might be to take Zen 2 and grab the juiciest bits from Zen 3 if possible.

I'm not saying they will USE a Ryzen 2700X I was talking about the ballpark in terms of computational power their CPU part of choice will be comparable to. And Zen 3 is really unlikeley because it's a 2020 design on a 7nm+ node and I don't think either will be ready in time to meet Sony's production schedule.

We have a comparable: Ryzen 2700E is 8C/16T @2.8GHz. That's Zen+ on 12nm @45W TDP.

We know frequency doesn't scale much at 7nm, but power and area will, so 2700E is a safe lower bound for performance IMO.
 
Jan 2, 2018
2,027
Zen 3 is likely a 7nm+ design. That's pretty aggressive schedule wise. An alternate solution might be to take Zen 2 and grab the juiciest bits from Zen 3 if possible.



We have a comparable: Ryzen 2700E is 8C/16T @2.8GHz. That's Zen+ on 12nm @45W TDP.

We know frequency doesn't scale much at 7nm, but power and area will, so 2700E is a safe lower bound for performance IMO.
So the only major difference between a Ryzen2700E to a 2700X is the clocks? I guess we could expect a theoretical Ryzen 3700E (?) to be substantially more powerful and still be a 45w TDP part?
 

anexanhume

Member
Oct 25, 2017
12,913
Maryland
So the only major difference between a Ryzen2700E to a 2700X is the clocks? I guess we could expect a theoretical Ryzen 3700E (?) to be substantially more powerful and still be a 45w TDP part?
More or less. It's the same silicon at least. They are binned differently so one may not be capable of the others' performance parameters.

The 2700E runs at lower voltages to enable the lower TDP. Power ramps greater than linearly to enable the higher clocks you see on desktop parts.

Look at the freq-Vmin curve in this thread: https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/


Edit: Sony is developing some kind of interactive robot.

https://patents.google.com/patent/G...nt&oq=Sony+interactive+entertainment&sort=new
 
Last edited:
Jul 6, 2018
174
I doubt they will support raytracing since amd hasnt talked about raytracing hardware before. And they probably will take a couple of years to catch up to nvidia on that point.

AMD has been talking about bringing hybrid ray tracing for several years. Who knows what they have in store.

AMD has three different GPU products in the pipeline right now according to Lisa Yu. It would be reasonable to bet that one of them will support ray tracing.

The knowledge of how to design a card capable of tracing exponentially more rays has been around for several years.

Folks who claim there is zero chance that such tech will make it into next gen consoles are overconfident unless they have some secret knowledge about AMD's next gen architecture.
 

anexanhume

Member
Oct 25, 2017
12,913
Maryland
AMD has been talking about bringing hybrid ray tracing for several years. Who knows what they have in store.

AMD has three different GPU products in the pipeline right now according to Lisa Yu. It would be reasonable to bet that one of them will support ray tracing.

The knowledge of how to design a card capable of tracing exponentially more rays has been around for several years.

Folks who claim there is zero chance that such tech will make it into next gen consoles are overconfident unless they have some secret knowledge about AMD's next gen architecture.

While there is a large element of the unknown, I would expect a few more indicators out of AMD. They've been following Nvidia in terms of dedicated or specialized hardware to accelerate certain types of compute more or less.

The software front could simply just be RT running through plain old GCN 2.0 compute cores. One thing that's interesting is that AMD has been hiring ImgTec engineers as they bleed talent following losing Apple's business. ImgTec bought caustic hardware (Sony appears to have picked up a former caustic engineer as well), which had the first hardware RT implementation that Carmack was impressed with. It's not full GI RT like Nvidia is showing, but it's still genuine RT tech. ImgTec could be an interesting acquisition for AMD if for nothing else than their patent portfolio, but I feel that's probably very low likelihood.
 
Last edited:

VX1

Member
Oct 28, 2017
7,000
Europe
While there is a large element of the unknown, I would expect a few more indicators out of AMD. They've been following Nvidia in terms of dedicated or specialized hardware to accelerate certain types of compute more or less.

The software front could simply just be RT running through plain old GCN 2.0 compute cores. One thing that's interesting is that AMD has been hiring ImgTec engineers as they bleed talent following losing Apple's business. ImgTec bought caustic hardware, which had the first hardware RT implementation that Carmack was impressed with. It's not full GI RT like Nvidia is showing, but it's still genuine RT tech. ImgTec could be an interesting acquisition for AMD if for nothing else than their patent portfolio, but I feel that's probably very low likelihood.

ImgTec was bought last year by some obscure Chinese fund.Company was almost ruined after Apple decided to ditch them.Really sad what happened to them,they were fine tech company.

I am frankly baffled how UK gov. allowed sale of ARM and ImgTec to Japanese and Chinese entities,probably 2 biggest British tech companies.I can not imagine US/France/Germany etc. would allow something like that.
 
Oct 27, 2017
3,893
ATL

Not sure about that, this might be more positive for AMD overall. AMD probably won't have to pay royalties to Global Foundry anymore for moving chip manufacturing over to TSMC. Also, all of AMDs designs were specifically created for TSMC's 7nm process.

Losing an executive could be a bad sign, but we really don't know. If Zen 2 measures up to the hype, the company will be more than fine. Intel having major issues with its 10nm process also helps them.

The only real problem I could see arise from this is if TSMC can't meet volume demands for AMDs products. I wonder if Samsung will pick up high-performance CPU and GPU bandwidth if that happens? 2019 is going to be an interesting year.
 

eathdemon

Member
Oct 27, 2017
9,631
Not sure about that, this might be more positive for AMD overall. AMD probably won't have to pay royalties to Global Foundry anymore for moving chip manufacturing over to TSMC. Also, all of AMDs designs were specifically created for TSMC's 7nm process.

Losing an executive could be a bad sign, but we really don't know. If Zen 2 measures up to the hype, the company will be more than fine. Intel having major issues with its 10nm process also helps them.

The only real problem I could see arise from this is if TSMC can't meet volume demands for AMDs products. I wonder if Samsung will pick up high-performance CPU and GPU bandwidth if that happens? 2019 is going to be an interesting year.
should have been clear, I meant trouble on the gpu side. their cpu side is fine.
 

anexanhume

Member
Oct 25, 2017
12,913
Maryland
ImgTec was bought last year by some obscure Chinese fund.Company was almost ruined after Apple decided to ditch them.Really sad what happened to them,they were fine tech company.

I am frankly baffled how UK gov. allowed sale of ARM and ImgTec to Japanese and Chinese entities,probably 2 biggest British tech companies.I can not imagine US/France/Germany etc. would allow something like that.

I believe they're likely still for sale for that reason. It's not as if they were bought by another tech company to forever boost their patent portfolio.

Not sure about that, this might be more positive for AMD overall. AMD probably won't have to pay royalties to Global Foundry anymore for moving chip manufacturing over to TSMC. Also, all of AMDs designs were specifically created for TSMC's 7nm process.

Losing an executive could be a bad sign, but we really don't know. If Zen 2 measures up to the hype, the company will be more than fine. Intel having major issues with its 10nm process also helps them.

The only real problem I could see arise from this is if TSMC can't meet volume demands for AMDs products. I wonder if Samsung will pick up high-performance CPU and GPU bandwidth if that happens? 2019 is going to be an interesting year.

should have been clear, I meant trouble on the gpu side. their cpu side is fine.

The company is in good hands with Lisa Su. They'll have plenty of people to promote from within. Exiting executives can be a sign of success in a company just as it can be a sign of a company's demise.
 

msia2k75

Member
Nov 1, 2017
601
Pretty shocking development. I remember they did a pretty big tech media fab tour thing including Anandtech in February but reading the article here Anandtech had big suspicions about GF's timeline for 7nm even then.

GloFo is a mess... to avoid at all costs as a chip manufacturer.

EDIT:

It's less detrimental to AMD rather than to IBM though...