• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

How much money are you willing to pay for a next generation console?

  • Up to $199

    Votes: 33 1.5%
  • Up to $299

    Votes: 48 2.2%
  • Up to $399

    Votes: 318 14.4%
  • Up to $499

    Votes: 1,060 48.0%
  • Up to $599

    Votes: 449 20.3%
  • Up to $699

    Votes: 100 4.5%
  • I will pay anything!

    Votes: 202 9.1%

  • Total voters
    2,210
Status
Not open for further replies.

chris 1515

Member
Oct 27, 2017
7,074
Barcelona Spain
But it is still polaris. What i am saying is that even if the consoles are using RDNA 2 features like ray tracing, you wont get some crazy efficiency boost that some here are hoping for because the consoles wont be based off 7nm+

But Polaris had some new GCN features, I bolded the part and it releases the same year 2016 than the PS4 Pro not the year before. I don't expect RDNA 2 to be a big gap because RDNA is a big architecture shift but saying this is impossible to release a GPU PC architecture and a console the same year is false.

EDIT: If consoles have future proof features inside them I expect some RDNA 3 features for example not RDNA2
 

modiz

Member
Oct 8, 2018
17,844
But Polaris had some new GCN features, I bolded the part and it releases the same year 2016 than the PS4 Pro not the year before. I don't expect RDNA 2 to be a big gap because RDNA is a big architecture shift but saying this is impossible to release a GPU PC architecture and a console the same year is false.

EDIT: If consoles have future proof features inside them I expect some RDNA 3 features for example not RDNA2
Oh, i am dumb, i though polaris launched in 2015, not 2016. That makes me more hopeful then. I apologize if i seemed too aggressive in my posts.
 

GhostTrick

Member
Oct 25, 2017
11,316
I don't understand why Toms Is only boosting to 1800 mhz? Is it because Furmark is 1600x900?

Real 4K Gaming Tests shows clock rates over 1950 MHz when the GPU power draw goes over 200w

At 1800 MHz GPU Power Draw is consistently under 160w



They're also doing it with Metro Last Light at 1440p with SSAA. Around 1800mhz for RX 5700XT. Averaging 214W.
 

Rylen

Member
Feb 5, 2019
466
They're also doing it with Metro Last Light at 1440p with SSAA. Around 1800mhz for RX 5700XT. Averaging 214W.

Well my personal testing pretty much lines up with this graph

powerscaling5xjiu.png
 
Oct 26, 2017
6,151
United Kingdom

Cheers for the update. It only strengthens the case.

Regarding volume discounts, there's always a limit after which no additional price breaks are granted. I'd be willing to bet Sony and MS would both have order quantities of sufficient size to receive the maximum volume discount. The manufactures aren't going to run below operating profit.

What high volume does get you is priority. Which is just as important. And this may be where Sony has some leverage.

With current Xbox hardware sales, I can't see MS expecting to move over 10m units in the first year. It also depends on how many chips they're using per console SKU, since Scarlett is alleged to have 10 from the initial teaser vid. E.g. Lets say MS is projecting 8m sales in the first yr, that's only 80m chips versus Sony's 160m, suggesting MS is more likely to pay closer to the market rate per chip than Sony who is ordering double the number of chips.

The above is merely illustrative, but I don't agree that MS and Sony are on the same level playing field in terms of negotiation power. And priority doesn't mean anything to Sony and MS when buying memory chips from a supplier. I struggle to see what benefit priority gives when the current DRAM suppliers aren't operating near full capacity (i.e. prices are on a downward trend).

To your point about suppliers not wanting to operate below market profit, that's obvious, but the extent they are willing to discount for volume orders is anyone's guess, however, even a small discount of ten of dollars per chip can make a big difference to a console BOM where Sony may or may not be planning for 16x chips per console SKU.
 
Last edited:

GhostTrick

Member
Oct 25, 2017
11,316
Well my personal testing pretty much lines up with this graph

powerscaling5xjiu.png


So does their testing. 1800Mhz nets them around 212W on average.

Now as I said to the poster I quoted, yes, beyond a certain clockspeed, you lose efficiency and higher clocks requires far bigger power consumption. Yet I dont see with that in mind when RX5700XT by itself draws over 200W at 1800mhz, that a 56CU could draw the same power at the same clockspeed. And we're talking about a GPU here. Now an entire SoC with a 56CU GPU at 1800mhz, a 3.2Ghz 8 core Zen 2 targeting 200W ? That seems really far fetched'
 

PianoBlack

Member
May 24, 2018
6,645
United States
With current Xbox hardware sales, I can't see MS expecting to move over 10m units in the first year. It also depends on how many chips they're using per console SKU, since Scarlett is alleged to have 10 from the initial teaser vid. E.g. Lets say MS is projecting 8m sales in the first yr, that's only 80m chips versus Sony's 160m, suggesting MS is more likely to pay close to the market rate per chip than Sony who is ordering double the number of chips.

The above is merely illustrative, but I don't agree that MS and Sony are on the same level playing field in terms of negotiation power.

You're forgetting the millions of Scarlett chips MS will need for xCloud, and the fact that they are also a huge AMD customer for Azure compute. I doubt MS lacks for bargaining leverage with AMD.

Now an entire SoC with a 56CU GPU at 1800mhz, a 3.2Ghz 8 core Zen 2 targeting 200W ? That seems really far fetched'

Why do you assume a 200W limit? Believe :)
 

GhostTrick

Member
Oct 25, 2017
11,316
You're forgetting the millions of Scarlett chips MS will need for xCloud, and the fact that they are also a huge AMD customer for Azure compute. I doubt MS lacks for bargaining leverage with AMD.



Why do you assume a 200W limit? Believe :)


Even if I dont assume it. Where do we end up ? When does a 56CU gpu can consum as much as a 40CU gpu at the same clockspeed (210W, solely for the gpu).
 

Fredrik

Member
Oct 27, 2017
9,003
Yeah, I wouldn't expect 60fps regardless. It is up to what the dev/pub wants and they are going to go for what makes the game look better (whether that be resolution, AI, effects, etc...).

The percentage of 60fps games (or at least those with a 60fps option) should be higher than this gen, however.
Yup, 30fps will be the norm unfortunately, I even doubt that last bit to be honest. The weak CPU is often blamed but we've still seen 60fps occasionally in just about all genres so it's easy to see that it's not the real problem. FM5 was 60fps 1080p on launch day of the weakest console. DC was 30fps on the most powerful console. And console games being downports from PC has been problematic this gen but nothing says it won't keep on being that even next gen considering how much is happening now with raytracing and the increased battle between AMD, Nvidia and Intel.
Aslo, as a more solid example, look at Assassin's Creed Odyssey on Stadia and ignore the streaming, the game runs on 10.7tf hardware, possibly not far from what we'll see on PS5 and XB2, and it's still 30fps. Disappointing, but that's how it is.
 

chris 1515

Member
Oct 27, 2017
7,074
Barcelona Spain
You're forgetting the millions of Scarlett chips MS will need for xCloud, and the fact that they are also a huge AMD customer for Azure compute. I doubt MS lacks for bargaining leverage with AMD.



Why do you assume a 200W limit? Believe :)

There is more than the SOC inside the console and the last IHS it seems some components were a bit cheaper for Sony

141525.jpeg
 
Last edited:

Morrowbie

Member
Oct 28, 2017
1,137
Ariel, Gonzalo and Flute are all Shakespear characters apparently!
Ariel and Gonzalo are both from the Tempest and are the characters that provide the most assistance to the powerful wizard Prospero. Has 'Prospero' popped up anywhere at all? I'd be surprised if it wasn't being used for something internally.

Flute is from A Midsummer Night's Dream, likely that name is actually from a different scheme of code names (connected with instruments).
 

sncvsrtoip

Banned
Apr 18, 2019
2,773
So does their testing. 1800Mhz nets them around 212W on average.

Now as I said to the poster I quoted, yes, beyond a certain clockspeed, you lose efficiency and higher clocks requires far bigger power consumption. Yet I dont see with that in mind when RX5700XT by itself draws over 200W at 1800mhz, that a 56CU could draw the same power at the same clockspeed. And we're talking about a GPU here. Now an entire SoC with a 56CU GPU at 1800mhz, a 3.2Ghz 8 core Zen 2 targeting 200W ? That seems really far fetched'
This image is power draw of whole system not just 5700xt
 

Deleted member 12635

User requested account closure
Banned
Oct 27, 2017
6,198
Germany
Ok I looked it over. Again, I'm sorry for being difficult.

So based on what AegonSnake said RDNA has a 1.45x advantage over GCN/Polaris/Vega correct?

To simplify it a 10 TFLOP RDNA would equal 14.5 GCN/Polaris/Vega FLOPS?
Performance differences between NAVI vs Polaris and NAVI vs VEGA are different.

I already posted a diagram with the numbers somewhere in this or the last thread and I will re-post it now again so we don't need to address the same questzion all over again every 30 pages ...

8kKkt7K.png

Source: Computerbase.de
 
Last edited:
Oct 26, 2017
6,151
United Kingdom
You're forgetting the millions of Scarlett chips MS will need for xCloud, and the fact that they are also a huge AMD customer for Azure compute. I doubt MS lacks for bargaining leverage with AMD.

You mean 100s of thousands of chips. It won't be anywhere near the same level of volume as their console orders.

Equally, do you think Sony is going to run PS5 games for PSNow on PS4-based server racks?

So

PS4 platform = Orbis
PS4 APU = Liverpool

PS5 platform = Flute
PS5 SoC or iGPU = Gonzalo

Is that right?

For PS4 I would guess the platform is Liverpool and the GPU is Thebes (as per the PCI Database entry).

Orbis is/was allegedly Sony's internal codename, however, this was never officially acknowledged anywhere outside the pastebin leak (that granted proved to be 100% true).
 

sncvsrtoip

Banned
Apr 18, 2019
2,773
Performance differences between NAVI vs Polaris and NAVI vs VEGA are different.

I already posted a diagram with the numbers somewhere in this or the last thread and I will re-post it now again so we don't need to address the same answer all over again every 30 pages ...

8kKkt7K.png

Source: Computerbase.de
Perf per flops depands of what navi and whats vega (56 is much more efficient than 64), 7.5tf navi is on pair with 12.58tf vega (1.67x) doesnt means thats general rule coues its not
 
Last edited:
DualShock 5 Patent

gofreak

Member
Oct 26, 2017
7,736
Hmm. Dualshock 5 related? Two patents filed this year. They seem to describe the use of an actuator to physically move triggers around for haptic feedback to the user.


This pic seems to show the triggers tilting/twisting inwards towards each other, for example:

3hMxfKo.png


AFAIK Microsoft has been researching similar things for triggers - so perhaps that'll be something we can expect as standard in controllers next-gen.
 

VX1

Member
Oct 28, 2017
7,000
Europe
A bit off topic but it seems relations between Japan and S.Korea are going from bad to worse:



This might have big impact on memory,nand flash prices/availability and what not...i wonder if it will influence next gen consoles as well.
 

BitsandBytes

Member
Dec 16, 2017
4,576
A bit off topic but it seems relations between Japan and S.Korea are going from bad to worse:



This might have big impact on memory,nand flash prices/availability and what not...i wonder if it will influence next gen consoles as well.


At the end of the day money and jobs talk. They will both puff out their chests but ultimately will figure things out. At least in 9/10 cases this happens.
 

Philippo

Developer
Verified
Oct 28, 2017
7,919
I missed the last bunch of pages, 10 or so, what did i miss?
Seems like codenames and such for PS5 dev kits and APUs are a lock?
And that PS5/Gonzalo has a stronger GPU than expected but a "disappointing" eaker CPU, am i reading this right?
 

CosmicBolt

Self-Requested Ban
Member
Oct 28, 2017
884
I missed the last bunch of pages, 10 or so, what did i miss?
Seems like codenames and such for PS5 dev kits and APUs are a lock?
And that PS5/Gonzalo has a stronger GPU than expected but a "disappointing" eaker CPU, am i reading this right?
PS5 ZEN2 CPU will be cut down version of its desktop variant. This is expected and by no means "disappointing".
 
Nov 2, 2017
2,275
Every graph I see with 4K CPU Scaling seems to look like this

This is with a Titan Xp


TFiAKfs.png

SOgQCW0.png
Duh, because the GPU is the bottleneck here at 4k ultra settings in these games... If you'd lower the settings or use a more powerful GPU it would be a different story. There's no difference between 1080p & 4k in theory if your GPU is powerful enough or if the game isn't graphically intensive. I'm pretty sure a 60fps console game, like battlefield, on a 2080Ti at 4k would show a difference.

And again CPUs in console aren't going to be spending most of the time running at <50% usage like they do on PC at 4k ultra. If that was the case you might as well stick a quad core in them as your example shows.
 

chris 1515

Member
Oct 27, 2017
7,074
Barcelona Spain
Doubt have been cast over the benchmark that shown that regarding whether its actually related

This is the only valid test we had and this is okay, the performance looks like a 1700X this is good...

AMD-Flute-semicustom-APU-Zen-2-konzole-UserBenchmark-2.png




A good reminder how the 3700X compare to a 1700X:


And an intel 9 990K, this is not bad and far from the anemic Jaguar:


This is ok, we never had a CPU with performance not lagging completly compared to good PC gaming CPU, I would say for CPU bound game it will be difficult to double the performance without better multithreading on engine. Some are very good like AC engine and probably other Ubi engine or CryEngine... I would not be surprise one day some games need 12 cores with better frequency to double the performance of the CPU inside the next generation consoles.

www.gdcvault.com

Parallelizing the Naughty Dog Engine Using Fibers

This talk is a detailed walkthrough of the game engine modifications needed to make The Last of Us Remastered run at 60 fps on PlayStation 4. Topics covered will include the fiber-based job system Naughty Dog adopted for the game, the overall...

Like this methodology for example or the one in other performant engine.
 
Last edited:

sncvsrtoip

Banned
Apr 18, 2019
2,773
Duh, because the GPU is the bottleneck here at 4k ultra settings in these games... If you'd lower the settings or use a more powerful GPU it would be a different story. There's no difference between 1080p & 4k in theory if your GPU is powerful enough or if the game isn't graphically intensive. I'm pretty sure a 60fps console game, like battlefield, on a 2080Ti at 4k would show a difference.

And again CPUs in console aren't going to be spending most of the time running at <50% usage like they do on PC at 4k ultra. If that was the case you might as well stick a quad core in them as your example shows.
but thats closer to real life game siutation, usualy we are gpu limited
 

DavidDesu

Banned
Oct 29, 2017
5,718
Glasgow, Scotland
Hmm. Dualshock 5 related? Two patents filed this year. They seem to describe the use of an actuator to physically move triggers around for haptic feedback to the user.


This pic seems to show the triggers tilting/twisting inwards towards each other, for example:

3hMxfKo.png


AFAIK Microsoft has been researching similar things for triggers - so perhaps that'll be something we can expect as standard in controllers next-gen.
Oh please! Would be amazing for racing games!
 
Oct 25, 2017
17,904
Yup, 30fps will be the norm unfortunately, I even doubt that last bit to be honest. The weak CPU is often blamed but we've still seen 60fps occasionally in just about all genres so it's easy to see that it's not the real problem. FM5 was 60fps 1080p on launch day of the weakest console. DC was 30fps on the most powerful console. And console games being downports from PC has been problematic this gen but nothing says it won't keep on being that even next gen considering how much is happening now with raytracing and the increased battle between AMD, Nvidia and Intel.
Aslo, as a more solid example, look at Assassin's Creed Odyssey on Stadia and ignore the streaming, the game runs on 10.7tf hardware, possibly not far from what we'll see on PS5 and XB2, and it's still 30fps. Disappointing, but that's how it is.
I think that is because of the foundation they had to work with. Yeah, we've had 60fps games where it was priority for the genre, but I think doing it would be a lot of work and trouble for devs so they didn't bother. They probably didn't even entertain the idea of it in a lot of cases.

With a much better foundation next generation, I think the potential for it will increase and that will lead to a higher percentage of games with 60fps. Using arbitrary numbers, if it was 10% this gen, I could see a bump up to 15% or so.
 
Status
Not open for further replies.