Status
Not open for further replies.

Super Barrier

Member
Nov 20, 2017
1,337
I like the name Xbox 365 😂 Just so used to Office 365 name at work already. MS can also use the number "5" as part of their name.
 

GhostTrick

Member
Oct 25, 2017
11,546
Maybe 4-5 times was a uber exaggeration, I get that. But again, this is the reason people always wonder years later how Gfraphics like these and that are possible on such an old GPU, which PC counterpart couldnt start the game not even in the lowest settings.


But yes, a Navi like Klee is telling, should be in the End faster than a 2080 Ti on PC.



Tell me which game doesn't start on an HD7870. I'm waiting.
 

JaggedSac

Member
Oct 25, 2017
2,990
Burbs of Atlanta
If I had to fill an assload of racks with scarlett for some estimated peak load, I might also look into how to utilize those normally unused compute resources outside of their target audience.
 

Deleted member 1589

User requested account closure
Banned
Oct 25, 2017
8,576
All I know is I'll be extremely disappointed if they didn't name the server blades as XBlades.

Also, BladeStation for the other side.
 

CelestialAtom

Mambo Number PS5
Member
Oct 26, 2017
6,176
Because you think of the 2060 as a PC gfx card, not a console GPU.

A 2060 used in consoles is 4-5 times faster than a 2080 Ti on PC, but to understand this you have to be a CS/CE

BTW, the Navi Chip used in PS5/Scarlett is at a minimum on 2070 Super Level when comparing to a PC card. But again, when in development for console titles it will be faster than a 2080 Ti by far, and really by far.

the 2080 Ti or PC cards in general have a potential, but it cant be used on PCs due to the amount of different configs.

this is why in consoles GPUs are much faster

(I am about to graduate in EECS, so I dont speculate here with no background in this ;) )

Honestly, I believe it will be the equivalent of a 2080 (which is a huge achievement), but beating a 2080ti? I just can't see that. Who knows? Maybe my grounded perspective is wrong, but I just don't believe it to be possible. Also, I just can't allow myself to get hyped over technology, as that usually leads to disappointment when said technology is released.

I honestly think PS5/Scarlett will all be more about the CPU/SSD improvements, rather than spending top dollar on GPU improvements.
 

gundamkyoukai

Member
Oct 25, 2017
21,593
Remedy devs on PS5 specs in the latest issue of OPM UK

On SSD: ... Faster hardware is always appreciated... But it's the new SSD that really stands out. Essentially streaming will become something we don't really have to worry about and it will free up some extra CPU bandwidth in the process.....
For something like Control, that could translate to an even deeper destruction system, richer, more detailed world's, and simple QOL improvements like instant reloading after dying.

Adding this to next gen thread hope you don't mind .

It really going to be interesting seeing the Jump in destruction in next gen games .
They now have a much better CPU and SSD to help with that and control is good eg of where things going to go destruction wise even in open world games
 

GhostTrick

Member
Oct 25, 2017
11,546
Did not at all. PS4 runs things like that look like Order 1886 and Horizon Zero Dawn, whereas if you had a PS4-spec-PC back in 2013, you would have had a miserable last 6 years in comparison, at least graphically. If anything 2x is an under-exaggeration.



An HD7870 from 2012 runs RDR2 at PS4 settings. Anything else ?
Of course The Order and Horizon wont run on those PCs. Because those are exclusives. If they weren't, they'd run on those PCs.
 

Deleted member 5764

User requested account closure
Banned
Oct 25, 2017
6,574
Adding this to next gen thread hope you don't mind .

It really going to be interesting seeing the Jump in destruction in next gen games .
They now have a much better CPU and SSD to help with that and control is good eg of where things going to go destruction wise even in open world games

I've been playing through Jedi Fallen Order recently and the idea of instant respawns sounds too good to be true. That game has me waiting upwards of a minute each death on Xbox One X.

If I wasn't already craving next-gen consoles, I sure as hell am now. It doesn't help that I've been playing a ton of games on my Switch with almost no loading times.
 

gundamkyoukai

Member
Oct 25, 2017
21,593
I've been playing through Jedi Fallen Order recently and the idea of instant respawns sounds too good to be true. That game has me waiting upwards of a minute each death on Xbox One X.

If I wasn't already craving next-gen consoles, I sure as hell am now. It doesn't help that I've been playing a ton of games on my Switch with almost no loading times.

Yes loading getting cut down will be great .
As person who loves fighting games it take way to long to play them now .
Hell i stop trying to get better ranks in DMCV cause the loading was to fucking long .
Fast travel don't even seem like fast travel when have to wait up to mins to move around the map in some games.
It will be hard for people to go back just because of loading times.
 

mangochutney

Member
Jun 11, 2018
375
People are so wet behind the ears about things like inflation and hardware cost increase. It's the natural order of things people.

PS2 was the best selling hardware of all time. Obviously $299 is the magic price Sony can never migrate from...

Or $249, to match the Wii. Second best.

$399 should never have happened using that logic. But it did, and it worked.

$499 will be fine as well, as we've seen with Xbox hardware sales. No one will bat an eye at $499 by the end of 2020.

Trying to say $399 is the magic number is blatantly false due to the PS2 success at $299.
It is much more complicated than that. A lot of people like to cite inflation, or wage growth but ultimately it boils down to consumer appetite. What are people prepared to pay for your product. That is your price point.

What someone is prepared to pay will also be linked to the value in the product, and so things like storage capacity etc plays a role in that. Without knowing what we're going to get it is tough to give a decent guess.

Another thing to remember is inflation is an average figure of the all industries. I can only speak to my own country, but if you take a sector like clothing, inflation there has been non-existent. You'll pay similar amounts for a t-shirt now as you would decades ago - but according to inflation it should be double the price by now, except it isn't.

I don't think we see $399 again. But I don't see $499 either.

$449/€499/£449.
 

degauss

Banned
Oct 28, 2017
4,631
An HD7870 from 2012 runs RDR2 at PS4 settings. Anything else ?
Of course The Order and Horizon wont run on those PCs. Because those are exclusives. If they weren't, they'd run on those PCs.

A HD780 PC with a ps4 level Jaguar CPU would not run RDR2 and even close to PS4 settings - and saying otherwise is completely objectively untrue and misleading. Anything else?

Saying Horizon Zero Dawn would run on same machine is just deranged.
 

GhostTrick

Member
Oct 25, 2017
11,546
A HD780 PC with a ps4 level Jaguar CPU would not run RDR2 and even close to PS4 settings - and saying otherwise is completely objectively untrue and misleading. Anything else?

Saying Horizon Zero Dawn would run on same machine is just deranged.


Entry level CPU would do the job just fine. But yeah, "objectively untrue, misleading" "Horizon Zero Dawn unthinkable".
Good thing video facts exists to crush those fanboyish claims.
 

Doctor Avatar

Banned
Jan 10, 2019
2,666
Well, Xbox One X has settings under low settings. So yeah, base PS4 might be worse.

source.gif
 

gofreak

Member
Oct 26, 2017
7,850
If I had to fill an assload of racks with scarlett for some estimated peak load, I might also look into how to utilize those normally unused compute resources outside of their target audience.

They could do that to some extent without adding a bunch of silicon that would be a waste for games. Spencer said very specifically that - and in implicit contrast with the XB1 - Scarlet is being designed with games in mind, and games alone. Spending budget on an outsized level of double precision performance or tensor core performance for non-game workloads would leave the door wide open to another own-goal for game performance. It would also be quite antithetical to the approach take in XB1X, which looked to actually hone down perceived 'extraneous' bits inside the GPU CUs based on game workloads alone, to pack as much game performance in.
 

Mitchman1411

Member
Jul 28, 2018
635
Oslo, Norway
I've been playing through Jedi Fallen Order recently and the idea of instant respawns sounds too good to be true. That game has me waiting upwards of a minute each death on Xbox One X.

If I wasn't already craving next-gen consoles, I sure as hell am now. It doesn't help that I've been playing a ton of games on my Switch with almost no loading times.
God of War respawns almost instantly, so this can be done on current generation too.
 

JaggedSac

Member
Oct 25, 2017
2,990
Burbs of Atlanta
They could do that to some extent without adding a bunch of silicon that would be a waste for games. Spencer said very specifically that - and in implicit contrast with the XB1 - Scarlet is being designed with games in mind, and games alone. Spending budget on an outsized level of double precision performance or tensor core performance for non-game workloads would leave the door wide open to another own-goal for game performance. It would also be quite antithetical to the approach take in XB1X, which looked to actually hone down perceived 'extraneous' bits inside the GPU CUs based on game workloads alone, to pack as much game performance in.

It would completely depend on what they would be offering clients. In my opinion, the concern would mainly be towards the OS and the bits needed to run generalized software rather than what customizations they've done to the CPU and GPU specific to gaming. Having said that, you could probably spin up VMs on these fairly simply to get that going. I also didn't listen to whatever thing kicked off this discussion in this thread, I was just pointing out that them using these idle resources is not outside the realm of possibilities.
 

modiz

Member
Oct 8, 2018
18,205
unknown.png


Sorry to break it to you. Some stuff are at low and even lower than low, which means you cant replicate them on PC. That is on One X. So yeah, you cant get a 1:1 comparison. But that's based on the fact that, yeah, you cant achieve lower than low.
and what about all those settings that are medium or higher? are we just going to ignore those?
 

GhostTrick

Member
Oct 25, 2017
11,546
and what about all those settings that are medium or higher? are we just going to ignore those?


Of course not. I never said all the settings were under low.
I said "it has settings under low". Which means you cant replicate them. You'll also notice that the "low/lower than low" are also often the more demanding ones.
 

gofreak

Member
Oct 26, 2017
7,850
It would completely depend on what they would be offering clients. In my opinion, the concern would mainly be towards the OS and the bits needed to run generalized software rather than what customizations they've done to the CPU and GPU specific to gaming. Having said that, you could probably spin up VMs on these fairly simply to get that going. I also didn't listen to whatever thing kicked off this discussion in this thread, I was just pointing out that them using these idle resources is not outside the realm of possibilities.

It would make more sense to have a variant for datacenters - that can run scarlet games or other ai/double-fp workloads, than for the same silicon to go in the home units. That I could see for sure.
 

degauss

Banned
Oct 28, 2017
4,631
Entry level CPU would do the job just fine. But yeah, "objectively untrue, misleading" "Horizon Zero Dawn unthinkable".
Good thing video facts exists to crush those fanboyish claims.

Sure, dismiss 2x spec for spec claim by Carmack as 'aging like milk' and then be so nebulous with CPU specs this equivalent PC could actually have, what, 3x the CPU power?

And you still won't back down on HZD running on this shit-box PC - lol
 

GhostTrick

Member
Oct 25, 2017
11,546
Sure, dismiss 2x spec for spec claim by Carmack as 'aging like milk' and then be so nebulous with CPU specs this equivalent PC could actually have, what, 3x the CPU power?

And you still won't back down on HZD running on this shit-box PC - lol


Of course it aged like milk. Tell me why so many games that are GPU bound performs more or less the same as their console counterparts ?
If you believe in the "2x spec for spec claim" despite this generation, you either haven't been paying attention or in for a rude awakening.

And yeah, I wont back down on HZD. Why ? Because yeah, we have no way to demonstrate that a game that is only on a console that can runs on a platform it's not. We do know though that there are as demanding, if not more demanding games running the same way as their console counterparts on equivalent hardware.
 

AegonSnake

Banned
Oct 25, 2017
9,566
A HD780 PC with a ps4 level Jaguar CPU would not run RDR2 and even close to PS4 settings - and saying otherwise is completely objectively untrue and misleading. Anything else?

Saying Horizon Zero Dawn would run on same machine is just deranged.
An HD7870 from 2012 runs RDR2 at PS4 settings. Anything else ?
Of course The Order and Horizon wont run on those PCs. Because those are exclusives. If they weren't, they'd run on those PCs.

7870 is a 2.5 tflops GPU. PS4 is 1.84 Tflops.

7850 (1.76 tflops) or a GTX 570 1.5 nvidia tflops would be a better comparison.
 
Oct 26, 2017
6,151
United Kingdom
TDP is a made up marketing number that has little to do with the power draw of a chip. Don't believe me, go check out Gamers Nexus video on how meaningless TDP is: https://www.youtube.com/watch?v=cVT1hydbBIE

Your Gamers Nexus video doesn't even support your conclusion.

They state that TDP is not a straightforward quantity and the calculation of it differs from vendor/IHV to vendor/IHV.

This does not mean that TDP is a useless BS number that has no relation to electrical power consumption. The problem here is you're reading something else into their statement.

TDP is "Thermal Design Power", i.e. a design metric used to inform a maximum thermal power to be dissipated from a given processor by a cooling solution. Of course it's a function of electrical power consumption, however, whether that is instantaneous peak power consumption or sustained max power consumption, always plus some design safety margin, will depend on who is determining it.

The key point for consoles is that unlike PC, the folks determining their formulae for TDP are the same folks who will design the cooling solution. So for a console, TDP is directly proportional to power consumption.
 

Deleted member 1589

User requested account closure
Banned
Oct 25, 2017
8,576
I can't wait to fly a drone to explore a world, have the battery dead after 30 minutes, charge it up in a QTE and fly it back again.
 

GhostTrick

Member
Oct 25, 2017
11,546
7870 is a 2.5 tflops GPU. PS4 is 1.84 Tflops.

7850 (1.76 tflops) or a GTX 570 1.5 nvidia tflops would be a better comparison.


GTX 570 wouldn't cut it. It's a completly different architecture. You're right, HD 7870 is a faster GPU. And it's still a bit faster in most scenarios. In any case, we're in a 30% class difference in worst cases. Not two times minimum. And certainly not "over two times".
 
OP
OP
Mecha Meister

Mecha Meister

Next-Gen Guru
Member
Oct 25, 2017
2,824
United Kingdom
Because you think of the 2060 as a PC gfx card, not a console GPU.

A 2060 used in consoles is 4-5 times faster than a 2080 Ti on PC, but to understand this you have to be a CS/CE

BTW, the Navi Chip used in PS5/Scarlett is at a minimum on 2070 Super Level when comparing to a PC card. But again, when in development for console titles it will be faster than a 2080 Ti by far, and really by far.

the 2080 Ti or PC cards in general have a potential, but it cant be used on PCs due to the amount of different configs.

this is why in consoles GPUs are much faster

(I am about to graduate in EECS, so I dont speculate here with no background in this ;) )

S4nwjVv.gif


There's evidence which says otherwise when you look at the current generation consoles and their closest PC equivalent GPUs.

A HD780 PC with a ps4 level Jaguar CPU would not run RDR2 and even close to PS4 settings - and saying otherwise is completely objectively untrue and misleading. Anything else?

Saying Horizon Zero Dawn would run on same machine is just deranged.

I'm sure GhostTrick was speaking specifically about the GPU.
An 8 Jaguar core CPU doesn't even exist outside the APUs used in the consoles, so we can't really do a comparison with comparable hardware there.

You know what, I'm going to do some wild shit out of curiosity.

PamLozk.png


My Ryzen 5 3600 at 1GHz... Ooooh boy.
 
Last edited:

Mitchman1411

Member
Jul 28, 2018
635
Oslo, Norway
Your Gamers Nexus video doesn't even support your conclusion.

They state that TDP is not a straightforward quantity and the calculation of it differs from vendor/IHV to vendor/IHV.
They call it mostly meaningless as a measurement of power draw, which does support my statement. Same TDP can give wildly different actual power draw, which is the only argument I have made and I believe I quite clearly said it was meaningless to be used as a metric for power draw. I questioned people tying it to power draw, which it cannot be used as a metric for. Please don't expand on what I actually said to infer some meaning I didn't explicitly state.
This is what I wrote in the context of power usage:
TDP is, in most cases, a meaningless number as it doesn't actually say anything about the power draw
 
Oct 26, 2017
6,151
United Kingdom
They call it mostly meaningless as a measurement of power draw, which does support my statement. Same TDP can give wildly different actual power draw, which is the only argument I have made and I believe I quite clearly said it was meaningless to be used as a metric for power draw. I questioned people tying it to power draw, which it cannot be used as a metric for. Please don't expand on what I actually said to infer some meaning I didn't explicitly state.

Which is only true on PC if you're trying to determine electrical power draw of different devices from different vendors as the different vendors calculate TDP differently. And it IS related to electrical power draw, even on PC, it's just that the relationship itself differs from device to device and vendor to vendor. So you're comment is wrong.

All of this is irrelevant to this console thread we are talking about, wherein TDP informs directly to a large extent the cost impact of the main processor's electrical power consumption on the console project as a whole.

Ergo, your statement that TDP is a BS meaningless number in a thread about consoles is wrong.
 

AegonSnake

Banned
Oct 25, 2017
9,566
In regards to what the PS4 GPU is capable of, i think it's important to take into account the fact that nearly every game this gen has realtime cutscenes. yes, they use souped up versions of ingame character models, yes the lighting is better and higher quality DoF is used to improve graphical quality during cutscenes. However, the early gen tech demos were also cutscene only and thus, I think these realtime cutscenes are a good barometer for a comparison with tech demos at the start of this gen.

10646018075_8b51d25bb1_o.gif


tumblr_pp69ayLIbL1wr9474o1_r1_500.gifv


406gQkR.gif


tumblr_pycrtq9oLc1qf5hjqo6_500.gifv


SstrP3r.gif


tumblr_pr9eh4DLuO1ucz4cso3_500.gifv


giphy.gif


kfE9qJ6.gif


If next gen the ingame models look this good, i will be happy.
 
Status
Not open for further replies.