I like the name Xbox 365 😂 Just so used to Office 365 name at work already. MS can also use the number "5" as part of their name.
https://www.youtube.com/watch?v=yE_i9wn7hgk after his criticism. https://www.youtube.com/watch?v=IwczmQNHVfo is a stomp on a XFX card, which they later fixed.
You will learn in not much time that you know nothing when you finish school. The sooner you accept that the better your carrer will be.(I am about to graduate in EECS, so I dont speculate here with no background in this ;) )
Something that applies to.....You will learn in not much time that you know nothing when you finish school. The sooner you accept that the better your carrer will be.
Maybe 4-5 times was a uber exaggeration, I get that. But again, this is the reason people always wonder years later how Gfraphics like these and that are possible on such an old GPU, which PC counterpart couldnt start the game not even in the lowest settings.
But yes, a Navi like Klee is telling, should be in the End faster than a 2080 Ti on PC.
Because you think of the 2060 as a PC gfx card, not a console GPU.
A 2060 used in consoles is 4-5 times faster than a 2080 Ti on PC, but to understand this you have to be a CS/CE
BTW, the Navi Chip used in PS5/Scarlett is at a minimum on 2070 Super Level when comparing to a PC card. But again, when in development for console titles it will be faster than a 2080 Ti by far, and really by far.
the 2080 Ti or PC cards in general have a potential, but it cant be used on PCs due to the amount of different configs.
this is why in consoles GPUs are much faster
(I am about to graduate in EECS, so I dont speculate here with no background in this ;) )
Did not at all. PS4 runs things like that look like Order 1886 and Horizon Zero Dawn, whereas if you had a PS4-spec-PC back in 2013, you would have had a miserable last 6 years in comparison, at least graphically. If anything 2x is an under-exaggeration.
That infamous Carmack tweet. It aged like milk indeed.
Remedy devs on PS5 specs in the latest issue of OPM UK
On SSD: ... Faster hardware is always appreciated... But it's the new SSD that really stands out. Essentially streaming will become something we don't really have to worry about and it will free up some extra CPU bandwidth in the process.....
For something like Control, that could translate to an even deeper destruction system, richer, more detailed world's, and simple QOL improvements like instant reloading after dying.
Did not at all. PS4 runs things like that look like Order 1886 and Horizon Zero Dawn, whereas if you had a PS4-spec-PC back in 2013, you would have had a miserable last 6 years in comparison, at least graphically. If anything 2x is an under-exaggeration.
Adding this to next gen thread hope you don't mind .
It really going to be interesting seeing the Jump in destruction in next gen games .
They now have a much better CPU and SSD to help with that and control is good eg of where things going to go destruction wise even in open world games
I've been playing through Jedi Fallen Order recently and the idea of instant respawns sounds too good to be true. That game has me waiting upwards of a minute each death on Xbox One X.
If I wasn't already craving next-gen consoles, I sure as hell am now. It doesn't help that I've been playing a ton of games on my Switch with almost no loading times.
It is much more complicated than that. A lot of people like to cite inflation, or wage growth but ultimately it boils down to consumer appetite. What are people prepared to pay for your product. That is your price point.People are so wet behind the ears about things like inflation and hardware cost increase. It's the natural order of things people.
PS2 was the best selling hardware of all time. Obviously $299 is the magic price Sony can never migrate from...
Or $249, to match the Wii. Second best.
$399 should never have happened using that logic. But it did, and it worked.
$499 will be fine as well, as we've seen with Xbox hardware sales. No one will bat an eye at $499 by the end of 2020.
Trying to say $399 is the magic number is blatantly false due to the PS2 success at $299.
It does ? 1080p 30fps at console settings ?An HD7870 from 2012 runs RDR2 at PS4 settings. Anything else ?
Of course The Order and Horizon wont run on those PCs. Because those are exclusives. If they weren't, they'd run on those PCs.
Don't know the PS4 settings but in Vulkan it is able to do 30 fps uncapped at low settings.
console version is a combination of all settings, so more expensive than this.
An HD7870 from 2012 runs RDR2 at PS4 settings. Anything else ?
Of course The Order and Horizon wont run on those PCs. Because those are exclusives. If they weren't, they'd run on those PCs.
A HD780 PC with a ps4 level Jaguar CPU would not run RDR2 and even close to PS4 settings - and saying otherwise is completely objectively untrue and misleading. Anything else?
Saying Horizon Zero Dawn would run on same machine is just deranged.
Well, Xbox One X has settings under low settings. So yeah, base PS4 might be worse.
Xbox Senor Fidelity
What a great name xD
That pastbin is one of the craziest of them all xD
If I had to fill an assload of racks with scarlett for some estimated peak load, I might also look into how to utilize those normally unused compute resources outside of their target audience.
God of War respawns almost instantly, so this can be done on current generation too.I've been playing through Jedi Fallen Order recently and the idea of instant respawns sounds too good to be true. That game has me waiting upwards of a minute each death on Xbox One X.
If I wasn't already craving next-gen consoles, I sure as hell am now. It doesn't help that I've been playing a ton of games on my Switch with almost no loading times.
the console war genie is out of the bottle
They could do that to some extent without adding a bunch of silicon that would be a waste for games. Spencer said very specifically that - and in implicit contrast with the XB1 - Scarlet is being designed with games in mind, and games alone. Spending budget on an outsized level of double precision performance or tensor core performance for non-game workloads would leave the door wide open to another own-goal for game performance. It would also be quite antithetical to the approach take in XB1X, which looked to actually hone down perceived 'extraneous' bits inside the GPU CUs based on game workloads alone, to pack as much game performance in.
and what about all those settings that are medium or higher? are we just going to ignore those?
Sorry to break it to you. Some stuff are at low and even lower than low, which means you cant replicate them on PC. That is on One X. So yeah, you cant get a 1:1 comparison. But that's based on the fact that, yeah, you cant achieve lower than low.
and what about all those settings that are medium or higher? are we just going to ignore those?
It would completely depend on what they would be offering clients. In my opinion, the concern would mainly be towards the OS and the bits needed to run generalized software rather than what customizations they've done to the CPU and GPU specific to gaming. Having said that, you could probably spin up VMs on these fairly simply to get that going. I also didn't listen to whatever thing kicked off this discussion in this thread, I was just pointing out that them using these idle resources is not outside the realm of possibilities.
Entry level CPU would do the job just fine. But yeah, "objectively untrue, misleading" "Horizon Zero Dawn unthinkable".
Good thing video facts exists to crush those fanboyish claims.
It would make more sense to have a variant for datacenters - that can run scarlet games or other ai/double-fp workloads, than for the same silicon to go in the home units. That I could see for sure.
Sure, dismiss 2x spec for spec claim by Carmack as 'aging like milk' and then be so nebulous with CPU specs this equivalent PC could actually have, what, 3x the CPU power?
And you still won't back down on HZD running on this shit-box PC - lol
A HD780 PC with a ps4 level Jaguar CPU would not run RDR2 and even close to PS4 settings - and saying otherwise is completely objectively untrue and misleading. Anything else?
Saying Horizon Zero Dawn would run on same machine is just deranged.
An HD7870 from 2012 runs RDR2 at PS4 settings. Anything else ?
Of course The Order and Horizon wont run on those PCs. Because those are exclusives. If they weren't, they'd run on those PCs.
TDP is a made up marketing number that has little to do with the power draw of a chip. Don't believe me, go check out Gamers Nexus video on how meaningless TDP is: https://www.youtube.com/watch?v=cVT1hydbBIE
Sony Drone Simulator yessssssssssNot sure what to make of this one. UAV for course-mapping, but it seems to support live events, not just capture for later use.
7870 is a 2.5 tflops GPU. PS4 is 1.84 Tflops.
7850 (1.76 tflops) or a GTX 570 1.5 nvidia tflops would be a better comparison.
God of War respawns almost instantly, so this can be done on current generation too.
Because you think of the 2060 as a PC gfx card, not a console GPU.
A 2060 used in consoles is 4-5 times faster than a 2080 Ti on PC, but to understand this you have to be a CS/CE
BTW, the Navi Chip used in PS5/Scarlett is at a minimum on 2070 Super Level when comparing to a PC card. But again, when in development for console titles it will be faster than a 2080 Ti by far, and really by far.
the 2080 Ti or PC cards in general have a potential, but it cant be used on PCs due to the amount of different configs.
this is why in consoles GPUs are much faster
(I am about to graduate in EECS, so I dont speculate here with no background in this ;) )
A HD780 PC with a ps4 level Jaguar CPU would not run RDR2 and even close to PS4 settings - and saying otherwise is completely objectively untrue and misleading. Anything else?
Saying Horizon Zero Dawn would run on same machine is just deranged.
They call it mostly meaningless as a measurement of power draw, which does support my statement. Same TDP can give wildly different actual power draw, which is the only argument I have made and I believe I quite clearly said it was meaningless to be used as a metric for power draw. I questioned people tying it to power draw, which it cannot be used as a metric for. Please don't expand on what I actually said to infer some meaning I didn't explicitly state.Your Gamers Nexus video doesn't even support your conclusion.
They state that TDP is not a straightforward quantity and the calculation of it differs from vendor/IHV to vendor/IHV.
TDP is, in most cases, a meaningless number as it doesn't actually say anything about the power draw
They call it mostly meaningless as a measurement of power draw, which does support my statement. Same TDP can give wildly different actual power draw, which is the only argument I have made and I believe I quite clearly said it was meaningless to be used as a metric for power draw. I questioned people tying it to power draw, which it cannot be used as a metric for. Please don't expand on what I actually said to infer some meaning I didn't explicitly state.