It's going to be all cross-gen games, anyway. Neither console should have any issues running current-gen games at 4K60.Looking at this thread imagine how it's gonna be when the first DF comparison comes out lol
It's going to be all cross-gen games, anyway. Neither console should have any issues running current-gen games at 4K60.Looking at this thread imagine how it's gonna be when the first DF comparison comes out lol
You do realize MS upped the CPU and GPU clocks a bit 'last minute' before Xbox One launch right?There is plenty stopping them. They haven't designed the system around higher clocks. Their power and thermal control is all done around their set clock speeds. They can't just raise those speeds on a whim. The entire reason Sony is using this unusual variable frequency method and expensive cooling is to allow such speeds to work safely.
There is plenty stopping them. They haven't designed the system around higher clocks. Their power and thermal control is all done around their set clock speeds. They can't just raise those speeds on a whim. The entire reason Sony is using this unusual variable frequency method and expensive cooling is to allow such speeds to work safely.
Right so now we are at the point where Xbox is being open about their base clocks and power levels and Sony is projecting boost speeds and an ssd but people are saying that now it's all about SSDs baby! Who cares about tflops/power/bc bah! All I want is those transfer rates daddy cerny!Lets time travel back to 2013
MS announce Xbox One, the multitasking console to rule the living room. Price? 500 dollars bundled with a Kinect (that nobody wants). Features: No used games, always online, if offline for more than 24h it will become a brick. "We have a product for people who aren't able to get some form of connectivity, it's called Xbox 360"
Is it powerful?:
- It has secret sauce; it has DDR3 memory because GDDR6 was "uncomfortable"; "have you seen Titanfall?"; crushed blacks + upscalling "You realize you will see every game in 1080p as your output right?"; "We created DirectX"; Power of the Cloud;
Sony annouce PS4, it just play games. Price: 400 dollars + nothing. Features: 100 dollars less than Xbox One.
if you ignore context of course we have a bunch of hypocrites here, haven't we?
But if you think just a little and remember all the other things surrounding the Xbox One launch it might make sense that people talked about the 40% difference in TF.
You can't make a product that is more expensive with less power and expect that people won't complain about it.
We have no idea if they have thermal headroom, no one but MS knows. I'd be shocked if they upped their clocks at this point, they have the higher numbers and all raising the clocks will do is reduce chip yields and increase fan noise on the console.They certainly have thermal headroom. MS is using an expensive cooling system too.
They raised the Xbox One's frequency a few months before launch, which had a standard heat sink.
Yeah this is what I keep thinking..like there's really not much to stop MS from upping the clocks a bit before launch considering how 'wide and slow' the architecture appears to be. Not saying they will..but it's certainly plausible if they feel like it.
I wish the X had variable clocks as well just because every game should be using dynamic resolution anyway that would scale with the clocks and means we could push even a little further at times. In games with fixed resolutions it would be terrible but I hope we never see fixed resolutions again.
Are they even trying to rein it in? GAF had its problems, particularly the shithead who owned it, but FUD like the article in the OP was shut down pretty quickly. It's embarrassing this thread has lasted over 9 hours and 11 pages on Era.Legit surprised we haven't seen a guideline or even a staff post yet, tbh.
You do realize MS upped the CPU and GPU clocks a bit 'last minute' before Xbox One launch right?
And even if it's about third party concerns, they would develop for the lowest specs as many here say, which could be the regular xbone gen for a year or two. Now are people saying third parties will base their games on the lowest common denominator or the higher specs? PS5 got like double IO and Xbox got 15% in whatever TF is gonna do. I'd want a poll to what people think.It's going to be all cross-gen games, anyway. Neither console should have any issues running current-gen games at 4K60.
If it bothers you so much, I would recommend putting these people on the ignore list.Right so now we are at the point where Xbox is being open about their base clocks and power levels and Sony is projecting boost speeds and an ssd but people are saying that now it's all about SSDs baby! Who cares about tflops/power/bc bah! All I want is those transfer rates daddy cerny!
you could see how some people are a little miffed at that pivot especially when they were browbeat about power for 4 years before the one x, evenPS4 execs were saying power makes you a better gamer so don't come in with a holier than thou attitude to console wars now that the pendulum is swinging the other way
Right so now we are at the point where Xbox is being open about their base clocks and power levels and Sony is projecting boost speeds and an ssd but people are saying that now it's all about SSDs baby! Who cares about tflops/power/bc bah! All I want is those transfer rates daddy cerny!
you could see how some people are a little miffed at that pivot especially when they were browbeat about power for 4 years before the one x, evenPS4 execs were saying power makes you a better gamer so don't come in with a holier than thou attitude to console wars now that the pendulum is swinging the other way
NXGamer and Digital Foundry have already covered this in far more detail.
Essentially, it's only misleading if you believe Cerny is lying about the system spending the majority of its time at max potential speeds. NXGamer does a good job of explaining why what Cerny said makes complete sense and appears most probable, hence the reason it really is 10.28 Tflop. Not to mention how only a very tiny frequency clock drop (2%) would be needed to claw back a lot of power, in the rare worst case power load scenarios.
I suggest you watch the video as it shows how whilst running games, the CPU/GPU are already downclocking based on usage on a frame by frame basis, hence rarely both max out per se anyway.
It looks like a lot of people are really wanting this to be a reverse of 2013 even when it shares no resemblance just so they can have their revenge for how their favorite console was treatedRight so now we are at the point where Xbox is being open about their base clocks and power levels and Sony is projecting boost speeds and an ssd but people are saying that now it's all about SSDs baby! Who cares about tflops/power/bc bah! All I want is those transfer rates daddy cerny!
you could see how some people are a little miffed at that pivot especially when they were browbeat about power for 4 years before the one x, evenPS4 execs were saying power makes you a better gamer so don't come in with a holier than thou attitude to console wars now that the pendulum is swinging the other way
I was under the impression the variable clocks were more like turbo on Intel CPU's, is it not that?The largest issue that's easiest to see is how well the box currently handles thermals with the current clocks and does say bumping the GPU to 2GHz and the CPU to 4.0 GHz/3.8 GHz affect the console's ability to handle the additional heat.
Having variable clocks means devs need to develop around where they want the system sending power (CPU vs GPU) and trying to maximize that.
Having a locked, sustained clock, devs have access to the highest raw output the box has without having to juggle the power shared between the CPU and GPU.
The question that will come up is where are devs deciding to cut back on CPU power on the PS5 to push more image quality or where are we seeing the image quality drop because power is being pulled away from the GPU and sent to the CPU to handle more load there.
They did, but only by a very small amount. The GPU clock was increased only 6% for example. In comparison the PS5's GPU clock is 22% faster than the XSX's. So whilst they may be able to change it late in the game, I'm not sure by how much they could.
No, Cerny was very specific to point out their variable frequencies work nothing like how they do on PCs and smartphones (where frequencies drop when things gets too hot). He was very clear to say that wouldn't work because then you'd run into situations where a game might perform worse just because someone was playing in a hot room.I was under the impression the variable clocks were more like turbo on Intel CPU's, is it not that?
with Sony only publicizing max clocks in a variable clock machine, they are not wrong at all in questioning it...
Right so now we are at the point where Xbox is being open about their base clocks and power levels and Sony is projecting boost speeds and an ssd but people are saying that now it's all about SSDs baby! Who cares about tflops/power/bc bah! All I want is those transfer rates daddy cerny!
When was the Xbox one the target for any game ?Honestly, all of these numeral differences probably won't mean much for multiplat titles as the lowest common denominator will be the target.
They did, but only by a very small amount. The GPU clock was increased only 6% for example. In comparison the PS5's GPU clock is 22% faster than the XSX's. So whilst they may be able to change it late in the game, I'm not sure by how much they could.
For every game that can run it. If they can run it, they obviously targeted it, because otherwise it wouldn't be able to run it at all.
The crazy thing about all this is no one is saying that the SX is inferior. Yes its the superior console in most areas. MS has done a great job marketing it and showing it off.
However, the hyperbole and downplay of the PS5 specs has been exhausting asf since Tuesday. You have devs, DF and NXGamer all praising the console for being smartly designed yet all it takes is one article from people who aren't devs to convince some of you Mark Cerny has lied or is being purposely vague.
I'd take the opinion of devs over this article any day. The sooner this forum can accept that next gen will be incredible either way on both consoles the better.
No, Cerny was very specific to point out their variable frequencies work nothing like how they do on PCs and smartphones (where frequencies drop when things gets too hot). He was very clear to say that wouldn't work because then you'd run into situations where a game might perform worse just because someone was playing in a hot room.
The only way of knowing how a GPU performs is by actually running games on it. This has always been the case.In my post I specifically compare a 5700 that is overclocked to 2,194 MHz, which is even closer to the PS5's clock and shows a correlating improvement in performance.
Just looks like the 5700 responds much better to overclocking than the 5700 XT does, perhaps that's due to the XT hitting power limitations, but as in your original post you only mentioned 5700, that's the one I looked into.
But that is interesting nonetheless. I guess now it's a case of seeing how RDNA2 responds to OC'ing, but presumably Sony is happy enough with the results that they did it.
I think this is conjecture until we see how 3rd party games are running on both. PS5's GPU cache scrubbers and coherency engines, combined with the custom I/O and any potential RDNA2 architecture improvements may paint a different picture.Conversely I think PS5's GPU at 2.23ghz is very bandwidth starved, especially on the render backend. Moving to 14gbps chips from 16gbps chips shown in the Github tests was a big sacrifice for price.
Man why can't we all be just happy with both systems. Why you all getting so worked up over this thing. PS4 with all its glory will not play Halo and SX with its beasty TF won't play ND games. Lets just hope both systems are no more than $500
He's splitting hairs. As power increases, so does the temperature. So if they lower a frequency to stay within a power envelope, they're also lowering frequency as temperatures increase.
Modern CPUs and GPUs have thermal sensing diodes all over them. Hundreds in some cases. They won't be ignored. You can bet that they are all being monitored. It's just that they feel their cooling system has enough headroom to run without the extra heat affecting their programmed frequency curves for a given power load. But I'm certain you could abuse the chip enough with a hot environment to trigger a panic signal to slow it down. They'd be stupid not to.
Why would you ignore the CU count? It's not like it's insignificant by any stretch.
Here's a transcript of what Cerny said:
"The simplest approach would be to look at the actual temperature of the silicone die and throttle the frequency on that basis, but that won't work. It fails to create a consistent Playstation 5 experience. It wouldn't do to run a console slower simply because it was in a hot room. So rather than look at the actual temperature of the silicone die, we look at the activities that the GPU and CPU are performing and set the frequencies on that basis, which makes everything deterministic and repeatable."
My post was in response to whether Microsoft could increase their clock frequency late in the design stages as they've done in the past, to try and get closer to the PS5's clockspeed. Hence CU's was never mentioned.
Really good post. Thanks for this.I'm not a game programmer, but I do develop GPU accelerated scientific models, so I have a few years experience in optimizing different gpu architectures to accelerate computation. So I have sense of what makes something fast or slow as I've seen it play out on various hardware.
The notion of a 'sustained TF rate' doesn't really make sense. The TF number is a peak theoretical number that will in all likelyhood never be actually hit on either console(possibly occassionally for a few milliseconds at a time, but generally they'll not be computing at that rate). It's not a benchmark, it's just a way of counting components and clocks within a single number. It's actually a lot like a the way a business considers the number of 'man hours' it'll take to perform a task. The actual computational throughput will be determined by things like thread occupancy and how much shared and local memory each thread requires. Since both components use the same architecture, this in theory effects both equally, but it's not nonsense to suggest the higher clockrate gives a bit of help to the PS5 here as it's able to utilize local data (the data sitting in the GPU cache) and shuffle it out for the next piece of data that it needs more readily. It is an effective bandwidth increase on the ram->cache->computation pipeline.
More technical version:
The biggest factor on speeding up a GPU is how much you can saturate the compute units (those CUs that we keep hearing about). If you can get a concurrent thread on each ALU within each CU, and have minimal or no requirement to reach back to VRAM within a kernel call to swap data in and out of the CU cache, then you can get pretty close to your peak throughput. This is, in practice, not common, as the available local storage within the CU is tiny, a few tens of KB shared between all the ALUs. For example, in the Nvidia volta architecutre (which i'm most familiar with), there is a single 256KB block of memory (arranged in 32bit registers) for every thread running on that SM to use for data exclusive to that thread. In a perfect world, every one of the 64 CUDA cores in a single Volta SM would have it's own thread, meaning each one gets ~4KB per thread to store useful data. (There is 96KB of shared memory as well, but I'll ignore this for the moment. It's extremely useful but somewhat immaterial for this explanation). This is not generally practical, in my experience, so you're left with two options, not mutually exclusive. You can reduce the number of concurrent threads, or you can periodically swap data in and out of registers by calling back to VRAM. The former is what Cerny was alluding to when he said it's hard to fill more CUs than fewer, although I somewhat disagree with his characterization in the case where all CUs in both platforms have access to the same relative register and shared memory. I'm not familiar with RDNA2 though so I don't want to comment on that too much. In the latter case, which is almost always necessary to some degree, you can think of an analogy to screen tearing as to what happens under the hood.
Several threads are going about their business, making computations, thread a says 'oh, i need something from vram'. Thread a then gets paused, and thread b gets moved into his place to keep computing while the data for thread a is fetched. Meanwhile, thread b finishes his work, and thread a either is ready to keep going or isnt. This is determined by the latency in access to vram. Now if thread a isn't ready, most likely another thread gets moved in to thread b's place and keeps going. If thread a is ready, then it'll get shuffled back in to pick up the computation where it left off with its new data. The analogy to screen tearing is this. If you have ever played with v-sync on a 60hz monitor, and then on a 144hz monitor, you've probably noticed that screen tearing is far less noticeable than on a slower refresh. This is because the gap between when getting data and being able to use it is smaller. A similar analogy holds with clock speeds in a GPU. A faster clock speed will generally lead to less 'down time' in any given ALU as it is more likely to be ready to go sooner when the requisite data is available.
What I want to point out, is that NONE of this shows up in a TF metric. The underlying reality of swapping data in and out, the various bottlenecks, tradeoffs, etc, all of that is presumed to essentially not exist when discussing TF. However this is one of the biggest considerations when doing optimization, as you have to take into account these facts of life about having threads 'stall', so to speak.
Will this make the PS5 faster in computations than the XSX? In a few cases, possibly, but in general, no it won't. However it does mean that the story on what that gap is isn't as simple as many here are claiming. I expect that the PS5 may generally run at a slightly lower resolution (some quick calculations would put the resolution of 3504x1971 at a hair more than 16% reduction in pixel count), but in many cases I think the gap will be closer than you'd expect from a raw TF count, because the higher clock speed does help 'in the real world' in a somewhat non-linear fashion as compared to raw TF numbers in that it makes the penalty of moving data in and out of local storage smaller. It's not a HUGE difference (at least not in most cases), but it's not nothing.
Sony have so much mindshare. I don't think anybody would be claiming the series X was well thought out and smartly designed if the roles were reversed.
There's a war to be won. LOL I think people underestimate the overall power upgrade both of these consoles will have compared to last gen. Both of these consoles should be awesome.why is there so much keyboard sharpening in this thread, it's just video games
why is there so much keyboard sharpening in this thread, it's just video games
6 months? Aren't we optimistic.6 more months of these articles, folks. Hang in there. I assume neither console will really disappoint, and as always, people will go where the games they want to play are, regardless of power differences realized or perceived.
I think the need for it is not really clear on consoles.
When you mess with my new best friend GB/s, there's bound to be problems here.
DF are biased Sony shills... don't chu kno nothin'?