• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Monarch1501

Designer @ Dontnod
Verified
Nov 2, 2017
161
To chime in on the subject, as a designer I rarely, rarely ever put any thought on the GPU side of things. IIRC, there was only one level on my 5 years career where I had to find solutions to a GPU problem (heavy FPS drops on base XB and PS4 on a particular scene in Life Is Strange 2) with the help of engineers, tech artists, and our lighting artists.

On the other hand, I know for a fact that every designer had to micro-manage pre-streaming assets, levels and even unloading stuff as a scene was unfolding, to free up RAM space for the next level since we had RAM budget for every scene.

Having a base SSD in both next-gen consoles could theoretically mean less micro-management on RAM/streaming on our end and more time to polish the core experience. Having said that, it doesn't mean that peak GPU power isn't something that other departments aren't paying close attention to.
 

endlessflood

Banned
Oct 28, 2017
8,693
Australia (GMT+10)
To chime in on the subject, as a designer I rarely, rarely ever put any thought on the GPU side of things. IIRC, there was only one level on my 5 years career where I had to find solutions to a GPU problem (heavy FPS drops on base XB and PS4 on a particular scene in Life Is Strange 2) with the help of engineers, tech artists, and our lighting artists.

On the other hand, I know for a fact that every designer had to micro-manage pre-streaming assets, levels and even unloading stuff as a scene was unfolding, to free up RAM space for the next level since we had RAM budget for every scene.

Having a base SSD in both next-gen consoles could theoretically mean less micro-management on RAM/streaming on our end and more time to polish the core experience. Having said that, it doesn't mean that peak GPU power isn't something that other departments aren't paying close attention to.
Thank you for taking the time to share :)

Small developer insights like this make the wading through pages of infantile bickering at least worth something.
 

Tiago Rodrigues

Attempted to circumvent ban with alt account
Banned
Nov 15, 2018
5,244
To chime in on the subject, as a designer I rarely, rarely ever put any thought on the GPU side of things. IIRC, there was only one level on my 5 years career where I had to find solutions to a GPU problem (heavy FPS drops on base XB and PS4 on a particular scene in Life Is Strange 2) with the help of engineers, tech artists, and our lighting artists.

On the other hand, I know for a fact that every designer had to micro-manage pre-streaming assets, levels and even unloading stuff as a scene was unfolding, to free up RAM space for the next level since we had RAM budget for every scene.

Having a base SSD in both next-gen consoles could theoretically mean less micro-management on RAM/streaming on our end and more time to polish the core experience. Having said that, it doesn't mean that peak GPU power isn't something that other departments aren't paying close attention to.

This is what most developers have been saying. Everyone that actually knows stuff laughs at how people look at the teraflops number and calls it a day.
 

nib95

Contains No Misinformation on Philly Cheesesteaks
Banned
Oct 28, 2017
18,498
To chime in on the subject, as a designer I rarely, rarely ever put any thought on the GPU side of things. IIRC, there was only one level on my 5 years career where I had to find solutions to a GPU problem (heavy FPS drops on base XB and PS4 on a particular scene in Life Is Strange 2) with the help of engineers, tech artists, and our lighting artists.

On the other hand, I know for a fact that every designer had to micro-manage pre-streaming assets, levels and even unloading stuff as a scene was unfolding, to free up RAM space for the next level since we had RAM budget for every scene.

Having a base SSD in both next-gen consoles could theoretically mean less micro-management on RAM/streaming on our end and more time to polish the core experience. Having said that, it doesn't mean that peak GPU power isn't something that other departments aren't paying close attention to.

Appreciate the insight. Always interesting to read/hear little background development titbits like this.
 
Oct 27, 2017
2,073
To chime in on the subject, as a designer I rarely, rarely ever put any thought on the GPU side of things. IIRC, there was only one level on my 5 years career where I had to find solutions to a GPU problem (heavy FPS drops on base XB and PS4 on a particular scene in Life Is Strange 2) with the help of engineers, tech artists, and our lighting artists.

On the other hand, I know for a fact that every designer had to micro-manage pre-streaming assets, levels and even unloading stuff as a scene was unfolding, to free up RAM space for the next level since we had RAM budget for every scene.

Having a base SSD in both next-gen consoles could theoretically mean less micro-management on RAM/streaming on our end and more time to polish the core experience. Having said that, it doesn't mean that peak GPU power isn't something that other departments aren't paying close attention to.

I'm not a dev in the slightest, but your comment remind me of James Cooper tweet which is a really interesting one I think.

 

Pasedo

Member
Oct 27, 2017
52
This is precisely the problem! This is what you believe.

You are willing to write off the opinion of seasoned veteran developers from some of the top studios in the world because you have an inherent belief that something Sony spent hundreds of millions of dollars on customizing is a bad solution. You are also willing to write off Jason Schrier who just last week trashed Sony's top studio.

I'm also afraid you haven't actually looked into the customizations Sony has done on the SSD side. It's not just about the SSD being twice as fast. The PS5's SSD has its own CPU, has a very large SRAM pool, has a novel indexing system, has a Kraken engine (equivalent in power to a Zen 2 cores), has GPU scrubbers (to alleviate the burden of GPU stalls) - and I could go on and on and on. This isn't about 2.4 vs 5 gb/s. This is about a REVOLUTIONARY way of how the the console speaks between its parts.

So please - XSX is a great machine and will most likely have better RT / Resolution, but the PS5 SSD is a whole lot better and has a ton of extra hardware in the machine to help it so. The PS5 will have better quality assets per frame - and it was designed with that purpose.
Better quality assets per frame. I'm not a tech guy but I like this clear statement. So I'm thinking then. With conventional games today if you want more fps you basically have to lower the quality of image. Are we saying with Ps5 they can get that same 60fps but with its super optimised hardware all the way through they can get more quality at 60fps? So the conventional rules no longer apply? So basically even though XSX has more raw power the Ps5 is more efficient so it can match or exceed visuals at the same fps target?
 

BradGrenz

Banned
Oct 27, 2017
1,507
Better quality assets per frame. I'm not a tech guy but I like this clear statement. So I'm thinking then. With conventional games today if you want more fps you basically have to lower the quality of image. Are we saying with Ps5 they can get that same 60fps but with its super optimised hardware all the way through they can get more quality at 60fps? So the conventional rules no longer apply? So basically even though XSX has more raw power the Ps5 is more efficient so it can match or exceed visuals at the same fps target?

It's just an increase in fidelity and/or variety of assets at whatever framerate you target. It doesn't make the frames draw faster, just determines what can be in a given frame.
 

Sub Boss

Banned
Nov 14, 2017
13,441
This is what most developers have been saying. Everyone that actually knows stuff laughs at how people look at the teraflops number and calls it a day.
I think the question is if the speed advantage of the PS5 SSD is more important/convenient than the XSX GPU advantages and how would third party games be impacted.most advantages of high speed drive talked by developers also apply to the Series X only to a smaller degree.

PS5 will be faster, but XSX is no slouch, both systems are very well designed this gen.
 

Shyotl

Member
Oct 25, 2017
1,272
The PS5's SSD has its own CPU, has a very large SRAM pool, has a novel indexing system, has a Kraken engine (equivalent in power to a Zen 2 cores), has GPU scrubbers (to alleviate the burden of GPU stalls) - and I could go on and on and on.
This is wrong. The Kraken engine allegedly decompresses at a rate equivalent to two zen cores, but that doesn't mean it's as powerful. It's specialized silicon for a single task that general purpose cpu's aren't particularly great at. That's not to say it isn't an interesting piece of hardware, but it's not providing more than it's functionality, which is bumping up the i/o read throughput.
 

Pasedo

Member
Oct 27, 2017
52
It's just an increase in fidelity and/or variety of assets at whatever framerate you target. It doesn't make the frames draw faster, just determines what can be in a given frame.
This is what I mean. So maybe an example. If two devs of equal skill and knowledge were working on the same game but one on Ps5 and the other on XSX and they were both told the target 60fps. Are we basically saying the Ps5 developer with access to greater fidelity and asset delivery per frame will make the game look better at 60fps on the PS5?
 

Deleted member 27551

User requested account closure
Banned
Oct 30, 2017
660
This is what I mean. So maybe an example. If two devs of equal skill and knowledge were working on the same game but one on Ps5 and the other on XSX and they were both told the target 60fps. Are we basically saying the Ps5 developer with access to greater fidelity and asset delivery per frame will make the game look better at 60fps on the PS5?
Wouldn't it be potentially some settings that are effected by the SSD will just be better but also I'm guessing the high clocks may help somewhere aswell, same for xbox because of its advantages. It's going to be really interesting for DF and all of us who watch.
 

Darmik

Member
Oct 25, 2017
686
To me it sounds like both Xbox and PS5 had the same goal with what they wanted the SSD to do and took different methods to get there (PS5 with hardware speed and Kraken and Microsoft with Xbox Velocity). It'll be interesting to see how they pan out.

I think people are both overestimating the difference the SSD speed will make specifically for the PS5 and the power difference between the PS5 and XSX personally.

I doubt we'll hear anything specifically from most developers. Do any of them even talk about publicly about developing for both PS4 Pro and Xbox One X?
 

Pasedo

Member
Oct 27, 2017
52
Wouldn't it be potentially some settings that are effected by the SSD will just be better but also I'm guessing the high clocks may help somewhere aswell, same for xbox because of its advantages. It's going to be really interesting for DF and all of us who watch.
I agree. I get more excited about paradigm shifts in technology which is more the direction Ps5 has taken by innovating the entire storage and memory architecture rather than just doing more of what were already doing today. I just hope this isn't all just hype. I got excited about the Wii U and their talks of fp16 and Esram etc and how it would punch about the theoretical numbers but in the end we all woke up to the reality it was just a weak ass machine. Don't get me wrong. It was fun but I just hope I don't get disappointment in the same way as Im swinging more towards getting the Ps5.
 
Jun 12, 2018
492
Someone point me to a PS5 thread where it doesn't devolve into stupid console war bullshit in the first page. It's like the PS5 is not allowed to have anything positive about it.
 

Horned Reaper

Member
Nov 7, 2017
1,560
7 years of pent up rage has taken its toll it seems.

The differences will be negliable at best. With first party games showcasing the different approaches of having more power vs more speed.
 

Pottuvoi

Member
Oct 28, 2017
3,065
It might for specific roles, with spiderman they had to work a ton on optimizing the intro because of the HDD speed, with an ultra fast SSD they could have made it a lot faster, not reducing crunch for everything of course.
Indeed they did.
Stuff like not loading couple of texture mipmaps after the highest during swinging or no loading of storefronts until spiders feet hit the ground etc.
 

Deleted member 27551

User requested account closure
Banned
Oct 30, 2017
660
I agree. I get more excited about paradigm shifts in technology which is more the direction Ps5 has taken by innovating the entire storage and memory architecture rather than just doing more of what were already doing today. I just hope this isn't all just hype. I got excited about the Wii U and their talks of fp16 and Esram etc and how it would punch about the theoretical numbers but in the end we all woke up to the reality it was just a weak ass machine. Don't get me wrong. It was fun but I just hope I don't get disappointment in the same way as Im swinging more towards getting the Ps5.
Yea I feel the same it seems a smart way to go with what sony has done without all the crap they had with the ps3 with unique hardware. I don't think all they have said is PR bollocks as we know with it being faster in some ways it will make a difference in some games with what we know with PC hardware.

Will it mean ps5 will out perform series x overall? Well I doubt it but it will have an advantage in some settings if devs take proper advantage. I'm buying both eiva way, they both have there ups and down with hardware and games so I won't miss out on anything this way.
 

mordecaii83

Avenger
Oct 28, 2017
6,862
You know they talked to actual devs, thus proving what they state?
Their theory directly contradicts Cerny, so either Cerny misspoke or they are mistaken. Watch the presentation again where Cerny specifies both chips will spend the majority of their time at the max clock, which is impossible if one has to downclock to allow the other to hit max clock.

How do you know that it isn't true?
See my above response.
 

Alexandros

Member
Oct 26, 2017
17,811
Their theory directly contradicts Cerny, so either Cerny misspoke or they are mistaken. Watch the presentation again where Cerny specifies both chips will spend the majority of their time at the max clock, which is impossible if one has to downclock to allow the other to hit max clock.


See my above response.

Digital Foundry is an independent publication and they said that they talked to developers that develop for the PS5. I see no reason to question their information at this point.
 

mordecaii83

Avenger
Oct 28, 2017
6,862
Digital Foundry is an independent publication and they said that they talked to developers that develop for the PS5. I see no reason to question their information at this point.
Then you're stating that Cerny either lied or was somehow mistaken in his presentation. He left zero room for their interpretation to be correct if what he was saying is correct. I'm more inclined to believe the system architect.

Feel free to watch the video yourself, there's really no wiggle room with what he said. Starts at 46:50. https://www.youtube.com/watch?v=utQIXcXXMrM
 

Adum

Member
May 30, 2019
925
Yeah, they're repeating this pet theory a lot but it's not true and a direct contradiction to what Cerny has said.
I'd love for Sony to explain more clearly the GPU/CPU power draw and downclocking. I think it was clear that the CPU and GPU frequency clocks given would be the base frequencies (and not some sort of boost clock like PC graphics cards) but the way Mark Cerny stated it is open to interpretation. Does the GPU downclock if the CPU has to run at max frequency and vice versa? Or does it all depend on how much power the system is drawing? But I think he also said that the power draw was fixed and that the GPU frequencies would need to be lowered by a few % in worst case scenarios. I'm also pretty sure he said that power draw would be lowered by 10% if GPU frequency was lowered by a couple % so power drawn isn't fixed...

I'll go through the presentation again to clarify for myself later, but if any of yous peeps have a better memory than mine maybe you could help me out here.
 

endlessflood

Banned
Oct 28, 2017
8,693
Australia (GMT+10)
Their theory directly contradicts Cerny, so either Cerny misspoke or they are mistaken. Watch the presentation again where Cerny specifies both chips will spend the majority of their time at the max clock, which is impossible if one has to downclock to allow the other to hit max clock.
I understood from the presentation was that there was a max power budget they were working with, and reducing clock slightly on either CPU or GPU could be used to reduce the power load on one (freeing up power for the other) in the case of hitting the limits.

I don't know how closely the clock is tied to power usage from moment to moment under load, but I'd assume that there's a lot of variance, where both can be at maximum clock but not exceeding the power budget, otherwise what Cerny said wouldn't make sense.
 

mordecaii83

Avenger
Oct 28, 2017
6,862
I'd love for Sony to explain more clearly the GPU/CPU power draw and downclocking. I think it was clear that the CPU and GPU frequency clocks given would be the base frequencies (and not some sort of boost clock like PC graphics cards) but the way Mark Cerny stated it is open to interpretation. Does the GPU downclock if the CPU has to run at max frequency and vice versa? Or does it all depend on how much power the system is drawing? But I think he also said that the power draw was fixed and that the GPU frequencies would need to be lowered by a few % in worst case scenarios. I'm also pretty sure he said that power draw would be lowered by 10% if GPU frequency was lowered by a couple % so power drawn isn't fixed...

I'll go through the presentation again to clarify for myself later, but if any of yous peeps have a better memory than mine maybe you could help me out here.
What he said was what both chips would operate at their max frequency most of the time. If the CPU isn't using much of the power budget the GPU is able to take that power allotment. If the chip goes over its power budget it will clock down, but should only need to drop a couple percent to see a 10% reduction in power and thus stay at/under the power budget.
 

mordecaii83

Avenger
Oct 28, 2017
6,862
I understood from the presentation was that there was a max power budget they were working with, and reducing clock slightly on either CPU or GPU could be used to reduce the power load on one (freeing up power for the other) in the case of hitting the limits.

I don't know how closely the clock is tied to power usage from moment to moment under load, but I'd assume that there's a lot of variance, where both can be at maximum clock but not exceeding the power budget, otherwise what Cerny said wouldn't make sense.
That's exactly how it works, certain workloads stress the chip at the same clock speed and it would need to downclock slightly during those workloads. Otherwise it should be running at max clock speed.

Edit: Sorry for the double post.
 

Osaragi

Member
Oct 27, 2017
173
Germany
I am more inclined to believe Cerny right now. If someone is stating that he is spreading misinformation there needs to be evidence to proof that statement. That is a big accusation, but time will tell.

Edit: I respect DF very much an love their content, but taking them by their word right now would mean Cerny was spreading false Information. I find that very unlikely. Because the presentation was very clear. It's more believable to me that there was a misunderstanding between them and their sources. No way in hell they would lie about having talked to devs working on PS5.
 
Last edited:

Alexandros

Member
Oct 26, 2017
17,811
Then you're stating that Cerny either lied or was somehow mistaken in his presentation. He left zero room for their interpretation to be correct if what he was saying is correct. I'm more inclined to believe the system architect.

Feel free to watch the video yourself, there's really no wiggle room with what he said. Starts at 46:50. https://www.youtube.com/watch?v=utQIXcXXMrM

I already watched the whole video multiple times. I don't have to call anyone anything at this point in time, I'd rather wait until actual benchmarks are out. I trust Digital Foundry and I believe that they wouldn't state something publicly if they had not verified it from multiple sources.
 

mordecaii83

Avenger
Oct 28, 2017
6,862
I already watched the whole video multiple times. I don't have to call anyone anything at this point in time, I'd rather wait until actual benchmarks are out. I trust Digital Foundry and I believe that they wouldn't state something publicly if they had not verified it from multiple sources.
OK, then I'll wait for you to explain how Cerny stating that both chips will be running at max frequency the majority of the time is possible if one chip has to downclock for the other to run at max speed. Please explain to me how that's possible, I'll wait. In that case the theoretical maximum for either chip would be 50% of the time at max speed, which is not "a majority of the time".

On a different note, I'm curious whether the power budget is a whole chip power budget, a separate CPU and GPU power budget, or a combination of the two.
 

endlessflood

Banned
Oct 28, 2017
8,693
Australia (GMT+10)
Lol, the SSD just does everything 😂
TBF, as someone who has worked at Insomniac, Naughty Dog, and Crytek, he probably knows what he's talking about, and he stressed that it's true "in theory." In this case, I'm sure we can all see that human nature, rather than tech, would be the potential impediment to that coming to pass.


I trust Digital Foundry and I believe that they wouldn't state something publicly if they had not verified it from multiple sources.
Perhaps they just misunderstood what they were told (I haven't seen their exact comment). It's quite a novel solution after all and one that we haven't really seen before (AFAIK).
 

mordecaii83

Avenger
Oct 28, 2017
6,862
TBF, as someone who has worked at Insomniac, Naughty Dog, and Crytek, he probably knows what he's talking about, and he stressed that it's true "in theory." In this case, I'm sure we can all see that human nature, rather than tech, would be the potential impediment to that coming to pass.


Perhaps they just misunderstood what they were told (I haven't seen their exact comment). It's quite a novel solution after all and one that we haven't really seen before (AFAIK).
That's my theory, they misunderstood the APU needing to balance power usage between the GPU and CPU as changing the clock speed.
 

OneBadMutha

Member
Nov 2, 2017
6,059
Because the feel the SSD will give the best over all benefit not only for games but the whole system .
Like a extra 4GB of ram not going to have a huge effect when it come to 3rd party stuff or even a extra 1 TF.
Even if 3rd party don't used the SSD to the fullest they will still help with certain things .
Now we have to see if they right or wrong and also the price.

Devs are less concerned with max power and more with ease of development. The IO speeds and Sony being ahead on tools is likely making life very easy for devs. Plastic warriors get excited about who can push the most things on the screen. Not most devs. Most of that theoretical max they don't touch. What the SSD is helping is development. Time is money. Time wasted trying to trouble shoot or work around throughout bottlenecks takes away from the other aspects of game development. People are misconstruing developer excitement me thinks. I think they're genuinely excited and people need to stop downplaying that. The excitement doesn't mean PS5 games will have exclusive features or exclusive textures. It may mean more exclusive games from Indies who have limited resources. Cheapening development may make it easier to moneyhat exclusive games with bigger publishers as well.

That said, I believe the Series X sounds like one hell of a machine that will excel in ray tracing. Microsoft is behind on the tools. Until then full feature sets are out in the wild, it's all speculatuon...outside of the fact that devs like developing for PS5. They aren't talking about the competition. Which is probably the #1 thing that matters as of today.
 

BradGrenz

Banned
Oct 27, 2017
1,507
You know they talked to actual devs, thus proving what they state?

What were they actually told? Saying they talked to someone isn't proof of anything.

Digital Foundry is an independent publication and they said that they talked to developers that develop for the PS5. I see no reason to question their information at this point.

They are not infallible. They have misinterpreted technical details in the past. I question the claim because it is direct contradiction with Cerny's description of the system.

I think it was clear that the CPU and GPU frequency clocks given would be the base frequencies (and not some sort of boost clock like PC graphics cards) but the way Mark Cerny stated it is open to interpretation. Does the GPU downclock if the CPU has to run at max frequency and vice versa?

The examples Cerny gives for when downclocking happens are when the GPU experiences a stressful workload, the GPU downclocks. Likewise, when the CPU runs stressful AVX code, the CPU downclocks. There is zero indication from him that the CPU has to downclock for the GPU to run faster, or the opposite.
 

foamdino

Banned
Oct 28, 2017
491
Devs are less concerned with max power and more with ease of development. The IO speeds and Sony being ahead on tools is likely making life very easy for devs. Plastic warriors get excited about who can push the most things on the screen. Not most devs. Most of that theoretical max they don't touch. What the SSD is helping is development. Time is money. Time wasted trying to trouble shoot or work around throughout bottlenecks takes away from the other aspects of game development. People are misconstruing developer excitement me thinks. I think they're genuinely excited and people need to stop downplaying that. The excitement doesn't mean PS5 games will have exclusive features or exclusive textures. It may mean more exclusive games from Indies who have limited resources. Cheapening development may make it easier to moneyhat exclusive games with bigger publishers as well.

That said, I believe the Series X sounds like one hell of a machine that will excel in ray tracing. Microsoft is behind on the tools. Until then full feature sets are out in the wild, it's all speculatuon...outside of the fact that devs like developing for PS5. They aren't talking about the competition. Which is probably the #1 thing that matters as of today.
This is how I read the room. The devs are excited and happy because Cerny has built a system that is super easy to work with and frees up time from the production process.

Happy devs == productive devs == faster Dev times == more games produced

In the presentation Cerny showed time to triangle for all PS consoles. PS5 is better than all previous ones. Which means it's easier to work with than PS1. Given how much more powerful it is than PS1, that shows how good the dev env must be and how good the tools are.

I have a software background and I can appreciate when a dev is happy about their tools, it does make a massive difference in team morale and productivity.

This is far more important to the healthy ecosystem of Sony wws than having more CUs on the Apu.
 

OneBadMutha

Member
Nov 2, 2017
6,059
This is how I read the room. The devs are excited and happy because Cerny has built a system that is super easy to work with and frees up time from the production process.

Happy devs == productive devs == faster Dev times == more games produced

In the presentation Cerny showed time to triangle for all PS consoles. PS5 is better than all previous ones. Which means it's easier to work with than PS1. Given how much more powerful it is than PS1, that shows how good the dev env must be and how good the tools are.

I have a software background and I can appreciate when a dev is happy about their tools, it does make a massive difference in team morale and productivity.

This is far more important to the healthy ecosystem of Sony wws than having more CUs on the Apu.

Yep....and as someone who doesn't work in gaming but has developed elsewhere, ease of development is far more important than max theoretical capabilities. As long as the platform can do the job, give me something that's agile, easy to trouble shoot, works as expected.... and I can focus more on my end product.

I've been saying since well before Github leaks that the primary bottleneck in game development is time and money. Whoever helps devs become empowered and who allows smaller teams to do bigger things better will win the tech battle IMO. Bigger devs usually have to spend the money and figure it out regardless. I want to see small and medium sized devs doing bigger things and getting better support this gen. That's where your risk taking and new ideas happen more often.

With all that said, this forum seems consumed with the versus and plastic wars of it all. They should hope development is easier across the board. They should be hoping the winners are the developers and that both Microsoft and Sony continue to bolster the tools to help devs do things efficiently.
 

Alexandros

Member
Oct 26, 2017
17,811
, then I'll wait for you to explain how Cerny stating that both chips will be running at max frequency the majority of the time is possible if one chip has to downclock for the other to run at max speed. Please explain to me how that's possible, I'll wait.

I think the answer is pretty straightforward based on what we've heard. It isn't. So who is right? We'll know soon enough. Based on the usual workload of a modern game I expect the GPU to be running at full load and the CPU to be underclocking, since the system has a specific power budget that must not be exceeded. I can see scenarios in which both clocks will be at their maximum, in smaller games that don't tax the hardware that much, but not in big games. In any case we are only a few months away from actual next gen games coming to market so we will get definitive answers soon.

Perhaps they just misunderstood what they were told (I haven't seen their exact comment). It's quite a novel solution after all and one that we haven't really seen before (AFAIK).

Perhaps, but it isn't their own theory. They talked to informed sources. For the moment I will trust their info and I'll wait for benchmarks to reveal the facts.

They are not infallible. They have misinterpreted technical details in the past. I question the claim because it is direct contradiction with Cerny's description of the system.

That's fair, you trust Cerny and I trust the DF crew. It is absolutely possible that Cerny is right.
 

Bunzy

Banned
Nov 1, 2018
2,205
I love how people think cerny is a guy who likes to oversell or lie. Dude if anything, is way to upfront and honest about things, multiple times in the talk he undersold certain things.
 

Hoo-doo

Attempted to circumvent ban with alt account
Banned
Oct 25, 2017
4,292
The Netherlands
I love how people think cerny is a guy who likes to oversell or lie. Dude if anything, is way to upfront and honest about things, multiple times in the talk he undersold certain things.

This. Cerny is to my knowledge not known for bullshitting or PR. He's an engineer and a developer and is vastly more experienced and knowledgable about hardware aspects of the systems he architected than anyone else. To insinuate that he's a PR spokesperson hamming it up for optics is kind of silly IMO.