at least the slim has a uhd drive. the ps4 pro let alone the ps4 slim skipped the uhd.
at least the slim has a uhd drive. the ps4 pro let alone the ps4 slim skipped the uhd.
It's obvious 2020 has been the plan for a while given Death Stranding, Ghosts and TLOU 2 were all revealed/showcased E3 2018 or sooner as PS4 titles.Damn. Thanks for the info
Team 2019 were right....
Probably no ps3/2/1 BC if it took them more time for ps4 BC
Even if that was true, which it isn't (Ryzen 7 2700 takes that throne), the statement makes little sense.
To both of you...
- The PS4 was amongst (if not the first) the first mainstream products to use 8Gb (1GB) GDDR5 chips back in 2013. So because something is not being made right now (it's scheduled to go into production between 2019/2020) it doesn't mean it can't make it into a next-gen console.
- The issue with HBM isn't that there is more demand for it, its actually that there isn't enough demand or it, that's why prices remain high.
- And something you both may not know about HBM3 (or HBM in general), what really makes it expensive is the fact that they use an interposer. That's literally having silicon on silicon. But it's necessary because it "was" the only way they saw to accommodate the 1024-bit/stack bus width. With HB3, one of the game changers of that spec is that they can instead choose to use a smaller bus of 512-bit/stack.
- Why this is great is that it means you can do away with that silicon (costly) interposer entirely and use an organic interposer(already used in MCM designs, that stuff that chiplets are put on) instead. These are much cheaper than silicon interposers while being able to handle significantly higher routing than PCBs. And that's just one of two ways HBM3 is going to be cheaper. It retains the same bandwidth throughput by increasing (doubling) the clock speed while halving the bus size.
- If sony were to have decided to go with HBM, it would have been a decision they could have made as late as last year, and that is time enough to tape out a chip that incorporated a HBM mem controller as opposed to a GDDR one. And I am almost certain sony will have a better bead on these things and their availability than us, orat least enough to make an informed decision.
Userbenchmark scores are usually the actual in game results and not the max theoretical.
RAM scores in Userbenchmark are usually ~8% of max theoretical bandwidth.
Flute userbenchmark 16 gb RAM bandwidth score is 529.6 gb/s, which is ~8% of 576 gb/s.
GDDR6 18gbps at 256-bit is 576 gb/s.
So it seems if flute is using GDDR6 Ram it is 18Gbps.
InFO_MS helps because there's no interposer needed.
Yup.
If we are north of Vega64, I'm happy.
I'll admit it, I was wrong regarding RDR2 and the 750ti.
BUT - the AMD minimum required GPU for RDR2 is the RX280 which is 28CU running at 933Mhz which is 3.3TF, basically closer to PS4 Pro than PS4 while One S, a GPU with almost 1/3 of the RX280 power, runs the game just fine. So yeah, even the One S runs RDR2 better than a 3TF+ GCN PC card but that's the nature of console optimization VS the PC's brute force. It doesn't mean that the RX280 isn't almost X2 more powerful than the PS4, just that the PS4 version is much more optimized.
In the end, that what the original point which leads to this nonsensical 750ti VS PS4 discussion, that the PS4 GPU on launch day was a low-end PC GPU at best (weaker than the 139$ RX260X) and Sony (and MS) went really cheap on us in 2013.
Cool. Sounds good! Really hope everything comes together as well as it sounds like it could be.I totally agree but consoles always downclock the memory. Flute was using 18Gbps GDDR6. we know that for sure, but the PS5 will not have the 18Gbps GDDR6 modules running at 18Gbps just like the PS4, the Pro and the X underclocked their GDDR5 chips. For example, look at the X that uses 7Gbps modules which should have given it a 336GB/s bus but instead, MS runs them at 6.8Gbps which gives them 326.4GB/s.
In the end, we don't really know how much Sony will downclock the 18Gbps GDDR6, but if that's the PS5's setup then 576GB/s is the higher bound which the PS5 will probably won't reach.
It can be 540GB/s, it can also be 560GB/s but not much higher than that if the Flute leak is true. PS4 was very similar to the RX 270 (with 4CU turned off and lower clocks) but while the 270 had 179GB/s bus, the PS4 had 176GB/s which was also shared with the CPU. The 5700XT has 448GB/s, so having 100GB/s more bandwidth sounds pretty good to me considering the CPU will need much less than 100GB/s which will leave the GPU with more bandwidth than the 5700XT. If we get ~5700XT in the PS5 then I would say that regarding bandwidth to GPU-power ratio, the PS5 is in better shape than the PS4.
Exactly that. They also seem to just be extremely paranoid about introducing any kind of incompatibility, no matter how small. Remember how the PS4 Pro didn't even have a boost mode at launch? Cerny seemed convinced that it would cause problems but once it was added only one game had issues and they were able to fix it relatively quickly.What the hell did Sony do that makes backwards compatibility so difficult? I am guessing their API has fewer levels of abstraction?
Yes, however, I think the InFO_MS is not only a different method of doing it without an interposer... t also has its complications too.
Yup
Just like the last minute 8 GB GDDR5 that even first party devs didn't know about and had to work with just 4GB of their devkits for their launch games, right?
That would be my guessWhat the hell did Sony do that makes backwards compatibility so difficult? I am guessing their API has fewer levels of abstraction?
It's a race between that and Team <8TF.
Team 2019 was right for a while, so it gets to live on forever. In a different timeline, a bright and beautiful one, we were champions.
...and what we came to realize was that with no backwards compatibility, we had no choice but to look forward.
Yes, however, I think the InFO_MS is not only a different method of doing it without an interposer... t also has its complications too.
The method I amtaling abot is based on what is already being employed in MCM (multi-chip module) designs. Basically what they are already using with regards to their chiplet based zen CPUs.
The CCX cores are all placed on an organic substrate. And connected to each other via an infinity fabric (what AMD calls it). This is an all-round cheaper way to go about things. Orders of agnitude cheaper and easier to do than what HBM2 is doing with an interposer.
HBM3 would have you replace that interposer with the same organic substrate used in MCMs, and would support traces of up to 512bit per stack.
Team 8TF still exists?It's a race between that and Team <8TF.
...and what we came to realize was that with no backwards compatibility, we had no choice but to look forward.
I wonder if Sony has decided to use some form of HBM, but they don't yet know which one due to the question of whether or not HBM3 will be available, hence why they haven't confirmed the RAM type yet.
Borderline impossible. At this point, they have to have a finished design and are probably signing long-term production contracts already.
Which game had an issue?Cool. Sounds good! Really hope everything comes together as well as it sounds like it could be.
Exactly that. They also seem to just be extremely paranoid about introducing any kind of incompatibility, no matter how small. Remember how the PS4 Pro didn't even have a boost mode at launch? Cerny seemed convinced that it would cause problems but once it was added only one game had issues and they were able to fix it relatively quickly.
1. Your capital costs if you make a huge design change also change. You would have to distribute that cost to the consumer, or write it off. The latter is double the cost incurred. This is how accounting in business works i.e. you need a dollar (preferably profit) to write off money that is either a loss, or badly allocated expenditure.That's not true.
No matter what any console manufacturer does, their hardware becomes "outdated" the second a smaller manufacturing node becomes available. That's just how it goes. And if you look at the PS4pro in particular, the only thing sony changed was the GPU. they didn't add more RAM or use a different CPU (just clocked it up), that tells you that it wasn't something done because their hardware was outdated, but rather something done to at least remain relevant.
Think people need to understand that consoles are usually made the best way they can be made at the time they are being made and to eat least good enough t last 5-6yrs. Its is literally impossible to make a future proof console if by "future" you are talking about 6-7yrs in the tech world.
Again, while more expensive than GDDR6, HBM3 cn cost significantly less than HBM2 especially if applied the way that its being proposed to be applied. What that cost delta is isn't something that you or I know. But here are some advantages.
Let's say we are comparing 20GB of GDDR6 to 20GB of HBM3. And let's say GDDR6 solution costs $100 and the HBM3 solution costs $130.
Now the question is, how much will the cost savings I other areas of our system offset the cost of using HBM3? And is the resultant cost worth the benefits it gives you? What of if sony has a 320mm2 chip that cost them $150 and MS has a 370mm2 chip that cost them $190. Or Sonys PCB ost them $14 t MS $22 PCB. All these things add up.
- With more adoption, cost of HBM3 will fall faster than GDDR6
- Smaller and cheaper PCB required for HBM3.
- less power draw meaning more power can go to the APU.
- Less heat generated
- Much higher bandwidth
- Less space is taken up for mem controller on the chip meaning you will need a smaller APU and also be able to clock that APU higher since it will generate less heat.
And you are saying everything you are saying about HBM based on what we know so far of it being very expensive. ut none of that is taking into account the radically different application methods or design f HBM3. Which were all specifically designed to make it cheaper and easier to make.
Some more additions to better explain myself.
- HBM3 and these changes has been listed by Samsung since 2016 with a scheduled release of 201/2020. So that's something that boh sony and MSwaould hav been fuly aware of.
- Some more insight into the interposer free design. Currently, HBM2 requires a silicon interposer is used and both it and the GPU chip sits on that interposer then the interposer is connected to the PCB. The interposer free design is akin to something like what you see with any Ryzen CCX based CPU. But now imagine that instead of having2 CCX and an I/O die, you have an APU and two HBM stacks. The substrate that ryzen CPUs areon is significantly cheaper than what you typically have as the HBM interposer.
- However, using this cheaper substrate means you can't have trace as dense as you would find in a HBM2 interposer. Which is why the bus width/stack has been dropped from 1024-bt to 512-bit.
- Another cost slashing initiative is to reduce the number of TSVs and the buffer die.
- All these things have been known and planned since 2016.
No, them not announcing it has nothing to do with whatever kinda RAM they are using. But it does lead me to believe that there will something good/bad about it which is why its the one thing that has seemingly purposefully not mentioned. However, whatever they are using? Its very well set in stone by now.I wonder if Sony has decided to use some form of HBM, but they don't yet know which one due to the question of whether or not HBM3 will be available, hence why they haven't confirmed the RAM type yet.
Fair points.1. Your capital costs if you make a huge design change also change. You would have to distribute that cost to the consumer, or write it off. The latter is double the cost incurred. This is how accounting in business works i.e. you need a dollar (preferably profit) to write off money that is either a loss, or badly allocated expenditure.
2. HBM was announced in 2013. The first application for it were AMD's Fiji GPU's. GDDR6 was announced in 2016 and was in production in 2018. HBM costs have really never come down enough to make it affordable for mass consumption. More telling however, is the fact that console developers have generally used RAM types that have been in production as opposed to going for something that is still not in use. Getting rid of the spec talk, design changes, the lower voltage usage etc, Sony going HBM3 instead of HBM2 or even GDDR6 would be a huge deviation from what has traditionally happened in the console space. It is for this reason that I speculate that we will not see them going for this as a solution.
They will also be looking at history. DDR platform is something that has evolved, power consumption gone down with each iteration as speeds have gone up. What is to say that we will not see it further evolution of the platform?
3. You are designing a console around what is needed. You have a GPU, and a CPU. At peak, what is the total memory bandwidth that they can use? There is no need of investing in excess capacity if it can be avoided. A $1200 nvidia card uses 11GB VRAM at 616GB/s. Going again by history, we will not be getting a card that is anywhere near that in performance, so why would anyone need 700GB/s in bandwidth for the entire SOC? It looks like overkill especially when you consider that people are talking about a smaller chip. You could clock it higher, but how much higher? 15%?
To me, it seems like you would be jumping through a lot of hoops to try and make HBM3 work.
No, them not announcing it has nothing to do with whatever kinda RAM they are using. But it does lead me to believe that there will something good/bad about it which is why its the one thing that has seemingly purposefully not mentioned. However, whatever they are using? Its very well set in stone by now.
Everything I said about the MCM approach to HBM3 has been making the rounds from Samsung since 2016. It used to be called LCHBM. It may not even be still in development. But one thing for certain is that HBM3 has a lot of o design initiatives prioritizing affordability.
Give PSP and Vita too plsI still think we might have full PS4,PS3,PS2 & PS1 BC on the PS5 and not just PS4 BC, mainly because of Microsoft having full Xbox (OG,360,One) BC on the Xbox Scarlett, they'd lose the BC marketing edge next year to MS, and from what we've heard Sony really cares about perfecting BC for next gen, so much so to the point of delaying the console by a year from a decision made in 2017,.
Now i'd be shocked that given that 3 year window from 2017 to 2020 Sony didn't consider full BC support going into the next gen PlayStation console and not just PS4 BC, something just doesn't sit right with needing 3 years just for getting PS4 BC 101% spot on, especially given that both consoles are x86 machines and Sony has already dabbled on PS4 console to console compatibility with the PS4 OG to Pro.
i am 80% - 85% sure that we'll get full Playstation BC support into the PS5.
I still think we might have full PS4,PS3,PS2 & PS1 BC on the PS5 and not just PS4 BC, mainly because of Microsoft having full Xbox (OG,360,One) BC on the Xbox Scarlett, they'd lose the BC marketing edge next year to MS, and from what we've heard Sony really cares about perfecting BC for next gen, so much so to the point of delaying the console by a year from a decision made in 2017,.
Now i'd be shocked that given that 3 year window from 2017 to 2020 Sony didn't consider full BC support going into the next gen PlayStation console and not just PS4 BC, something just doesn't sit right with needing 3 years just for getting PS4 BC 101% spot on, especially given that both consoles are x86 machines and Sony has already dabbled on PS4 console to console compatibility with the PS4 OG to Pro.
i am 80% - 85% sure that we'll get full Playstation BC support into the PS5.
I am not ignoring that HBM3 can be made cheaper. But what makes something cheaper? It is refinement of the process, and scaling up of production. If there is high production/supply and high demand, what usually happens is that costs can be spread over a larger canvas so you end up having a high volume, lower margin business. It also allows for a faster break even point, and because it is something that is selling extremely well, you are likely to have more players coming investing in capacity. Competition is something that will now bring in even more cost reductions. This at the end of the day is what drives economies of scale.Fair points.
I disagree with the whole them not doing something because it goes against precedent thing though.
And HBM3 doesn't " have" to be 700GB+, they can even clock it lower to use even less power, which is one of the major benefits it has over GDDR6 which uses as much as 4.5x the power of HBM2 and HBM3 uses even less power than that. In a console, A 10-15W delta can make a world of difference. And we don'thave to guess the power of GDDR6, we know what it is per chip right now, and can calculate how much power it will draw. And that draw will remain constant along the course of the generation, as long as its GDDR6 that is being used.
And you are blatantly ignoring that there are ways that HBM3 can be made even cheaper and incorporated at a lower cost... why? Simply because no one else as done it yet?
And don't you see that rumors of sony using a smaller chip actually point more to HBM than it does to GDDR6?
Hope so, that'd be amazing. Still need to finish Suikoden II...
lol I assume you're joking about the 120fps but a 9x resolution boost across the board similar to what Xbox has for OG titles would be amazing for Vita and PSPRemastered in 8K@120fps yes please.
I actually want to know how bad an old PSP game would scale in a 50 inch TV lol.
There's a lot of Metal Gear games that ive yet to play but really want to experience, had a 360 Last gen so i only got to play MGS2,3,PW from the HD remaster and MGS:GZ,TP on the PS4, in fact the PS4 was my first playstation console, there's a lot of exclusives from the PS1,PS2,PS3 days that i really want to play with the added benefit of a 4K/8K @ uncapped frames.
lol I assume you're joking about the 120fps but a 9x resolution boost across the board similar to what Xbox has for OG titles would be amazing for Vita and PSP
Ah wow. Massive potential backlog for you then, haha. I was pretty much Sony only from PS2-PS4 and had both portables too. So I'm hoping to get access to my old PS3/PS1 classic digital purchases. Not really holding out any hope for PSP and Vita games but who knows?
Xbox doesn't have the deep back catalog Sony does (though still plenty of gems), but knowing MS respects those purchases and wants me to be able to go back and play Gears 1-4 before Gears 5 came out, for example, has really added some shine to the XB1 that I hope Sony matches. Like, I literally bought the Lost Odyssey DLC on a friend's 360 back in 2009 and there it was waiting for me in 2018 when I got my first Xbox. Hope that becomes standard. Seems we are finally getting enough power on the PS side to make it possible.
"In a console, A 10-15W delta can make a world of difference." LmaoNo, them not announcing it has nothing to do with whatever kinda RAM they are using. But it does lead me to believe that there will something good/bad about it which is why its the one thing that has seemingly purposefully not mentioned. However, whatever they are using? Its very well set in stone by now.
Everything I said about the MCM approach to HBM3 has been making the rounds from Samsung since 2016. It used to be called LCHBM. It may not even be still in development. But one thing for certain is that HBM3 has a lot of o design initiatives prioritizing affordability.
Fair points.
I disagree with the whole them not doing something because it goes against precedent thing though.
And HBM3 doesn't " have" to be 700GB+, they can even clock it lower to use even less power, which is one of the major benefits it has over GDDR6 which uses as much as 4.5x the power of HBM2 and HBM3 uses even less power than that. In a console, A 10-15W delta can make a world of difference. And we don'thave to guess the power of GDDR6, we know what it is per chip right now, and can calculate how much power it will draw. And that draw will remain constant along the course of the generation, as long as its GDDR6 that is being used.
And you are blatantly ignoring that there are ways that HBM3 can be made even cheaper and incorporated at a lower cost... why? Simply because no one else as done it yet?
And don't you see that rumors of sony using a smaller chip actually point more to HBM than it does to GDDR6?
It's a mistake to think minimum requirements are consistent. I've seen games where the minimum requirements delivered way past console performance and I've seen games where the minimum requirements gave you 20fps at 720p.BUT - the AMD minimum required GPU for RDR2 is the RX280 which is 28CU running at 933Mhz which is 3.3TF, basically closer to PS4 Pro than PS4 while One S, a GPU with almost 1/3 of the RX280 power, runs the game just fine. So yeah, even the One S runs RDR2 better than a 3TF+ GCN PC card but that's the nature of console optimization VS the PC's brute force. It doesn't mean that the RX280 isn't almost X2 more powerful than the PS4, just that the PS4 version is much more optimized.
"In a console, A 10-15W delta can make a world of difference." Lmao
False equivalence.Don't know why you think it is funny as he is right.
112 watts for the Xbox One vs. 137 watts for the PlayStation 4 aka 900p vs. 1080p.
And thats world difference? Also 25>10-15 ;)Don't know why you think it is funny as he is right.
112 watts for the Xbox One vs. 137 watts for the PlayStation 4 aka 900p vs. 1080p.
What is the difference in wattage between the PS4 Pro and Xbox One X? Or do we just throw numbers when they suit?
What is the difference in wattage between the PS4 Pro and Xbox One X? Or do we just throw numbers when they suit?
at least the slim has a uhd drive. the ps4 pro let alone the ps4 slim skipped the uhd.
Why do you think I can't see difference ? You don't know it but I know that you can't see difference betwren 25w and 10-15w :dIt is just one loose example. There is more to it obviiously but if you can't see a difference for example between 900p and 1080p you shouldn't discuss here anyway.
Why do you think I can't see difference ? Uou don't know it bit I know that you can't see difference betwren 25w and 10-15w :d
Still thinking that statement 10-15w is world difference when both console will probably be around 180-200w is funny exageration, if this is not so funny for you, hmm whatever ;)Then just define the meaning behind "worlds" so I get the idea behind your comments. Phoenix was talking about power consumption of only a part of the entire system what has an even more impact than the overall power consumption since cooling would be different accordingly and so on... To laugh about 10- 15 watts less power consumption of only a component of the entire system that impacts other parts significantly is just false.