• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.

How much money are you willing to pay for a next generation console?

  • Up to $199

    Votes: 33 1.5%
  • Up to $299

    Votes: 48 2.2%
  • Up to $399

    Votes: 318 14.4%
  • Up to $499

    Votes: 1,060 48.0%
  • Up to $599

    Votes: 449 20.3%
  • Up to $699

    Votes: 100 4.5%
  • I will pay anything!

    Votes: 202 9.1%

  • Total voters
    2,210
Status
Not open for further replies.

BreakAtmo

Member
Nov 12, 2017
12,805
Australia
To both of you...

  • The PS4 was amongst (if not the first) the first mainstream products to use 8Gb (1GB) GDDR5 chips back in 2013. So because something is not being made right now (it's scheduled to go into production between 2019/2020) it doesn't mean it can't make it into a next-gen console.

  • The issue with HBM isn't that there is more demand for it, its actually that there isn't enough demand or it, that's why prices remain high.

  • And something you both may not know about HBM3 (or HBM in general), what really makes it expensive is the fact that they use an interposer. That's literally having silicon on silicon. But it's necessary because it "was" the only way they saw to accommodate the 1024-bit/stack bus width. With HB3, one of the game changers of that spec is that they can instead choose to use a smaller bus of 512-bit/stack.

  • Why this is great is that it means you can do away with that silicon (costly) interposer entirely and use an organic interposer(already used in MCM designs, that stuff that chiplets are put on) instead. These are much cheaper than silicon interposers while being able to handle significantly higher routing than PCBs. And that's just one of two ways HBM3 is going to be cheaper. It retains the same bandwidth throughput by increasing (doubling) the clock speed while halving the bus size.

  • If sony were to have decided to go with HBM, it would have been a decision they could have made as late as last year, and that is time enough to tape out a chip that incorporated a HBM mem controller as opposed to a GDDR one. And I am almost certain sony will have a better bead on these things and their availability than us, orat least enough to make an informed decision.

What about InFO_MS? Would that provide further savings?
 

msia2k75

Member
Nov 1, 2017
601
Userbenchmark scores are usually the actual in game results and not the max theoretical.

RAM scores in Userbenchmark are usually ~8% of max theoretical bandwidth.

Flute userbenchmark 16 gb RAM bandwidth score is 529.6 gb/s, which is ~8% of 576 gb/s.

GDDR6 18gbps at 256-bit is 576 gb/s.

So it seems if flute is using GDDR6 Ram it is 18Gbps.

Reading your post, it reminded me this reddit rumour:

 

Firmus_Anguis

Member
Oct 30, 2017
6,106
AMD's Joe Macri in 2015 talking about HBM (essentially a promotional video)



Makes it seem as if it's better suited for VR, which Sony's definitely invested in. Them still not confirming which type of memory they're using makes me suspect they'll actually go with HBM2.

Would be nice if someone in the know could debunk this if it isn't true. Matt's been good at setting our expectations straight, so... Debunk away, I say!
 
OP
OP
Mecha Meister

Mecha Meister

Next-Gen Guru
Member
Oct 25, 2017
2,800
United Kingdom
Yup.


If we are north of Vega64, I'm happy.


I'll admit it, I was wrong regarding RDR2 and the 750ti.

BUT - the AMD minimum required GPU for RDR2 is the RX280 which is 28CU running at 933Mhz which is 3.3TF, basically closer to PS4 Pro than PS4 while One S, a GPU with almost 1/3 of the RX280 power, runs the game just fine. So yeah, even the One S runs RDR2 better than a 3TF+ GCN PC card but that's the nature of console optimization VS the PC's brute force. It doesn't mean that the RX280 isn't almost X2 more powerful than the PS4, just that the PS4 version is much more optimized.

In the end, that what the original point which leads to this nonsensical 750ti VS PS4 discussion, that the PS4 GPU on launch day was a low-end PC GPU at best (weaker than the 139$ RX260X) and Sony (and MS) went really cheap on us in 2013.

We don't really know how the R9 280 and lower GPUs will perform in this game until it's out. So saying the XBox One S runs RDR2 better than a 3TF GCN PC GPU is just guess work based on the system requirements until it has been tested.

We don't even know what the minimum system requirements are targeting in terms of graphical settings as they're quite vague to say the least.
Could it be Xbox One quality settings at 900p 30 fps? Or even PS4 quality settings at 1080p 30, or even 900p to 1080p 60 fps? We don't know.

Red Dead Redemption 2 is even rumoured to feature DX12 and Vulkan based on information some people found in the Rockstar Games Launcher, this will certainly be of much benefit to PC Gamers as good implementations of these APIs can really make a huge difference for performance.
The inclusion of these will certainly make it one of the most fascinating PC releases to date!

Edit - Expanded post and made it clearer.
 
Last edited:

pg2g

Member
Dec 18, 2018
4,779
What the hell did Sony do that makes backwards compatibility so difficult? I am guessing their API has fewer levels of abstraction?
 

RoboPlato

Member
Oct 25, 2017
6,802
I totally agree but consoles always downclock the memory. Flute was using 18Gbps GDDR6. we know that for sure, but the PS5 will not have the 18Gbps GDDR6 modules running at 18Gbps just like the PS4, the Pro and the X underclocked their GDDR5 chips. For example, look at the X that uses 7Gbps modules which should have given it a 336GB/s bus but instead, MS runs them at 6.8Gbps which gives them 326.4GB/s.

In the end, we don't really know how much Sony will downclock the 18Gbps GDDR6, but if that's the PS5's setup then 576GB/s is the higher bound which the PS5 will probably won't reach.


It can be 540GB/s, it can also be 560GB/s but not much higher than that if the Flute leak is true. PS4 was very similar to the RX 270 (with 4CU turned off and lower clocks) but while the 270 had 179GB/s bus, the PS4 had 176GB/s which was also shared with the CPU. The 5700XT has 448GB/s, so having 100GB/s more bandwidth sounds pretty good to me considering the CPU will need much less than 100GB/s which will leave the GPU with more bandwidth than the 5700XT. If we get ~5700XT in the PS5 then I would say that regarding bandwidth to GPU-power ratio, the PS5 is in better shape than the PS4.
Cool. Sounds good! Really hope everything comes together as well as it sounds like it could be.


What the hell did Sony do that makes backwards compatibility so difficult? I am guessing their API has fewer levels of abstraction?
Exactly that. They also seem to just be extremely paranoid about introducing any kind of incompatibility, no matter how small. Remember how the PS4 Pro didn't even have a boost mode at launch? Cerny seemed convinced that it would cause problems but once it was added only one game had issues and they were able to fix it relatively quickly.
 

Pheonix

Banned
Dec 14, 2018
5,990
St Kitts
What about InFO_MS? Would that provide further savings?
Yes, however, I think the InFO_MS is not only a different method of doing it without an interposer... t also has its complications too.

The method I amtaling abot is based on what is already being employed in MCM (multi-chip module) designs. Basically what they are already using with regards to their chiplet based zen CPUs.

The CCX cores are all placed on an organic substrate. And connected to each other via an infinity fabric (what AMD calls it). This is an all-round cheaper way to go about things. Orders of agnitude cheaper and easier to do than what HBM2 is doing with an interposer.

HBM3 would have you replace that interposer with the same organic substrate used in MCMs, and would support traces of up to 512bit per stack.
 

Lady Gaia

Member
Oct 27, 2017
2,476
Seattle
Just like the last minute 8 GB GDDR5 that even first party devs didn't know about and had to work with just 4GB of their devkits for their launch games, right?

The dev kits used the same motherboard layout and were already populated with 8GB of GDDR5. The amount shipped in retail units was strictly a cost and availability question, not a change to a different memory controller. Not even a board layout change. Or even a tools change. It was an opportunistic shift with supply chain implications and little else.

Nothing is locked right now and everything can change.

Nonsense. Pick a topic on which you're qualified to speak when making an authoritative claim like that. There have been any number of posts from people here pointing out that silicon gets locked pretty early on in the process. As someone who has been personally involved in launching multi-million selling mass-market products around custom silicon, I can assure you once again that this is the case. Early dev kits fail to represent the final hardware not because it's in a constant state of flux, but because designs take so long to finalize and are so expensive to tape out and test that they're in a strict timeline that doesn't line up with game development needs - so you start on the best off-the-shelf approximation you can get until real hardware is available.
 

anexanhume

Member
Oct 25, 2017
12,912
Maryland

BreakAtmo

Member
Nov 12, 2017
12,805
Australia
Yes, however, I think the InFO_MS is not only a different method of doing it without an interposer... t also has its complications too.

The method I amtaling abot is based on what is already being employed in MCM (multi-chip module) designs. Basically what they are already using with regards to their chiplet based zen CPUs.

The CCX cores are all placed on an organic substrate. And connected to each other via an infinity fabric (what AMD calls it). This is an all-round cheaper way to go about things. Orders of agnitude cheaper and easier to do than what HBM2 is doing with an interposer.

HBM3 would have you replace that interposer with the same organic substrate used in MCMs, and would support traces of up to 512bit per stack.

I wonder if Sony has decided to use some form of HBM, but they don't yet know which one due to the question of whether or not HBM3 will be available, hence why they haven't confirmed the RAM type yet.
 

Lagspike_exe

Banned
Dec 15, 2017
1,974
It's a race between that and Team <8TF.




...and what we came to realize was that with no backwards compatibility, we had no choice but to look forward.
Team 8TF still exists?

I wonder if Sony has decided to use some form of HBM, but they don't yet know which one due to the question of whether or not HBM3 will be available, hence why they haven't confirmed the RAM type yet.

Borderline impossible. At this point, they have to have a finished design and are probably signing long-term production contracts already.
 

BreakAtmo

Member
Nov 12, 2017
12,805
Australia
Borderline impossible. At this point, they have to have a finished design and are probably signing long-term production contracts already.

That's fair. I was considering the idea that they might have two designs that were mostly identical apart from the RAM with plans to pick one once manufacturing started but I suppose it can't work like that. And it seems like they can't just switch HBM2 out for HBM3 in the design either.
 

Flandy

Community Resettler
Member
Oct 25, 2017
3,445
Cool. Sounds good! Really hope everything comes together as well as it sounds like it could be.



Exactly that. They also seem to just be extremely paranoid about introducing any kind of incompatibility, no matter how small. Remember how the PS4 Pro didn't even have a boost mode at launch? Cerny seemed convinced that it would cause problems but once it was added only one game had issues and they were able to fix it relatively quickly.
Which game had an issue?
 

gremlinz1982

Member
Aug 11, 2018
5,331
That's not true.

No matter what any console manufacturer does, their hardware becomes "outdated" the second a smaller manufacturing node becomes available. That's just how it goes. And if you look at the PS4pro in particular, the only thing sony changed was the GPU. they didn't add more RAM or use a different CPU (just clocked it up), that tells you that it wasn't something done because their hardware was outdated, but rather something done to at least remain relevant.

Think people need to understand that consoles are usually made the best way they can be made at the time they are being made and to eat least good enough t last 5-6yrs. Its is literally impossible to make a future proof console if by "future" you are talking about 6-7yrs in the tech world.


Again, while more expensive than GDDR6, HBM3 cn cost significantly less than HBM2 especially if applied the way that its being proposed to be applied. What that cost delta is isn't something that you or I know. But here are some advantages.

Let's say we are comparing 20GB of GDDR6 to 20GB of HBM3. And let's say GDDR6 solution costs $100 and the HBM3 solution costs $130.
  • With more adoption, cost of HBM3 will fall faster than GDDR6
  • Smaller and cheaper PCB required for HBM3.
  • less power draw meaning more power can go to the APU.
  • Less heat generated
  • Much higher bandwidth
  • Less space is taken up for mem controller on the chip meaning you will need a smaller APU and also be able to clock that APU higher since it will generate less heat.
Now the question is, how much will the cost savings I other areas of our system offset the cost of using HBM3? And is the resultant cost worth the benefits it gives you? What of if sony has a 320mm2 chip that cost them $150 and MS has a 370mm2 chip that cost them $190. Or Sonys PCB ost them $14 t MS $22 PCB. All these things add up.

And you are saying everything you are saying about HBM based on what we know so far of it being very expensive. ut none of that is taking into account the radically different application methods or design f HBM3. Which were all specifically designed to make it cheaper and easier to make.

Some more additions to better explain myself.

  • HBM3 and these changes has been listed by Samsung since 2016 with a scheduled release of 201/2020. So that's something that boh sony and MSwaould hav been fuly aware of.
  • Some more insight into the interposer free design. Currently, HBM2 requires a silicon interposer is used and both it and the GPU chip sits on that interposer then the interposer is connected to the PCB. The interposer free design is akin to something like what you see with any Ryzen CCX based CPU. But now imagine that instead of having2 CCX and an I/O die, you have an APU and two HBM stacks. The substrate that ryzen CPUs areon is significantly cheaper than what you typically have as the HBM interposer.
  • However, using this cheaper substrate means you can't have trace as dense as you would find in a HBM2 interposer. Which is why the bus width/stack has been dropped from 1024-bt to 512-bit.
  • Another cost slashing initiative is to reduce the number of TSVs and the buffer die.
  • All these things have been known and planned since 2016.
1. Your capital costs if you make a huge design change also change. You would have to distribute that cost to the consumer, or write it off. The latter is double the cost incurred. This is how accounting in business works i.e. you need a dollar (preferably profit) to write off money that is either a loss, or badly allocated expenditure.

2. HBM was announced in 2013. The first application for it were AMD's Fiji GPU's. GDDR6 was announced in 2016 and was in production in 2018. HBM costs have really never come down enough to make it affordable for mass consumption. More telling however, is the fact that console developers have generally used RAM types that have been in production as opposed to going for something that is still not in use. Getting rid of the spec talk, design changes, the lower voltage usage etc, Sony going HBM3 instead of HBM2 or even GDDR6 would be a huge deviation from what has traditionally happened in the console space. It is for this reason that I speculate that we will not see them going for this as a solution.

They will also be looking at history. DDR platform is something that has evolved, power consumption gone down with each iteration as speeds have gone up. What is to say that we will not see it further evolution of the platform?

3. You are designing a console around what is needed. You have a GPU, and a CPU. At peak, what is the total memory bandwidth that they can use? There is no need of investing in excess capacity if it can be avoided. A $1200 nvidia card uses 11GB VRAM at 616GB/s. Going again by history, we will not be getting a card that is anywhere near that in performance, so why would anyone need 700GB/s in bandwidth for the entire SOC? It looks like overkill especially when you consider that people are talking about a smaller chip. You could clock it higher, but how much higher? 15%?

To me, it seems like you would be jumping through a lot of hoops to try and make HBM3 work.
 

Pheonix

Banned
Dec 14, 2018
5,990
St Kitts
I wonder if Sony has decided to use some form of HBM, but they don't yet know which one due to the question of whether or not HBM3 will be available, hence why they haven't confirmed the RAM type yet.
No, them not announcing it has nothing to do with whatever kinda RAM they are using. But it does lead me to believe that there will something good/bad about it which is why its the one thing that has seemingly purposefully not mentioned. However, whatever they are using? Its very well set in stone by now.

Everything I said about the MCM approach to HBM3 has been making the rounds from Samsung since 2016. It used to be called LCHBM. It may not even be still in development. But one thing for certain is that HBM3 has a lot of o design initiatives prioritizing affordability.
1. Your capital costs if you make a huge design change also change. You would have to distribute that cost to the consumer, or write it off. The latter is double the cost incurred. This is how accounting in business works i.e. you need a dollar (preferably profit) to write off money that is either a loss, or badly allocated expenditure.

2. HBM was announced in 2013. The first application for it were AMD's Fiji GPU's. GDDR6 was announced in 2016 and was in production in 2018. HBM costs have really never come down enough to make it affordable for mass consumption. More telling however, is the fact that console developers have generally used RAM types that have been in production as opposed to going for something that is still not in use. Getting rid of the spec talk, design changes, the lower voltage usage etc, Sony going HBM3 instead of HBM2 or even GDDR6 would be a huge deviation from what has traditionally happened in the console space. It is for this reason that I speculate that we will not see them going for this as a solution.

They will also be looking at history. DDR platform is something that has evolved, power consumption gone down with each iteration as speeds have gone up. What is to say that we will not see it further evolution of the platform?

3. You are designing a console around what is needed. You have a GPU, and a CPU. At peak, what is the total memory bandwidth that they can use? There is no need of investing in excess capacity if it can be avoided. A $1200 nvidia card uses 11GB VRAM at 616GB/s. Going again by history, we will not be getting a card that is anywhere near that in performance, so why would anyone need 700GB/s in bandwidth for the entire SOC? It looks like overkill especially when you consider that people are talking about a smaller chip. You could clock it higher, but how much higher? 15%?

To me, it seems like you would be jumping through a lot of hoops to try and make HBM3 work.
Fair points.

I disagree with the whole them not doing something because it goes against precedent thing though.

And HBM3 doesn't " have" to be 700GB+, they can even clock it lower to use even less power, which is one of the major benefits it has over GDDR6 which uses as much as 4.5x the power of HBM2 and HBM3 uses even less power than that. In a console, A 10-15W delta can make a world of difference. And we don'thave to guess the power of GDDR6, we know what it is per chip right now, and can calculate how much power it will draw. And that draw will remain constant along the course of the generation, as long as its GDDR6 that is being used.

And you are blatantly ignoring that there are ways that HBM3 can be made even cheaper and incorporated at a lower cost... why? Simply because no one else as done it yet?

And don't you see that rumors of sony using a smaller chip actually point more to HBM than it does to GDDR6?
 
Last edited:

BreakAtmo

Member
Nov 12, 2017
12,805
Australia
No, them not announcing it has nothing to do with whatever kinda RAM they are using. But it does lead me to believe that there will something good/bad about it which is why its the one thing that has seemingly purposefully not mentioned. However, whatever they are using? Its very well set in stone by now.

Everything I said about the MCM approach to HBM3 has been making the rounds from Samsung since 2016. It used to be called LCHBM. It may not even be still in development. But one thing for certain is that HBM3 has a lot of o design initiatives prioritizing affordability.

Well, fingers crossed, and hopefully we know in a few months. Still hoping the January 22/Sony Hall rumour was legit.

On that - how far in advance are daily and nightly events at places like Sony Hall booked? As in, normal events that don't need secrecy or anything? Is it possible that we could reach November or December and find that Sony Hall has booked every night up to February, with the sole exception of a suspiciously blank January 22?
 

Sekiro

Member
Jan 25, 2019
2,938
United Kingdom
I still think we might have full PS4,PS3,PS2 & PS1 BC on the PS5 and not just PS4 BC, mainly because of Microsoft having full Xbox (OG,360,One) BC on the Xbox Scarlett, they'd lose the BC marketing edge next year to MS, and from what we've heard Sony really cares about perfecting BC for next gen, so much so to the point of delaying the console by a year from a decision made in 2017,.

Now i'd be shocked that given that 3 year window from 2017 to 2020 Sony didn't consider full BC support going into the next gen PlayStation console and not just PS4 BC, something just doesn't sit right with needing 3 years just for getting PS4 BC 101% spot on, especially given that both consoles are x86 machines and Sony has already dabbled on PS4 console to console compatibility with the PS4 OG to Pro.

i am 80% - 85% sure that we'll get full Playstation BC support into the PS5.
 

Flandy

Community Resettler
Member
Oct 25, 2017
3,445
I still think we might have full PS4,PS3,PS2 & PS1 BC on the PS5 and not just PS4 BC, mainly because of Microsoft having full Xbox (OG,360,One) BC on the Xbox Scarlett, they'd lose the BC marketing edge next year to MS, and from what we've heard Sony really cares about perfecting BC for next gen, so much so to the point of delaying the console by a year from a decision made in 2017,.

Now i'd be shocked that given that 3 year window from 2017 to 2020 Sony didn't consider full BC support going into the next gen PlayStation console and not just PS4 BC, something just doesn't sit right with needing 3 years just for getting PS4 BC 101% spot on, especially given that both consoles are x86 machines and Sony has already dabbled on PS4 console to console compatibility with the PS4 OG to Pro.

i am 80% - 85% sure that we'll get full Playstation BC support into the PS5.
Give PSP and Vita too pls
 

PianoBlack

Member
May 24, 2018
6,626
United States
I still think we might have full PS4,PS3,PS2 & PS1 BC on the PS5 and not just PS4 BC, mainly because of Microsoft having full Xbox (OG,360,One) BC on the Xbox Scarlett, they'd lose the BC marketing edge next year to MS, and from what we've heard Sony really cares about perfecting BC for next gen, so much so to the point of delaying the console by a year from a decision made in 2017,.

Now i'd be shocked that given that 3 year window from 2017 to 2020 Sony didn't consider full BC support going into the next gen PlayStation console and not just PS4 BC, something just doesn't sit right with needing 3 years just for getting PS4 BC 101% spot on, especially given that both consoles are x86 machines and Sony has already dabbled on PS4 console to console compatibility with the PS4 OG to Pro.

i am 80% - 85% sure that we'll get full Playstation BC support into the PS5.

Hope so, that'd be amazing. Still need to finish Suikoden II...
 

gremlinz1982

Member
Aug 11, 2018
5,331
Fair points.

I disagree with the whole them not doing something because it goes against precedent thing though.

And HBM3 doesn't " have" to be 700GB+, they can even clock it lower to use even less power, which is one of the major benefits it has over GDDR6 which uses as much as 4.5x the power of HBM2 and HBM3 uses even less power than that. In a console, A 10-15W delta can make a world of difference. And we don'thave to guess the power of GDDR6, we know what it is per chip right now, and can calculate how much power it will draw. And that draw will remain constant along the course of the generation, as long as its GDDR6 that is being used.

And you are blatantly ignoring that there are ways that HBM3 can be made even cheaper and incorporated at a lower cost... why? Simply because no one else as done it yet?

And don't you see that rumors of sony using a smaller chip actually point more to HBM than it does to GDDR6?
I am not ignoring that HBM3 can be made cheaper. But what makes something cheaper? It is refinement of the process, and scaling up of production. If there is high production/supply and high demand, what usually happens is that costs can be spread over a larger canvas so you end up having a high volume, lower margin business. It also allows for a faster break even point, and because it is something that is selling extremely well, you are likely to have more players coming investing in capacity. Competition is something that will now bring in even more cost reductions. This at the end of the day is what drives economies of scale.

When the demand is weaker, you do not have the same economies of scale. We now live in a worldwide economy where people want the cheapest goods. Governments and central banks which always want to have some inflation have had a problem generating it because of just how interconnected the worldwide economy is. So, you get some niche products that are high end, that can charge a premium, but to the rest of the market, it is a bloodbath where companies are simply trying to sell enough sometimes to just stay in business.
This is why you never see new tech adopted fast enough if the cost benefit is not there. This in turn means that there is not much demand to force prices down.....it is a vicious cycle. So console manufacturers will go for a known entity, not competing for the fastest chips on the market either, because those also cost more.

You started with design A, pumped money into it. You have now abandoned that and are planning a new design, which also costs money to have a more expensive RAM type whose costs will likely not come down as fast going by how HBM prices have behaved. There are people that sign off on expenditure, and they are more concerned about the bottom line than they are innovation.

Sony, Microsoft, Nintendo will not be chasing new tech if the price is not right. Cost at the end of the day is what drives the console business. Sony/Microsoft could have gone for more capable Intel CPU's in 2013, could have gone with SSD solutions or better GPU's. They did not because they are building something for the masses.
 

Sekiro

Member
Jan 25, 2019
2,938
United Kingdom
Hope so, that'd be amazing. Still need to finish Suikoden II...

There's a lot of Metal Gear games that ive yet to play but really want to experience, had a 360 Last gen so i only got to play MGS2,3,PW from the HD remaster and MGS:GZ,TP on the PS4, in fact the PS4 was my first playstation console, there's a lot of exclusives from the PS1,PS2,PS3 days that i really want to play with the added benefit of a 4K/8K @ uncapped frames.
 

PianoBlack

Member
May 24, 2018
6,626
United States
There's a lot of Metal Gear games that ive yet to play but really want to experience, had a 360 Last gen so i only got to play MGS2,3,PW from the HD remaster and MGS:GZ,TP on the PS4, in fact the PS4 was my first playstation console, there's a lot of exclusives from the PS1,PS2,PS3 days that i really want to play with the added benefit of a 4K/8K @ uncapped frames.

Ah wow. Massive potential backlog for you then, haha. I was pretty much Sony only from PS2-PS4 and had both portables too. So I'm hoping to get access to my old PS3/PS1 classic digital purchases. Not really holding out any hope for PSP and Vita games but who knows?

Xbox doesn't have the deep back catalog Sony does (though still plenty of gems), but knowing MS respects those purchases and wants me to be able to go back and play Gears 1-4 before Gears 5 came out, for example, has really added some shine to the XB1 that I hope Sony matches. Like, I literally bought the Lost Odyssey DLC on a friend's 360 back in 2009 and there it was waiting for me in 2018 when I got my first Xbox. Hope that becomes standard. Seems we are finally getting enough power on the PS side to make it possible.
 

Sekiro

Member
Jan 25, 2019
2,938
United Kingdom
lol I assume you're joking about the 120fps but a 9x resolution boost across the board similar to what Xbox has for OG titles would be amazing for Vita and PSP

Mainly uncapped frames on previous gens, wanna see how crazy those old games look once you take off the technical leash, plus they need something to showcase that 8K/120fps marketing push they've been telling us lol.

Ah wow. Massive potential backlog for you then, haha. I was pretty much Sony only from PS2-PS4 and had both portables too. So I'm hoping to get access to my old PS3/PS1 classic digital purchases. Not really holding out any hope for PSP and Vita games but who knows?

Xbox doesn't have the deep back catalog Sony does (though still plenty of gems), but knowing MS respects those purchases and wants me to be able to go back and play Gears 1-4 before Gears 5 came out, for example, has really added some shine to the XB1 that I hope Sony matches. Like, I literally bought the Lost Odyssey DLC on a friend's 360 back in 2009 and there it was waiting for me in 2018 when I got my first Xbox. Hope that becomes standard. Seems we are finally getting enough power on the PS side to make it possible.

Yeah Xbox's commitment to backwards compatibility has been nothing but stellar and a great love letter to their fans, hope Sony follow the same suit, it is insane how big Sony's library is when you combine all the games released from the PS1 to PS4, imagine announcing FULL BC support for the PS5 both disc and digital? it would dominate the global news next year come release.
 

sncvsrtoip

Banned
Apr 18, 2019
2,773
No, them not announcing it has nothing to do with whatever kinda RAM they are using. But it does lead me to believe that there will something good/bad about it which is why its the one thing that has seemingly purposefully not mentioned. However, whatever they are using? Its very well set in stone by now.

Everything I said about the MCM approach to HBM3 has been making the rounds from Samsung since 2016. It used to be called LCHBM. It may not even be still in development. But one thing for certain is that HBM3 has a lot of o design initiatives prioritizing affordability.

Fair points.

I disagree with the whole them not doing something because it goes against precedent thing though.

And HBM3 doesn't " have" to be 700GB+, they can even clock it lower to use even less power, which is one of the major benefits it has over GDDR6 which uses as much as 4.5x the power of HBM2 and HBM3 uses even less power than that. In a console, A 10-15W delta can make a world of difference. And we don'thave to guess the power of GDDR6, we know what it is per chip right now, and can calculate how much power it will draw. And that draw will remain constant along the course of the generation, as long as its GDDR6 that is being used.

And you are blatantly ignoring that there are ways that HBM3 can be made even cheaper and incorporated at a lower cost... why? Simply because no one else as done it yet?

And don't you see that rumors of sony using a smaller chip actually point more to HBM than it does to GDDR6?
"In a console, A 10-15W delta can make a world of difference." Lmao
 
Nov 2, 2017
2,275
BUT - the AMD minimum required GPU for RDR2 is the RX280 which is 28CU running at 933Mhz which is 3.3TF, basically closer to PS4 Pro than PS4 while One S, a GPU with almost 1/3 of the RX280 power, runs the game just fine. So yeah, even the One S runs RDR2 better than a 3TF+ GCN PC card but that's the nature of console optimization VS the PC's brute force. It doesn't mean that the RX280 isn't almost X2 more powerful than the PS4, just that the PS4 version is much more optimized.
It's a mistake to think minimum requirements are consistent. I've seen games where the minimum requirements delivered way past console performance and I've seen games where the minimum requirements gave you 20fps at 720p.

You haven't actually lost the argument yet. It's still possible for the 750ti to match the PS4 in RDR2, but it's rather unlikely.
 
Sep 19, 2019
2,260
Hamburg- Germany
Why do you think I can't see difference ? Uou don't know it bit I know that you can't see difference betwren 25w and 10-15w :d

Then just define the meaning behind "worlds" so I get the idea behind your comments. Phoenix was talking about power consumption of only a part of the entire system what has an even more impact than the overall power consumption since cooling would be different accordingly and so on... To laugh about 10- 15 watts less power consumption of only a component of the entire system that impacts other parts significantly is just false.
 

sncvsrtoip

Banned
Apr 18, 2019
2,773
Then just define the meaning behind "worlds" so I get the idea behind your comments. Phoenix was talking about power consumption of only a part of the entire system what has an even more impact than the overall power consumption since cooling would be different accordingly and so on... To laugh about 10- 15 watts less power consumption of only a component of the entire system that impacts other parts significantly is just false.
Still thinking that statement 10-15w is world difference when both console will probably be around 180-200w is funny exageration, if this is not so funny for you, hmm whatever ;)
 
Status
Not open for further replies.