• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

iceblade

Member
Oct 25, 2017
4,216
This gen was the wrong one to entirely cock up due to the digital lock in.

Agreed. It could work out well for MS though - "Buy our console with your digital games from all previous Xbox consoles, and you can get Game Pass too with day and date releases of new first party titles". They've also reiterated that they want you to be able to bring your games forward with you. Sony hasn't been nearly as aggressive with their BC initiative despite also having a large catalogue to tap into. I get that PS3 would be problematic to bring over, but PS1 and PS2 shouldn't be. PS4 BC is good, yes, but it's not as exhaustive as MS's offering.

If I was a neutral jumping in and deciding on an ecosystem to stick with that'd be a pitch that'd definitely make me look hard at picking up an Xbox Series X. TBH as it is, and as someone who plays on PC / Playstation / Nintendo, I'm still having a hard time not following the Series X closely.

Also, but unrelated: Blue is the colour :D.
 
Last edited:

Scently

Member
Oct 27, 2017
1,464
I have a question is it better for a developer to have a fixed or variable clock speed? I've been seeing some responses that state variable is the better situation (which doesn't seem right). Also if we have two consoles using the same RDNA2 structure, why are people discrediting the comparison of TFs (this is from the playstation 5 OT)
I think from Sony's point of view I think it's a smart way of extracting more out a fixed silicon and power budget but I would assume developers would much rather go with a fixed system as I would imagine its one less burden or something that they have to take into account.
As for which is more powerful, that's undeniably the XSX in terms of the CPU, GPU and RAM bandwidth. But the level to which this will manifest itself on-screen remains to be seen. I don't think it will amount to much. The SSD in the PS5 is twice as fast as the one on the XSX, but even the SSD on it is no slouch; it's anywhere between 48-96x what we had to deal with in this current-gen. I personally don't see how you the SSD on the XSX is going to limit level/game design but once you, more or less double the speed, suddenly everything changes. Nobody has been able to qualify it for me.

Anyway I think both systems are on a strong footing. Both are really powerful and I can't wait to see the games.
 
Last edited:

bcatwilly

Member
Oct 27, 2017
2,483
I have a question is it better for a developer to have a fixed or variable clock speed? I've been seeing some responses that state variable is the better situation (which doesn't seem right). Also if we have two consoles using the same RDNA2 structure, why are people discrediting the comparison of TFs (this is from the playstation 5 OT)

Of course a developer would always prefer the most powerful fixed/known performance target possible over a variable target where they have to compromise on whether the CPU or GPU gets full power at any given time instead of both. These are just fanboys trying to convince themselves of the benefits of lesser powerful hardware.
 

rokkerkory

Banned
Jun 14, 2018
14,128
I have a question is it better for a developer to have a fixed or variable clock speed? I've been seeing some responses that state variable is the better situation (which doesn't seem right). Also if we have two consoles using the same RDNA2 structure, why are people discrediting the comparison of TFs (this is from the playstation 5 OT)

I think most devs would like one less thing to worry about and would want the most power they can get.
 

space_nut

Member
Oct 28, 2017
3,306
NJ
Fixed, constant, known power of gpu/cpu is 100% always better than variable clocks for a gpu/cpu that will downclock for "workloads". Means that devs on ps5 will have to code/develop with those "workloads" in mind and limit themselves more than on the XSX. The XSX offers power that won't change. On top of that the XSX offering superior gpu/cpu/ram with no variable changes makes it an idea system for devs to make games on
 

Isayas

Banned
Jun 10, 2018
2,729
Fixed, constant, known power of gpu/cpu is 100% always better than variable clocks for a gpu/cpu that will downclock for "workloads". Means that devs on ps5 will have to code/develop with those "workloads" in mind and limit themselves more than on the XSX. The XSX offers power that won't change. On top of that the XSX offering superior gpu/cpu/ram with no variable changes makes it an idea system for devs to make games on

Hehehehe. We will see. Cerny is a genius and I love the the doubter for the PS5.
 

DrKeo

Banned
Mar 3, 2019
2,600
Israel
Some more Lockhart prediction:

3.0-3.5ghz. Lower than Xsx due to less robust cooling and less demand to feed the GPU.
10GB GDDR6, 6GB at 288GB/s, 4GB at 192GB/s
26CUs, 32ROPS at ~1353mhz, ~4.5TF, VRS, RT. Half the Xsx's GPU in size.
1TB SSD
$249.99



RAM bandwidth is maxed out on the chips they have. They'll need to upgrade to 16gbps chips and that might be too expensive, and it might not even work if the SOC and mobo wasn't designed with that headroom.
XSS predictions sounds like fun. I'll give it a shot:
CPU - 8 core / 16 threads @3.2Ghz.
GPU - 22 CUs @1530Mhz -> 4.3TF.
Memory - 192-bit interface 13.4Gbps GDDR6, 12GB at 321.6GB/s.
SSD - 512GB @2.2GB/s.
Other - no BR drive, looks like an XSX cut in half (so basically a cube).
Price - 349$.

Hmm... DrKeo ,you agree with this? 😉
That guy sounds like a bozo 😬
 
Last edited:

DukeBlueBall

Banned
Oct 27, 2017
9,059
Seattle, WA

Outrun

Member
Oct 30, 2017
5,782
Is this for real? PS5 will be a great console regardless with some cool games no doubt, but it is definitively the less powerful console than Xbox Series X.

100%

Both systems are splendid. XSX is more powerful though. The amount of energy that some are spending to discount that fact is astounding.
 

Deleted member 224

Oct 25, 2017
5,629
Hehehehe. We will see. Cerny is a genius and I love the the doubter for the PS5.
We've had a dev on this site claim that variable clocks are "less than ideal". If Sony could have guaranteed those clocks 100% of the time, they would have.

I'm sure Sony sees it as preferable to lowering the clocks and having a weaker system, but it's still a compromise.
 

Micerider

Member
Nov 11, 2017
1,180
I can't help but seeing a Lockhart solution die very fast on the market. It will draw some penny less gamers for sure
100%

Both systems are splendid. XSX is more powerful though. The amount of energy that some are spending to discount that fact is astounding.

The problem comes from how the term is perceived for videogames. Some people see "more powerfull" and automatically turn in defense mode because they feel the intent is to diss the other console or to make it look bad, or implying that games will be crappy on it in comparison. None of these statement are true, there will be a difference, but not one that will make games different in their core or general feel.

Other aspect is : there is a point to be made about the SSD, but that point cannot be used in "power" comparison, it will also impact games, probably in ways we do not fully grasp yet, but you can't really show how as long as there is no direct use of it.

A weird discussion eventually.
 

rokkerkory

Banned
Jun 14, 2018
14,128
I hope lockhart is the streaming / xcloud device. Some local power ie 4TF rdna 2 goodness to help with the new technology that perhaps isn't quite ready for prime time plus the power of xcloud in the datacenters will give us almost xsx experience but at a reduced cost. It'll be streaming device with no disc drive and meant for those that have the bandwidth to enjoy cloud gaming.
 

Isayas

Banned
Jun 10, 2018
2,729
We've had a dev on this site claim that variable clocks are "less than ideal". If Sony could have guaranteed those clocks 100% of the time, they would have.

I'm sure Sony sees it as preferable to lowering the clocks and having a weaker system, but it's still a compromise.

I guess but we will see. We will see why Cerny went this route instead of the traditional beefier CPU/GPU.
 

tapedeck

Member
Oct 28, 2017
7,977
Other than the magical SSD stuff..I think the biggest cause of discounting XSX's power advantage over PS5 boils down to 'percentage of increase'..people see the 16% GPU advantage and jump to 'that's even less than 1X vs PS4Pro and way less than PS4 vs Xbone'. Of course that's true but that 1.875 Teraflop gap is more than an extra PS4 and those are RDNA2 flops..thats probably 1.5 PS4's GCN, a significant power gap no matter how you slice it.

1st party games who knows, Sony's teams are amazing and I'm sure their stuff will be jaw dropping. I say all this as someone buying both consoles and who was really rooting for PS5 to be 13.3TF but obviously that rumor turned out to be 1000% horseshit and Im admittedly a little disappointed in the PS5 spec.
 

bcatwilly

Member
Oct 27, 2017
2,483
I guess but we will see. We will see why Cerny went this route instead of the traditional beefier CPU/GPU.

The simplest answer is that they maintained the same 36 CU count on the GPU because they didn't know how to handle backwards compatibility any better than doing that (Cerny even hinted around the edges of that when mentioning BC), so they are clocking the heck out of it to attempt to close the performance gap a little with the 12 TF 52 CU Xbox Series X GPU.
 

rokkerkory

Banned
Jun 14, 2018
14,128
The simplest answer is that they maintained the same 36 CU count on the GPU because they didn't know how to handle backwards compatibility any better than doing that (Cerny even hinted around the edges of that when mentioning BC), so they are clocking the heck out of it to attempt to close the performance gap a little with the 12 TF 52 CU Xbox Series X GPU.

I hope they fix the BC issue by next-gen ie ps6 so they aren't bounded by h/w limitations if this is true.
 

Godzilla24

Member
Nov 12, 2017
3,371
I have a question is it better for a developer to have a fixed or variable clock speed? I've been seeing some responses that state variable is the better situation (which doesn't seem right). Also if we have two consoles using the same RDNA2 structure, why are people discrediting the comparison of TFs (this is from the playstation 5 OT)
Sustained is always better. Don't take any spin too seriously. Lots of FUD going around.
 

solis74

Member
Jun 11, 2018
42,907
The simplest answer is that they maintained the same 36 CU count on the GPU because they didn't know how to handle backwards compatibility any better than doing that (Cerny even hinted around the edges of that when mentioning BC), so they are clocking the heck out of it to attempt to close the performance gap a little with the 12 TF 52 CU Xbox Series X GPU.

interesting
 

ArchedThunder

Uncle Beerus
Member
Oct 25, 2017
19,060
Which is why all PC GPU's and CPU's use variable clocks??

What if XSX could have ran at 2.0GHz most of the time if it was variable?

There are definitely some benefits to going with a variable solution.
Variable clocks in PCs exist because they aren't fixed systems so in the case of throttling they are automatically clocked down if they heat up too much, or in the case of overclocking the user can chose to overclock at the cost of a higher power draw and more heat which may require better cooling. Meanwhile every XSX is the same and every PS5 is the same so you know the exact cooling set up and max power draw. Putting the clock speeds on the devs is another element of optimization that devs have to deal with instead of always having the same speeds all the time. There is a reason games can be much more heavily optimized on consoles compared to PCs, the devs always know exactly what they are working with. The variable clock speeds in the PS5 are a completely different thing than what you see in PC hardware, the system is not going to automatically throttle based on heat, nor will the user be able to chose their clock speeds. Clock speeds in PS5 games will be decided by devs.
 

Outrun

Member
Oct 30, 2017
5,782
I can't help but seeing a Lockhart solution die very fast on the market. It will draw some penny less gamers for sure

The problem comes from how the term is perceived for videogames. Some people see "more powerfull" and automatically turn in defense mode because they feel the intent is to diss the other console or to make it look bad, or implying that games will be crappy on it in comparison. None of these statement are true, there will be a difference, but not one that will make games different in their core or general feel.

Other aspect is : there is a point to be made about the SSD, but that point cannot be used in "power" comparison, it will also impact games, probably in ways we do not fully grasp yet, but you can't really show how as long as there is no direct use of it.

A weird discussion eventually.

I don't think that anyone is going to look at the next Sony masterpiece game and feel that it is missing something because of a silly TF number. :)
 

DrKeo

Banned
Mar 3, 2019
2,600
Israel
You're thinking they need to match the bw for the OS? ~336GB/s is serious overkill for a 4-5TF GPU.
Actually I'm reiterating, the memory is 13.4Gbps (Sparkman in the Github baby!).

XSX has a split speed memory because MS wanted to save money, not because they wanted to have two speeds. So I'm assuming XSS will have the simplest memory setup it can, which IMO is 192-bit interface with the full 12GB but it will run at 1675Mhz, just like the Sparkman leak. So that means 12GB @321.6GB/s.

Nn br drive -> gamepass machine :)
Half size SSD because no 4K textures mean smaller installs and there is an expansion slot if anyone needs more.
 

cyrribrae

Chicken Chaser
Member
Jan 21, 2019
12,723
I hope lockhart is the streaming / xcloud device. Some local power ie 4TF rdna 2 goodness to help with the new technology that perhaps isn't quite ready for prime time plus the power of xcloud in the datacenters will give us almost xsx experience but at a reduced cost. It'll be streaming device with no disc drive and meant for those that have the bandwidth to enjoy cloud gaming.
If that's the case, the Lockhart is the wrong form factor. If they're going to give us a streaming device, then it doesn't need 4TF RDNA2. It needs to be an Xbox One or it needs to be a stick you plug into the back of your TV. I mean DLI improvements maybe obviate the usefulness of Xbox One, but probably not.

Lockhart is fine as a concept *IF* MS successfully finds a way to make it fairly trivial to downport games from the XSX to Lockhart. That's not a given, but it's also not impossible. If MS delivers on that and convincingly makes the argument that the Lockhart is for 1080/60 on games that hit 4k/60 on XSX at a significant price reduction, then the marketing strikes me as being VERY GOOD and poised to do well.
 

Pancracio17

▲ Legend ▲
Avenger
Oct 29, 2017
18,754
If that's the case, the Lockhart is the wrong form factor. If they're going to give us a streaming device, then it doesn't need 4TF RDNA2. It needs to be an Xbox One or it needs to be a stick you plug into the back of your TV. I mean DLI improvements maybe obviate the usefulness of Xbox One, but probably not.

Lockhart is fine as a concept *IF* MS successfully finds a way to make it fairly trivial to downport games from the XSX to Lockhart. That's not a given, but it's also not impossible. If MS delivers on that and convincingly makes the argument that the Lockhart is for 1080/60 on games that hit 4k/60 on XSX at a significant price reduction, then the marketing strikes me as being VERY GOOD and poised to do well.
Well, there was some tech MS showed a while ago that combined running parts of the game and streaming the rest of the game, making it overall have less lag and need less bandwidth than streaming.


Tho I doubt lockheart is actually for streaming despite this.
 

Deleted member 12635

User requested account closure
Banned
Oct 27, 2017
6,198
Germany
He is right. GB for storage is not the GiB used for ram. But no one has been using GiB for storage for decades now. Colbert.
In the early days of PCs, a GB was a GiB (based on 1024). The shift to GB was based on 1000 instead of 1024 came later (1998). Just after I finished my computer science study and I left the PC desktop arena for around 20 years (until 2019).

Fascinating is the fact that with memory we are still based on 1024 for marketing the products and the actual size of the memory.

XSX has a split speed memory because MS wanted to save money, not because they wanted to have two speeds. So I'm assuming XSS will have the simplest memory setup it can, which IMO is 192-bit interface with the full 12GB but it will run at 1675Mhz, just like the Sparkman leak. So that means 12GB @321.6GB/s.
8 to 10 GB of memory. With the new render techniques you don't need more memory for a 1080p console. What you need is bandwidth.

Nn br drive -> gamepass machine :)
Half size SSD because no 4K textures mean smaller installs and there is an expansion slot if anyone needs more.
I see no BRD but I think 1 TB even for Lockhart.
 
Last edited:

Fredrik

Member
Oct 27, 2017
9,003
In the early days of PCs, a GB was a GiB (based on 1024). The shift to GB was based on 1000 instead of 1024 came later.

Fascinating is the fact that with memory we are still based on 1024 for marketing the products and the actual size of the memory.


8 to 10 GB of memory. With the new render techniques you don't need more memory for a 1080p console. What you need is bandwidth.


I see no BRD but I think 1 TB even for Lockhart.
Is this somehow related to bit and byte conversions? Never even heard about GiB before.
 

christocolus

Member
Oct 27, 2017
14,932
MS put a lot of thought into this hardware and I can't wait to see what developers can achieve with it. next gen will be great.
 
Last edited:

Deleted member 12635

User requested account closure
Banned
Oct 27, 2017
6,198
Germany
Is this somehow related to bit and byte conversions? Never even heard about GiB before.
If you buy storage the capacity is given in GB. Means the number of bytes is divided by 1000*1000*1000 for a GB whereas your system use 1024*1024*1024 for obvious reasons (2^x).

"The gibibyte (GiB) is a multiple of the unit byte for digital information. The binary prefix gibi means 230, therefore one gibibyte is equal to 1073741824bytes = 1024 mebibytes. The unit symbol for the gibibyte is GiB. It is one of the units with binary prefixes defined by the International Electrotechnical Commission (IEC) in 1998.[1][2]

The gibibyte is closely related to the gigabyte (GB), which is defined by the IEC as 109 bytes = 1000000000bytes, 1GiB ≈ 1.074GB. 1024 gibibytes are equal to one tebibyte. In the context of computer memory, gigabyte and GB are customarily used to mean 10243 (230) bytes, although not in the context of data transmission and not necessarily for hard drive size.[3]

Hard drive and SSD manufacturers use the gigabyte to mean 1000000000 bytes. Therefore, the capacity of a 128 GB SSD is 128000000000 bytes. Expressed in gibibytes this is about 119.2 GiB. Operating systems, including Microsoft Windows, display such a drive capacity as 119 GB, using the SI prefix G with the binary meaning. No space is missing: the size is simply being expressed in a different unit, even though the same prefix (G) is used in both cases.

The use of gigabyte (GB) to refer to 1000000000 bytes in some contexts and to 1073741824 bytes in others, sometimes in reference to the same device, has led to claims of confusion, controversies, and lawsuits.[4][5][6][7] The IEC created the binary prefixes (kibi, mebi, gibi, etc.) in an attempt to reduce such confusion. They are increasingly used in technical literature and open-source software, and are a component of the International System of Quantities.
[8]"

What? LOL
 
Last edited:

christocolus

Member
Oct 27, 2017
14,932
If you buy storage the capacity is given in GB. Means the number of bytes is divided by 1000*1000*1000 for a GB whereas your system use 1024*1024*1024 for obvious reasons (2^x).

"The gibibyte (GiB) is a multiple of the unit byte for digital information. The binary prefix gibi means 230, therefore one gibibyte is equal to 1073741824bytes = 1024 mebibytes. The unit symbol for the gibibyte is GiB. It is one of the units with binary prefixes defined by the International Electrotechnical Commission (IEC) in 1998.[1][2]

The gibibyte is closely related to the gigabyte (GB), which is defined by the IEC as 109 bytes = 1000000000bytes, 1GiB ≈ 1.074GB. 1024 gibibytes are equal to one tebibyte. In the context of computer memory, gigabyte and GB are customarily used to mean 10243 (230) bytes, although not in the context of data transmission and not necessarily for hard drive size.[3]

Hard drive and SSD manufacturers use the gigabyte to mean 1000000000 bytes. Therefore, the capacity of a 128 GB SSD is 128000000000 bytes. Expressed in gibibytes this is about 119.2 GiB. Operating systems, including Microsoft Windows, display such a drive capacity as 119 GB, using the SI prefix G with the binary meaning. No space is missing: the size is simply being expressed in a different unit, even though the same prefix (G) is used in both cases.

The use of gigabyte (GB) to refer to 1000000000 bytes in some contexts and to 1073741824 bytes in others, sometimes in reference to the same device, has led to claims of confusion, controversies, and lawsuits.[4][5][6][7] The IEC created the binary prefixes (kibi, mebi, gibi, etc.) in an attempt to reduce such confusion. They are increasingly used in technical literature and open-source software, and are a component of the International System of Quantities.
[8]"


What? LOL
Lmao xD.. error.. I was about to type but got distracted and must have mistakenly hit some buttons. Lol.
 

Fredrik

Member
Oct 27, 2017
9,003
If you buy storage the capacity is given in GB. Means the number of bytes is divided by 1000*1000*1000 for a GB whereas your system use 1024*1024*1024 for obvious reasons (2^x).

"The gibibyte (GiB) is a multiple of the unit byte for digital information. The binary prefix gibi means 230, therefore one gibibyte is equal to 1073741824bytes = 1024 mebibytes. The unit symbol for the gibibyte is GiB. It is one of the units with binary prefixes defined by the International Electrotechnical Commission (IEC) in 1998.[1][2]

The gibibyte is closely related to the gigabyte (GB), which is defined by the IEC as 109 bytes = 1000000000bytes, 1GiB ≈ 1.074GB. 1024 gibibytes are equal to one tebibyte. In the context of computer memory, gigabyte and GB are customarily used to mean 10243 (230) bytes, although not in the context of data transmission and not necessarily for hard drive size.[3]

Hard drive and SSD manufacturers use the gigabyte to mean 1000000000 bytes. Therefore, the capacity of a 128 GB SSD is 128000000000 bytes. Expressed in gibibytes this is about 119.2 GiB. Operating systems, including Microsoft Windows, display such a drive capacity as 119 GB, using the SI prefix G with the binary meaning. No space is missing: the size is simply being expressed in a different unit, even though the same prefix (G) is used in both cases.

The use of gigabyte (GB) to refer to 1000000000 bytes in some contexts and to 1073741824 bytes in others, sometimes in reference to the same device, has led to claims of confusion, controversies, and lawsuits.[4][5][6][7] The IEC created the binary prefixes (kibi, mebi, gibi, etc.) in an attempt to reduce such confusion. They are increasingly used in technical literature and open-source software, and are a component of the International System of Quantities.
[8]"


What? LOL
o.O
Lol it's only morning and I already feel like I'm dumber than I was yesterday
 

DukeBlueBall

Banned
Oct 27, 2017
9,059
Seattle, WA
If there is no decently priced <$2000 TV that can take advantage of all Xsx features at launch, I'll make do with Lockhart + 1440p gaming monitor for the time being.