This might precisely be it. Sony with the slightest edge over Scarlett
Edge by what? 0.2 TF? Lol
What about the 0.4 ghz in CPU and 48 in bandwidth in favor of Scarlett?
If these specs are true then there is no edge to write home about.
This might precisely be it. Sony with the slightest edge over Scarlett
Oh ok I haven't been in this thread much, didn't know about the GI Editors info, I did hear about the KLee GameFan guy talking in here a while ago. Interesting.Executive editor of Game Informer straight up said, that in discussion on the show floor for a conference devs were saying that ps5 is more powerful. It was unequivocal and even clarified that based on target specs each was targeting, not even the iteration of devkit they currently had. Add in kleegamefan saying the difference between the new consoles is less than the difference between og X1 and PS4, and here we are
Kleegamefan - he suggested that PS5 has the ever slightest edge
Thanks for the info. Funny thing is that I have still not seen this flute rumor. I think I was either banned when it was making the rounds and just never bothered to look at it properly or its one of those things I dismissed lol.MS had shown us in the CGI trailer a mixture of 1GB and 2GB chips, 10 of those means anything between 11GB and 19GB of GDDR6 in Scarlett unless they will decide to downgrade to just 1GB chips (which will never happen) or just 2GB (which might happen).
Flute implies a 256-bit bus with 16GB of 18Gbps GDDR6, MS CGI trailer from E3 implies a 320-bit bus with 10GB-20GB of 14Gbps GDDR6.
320-bit vs 256-bit should be ~20mm^2 on the die.
That's why I mentioned the move away from baking. It would make a lot of sense for devs to want it simply for that reason.It'll be most beneficial for devs and their development process. That will trickle down to us in other ways like games potentially coming out sooner.
Flute popped up in the userbench DB, found by Kemchi. It had:Thanks for the info. Funny thing is that I have still not seen this flute rumor. I think I was either banned when it was making the rounds and just never bothered to look at it properly or its one of those things I dismissed lol.
I'm not sure what kind of misdirect the can do using that setup. If it's not mixed it's 10GB of GDDR6 or 20GB of GDDR6, no other option, and I'm pretty sure it's not 10GB :)As or the MS RAM thing, I am leaning more towards the RAMchip mixture they used being a misdirect so as to not give an exact number on ram mount of their hardware. Because the wole mixture thing just doesn't work based on how I understand GDDR6 working.
I think you use the word "edge" different than everyone else.Edge by what? 0.2 TF? Lol
What about the 0.4 ghz in CPU and 48 in bandwidth in favor of Scarlett?
If these specs are true then there is no edge to write home about.
Yep.That's why I mentioned the move away from baking. It would make a lot of sense for devs to want it simply for that reason.
I don't know how I missed all that.Flute popped up in the userbench DB, found by Kemchi. It had:
- CPU score very similar to the 1700X.
- 8 cores 16 threads at 1.6Ghz and 3.2Ghz boost.
- 16 1GB GDDR6 chips in a clamshell that scored 530GB/s in the benchmark.
- GPU was 2Ghz.
The benchmark was taken down so no link to it anymore, maybe someone here has a screenshot. Flute is also a Shakespeare reference so the leak seemed pretty sound and it replaced the good old Gonzalo leak.
So from that leak, we can assume that PS5 has a 256-bit bus with 16GB of 18Gbps GDDR6 running at an unknown speed, probably a bit under 18Gbps which means a bit under 576GB/s.
I'm not sure what kind of misdirect the can do using that setup. If it's not mixed it's 10GB of GDDR6 or 20GB of GDDR6, no other option, and I'm pretty sure it's not 10GB :)
They can mix 2's and 1'sFlute popped up in the userbench DB, found by Kemchi. It had:
- CPU score very similar to the 1700X.
- 8 cores 16 threads at 1.6Ghz and 3.2Ghz boost.
- 16 1GB GDDR6 chips in a clamshell that scored 530GB/s in the benchmark.
- GPU was 2Ghz.
The benchmark was taken down so no link to it anymore, maybe someone here has a screenshot. Flute is also a Shakespeare reference so the leak seemed pretty sound and it replaced the good old Gonzalo leak.
So from that leak, we can assume that PS5 has a 256-bit bus with 16GB of 18Gbps GDDR6 running at an unknown speed, probably a bit under 18Gbps which means a bit under 576GB/s.
I'm not sure what kind of misdirect the can do using that setup. If it's not mixed it's 10GB of GDDR6 or 20GB of GDDR6, no other option, and I'm pretty sure it's not 10GB :)
Predictions
PS5
GPU: 44cu @ 2000Mhz = 11.26TF
CPU: 3.2Ghz Zen 2
RAM: 16GB GDDR6 @ 16Gbps on 256 bit memory bus = 512GB/s bandwidth
SSD: 1TB
$499
Scarlett
GPU: 48cu @ 1800Mhz = 11.06TF
CPU: 3.6Ghz Zen 2
RAM: 16GB GDDR6 @ 14Gbps on 320 bit memory bus = 560GB/s bandwidth
SSD: 1TB
$499
Practically the same machine, just focusing on slightly different things.
Size wise, I think Scarlett will be the bigger box, PS5 smaller.
Flute popped up in the userbench DB, found by Kemchi. It had:
- CPU score very similar to the 1700X.
- 8 cores 16 threads at 1.6Ghz and 3.2Ghz boost.
- 16 1GB GDDR6 chips in a clamshell that scored 530GB/s in the benchmark.
- GPU was 2Ghz.
The benchmark was taken down so no link to it anymore, maybe someone here has a screenshot. Flute is also a Shakespeare reference so the leak seemed pretty sound and it replaced the good old Gonzalo leak.
So from that leak, we can assume that PS5 has a 256-bit bus with 16GB of 18Gbps GDDR6 running at an unknown speed, probably a bit under 18Gbps which means a bit under 576GB/s.
If true that's all they need for marketing. Even if by .Infinite...001 TF. .infinite 99999< 1Kleegamefan - he suggested that PS5 has the ever slightest edge
RAM is based on the E3 video which showed what looked to be a 10 chip set up with 1GB and 2GB chips at 14GBps.
GPU is based on Scarlett being very close to PS5.
CPU is a shot in the dark.
What if the mixed RAM density thing is due to a non-unified RAM pool? We know that access contention between CPU and GPU can have a pretty disproportionate impact on available bandwidth and, with Zen 2 being considerably more powerful and sensitive to latency, it might make sense to give the CPU its own set of pipes. HBCC could be used to both virtually unify the pools and shuttle data between them, if necessary.
I think everyone finding out at the same time during a reveal would be cool. We're probably ~4 months away from it anyway. It is right around the corner.Fair enough. Even though I think a proper leak would be exciting, it might be cool to experience a proper reveal (for once).
I really don't know about this... But I can't say for certain because I am not sure because there has never been a single GPU or system that uses different sizes of GDDR RAM. Even the RTX 2080 TI that has 11GB simply left a space on the board blank as opposed to 10x1GB modules + 2x 512MB modules whch would have meant it would have had a higher bus width and in turn higher bandwidth. There has to be a reason for that. Here's an image of the RTX 2080 ti PCB.
It is reduced like that for yield. The full version of this die only shows up in Titan products and has all 12 memory channels active.I really don't know about this... But I can't say for certain because I am not sure because there has never been a single GPU or system that uses different sizes of GDDR RAM. Even the RTX 2080 TI that has 11GB simply left a space on the board blank as opposed to 10x1GB modules + 2x 512MB modules whch would have meant it would have had a higher bus width and in turn higher bandwidth. There has to be a reason for that. Here's an image of the RTX 2080 ti PCB.
Notice they opted to leave a space blank? If you could mix GDDR sizes, wouldnt ithave been better for them to use 10x1GB modules and 2x512MB mdeuls to notjustget their 11GB otal ut also get higher bandwidth over the 384-bit bus as opposed to a 352-bit bus?
And from what I know about PCs and DDR RAM, if you have different sizes of RAM, all the ones of identical size can work optimally in their dual channel mode, the ones that aren't of a similar size only work as single-channel RAM.
ok. Then can you answer this cause its got me stumped.It is reduced like that for yield. The full version of this die only shows up in Titan products and has all 12 memory channels active.
GameSpot has a list comparing specs of scarlet and ps5 based on public info...it says scarlet has a 1.6 clock on th zen 2. Where did they pull that bs number from? Lol
Edit: and it specifically says scarlet has 16 gigs of ram, what is going on?
ok. Then can you answer this cause its got me stumped.
Can they mix different sizes of RAM with GDDR? Cause I don't think anyone has ever done it.
Yikes.
GameSpot has a list comparing specs of scarlet and ps5 based on public info...it says scarlet has a 1.6 clock on th zen 2. Where did they pull that bs number from? Lol
Edit: and it specifically says scarlet has 16 gigs of ram, what is going on?
The GPU on scarlett is also specifically listed as arcturus, so obviously this is nothing
GameSpot has a list comparing specs of scarlet and ps5 based on public info...it says scarlet has a 1.6 clock on th zen 2. Where did they pull that bs number from? Lol
Edit: and it specifically says scarlet has 16 gigs of ram, what is going on?
How can something like this be posted by Gamespot? Wtf
GameSpot has a list comparing specs of scarlet and ps5 based on public info...it says scarlet has a 1.6 clock on th zen 2. Where did they pull that bs number from? Lol
Edit: and it specifically says scarlet has 16 gigs of ram, what is going on?
And I said they pulled the numbers from their ass, it thought it was worth a laugh
You don't find the humor in saying that the article will cause misinformation but also being the person to post it?
*throws lit match into forest* "Damn, that's gonna burn so many trees 😥"
*Walks into a fire station* "hey guys, can you believe people actually think that cotton candy is an effective way to put out fire?"
I walked into a thread with posters that know better, to share something funny. I would agree with you if I did this with any other thread
Ok, I think you may have taken some kind of offense when I meant none. Also not everybody will read every post and lurkers take random posts from here and post them elsewhere, facts be damned. But I wasn't trying to knock you, I legitimately found it funny. Sorry if you feel some type of way.
My bad! No issue at all, it's easy misinterpret things through text. My apologies!
My bad! No issue at all, it's easy misinterpret things through text. My apologies!
And I said they pulled the numbers from their ass, it thought it was worth a laugh
An Instinct product that isn't even capable of displaying graphics.
Probably so.Is it expected that the majority of PS5 games will still run at 30 fps as a standard?
Probably so.
It is a matter of what the devs want. Most of them will want to add extra graphical features over 60fps. The former sells games more.
The specs listed below are not set in stone, and are based on best guesswork from leaks and the small amount of publicly available information. This article will be updated as new information becomes available.