Status
Not open for further replies.

Pheonix

Banned
Dec 14, 2018
5,990
St Kitts
aight thanks. So ill get a new cable when i make a TV upgrade then.
You don't need to. The cable that comes with the PS5 can support the full bandwidth of HDMI 2.1 because it's a 2.1 cable. their embedded HDMI chipset is just limited to 32Gb bandwidth (vs the standard 48Gb oh HDMI 2.1). And 32Gb with 12bit color depth would either do 4K@60Hz in 4:4:4 or 4K@120Hz in 4:2:2.

But what Vince may have missed, and what a lot of people are ignoring here... is that 32Gb also means you can do 1080p@120Hz at 4:4:4. Why that is very important is that pretty much all if not most of the games that support a 120fps mode would be doing that at 1080p.

Think about that, how many games do you think would e dong 4K@120fps on these consoles?
 

Indy_Rex

Banned
Sep 20, 2020
759
Yeah for sure. I have it on XSX, not PS5 so I have no experience with it on that platform (I also haven't seen any dropped frames, likely due to VRR support on Xbox) but I've been seeing that it runs much better on PS5 all over the internet.

It consistently holds 60 moreso than the XSX version, but frame tearing is still present, just not to the extent of the XSX version, or as notable.
 

RayCharlizard

Member
Nov 2, 2017
3,046
More important than the drop from RGB to YUV422 (which your PS4 Pro and Xbox One X have been doing with HDR content this entire time and you never noticed) is the drop from 12-bit to just 8-bit at 4K@120Hz instead of using 10-bit output.

Edit: Nevermind, later in the video he says his Denon receiver says the PS5 is outputting at 12-bit still and the LG readout is bugged.
 

Deleted member 5876

Big Seller
Banned
Oct 25, 2017
2,559
No it does not.

At 4K60hz mode, the PS5 outputs the full 4:4:4 chroma with 12bit color depth. When outputting at 120fps, it drops the chroma to 4:2:2. Basically using the same bandwidth but with less color information. Basically, sony optimized their chipset specifically for a 4K60Hz output. It could have been worse, it could have been 4:2:0.

But this s a non-issue, this is one of those things that no one would probably notice unless you were told.

Lots of people, including me, who aren't able to get full 4:4:4 with 4K/60hz/hdr even tho tv is capable.
I have other devices that work fine with my setup.
Somethings off with PS5 here.
 

jfkgoblue

Banned
Oct 27, 2017
5,650
First they refuse to include Dolby Atmos support because "Tempest 3D" but it only works with headphones, then they fail to include VRR at launch (sorely needed in AC:Valhalla), and now this.

Lots of little compromises over the years from their hardware, and its quite surprising and disappointing.
Valhalla doesn't have the frame drops nor nearly the tearing issues on PS5 that the XSX version has
 

halcali

Banned
Nov 7, 2017
6,317
Hong Kong SAR
Anyone wondering what the difference is between 4:4:4, 4:2:2, and 4:2:0 chroma subsampling should check out Rting's very informative page on the subject.

Bottom line is, unless you're using your display as a PC monitor, you will most likely never notice the difference in color accuracy between 4:4:4 and 4:2:2.

Yep!

Visual impact on video games is labelled as "minor" while PC is labelled as "major."

(movies = "none")
 

Xx 720

Member
Nov 3, 2017
3,920
The console will sell gangbusters either way, it would not shock me if Sony leaves it as is, hope they do though.
 

Pheonix

Banned
Dec 14, 2018
5,990
St Kitts
Well their "ps5 tv" is 48gps so hopefully they can match that
This is really a non issue as I have explained already. They don't have to do anything. 32Gb is enough to give you
  • 4K + 60Hz + HDR + 4:4:4
  • 4K + 120Hz + HDR + 4:2:2
  • 1080p + 120Hz + HDR + 4:4:4
How many games do you guys think will be running at 120Hz in 4K on these consoles? And even if that game comes along that is doing 4K@120fps and needs to drop the chroma to 4:2:2... I can bet 99% of the posters here wouldn't notice the difference.
 

Jedi2016

Member
Oct 27, 2017
15,965
I'm going to just file this under "who really cares" until we can find out definitively if it actually has any real effect on.... well, anything.

Especially seeing as my TV only supports HDMI 2.0, so it won't make a lick of difference to me for years.
 

JaseC64

Enlightened
Banned
Oct 25, 2017
11,008
Strong Island NY
The dream ain't dead!

giphy-downsized-large.gif
Is that Kid Cerny? Lol
 

Spasm

Member
Nov 16, 2017
1,949
This is really a non issue as I have explained already. They don't have to do anything. 32Gb is enough to give you
  • 4K + 60Hz + HDR + 4:4:4
  • 4K + 120Hz + HDR + 4:2:2
  • 1080p + 120Hz + HDR + 4:4:4
How many games do you guys think will be running at 120Hz in 4K on these consoles?
And anyone fretting over 4:2:2 doesn't really need to. UHD Blu-rays are encoded on disc at YUV 4:2:0 and no one has ever raised a stink about their color accuracy.
 

KAMI-SAMA

Banned
Aug 25, 2020
5,496
is this a hardware thing or a software thing?

either way, i feel that for this launch, sony seems abit rushed in terms of the hardware side of things (OS, packaging,etc) for the ps5 but makes it up with the software while microsoft is the complete opposite...their hardware & OS seems much more mature and stable but then the software is not there.

Can't have it both ways i guess.

There's "compromises" on both systems. The PS5 comes with high speed USB ports including a UBS-C port included. Also with WiFi 6 and Bluetooth 5.1. Guess they both went after different things.
 

Ascenion

Prophet of Truth - One Winged Slayer
Member
Oct 25, 2017
10,281
Mecklenburg-Strelitz
Personally it ain't HDMI 2.1 if it's sub 48. So both basically screwed the pooch here. Never mind most TVs don't have full 48 ports.
 

Pheonix

Banned
Dec 14, 2018
5,990
St Kitts
And anyone fretting over 4:2:2 doesn't need to. UHD Blu-rays are encoded on disc at YUV 4:2:0 and no one has ever raised a stink about their color accuracy.
Exactly... this is such a non-issue it's almost ridiculous if it causes concern to anyone.

What good would having the full 48Gb 2.1 bandwidth have done for the PS5?
 

Zafir

Member
Oct 25, 2017
7,204
And anyone fretting over 4:2:2 doesn't need to. UHD Blu-rays are encoded on disc at YUV 4:2:0 and no one has ever raised a stink about their color accuracy.
I mean there is use cases where it matters though. For example if someone happened to be sitting rather close to the screen it can be very noticable on text.

For most users, probably not.
 

Pheonix

Banned
Dec 14, 2018
5,990
St Kitts
Personally it ain't HDMI 2.1 if it's sub 48. So both basically screwed the pooch here. Never mind most TVs don't have full 48 ports.
This isn't true.

Unless we also wanna say every LG CX Oled also isn't HDMI 2.1 because they aren't 48Gb either. They are 40Gb. Or all TVs that can't hit 1000nits aren't HDR either...
 

CrispyGamer

Banned
Jan 4, 2020
2,774
They already mentioned 8k support in the future so that would entail an hdmi software update and most likely VRR being added but they haven't spoken about it yet i don't understand why a journalist can't get an answer for that
 

J-Wood

Member
Oct 25, 2017
5,887
So question. If my tv and AV receiver are hdmi 2.0 (I think they are. Tv is Sony x900e and I have a Yamaha receiver that's a couple years old) that means the max I can output with HDR is 4:2:2 right? You can only do 4k hdr with 4:4:4 on an hdmi 2.1 setup?
 

Deleted member 1003

User requested account closure
Banned
Oct 25, 2017
10,638
We need to remember that the consoles we got at the beginning of last gen was not the same console we had at the end of the gen. At least as far as Playstation is concerned. It changed quite a bit. So I doubt the PS5 we have now will be the same in a few years.
 

Deleted member 5876

Big Seller
Banned
Oct 25, 2017
2,559
This isn't true.

Unless we also wanna say every LG CX Oled also isn't HDMI 2.1 because they aren't 48Gb either. They are 40Gb. Or all TVs that can't hit 1000nits aren't HDR either...

Thats a bogus comparison.
There is a world of difference between a substandard "hdr" tv and one that isn't.
40 vs 48 is a different story.
 

Xx 720

Member
Nov 3, 2017
3,920
From what's been written, the Sony x900 was waiting for hdmi vrr format to get finalized so maybe that's why this hasn't been patched for ps5.
 

Ascenion

Prophet of Truth - One Winged Slayer
Member
Oct 25, 2017
10,281
Mecklenburg-Strelitz
This isn't true.

Unless we also wanna say every LG CX Oled also isn't HDMI 2.1 because they aren't 48Gb either. They are 40Gb. Or all TVs that can't hit 1000nits aren't HDR either...
I was saying that in my opinion, I think HDMI 2.1 should comply with the full spec. So I don't consider either to really be 2.1 even if they chance of running into that content requiring 48 is low to nonexistent. The sad thing here is that the C9 had 48Gb ports. LG downgraded the CX they said they found most customers had little need for the full speed. I'm hoping this will be rectified with the C11.
 

Deleted member 5876

Big Seller
Banned
Oct 25, 2017
2,559
So question. If my tv and AV receiver are hdmi 2.0 (I think they are. Tv is Sony x900e and I have a Yamaha receiver that's a couple years old) that means the max I can output with HDR is 4:2:2 right? You can only do 4k hdr with 4:4:4 on an hdmi 2.1 setup?

I have a similar setup.
I get 4K / 60hz / hdr / 4:4:4 on appletv just fine.
PS5 on the other hand.. 4:2:2
 

Spasm

Member
Nov 16, 2017
1,949
So question. If my tv and AV receiver are hdmi 2.0 (I think they are. Tv is Sony x900e and I have a Yamaha receiver that's a couple years old) that means the max I can output with HDR is 4:2:2 right? You can only do 4k hdr with 4:4:4 on an hdmi 2.1 setup?
I have a similar setup.
I get 4K / 60hz / hdr / 4:4:4 on appletv just fine.
PS5 on the other hand.. 4:2:2
10-bit 4k60 over HDMI 2.0 is limited to 4:2:2 chroma subsampling, yes.
 
Dec 11, 2017
4,886
I was saying that in my opinion, I think HDMI 2.1 should comply with the full spec. So I don't consider either to really be 2.1 even if they chance of running into that content requiring 48 is low to nonexistent. The sad thing here is that the C9 had 48Gb ports. LG downgraded the CX they said they found most customers had little need for the full speed. I'm hoping this will be rectified with the C11.
48Gb is needed for 12 bit panels, which aren't a thing yet.
 

Pheonix

Banned
Dec 14, 2018
5,990
St Kitts
Thats a bogus comparison.
There is a world of difference between a substandard "hdr" tv and one that isn't.
40 vs 48 is a different story.
Pls, read the post I replied to.

Of course, I know there is a world of difference n the quality of these things. But that not what I was replying to. I was replying to the notion that something not meeting the full spec of something automatically means it's not that thing.

These are the two things sony knows about the PS5. At 4K, most names will run at 30fps/60fps. At 120fps, pretty much all games will run at 1080p with the exception to maybe some indie titles. The PS5 does not need 48Gb. Even if it ends up getting it in an update down the road cause it does have the chipset for it.
 

Doctre81

Member
Oct 26, 2017
1,452
48Gb is needed for 12 bit panels, which aren't a thing yet.

True but I think it might be something similar to how ps4 pro games can still look slightly better than ps4 games even on a 1080p tv. And how how "mastered in 4k" blurays look slightly better than the normal ones. tv's and devices that take advantage of the full 48gps could potentially result is better gradient handling despite still being on a 10 bit panel.
 

Pargon

Member
Oct 27, 2017
12,133
Called it last week:
Have Sony ever said that the PS5 is a 48 gbps device?
I'm starting to think it might be somewhere around 32–36 gbps, which should be enough for 4K120 4:2:2 but not 4:4:4.
My original estimate was only 32 gbps too, but I edited to "32–36 gbps" just to be on the safe side, as bandwidth calculators were producing results slightly above 32 gbps.
I'm guessing that's the result of those calculators being designed for TMDS rather then FRL, which is about 9% more efficient.
Hopefully this restriction is something that may change via a firmware update once VRR support is introduced.
Or perhaps they would be able to use display stream compression (DSC) to achieve 444 via 32 gbps if it's a hardware limitation rather than firmware.

Anyone wondering what the difference is between 4:4:4, 4:2:2, and 4:2:0 chroma subsampling should check out Rting's very informative page on the subject.
Bottom line is, unless you're using your display as a PC monitor, you will most likely never notice the difference in color accuracy between 4:4:4 and 4:2:2.
Personally I've always found the drop in chroma resolution noticeable in games; not only PC content.
It matters less the higher resolution you go, but is still noticeable to me. I've always chased down support for RGB/4:4:4 from the SNES onward.
It generally doesn't matter at all for video though.

This is why I've been frustrated that newer TVs started to ditch proper 4:4:4/RGB support in PC/Game Mode; e.g. LG's OLEDs suffering from color banding in the 4:4:4 PC mode, unless dropping chroma resolution to the 4:2:2 Game Mode.
That's also why the 40 gbps vs 48 gbps chipsets in the CX/C9 makes no practical difference, if you can't use RGB/444 anyway.
But one of the main reasons I selected my current Sony TV is because it had no compromises to image quality in Game Mode, other than disabling motion interpolation to reduce latency.

First they refuse to include Dolby Atmos support because "Tempest 3D" but it only works with headphones, then they fail to include VRR at launch (sorely needed in AC:Valhalla), and now this.
Lots of little compromises over the years from their hardware, and its quite surprising and disappointing.
Yeah, it's very frustrating that they only have 3D Audio when you are connecting headphones to a controller or USB audio device, and cannot output 3D Audio over HDMI.
I expect they will add support for 3D Audio for headphones (2.0) over HDMI in an update, but not 3D Audio support for multichannel speaker setups (Atmos).

40 Gbps is theoretically just fine because it can do 4k 120hz at 4:4:4 with 10 bit color. You need 48 Gbps to do all that at 12 bit color, but all the consumer TV panels on the market right now aren't 12 bit anyway, so it shouldn't make a difference. Now if you really get into the weeds of the subject, you'll find some enthusiasts out there that say color banding can be slightly reduced on some sets (depending on the each set's color processing) if reducing 12 bit signals to 10 bit - so technically the full 48 Gbps might be better (or not) on a case-by-case basis, but that gets into some real split hairs territory.
Essentially, a 10-bit signal is fine for a 10-bit panel if it's being displayed as-is.
But if you are applying processing to the signal (which all TVs must do) it has to be done at a higher bit-depth than the final output to minimize issues like color banding - so it's preferable to have a 12-bit signal for a 10-bit panel, and use greater than 12-bit processing; e.g. 16-bit.
The practical differences on the majority of today's displays are going to be negligible though.
Most of the 40 vs 48 discussion was related to LG OLEDs, and the 40-bit CX displays less color banding than the 48-bit C9 due to improved panels/processing anyway.

You don't need to. The cable that comes with the PS5 can support the full bandwidth of HDMI 2.1 because it's a 2.1 cable. their embedded HDMI chipset is just limited to 32Gb bandwidth (vs the standard 48Gb oh HDMI 2.1). And 32Gb with 12bit color depth would either do 4K@60Hz in 4:4:4 or 4K@120Hz in 4:2:2.
But what Vince may have missed, and what a lot of people are ignoring here... is that 32Gb also means you can do 1080p@120Hz at 4:4:4. Why that is very important is that pretty much all if not most of the games that support a 120fps mode would be doing that at 1080p.
Think about that, how many games do you think would e dong 4K@120fps on these consoles?
VRR requires that you output 120Hz at all times for it to function correctly. So you would be outputting 120Hz even if the game is running at 4K30 - which might otherwise have frame-pacing issues without VRR.

This would explain why I've been having weird washed out/crushed blacks with 4K/120 right?
No.
It's possible that your TV has per-mode settings and 60Hz vs 120Hz, or less-likely 422 vs 444, use different settings though.
 
Status
Not open for further replies.