Called it last week:
Have Sony ever said that the PS5 is a 48 gbps device?
I'm starting to think it might be somewhere around 32–36 gbps, which should be enough for 4K120 4:2:2 but not 4:4:4.
My original estimate was only 32 gbps too, but I edited to "32–36 gbps" just to be on the safe side, as bandwidth calculators were producing results slightly above 32 gbps.
I'm guessing that's the result of those calculators being designed for TMDS rather then FRL, which is about 9% more efficient.
Hopefully this restriction is something that may change via a firmware update once VRR support is introduced.
Or perhaps they would be able to use display stream compression (DSC) to achieve 444 via 32 gbps if it's a hardware limitation rather than firmware.
Anyone wondering what the difference is between 4:4:4, 4:2:2, and 4:2:0 chroma subsampling should check out
Rting's very informative page on the subject.
Bottom line is, unless you're using your display as a PC monitor, you will most likely never notice the difference in color accuracy between 4:4:4 and 4:2:2.
Personally I've always found the drop in chroma resolution noticeable in games; not only PC content.
It matters less the higher resolution you go, but is still noticeable to me. I've always chased down support for RGB/4:4:4 from the SNES onward.
It generally doesn't matter at all for video though.
This is why I've been frustrated that newer TVs started to ditch proper 4:4:4/RGB support in PC/Game Mode; e.g. LG's OLEDs suffering from color banding in the 4:4:4 PC mode, unless dropping chroma resolution to the 4:2:2 Game Mode.
That's also why the 40 gbps vs 48 gbps chipsets in the CX/C9 makes no practical difference, if you can't use RGB/444 anyway.
But one of the main reasons I selected my current Sony TV is because it had no compromises to image quality in Game Mode, other than disabling motion interpolation to reduce latency.
First they refuse to include Dolby Atmos support because "Tempest 3D" but it only works with headphones, then they fail to include VRR at launch (sorely needed in AC:Valhalla), and now this.
Lots of little compromises over the years from their hardware, and its quite surprising and disappointing.
Yeah, it's very frustrating that they only have 3D Audio when you are connecting headphones to a controller or USB audio device, and cannot output 3D Audio over HDMI.
I expect they will add support for 3D Audio for headphones (2.0) over HDMI in an update, but not 3D Audio support for multichannel speaker setups (Atmos).
40 Gbps is theoretically just fine because it can do 4k 120hz at 4:4:4 with 10 bit color. You need 48 Gbps to do all that at 12 bit color, but all the consumer TV panels on the market right now aren't 12 bit anyway, so it shouldn't make a difference. Now if you really get into the weeds of the subject, you'll find some enthusiasts out there that say color banding can be slightly reduced on some sets (depending on the each set's color processing) if reducing 12 bit signals to 10 bit - so technically the full 48 Gbps might be better (or not) on a case-by-case basis, but that gets into some real split hairs territory.
Essentially, a 10-bit signal is fine for a 10-bit panel if it's being displayed as-is.
But if you are applying processing to the signal (which all TVs must do) it has to be done at a higher bit-depth than the final output to minimize issues like color banding - so it's preferable to have a 12-bit signal for a 10-bit panel, and use greater than 12-bit processing; e.g. 16-bit.
The practical differences on the majority of today's displays are going to be negligible though.
Most of the 40 vs 48 discussion was related to LG OLEDs, and the 40-bit CX displays less color banding than the 48-bit C9 due to improved panels/processing anyway.
You don't need to. The cable that comes with the PS5 can support the full bandwidth of HDMI 2.1 because it's a 2.1 cable. their embedded HDMI chipset is just limited to 32Gb bandwidth (vs the standard 48Gb oh HDMI 2.1). And 32Gb with 12bit color depth would either do 4K@60Hz in 4:4:4 or 4K@120Hz in 4:2:2.
But what Vince may have missed, and what a lot of people are ignoring here... is that 32Gb also means you can do 1080p@120Hz at 4:4:4. Why that is very important is that pretty much all if not most of the games that support a 120fps mode would be doing that at 1080p.
Think about that, how many games do you think would e dong 4K@120fps on these consoles?
VRR requires that you output 120Hz at all times for it to function correctly. So you would be outputting 120Hz even if the game is running at 4K30 - which might otherwise have frame-pacing issues without VRR.
This would explain why I've been having weird washed out/crushed blacks with 4K/120 right?
No.
It's possible that your TV has per-mode settings and 60Hz vs 120Hz, or less-likely 422 vs 444, use different settings though.