I was discussing this a little bit in the LG C9 thread, and am curious about those gaming on PC and the displays and settings they're using. Main reason being while consoles generally plug-and-play and automate their output accordingly, Windows is notoriously finicky and there's an argument to be had about 8-bit vs 10-bit, and tweaks in GPU driver control panels.
The questions I ask are thus;
1) What display are you using for HDR gaming via PC?
2) What GPU are you using?
3) Do you allow Windows and GPU control panels to automate output, or are you customising your settings?
4) If the latter, what settings are you using? (eg: RGB vs YCbCr420 vs YCbCr422 vs YCbCr444, 8-bit vs 10-bit vs 12-bit, etc)
The reason I ask is that from my tinkering and research there doesn't appear to be a solid consensus on what PC gamers should be doing. Some people leave it totally automated, others opt for specific settings (eg: through the Nvidia Control Panel). There's also reports of Windows not always handling HDR appropriately and whatnot.
On my end there are immediately differences if I play with the settings. For example, if I plug in my PC to my LG C8 and boot up Gears 5, while leaving the NVIDIA control panel settings to automated, and turn on HDR via Windows, the display output says it is running in HDR and receiving a HDR signal, but the image quality has what appears like extremely crushed blacks more likened to RGB. Meanwhile If I use the control panel to change the colour output to YCbCr422 10-bit, which is the same signal the PS4 Pro sends when outputting HDR, the image quality changes while the display still says it's using HDR.
There appears to be a lot of misinformation and misunderstanding going around (myself included!) pertaining to 8-bit vs 10-bit, RGB vs variations of YCbCr, and what exactly people should be aiming for to tailor their signal and display for the most accurate image quality.
Ergo, this thread.
EDIT: Two really good explanations.
The questions I ask are thus;
1) What display are you using for HDR gaming via PC?
2) What GPU are you using?
3) Do you allow Windows and GPU control panels to automate output, or are you customising your settings?
4) If the latter, what settings are you using? (eg: RGB vs YCbCr420 vs YCbCr422 vs YCbCr444, 8-bit vs 10-bit vs 12-bit, etc)
The reason I ask is that from my tinkering and research there doesn't appear to be a solid consensus on what PC gamers should be doing. Some people leave it totally automated, others opt for specific settings (eg: through the Nvidia Control Panel). There's also reports of Windows not always handling HDR appropriately and whatnot.
On my end there are immediately differences if I play with the settings. For example, if I plug in my PC to my LG C8 and boot up Gears 5, while leaving the NVIDIA control panel settings to automated, and turn on HDR via Windows, the display output says it is running in HDR and receiving a HDR signal, but the image quality has what appears like extremely crushed blacks more likened to RGB. Meanwhile If I use the control panel to change the colour output to YCbCr422 10-bit, which is the same signal the PS4 Pro sends when outputting HDR, the image quality changes while the display still says it's using HDR.
There appears to be a lot of misinformation and misunderstanding going around (myself included!) pertaining to 8-bit vs 10-bit, RGB vs variations of YCbCr, and what exactly people should be aiming for to tailor their signal and display for the most accurate image quality.
Ergo, this thread.
EDIT: Two really good explanations.
Copy pasting something I wrote some time ago.
You've probably been playing games or watching content in HDR via your PC, while missing a critical component – 10bit video.
Windows 10, unlike game consoles does not auto-switch the bit depth of the outgoing video when launching HDR games.
By default, both NVIDIA and AMD GPUs are configured to output RGB 8bit.
You might be wondering "But my TV turns on its HDR modes and games look better" this is indeed true – HDR is a collection of different pieces that when working together create the HDR effect. Your PC is sending the WCG(Wide Color Gamut)/BT.2020 metadata as well as other information to the TV which triggers its HDR mode, but the PC is still only sending an 8bit signal.
How to output 10-bit video on an NVIDIA GPU
NVIDIA GPUs have some quirks when it comes to what bit depths can be output with formats. The list is as follows:
What does this mean for you? Not much – 12-bit has the same bandwidth requirements as 10-bit. If you do require RGB/YUV444, and send a 12-bit signal to the TV, that signal still only is contains a 10-bit signal being sent in a 12-bit container. The TV will convert the signal back down to 10-bit.
- RGB/YUV444:
- 8-Bit
- 12-Bit
- YUV422
- 8-Bit,
- 10-Bit
- 12-Bit
However, if you want to output true 10-bit, then you'll need to step down to YUV422 signal. Again, not the end of the world. At normal TV viewing distances (and even in 4K monitors) it is very difficult to tell the difference between 4:4:4 and 4:2:2.
The recommended setting in this case is YUV422 video, at 10-bit, for both SDR and HDR. This will make the switch seamless and does not require you to do any extra work.
How to configure NVIDA GPUs
Thats it. Your GPU is now outputting YUV422 10-bit video to your TV or monitor.
- Right click on the Windows desktop
- Open the NVIDA control panel
- On the left side, click on Resolutions
- click on the Output Color Formatdropdown menu and select YUV422
- Click on Apply
- Now click on the Output Color Depth dropdown menu and select 10bpc(bits per color)
- Click on Apply
Now launch an HDR game and you'll see the full 10-bit color depth!
I'm hoping to do a full video on this,
I've got some new equipment on it's way, so I can do some extensive testing on setup options for me.
me and LtRoyalShrimp did some testing between us a few weeks ago to consolidate this info, see his excellent post above.
But as rule of thumb
You want to aim for your :
10bit/12bit
YUV 422
This should be your gold standard.
Enable HDR in windows before you boot a game (I leave mine on all the time)
Games typically behave more favourably in full screen exclusive, so use this for HDR where available.
In the NVIDiA CP, there are 2 sets of resolution options and timings. The UHD standard options and the PC standards. Always try to use the UHD options first if you can, this will likely iron out some of the quirks that can occur.
Last edited: