• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

EatChildren

Wonder from Down Under
Member
Oct 27, 2017
7,038
I was discussing this a little bit in the LG C9 thread, and am curious about those gaming on PC and the displays and settings they're using. Main reason being while consoles generally plug-and-play and automate their output accordingly, Windows is notoriously finicky and there's an argument to be had about 8-bit vs 10-bit, and tweaks in GPU driver control panels.

The questions I ask are thus;
1) What display are you using for HDR gaming via PC?
2) What GPU are you using?
3) Do you allow Windows and GPU control panels to automate output, or are you customising your settings?
4) If the latter, what settings are you using? (eg: RGB vs YCbCr420 vs YCbCr422 vs YCbCr444, 8-bit vs 10-bit vs 12-bit, etc)

The reason I ask is that from my tinkering and research there doesn't appear to be a solid consensus on what PC gamers should be doing. Some people leave it totally automated, others opt for specific settings (eg: through the Nvidia Control Panel). There's also reports of Windows not always handling HDR appropriately and whatnot.

On my end there are immediately differences if I play with the settings. For example, if I plug in my PC to my LG C8 and boot up Gears 5, while leaving the NVIDIA control panel settings to automated, and turn on HDR via Windows, the display output says it is running in HDR and receiving a HDR signal, but the image quality has what appears like extremely crushed blacks more likened to RGB. Meanwhile If I use the control panel to change the colour output to YCbCr422 10-bit, which is the same signal the PS4 Pro sends when outputting HDR, the image quality changes while the display still says it's using HDR.

There appears to be a lot of misinformation and misunderstanding going around (myself included!) pertaining to 8-bit vs 10-bit, RGB vs variations of YCbCr, and what exactly people should be aiming for to tailor their signal and display for the most accurate image quality.

Ergo, this thread.

EDIT: Two really good explanations.

Copy pasting something I wrote some time ago.

You've probably been playing games or watching content in HDR via your PC, while missing a critical component – 10bit video.

Windows 10, unlike game consoles does not auto-switch the bit depth of the outgoing video when launching HDR games.

By default, both NVIDIA and AMD GPUs are configured to output RGB 8bit.

You might be wondering "But my TV turns on its HDR modes and games look better" this is indeed true – HDR is a collection of different pieces that when working together create the HDR effect. Your PC is sending the WCG(Wide Color Gamut)/BT.2020 metadata as well as other information to the TV which triggers its HDR mode, but the PC is still only sending an 8bit signal.


How to output 10-bit video on an NVIDIA GPU

NVIDIA GPUs have some quirks when it comes to what bit depths can be output with formats. The list is as follows:

  • RGB/YUV444:
    • 8-Bit
    • 12-Bit
  • YUV422
    • 8-Bit,
    • 10-Bit
    • 12-Bit
What does this mean for you? Not much – 12-bit has the same bandwidth requirements as 10-bit. If you do require RGB/YUV444, and send a 12-bit signal to the TV, that signal still only is contains a 10-bit signal being sent in a 12-bit container. The TV will convert the signal back down to 10-bit.

However, if you want to output true 10-bit, then you'll need to step down to YUV422 signal. Again, not the end of the world. At normal TV viewing distances (and even in 4K monitors) it is very difficult to tell the difference between 4:4:4 and 4:2:2.

The recommended setting in this case is YUV422 video, at 10-bit, for both SDR and HDR. This will make the switch seamless and does not require you to do any extra work.

How to configure NVIDA GPUs

  1. Right click on the Windows desktop
  2. Open the NVIDA control panel
  3. On the left side, click on Resolutions
  4. click on the Output Color Formatdropdown menu and select YUV422
  5. Click on Apply
  6. Now click on the Output Color Depth dropdown menu and select 10bpc(bits per color)
  7. Click on Apply
Thats it. Your GPU is now outputting YUV422 10-bit video to your TV or monitor.

Now launch an HDR game and you'll see the full 10-bit color depth!

I'm hoping to do a full video on this,
I've got some new equipment on it's way, so I can do some extensive testing on setup options for me.

me and LtRoyalShrimp did some testing between us a few weeks ago to consolidate this info, see his excellent post above.

But as rule of thumb

You want to aim for your :
10bit/12bit
YUV 422
This should be your gold standard.

Enable HDR in windows before you boot a game (I leave mine on all the time)

Games typically behave more favourably in full screen exclusive, so use this for HDR where available.

In the NVIDiA CP, there are 2 sets of resolution options and timings. The UHD standard options and the PC standards. Always try to use the UHD options first if you can, this will likely iron out some of the quirks that can occur.
 
Last edited:

GangWarily

Member
Oct 25, 2017
904
I use the following:

1. PG35VQ
2. 2080 TI
3. I don't...and afaik, I have to manually enable HDR when I'm playing 99% of HDR games on Windows. I think Doom Eternal may have been one of the rarity where it turned it on automatically when HDR is enabled in the game and turns it off once you're out.
4. I use RGB 8 bit since 10 bit caps out at 144fps (The monitor goes up to 200hz). I've read that 8 bit can't do HDR / that it sacrifices quality somehow (something to do with dithering?) and I honestly could not tell the difference.

I've also played a fair bit of HDR games on my B7 but Game Mode in HDR makes every super dim for some reason :S
 

scitek

Member
Oct 27, 2017
10,099
1) Vizio P50-C1
2) RTX 2080
3) Nvidia color settings
4) YCbCr422 12-bit, which my set downsamples to 10-bit.
 

collige

Member
Oct 31, 2017
12,772
I have really messed around too much with HDR gaming yet since support is kind of a clusterfuck. That said:

1) Razer Raptor
2) GTX 1080ti
3/4) I have it set to RGB, 10 bit, full dynamic range in the NVvdia control panel. YCbCr422 and YCbCr444 only give me the option for limited range
 
Last edited:

Jayde Six

Member
Feb 2, 2019
375
1)Sony Bravia X950H
2)Radeon RX 580
3)Customizing - Also using a script to toggle Windows level HDR on/off, Steam and Movies engage HDR Automatically, WIndows Store games need the script
4)YCbCr4:2:0 10-bit
 

SliChillax

Member
Oct 30, 2017
2,148
Tirana, Albania
1) LG C9
2) RTX 2080ti
3) 422, 12 bit
I've set the TV to recognize my PC as a normal input not as a PC as I hear it causes some issues with image quality. I do use game mode though.

From the tests that I've done comparing the same game on consoles, I haven't seen any difference with PC HDR which is a good thing. Bring on the RTX 3xxx cards with Hdmi 2.1 so we can end this headache.
 

craven68

Member
Jun 20, 2018
4,555
1) Samsung 55Q6FN
2) 1080TI
3) Manually on windows setting to turn on or off hdr, before, it was automatically on somes games....on nvidia, i put it on automatically too
I m going to try YCbCr4:2:0 10-bit again, if i remember right, in a lot of game before, it wasn't always better ( but it was on my older tv, maybe it's different now).
 

maenckman

Member
Dec 3, 2018
222
1) Sony Bravia XE93
2) RTX 2080ti
3/4) 422, 10 bit
For me there is no big difference between RGB (which is 8 bit with dithering, according to Windows) and YCbCr422 except there is less banding with the latter. But I remember a discussion about this in the HDR games analyzed thread. If you want to go down that rabbit hole ;)
 

Deimos

Member
Oct 25, 2017
5,787
1) LG C9
2) GTX 1060
3) 4) YCbCr422 12-bit

Nvidia's default HDR settings are RGB if I'm remembering right. I tested every mode and 422 12 bit resulted in the best colors and least banding.
 

Edgar

User requested ban
Banned
Oct 29, 2017
7,180
1) Sony X900E
2) RTX 2080ti
3) 422, 12 bit limited range . 9 times ouf of ten i enable hdr on windows on my own . For SDR i used RGB , full range, 8 bit color .
I am still not sure what im doing is right ,but meh seems to work okay for me unless theres anyone willing to share proper settings
 

brain_stew

Member
Oct 30, 2017
4,737
Honestly, I tend not to bother with my LG B8. I prefer to take advantage of the better motion resolution I get from black frame insertion and it doesn't mesh well with HDR die to the lowered peak brightness. HDR in Windows is a finicky mess, so I prefer to just leave it switched off for now, I may give it another go in future.

Never combine 8 bit and HDR though, you're going to introduce hideous banding if you do.
 

Megasoum

Member
Oct 25, 2017
22,596
I use a TCL Series 6 tv with a GTX 1080.

I have never played around with the NVIDIA settings... should I?

The main thing I have noticed is that the vast majority of games I have tried so far, most will work in HDR without me having to first enable it in the global Windows Settings for my TV.

The two current exceptions I have are Shadow of the Tomb Raider and F1 2020. For those I need to go in the Settings app and turn it on/off before and after I play the game which is annoying... I looked but couldn't find a programtically way to toggle that setting in Windows so I can't seem to be able to automate it.
 
Guide for enabling 10-bit HDR in Windows

Videophile

Tech Marketing at Elgato
Verified
Oct 27, 2017
63
San Francisco, CA
Copy pasting something I wrote some time ago.

You've probably been playing games or watching content in HDR via your PC, while missing a critical component – 10bit video.

Windows 10, unlike game consoles does not auto-switch the bit depth of the outgoing video when launching HDR games.

By default, both NVIDIA and AMD GPUs are configured to output RGB 8bit.

You might be wondering "But my TV turns on its HDR modes and games look better" this is indeed true – HDR is a collection of different pieces that when working together create the HDR effect. Your PC is sending the WCG(Wide Color Gamut)/BT.2020 metadata as well as other information to the TV which triggers its HDR mode, but the PC is still only sending an 8bit signal.


How to output 10-bit video on an NVIDIA GPU

NVIDIA GPUs have some quirks when it comes to what bit depths can be output with formats when connected to a display via HDMI.

HDMI supported formats:
  • RGB/YUV444:
    • 8-Bit
    • 12-Bit
  • YUV422
    • 8-Bit,
    • 10-Bit
    • 12-Bit
What does this mean for you? Not much – 12-bit has the same bandwidth requirements as 10-bit. If you do require RGB/YUV444, and send a 12-bit signal to the TV, that signal still only is contains a 10-bit signal being sent in a 12-bit container. The TV will convert the signal back down to 10-bit.

However, if you want to output true 10-bit, then you'll need to step down to YUV422 signal. Again, not the end of the world. At normal TV viewing distances (and even in 4K monitors) it is very difficult to tell the difference between 4:4:4 and 4:2:2.

The recommended setting in this case is YUV422 video, at 10-bit, for both SDR and HDR. This will make the switch seamless and does not require you to do any extra work.

How to configure NVIDA GPUs

  1. Right click on the Windows desktop
  2. Open the NVIDA control panel
  3. On the left side, click on Resolutions
  4. click on the Output Color Format dropdown menu and select YUV422
  5. Click on Apply
  6. Now click on the Output Color Depth dropdown menu and select 10bpc(bits per color)
  7. Click on Apply
Thats it. Your GPU is now outputting YUV422 10-bit video to your TV or monitor.

Now launch an HDR game and you'll see the full 10-bit color depth!
 
Last edited:

chrisypoo

Member
Oct 27, 2017
3,457
Copy pasting something I wrote some time ago.

You've probably been playing games or watching content in HDR via your PC, while missing a critical component – 10bit video.

Windows 10, unlike game consoles does not auto-switch the bit depth of the outgoing video when launching HDR games.

By default, both NVIDIA and AMD GPUs are configured to output RGB 8bit.

You might be wondering "But my TV turns on its HDR modes and games look better" this is indeed true – HDR is a collection of different pieces that when working together create the HDR effect. Your PC is sending the WCG(Wide Color Gamut)/BT.2020 metadata as well as other information to the TV which triggers its HDR mode, but the PC is still only sending an 8bit signal.


How to output 10-bit video on an NVIDIA GPU

NVIDIA GPUs have some quirks when it comes to what bit depths can be output with formats. The list is as follows:

  • RGB/YUV444:
    • 8-Bit
    • 12-Bit
  • YUV422
    • 8-Bit,
    • 10-Bit
    • 12-Bit
What does this mean for you? Not much – 12-bit has the same bandwidth requirements as 10-bit. If you do require RGB/YUV444, and send a 12-bit signal to the TV, that signal still only is contains a 10-bit signal being sent in a 12-bit container. The TV will convert the signal back down to 10-bit.

However, if you want to output true 10-bit, then you'll need to step down to YUV422 signal. Again, not the end of the world. At normal TV viewing distances (and even in 4K monitors) it is very difficult to tell the difference between 4:4:4 and 4:2:2.

The recommended setting in this case is YUV422 video, at 10-bit, for both SDR and HDR. This will make the switch seamless and does not require you to do any extra work.

How to configure NVIDA GPUs

  1. Right click on the Windows desktop
  2. Open the NVIDA control panel
  3. On the left side, click on Resolutions
  4. click on the Output Color Formatdropdown menu and select YUV422
  5. Click on Apply
  6. Now click on the Output Color Depth dropdown menu and select 10bpc(bits per color)
  7. Click on Apply
Thats it. Your GPU is now outputting YUV422 10-bit video to your TV or monitor.

Now launch an HDR game and you'll see the full 10-bit color depth!
I just wanted to sincerely thank you for this. This is literally exactly what I needed to solve my PC HDR issues, and you're the first person I've seen that explained everything so simply and succinctly.

Really, seriously, thanks.
 

Elven_Star

Member
Oct 27, 2017
3,983
Copy pasting something I wrote some time ago.

You've probably been playing games or watching content in HDR via your PC, while missing a critical component – 10bit video.

Windows 10, unlike game consoles does not auto-switch the bit depth of the outgoing video when launching HDR games.

By default, both NVIDIA and AMD GPUs are configured to output RGB 8bit.

You might be wondering "But my TV turns on its HDR modes and games look better" this is indeed true – HDR is a collection of different pieces that when working together create the HDR effect. Your PC is sending the WCG(Wide Color Gamut)/BT.2020 metadata as well as other information to the TV which triggers its HDR mode, but the PC is still only sending an 8bit signal.


How to output 10-bit video on an NVIDIA GPU

NVIDIA GPUs have some quirks when it comes to what bit depths can be output with formats. The list is as follows:

  • RGB/YUV444:
    • 8-Bit
    • 12-Bit
  • YUV422
    • 8-Bit,
    • 10-Bit
    • 12-Bit
What does this mean for you? Not much – 12-bit has the same bandwidth requirements as 10-bit. If you do require RGB/YUV444, and send a 12-bit signal to the TV, that signal still only is contains a 10-bit signal being sent in a 12-bit container. The TV will convert the signal back down to 10-bit.

However, if you want to output true 10-bit, then you'll need to step down to YUV422 signal. Again, not the end of the world. At normal TV viewing distances (and even in 4K monitors) it is very difficult to tell the difference between 4:4:4 and 4:2:2.

The recommended setting in this case is YUV422 video, at 10-bit, for both SDR and HDR. This will make the switch seamless and does not require you to do any extra work.

How to configure NVIDA GPUs

  1. Right click on the Windows desktop
  2. Open the NVIDA control panel
  3. On the left side, click on Resolutions
  4. click on the Output Color Formatdropdown menu and select YUV422
  5. Click on Apply
  6. Now click on the Output Color Depth dropdown menu and select 10bpc(bits per color)
  7. Click on Apply
Thats it. Your GPU is now outputting YUV422 10-bit video to your TV or monitor.

Now launch an HDR game and you'll see the full 10-bit color depth!
So, no need to turn on the HDR thing in Windows display settings?
 

Mukrab

Member
Apr 19, 2020
7,556
I use the following:

1. PG35VQ
2. 2080 TI
3. I don't...and afaik, I have to manually enable HDR when I'm playing 99% of HDR games on Windows. I think Doom Eternal may have been one of the rarity where it turned it on automatically when HDR is enabled in the game and turns it off once you're out.
4. I use RGB 8 bit since 10 bit caps out at 144fps (The monitor goes up to 200hz). I've read that 8 bit can't do HDR / that it sacrifices quality somehow (something to do with dithering?) and I honestly could not tell the difference.

I've also played a fair bit of HDR games on my B7 but Game Mode in HDR makes every super dim for some reason :S
It is unbelievable to me that not more games did what doom did and let you enable hdr without having it enabled on windows. I think wolfenstein did it too.
 

Edgar

User requested ban
Banned
Oct 29, 2017
7,180
Copy pasting something I wrote some time ago.

You've probably been playing games or watching content in HDR via your PC, while missing a critical component – 10bit video.

Windows 10, unlike game consoles does not auto-switch the bit depth of the outgoing video when launching HDR games.

By default, both NVIDIA and AMD GPUs are configured to output RGB 8bit.

You might be wondering "But my TV turns on its HDR modes and games look better" this is indeed true – HDR is a collection of different pieces that when working together create the HDR effect. Your PC is sending the WCG(Wide Color Gamut)/BT.2020 metadata as well as other information to the TV which triggers its HDR mode, but the PC is still only sending an 8bit signal.


How to output 10-bit video on an NVIDIA GPU

NVIDIA GPUs have some quirks when it comes to what bit depths can be output with formats. The list is as follows:

  • RGB/YUV444:
    • 8-Bit
    • 12-Bit
  • YUV422
    • 8-Bit,
    • 10-Bit
    • 12-Bit
What does this mean for you? Not much – 12-bit has the same bandwidth requirements as 10-bit. If you do require RGB/YUV444, and send a 12-bit signal to the TV, that signal still only is contains a 10-bit signal being sent in a 12-bit container. The TV will convert the signal back down to 10-bit.

However, if you want to output true 10-bit, then you'll need to step down to YUV422 signal. Again, not the end of the world. At normal TV viewing distances (and even in 4K monitors) it is very difficult to tell the difference between 4:4:4 and 4:2:2.

The recommended setting in this case is YUV422 video, at 10-bit, for both SDR and HDR. This will make the switch seamless and does not require you to do any extra work.

How to configure NVIDA GPUs

  1. Right click on the Windows desktop
  2. Open the NVIDA control panel
  3. On the left side, click on Resolutions
  4. click on the Output Color Formatdropdown menu and select YUV422
  5. Click on Apply
  6. Now click on the Output Color Depth dropdown menu and select 10bpc(bits per color)
  7. Click on Apply
Thats it. Your GPU is now outputting YUV422 10-bit video to your TV or monitor.

Now launch an HDR game and you'll see the full 10-bit color depth!
I can select 12 bit 422, is there any practical difference between 12 bit and 10 bit 422?
 

strife85

Member
Oct 30, 2017
1,476
I am using a vg27aq and hdr looks washed out ;( I changed the settings to YUV422 and 10bit. When I turn HDR on in windows it's just washed out colors.
 
EvilBoris weighs in

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,694
I'm hoping to do a full video on this,
I've got some new equipment on it's way, so I can do some extensive testing on setup options for me.

me and LtRoyalShrimp did some testing between us a few weeks ago to consolidate this info, see his excellent post above.

But as rule of thumb

You want to aim for your :
10bit/12bit
YUV 422
This should be your gold standard.

Enable HDR in windows before you boot a game (I leave mine on all the time)

Games typically behave more favourably in full screen exclusive, so use this for HDR where available.

In the NVIDiA CP, there are 2 sets of resolution options and timings. The UHD standard options and the PC standards. Always try to use the UHD options first if you can, this will likely iron out some of the quirks that can occur.
 

strife85

Member
Oct 30, 2017
1,476
Ok I think I got HDR working on vg27aq. It doesn't look that much different, but the colors aren't very washed out in Destiny 2 now. They look brighter a bit is all. I changed colors in NVIDIA control panel. Basically put contrast to 100% and left the other stuff as is.
 
OP
OP
EatChildren

EatChildren

Wonder from Down Under
Member
Oct 27, 2017
7,038
Really fantastic posts. Gonna bookmark a couple. Thanks guys. The main reason I made the thread is because it's really hard to find a consensus. And if we're getting some form here, that's great. Also thank you for the explanations as to why certain settings are best.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
1. Pixio 277h 1440p
2. 5700XT
3. some games auto trigger and some you have to use the windows slider
4. 10 bit limited
 

tokkun

Member
Oct 27, 2017
5,420
However, if you want to output true 10-bit, then you'll need to step down to YUV422 signal. Again, not the end of the world. At normal TV viewing distances (and even in 4K monitors) it is very difficult to tell the difference between 4:4:4 and 4:2:2.

The recommended setting in this case is YUV422 video, at 10-bit, for both SDR and HDR. This will make the switch seamless and does not require you to do any extra work.

For games and movies, yes. But I wouldn't want to leave my chroma settings at 4:2:2 for desktop applications like web browsing.
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,694
For games and movies, yes. But I wouldn't want to leave my chroma settings at 4:2:2 for desktop applications like web browsing.

With the C9/CX the 444 handling for HDR is super poor and I would challenge anybody who says they can see the difference between 422/444 at 4K resolution at a typical viewing distance for a 48inch+ screen.

besides, simply running the content at 10bit quadruples the amount of data, before it is reduced by half in the chroma channel.
So the difference is far less pronounced than a straight 8bit vs 8bit comparison.
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
Does anyone have an answer to if it makes sense to use full 444/RGB if you are running HDR at 1440p?

I believe the bandwidth is there to do so but there is not much info on HDR at lower than 4k resolution.
 

plagiarize

It's not a loop. It's a spiral.
Moderator
Oct 25, 2017
27,610
Cape Cod, MA
I'll have to check when I get home. I'm using an HDR monitor that connects with display port, so I'm not sure that handles things the same way. When I connect to my 4K tv for hdr, I go the 422 route, but I'm pretty sure I don't have my PC set the same way with my monitor because 422 at the distance I sit from my monitor *is* readily apparent.

Are there any good test screens for banding, etc?
 
Oct 28, 2017
1,715
In the NVIDiA CP, there are 2 sets of resolution options and timings. The UHD standard options and the PC standards. Always try to use the UHD options first if you can, this will likely iron out some of the quirks that can occur.

Your and LtRoyalShrimp's posts are very useful, thanks.

Would you be able to clarify this though - which area is which in the NVCP?
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
I'll have to check when I get home. I'm using an HDR monitor that connects with display port, so I'm not sure that handles things the same way. When I connect to my 4K tv for hdr, I go the 422 route, but I'm pretty sure I don't have my PC set the same way with my monitor because 422 at the distance I sit from my monitor *is* readily apparent.

Are there any good test screens for banding, etc?

I recall a more general banding test I've used just with a Google search but that was more to test if I had a good panel or not on my B7 (it's perfect).
 

Elven_Star

Member
Oct 27, 2017
3,983
Yeah, I tinkered around a bit with it and have decided it's not worth the trouble (2018 TCL). Back to RGB full.
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,694
Your and LtRoyalShrimp's posts are very useful, thanks.

Would you be able to clarify this though - which area is which in the NVCP?

Just in the regular resolution selection area.

It will say Ultra HD, HD, SD - These are the TV resolutions.

then further down the list you'll see the sub heading "PC"

These are the VESA standard PC resolutions
 

Megasoum

Member
Oct 25, 2017
22,596
So is there a way to programmatically enable/disable HDR at the Windows level?

Or I guess, side question... Is there a way to not make my tv look like crap when I enable HDR in Windows but not doing actual HDR content?
 

Kyle Cross

Member
Oct 25, 2017
8,457
Just in the regular resolution selection area.

It will say Ultra HD, HD, SD - These are the TV resolutions.

then further down the list you'll see the sub heading "PC"

These are the VESA standard PC resolutions
How can we be sure that a game in fullscreen is using the TV resoluions instead of the PC ones? For example in my NVCP 4k under TV tops out at 60hz and allows YUV422 10bit, but the 4k under PC only allows YUV420 8bit 100hz or 120hz. Needless to say I want the TV version, but in-game settings seem to have no differentiation.
 

Vasto

Member
May 26, 2019
342
Been waiting for a thread covering this.

1. Samsung MU6300 4K HDR
2. Radeon 5700XT

Can somebody tell me what I need to set my color depth / pixel format to in order to get HDR on PC? Right now I have it set to 8 Bit Color Depth and RGB 4:4:4 Pixel Format PC Standard ( Full RGB ).
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,694
How can we be sure that a game in fullscreen is using the TV resoluions instead of the PC ones? For example in my NVCP 4k under TV tops out at 60hz and allows YUV422 10bit, but the 4k under PC only allows YUV420 8bit 100hz or 120hz. Needless to say I want the TV version, but in-game settings seem to have no differentiation.

That's a really good question, I guess games can access the same list of resolutions too. I absolutely bet that is why people get weirdness on PC with HDR.
I suppose one thing to do is edit the resolutions your display's EDID offers in something like CRU.
 

Vasto

Member
May 26, 2019
342
Been waiting for a thread covering this.

1. Samsung MU6300 4K HDR
2. Radeon 5700XT

Can somebody tell me what I need to set my color depth / pixel format to in order to get HDR on PC? Right now I have it set to 8 Bit Color Depth and RGB 4:4:4 Pixel Format PC Standard ( Full RGB ).


After trying the different Pixel Format looks like in order to get 10 Bit color I have to use either YCbCr 4:2:2 or YCbCr 4:2:0. Which one should I use?
 

Deleted member 17092

User requested account closure
Banned
Oct 27, 2017
20,360
So is there a way to programmatically enable/disable HDR at the Windows level?

Or I guess, side question... Is there a way to not make my tv look like crap when I enable HDR in Windows but not doing actual HDR content?

afaik it's just kind of a crapshoot as far as games that automatically enable and disable vs having to toggle the slider. ultimately I don't think that flipping the toggle is that big of a deal.

After trying the different Pixel Format looks like in order to get 10 Bit color I have to use either YCbCr 4:2:2 or YCbCr 4:2:0. Which one should I use?

422 is half the color sampling of 444 while 420 is 1/4 so 422.
 

Megasoum

Member
Oct 25, 2017
22,596
Maybe not directly HDR related but are there any tricks in reducing color banding? I'm playing on a TCL 55R615 tv with a GTX 1080 and the games look great overall but I'm still getting some color banding at times and this is killing me haha...

Some of it happens during pre-rendered videos so for those I assume it's probably mostly a compression issue but I'm also seeing it in gameplay too.

I'm playing Shadow of the Tomb Raider and I am seeing it a lot when swimming underwater for example.
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,694
Maybe not directly HDR related but are there any tricks in reducing color banding? I'm playing on a TCL 55R615 tv with a GTX 1080 and the games look great overall but I'm still getting some color banding at times and this is killing me haha...

Some of it happens during pre-rendered videos so for those I assume it's probably mostly a compression issue but I'm also seeing it in gameplay too.

I'm playing Shadow of the Tomb Raider and I am seeing it a lot when swimming underwater for example.

Videos is usually just down to them being low bit depth and low bit rate, then converted to HDR, which reduces their effective bit depth.

In games, you are still likely to see it , especially when anything is volumetric.
If you have a film grain option, that will help to smooth it out.
 

Fries

Member
Oct 25, 2017
554
I'm stuck on a C8 with HDMI2.0 I believe so only enough bandwidth for 4K 60Hz HDR 10 bit @ 4:2:2, I don't expect my 2080 ti to hit past 60 on Ultra in a lot of games with my old i7 4790k anyway so that is good enough for me. I do find it annoying to have to turn on and off the Windows HDR setting because some games that do turn it on automatically do not turn it off ._.
 

Vasto

Member
May 26, 2019
342
I go to YouTube and watch a HDR video it does not say HDR in the corner of the video. I thought HDR was supposed to automatically turn on for HDR content?
 

Videophile

Tech Marketing at Elgato
Verified
Oct 27, 2017
63
San Francisco, CA
I go to YouTube and watch a HDR video it does not say HDR in the corner of the video. I thought HDR was supposed to automatically turn on for HDR content?
For YouTube to play HDR videos the HDR toggle in the Windows 10 settings needs to be set to on. This will turn on HDR system wide, and a browser like Chrome will detect that HDR is supported and let YouTube play in HDR.

Its really annoying to have to do this and I hope it changes in the future.

The reason games can turn on HDR is because they take over the entire display pipeline when launched in fullscreen - A browser can't do this currently afaik so if Windows is running in SDR and Chrome launches a HDR video Chrome has no way to tell Windows/the attached display to switch to HDR.
 

Videophile

Tech Marketing at Elgato
Verified
Oct 27, 2017
63
San Francisco, CA
Maybe not directly HDR related but are there any tricks in reducing color banding? I'm playing on a TCL 55R615 tv with a GTX 1080 and the games look great overall but I'm still getting some color banding at times and this is killing me haha...

Some of it happens during pre-rendered videos so for those I assume it's probably mostly a compression issue but I'm also seeing it in gameplay too.

I'm playing Shadow of the Tomb Raider and I am seeing it a lot when swimming underwater for example.
You could try forcing the driver to do some dithering.

A person on the AVS forums created this really handy command line tool to change dithering and color format options.

Download link: http://www.mediafire.com/file/yxcecws1dwfwhqq/NvColorControl2.0.0.0.zip/file

Code:
Usage: NvColorControl <bit-depth> <color format> <dithering> <hdr>
  <bit-depth>   : 8, 10 or 12
  <color format>: RGB (full), RGBLM (limited), YUV444, YUV422 or YUV420
  <dithering>   : state: 0 = auto, 1 = enabled, 2 = disabled,
                  bits : 0 = 6 bit, 1 = 8 bit, 2 = 10 bit,
                  mode : 0 = none, 1, 2 or 3 = spacial, 4 = temporal
  <hdr>         : 0 or 1
Examples:
- NvColorControl 8 YUV444
- NvColorControl 10 YUV422
- NvColorControl 12 YUV420
- NvColorControl 8 RGB 1 1 4 1

NOTES:
- not all combinations are possible
- HDR can currently only be enabled and requires the application to stay open
- this application does not revert automatically to the previous settings after a timeout
 

Vasto

Member
May 26, 2019
342
For YouTube to play HDR videos the HDR toggle in the Windows 10 settings needs to be set to on. This will turn on HDR system wide, and a browser like Chrome will detect that HDR is supported and let YouTube play in HDR.

Its really annoying to have to do this and I hope it changes in the future.

The reason games can turn on HDR is because they take over the entire display pipeline when launched in fullscreen - A browser can't do this currently afaik so if Windows is running in SDR and Chrome launches a HDR video Chrome has no way to tell Windows/the attached display to switch to HDR.


When I turn on the HDR toggle in Windows my entire screen becomes washed out and it does not look good at all. Any idea what is happening?