XxLeonV

Member
Nov 8, 2017
1,140

So I got 1440p working on the 1060 in HDR @120hz

Also Ultra Widescreen


Runs shocking on that card, but it was a fun thing to try for this picture :P


Didn't know ultrawide was possible on a OLED. About to try this on my C7.

That's true for most games but there are definitely somes games that require you to turn on the setting in Windows before they work in HDR. Shadow of the Tomb Raider is a current example of a game I'm playing where I need to turn the setting on manually everytime.

Almost any game in the Xbox PC app that supports HDR as well. I have to do it manually for Forza everytime. I really wish it would just activate it automatically like consoles.
 

brain_stew

Member
Oct 30, 2017
4,763
I have played about 10 HDR games on my PC since getting a 6 Series.

At first would always enable Windows HDR mode and adjust my Nvidia settings before playing one then disable all the changes when done. Every game had some issues like brightness being wrong or reds being completely blown out.

Then I realized I didn't need to do anything to get better results.

Now I don't turn on Windows HDR mode or mess with any Nvidia settings. The games switch to HDR just fine on their own and look great.

Couple problems with that:

You're almost certainly playing on HDR games in 8 bit colour. You may be able to tolerate the inevitable banding that comes with that but I certainly can't.

Some games such as Gears 5 won't even give you the HDR option in the menu unless you enable the Windows setting.
 

flyinj

Member
Oct 25, 2017
11,025
Didn't know ultrawide was possible on a OLED. About to try this on my C7.



Almost any game in the Xbox PC app that supports HDR as well. I have to do it manually for Forza everytime. I really wish it would just activate it automatically like consoles.
Couple problems with that:

You're almost certainly playing on HDR games in 8 bit colour. You may be able to tolerate the inevitable banding that comes with that but I certainly can't.

Some games such as Gears 5 won't even give you the HDR option in the menu unless you enable the Windows setting.

Weird, I'm able to play both Gears 5 and Forza Horizon 4 fine in HDR without enabling the windows setting.
 

brain_stew

Member
Oct 30, 2017
4,763
Didn't know ultrawide was possible on a OLED. About to try this on my C7.

It works great on my B8. I always assumed I would need to use GPU scaling but my B8 natively accepts 2560x1080, 3440x1440 and 3840x1646 just fine and does a great job at scaling them. I was hoping I would be able to get out to support 2560x1080 at 120hz but unfortunately I've had no luck, 120hz don't work in anything above 1080p.

It's why high end ultra wide HDR monitors no longer make any sense to me. Just grab a 48" C8 and run it in ultrawide mode if 21:9 is your preferred aspect ratio.
 

XxLeonV

Member
Nov 8, 2017
1,140
Weird, I'm able to play both Gears 5 and Forza Horizon 4 fine in HDR without enabling the windows setting.

Damn I would love to know how to do that for mine. Maybe I need to do some researching and see if its something with my settings.

It works great on my B8. I always assumed I would need to use GPU scaling but my B8 natively accepts 2560x1080, 3440x1440 and 3840x1646 just fine and does a great job at scaling them. I was hoping I would be able to get out to support 2560x1080 at 120hz but unfortunately I've had no luck, 120hz don't work in anything above 1080p.

It's why high end ultra wide HDR monitors no longer make any sense to me. Just grab a 48" C8 and run it in ultrawide mode if 21:9 is your preferred aspect ratio.

Yeah I did try 3840x1646 and it worked well for Forza. I had issues in GTA V and MW though...it popped an invalid format screen even though the res was accepted for the desktop. It could be something to do with the HDMI switch I'm using. I'll troubleshoot later.

I want to try this on my B6. Guessing I just have to make a custom resolution in NVCP?

I used CRU.
 

flyinj

Member
Oct 25, 2017
11,025
Damn I would love to know how to do that for mine. Maybe I need to do some researching and see if its something with my settings.



Yeah I did try 3840x1646 and it worked well for Forza. I had issues in GTA V and MW though...it popped an invalid format screen even though the res was accepted for the desktop. It could be something to do with the HDMI switch I'm using. I'll troubleshoot later.



I used CRU.

Yeah I think it might be if you run the game once with windows HDR enabled, you can then run it again with it disabled and it will still go into HDR.

Try that out maybe?
 

Megasoum

Member
Oct 25, 2017
22,649
Videos is usually just down to them being low bit depth and low bit rate, then converted to HDR, which reduces their effective bit depth.

In games, you are still likely to see it , especially when anything is volumetric.
If you have a film grain option, that will help to smooth it out.
So going back to this reply, I was playing Shadow of the Tomb Raider in HDR tonight and decided to take pictures of my TV. Granted, the fact that those are off-screen pictures uploaded to the internet, it might not be 100% accurate to reality but looking at them, it's not that far off to what I'm seeing and it's kiiiilllinngggg meee haha.

I feel like I don't have that issue when playing PS4 in HDR (I'm about to start GoT so I'll confirm if I'm crazy or not) but I feel like this is only happening when playing on my PC.
img_20200716_234412y8khi.jpg

img_20200716_234401adj0j.jpg


EDIT: Wait... I just changed the settings in NVIDIA Panel to 8bpc and 444 (instead of 422) and now my TV still says HDR and the color banding is almost completely gone... Isn't HDR supposed to use 10bpc?
I *feel* like the colors are a lot less "popping" but at this point, I've tried so many settings that it could just be my brain playing tricks...

Edit 2: Or maybe everything I said in the first edit is wrong and it's not actually better? I don't know anymore... I think I'm gonna go to bed. I swear, this shit is making me go crazy.
 
Last edited:

Super Rookie

Member
Oct 25, 2017
276
London
I have noticed banding more in videos on my C9 in Y422 10bit Limited, than RGB 8bit Full in NVCP.

I thought I was going crazy but then put a black to white gradient from the HD Video calibration and there is definitely noticeable banding between the two, RGB is way smoother.

HDMI 2.1 cards can't come soon enough.
 

Lump

One Winged Slayer
Member
Oct 25, 2017
16,259
I have noticed banding more in videos on my C9 in Y422 10bit Limited, than RGB 8bit Full in NVCP.

I thought I was going crazy but then put a black to white gradient from the HD Video calibration and there is definitely noticeable banding between the two, RGB is way smoother.

HDMI 2.1 cards can't come soon enough.

I'm in the same boat of juggling different modes for different applications and am almost looking at my current GPU in a mild amount of disgust for not having HDMI 2.1 as if it's its fault. The moment I can upgrade, I'm absolutely upgrading. I need that port.
 

Dave_6

Member
Oct 25, 2017
1,526
I got 3840x1646 working easily enough on my B6 but HDR wouldn't work with it. Could be my display or I wasn't doing something right, but man Death Stranding looks so good in that resolution on an OLED.
 
RTX 2070, LG 55 with edge led (I'm moron to buy that)

Setting 3840x1646 was so damn easy. HDR working.
Download CRU, add the res there, with CRU there was restart64 file, click on it. Now you are able to select the resolution.
If it will be stretched out, go to NVCPL and set to Aspect Ratio or Integer Scaling and now question -

EvilBoris thanks for all mate, the question to the above is - what should it be set on? Aspect or Integer and on GPU or display, please? Ty
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,718
RTX 2070, LG 55 with edge led (I'm moron to buy that)

Setting 3840x1646 was so damn easy. HDR working.
Download CRU, add the res there, with CRU there was restart64 file, click on it. Now you are able to select the resolution.
If it will be stretched out, go to NVCPL and set to Aspect Ratio or Integer Scaling and now question -

EvilBoris thanks for all mate, the question to the above is - what should it be set on? Aspect or Integer and on GPU or display, please? Ty

If you set it on GPU it will literally render the black bars on the GPU and send it to the TV as a full 5K image.
This is easier for the TV to understand (and might allow HDR more easily), but takes up more bandwidth than sending it sans black bars.
I think I have mine set on aspect to ensure it doesn't stretch the image back out to full screen.
 
If you set it on GPU it will literally render the black bars on the GPU and send it to the TV as a full 5K image.
This is easier for the TV to understand (and might allow HDR more easily), but takes up more bandwidth than sending it sans black bars.
I think I have mine set on aspect to ensure it doesn't stretch the image back out to full screen.
Thank you so much. I thought it will take more toll on the bandwidth. Need to have it on Aspect x GPU as display gives invalid input.

Have a nice one!
 

TheTrain

Member
Oct 27, 2017
611
I've a question, why the output dynamic range shows only limited options with the YUV422 settings?
 

noomi

Member
Oct 25, 2017
3,695
New Jersey
Sorry to bump an old thread but curious about if any of this has changed with hdmi 2.1 capable displays and gpu's.

I have an lg cx and 3080 series card along with a 48Gbps hdmi cable.

Wondering what the best setting would be:

- RGB - 10bit - Full
- YcBr444 - 10bit - Full
 

EvilBoris

Prophet of Truth - HDTVtest
Verified
Oct 29, 2017
16,718
Sorry to bump an old thread but curious about if any of this has changed with hdmi 2.1 capable displays and gpu's.

I have an lg cx and 3080 series card along with a 48Gbps hdmi cable.

Wondering what the best setting would be:

- RGB - 10bit - Full
- YcBr444 - 10bit - Full

Either of those but set the video range to limited.
 

Bruticis

Banned
Oct 27, 2017
53
I don't have an option for 10-bit and when I pick YCbCR444 and my dynamic range output is limited. Should I stick with RGB where the output is unlimited vs what I'm using now? I'm using a 2080 super on an Asus ROG PG27U.
47M7nfe.png
 

Yarbskoo

Member
Oct 27, 2017
2,980
I've got a Acer x27 and an RTX 2080. I leave HDR on all the time, 10 bpc, YCbCr422. Only minor issues, web browsers can't display HDR Youtube correctly, monitor goes black for a second when switching to SDR apps and back.
 
Oct 28, 2017
83
Copy pasting something I wrote some time ago.

You've probably been playing games or watching content in HDR via your PC, while missing a critical component – 10bit video.

Windows 10, unlike game consoles does not auto-switch the bit depth of the outgoing video when launching HDR games.

By default, both NVIDIA and AMD GPUs are configured to output RGB 8bit.

You might be wondering "But my TV turns on its HDR modes and games look better" this is indeed true – HDR is a collection of different pieces that when working together create the HDR effect. Your PC is sending the WCG(Wide Color Gamut)/BT.2020 metadata as well as other information to the TV which triggers its HDR mode, but the PC is still only sending an 8bit signal.


How to output 10-bit video on an NVIDIA GPU

NVIDIA GPUs have some quirks when it comes to what bit depths can be output with formats when connected to a display via HDMI.

HDMI supported formats:
  • RGB/YUV444:
    • 8-Bit
    • 12-Bit
  • YUV422
    • 8-Bit,
    • 10-Bit
    • 12-Bit
What does this mean for you? Not much – 12-bit has the same bandwidth requirements as 10-bit. If you do require RGB/YUV444, and send a 12-bit signal to the TV, that signal still only is contains a 10-bit signal being sent in a 12-bit container. The TV will convert the signal back down to 10-bit.

However, if you want to output true 10-bit, then you'll need to step down to YUV422 signal. Again, not the end of the world. At normal TV viewing distances (and even in 4K monitors) it is very difficult to tell the difference between 4:4:4 and 4:2:2.

The recommended setting in this case is YUV422 video, at 10-bit, for both SDR and HDR. This will make the switch seamless and does not require you to do any extra work.

How to configure NVIDA GPUs

  1. Right click on the Windows desktop
  2. Open the NVIDA control panel
  3. On the left side, click on Resolutions
  4. click on the Output Color Format dropdown menu and select YUV422
  5. Click on Apply
  6. Now click on the Output Color Depth dropdown menu and select 10bpc(bits per color)
  7. Click on Apply
Thats it. Your GPU is now outputting YUV422 10-bit video to your TV or monitor.

Now launch an HDR game and you'll see the full 10-bit color depth!

Is this still relevant today with the 3070/3080 GPU's now released?

Also @EvilBoris, why would you have HDR in windows 10 enabled all the time when that maxes out your oled light which especially on the desktop could lead to burn-in?

Either of those but set the video range to limited.

Reading this article here,

https://referencehometheater.com/2014/commentary/rgb-full-vs-limited/

It suggests the following,

"On a computer monitor you use the opposite approach. RGB Full will display video games and other 0-255 content at the correct 0-255 range. TV, Movies and other video range content expands to use the full 0-255 range of a computer display. If you use RGB limited instead, shadows will be gray instead of black and highlights will be dull. You will not take full advantage of the dynamic range of the PC monitor and content will have a washed-out look. The image below is the opposite of that above where now we are missing highlights, they are slightly gray instead of white, while blacks are a dark gray and not black."

That being the case do you still recommend using Limited on a LG C9?
 
Last edited:
Oct 2, 2018
50
Question regarding 30Hz.

Firstly thanks to everyone for contributing to this topic, lots of helpful information.

When using 4K@60Hz I use YCbCr 4:2:2 10bit to get proper HDR (as this thread suggests), since I'm limited by HDMI 2.0. However, when using 4K@30Hz I also have the option to use YCbCr 4:4:4 12bit (10 bit not shown in drop-down). Would this be the one to use at 30Hz? Is there any reason to still use YCbCr 4:2:2 10bit?
 

dgrdsv

Member
Oct 25, 2017
12,065
People are saying that setting output range to limited and chroma to 4:2:2 is a better choice for 9/X OLED panels, especially with HDR.
You should try it out and see for yourself.
 
Jul 7, 2021
3,099
Copy pasting something I wrote some time ago.

You've probably been playing games or watching content in HDR via your PC, while missing a critical component – 10bit video.

Windows 10, unlike game consoles does not auto-switch the bit depth of the outgoing video when launching HDR games.

By default, both NVIDIA and AMD GPUs are configured to output RGB 8bit.

You might be wondering "But my TV turns on its HDR modes and games look better" this is indeed true – HDR is a collection of different pieces that when working together create the HDR effect. Your PC is sending the WCG(Wide Color Gamut)/BT.2020 metadata as well as other information to the TV which triggers its HDR mode, but the PC is still only sending an 8bit signal.


How to output 10-bit video on an NVIDIA GPU

NVIDIA GPUs have some quirks when it comes to what bit depths can be output with formats when connected to a display via HDMI.

HDMI supported formats:
  • RGB/YUV444:
    • 8-Bit
    • 12-Bit
  • YUV422
    • 8-Bit,
    • 10-Bit
    • 12-Bit
What does this mean for you? Not much – 12-bit has the same bandwidth requirements as 10-bit. If you do require RGB/YUV444, and send a 12-bit signal to the TV, that signal still only is contains a 10-bit signal being sent in a 12-bit container. The TV will convert the signal back down to 10-bit.

However, if you want to output true 10-bit, then you'll need to step down to YUV422 signal. Again, not the end of the world. At normal TV viewing distances (and even in 4K monitors) it is very difficult to tell the difference between 4:4:4 and 4:2:2.

The recommended setting in this case is YUV422 video, at 10-bit, for both SDR and HDR. This will make the switch seamless and does not require you to do any extra work.

How to configure NVIDA GPUs

  1. Right click on the Windows desktop
  2. Open the NVIDA control panel
  3. On the left side, click on Resolutions
  4. click on the Output Color Format dropdown menu and select YUV422
  5. Click on Apply
  6. Now click on the Output Color Depth dropdown menu and select 10bpc(bits per color)
  7. Click on Apply
Thats it. Your GPU is now outputting YUV422 10-bit video to your TV or monitor.

Now launch an HDR game and you'll see the full 10-bit color depth!

Mine says YCbCr422 is that the same thing?

Also, is anyone running into an issue with Sea of thieves, where HDR works fine when in windowed mode (unfortunately the game doesn't have a borderless window mode) vs fullscreen where the game becomes super dark?