Dinjoralo

Member
Oct 25, 2017
9,300
SOTTR runs with newer versions just fine.
But if you're asking if setting this global driver variable to the new preset E will affect games using older DLSS versions where there is such profile but it's "empty" (3.1.11-3.6.0) then this is an interesting question which needs to be checked.
I thiiiink older DLSS versions will just fall back on whatever the game sets as the default? Not too sure.
Either way, stoked to see if this can give better image quality in DD2 and LaD Infinite Wealth. Assuming DD2 won't pitch a fit from replacing the DLL...
 

dgrdsv

Member
Oct 25, 2017
12,062
Replacing 3.7.0 .dll in all games and enabling Preset E globally via Inspector + added XML will actually apply it and everything look and perform as best as it ever did on my side.
I've only some some small visual tests of E in HFW during night (lol) so can't really say much about it's quality. Seems like a solid preset for SR though.
With DLAA I haven't noticed any apparent differences to the game's default C.

DLAA is almost useless now with how much DLSS Quality improved with latest update.
Yeah well kinda already was the case previously. I was impressed by the details DLSS P with preset E is able to resolve though.

I thiiiink older DLSS versions will just fall back on whatever the game sets as the default? Not too sure.
Games pre-3.1 didn't really have any preset control and whatever they are defaulting to in these versions is usually DLSS side choice based on app id.
But if they are just falling back to D then that's fine, means forcing E for SR modes globally shouldn't affect such games (much at least).

The DLSS programming guide in DLSS repo was updated too btw and have some additional information:
Preset A (intended for Perf/Balanced/Quality modes):
o An older variant best suited to combat ghosting for elements with missing inputs (such as motion vectors)

• Preset B (intended for Ultra Perf mode):
o Similar to Preset A but for Ultra Performance mode

• Preset C (intended for Perf/Balanced/Quality modes):
o Preset which generally favors current frame information. Generally well-suited for fast-paced game content

• Preset D (intended for Perf/Balanced/Quality modes):
o Similar to Preset E. Preset E is generally recommended over Preset D.

• Preset E (intended for Perf/Balanced/Quality modes)
o The default preset for Perf/Balanced/Quality mode. Generally, favors image stability

• Preset F (intended for Ultra Perf/DLAA modes):
o The default preset for Ultra Perf and DLAA modes.

• Preset G (Unused)
So E is an updated version of D, and there's a new "unused" preset G now. Makes you wonder why use a new preset if it's just an update of D...

3.16 Alpha Upscaling Support
By default, DLSS is intended for 3-channel RGB images, only. Experimental support for upscaling 4-channel RGBA images can be enabled by setting the NVSDK_NGX_DLSS_Feature_Flags_AlphaUpscaling flag at creation time. For best results, the RGB color should be premultiplied by alpha in the color input.
Note: performance will be impacted by enabling this feature. Expect the overall execution time of DLSS to increase by 15-25% when alpha blending is enabled.
 
Last edited:
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,711
Italy
The DLSS programming guide in DLSS repo was updated too btw and have some additional information:

So E is an updated version of D, and there's a new "unused" preset G now. Makes you wonder why use a new preset if it's just an update of D...
Preset E seems more a new thing than just an update for D: it looks like C but it's stable like F, which is exactly what DLSS needed for SR.

The unused Preset G was also there before, so we'll see how they will use it in the future (even if there's not so much to improve left for SR right now).

For example the Alpha Upscaling experimental support seems very interesting...but that 15-25% perf penalty is definitely a no-go as it goes against the main purpose of DLSS.
 

Dinjoralo

Member
Oct 25, 2017
9,300
Preset E seems more a new thing than just an update for D: it looks like C but it's stable like F, which is exactly what DLSS needed for SR.

The unused Preset G was also there before, so we'll see how they will use it in the future (even if there's not so much to improve left for SR right now).

For example the Alpha Upscaling experimental support seems very interesting...but that 15-25% perf penalty is definitely a no-go as it goes against the main purpose of DLSS.
Oh thank God, alpha support...
I think this would be used in places like menus where you have a render of a character with the menu as a backdrop, so you don't have stuff like this. Since apparently the only alternative is using really crap upscaling, instead of just like, keeping those elements at native res...
 

dgrdsv

Member
Oct 25, 2017
12,062
I think this would be used in places like menus
All game graphics are blended prior to becoming front buffer which you see which is why there is no need to handle alpha channel usually or even have it in the front buffer color format (RGB10A2 leaves just 2 bits for alpha values for example).
It's not really clear where you would get any benefit from using DLSS with alpha channel. I can't even think of any examples.

where you have a render of a character with the menu as a backdrop, so you don't have stuff like this
That's just a badly implemented upscaling of this screen in particular.

Since apparently the only alternative is using really crap upscaling, instead of just like, keeping those elements at native res...
Anything which can be overlaid on top of the rendered image can be native res. The background and the model in the example above should just provide the data needed for the upscaler to work - which it apparently does not.
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,711
Italy
This is Horizon: Forbidden West v1.1.47 with max settings at 4K + DLSS (3.7.0 - Preset E) Quality + FG + Reflex: On and all my PC Optimizations applied on my 7800X3D and 4080 (Undervolted + Overclocked):

Horizon-Forbidden-West-Complete-Edition-v1-1-47-0-07-04-2024-23-57-51.png


Frame time is literally flat with very small fluctuations totally sorted out by G-Sync on my LG G3 OLED.
Only in cutscenes there are some very small stutters on some scenes' transitions but often you don't even notice.

I think it's among the best PS5/Console ports on PC and one of the best looking game ever so far. 🥹
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,711
Italy
sdfoDd9.png


I'm glad to announce the opening of my new YouTube Playlist:
[4K/HDR] Optimized Gameplay


You will find Very High Quality (among the best seen on YT), Direct-feed gameplay recordings in Native 4K, 60fps, color-accurate and Calibrated HDR after applying all the optimizations in this guide to my new PC (Full Specs and Setup in each video description)

For this reason, I highly recommend to see them from a true HDR display!

YouTube will still decently tone map them to SDR automatically if you can't.

I will periodically add new gameplay recordings for select games over time, so if you like them please support my work by leaving a Like & Subscribe on the YouTube channel and on Patreon!


Enjoy :)

-P


NOTE:
In order to preserve the highest and smoothest quality of the recordings, the actual performance you see is around 5-10% slower than normal. Just consider that as a reference if you want to compare it with similar setups.
 
Last edited:

AYZON

Member
Oct 29, 2017
912
Germany
Aloha,
I copied the settings from the OP for nvidia gpus and Im mostly happy with it.

The only issue I have that maybe just highlights that my approach to the issue isnt ideal - when running games in window mode and limiting their max fps (so that my gpu has to work less) other windows (and my OS) start to flicker.
My monitor refresh rate is at 144hz, the game in question is limited via nvidia control panel to 40fps (as otherwise it would be unlimited and cook my gpu). The game itself offers no such option.
Any idea to how I could approach this differently or fix the flickering without changing my approach?
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,711
Italy
Aloha,
I copied the settings from the OP for nvidia gpus and Im mostly happy with it.

The only issue I have that maybe just highlights that my approach to the issue isnt ideal - when running games in window mode and limiting their max fps (so that my gpu has to work less) other windows (and my OS) start to flicker.
My monitor refresh rate is at 144hz, the game in question is limited via nvidia control panel to 40fps (as otherwise it would be unlimited and cook my gpu). The game itself offers no such option.
Any idea to how I could approach this differently or fix the flickering without changing my approach?
You don't have to add manual frame limiters anymore.

Just enable NVIDIA Ultra Low Latency in NVCP and Reflex to "On" in-game whenever available and those will auto cap everything to -3/-4 fps your Refresh Rate (so around 140fps max), lowering input lag and ensuring the most flat Frametimes.

If your GPU is too hot when doing it, just install MSI Afterburner and Undervolt it.
You will achieve much lower temps/noise with the same if not even better performance than stock. There are many guides on YT based on GPU.
 

AYZON

Member
Oct 29, 2017
912
Germany
You don't have to add manual frame limiters anymore.

Just enable NVIDIA Ultra Low Latency in NVCP and Reflex to "On" in-game whenever available and those will auto cap everything to -3/-4 fps your Refresh Rate (so around 140fps max), lowering input lag and ensuring the most flat Frametimes.

If your GPU is too hot when doing it, just install MSI Afterburner and Undervolt it.
You will achieve much lower temps/noise with the same if not even better performance than stock. There are many guides on YT based on GPU.
I probably worded this badly but my issue is with things like idle games or wallpaper engine, having them run at 144fps makes no sense and just puts stress on the gpu/consumes power.
For games in general youre probably right, just in this case its about reducing power consumption and taking load off the gpu for minor tasks that dont require 100% usage.
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,711
Italy
I probably worded this badly but my issue is with things like idle games or wallpaper engine, having them run at 144fps makes no sense and just puts stress on the gpu/consumes power.
For games in general youre probably right, just in this case its about reducing power consumption and taking load off the gpu for minor tasks that dont require 100% usage.
Reaching 140fps doesn't automatically mean the GPU will be at 100% usage and max Watts consumption.

For lighter games and 3D Applications the power draw will still be lower even at that framerate, especially when you're also using the Windows "Balanced" power plan + power mode in Win11 coupled with also "Normal" GPU Power mode in NVCP, as suggested.
 

AYZON

Member
Oct 29, 2017
912
Germany
Maybe its wallpaper engine being weird, but if I set the fps to 144 for it (there is no unlocked option in the program itself), it definitely increases the usage signficantly.
On a wallpaper that doesnt do much animation:
10fps limit = ~2-6% usage
144 fps limit = 15-20% usage
Another wallpaper I tried had it go above 30% usage.

The game I tried is summoners war, but it seems to run for the most part on the cpu. Still, unlocking the framerate doubles the gpu usage from 2-3% to around 5-6%.
Maybe im missing some settings, ill check again if I forgot to change something.
Placing both in "efficiancy mode" via task manager helps but I guess this is not a automatic process.

Edit:
Checked the settings mentioned in the OP again and no idea whats going wrong. Even went further and activated the windows power saving mode instead of balanced.
 
Last edited:

d0x

Member
Oct 25, 2017
1
Maybe its wallpaper engine being weird, but if I set the fps to 144 for it (there is no unlocked option in the program itself), it definitely increases the usage signficantly.
On a wallpaper that doesnt do much animation:
10fps limit = ~2-6% usage
144 fps limit = 15-20% usage
Another wallpaper I tried had it go above 30% usage.

The game I tried is summoners war, but it seems to run for the most part on the cpu. Still, unlocking the framerate doubles the gpu usage from 2-3% to around 5-6%.
Maybe im missing some settings, ill check again if I forgot to change something.
Placing both in "efficiancy mode" via task manager helps but I guess this is not a automatic process.

Edit:
Checked the settings mentioned in the OP again and no idea whats going wrong. Even went further and activated the windows power saving mode instead of balanced.

Wallpaper engine should not be using that much GPU time for what it does... That's where your issue lies, well that or maybe DWM.exe (desktop windows manager) that handles Explorer's visually. Previously there was a driver bug with nVidia that caused a memory leak but they finally fixed that a few months ago but it wouldn't explain temps but it did explain lots of games in 2023 appearing to launch with leaks.

That said sometimes dwm can get into an e-argument with other apps that try to do it's job at the same time. If it's not a conflict with dwm and wallpaper engine then it's just wallpaper engine because yeah ... It shouldn't be chewing up that much GPU time unless you're running it on ab AMD 270x lol
 

Gloomz

Member
Oct 27, 2017
2,435
I'm noticing the links to guides at the start of this OP cost $ to view is that correct?
 

AYZON

Member
Oct 29, 2017
912
Germany
Wallpaper engine should not be using that much GPU time for what it does... That's where your issue lies, well that or maybe DWM.exe (desktop windows manager) that handles Explorer's visually. Previously there was a driver bug with nVidia that caused a memory leak but they finally fixed that a few months ago but it wouldn't explain temps but it did explain lots of games in 2023 appearing to launch with leaks.

That said sometimes dwm can get into an e-argument with other apps that try to do it's job at the same time. If it's not a conflict with dwm and wallpaper engine then it's just wallpaper engine because yeah ... It shouldn't be chewing up that much GPU time unless you're running it on ab AMD 270x lol
I think I may have figured it out. I monitored the GPU usage via the windows task manager which apparently doesnt take the GPU clock into account. So it sometimes looks like the GPU is under heavy load even though its downclocked. Monitoring the usage and GPU clock with GPU-Z showed that while load % seems to increase, the GPU clock stays almost the same so the actual difference isnt that big.

Man that drove me crazy, thanks everyone!
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,711
Italy
The default "Balanced" power plan (in old Control Panel) + "Best Performance" power mode (in New Control Panel) when plugged to AC is now recommended.


n8HgJqo.png


khOwyAi.png



After testing 50+ games and benchmarks, this provided the best and most consistent results compared to old High/Ultimate Performance power plans (with no power modes control) and "Balanced/Balanced" (power plan + power mode) while having the plus of still properly going in Idle states (and lower temps) when needed, without consuming a lot more power and making a lot more noise like the old plans (both for AMD and Intel CPUs).

OP also updated.
 
Last edited:

El meso

Member
Oct 27, 2017
551
I bought a 4070 super this week and it was performing pretty bad, frame gen was a stutter mess on the witcher 3 and even dragon quest 11 was stuttering.
Working smoothly now, Thanks for the tips op
 

Rickyrozay2o9

Member
Dec 11, 2017
4,568
P40L0 question about your guide that perhaps I'm making a mistake about but when using NVCP I know to set HDR to 10bit but when viewing SDR content should I also set it to 10bit or 8bit?
 

dgrdsv

Member
Oct 25, 2017
12,062
P40L0 question about your guide that perhaps I'm making a mistake about but when using NVCP I know to set HDR to 10bit but when viewing SDR content should I also set it to 10bit or 8bit?
There is no real difference for SDR content (and even in HDR you won't see much of it really but at least going with less than 10 bits puts it out of spec).
However do note that you loose MPO support on Nvidia if you force anything but 8 bits for SDR mode. This could be a much bigger issue than the invisible lack of precision.
 

Rickyrozay2o9

Member
Dec 11, 2017
4,568
There is no real difference for SDR content (and even in HDR you won't see much of it really but at least going with less than 10 bits puts it out of spec).
However do note that you loose MPO support on Nvidia if you force anything but 8 bits for SDR mode. This could be a much bigger issue than the invisible lack of precision.
What's MPO and how would I know if it's an issue?
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,711
Italy
There is no real difference for SDR content (and even in HDR you won't see much of it really but at least going with less than 10 bits puts it out of spec).
However do note that you loose MPO support on Nvidia if you force anything but 8 bits for SDR mode. This could be a much bigger issue than the invisible lack of precision.

What's MPO and how would I know if it's an issue?

Honestly, at least on my LG G3 OLED, I prefer leaving Windows 11 in HDR (so obviously with Color Depth 10-bit, along with all other suggested settings in the OP) and manage SDR luminance through the specific Windows SDR slider to be set to 35 (which will be equal to 250 nits, the standard for SDR contained in an HDR signal) so in the end everything looks great both in SDR and HDR without the need to manually switch between the two signals.
 
Last edited:

dgrdsv

Member
Oct 25, 2017
12,062
What's MPO and how would I know if it's an issue?
wiki.special-k.info

Presentation Model (D3D11-12)

For when you wonder why things are the way they are...
It's a thing which allows low latency frame presentation which support all fullscreen features basically.
Two main advantages it provides in modern Win11 are:
1. Low latency frame presentation inside a window (bypassing DWM s/w composition). Note that fullscreen borderless is less affected by that since modern Win11 have a DWM bypass mode even without MPO support.
2. Support for multiple overlays which is used by Windows OSDs (Game Bar, system notifications, volume bar, etc) which should either lessen or completely remove syncing issues when such OSD objects appear above a VRR'ed game window. Without MPOs such OSD elements appearing lead to DWM intervention which essentially disable VRR and force vsynced high latency composition.
Basically I would probably prefer MPO support over pointless 10 bit SDR color. But it depends on your typical gaming scenarios.
 

Dec

Prophet of Truth
Member
Oct 26, 2017
3,587
What's MPO and how would I know if it's an issue?

I think the issues is related to HDR toggling. MPO works fine with 10bit SDR, and also works when you enable HDR, however if you then toggle back to SDR MPO breaks in SDR and won't work again until a restart. I don't think there is a solution other than always use 8bit or restart after toggling HDR on then back off. Or always have HDR enabled.

Though I'm having trouble finding the thread about the issue I am remembering.

In my experience if you go over your MPO plane count you lose gsync until the overlay is gone (can happen when something like an xbox achievement pops up) if you only have 1 or 2 MPO planes.

You can see if MPO is working properly using Special K.
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,711
Italy
I think the issues is related to HDR toggling. MPO works fine with 10bit SDR, and also works when you enable HDR, however if you then toggle back to SDR MPO breaks in SDR and won't work again until a restart. I don't think there is a solution other than always use 8bit or restart after toggling HDR on then back off. Or always have HDR enabled.

Though I'm having trouble finding the thread about the issue I am remembering.

In my experience if you go over your MPO plane count you lose gsync until the overlay is gone (can happen when something like an xbox achievement pops up) if you only have 1 or 2 MPO planes.

You can see if MPO is working properly using Special K.
Always staying in HDR @ 10-bit (and then setup SDR within HDR through Win11) solves it, even when using VRR and even for Borderless window along with old Fullscreen.
 

Rickyrozay2o9

Member
Dec 11, 2017
4,568
wiki.special-k.info

Presentation Model (D3D11-12)

For when you wonder why things are the way they are...
It's a thing which allows low latency frame presentation which support all fullscreen features basically.
Two main advantages it provides in modern Win11 are:
1. Low latency frame presentation inside a window (bypassing DWM s/w composition). Note that fullscreen borderless is less affected by that since modern Win11 have a DWM bypass mode even without MPO support.
2. Support for multiple overlays which is used by Windows OSDs (Game Bar, system notifications, volume bar, etc) which should either lessen or completely remove syncing issues when such OSD objects appear above a VRR'ed game window. Without MPOs such OSD elements appearing lead to DWM intervention which essentially disable VRR and force vsynced high latency composition.
Basically I would probably prefer MPO support over pointless 10 bit SDR color. But it depends on your typical gaming scenarios.
This sounds potentially terrible so I definitely don't want to have these issues.

I think the issues is related to HDR toggling. MPO works fine with 10bit SDR, and also works when you enable HDR, however if you then toggle back to SDR MPO breaks in SDR and won't work again until a restart. I don't think there is a solution other than always use 8bit or restart after toggling HDR on then back off. Or always have HDR enabled.

Though I'm having trouble finding the thread about the issue I am remembering.

In my experience if you go over your MPO plane count you lose gsync until the overlay is gone (can happen when something like an xbox achievement pops up) if you only have 1 or 2 MPO planes.

You can see if MPO is working properly using Special K.
If this is how it works, it's really not THAT bad but I suppose it would get annoying down the line if you're having issues with a game using HDR and potentially forgetting this could be causing it.

Always staying in HDR @ 10-bit (and then setup SDR within HDR through Win11) solves it, even when using VRR and even for Borderless window along with old Fullscreen.
Interesting. I'm not actually using my C9 with this but my QD-OLED MSI 32URX, but I imagine it would be the same thing. Funny enough I have my SDR slider set to 40 within my calibrated W11 profile. You're saying dropping that down to 35 would essentially be the same thing as using SDR fully with 10bit activated?
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,711
Italy
Interesting. I'm not actually using my C9 with this but my QD-OLED MSI 32URX, but I imagine it would be the same thing. Funny enough I have my SDR slider set to 40 within my calibrated W11 profile. You're saying dropping that down to 35 would essentially be the same thing?
Yeah, same thing.
Also between 35 and 40 the difference is small (35 = 250 nits which is standard, while 40 should be around 285-300 nits which could be a bit too bright at night/dark but still a good SDR-in-HDR value).
 

dgrdsv

Member
Oct 25, 2017
12,062
I think the issues is related to HDR toggling. MPO works fine with 10bit SDR, and also works when you enable HDR, however if you then toggle back to SDR MPO breaks in SDR and won't work again until a restart. I don't think there is a solution other than always use 8bit or restart after toggling HDR on then back off. Or always have HDR enabled.
MPO doesn't work with 10 bit SDR mode on Nvidia at all, doesn't matter what you toggle.
Current MPO support is a) 8 bit SDR, b) HDR10 (so 10 bit HDR). You get 4 planes in these. Toggling between these on the same display won't disable MPO.
Also if you have more than 1 display then only one of them will be getting MPO support if >1 qualify. Which one depends on current weather over Gibraltar probably.
Also worth noting that MPOs are a bit finicky after a restart - you may need to wait for some time after that before they'll activate.
 

Dec

Prophet of Truth
Member
Oct 26, 2017
3,587
MPO doesn't work with 10 bit SDR mode on Nvidia at all, doesn't matter what you toggle.
Current MPO support is a) 8 bit SDR, b) HDR10 (so 10 bit HDR). You get 4 planes in these. Toggling between these on the same display won't disable MPO.
Also if you have more than 1 display then only one of them will be getting MPO support if >1 qualify. Which one depends on current weather over Gibraltar probably.
Also worth noting that MPOs are a bit finicky after a restart - you may need to wait for some time after that before they'll activate.

Yea you are correct, I tracked down the thread describing the issue, the issue I was describing is only if using NVCP default color settings option

https://forums.guru3d.com/threads/multiplane-overlay-issues.443121/page-4#post-6198351
 

Rickyrozay2o9

Member
Dec 11, 2017
4,568
Yeah, same thing.
Also between 35 and 40 the difference is small (35 = 250 nits which is standard, while 40 should be around 285-300 nits which could be a bit too bright at night/dark but still a good SDR-in-HDR value).
Now If I do this, what about playing games that don't support HDR? I'd have to turn Auto HDR off right?

MPO doesn't work with 10 bit SDR mode on Nvidia at all, doesn't matter what you toggle.
Current MPO support is a) 8 bit SDR, b) HDR10 (so 10 bit HDR). You get 4 planes in these. Toggling between these on the same display won't disable MPO.
Also if you have more than 1 display then only one of them will be getting MPO support if >1 qualify. Which one depends on current weather over Gibraltar probably.
Also worth noting that MPOs are a bit finicky after a restart - you may need to wait for some time after that before they'll activate.
Yeah, in that case I won't do it for sure.
 

Rickyrozay2o9

Member
Dec 11, 2017
4,568
AutoHDR will just work as usual for supported games (if you use it, but I vastly prefer NVIDIA RTX HDR over it + this will work with almost any SDR game)
Gotcha. So just to be clear Auto HDR is really just for SDR titles with no HDR that support it and RTX HDR can actually be used for both compatible HDR and non native HDR titles that support Auto HDR?
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,711
Italy
Gotcha. So just to be clear Auto HDR is really just for SDR titles with no HDR that support it and RTX HDR can actually be used for both compatible HDR and non native HDR titles that support Auto HDR?
No.
AutoHDR is for selected SDR games (by Microsoft) which will be auto converted in HDR (but with black level raise and color banding.)
RTX HDR is the equivalent of AutoHDR of NVIDIA which will work with almost all SDR games which will be auto converted in HDR (with correct black levels, customizable luminance and less color banding). Disable AutoHDR if you want to use this.

Native HDR games won't be touched by neither of them.
 

Rickyrozay2o9

Member
Dec 11, 2017
4,568
No.
AutoHDR is for selected SDR games (by Microsoft) which will be auto converted in HDR (but with black level raise and color banding.)
RTX HDR is the equivalent of AutoHDR of NVIDIA which will work with almost all SDR games which will be auto converted in HDR (with correct black levels, customizable luminance and less color banding). Disable AutoHDR if you want to use this.

Native HDR games won't be touched by neither of them.
I see, thank you for the information. For the RTX HDR to work what exactly is the requirement, dx11/12 titles or just anything?
 

Rickyrozay2o9

Member
Dec 11, 2017
4,568
Officially it will work for all DX9-12 titles but it actually works on almost anything (DX8, OpenGL, Vulkan) with some adjustments in Win11 and NVCP.

For more info read here and watch this:


View: https://youtu.be/BditFs3VR9c?t=877

Thank you. Sorry but one more question, I was looking at the options for enabling RTX HDR and it looks like I already have the updated xml file for inspector to do it. Now this isn't for my LG and since my peak brightness using this monitor is around 460-470, what would I put in for my peak brightness since there's no real way to type a direct number in there without the app.
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,711
Italy
Thank you. Sorry but one more question, I was looking at the options for enabling RTX HDR and it looks like I already have the updated xml file for inspector to do it. Now this isn't for my LG and since my peak brightness using this monitor is around 460-470, what would I put in for my peak brightness?
I would leave it at default 1.000 nits as 460 nits it's really not HDR, so the monitor for sure has some tone mapping going on in accepting standard HDR10 1.000 nits.
See the article for the exact values to put in Inspector for it.
 

Rickyrozay2o9

Member
Dec 11, 2017
4,568
I would leave it at default 1.000 nits as 460 nits it's really not HDR, so the monitor for sure has some tone mapping going on in accepting standard HDR10 1.000 nits.
See the article for the exact values to put in Inspector for it.
leave it at 1.000 even with the new QD-OLEDs using HDR 400? I do have a 1000 mode but the PQ tacking on these 2024 QD-OLEDs using it aren't great and 400 in most situations look brighter, for now anyway.
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,711
Italy
leave it at 1.000 even with the new QD-OLEDs using HDR 400? I do have a 1000 mode but the PQ tacking on these 2024 QD-OLEDs using it aren't great and 400 in most situations look brighter, for now anyway.
I would use the standard 1.000 on monitors unless those have a (properly working) HGIG mode which will turn off any tone mapping altogether, so the nits you put there exactly match what the display will output (to the max it's actually capable of).
In this second case, you can calculate the hexadecimal value to put in Inspector here (based on max nits).
 

Ghostshark

One Winged Slayer
Member
Oct 25, 2017
112
The default "Balanced" power plan (in old Control Panel) + "Best Performance" power mode (in New Control Panel) when plugged to AC is now recommended.
Thanks for this tip. While I can't tell the difference on a VRR display, this really improved my moonlight AV1 streaming. With the old settings I was getting 15 MS spikes running 1080p 120fps, now it runs flawlessly at 1440p 120 fps with 4 ms decoder lag. I'm also no longer getting camera judder at all even with mouse and keyboard.
 
Last edited:

Septimus

Member
Oct 27, 2017
1,609
Feels like I've tried everything and I cannot get G-SYNC/Set Up G-Sync to show up in the Nvidia settings :(

In case anyone has any thoughts or ideas my hardware is a 4090 + LG CX (65"), certified HDMI cable, label set to PC, power cycled, Game Mode, Instant Game Response*, etc

*Been thinking maybe this is it. When I use it on PS5 a banner comes up saying Instant Game Response on. No such banner on PC. Not to mention the tv doesn't switch automatically to the PC input upon turning it on.