• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

Dinjoralo

Member
Oct 25, 2017
9,197
SOTTR runs with newer versions just fine.
But if you're asking if setting this global driver variable to the new preset E will affect games using older DLSS versions where there is such profile but it's "empty" (3.1.11-3.6.0) then this is an interesting question which needs to be checked.
I thiiiink older DLSS versions will just fall back on whatever the game sets as the default? Not too sure.
Either way, stoked to see if this can give better image quality in DD2 and LaD Infinite Wealth. Assuming DD2 won't pitch a fit from replacing the DLL...
 

dgrdsv

Member
Oct 25, 2017
11,911
Replacing 3.7.0 .dll in all games and enabling Preset E globally via Inspector + added XML will actually apply it and everything look and perform as best as it ever did on my side.
I've only some some small visual tests of E in HFW during night (lol) so can't really say much about it's quality. Seems like a solid preset for SR though.
With DLAA I haven't noticed any apparent differences to the game's default C.

DLAA is almost useless now with how much DLSS Quality improved with latest update.
Yeah well kinda already was the case previously. I was impressed by the details DLSS P with preset E is able to resolve though.

I thiiiink older DLSS versions will just fall back on whatever the game sets as the default? Not too sure.
Games pre-3.1 didn't really have any preset control and whatever they are defaulting to in these versions is usually DLSS side choice based on app id.
But if they are just falling back to D then that's fine, means forcing E for SR modes globally shouldn't affect such games (much at least).

The DLSS programming guide in DLSS repo was updated too btw and have some additional information:
Preset A (intended for Perf/Balanced/Quality modes):
o An older variant best suited to combat ghosting for elements with missing inputs (such as motion vectors)

• Preset B (intended for Ultra Perf mode):
o Similar to Preset A but for Ultra Performance mode

• Preset C (intended for Perf/Balanced/Quality modes):
o Preset which generally favors current frame information. Generally well-suited for fast-paced game content

• Preset D (intended for Perf/Balanced/Quality modes):
o Similar to Preset E. Preset E is generally recommended over Preset D.

• Preset E (intended for Perf/Balanced/Quality modes)
o The default preset for Perf/Balanced/Quality mode. Generally, favors image stability

• Preset F (intended for Ultra Perf/DLAA modes):
o The default preset for Ultra Perf and DLAA modes.

• Preset G (Unused)
So E is an updated version of D, and there's a new "unused" preset G now. Makes you wonder why use a new preset if it's just an update of D...

3.16 Alpha Upscaling Support
By default, DLSS is intended for 3-channel RGB images, only. Experimental support for upscaling 4-channel RGBA images can be enabled by setting the NVSDK_NGX_DLSS_Feature_Flags_AlphaUpscaling flag at creation time. For best results, the RGB color should be premultiplied by alpha in the color input.
Note: performance will be impacted by enabling this feature. Expect the overall execution time of DLSS to increase by 15-25% when alpha blending is enabled.
 
Last edited:
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,631
Italy
The DLSS programming guide in DLSS repo was updated too btw and have some additional information:

So E is an updated version of D, and there's a new "unused" preset G now. Makes you wonder why use a new preset if it's just an update of D...
Preset E seems more a new thing than just an update for D: it looks like C but it's stable like F, which is exactly what DLSS needed for SR.

The unused Preset G was also there before, so we'll see how they will use it in the future (even if there's not so much to improve left for SR right now).

For example the Alpha Upscaling experimental support seems very interesting...but that 15-25% perf penalty is definitely a no-go as it goes against the main purpose of DLSS.
 

Dinjoralo

Member
Oct 25, 2017
9,197
Preset E seems more a new thing than just an update for D: it looks like C but it's stable like F, which is exactly what DLSS needed for SR.

The unused Preset G was also there before, so we'll see how they will use it in the future (even if there's not so much to improve left for SR right now).

For example the Alpha Upscaling experimental support seems very interesting...but that 15-25% perf penalty is definitely a no-go as it goes against the main purpose of DLSS.
Oh thank God, alpha support...
I think this would be used in places like menus where you have a render of a character with the menu as a backdrop, so you don't have stuff like this. Since apparently the only alternative is using really crap upscaling, instead of just like, keeping those elements at native res...
 

dgrdsv

Member
Oct 25, 2017
11,911
I think this would be used in places like menus
All game graphics are blended prior to becoming front buffer which you see which is why there is no need to handle alpha channel usually or even have it in the front buffer color format (RGB10A2 leaves just 2 bits for alpha values for example).
It's not really clear where you would get any benefit from using DLSS with alpha channel. I can't even think of any examples.

where you have a render of a character with the menu as a backdrop, so you don't have stuff like this
That's just a badly implemented upscaling of this screen in particular.

Since apparently the only alternative is using really crap upscaling, instead of just like, keeping those elements at native res...
Anything which can be overlaid on top of the rendered image can be native res. The background and the model in the example above should just provide the data needed for the upscaler to work - which it apparently does not.
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,631
Italy
This is Horizon: Forbidden West v1.1.47 with max settings at 4K + DLSS (3.7.0 - Preset E) Quality + FG + Reflex: On and all my PC Optimizations applied on my 7800X3D and 4080 (Undervolted + Overclocked):

Horizon-Forbidden-West-Complete-Edition-v1-1-47-0-07-04-2024-23-57-51.png


Frame time is literally flat with very small fluctuations totally sorted out by G-Sync on my LG G3 OLED.
Only in cutscenes there are some very small stutters on some scenes' transitions but often you don't even notice.

I think it's among the best PS5/Console ports on PC and one of the best looking game ever so far. 🥹
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,631
Italy
sdfoDd9.png


I'm glad to announce the opening of my new YouTube Playlist:
[4K/HDR] Optimized Gameplay


You will find Very High Quality (among the best seen on YT), Direct-feed gameplay recordings in Native 4K, 60fps, color-accurate and Calibrated HDR after applying all the optimizations in this guide to my new PC (Full Specs and Setup in each video description)

For this reason, I highly recommend to see them from a true HDR display!

YouTube will still decently tone map them to SDR automatically if you can't.

I will periodically add new gameplay recordings for select games over time, so if you like them please support my work by leaving a Like & Subscribe on the YouTube channel and on Patreon!


Enjoy :)

-P


NOTE:
In order to preserve the highest and smoothest quality of the recordings, the actual performance you see is around 5-10% slower than normal. Just consider that as a reference if you want to compare it with similar setups.
 
Last edited:

AYZON

Member
Oct 29, 2017
908
Germany
Aloha,
I copied the settings from the OP for nvidia gpus and Im mostly happy with it.

The only issue I have that maybe just highlights that my approach to the issue isnt ideal - when running games in window mode and limiting their max fps (so that my gpu has to work less) other windows (and my OS) start to flicker.
My monitor refresh rate is at 144hz, the game in question is limited via nvidia control panel to 40fps (as otherwise it would be unlimited and cook my gpu). The game itself offers no such option.
Any idea to how I could approach this differently or fix the flickering without changing my approach?
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,631
Italy
Aloha,
I copied the settings from the OP for nvidia gpus and Im mostly happy with it.

The only issue I have that maybe just highlights that my approach to the issue isnt ideal - when running games in window mode and limiting their max fps (so that my gpu has to work less) other windows (and my OS) start to flicker.
My monitor refresh rate is at 144hz, the game in question is limited via nvidia control panel to 40fps (as otherwise it would be unlimited and cook my gpu). The game itself offers no such option.
Any idea to how I could approach this differently or fix the flickering without changing my approach?
You don't have to add manual frame limiters anymore.

Just enable NVIDIA Ultra Low Latency in NVCP and Reflex to "On" in-game whenever available and those will auto cap everything to -3/-4 fps your Refresh Rate (so around 140fps max), lowering input lag and ensuring the most flat Frametimes.

If your GPU is too hot when doing it, just install MSI Afterburner and Undervolt it.
You will achieve much lower temps/noise with the same if not even better performance than stock. There are many guides on YT based on GPU.
 

AYZON

Member
Oct 29, 2017
908
Germany
You don't have to add manual frame limiters anymore.

Just enable NVIDIA Ultra Low Latency in NVCP and Reflex to "On" in-game whenever available and those will auto cap everything to -3/-4 fps your Refresh Rate (so around 140fps max), lowering input lag and ensuring the most flat Frametimes.

If your GPU is too hot when doing it, just install MSI Afterburner and Undervolt it.
You will achieve much lower temps/noise with the same if not even better performance than stock. There are many guides on YT based on GPU.
I probably worded this badly but my issue is with things like idle games or wallpaper engine, having them run at 144fps makes no sense and just puts stress on the gpu/consumes power.
For games in general youre probably right, just in this case its about reducing power consumption and taking load off the gpu for minor tasks that dont require 100% usage.
 
OP
OP
P40L0

P40L0

Member
Jun 12, 2018
7,631
Italy
I probably worded this badly but my issue is with things like idle games or wallpaper engine, having them run at 144fps makes no sense and just puts stress on the gpu/consumes power.
For games in general youre probably right, just in this case its about reducing power consumption and taking load off the gpu for minor tasks that dont require 100% usage.
Reaching 140fps doesn't automatically mean the GPU will be at 100% usage and max Watts consumption.

For lighter games and 3D Applications the power draw will still be lower even at that framerate, especially when you're also using the Windows "Balanced" power plan + power mode in Win11 coupled with also "Normal" GPU Power mode in NVCP, as suggested.
 

AYZON

Member
Oct 29, 2017
908
Germany
Maybe its wallpaper engine being weird, but if I set the fps to 144 for it (there is no unlocked option in the program itself), it definitely increases the usage signficantly.
On a wallpaper that doesnt do much animation:
10fps limit = ~2-6% usage
144 fps limit = 15-20% usage
Another wallpaper I tried had it go above 30% usage.

The game I tried is summoners war, but it seems to run for the most part on the cpu. Still, unlocking the framerate doubles the gpu usage from 2-3% to around 5-6%.
Maybe im missing some settings, ill check again if I forgot to change something.
Placing both in "efficiancy mode" via task manager helps but I guess this is not a automatic process.

Edit:
Checked the settings mentioned in the OP again and no idea whats going wrong. Even went further and activated the windows power saving mode instead of balanced.
 
Last edited:

d0x

Member
Oct 25, 2017
1
Maybe its wallpaper engine being weird, but if I set the fps to 144 for it (there is no unlocked option in the program itself), it definitely increases the usage signficantly.
On a wallpaper that doesnt do much animation:
10fps limit = ~2-6% usage
144 fps limit = 15-20% usage
Another wallpaper I tried had it go above 30% usage.

The game I tried is summoners war, but it seems to run for the most part on the cpu. Still, unlocking the framerate doubles the gpu usage from 2-3% to around 5-6%.
Maybe im missing some settings, ill check again if I forgot to change something.
Placing both in "efficiancy mode" via task manager helps but I guess this is not a automatic process.

Edit:
Checked the settings mentioned in the OP again and no idea whats going wrong. Even went further and activated the windows power saving mode instead of balanced.

Wallpaper engine should not be using that much GPU time for what it does... That's where your issue lies, well that or maybe DWM.exe (desktop windows manager) that handles Explorer's visually. Previously there was a driver bug with nVidia that caused a memory leak but they finally fixed that a few months ago but it wouldn't explain temps but it did explain lots of games in 2023 appearing to launch with leaks.

That said sometimes dwm can get into an e-argument with other apps that try to do it's job at the same time. If it's not a conflict with dwm and wallpaper engine then it's just wallpaper engine because yeah ... It shouldn't be chewing up that much GPU time unless you're running it on ab AMD 270x lol
 

AYZON

Member
Oct 29, 2017
908
Germany
Wallpaper engine should not be using that much GPU time for what it does... That's where your issue lies, well that or maybe DWM.exe (desktop windows manager) that handles Explorer's visually. Previously there was a driver bug with nVidia that caused a memory leak but they finally fixed that a few months ago but it wouldn't explain temps but it did explain lots of games in 2023 appearing to launch with leaks.

That said sometimes dwm can get into an e-argument with other apps that try to do it's job at the same time. If it's not a conflict with dwm and wallpaper engine then it's just wallpaper engine because yeah ... It shouldn't be chewing up that much GPU time unless you're running it on ab AMD 270x lol
I think I may have figured it out. I monitored the GPU usage via the windows task manager which apparently doesnt take the GPU clock into account. So it sometimes looks like the GPU is under heavy load even though its downclocked. Monitoring the usage and GPU clock with GPU-Z showed that while load % seems to increase, the GPU clock stays almost the same so the actual difference isnt that big.

Man that drove me crazy, thanks everyone!