• Staff have decided to place a soft ban on topics concerning AI content generation and their algorithms like Stable Diffusion and ChatGPT. You can read more about the update here.

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987

Downsampling or super-sampling down from a higher resolution is a brute-force form of anti-aliasing. Nvidia's new DLDSR aims to do the same job with lower pixel counts, improved performance and maybe improved quality. What does it do? How does it work? Can a downsampled 1620p really look as good as a downsampled 2160p? And what happens if you combine DLSS with DLDSR? Alex Battaglia has answers!
  • DSR from 2014 is ordered grid super-sampling AA
  • 60% performance loss on a 2060 when rendering 2160p
  • non-whole number scaling can cause issues because of uneven division
  • DLDSR targets the filtering phase of DSR to correct errors by uneven scaling
  • there seems to be edge detection and edge anti-aliasing
  • DLDSR is 3% slower (on a 2060) than DSR at the same settings
  • 0% smoothing does create errors
  • smoothing filter is recommended
  • DLDSR has temporal stability over DSR
  • competitive to DSR 4x
  • some resolution-dependent aspects can't match the higher internal res of DSR 4x, however; depth of field is improved with higher internal res, thin lines exhibit less holes due to more pixels being shaded
  • using DLDSR with DLSS can improve IQ while minimizing performance loss
  • DLDSR only offers 1.75x and 2.25x scaling, while DSR has a wider range of scaling factors
 
Last edited:

Mocha Joe

Member
Jun 2, 2021
6,306
Can’t wait to watch this later.

From my experience of enabling it in God of War, I could not tell any difference between native 4K and DLDSR enabled (with either of the options on). So I am keeping it off for now.
 

Anatole

Member
Mar 25, 2020
1,312
The long and short of it, imo: this is a resampling problem.

DSR in its current form uses a 13 tap Gaussian filter to downsample.

DLDSR uses a neural network instead. It’s probably a convolutional neural network, which is basically a bunch of filters in series and parallel. When you train a convolutional neural network, you optimize the weights in each filter by minimizing a loss function, which is why it can get better results on image processing tasks like this.

Unlike DLSS, there’s no temporal information required. There’s no upscaling either. It’s just about better filtering.
😎
 

Sullivan

Banned
Mar 4, 2021
23,353
Interesting. When I first heard DLDSR I thought it was upscaling using DLSS and downsampling like regular DSR.
 

RayCharlizard

Member
Nov 2, 2017
2,019
Is there a recommended smoothing setting for general use?
When using DLDSR Alex recommends around 50% smoothing to defeat some of the excess sharpening that the machine learning filter applies. For standard DSR, use 33% (or less, personal preference really).

On God of War, when using both 2.25x DLDSR and DLSS Quality he set smoothing to 80%.
 

tokkun

Member
Oct 27, 2017
4,906
The video did a good job of explaining what the setting does, however I came away from it not knowing when I should actually use it.

The critical piece that is missing is a comparison with anti aliasing enabled. A large part of the image quality advantage of DSR is that you are getting SSAA out of it. The natural question is going to be how it compares in performance and quality versus using a lower internal resolution with any sort of non-super-sampled AA.

As we start combining these techniques it is getting incredibly difficult to figure out what even a reasonably good combination of settings is in PC gaming.
 

RayCharlizard

Member
Nov 2, 2017
2,019
The video did a good job of explaining what the setting does, however I came away from it not knowing when I should actually use it.

The critical piece that is missing is a comparison with anti aliasing enabled. A large part of the image quality advantage of DSR is that you are getting SSAA out of it. The natural question is going to be how it compares in performance and quality versus using a lower internal resolution with any sort of non-super-sampled AA.

As we start combining these techniques it is getting incredibly difficult to figure out what even a reasonably good combination of settings is in PC gaming.
There's no direct comparison because I imagine that's beyond the scope of the video and I think Alex might already have done an anti-aliasing tech focus in the past? I'm not sure. But really this is just new information to add to whatever you already know. Any form of super-sampled AA is going to compare pretty similarly against any form of AA that isn't just brute forcing with more pixels. FXAA is blurriest, TAA a bit less so but can be more smeary and quality is largely engine / implementation dependent. Old methods look worse than what you'll get in the latest UE4/5 games. The scale hasn't really changed. Adding more rendered pixels is going to result in better image quality than depending on a filter, the question is do you have the performance headroom to do that.
 

Firefly

Member
Jul 10, 2018
7,567
The video did a good job of explaining what the setting does, however I came away from it not knowing when I should actually use it.

The critical piece that is missing is a comparison with anti aliasing enabled. A large part of the image quality advantage of DSR is that you are getting SSAA out of it. The natural question is going to be how it compares in performance and quality versus using a lower internal resolution with any sort of non-super-sampled AA.

As we start combining these techniques it is getting incredibly difficult to figure out what even a reasonably good combination of settings is in PC gaming.
The performance impact is whatever resolution you choose to downsample from on your GPU. Since this is basically a technique to achieve AA the idea is to see how it treats the image with no AA. You can assume its "extra" AA on top of whatever the game is using. Which is to say, your performance headroom is still limited by the GPU you have.
 

fixing ranger

Member
Aug 24, 2021
409
The video did a good job of explaining what the setting does, however I came away from it not knowing when I should actually use it.

The critical piece that is missing is a comparison with anti aliasing enabled. A large part of the image quality advantage of DSR is that you are getting SSAA out of it. The natural question is going to be how it compares in performance and quality versus using a lower internal resolution with any sort of non-super-sampled AA.

As we start combining these techniques it is getting incredibly difficult to figure out what even a reasonably good combination of settings is in PC gaming.

From my understanding you use it if you are on 1080p monitor but want increased IQ from higher resolutions, and this algorithm is a bit more efficient than just direct supersampling.
 

Sambo

Member
Oct 28, 2017
149
UK
I just tried this in FF14 and boy does it make a good difference. I tried DSR before and the UI would get messed up but not so with DLDSR, UI looks great. I have smoothness at 20% and it looks fine to me. The best thing about this feature is it fixes the shoddy FXAA feature the game has always had.
 

rashbeep

Banned
Oct 27, 2017
8,790
The video did a good job of explaining what the setting does, however I came away from it not knowing when I should actually use it.

The critical piece that is missing is a comparison with anti aliasing enabled. A large part of the image quality advantage of DSR is that you are getting SSAA out of it. The natural question is going to be how it compares in performance and quality versus using a lower internal resolution with any sort of non-super-sampled AA.

As we start combining these techniques it is getting incredibly difficult to figure out what even a reasonably good combination of settings is in PC gaming.

I use it in any game with bad AA settings and horrible forced TAA. This actually works well in halo infinite and I'd imagine it would work in god of war as well
 

tokkun

Member
Oct 27, 2017
4,906
There's no direct comparison because I imagine that's beyond the scope of the video and I think Alex might already have done an anti-aliasing tech focus in the past? I'm not sure. But really this is just new information to add to whatever you already know. Any form of super-sampled AA is going to compare pretty similarly against any form of AA that isn't just brute forcing with more pixels. FXAA is blurriest, TAA a bit less so but can be more smeary and quality is largely engine / implementation dependent. Old methods look worse than what you'll get in the latest UE4/5 games. The scale hasn't really changed. Adding more rendered pixels is going to result in better image quality than depending on a filter, the question is do you have the performance headroom to do that.

Knowing what they do at a broad level isn't that helpful when it comes to making a decision about what to enable in a given game. As some of these features can be layered - for instance DLSS + DLDSR as shown in the video - there is getting to be a combinatorial explosion of different options to achieve anti-aliasing.

I understand that also makes it infeasible to cover all options in a video. But it would have been nice to show at least one comparison point to something like native resolution + engine-level AA as a point of comparison. Digital Foundry has previously done comparisons with TAA vs DLSS, and that was pretty useful.
 
Oct 28, 2017
1,951
With DLDSR, you turn off in-game AA to make the game sharper and depend on DLDSR to AA for you.
That's a valuable trick.

Because quite a bit of games with TAA have lower sharpness than you expect.
 

Green

Member
Oct 27, 2017
7,198
Can confirm, this works wonders with DLSS as well.

Played some Control on my 1440P screen and used DLDSR to super sample at 4K, but ran DLSS in-game to internally render at 1440P.

Looks fantastic and basically zero performance loss compared to running native 1440P.

There's other things you can do with it like the video suggests, and it's probably less useful for gaming at 4K other than for really performance optimized games. But for those on 1080P and 1440P screens it's bananas. Great feature update.

This driver version also includes a new "inject raytracing" sort of filter in a small handful of games. It's interesting. Extremely costly performance wise, and unless you're extremely perceptive it's hard to notice. But it's cool they're starting to work on this either way and adding it to the built in tools.

Edit: As a tip, this only works in games that have exclusive full screen. But you can set Windows' desktop resolution to your DLDSR resolution after enabling to bypass that limitation - though it might cause weird issues in some games. This is how I got it working in God of War for example, which doesn't have an exclusive full screen mode currently.
 

RayCharlizard

Member
Nov 2, 2017
2,019
Knowing what they do at a broad level isn't that helpful when it comes to making a decision about what to enable in a given game. As some of these features can be layered - for instance DLSS + DLDSR as shown in the video - there is getting to be a combinatorial explosion of different options to achieve anti-aliasing.

I understand that also makes it infeasible to cover all options in a video. But it would have been nice to show at least one comparison point to something like native resolution + engine-level AA as a point of comparison. Digital Foundry has previously done comparisons with TAA vs DLSS, and that was pretty useful.
It's a lot to consider for sure but my general rule of thumb lately has just been if the game has DLSS, use it lol. The latest version of DLSS is always going to be better than the game's native TAA as DLSS is built on-top of that functionality (unless that changed recently?) and it won't suffer from the same blurring or softening that TAA is known for. In the case of the games shown in the video, Witcher 3 doesn't support DLSS so it's just TAA vs. SSAA which you can see in hundreds of comparisons of other games online already. And for God of War, I believe they showed TAA vs. DLSS in the PC analysis video.
 

Green

Member
Oct 27, 2017
7,198
It's a lot to consider for sure but my general rule of thumb lately has just been if the game has DLSS, use it lol. The latest version of DLSS is always going to be better than the game's native TAA as DLSS is built on-top of that functionality (unless that changed recently?) and it won't suffer from the same blurring or softening that TAA is known for. In the case of the games shown in the video, Witcher 3 doesn't support DLSS so it's just TAA vs. SSAA which you can see in hundreds of comparisons of other games online already. And for God of War, I believe they showed TAA vs. DLSS in the PC analysis video.

Yes for non-DLSS games, the advantage of this DLDSR feature is really that you can super sample to 1.75x with results comparable to 2.25x. So you can get a decent performance boost for pretty close results to the image quality depending on the game. Not super useful if you're on a 4K panel due to the performance cost of super sampling above 4K either way, but if you're on 1440P or 1080P or even Ultrawide (though I hear there's a bug right now with ultrawide 1440P and 1.75x), it can be great. Especially if you have a G-sync display you can get 10fps+ performance boost.
 

Terbinator

Member
Oct 29, 2017
7,791
The problem I have with this is the driver detects my monitors native res as 4K (not 1440p) and so the lowest figure I can get is actually 2880p (at x1.78) which is a massive penalty even on a 3080Ti in any recent games.
 

Rickyrozay2o9

Member
Dec 11, 2017
2,928
The problem I have with this is the driver detects my monitors native res as 4K (not 1440p) and so the lowest figure I can get is actually 2880p (at x1.78) which is a massive penalty even on a 3080Ti in any recent games.
I was having this issue when I had my second monitor being my lgc9 hooked in. Once I unhooked the tv it showed the correct resolutions. Although even when it showed the wrong resolutions I'm pretty sure it was outputting correctly.

EDIT: actually I take that back, make sure it shows the correct resolutions in the NVCP because even if it shows the right resolution in game but NVCP tells you something different you get funny results.
 
Last edited:

Hzoltan969

Member
Oct 26, 2017
185
I only have 23/24Hz refresh rates available for the DLDSR resolutions - any ideas how I can fix it? Tried everything I could think of, but no luck...
 

Conf

Member
Dec 4, 2021
418
Even 33% is oversharpened going by the video
I would start at 50% personally
 

Kyle Cross

Member
Oct 25, 2017
7,659
Any way to use this at 1080p on a native 4k display so you can take advantage of 120hz HDR within HDMI 2.0 limitation?
 

EatChildren

Wonder from Down Under
Member
Oct 27, 2017
6,541
A semi-ideal middle ground for those on 1440p monitors but with RTX grunt seems to be something like DLDSR x2.25 to downsample from 4K, but then also using DLSS Balanced or Performance. You get the benefits of efficient image reconstruction to 4K, and the downsampling should obscure some of the deficiencies of using Performance over Quality in DLSS. 4K Performance DLSS is, as far as I'm aware, working with a slightly higher base resolution than 1440p Quality DLSS anyway, so the math backs it up.
 

Poison Jam

Member
Nov 6, 2017
2,932
A problem with this, and regular DSR, is that it uses your screens native resolution as a base. Which means that a lot of 4K televisions will report film industry standard 4K (4096 × 2160). Making games look squished and it breaks auto-HDR.

Now, I know it's mainly a feature targeting monitors of 1440p or lower, but still. I've also heard of people with ultra-wides having similar issues.
 

Terbinator

Member
Oct 29, 2017
7,791
So using a DP cable rather than HDMI locks my native res to 1440p, but, the scaling from x2.25 (4K) in DLDSR compared 4K DSR is definitely worse in FH5.

One step forward, one step back lol.
 

elenarie

Game Developer
Verified
Jun 10, 2018
8,299
The video did a good job of explaining what the setting does, however I came away from it not knowing when I should actually use it.

The critical piece that is missing is a comparison with anti aliasing enabled. A large part of the image quality advantage of DSR is that you are getting SSAA out of it. The natural question is going to be how it compares in performance and quality versus using a lower internal resolution with any sort of non-super-sampled AA.

As we start combining these techniques it is getting incredibly difficult to figure out what even a reasonably good combination of settings is in PC gaming.

Basically you would use this if you have like a 1080p or 1440p 60hz screen without any VRR. And you are able to spare the GPU resources needed to run this.

If you've a 4k 60hz screen or something like 1080p / 1440p 120+hz screen, you probably would prefer better performance. But to each their own, I guess.
 

RedHeat

Member
Oct 25, 2017
11,222
I think this is the perfect companion with DLSS; noticeable increase in picture quality over just using DLSS, plus no performance loss.
 

craven68

Member
Jun 20, 2018
4,214
I m sorry to put back this thread but didn't know which one to talk about that.
I wanted to try that this dldsr but the frequency is not good, i can put only 24hz on it. It can't do 60hz ? i m playing on a 4k tv.
In game or in the the nvidia setting the 1.78 or 2.25 multiplier are on low frequency.
 

Terbinator

Member
Oct 29, 2017
7,791
I m sorry to put back this thread but didn't know which one to talk about that.
I wanted to try that this dldsr but the frequency is not good, i can put only 24hz on it. It can't do 60hz ? i m playing on a 4k tv.
In game or in the the nvidia setting the 1.78 or 2.25 multiplier are on low frequency.
Sounds like an EDID issue and/or bandwidth limitations of your TV?
 

craven68

Member
Jun 20, 2018
4,214
are you in PC mode on the TV? Are you using GPU scaling in the NV driver?
My tv on pc mode or not doesn't change anything :/
The gpu scaling is not enable.


Sorry my computer is on french language, but as you can see, the framerate is lock at 24 ( and when i try game with this resolution, i can clearly see that in game it's also at the frequency).
I only enable the new dldlsr.
 

Dinjoralo

Member
Oct 25, 2017
6,735
8Ykonhx.png

0RqRyJQ.png


DLDSR really is a godsend when playing games that don't need a lot of GPU horsepower. I do think you want to have some form of in-game AA going on when you're using it, I find that the DL stuff can only do so much to shimmering on its own.

I do wish you could use DLDSR with higher resolution scales, all the way to 4x res. I mainly say this because El Shaddai is a weird game that doesn't render at native res, and you basically need 4x downsampling for it to look right.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,608
Berlin, 'SCHLAND
My tv on pc mode or not doesn't change anything :/
The gpu scaling is not enable.


Sorry my computer is on french language, but as you can see, the framerate is lock at 24 ( and when i try game with this resolution, i can clearly see that in game it's also at the frequency).
I only enable the new dldlsr.
Turn on GPU scaling and Tell me what happens
 

craven68

Member
Jun 20, 2018
4,214
Turn on GPU scaling and Tell me what happens
Still not working, but i think i have to delete the "4096x2160", for some reason, this resolution is only at 24hz...and when i try a game with dldsr, the tv show me that computer is on this resolution. ( maybe this resolution is incompatible with my tv to get a good frequency).
 

Jazzem

Member
Feb 2, 2018
2,464
Still not working, but i think i have to delete the "4096x2160", for some reason, this resolution is only at 24hz...and when i try a game with dldsr, the tv show me that computer is on this resolution. ( maybe this resolution is incompatible with my tv to get a good frequency).

Ugh, 4096x2160 is the bane of any PC+TV user downsampling user D:

I used Custom Resolution Utility with this guide to get rid of it