Yeah it's pretty good. Just played a few hours of control on my 5700xt rig with RIS at 1440p. My other rig 4k dlss on a 2070 super sure I'd say the rtx stuff is cool, but it's not really something I notice unless I'm actively looking for it. IQ wise the 2070s with DLSS is a bit better I guess? It's nothing truly mind blowing still imo. The game looks fucking awesome on a Radeon card too maxed out. You could also likely use Nvidia's sharpening tool as well as DLSS 2.0 to improve the image further if you wanted and liked the look it provides, it is just a post process thing, so it doesn't have the same performance issues you'd get from doing something like DLSS which fits right between the render and post processing.
Id also say HDR in MW is far more impressive than any of the rtx stuff. So yeah imo even having an Rtx card for my tv set up I don't really think it's at the point that dlss or rtx is that big of a deal. I also have a feeling the console/amd implementation of these futures is going to be well behind Nvidia for quite along time, so it's possible a lot of cross platform stuff still won't really be doing much with this stuff even years from now.
What some of the articles for RIS compared it to was a 1440p with RIS is close to a 1800p image, though there is some weird over sharpening going on, but largely not at the edges. It also depends on the size of the monitor how well those are coming across and it is also a case by case basis. Generally it's worth having on.
DLSS 2.0 is pretty ground breaking, to take a 540p render and output 1080p and have that 1080p look better than native is crazy, whats more is that is a 1:4 ratio for pixels, and can greatly increase the performance of the card. I think HDR can be cool, and sure, it can pop and look really great, but real time ray tracing is technically far more impressive (RTX stuff), one day when everyone is using Raytracing on everything, we can appreciate the difficulty required at this time and the feature itself. I think it might take an exclusive game somewhere down the line, but yeah politics of gaming is being weighted largely in the gaming community (not just era) and a lot of serious tech sites are overly optimistic about AMD and keep down playing RTX, which has led to some pretty crazy bias IMO. I think in general AMD makes great cards and at good prices, but they always seem to be following Nvidia's lead, and hopefully they can quickly catch up here in the RTX era, I like XSX's ability to do raytracing, but if it also had DLSS 2.0, it could actually push next gen into largely adopting Raytracing, since they could focus on 1080p rendering and outputting 4K while pushing Raytracing much harder than whatever they end up doing with 4K.
If the AI is recreating detail from a 16k image, do you think that would suggest that there is more room for AI to upscale an image further than the current 4x resolution max? Although maybe that is basically the original idea for DLSS 2x, but maybe it could be upgraded to go from 720p to 4k (9x increase), then down sample to 1080p for a much better image.
There is a video pages back that explores 72p to recreate a 1440p? resolution, that is just absolutely nuts, and it actually starts looking good as early on as 288p? It actually gets easier to do DLSS with higher render and output targets, since it is analyzing a 16K image, rendering 4K and outputting 8K would actually be far easier to do for the card.
So the answer to your question is in the video, they just changed an .ini file in control and was able to allow the settings to render lower. You can give 720p to 4K a shot if you want, you might like the look and the performance gains get ridiculous at that point too.