• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

McFly

Member
Nov 26, 2017
2,742
The fact that on the PS5 Pro, Sony had to personally step in to handle the RT hardware and develop PSSR says it all about AMD's state when it comes to these technologies. They are simply a raster only company at this point with no features.
Considering AMD is designing it for Sony using AMD IPs, no doubt with Sony input, it says AMD knows how to make the hardware. Sony is responsible for their own software platform, but they aren't doing it without input from AMD. It is a collaborative effort.
 

Shaz12567

Member
Jun 7, 2021
485
I wish Alex would wait until FSR 3.1 release to do a 3 way on the update for all 3
Well for one, AMD will take ages to release it and due to their insistence on trying to phase out DLSS and XeSS, AMD made FSR non upgradable by the user. So after release, we will then have to wait for devs to implement it which will take even more time.

No reason to postpone the video considering AMD's shortcomings here. Late to the party as usual.
 

Shaz12567

Member
Jun 7, 2021
485
Considering AMD is designing it for Sony using AMD IPs, no doubt with Sony input, it says AMD knows how to make the hardware. Sony is responsible for their own software platform, but they aren't doing it without input from AMD. It is a collaborative effort.
The point is, Sony was the one who came up with PSSR. Its likely they had a hand in the AI model being used in the technologies. Same with ray tracing. AMD is just a contractor here. The fact that even Sony was unhappy with FSR and devoted their own resources to develop PSSR says a lot.

I think AMD will just use these R&D funds provided by Sony and we will find FSR will become a derivative of PSSR with future RDNA GPUs.
 

Gitaroo

Member
Nov 3, 2017
8,025
Well for one, AMD will take ages to release it and due to their insistence on trying to phase out DLSS and XeSS, AMD made FSR non upgradable by the user. So after release, we will then have to wait for devs to implement it which will take even more time.

No reason to postpone the video considering AMD's shortcomings here. Late to the party as usual.

I think the game Alex picked will probably get updated, especially Ratchet which was shown as an example by AMD, but I think Alex already planned on checking it out. At the moment, I think every game should implement XeSS by default if the dev can't get all 3 in their game. FSR2 really suffers with alpha affects, everything looks very very pixelated.
 

Shaz12567

Member
Jun 7, 2021
485
I think the game Alex picked will probably get updated, especially Ratchet which was shown as an example by AMD, but I think Alex already planned on checking it out. At the moment, I think every game should implement XeSS by default if the dev can't get all 3 in their game. FSR2 really suffers with alpha affects, everything looks very very pixelated.
Its not as simple if the devs have to implement it. When we mod DLSS / XESS newer versions into older games, there could be potential issues but modders don't have to worry about QC. If devs have to do it, they will need to perform QC checks on the whole game which takes months.

Even after release, I don't think FSR 3.1 will be adopted in any major game for at least a month or two. By which time, DLSS and XeSS would already have advanced a couple of versions ahead.
 

McFly

Member
Nov 26, 2017
2,742
The point is, Sony was the one who came up with PSSR. Its likely they had a hand in the AI model being used in the technologies. Same with ray tracing. AMD is just a contractor here.
While Sony is responsible for their own software and development platforms, SDK, features etc, AMD is not just a contractor, they design the hardware and provide Sony with support in developing drivers and software necessary to make the hardware work. Any look into their documentation would show you AMD is involved beyond just designing the hardware.

The fact that even Sony was unhappy with FSR and devoted their own resources to develop PSSR says a lot.
Where does it say Sony was unhappy with FSR? Sony has always created their own technologies and been doing cutting edge research for new technologies; they were heavily involved in pushing the adoption of checkerboard rendering last gen. Their studios have created and implemented various temporal upscaling techniques. AI is very good with image upscaling so it makes sense that Sony would try and adopt it by creating their own for their development platform.

I think AMD will just use these R&D funds provided by Sony and we will find FSR will become a derivative of PSSR with future RDNA GPUs.
Collaboration is a good thing. That is what opensource is meant to encourage and Sony is a big supporter of open source and PS5 is built on opensource software.
 

PLASTICA-MAN

Member
Oct 26, 2017
23,676
When is FSR 3.1 going to reelase? Immortals is suppsoed to include it. Also Isn't FSR 4.0 supposed to be reelased at the end of this year? I wonder if it coincides with PS5 Pro release and can see how it gonna compare to the current PSSSR version.
 

b0uncyfr0

Member
Apr 2, 2018
949
Yep, FSR 2 is not doing great. Luckily Xess is easily usable on most gpu's.

FSR 3.1 is out now though. Forbidden west will probably be the earliest implementation we get, hopefully that bring it up to spec.
 

Timu

Member
Oct 25, 2017
15,611
I personally don't rely on upscaling at all when getting a gpu...but man AMD really needs to get FSR in a better state as Xess already surpassed it so quickly. Hopefully FSR 3.1 can do the trick(or when FSR 4.0 comes out).
 

Ry.

Member
Oct 10, 2021
1,132
the planet Zebes
I switched my desktop card to nVidia after multiple games in a row displayed game-breaking issues in review copies of games prior to release on AMD; and as much as it pains me to admit it, considering up until now I've been only AMD, unless they can really turn things around I don't see myself giving up stuff like DLSS and the better RT performance going forward.

With all the AI income/investment money coming in now, I highly doubt anyone will even be able to catch up to them anytime soon. It's like the Photoshop or Unreal issue; trying to create a competitor to something that has decades of development time and billions of development dollars over the competition is just futile at this point.

As time goes on I could see AMD just focusing more on their mobile APU tech while Nvidia continues to pull ahead in the chunky boy GPU/TPU space. Intel is certainly a wild card in all this, as they have the ability and the resources to make a move in a way that AMD can't.
 

Zomba13

#1 Waluigi Fan! Current Status: Crying
Member
Oct 25, 2017
8,956
I've heard this. What does this mean? If the game has a multiplayer mode don't use it at all or are we only talking about things like Destiny, Overwatch 2, Fortnight, etc? With heavy anti-cheat

Don't use it with anything that has a form of anti-cheat (so, to be safe, multiplayer games) as modified .dll files are some of the first things anti-cheats look for so even though this isn't doing anything malicious, it's not the DLSS.dll that the game shipped with that the anti-cheat expects thus can lead to bans.
 

Mango Pilot

Member
Apr 8, 2024
311
Don't use it with anything that has a form of anti-cheat (so, to be safe, multiplayer games) as modified .dll files are some of the first things anti-cheats look for so even though this isn't doing anything malicious, it's not the DLSS.dll that the game shipped with that the anti-cheat expects thus can lead to bans.
Is there a way to check if a game has anti-cheat? The PCGaming Wiki?

Edit:

Looks like the PC Gaming Wiki does but stuff like it listing Far Cry 5 because I'd only be playing it on Single Player. Should I still stay away from messing with it?

 

dgrdsv

Member
Oct 25, 2017
11,888
Biggest takeaway for me is that FSR has really fallen behind at this point. Like, by a surprisingly large amount.
Always has been behind.
Smallish improvements which DLSS had over these years and XeSS 1.3 "side-grades" (which is what I'd call them honestly) do nothing to change that status quo.
The interesting comparison will be with FSR 3.1 SR when that will release and get some traction in game integrations.

The fact that on the PS5 Pro, Sony had to personally step in to handle the RT hardware and develop PSSR says it all about AMD's state when it comes to these technologies. They are simply a raster only company at this point with no features.
We really don't know who's stepped into what for PS5Pro h/w just yet.
 

Folie

Member
Dec 16, 2017
644
Yep, FSR 2 is not doing great. Luckily Xess is easily usable on most gpu's.

FSR 3.1 is out now though. Forbidden west will probably be the earliest implementation we get, hopefully that bring it up to spec.

FSR 3 is out (which is the frame gen part), 3.1 isn't (AMD have said second quarter this year, but we saw how FSR 3's release dragged on and on...)
 

eathdemon

Banned
Oct 27, 2017
9,690
im not a software engineer, but would it be possible, maybe via a api that could use tensor cores, or athe ai acelorators on arc cards? its prity clear hardware acceleration does matter.
 

Zomba13

#1 Waluigi Fan! Current Status: Crying
Member
Oct 25, 2017
8,956
Is there a way to check if a game has anti-cheat? The PCGaming Wiki?

Edit:

Looks like the PC Gaming Wiki does but stuff like it listing Far Cry 5 because I'd only be playing it on Single Player. Should I still stay away from messing with it?


Probably for the best. I only swap the stuff in things without anti-cheat even if I wouldn't touch the multiplayer parts just to be safe.
 

McFly

Member
Nov 26, 2017
2,742
That and the fact that Intel has way more money to spend on R&D. Intel spends more than 3 times as much money on R&D than AMD.
Well yes they also fabricate their hardware so they have to spend money to develop fabrication processes. Look at how much TSMC has to spend for each new node they develop. AMD sold all their foundries and technologies. Global foundry stopped trying to compete.

Best comparison would be Nvidia which spend roughly 50% more than AMD, but they can also afford to because they make huge profits. AMDs budget has increased significantly over the past few years since their zen architecture became competitive with Intel.
 

k0decraft

Member
Oct 27, 2017
2,197
Earth
I switched my desktop card to nVidia after multiple games in a row displayed game-breaking issues in review copies of games prior to release on AMD; and as much as it pains me to admit it, considering up until now I've been only AMD, unless they can really turn things around I don't see myself giving up stuff like DLSS and the better RT performance going forward.

You chose wisely. nVidia been the truth !
 

professor_t

Member
Oct 27, 2017
1,340
i recommend DLSS Swapper, it detects all your DLSS games and you can replace versions with a click of a button https://github.com/beeradmoore/dlss-swapper
(mind you, do not use it for multiplayer titles)
Oh man, I am so clueless. I was assuming that if I use DLSS in a game, such as Cyberpunk, I'm automatically getting whatever updates that Nvidia has made to DLSS (assuming I'm on the latest drivers).

So how do I know which version of DLSS a game is using, and if I'm wary of messing with these other utilities, will I eventually get updated versions of DLSS without doing anything specific? I'm so lost.
 

bitcloudrzr

Member
May 31, 2018
13,977
The point is, Sony was the one who came up with PSSR. Its likely they had a hand in the AI model being used in the technologies. Same with ray tracing. AMD is just a contractor here. The fact that even Sony was unhappy with FSR and devoted their own resources to develop PSSR says a lot.

I think AMD will just use these R&D funds provided by Sony and we will find FSR will become a derivative of PSSR with future RDNA GPUs.

While Sony is responsible for their own software and development platforms, SDK, features etc, AMD is not just a contractor, they design the hardware and provide Sony with support in developing drivers and software necessary to make the hardware work. Any look into their documentation would show you AMD is involved beyond just designing the hardware.


Where does it say Sony was unhappy with FSR? Sony has always created their own technologies and been doing cutting edge research for new technologies; they were heavily involved in pushing the adoption of checkerboard rendering last gen. Their studios have created and implemented various temporal upscaling techniques. AI is very good with image upscaling so it makes sense that Sony would try and adopt it by creating their own for their development platform.


Collaboration is a good thing. That is what opensource is meant to encourage and Sony is a big supporter of open source and PS5 is built on opensource software.
Sony and AMD fully collaborate on the hardware they design and build, Cerny has said as much and it makes sense they continue to strengthen this relationship. No one should be surprised if similar hardware in the Pro shows up in AMD cards in the near future.
 

mordecaii83

Avenger
Oct 28, 2017
6,862
It's good to see Intel and Nvidia continuing to improve image quality for their upscalers. I hope FSR 3.1 is also a good improvement, even though I expect it to still be the worst image quality of the three. It would be nice if more devs included XeSS support either in place of or in addition to FSR since both support the majority of video cards but XeSS looks better.
 

Theiea

Member
Oct 27, 2017
1,577
Oh man, I am so clueless. I was assuming that if I use DLSS in a game, such as Cyberpunk, I'm automatically getting whatever updates that Nvidia has made to DLSS (assuming I'm on the latest drivers).

So how do I know which version of DLSS a game is using, and if I'm wary of messing with these other utilities, will I eventually get updated versions of DLSS without doing anything specific? I'm so lost.

When you boot up DLSS-Sweapper, it shows the version of DLSS the game is using on the thumbnail. You can also locate the DLSS dll file in the game install directory and go to properties -> details tab to see which version of DLSS it is.

So, for example for Cyberpunk 2077, the file (called nvngx_dlss.dll) is located in \Steam\steamapps\common\Cyberpunk 2077\bin\x64 directory. Currently CP2077 uses 3.5.10.0

Newer versions of DLSS don't automatically get updated. CD Projekt would have to send out a patch to update it, or you can use the DLSS-swapper to update it yourself.
 

Zeliard

Member
Jun 21, 2019
10,954
A few days ago I tried RDR2 with DLSS 3.7 and Preset E (forced through Nvidia Inspector) and it was a marked difference to these old tired eyes.

Even better if you go with the ol DLDSR + DLSS combo.

It's so goddamn beautiful looking.
 

Pargon

Member
Oct 27, 2017
12,038
It's not cheating lmao. They decided their upscaling technique is good enough now that they can render at a lower resolution for the same perceptible quality. I'm pretty sure NVIDIA and AMD changed their render scales as well between their 1.0 and 2.0 versions. They're not trying to slip one past the goalie.
It doesn't make it easy for people to compare if AMD and NVIDIA use the same scale for their quality settings, but Intel uses one step lower resolution now.
They're already advertising it as an "up to 28% performance improvement."
0Gqj3ox.png


A few days ago I tried RDR2 with DLSS 3.7 with Preset E (forced through Nvidia Inspector) and it was a marked difference to these old tired eyes.
Even better if you go with the ol DLDSR + DLSS combo.
If you use Special K, you should be able to force DLAA rather than having to use DLDSR with DLSS.
 

RayCharlizard

Member
Nov 2, 2017
2,984
It doesn't make it easy for people to compare if AMD and NVIDIA use the same scale for their quality settings, but Intel uses one step lower resolution now.
They're already advertising it as an "up to 28% performance improvement."
0Gqj3ox.png
None of these companies have honest marketing, but they're comparing the performance improvement to the previous version of their own software, not competitor's. There is a performance improvement because they reduced the render scale. If that bore out as worse image quality, it wouldn't be a very useful metric. But according to this test from DigitalFoundry, it doesn't result in worse image quality. So there is an increase in image quality and improvement in performance. That is not a dishonest statement.
 

Pargon

Member
Oct 27, 2017
12,038
I was under the impression that DLDSR + DLSS was generally superior even to DLAA. Is that not the case?
I suppose it depends on what the actual rendering resolution ends up being, but DLAA is preferable in my opinion.
I thought people were doing the DLDSR + DLSS thing in games where DLAA was not an option.
Either way, using DLDSR also involves a display mode-switch, and kills off MPOs, so I try to avoid using it.

None of these companies have honest marketing, but they're comparing the performance improvement to the previous version of their own software, not competitor's. There is a performance improvement because they reduced the render scale. If that bore out as worse image quality, it wouldn't be a very useful metric. But according to this test from DigitalFoundry, it doesn't result in worse image quality. So there is an increase in image quality and improvement in performance. That is not a dishonest statement.
I don't think it makes sense to use "image quality" as a metric here.
If that were the case you could start comparing DLSS Ultra Performance against FSR2 Quality.

Keep the rendering resolution the same and market improved image quality instead.
Or just display the resolution scale rather than an excessive number of performance options.
 

RayCharlizard

Member
Nov 2, 2017
2,984
I don't think it makes sense to use "image quality" as a metric here.
If that were the case you could start comparing DLSS Ultra Performance against FSR2 Quality.

Keep the rendering resolution the same and market improved image quality instead.
Or just display the resolution scale rather than an excessive number of performance options.
Why would image quality not make sense as a metric to compare image quality descriptors for upscaling presets? The only reason the render scales are the way they are now is because NVIDIA led the way, it's not some set in stone rule that Quality means X, Performance means Y. Your argument is for standardization, which is a fine argument to make, but it doesn't have anything to do with whether or not Intel's quality preset descriptors hold water and it is clear that the lower render scale + XeSS 1.3 is higher quality than higher render scale + XeSS 1.2. The render scales are already arbitrary based on NVIDIA's own marketing arm when DLSS was first introduced. No other vendor is beholden to those rules.
 

Pargon

Member
Oct 27, 2017
12,038
Why would image quality not make sense as a metric to compare image quality descriptors for upscaling presets? The only reason the render scales are the way they are now is because NVIDIA led the way, it's not some set in stone rule that Quality means X, Performance means Y. Your argument is for standardization, which is a fine argument to make, but it doesn't have anything to do with whether or not Intel's quality preset descriptors hold water and it is clear that the lower render scale + XeSS 1.3 is higher quality than higher render scale + XeSS 1.2. The render scales are already arbitrary based on NVIDIA's own marketing arm when DLSS was first introduced. No other vendor is beholden to those rules.
Not sure how to respond without repeating myself really.
I think having seven presets is absolutely ridiculous. Especially when that gives us names like "Ultra Quality Plus."
How is that useful to anyone?

Even if there was no official standardization, AMD and Intel using the same quality level names and scales as NVIDIA at least made comparisons easy.
Let's say that the following are comparable for image quality:
  • DLSS Performance (0.50x)
  • XeSS Balanced (0.59x)
  • FSR2 Quality (0.67x)
Would it be useful to a player if we renamed them all to be "Quality" ?
What happens when FSR3.1 gets here if it's comparable to XeSS or DLSS? Do they rename all the presets again?
It just seems like obfuscation, and further erosion of image quality to me.

If I'm playing a game, I don't look at the presets to see if I'm playing on "Balanced" or "Quality" etc.
I look at the performance metrics to see if it's running acceptably well.
And then I'll pick the scaler based on which one looks best.

Now you have to juggle both. You can't just flip between DLSS/XeSS/FSR at your selected quality level.
This is why I think the presets don't do much to help players at all - especially after these changes - and it should just be a resolution slider/multiplier option at this point.
 
Last edited:

Mahonay

Member
Oct 25, 2017
33,321
Pencils Vania
Not sure how to respond without repeating myself really.
I think having seven presets is absolutely ridiculous. Especially when that gives us names like "Ultra Quality Plus."
How is that useful to anyone?

Even if there was no official standardization, AMD and Intel using the same quality level names and scales as NVIDIA at least made comparisons easy.
Let's say that the following are comparable for image quality:
  • DLSS Performance (0.50x)
  • XeSS Balanced (0.59x)
  • FSR2 Quality (0.67x)
Would it be useful to a player if we renamed them all to be "Quality" ?
What happens when FSR3 gets here if it's comparable to XeSS or DLSS? Do they rename all the presets again?
It just seems like obfuscation, and further erosion of image quality to me.

If I'm playing a game, I don't look at the presets to see if I'm playing on "Balanced" or "Quality" etc.
I look at the performance metrics to see if it's running acceptably well.
And then I'll pick the scaler based on which one looks best.

Now you have to juggle both. You can't just flip between DLSS/XeSS/FSR at your selected quality level.
This is why I think the presets don't do much to help players at all - especially after these changes - and it should just be a resolution slider/multiplier option at this point.
No one is flipping between DLSS, XeSS, and FSR. If you have an Nvidia GPU you are using DLSS because it is the best reconstruction option 100 percent of the time.

Games that don't have it, sometimes do have the FSR option. I just pick the highest quality version every time because anything below that starts to look like a mess. Or sometimes I don't use FSR at all because it looks like shit when you are used to using DLSS.

Additionally, you are probably never going to use XeSS on a non-Intel GPU. It needs to run on an Intel GPU to perform at it's best.
 

maximumzero

Member
Oct 25, 2017
22,951
New Orleans, LA
This seems disingenuous at best for AMD's product. Judging by casual Googling it seems like DLSS 3.7.0 and XeSS 1.3 both released at the beginning of April 2024, and FSR 2.0 is from September 2022?

I know AMD would likely be behind even with FSR 3, but it seems kinda lopsided to show new tech versus something that's from eighteen months ago.
 

Pargon

Member
Oct 27, 2017
12,038
No one is flipping between DLSS, XeSS, and FSR. If you have an Nvidia GPU you are using DLSS because it is the best reconstruction option 100 percent of the time.

Games that don't have it, sometimes do have the FSR option. I just pick the highest quality version every time because anything below that starts to look like a mess. Or sometimes I don't use FSR at all because it looks like shit when you are used to using DLSS.
I agree that you will probably use DLSS if you've got a supported NVIDIA GPU.
But that's not to say it will always be the best option.

And in some cases, DLSS has suffered worse ghosting/artifacts than XeSS or other upscalers (seems to be fixable if you change the preset, especially with this new Preset E).
I know a few people with 30-series GPUs that often do not select DLSS in a game, because there are artifacts with it that bothers them more - even if the overall image quality may be "worse" with the alternative.

Additionally, you are probably never going to use XeSS on a non-Intel GPU. It needs to run on an Intel GPU to perform at it's best.
XeSS is usually a better choice than FSR even on AMD cards - or older NVIDIA GPUs.
The performance overhead is a bit higher though, so it's not always viable.

This seems disingenuous at best for AMD's product. Judging by casual Googling it seems like DLSS 3.7.0 and XeSS 1.3 both released at the beginning of April 2024, and FSR 2.0 is from September 2022?

I know AMD would likely be behind even with FSR 3, but it seems kinda lopsided to show new tech versus something that's eighteen months ago.
You can't just drop in a newer FSR DLL to upgrade it, like you can with DLSS/XeSS.
But FSR 3 has not made any changes to the upscaling algorithm vs 2.x, it only introduced Frame Gen.
FSR 3.1 is going to be the first update in a long time.
 
Last edited:

Mahonay

Member
Oct 25, 2017
33,321
Pencils Vania
I agree that you will probably use DLSS if you've got a supported NVIDIA GPU.
But that's not to say it will always be the best option.

And in some cases, DLSS has suffered worse ghosting/artifacts than XeSS or other upscalers (seems to be fixable if you change the preset, especially with this new Preset E).
I know a few people with 30-series GPUs that often do not select DLSS in a game, because there are artifacts with it that bothers them more - even if the overall image quality may be "worse" with the alternative.
Huh. I have never heard someone have a real issue with DLSS. What are they doing if not using DLSS. Just native?

DLSS more often than not enhances detail compared to using native, no? It's also free anti-aliasiing. The benefits for me at least outweigh any minor visual oddities that come along (usually trouble with things like people's hair, weird ghosting during a resting image, etc)

XeSS is usually a better choice than FSR even on AMD cards - or older NVIDIA GPUs.
The performance overhead is a bit higher though, so it's not always viable.
Yeah the performance overhead with XeSS is what makes me think it wouldn't be a sensible option for an AMD user. But yeah I guess it definitely does reconstruct the image more accurately, even with when it's not fully featured.
 

Pargon

Member
Oct 27, 2017
12,038
Huh. I have never heard someone have a real issue with DLSS. What are they doing if not using DLSS. Just native?

DLSS more often than not enhances detail compared to using native, no? It's also free anti-aliasiing. The benefits for me at least outweight any minor visual oddities that come along (usually trouble with things like people's hair, weird ghosting during a resting image, etc)
The game's native TAA - or in some cases, they've even said they prefer FSR2 Quality.
I don't get it, but different people pick up on/are bothered by different things, I suppose.
It's possible that they'd be fine with DLSS if they updated it or changed the preset, but that then becomes "too much work."

Yeah the performance overhead with XeSS is what makes me think it wouldn't be a sensible option for an AMD user. But yeah I guess it definitely does reconstruct the image more accurately, even with when it's not fully featured,
Well, FSR has fallen so far behind that you can often drop the preset a tier lower and still end up with a better image overall - despite the higher performance cost.
Or you just take the hit to performance.
 

RayCharlizard

Member
Nov 2, 2017
2,984
Not sure how to respond without repeating myself really.
I think having seven presets is absolutely ridiculous. Especially when that gives us names like "Ultra Quality Plus."
How is that useful to anyone?

Even if there was no official standardization, AMD and Intel using the same quality level names and scales as NVIDIA at least made comparisons easy.
Let's say that the following are comparable for image quality:
  • DLSS Performance (0.50x)
  • XeSS Balanced (0.59x)
  • FSR2 Quality (0.67x)
Would it be useful to a player if we renamed them all to be "Quality" ?
What happens when FSR3.1 gets here if it's comparable to XeSS or DLSS? Do they rename all the presets again?
It just seems like obfuscation, and further erosion of image quality to me.

If I'm playing a game, I don't look at the presets to see if I'm playing on "Balanced" or "Quality" etc.
I look at the performance metrics to see if it's running acceptably well.
And then I'll pick the scaler based on which one looks best.

Now you have to juggle both. You can't just flip between DLSS/XeSS/FSR at your selected quality level.
This is why I think the presets don't do much to help players at all - especially after these changes - and it should just be a resolution slider/multiplier option at this point.
I only take umbrage with claiming that it is a dishonest tactic to lower the render scale and introduce more granular options for upscaling performance. Again, you made the claim that it was to state a performance advantage but the only evidence was that it showed an improvement versus their prior version technology which is a true claim. If Intel had said they are performing better than NVIDIA or AMD that'd be an entirely different conversation.
 

Dictator

Digital Foundry
Verified
Oct 26, 2017
4,934
Berlin, 'SCHLAND
This seems disingenuous at best for AMD's product. Judging by casual Googling it seems like DLSS 3.7.0 and XeSS 1.3 both released at the beginning of April 2024, and FSR 2.0 is from September 2022?

I know AMD would likely be behind even with FSR 3, but it seems kinda lopsided to show new tech versus something that's from eighteen months ago.
Disingenuous? FSR 2.2 is the Same upscaler as FSR 3.0

AMD has Not updated it in a Long Long time
 

Shaz12567

Member
Jun 7, 2021
485
Always has been behind.
Smallish improvements which DLSS had over these years and XeSS 1.3 "side-grades" (which is what I'd call them honestly) do nothing to change that status quo.
The interesting comparison will be with FSR 3.1 SR when that will release and get some traction in game integrations.


We really don't know who's stepped into what for PS5Pro h/w just yet.
I think it's pretty obvious. AMD's AI enhanced FSR is talked about only when PS5 Pro gets PSSR. AMD rumored to go big with RT with RDNA5 right after Sony focuses on RT with PS5 Pro. Essentially, Sony is funding R&D for AMD's GPU division because they were unhappy with FSR and their RT performance (stands to reason because they wouldn't develop PSSR if FSR was good enough).
 

Shaz12567

Member
Jun 7, 2021
485
Sony and AMD fully collaborate on the hardware they design and build, Cerny has said as much and it makes sense they continue to strengthen this relationship. No one should be surprised if similar hardware in the Pro shows up in AMD cards in the near future.
But does that mean AMD does nothing until Sony asks for it to be done? Because that's how it looks like. AI enabled FSR only talked about after PSSR. RT performance leap with RDNA5 only after Sony asked for better RT performance with PS5 Pro.
 

bitcloudrzr

Member
May 31, 2018
13,977
But does that mean AMD does nothing until Sony asks for it to be done? Because that's how it looks like. AI enabled FSR only talked about after PSSR. RT performance leap with RDNA5 only after Sony asked for better RT performance with PS5 Pro.
The technology behind it, has likely been in development for some time between both parties. Remember that we are not supposed to know about any of this, and implementations of it could be in the works for RDNA 4, since the PS5 Pro may be an RDNA 3/4 hybrid. Ever since the PS4, Sony and Cerny seem happy to strengthen ties and share design and technology much more than business customer and contractor. AMD knows that their software only FSR and non-dedicated RT are not in a good position, and it would be illogical that they have not been working on their own including Sony's PS hardware team. Factor in that they are AMD's largest customer, and they are facing down GPU competition from the $2 trillion Nvidia. It is in their best interest to work together to advance technology goals.