Techno

Powered by Friendship™
The Fallen
Oct 27, 2017
6,517
It's not just w/r speed that makes the SSD solution super-fast in consoles, though.

True, but people were going on and on about read and write speeds before the new consoles came out, and in like 2-3 months that gap was closed again - in fact the Western Digital SN850 and Samsung 980 m.2 drives are the fastest at the moment I think.
 

dgrdsv

Member
Oct 25, 2017
12,260
Its not just standard RDNA acceleration, they added some custom machine learning acceleration hardware on the die
It's standard RDNA2 acceleration of packed math, meaning 2X for halved precision formats from FP32 down to INT4.
It is very small h/w cost precisely because it is a standard feature of RDNA2 which was added back in GCN5 and was present in some versions of RDNA1 also.
RPM is "cheap" but it is far from being ideal for ML acceleration. It will run ML better than just FP32 SIMDs would but I have doubts about that being enough to do something like DLSS effectively.
AMD's upcoming "super resolution" being apparently just another variation of temporal upscaling kinda proves this point.

There are some "tricks" which can be done though and it is possible that Xbox Series have them. There are matrix multiplication instructions which can be implemented to make this type of math run more efficiently even without tensor arrays. But they don't say that explicitly so it is unknown if this is what they mean. It is also something which would result in simplified coding and some 10s of % of acceleration but not x10 and such which you get with dedicated TCs.
 

orava

Alt Account
Banned
Jun 10, 2019
1,316
It was not possible to have this exact tech in the new consoles. AMD does not have the dedicated hardware and software yet and are practically generations behind nvidia with this.
 
Last edited:

Mifec

Member
Oct 25, 2017
17,949
People need to stop dreaming about DirectML or ADM's solution being anywhere close. DLSS came out of nvidia investing billions in AI and being so far ahead of everyone involved here.
 

justiceiro

Banned
Oct 30, 2017
6,664
If not that technology, then would be other technology/breakthrough that would be missing few years down the line. You can't really release a console only when it have all the technology available.
 

The Bookerman

Banned
Oct 25, 2017
4,124
Doing ray tracing without DLSS in Cyberpunk would murder my RTX 2070.

Thank god it exists.

DLSS is running on the tensor cores on the die, something radeon GPU's don't have.

Also Radeon GPU's don't have great performance with Ray-tracing. Couple that with no DLSS equivalent, there's no way I'd buy a radeon GPU for now.
 

dgrdsv

Member
Oct 25, 2017
12,260
People need to stop dreaming about DirectML or ADM's solution being anywhere close. DLSS came out of nvidia investing billions in AI and being so far ahead of everyone involved here.
I mean, people were saying that to their eyes CAS is the equivalent of DLSS so ¯\_(ツ)_/¯

Also don't write off a good temporal supersampler just because it doesn't have ML component in it. AMD's super resolution should be pretty close to what DLSS 1.9 did in Control - if not a bit better. It wasn't on DLSS 2.0 level of quality and performance of course but it still did a nice job at upscaling the image.
 

GhostofWar

Member
Apr 5, 2019
512
DirectML on Xbox seems to be a good substitute. In fact, Microsoft have some extra hardware extensions to support AI up scaling and are currently working on a form of "super" directml for XSX. My guess is that it would be used in future games. https://wccftech.com/xbox-series-di...n-area-of-very-active-research-for-microsoft/

Congratulations on posting yet another link to an article talking about Nvidia dlss running on Direct ML using an nvdia titan v
 

floridaguy954

Member
Oct 29, 2017
3,631
How can it be a mistake when AMD legitimately does not have a DLSS alternative in existence AT ALL at this time?

Nvidia is like 5 years ahead of AMD on that front.
 

orava

Alt Account
Banned
Jun 10, 2019
1,316
DirectML is not same thing as DLSS. It is a software api like Direct3D and is part of the DirectX 12 api library. It can be used to make gpu accelerated machine learning applications and features. Without dedicated ml hardware, it's not nearly as efficient and takes away resources and the perf hit is much bigger.
 

DanielG123

Member
Jul 14, 2020
2,490
Running Cyberpunk on a 2080 Super at 1440p ultra settings, with DLSS set to performance, and the tech is some black magic wizardry. I went from frames hovering around the mid 50s, but then dropping suddenly to the 30s and even 20s in busier areas, to now having much more consistency, stability, and even a smooth 60fps in a number of cases. It's truly amazing technology that, in my case, doesn't deteriorate the graphical quality of games that have it.

That being said, I don't believe that it was a, "mistake" for neither console manufacturer to not have an equivalent to DLSS. They, especially Microsoft, waited as long as they could for the most recent technology available, and that's enough. The hardware in both next generation systems will be just fine, more than that, really.
 

Dreamwriter

Member
Oct 27, 2017
7,461
Even if AMD had hardware ready for it, I'm not sure it would have been a good idea. I think it comes down to this question: at what point is it worth sacrificing higher-quality graphics to *simulate* higher quality graphics? I mean, more money going into the AI "tensor math" cores DLSS relies on means less money going into the actual rendering cores. Which increases the need for DLSS to make up for that. Consoles are only this generation able to consistently hit 4k/60 without ray tracing, but if they had spent some of their money on AI cores instead of rendering cores, maybe they *wouldn't* have been able to consitently hit 4k/60 without using DLSS to fake it.

It's an interesting balancing act; ray tracing does benefit from AI to allow fewer ray casts, which is why nVidia put tensor cores into their GPUs in the first place, but you also want the regular rasterized graphics to run great.
 

Deleted member 76797

Alt-Account
Banned
Aug 1, 2020
2,091
They didn't really have a choice since they are handcuffed to AMD tech. But after playing Cyberpunk and Death Stranding on a 3080 at 4k I am blown away by the DLSS implementation. It's only gonna get better too.
 
Oct 27, 2017
3,730
Everything about DLSS would benefit a hybrid device that uses lower power chips vs PS5 and Series X, so I'm sure that Nintendo are extremely interested...

Maybe. While it makes sense to dedicate die space to it on a 250W+ PC GPU (especially since they're also using these chips on non-gaming products), it might not be such a clear-cut win on a 5-10W mobile chip, where it might be more beneficial to add more general-purpose cores.

Having said that, Apple have had dedicated ML cores on their mobile devices for years now (though they're not really for gaming uses).
 

NineTailSage

Member
Jan 26, 2020
1,449
Hidden Leaf
Maybe. While it makes sense to dedicate die space to it on a 250W+ PC GPU (especially since they're also using these chips on non-gaming products), it might not be such a clear-cut win on a 5-10W mobile chip, where it might be more beneficial to add more general-purpose cores.

Having said that, Apple have had dedicated ML cores on their mobile devices for years now (though they're not really for gaming uses).

If potential future Switch games on a Pro or Switch 2 only need to render games native at 360p, 480p, 540p and 720p to achieve great visual fidelity then the dedicated die space to Tensor cores will be well worth it...

I keep sharing this video because it's just such an amazing projection of where DLSS can go from here.
 

mute

▲ Legend ▲
Member
Oct 25, 2017
25,754
I kinda like the idea of developers optimizing without the DLSS-like tech being there, makes me think that when it does show up things will look that much better at baseline.
 

starbuck2907

Member
Jan 29, 2018
96
Congratulations on posting yet another link to an article talking about Nvidia dlss running on Direct ML using an nvdia titan v

Congrats on ignoring everything else in the article, like the part where it says the XSX has hardware extensions made to support machine learning, which will likely allow some advanced form of directml in the future.

"In an interview with Wccftech, Quantic Dream CEO David Cage pointed out that one of the biggest hardware advantages for the Xbox Series S and X consoles over the competition (chiefly, Sony's PS5) could be in their shader cores, reportedly more suitable for Machine Learning tasks thanks to hardware extensions allowing for up to 49 TOPS for 8-bit integer operations and 97 TOPS for 4-bit integer operations."
 

Gitaroo

Member
Nov 3, 2017
8,302
I think ppl expected way too much from DLSS, from my personal experience, min base resolution for DLSS to look good is actually 1080p, mean games still have to render at a minimum of 1080p whether it's scaling to 1440p or 4k. I tested it out myself, any base resolution below 1080p will have tons a shaders shimmer and can be very distracting. It's also a reason why nvidia only recommanded ultra performance dlss for 8k screen since its base resolution is 1440p. If you use ultra performance mode to upscale to 4k, its base resolution is 720p, looks super nasty. Don't use YouTube compressed videos or still frame capture to compare here. You won't them the nasty shimmering from low resolution scaling like you do with real hardware on actually screen.
 

Delusibeta

Prophet of Truth
Banned
Oct 26, 2017
5,648
Ultimately, I do think Nvidia's flexing of their raytracing prowess during the RTX 3000 series reveal has made a PS5 Pro and equivalent Xbox Series an inevitability, and I do think whatever DLSS equivalent AMD comes up with will probably be baked into these mid gen upgrades.
 

dgrdsv

Member
Oct 25, 2017
12,260
Ultimately, I do think Nvidia's flexing of their raytracing prowess during the RTX 3000 series reveal has made a PS5 Pro and equivalent Xbox Series an inevitability, and I do think whatever DLSS equivalent AMD comes up with will probably be baked into these mid gen upgrades.
AMD's DLSS equivalent will run on current gen h/w since it is expected to be just temporal reconstruction.
 

NineTailSage

Member
Jan 26, 2020
1,449
Hidden Leaf
I think ppl expected way too much from DLSS, from my personal experience, min base resolution for DLSS to look good is actually 1080p, mean games still have to render at a minimum of 1080p whether it's scaling to 1440p or 4k. I tested it out myself, any base resolution below 1080p will have tons a shaders shimmer and can be very distracting. It's also a reason why nvidia only recommanded ultra performance dlss for 8k screen since its base resolution is 1440p. If you use ultra performance mode to upscale to 4k, its base resolution is 720p, looks super nasty. Don't use YouTube compressed videos or still frame capture to compare here. You won't them the nasty shimmering from low resolution scaling like you do with real hardware on actually screen.

This was literally a few post above just to show you that's not the case...
If a future Switch could run PS5/Series X games at 360p, 480p and 720p, this can make up dramatically for lack of comparable hardware.
 

Gitaroo

Member
Nov 3, 2017
8,302
This was literally a few post above just to show you that's not the case...
If a future Switch could run PS5/Series X games at 360p, 480p and 720p, this can make up dramatically for lack of comparable hardware.

I literally stated NO YouTube videos and still pictures in my post. I know because I tested it on my 3080 with multiple games. Upscaling from anything lower than 1080p, shader shimmering is become very bad, you won't see any in still shots and much reduced in compressed youtube videos. 720p scaling to 4k is pretty much shit, left alone 480p. Anything lower than 1080p is not enough for reconstruction.
 

NineTailSage

Member
Jan 26, 2020
1,449
Hidden Leaf
I literally stated NO YouTube videos and still pictures in my post. I know because I tested it on my 3080 with multiple games. Upscaling from anything lower than 1080p, shader shimmering is become very bad, you won't see any in still shots and much reduced in compressed youtube videos. 720p scaling to 4k is pretty much shit, left alone 480p. Anything lower than 1080p is not enough for reconstruction.

But what you are describing are just trade offs for what a gamer is willing to accept to have the game on the go...
Docked performance could have much better image quality than a handheld mode, then again on a much smaller screen most of these artifacts from lower resolution DLSS upscaling probably won't be an issue to the average gamer.
 

Gitaroo

Member
Nov 3, 2017
8,302
But what you are describing are just trade offs for what a gamer is willing to accept to have the game on the go...
Docked performance could have much better image quality than a handheld mode, then again on a much smaller screen most of these artifacts from lower resolution DLSS upscaling probably won't be an issue to the average gamer.
The tech get praised for little to no trade off beside some weird obvious artifacts in games like death stranding but overall better image quality plus huge performance gain. Having shaders shimmering, constant flickering all over the screen is almost as bad running games at low res filled with jaggies. Of all the games and even tech demo I have tested, scaling anything from lower than 1080p base resolution is pretty junk. If there is a switch Pro with dlss, it still need to run every at 1080p native to scale to higher res. Even 720p scaling to 1080p or 1440p is awful. Anything lower than 1080p is just not enough information to reconstruct.
 

Beatle

Member
Dec 4, 2017
1,123
Sony and MS can't have that tech as they are using AMD hardware, BUT Nintendo can in it's next system (Switch 2) as they hare using Nvidia hardware, the next Nintendo system will have an advantage if it implements DLSS 2.0/3.0 beyond what we normally think of with Nintendo's mobile console
 

Babadook

self-requsted ban
Banned
Nov 11, 2017
192
It's not a mistake. It's coming to (most likely all) next gen consoles.

Edit: referring to ML upscaling in general.
 

NineTailSage

Member
Jan 26, 2020
1,449
Hidden Leaf
The tech get praised for little to no trade off beside some weird obvious artifacts in games like death stranding but overall better image quality plus huge performance gain. Having shaders shimmering, constant flickering all over the screen is almost as bad running games at low res filled with jaggies. Of all the games and even tech demo I have tested, scaling anything from lower than 1080p base resolution is pretty junk. If there is a switch Pro with dlss, it still need to run every at 1080p native to scale to higher res. Even 720p scaling to 1080p or 1440p is awful. Anything lower than 1080p is just not enough information to reconstruct.

I would much rather play games with higher more stable framerates with good image quality and some artifacts in game vs a muddy low native resolution jaggy mess. Each gamer is different and this might only be a solution that's mostly needed to bridge game performance in handheld to the docked modes.

Also you are one of the few people that I've seen try the new DLSS version out and state that anything under 1080p is awful(when upscaling to higher resolutions). 3.0 or whatever is next could fix a lot of the current issues in DLSS, but the pros far outweigh the cons when it comes to future potential.
 

dgrdsv

Member
Oct 25, 2017
12,260
Also you are one of the few people that I've seen try the new DLSS version out and state that anything under 1080p is awful(when upscaling to higher resolutions).
It's not the native res which matters, it's the amount of upscaling. Ultra Performance mode which upscale by a factor of 8X (1080p->8K) is pretty bad no matter from what native res the upscaling is performed. There's only so much DLSS can do with lack of information.
 

Tora

The Enlightened Wise Ones
Member
Jun 17, 2018
8,658
There's definitely so much room for growth and I could see a lot of Nintendo's cleaner art styles benefit tremendously.
Fortnite is an example of a game that works really really well with DLSS, so I could imagine a mainline Zelda/Mario and even Yoshi etc benefiting from the tech. Again though, it's not magic, but in handheld mode at least it would be great.
 

NineTailSage

Member
Jan 26, 2020
1,449
Hidden Leaf
It's not the native res which matters, it's the amount of upscaling. Ultra Performance mode which upscale by a factor of 8X (1080p->8K) is pretty bad no matter from what native res the upscaling is performed. There's only so much DLSS can do with lack of information.

Sure but I don't think with a device like Switch that they would need something like an 8x multiplier in docked modes, now handheld would definitely need the help in bridging that performance gap on more demanding games.
Then again on a smaller screen more of these artifacts might be more forgivable than when blown up on a larger screen.