• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.

MrKlaw

Member
Oct 25, 2017
33,203
It's cool but still a trade off right? It's using the existing shaders to carry out the operations. So if you're doing ML you aren't doing shading in those CUs

That works well for something like a DLSS equivalent as long as the cost of ML time is less than the time saved by the GPU shading less stuff. It might also be necessary if RT really needs to run at lower resolutions to be efficient (as we've seen on PC already)

there are so many new tools for devs it'll be fascinating to see how engines develop to leverage them. ML, VRS, mesh shaders are really big ones that could have huge impact
 

ILikeFeet

DF Deet Master
Banned
Oct 25, 2017
61,987
cool but still a trade off right? It's using the existing shaders to carry out the operations. So if you're doing ML you aren't doing shading in those CUs
That's my takeaway, but to balance that, you're working with a lower base resolution to balance. But you might not get all the visual effects
 

dgrdsv

Member
Oct 25, 2017
11,975
Wait, not even on RTX? I thought they touted their tensor cores for denoising.
No, not presently. Devs prefer to handle denoising with general shaders, possibly because this makes their RT implementation fully DXR compliant instead of relying on NV's proprietary NGX or DirectML with unknown performance on h/w without tensor arrays.
 
OP
OP
bcatwilly

bcatwilly

Member
Oct 27, 2017
2,483
Just a little interesting tidbit that Phil Spencer mentioned in the IGN Unlocked podcast interview with Ryan McCaffrey that the use of DirectML and AI was something that they thought about with the Project Scarlett (Xbox Series X) hardware in terms of its use in the cloud too.