Do you have any sources?With the state of DLSS 2.0 in 2021, an internet connection is required for DLSS to work during the game, so Nintendo would have to inform the consumer one way or another.
Do you have any sources?With the state of DLSS 2.0 in 2021, an internet connection is required for DLSS to work during the game, so Nintendo would have to inform the consumer one way or another.
Recent ARM cores are all big.LITTLE compatible. DynamIQ is the latest iteration of ARM Big.LITTLE implementation. Nintendo will retain the fast coming off sleep with any new ARM CPU.
A53s are functioning, especially on Mariko TX1. It just that Nintendo scheduler needs the 4*A57 to Be running when the console is not asleep. The big.LITTLE cluster switching implementation of TX1 will then prevent the 4*A53 to run.
DynamIQ is big.LITTLE for ARM A55 and A75 - A78 CPU family. The main difference between the old implementation is the size of the CPU cluster.Thank you, I recall now that you say it, that DynamIQ is supposed to emulate the big.LITTLE features.
So that's one relief, as it's paramount to keep as good of a user experience.
The way how big.LITTLE works is a bit more complicated than that and what the Switch's OS task scheduler sees is another matter, but no need to discuss these details that have no value for gamers.
I never heard of a requirement for internet connection to use dlss. If so, yea that's a major caveat for a handheld system.I don't understand this DLSS belief for a Switch revision, the more time passes, the less I believe it makes any sense.
Have someone in this thread given an explanation on how to bypass the Internet connection necessary for DLSS in a Nintendo compatible way?
I thought perhaps just for Nintendo, that NVidia could make that some game specific data could be downloaded to the Nintendo console, but that's GB of data, which makes no sense for the Switch storage and starting games very fast, that would be destroying the user experience.
I could see DLSS being doable in docked mode for some kind of games, but it's not credible at all for handheld.
I see also lots of phone lobbyists that apparently are unable to understand that the gaming (toy) market has nothing to do with the smartphone market, and keep making nonsensical comparisons between smartphones and Switch, be it hardware or business strategy. Apparently just because they don't understand tech and believe they're the same because they use ARM chips.
DLSS for Switch handheld mode is just not possible today with what I understand of the tech, and I'm convinced it will not appear on a hypothetical revision of the Switch.
For docked mode, it's still possible, I'm not sure about a revision, that makes no business sense (except for those that believe smartphones are in the entertainment market), it looks more like sth like the 10th gen Switch successor Nintendo console. And that would require Nintendo applying what they said before, changing their strategy a bit to accept an always connected console to play games (with DLSS). They could still allow the games to play without a connection (or a big [GBs] data download and big wait, makes low sense) at a lower resolution, that would still need to be of consumer quality (which is 720p, Netflix stats would be interesting to see the proportion of people with SD content) without DLSS.
We (at least I) always knew that the hardware and specs has nothing to do with a game's performance, the programmers have to know how to use the hardware in the first place. I said in 2016 that any game can be made for the Switch, people that don't understand tech (like DF) started saying nonsense like some games are impossible, and when competent programmers that actually understand tech made these games on the Switch, these people that don't know what they're talking about called them nonsense like "impossible ports".
Nintendo has competent engineers, very proficient ones, that understand these things better than I do, as they're constantly in games, while I have barely the level of signal processing 101 knowledge from 25+ years ago (with a bit of update) and still I could understand sth in 2016 that people up till today are unable to grasp: the "power" of the hardware has nothing to do with a game being doable on current hardwares and engines, programmers' proficiency, equal time and money is.
That's also how I thought it was managed; via a driver update.The neural network used to train the DLSS model for each game uses a server, but the results are delivered locally via drivers & updates. There shouldn't be any technical requirement to have a persistent internet connection to use it, even if Nvidia is requiring it in the current implementations.
Thank you, I recall now that you say it, that DynamIQ is supposed to emulate the big.LITTLE features.
So that's one relief, as it's paramount to keep as good of a user experience.
The way how big.LITTLE works is a bit more complicated than that and what the Switch's OS task scheduler sees is another matter, but no need to discuss these details that have no value for gamers.
This video is marketing for devs, it's not a white book or anything like that and it doesn't give any information on what is needed consumer wise. Look at the NVidia DLSS toolkit instead.
Why is nobody talking about the true tech details but only pretty presentations focused on everything but the tech details necessary for all this to even be usable by the end user?
The power of misinformation is strong when people don't understand tech at all, some talk about chips this and that all day but to this day believe that A53 cores in the Switch are disabled or not used. I don't even know if anyone explained how Nintendo would replace or just abandon the Switch functionality of coming off sleep so fast, I don't know (I didn't look up) if it's still possible with more recent ARM cores (which are no more big.LITTLE IIRC).
random question, but would RT cores help Nintendo's lighting work in any way. Could they achieve better results or do certain things they couldn't before
DLSS does not require an internet connection at all.
You're mixing up the invasive telemetry in the Nvidia driver with the actual technology behind DLSS.
The technology itself does not have any need for an internet connection, the model is pre-trained and shipped as part of the driver.
As mentioned earlier, DLSS no longer requires training per game. It's a more generalized approach post 2.0 update.The neural network used to train the DLSS model for each game uses a server, but the results are delivered locally via drivers & updates. There shouldn't be any technical requirement to have a persistent internet connection to use it, even if Nvidia is requiring it in the current implementations.
Ah, right, of course. So the way this would likely work in practice is each game would require a certain Switch firmware version at a minimum.As mentioned earlier, DLSS no longer requires training per game. It's a more generalized approach post 2.0 update.
No, not really. As explained in the video, DLSS 2.0 (and above) now uses one general model for every DLSS implementation, a model which Nvidia will update to improve the quality of DLSS as time goes on (and as of DLSS2.1, it now supports variable resolution internal rendering to a set target resolution). Any Switch game does not require to "have its own firmware" for DLSS to work. Rather, the game's engine just need to be updated to support DLSS as a sort of black box that it needs to feed data, such as motion vectors.Ah, right, of course. So the way this would likely work in practice is each game would require a certain Switch firmware version at a minimum.
Rather, the game's engine just need to be updated to support DLSS as a sort of black box.
They wouldn't continue to update the drivers and (presumably) distribute those updates with new firmware updates?No, not really. As explained in the video, DLSS 2.0 (and above) now uses one general model for every DLSS implementation, a model which Nvidia will update to improve the quality of DLSS as time goes on (and as of DLSS2.1, it now supports variable resolution internal rendering to a set target resolution). Any Switch game does not require to "have its own firmware" for DLSS to work. Rather, the game's engine just need to be updated to support DLSS as a sort of black box that it needs to feed data, such as motion vectors.
General updates to the DLSS models would most likely be milestone updates that coincide with System Firmware updates. The current model should be generalized enough that it should work with any existing or future games that decide to use DLSS without the need to retrain the neural network. It just needs to be fed the right sort of data and mip biases/LOD for the target resolution also have to be taken into account.They wouldn't continue to update the drivers and (presumably) distribute those updates with new firmware updates?
I'm not talking about per-game updates; just general updates to the drivers.
It's likely the engineers would be taking into account all the different case-by-case scenarios, in which case, it's down to how they train their algorithms to blend/interpret pixels in every one of them, like those particle in Death Stranding which inadvertendly cause a streaking effect.while 2.0 allows for generalized training, I wonder if there are some esoteric cases where a game would be specifically trained for its own use case
Well there you have it Ookaze . DLSS does not require a constant internet connection to function.I had some time to kill, so because I never tested DLSS on my Laptop with a RTX 2080 MAXQ, I installed Deliver us the Moon and turned on DLSS (without and with an internet connection lol). I made a short video:
The settings are all on "epic", resolution is 2560x1440 and RTX is activated (of course). The framerate was around 60fps without obs recording, but you can still see how the framerate increases almost 2x as soon as DLSS is activated. Obviously the connection to the internet makes no difference on DLSS
while 2.0 allows for generalized training, I wonder if there are some esoteric cases where a game would be specifically trained for its own use case
That was 1.0. 2.0 was generalized enough to work even at different target resolutions and in 2.1 we started seeing a more universalized approach by implementing it enginewide in Unreal and Unity.
That was 1.0. 2.0 was generalized enough to work even at different target resolutions and in 2.1 we started seeing a more universalized approach by implementing it enginewide in Unreal and Unity.
Absolutely. The problem I see with something like RT cores is cost limitations, die space limitations, and power consumption limitations. In a portable device of such a small size, it would be a challenge to include enough of them to efficiently use ray tracing with little consequence to the limitations in question.
I know and understand that, I was just asking out of curiosityFor me DLSS is enough of a shock to me for Nintendo to be using nevermind RT. I just can't see it because of the limitations you listed. In an alternate World where Switch was a console meaning more die space then it would be a lot more likely.
There is of course the development saving they would make on baking lighting when using RT but at some point the limitations of the silicone would preclude the full use of RT for GI anyway so they would be using both methods for lighting which is spending more not less.
Overall I expect DLSS for this new Switch chipset with more cores for RT for the next major redesign.
There is of course the development saving they would make on baking lighting when using RT but at some point the limitations of the silicone would preclude the full use of RT for GI anyway so they would be using both methods for lighting which is spending more not less.
Someone thought I was full of shit for asking a hypothetical question of how many games would take advantage of DLSS to go from as low as 360p to as high as 1080p in handheld mode if the new model has a 1080p screen.
Hey, my time identifying traffic lights on CAPTCHAs has finally paid off!Someone thought I was full of shit for asking a hypothetical question of how many games would take advantage of DLSS to go from as low as 360p to as high as 1080p in handheld mode if the new model has a 1080p screen.
(I don't deny that in general, I'm full of shit at times.)
But here's something that's potentially interesting.
I'll give it a watchSomeone thought I was full of shit for asking a hypothetical question of how many games would take advantage of DLSS to go from as low as 360p to as high as 1080p in handheld mode if the new model has a 1080p screen.
(I don't deny that in general, I'm full of shit at times.)
But here's something that's potentially interesting.
Just to articulate on this.
If you decide to use baked lighting in your game, you wouldn't use Switch hw to do the baking, so it doesn't really matter if it has hw support for it or not in this case.
You can use any hardware or software combination, and would likely use high end desktop cards for this.
That. said, while in theory using hw accelerated RT to bake lighting sounds a good time saver, the effort to switch to a baking pipeline exploiting the hw is quite significant and I'm not even sure there's many games if any at all actually doing that.
drive.google.com
according to Anandtech, a single chip looks to be 32SM, which is inbetween the 3060 and 3060Ti (28SM and 38SM respectively).How powerful would a full phat Orin chip be should Nintendo decide to use one on a stationary home console? Would it work as is or does it need to be reworked for a gaming console?
Here's the information Nvidia provided about the specs of Orin during GTC China 2019, courtesy of Forbes.FYI, I've uploaded a link to the slides for that presentation:
![]()
SE3071 - DRIVE AGX Hardware Update with NVIDIA Orin.pdf
drive.google.com
Overview shot:
![]()
I would expect HDMI 2.1 for future-proofing. Heck, the Switch launched with USB-C in 2017 when the port wasn't even used that much.8k 30 Dec | 4k 60 Enc - H264/H265/VP9
Sounds like that would exclude it being a HDMI 2.1 device? Those are HDMI 2.0 specs.
8k 30 Dec | 4k 60 Enc - H264/H265/VP9
Sounds like that would exclude it being a HDMI 2.1 device? Those are HDMI 2.0 specs.
prepare for 4.5K greatnessCurrently ready for that 8K switch Pro! :P
4K who? Don’t know her
Here's the information Nvidia provided about the specs of Orin during GTC China 2018, courtesy of Forbes.
![]()
Note: 2021 Orin is referring to the specs Nvidia revealed about Orin during GTC 2021, whilst 2018 Orin refers to the specs Nvidia revealed about Orin during GTC China 2018.
So, 2021 Orin has ~19.05% more transistors compared to 2018 Orin, which results in a ~21.26% increase in INT8 TOPS for 2021 Orin in comparison to 2018 Orin. 2021 Orin also has ~2.44% more memory bandwidth in comparison to 2018 Orin.
Edit: I wonder what the max transistor density of Samsung's 8N+ process node. And I guess there's still a possibility the new model's SoC might be fabricated using Samsung's 7 nm (7LPP) or 6 nm (6LPP) process nodes.
They didn’t see it coming, 4.5K everything :P
Not necessarily, since those specs sounds like the specs for the 7th gen NVDEC (Nvidia Decoder) and NVENC (Nvidia Encoder), which is practically the same as the 6th gen NVDEC and NVENC.8k 30 Dec | 4k 60 Enc - H264/H265/VP9
Sounds like that would exclude it being a HDMI 2.1 device? Those are HDMI 2.0 specs.
I believe all of the consumer Ampere GPUs are fabricated using Samsung's 8N process, not Samsung's 8N+ process node.I’m not sure on the max for 8N+, but considering the other Nvidia GPUs using 8N+, if I did the math right, it should be around 466.6mm^2 as the die size.
I've managed to find a slide from Nvidia from 2020 talking about Orin. And the specs for Orin in the slide are the exact same as the specs Nvidia disclosed about Orin during GTC China 2018. So the increase of transistors, which resulted in the increase of INT8 TOPS, and the increase in memory bandwidth was very recent.Here's the information Nvidia provided about the specs of Orin during GTC China 2018, courtesy of Forbes.
![]()
Note: 2021 Orin is referring to the specs Nvidia revealed about Orin during GTC 2021, whilst 2018 Orin refers to the specs Nvidia revealed about Orin during GTC China 2018.
So, 2021 Orin has ~19.05% more transistors compared to 2018 Orin, which results in a ~21.26% increase in INT8 TOPS for 2021 Orin in comparison to 2018 Orin. 2021 Orin also has ~2.44% more memory bandwidth in comparison to 2018 Orin.
Edit: I wonder what the max transistor density of Samsung's 8N+ process node. And I guess there's still a possibility the new model's SoC might be fabricated using Samsung's 7 nm (7LPP) or 6 nm (6LPP) process nodes.
Oh I see, my bad. I will edit my post.I believe all of the consumer Ampere GPUs are fabricated using Samsung's 8N process, not Samsung's 8N+ process node.
That sounds super impressive and I assume such a console would have hi compatibility with a hybrid Switch on the assumed OrinS cut down versionaccording to Anandtech, a single chip looks to be 32SM, which is inbetween the 3060 and 3060Ti (28SM and 38SM respectively).
it'd nearly compete with the PS5 in the number of shader cores, but not clock speed
Dont kid yourself.That sounds super impressive and I assume such a console would have hi compatibility with a hybrid Switch on the assumed OrinS cut down version