How do you view the new Nintendo Switch model in terms of a hardware upgrade?

  • As a mid-gen refresh (e.g. Xbox One S → Xbox One X, etc.)

    Votes: 114 48.7%
  • As an iterative successor (e.g. iPhone 11 → iPhone 12, etc.)

    Votes: 120 51.3%

  • Total voters
    234
  • Poll closed .

Ookaze

Member
Nov 22, 2017
97
Recent ARM cores are all big.LITTLE compatible. DynamIQ is the latest iteration of ARM Big.LITTLE implementation. Nintendo will retain the fast coming off sleep with any new ARM CPU.

A53s are functioning, especially on Mariko TX1. It just that Nintendo scheduler needs the 4*A57 to Be running when the console is not asleep. The big.LITTLE cluster switching implementation of TX1 will then prevent the 4*A53 to run.

Thank you, I recall now that you say it, that DynamIQ is supposed to emulate the big.LITTLE features.
So that's one relief, as it's paramount to keep as good of a user experience.
The way how big.LITTLE works is a bit more complicated than that and what the Switch's OS task scheduler sees is another matter, but no need to discuss these details that have no value for gamers.
 
Apr 11, 2020
1,235
Thank you, I recall now that you say it, that DynamIQ is supposed to emulate the big.LITTLE features.
So that's one relief, as it's paramount to keep as good of a user experience.
The way how big.LITTLE works is a bit more complicated than that and what the Switch's OS task scheduler sees is another matter, but no need to discuss these details that have no value for gamers.
DynamIQ is big.LITTLE for ARM A55 and A75 - A78 CPU family. The main difference between the old implementation is the size of the CPU cluster.

Modern ARM CPU cluster contains 8 cores instead of the old one which could only contain 4 cores. All cores shares the same L3$ memory pool. Some of the cores can also use a dedicated power rail in order to adjust the core voltage independently from the other cores (E2100 have a power rail for the X1 core compared to S888's X1 which use the same power rail used for the 3 A78s). Moreover, clocks can independently by adjust in a core by core basis which could only be done with A73/A53 with the old bL.

That's why the next model will probably have 8 cores in a 4+4 configuration (4*A78 + A55). The other possibility is A78C which can be used in a 6 or 8 cores layout but without any A55 cores. While it could mean higher price due to a much greater transistor budget, it could also be the only possibility to have an 8MB L3$ which could be necessary in order to handle PS5 ports.
 

Hermii

Member
Oct 27, 2017
4,404
I don't understand this DLSS belief for a Switch revision, the more time passes, the less I believe it makes any sense.
Have someone in this thread given an explanation on how to bypass the Internet connection necessary for DLSS in a Nintendo compatible way?

I thought perhaps just for Nintendo, that NVidia could make that some game specific data could be downloaded to the Nintendo console, but that's GB of data, which makes no sense for the Switch storage and starting games very fast, that would be destroying the user experience.
I could see DLSS being doable in docked mode for some kind of games, but it's not credible at all for handheld.
I see also lots of phone lobbyists that apparently are unable to understand that the gaming (toy) market has nothing to do with the smartphone market, and keep making nonsensical comparisons between smartphones and Switch, be it hardware or business strategy. Apparently just because they don't understand tech and believe they're the same because they use ARM chips.

DLSS for Switch handheld mode is just not possible today with what I understand of the tech, and I'm convinced it will not appear on a hypothetical revision of the Switch.
For docked mode, it's still possible, I'm not sure about a revision, that makes no business sense (except for those that believe smartphones are in the entertainment market), it looks more like sth like the 10th gen Switch successor Nintendo console. And that would require Nintendo applying what they said before, changing their strategy a bit to accept an always connected console to play games (with DLSS). They could still allow the games to play without a connection (or a big [GBs] data download and big wait, makes low sense) at a lower resolution, that would still need to be of consumer quality (which is 720p, Netflix stats would be interesting to see the proportion of people with SD content) without DLSS.

We (at least I) always knew that the hardware and specs has nothing to do with a game's performance, the programmers have to know how to use the hardware in the first place. I said in 2016 that any game can be made for the Switch, people that don't understand tech (like DF) started saying nonsense like some games are impossible, and when competent programmers that actually understand tech made these games on the Switch, these people that don't know what they're talking about called them nonsense like "impossible ports".
Nintendo has competent engineers, very proficient ones, that understand these things better than I do, as they're constantly in games, while I have barely the level of signal processing 101 knowledge from 25+ years ago (with a bit of update) and still I could understand sth in 2016 that people up till today are unable to grasp: the "power" of the hardware has nothing to do with a game being doable on current hardwares and engines, programmers' proficiency, equal time and money is.
I never heard of a requirement for internet connection to use dlss. If so, yea that's a major caveat for a handheld system.
 

olobolger

Member
Oct 31, 2017
1,229
Andalusia
Wait, what? What is the internet connection used for in DLSS? I guess that if it's needed in PC is purely because of telemetry, but they never shared any need to check a server for anything for the tech to work as far as I'm aware.
 

Hermii

Member
Oct 27, 2017
4,404
Someone with an rtx card need to get to the bottom of this immediately.

Disable internet on your pc and try to play a game with dlss enabled. report back.
 

LegendofLex

Member
Nov 20, 2017
4,457
The neural network used to train the DLSS model for each game uses a server, but the results are delivered locally via drivers & updates. There shouldn't be any technical requirement to have a persistent internet connection to use it, even if Nvidia is requiring it in the current implementations.
 

a11244

Banned
Nov 9, 2017
2,819
DLSS worked even the game was cracked and my Pi-hole didn’t show anything suspicious related to nvidia 🤔
 

RailWays

One Winged Slayer
Avenger
Oct 25, 2017
12,469
The neural network used to train the DLSS model for each game uses a server, but the results are delivered locally via drivers & updates. There shouldn't be any technical requirement to have a persistent internet connection to use it, even if Nvidia is requiring it in the current implementations.
That's also how I thought it was managed; via a driver update.
 

ILikeFeet

Member
Oct 25, 2017
50,249
What the fuck is going on here?

There is no online requirement to DLSS. The algorithm can be updated via driver updates or game updates to improve quality.
 

Corralx

Member
Aug 23, 2018
1,070
London, UK
Thank you, I recall now that you say it, that DynamIQ is supposed to emulate the big.LITTLE features.
So that's one relief, as it's paramount to keep as good of a user experience.
The way how big.LITTLE works is a bit more complicated than that and what the Switch's OS task scheduler sees is another matter, but no need to discuss these details that have no value for gamers.

DynamicIQ is not emulating big.LITTLE features.
It's a new and more flexible cluster design *for* big.LITTLE. It's the same concept, but with a new architecture that allows for more flexibility in the mixing of different cores.
And even then, the little cores being there or not have *nothing* to do with the sleep functionality in the Nintendo Switch.

This video is marketing for devs, it's not a white book or anything like that and it doesn't give any information on what is needed consumer wise. Look at the NVidia DLSS toolkit instead.
Why is nobody talking about the true tech details but only pretty presentations focused on everything but the tech details necessary for all this to even be usable by the end user?
The power of misinformation is strong when people don't understand tech at all, some talk about chips this and that all day but to this day believe that A53 cores in the Switch are disabled or not used. I don't even know if anyone explained how Nintendo would replace or just abandon the Switch functionality of coming off sleep so fast, I don't know (I didn't look up) if it's still possible with more recent ARM cores (which are no more big.LITTLE IIRC).

DLSS does not require an internet connection at all.
You're mixing up the invasive telemetry in the Nvidia driver with the actual technology behind DLSS.
The technology itself does not have any need for an internet connection, the model is pre-trained and shipped as part of the driver.

Babbling about the power of misinformation and how people don't understand tech, and then making all these claims about DLSS, the state of the A53 cores in the X1, or how new ARM cores are not big.LITTLE and how that's supposedly needed for the sleep functionality (it's not), is all very ironic.
 

brainchild

Independent Developer
Verified
Nov 25, 2017
9,213
Minnesota
random question, but would RT cores help Nintendo's lighting work in any way. Could they achieve better results or do certain things they couldn't before

Absolutely. The problem I see with something like RT cores is cost limitations, die space limitations, and power consumption limitations. In a portable device of such a small size, it would be a challenge to include enough of them to efficiently use ray tracing with little consequence to the limitations in question.

DLSS does not require an internet connection at all.

You're mixing up the invasive telemetry in the Nvidia driver with the actual technology behind DLSS.

The technology itself does not have any need for an internet connection, the model is pre-trained and shipped as part of the driver.



Yup
 

SiG

Member
Oct 25, 2017
6,468
The neural network used to train the DLSS model for each game uses a server, but the results are delivered locally via drivers & updates. There shouldn't be any technical requirement to have a persistent internet connection to use it, even if Nvidia is requiring it in the current implementations.
As mentioned earlier, DLSS no longer requires training per game. It's a more generalized approach post 2.0 update.
 

SiG

Member
Oct 25, 2017
6,468
Ah, right, of course. So the way this would likely work in practice is each game would require a certain Switch firmware version at a minimum.
No, not really. As explained in the video, DLSS 2.0 (and above) now uses one general model for every DLSS implementation, a model which Nvidia will update to improve the quality of DLSS as time goes on (and as of DLSS2.1, it now supports variable resolution internal rendering to a set target resolution). Any Switch game does not require to "have its own firmware" for DLSS to work. Rather, the game's engine just need to be updated to support DLSS as a sort of black box that it needs to feed data, such as motion vectors.
 

brainchild

Independent Developer
Verified
Nov 25, 2017
9,213
Minnesota
Rather, the game's engine just need to be updated to support DLSS as a sort of black box.

That word is doing a lot of heavy lifting, don't you think?

For example, Crytek talked about how problematic specular highlights along the edges of geometry could screw with the results. This would be highly scene dependent (but could be addressed by managing exposure levels). There's nothing about a DLSS black box that would make this not a problem. I think it's best not to oversimplify how DLSS might be integrated in an engine that wasn't designed for it from the beginning.
 

lexony

Member
Oct 25, 2017
1,910
I had some time to kill, so because I never tested DLSS on my Laptop with a RTX 2080 MAXQ, I installed Deliver us the Moon and turned on DLSS (without and with an internet connection lol). I made a short video:

The settings are all on "epic", resolution is 2560x1440 and RTX is activated (of course). The framerate was around 60fps without obs recording, but you can still see how the framerate increases almost 2x as soon as DLSS is activated. Obviously the connection to the internet makes no difference on DLSS
 
Last edited:

LegendofLex

Member
Nov 20, 2017
4,457
No, not really. As explained in the video, DLSS 2.0 (and above) now uses one general model for every DLSS implementation, a model which Nvidia will update to improve the quality of DLSS as time goes on (and as of DLSS2.1, it now supports variable resolution internal rendering to a set target resolution). Any Switch game does not require to "have its own firmware" for DLSS to work. Rather, the game's engine just need to be updated to support DLSS as a sort of black box that it needs to feed data, such as motion vectors.
They wouldn't continue to update the drivers and (presumably) distribute those updates with new firmware updates?

I'm not talking about per-game updates; just general updates to the drivers.
 

SiG

Member
Oct 25, 2017
6,468
They wouldn't continue to update the drivers and (presumably) distribute those updates with new firmware updates?

I'm not talking about per-game updates; just general updates to the drivers.
General updates to the DLSS models would most likely be milestone updates that coincide with System Firmware updates. The current model should be generalized enough that it should work with any existing or future games that decide to use DLSS without the need to retrain the neural network. It just needs to be fed the right sort of data and mip biases/LOD for the target resolution also have to be taken into account.

(The Crytek presentation explains how not everything in their engine produces motion vectors, but the model seems to still be okay with it without any noticable negative impact. Of course, people with keen eyes will likely notice some mild ghosting like the foliage in Sekiro, but it would still be miles clearer than having TAA.)
 

ILikeFeet

Member
Oct 25, 2017
50,249
while 2.0 allows for generalized training, I wonder if there are some esoteric cases where a game would be specifically trained for its own use case
 

SiG

Member
Oct 25, 2017
6,468
while 2.0 allows for generalized training, I wonder if there are some esoteric cases where a game would be specifically trained for its own use case
It's likely the engineers would be taking into account all the different case-by-case scenarios, in which case, it's down to how they train their algorithms to blend/interpret pixels in every one of them, like those particle in Death Stranding which inadvertendly cause a streaking effect.
I had some time to kill, so because I never tested DLSS on my Laptop with a RTX 2080 MAXQ, I installed Deliver us the Moon and turned on DLSS (without and with an internet connection lol). I made a short video:

The settings are all on "epic", resolution is 2560x1440 and RTX is activated (of course). The framerate was around 60fps without obs recording, but you can still see how the framerate increases almost 2x as soon as DLSS is activated. Obviously the connection to the internet makes no difference on DLSS
Well there you have it Ookaze . DLSS does not require a constant internet connection to function.
 

SiG

Member
Oct 25, 2017
6,468
DLSS certainly doesn't have a single one-size-fits-all model.
That was 1.0. 2.0 was generalized enough to work even at different target resolutions and in 2.1 we started seeing a more universalized approach by implementing it enginewide in Unreal and Unity.

Sure, it's not going to be used for everything (pixel art, etc.). It's still going to require data like motion vectors and such, but for most use cases its a good replacement for TAA.
 

Corralx

Member
Aug 23, 2018
1,070
London, UK
That was 1.0. 2.0 was generalized enough to work even at different target resolutions and in 2.1 we started seeing a more universalized approach by implementing it enginewide in Unreal and Unity.

What i'm saying is that even if it's not trained per-game, doesn't mean that internally doesn't have different specialised models aimed at optimally upscaling different scenarios.
I find it likely they're doing some sort of classification of content and then picking the appropriate model matching a predefined set of hyperparameters.
If you look at text for example, there's strong indications given by the artifacts that it's being treated differently from game content.
All of this doesn't matter for the developer tho, it's all transparently handled by Nvidia.
 

Neurotic

Member
Dec 2, 2020
1,244
Absolutely. The problem I see with something like RT cores is cost limitations, die space limitations, and power consumption limitations. In a portable device of such a small size, it would be a challenge to include enough of them to efficiently use ray tracing with little consequence to the limitations in question.

For me DLSS is enough of a shock to me for Nintendo to be using nevermind RT. I just can't see it because of the limitations you listed. In an alternate World where Switch was a console meaning more die space then it would be a lot more likely.

There is of course the development saving they would make on baking lighting when using RT but at some point the limitations of the silicone would preclude the full use of RT for GI anyway so they would be using both methods for lighting which is spending more not less.

Overall I expect DLSS for this new Switch chipset with more cores for RT for the next major redesign.
 

#Salt

Member
Feb 27, 2021
65
For me DLSS is enough of a shock to me for Nintendo to be using nevermind RT. I just can't see it because of the limitations you listed. In an alternate World where Switch was a console meaning more die space then it would be a lot more likely.

There is of course the development saving they would make on baking lighting when using RT but at some point the limitations of the silicone would preclude the full use of RT for GI anyway so they would be using both methods for lighting which is spending more not less.

Overall I expect DLSS for this new Switch chipset with more cores for RT for the next major redesign.
I know and understand that, I was just asking out of curiosity
 

bmfrosty

Member
Oct 27, 2017
1,632
SF Bay Area
If it's generalized then a model set comes with the driver built into the firmware. If it's per game, then the model or models come with the game. If the models make it hard to fit on a cartridge, then they can be downloaded from the eShop as the free 4k DLC.

That part is easy enough.

What I wonder about is if portable clocks will be high enough to use DLSS (it feels like there will be some sort of tensor performance threshold that has to be met to do 360 -> 720 or 480 -> 720 or 540 -> 720) and if they're not, then maybe tensor cores could be powered down while in portable mode.
 

Corralx

Member
Aug 23, 2018
1,070
London, UK
There is of course the development saving they would make on baking lighting when using RT but at some point the limitations of the silicone would preclude the full use of RT for GI anyway so they would be using both methods for lighting which is spending more not less.

Just to articulate on this.
If you decide to use baked lighting in your game, you wouldn't use Switch hw to do the baking, so it doesn't really matter if it has hw support for it or not in this case.
You can use any hardware or software combination, and would likely use high end desktop cards for this.
That. said, while in theory using hw accelerated RT to bake lighting sounds a good time saver, the effort to switch to a baking pipeline exploiting the hw is quite significant and I'm not even sure there's many games if any at all actually doing that.
 
Dec 21, 2020
4,993
I don’t understand what on earth happened


To go on a more related discussion, it can make more sense, using brainchild’s information for the implementation of DLSS, why they mentioned it way back in September about getting games ”4k ready”. There is more than just a button flick to get it to work and if developers knew of it since way back then of not only the quality of assets ready for 4k, but also the implementation of DLSS in the case of the switch, then it makes sense why they would be notified that early.

A light remaster+implementing DLSS can require some time seems like a fair conclusion wrt that.
 
Last edited:

Dekuman

Member
Oct 27, 2017
15,373
There's no news so people are talking on circles again. Biggest news recently is unity getting dlss support
 

JaggiBaggi

Member
Nov 4, 2017
399
...this is the weirdest concern trolling attempt I've seen in this place so far.

DLSS does not require any internet connection to anything to work.
 
OP
OP
Dakhil

Dakhil

Member
Mar 26, 2019
4,379
Orange County, CA
I don’t understand what on earth happened
Someone thought I was full of shit for asking a hypothetical question of how many games would take advantage of DLSS to go from as low as 360p to as high as 1080p in handheld mode if the new model has a 1080p screen.

(I don't deny that in general, I'm full of shit at times.)

But here's something that's potentially interesting.
 

FernandoRocker

Avenger
Oct 25, 2017
7,766
México
Someone thought I was full of shit for asking a hypothetical question of how many games would take advantage of DLSS to go from as low as 360p to as high as 1080p in handheld mode if the new model has a 1080p screen.

(I don't deny that in general, I'm full of shit at times.)

But here's something that's potentially interesting.
Hey, my time identifying traffic lights on CAPTCHAs has finally paid off!
 

ILikeFeet

Member
Oct 25, 2017
50,249
Someone thought I was full of shit for asking a hypothetical question of how many games would take advantage of DLSS to go from as low as 360p to as high as 1080p in handheld mode if the new model has a 1080p screen.

(I don't deny that in general, I'm full of shit at times.)

But here's something that's potentially interesting.
I'll give it a watch
 

Neurotic

Member
Dec 2, 2020
1,244
Just to articulate on this.
If you decide to use baked lighting in your game, you wouldn't use Switch hw to do the baking, so it doesn't really matter if it has hw support for it or not in this case.
You can use any hardware or software combination, and would likely use high end desktop cards for this.
That. said, while in theory using hw accelerated RT to bake lighting sounds a good time saver, the effort to switch to a baking pipeline exploiting the hw is quite significant and I'm not even sure there's many games if any at all actually doing that.

Of course, my point was it would be better in terms of finances for Nintendo to wait and go all in on RT once their hardware can fully support it. I think it’s pretty well accepted that RT for most things would be significantly cheaper development wise rather than spending thousands of hours baking lighting into everything.
 

brainchild

Independent Developer
Verified
Nov 25, 2017
9,213
Minnesota
But here's something that's potentially interesting.

FYI, I've uploaded a link to the slides for that presentation:


Overview shot:

Screenshot2021042011.jpg
 

Dekuman

Member
Oct 27, 2017
15,373
How powerful would a full phat Orin chip be should Nintendo decide to use one on a stationary home console? Would it work as is or does it need to be reworked for a gaming console?
 

ILikeFeet

Member
Oct 25, 2017
50,249
How powerful would a full phat Orin chip be should Nintendo decide to use one on a stationary home console? Would it work as is or does it need to be reworked for a gaming console?
according to Anandtech, a single chip looks to be 32SM, which is inbetween the 3060 and 3060Ti (28SM and 38SM respectively).

it'd nearly compete with the PS5 in the number of shader cores, but not clock speed
 
OP
OP
Dakhil

Dakhil

Member
Mar 26, 2019
4,379
Orange County, CA
FYI, I've uploaded a link to the slides for that presentation:


Overview shot:

Screenshot2021042011.jpg
Here's the information Nvidia provided about the specs of Orin during GTC China 2019, courtesy of Forbes.

Note: 2021 Orin is referring to the specs Nvidia revealed about Orin during GTC 2021, whilst 2019 Orin refers to the specs Nvidia revealed about Orin during GTC China 2019.

So, 2021 Orin has ~19.05% more transistors compared to 2018 Orin, which results in a ~21.26% increase in INT8 TOPS for 2021 Orin in comparison to 2019 Orin. 2021 Orin also has ~2.44% more memory bandwidth in comparison to 2019 Orin.

Edit: I wonder what the max transistor density of Samsung's 8N+ process node. And I guess there's still a possibility the new model's SoC might be fabricated using Samsung's 7 nm (7LPP) or 6 nm (6LPP) process nodes.
 
Last edited:

phyl0x

Member
Nov 30, 2020
605
8k 30 Dec | 4k 60 Enc - H264/H265/VP9

Sounds like that would exclude it being a HDMI 2.1 device? Those are HDMI 2.0 specs.
 
Dec 21, 2020
4,993
Here's the information Nvidia provided about the specs of Orin during GTC China 2018, courtesy of Forbes.

Note: 2021 Orin is referring to the specs Nvidia revealed about Orin during GTC 2021, whilst 2018 Orin refers to the specs Nvidia revealed about Orin during GTC China 2018.

So, 2021 Orin has ~19.05% more transistors compared to 2018 Orin, which results in a ~21.26% increase in INT8 TOPS for 2021 Orin in comparison to 2018 Orin. 2021 Orin also has ~2.44% more memory bandwidth in comparison to 2018 Orin.

Edit: I wonder what the max transistor density of Samsung's 8N+ process node. And I guess there's still a possibility the new model's SoC might be fabricated using Samsung's 7 nm (7LPP) or 6 nm (6LPP) process nodes.
I’m not sure on the max for 8N+, but considering the other Nvidia GPUs using 8N+, if I did the math right, it should be around 466.6mm^2 as the die size.


It is a very large chip


disregard
prepare for 4.5K greatness
They didn’t see it coming, 4.5K everything :P
 
Last edited:
OP
OP
Dakhil

Dakhil

Member
Mar 26, 2019
4,379
Orange County, CA
8k 30 Dec | 4k 60 Enc - H264/H265/VP9

Sounds like that would exclude it being a HDMI 2.1 device? Those are HDMI 2.0 specs.
Not necessarily, since those specs sounds like the specs for the 7th gen NVDEC (Nvidia Decoder) and NVENC (Nvidia Encoder), which is practically the same as the 6th gen NVDEC and NVENC.

I’m not sure on the max for 8N+, but considering the other Nvidia GPUs using 8N+, if I did the math right, it should be around 466.6mm^2 as the die size.
I believe all of the consumer Ampere GPUs are fabricated using Samsung's 8N process, not Samsung's 8N+ process node.

Here's the information Nvidia provided about the specs of Orin during GTC China 2018, courtesy of Forbes.

Note: 2021 Orin is referring to the specs Nvidia revealed about Orin during GTC 2021, whilst 2018 Orin refers to the specs Nvidia revealed about Orin during GTC China 2018.

So, 2021 Orin has ~19.05% more transistors compared to 2018 Orin, which results in a ~21.26% increase in INT8 TOPS for 2021 Orin in comparison to 2018 Orin. 2021 Orin also has ~2.44% more memory bandwidth in comparison to 2018 Orin.

Edit: I wonder what the max transistor density of Samsung's 8N+ process node. And I guess there's still a possibility the new model's SoC might be fabricated using Samsung's 7 nm (7LPP) or 6 nm (6LPP) process nodes.
I've managed to find a slide from Nvidia from 2020 talking about Orin. And the specs for Orin in the slide are the exact same as the specs Nvidia disclosed about Orin during GTC China 2018. So the increase of transistors, which resulted in the increase of INT8 TOPS, and the increase in memory bandwidth was very recent.
 

Dekuman

Member
Oct 27, 2017
15,373
according to Anandtech, a single chip looks to be 32SM, which is inbetween the 3060 and 3060Ti (28SM and 38SM respectively).

it'd nearly compete with the PS5 in the number of shader cores, but not clock speed
That sounds super impressive and I assume such a console would have hi compatibility with a hybrid Switch on the assumed OrinS cut down version