How do you view the new Nintendo Switch model in terms of a hardware upgrade?

  • As a mid-gen refresh (e.g. Xbox One S → Xbox One X, etc.)

    Votes: 114 48.7%
  • As an iterative successor (e.g. iPhone 11 → iPhone 12, etc.)

    Votes: 120 51.3%

  • Total voters
    234
  • Poll closed .
Oct 27, 2017
744
Honestly, even with all the buzz about it, I don't really care about 4K visuals for the Switch. I feel like the Switch needs more power to ensure stable framerate and help devs more than anything else. I'd rather have games run at the same resolution as Super Mario Odyssey and do it smoothly than anything else. I actually put off playing Age of Calamity because of the framerate drops in the opening level, hoping that a Pro would come out and give it a little boost. The framerate drops in Link's Awakening, as much as I loved the game, made me a little queasy at times :/
 

Alovon11

Member
Jan 8, 2021
1,031
that would make sense for an automotive part, but a tablet part, probably not. in ampere terms, that's 1536 cores
Even half of that would be a 6SM part which is well beyond the power of a PS4 GPU wise as 4SMs of Lovelace will likely be roughly equal at decent clocks, so even an underclocked 6SM part would match or beat the PS4's GPU before DLSS.
 

Alovon11

Member
Jan 8, 2021
1,031
Would it be safe to say that 1012 CUDA cores is the absolute maximum amount of CUDA cores to expect for the T239, considering that Orin has 2024 CUDA cores? (I'm not saying it's likely, mind you.)
Yeah, 8SMs is 1024 CUDA cores, so that is the upper bound of what we've been predicting.

But what Kopite just said means that even cutting the GPU amount of the GPC in half would result in a 6SM part with 768 CUDA cores.

So GPU-wise, T239 in all likelihood unless they set the clock speeds to sub-1Ghz, will beat the PS4 OG's GPU in performance before DLSS.

And a 6SM GPU does potentially increase the chances of the CPU being better than the 4+4 A78/A55 config discussed at the lower end.

I could see a 6+2, 7+1, or 8 config with this if it indeed has a 6 or 8SM GPU.

7+1 or 8 would give developers full access to port stuff over, especially with the Switch OLED Model making it so the system the T239 will be in will more than likely be marketed as a Successor ala iPhone 5 -> 6., etc/Iterative successors like what Microsoft is seemingly planning to do with the Xbox Series Systems

AKA, in 2024/5 we get the Series S2 and X2, 2028/9, S3, X3.etc with old systems switching to Cloud streaming/getting support dropped over time.

Nintendo could do that exact thing as NVIDIA has their own streaming service with Geforce Now that old switches can leverage for new releases when it's not possible to backport them.
 

4859

Banned
Oct 27, 2017
7,046
In the weak and the wounded

A gpc is Nvidia's new way if organizing their architecture.

It's basically... A self contained GPU. It has the raster engine and SMs (which contain the cuda cores, the shaders) the tensor cores, rx cores, all the odds and ends, all that stuff is packed into each gpc. So looking at something like that orin, which has 12 gpc's, you're basically looking at 12 gpu's strung together. The only thing I don't think they contain are the texture units, which are now organized in their own block. Nvidia can scale the number of gpc's in a product to meet the needs of the device, down to the orin s that's going to be the soc for the new generation of drive internal camera systems, which looks to have 1 gpc and run at 15 watts. It's this product I think kopite was talking about in that tweet.

Gpc's are divided into halves called partitions. It looks like one partitian from orin is likely going to be the max bounds for fitting in close enough with the current switches power draw with this new soc.
 
Last edited:

BlueManifest

One Winged Slayer
Banned
Oct 25, 2017
11,678
Yeah, 8SMs is 1024 CUDA cores, so that is the upper bound of what we've been predicting.

But what Kopite just said means that even cutting the GPU amount of the GPC in half would result in a 6SM part with 768 CUDA cores.

So GPU-wise, T239 in all likelihood unless they set the clock speeds to sub-1Ghz, will beat the PS4 OG's GPU in performance before DLSS.

And a 6SM GPU does potentially increase the chances of the CPU being better than the 4+4 A78/A55 config discussed at the lower end.

I could see a 6+2, 7+1, or 8 config with this if it indeed has a 6 or 8SM GPU.

7+1 or 8 would give developers full access to port stuff over, especially with the Switch OLED Model making it so the system the T239 will be in will more than likely be marketed as a Successor ala iPhone 5 -> 6., etc/Iterative successors like what Microsoft is seemingly planning to do with the Xbox Series Systems

AKA, in 2024/5 we get the Series S2 and X2, 2028/9, S3, X3.etc with old systems switching to Cloud streaming/getting support dropped over time.

Nintendo could do that exact thing as NVIDIA has their own streaming service with Geforce Now that old switches can leverage for new releases when it's not possible to backport them.
Well the switch was stronger than the PS3, so wouldn’t be a stretch for the switch 2 to be stronger than ps4
 

JoshuaJSlone

Member
Dec 27, 2017
711
Indiana
When Nintendo does release a next gen switch 2 do you think they will cut back on some features so they can release better versions of the system later?

example will they go back to lcd so they can release a switch 2 oled?

will they to go back to 6 inch screen so they can release a 7 inch switch 2?
Looking at how things worked on Nintendo portables this century, there really hasn't been regression on screen quality. GBA improved to GBASP improved to DS improved to DSL and on and on. Size is another matter, once they started making XL they didn't make ALL screens XL, just continued to make that an option. But I don't think they'd just arbitrarily decide "Let's make a launch machine with giant bezels so we can sell a less-bezel update in 4 years."
If Nintendo wants to utilize the switch for VR, like they did with labo but with a real headset to let the hands free, then I hope for even bigger resolution in portable, like 1440p (in this case run games at 720p and upscale to 1440p with DLSS).
If they want to do VR at all they might as well do it right and make it its own SKU rather than making a Labo-like solution that makes for an awkward front-heavy feel. When good $300 standalone VR exists, a $300+ flat game machine plus extra VR accessory that works out to a half-assed solution is a hard sell.
 
Dec 21, 2020
4,993
Hm, what are the odds that the difference between Orin and Orin S is mostly down to how many cuda cores it has? but not how many cpu cores it has?

As in, they all have the same amount of A78AE cores (12), but a bigger variance from the cuda?
 
OP
OP
Dakhil

Dakhil

Member
Mar 26, 2019
4,379
Orange County, CA
Looking at how things worked on Nintendo portables this century, there really hasn't been regression on screen quality. GBA improved to GBASP improved to DS improved to DSL and on and on. Size is another matter, once they started making XL they didn't make ALL screens XL, just continued to make that an option. But I don't think they'd just arbitrarily decide "Let's make a launch machine with giant bezels so we can sell a less-bezel update in 4 years."
The Nintendo 3DS did have the IPS vs TN screens controversy if that counts as an example of screen quality regression.
 

Hermii

Member
Oct 27, 2017
4,404
Well the switch was stronger than the PS3, so wouldn’t be a stretch for the switch 2 to be stronger than ps4
You are comparing a 2006 console to 2017 portable console (11 years) and a 2013.console to a 2022(?) portable console (9 years).

I mean it may be correct anyway, but just comparing numbers is not entirely fair. It’s far from safe to assume switch 3 will be a portable ps5.
 

Adulfzen

Member
Oct 29, 2017
2,880
You are comparing a 2006 console to 2017 portable console (11 years) and a 2013.console to a 2022(?) portable console (9 years).

I mean it may be correct anyway, but just comparing numbers is not entirely fair. It’s far from safe to assume switch 3 will be a portable ps5.
while it may not be one to one, I thought at least some aspect of the DLSS Switch would surpass a PS4 like the CPU but obviously we won't know how powerful it really is until we have an ample amount of ports to see the differences.
 

Hermii

Member
Oct 27, 2017
4,404
while it may not be one to one, I thought at least some aspect of the DLSS Switch would surpass a PS4 like the CPU but obviously we won't know how powerful it really is until we have an ample amount of ports to see the differences.
Absolutely!

It was just the”switch was stronger than ps3, switch 2 must be stronger than a ps4” logic I had an issue with.
 
Dec 21, 2020
4,993
while it may not be one to one, I thought at least some aspect of the DLSS Switch would surpass a PS4 like the CPU but obviously we won't know how powerful it really is until we have an ample amount of ports to see the differences.
Architecturally speaking, Ampere and by extension Lovelace is much more advanced than GCN 1 that the PS4 and XB1 had. While it can have the same flops, it's a completely different architecture that sets it a generation apart in feature set. As in, the ampere based machine would be more performant than meets the eye.

If the PS4 was RDNA2 and the switch was lovelace, same gpu paper specs in flops, then I would agree that we would have to wait and see.
 
OP
OP
Dakhil

Dakhil

Member
Mar 26, 2019
4,379
Orange County, CA
I keep seeing 8096 and 1012. It feels like someone mistyped something and all the math has been off ever since. Should be 8192 and 1024.
It's also possible that the 8096 CUDA cores number from NIO is outdated, considering that NIO mentions that there are 68 billion transistors in total for 4 Orin SoCs, which results in 17 billion transistors for 1 Orin SoC, which is consistent with what Nvidia said about Orin initially. But Nvidia recently mentioned during GTC 2021 that Orin has 21 billion transistors. So there's a possibility that the number of CUDA cores Orin has might have changed.

Hm, what are the odds that the difference between Orin and Orin S is mostly down to how many cuda cores it has? but not how many cpu cores it has?

As in, they all have the same amount of A78AE cores (12), but a bigger variance from the cuda?
Perosnally, the odds seem pretty low to me since 12 Cortex-A78AE cores seem to take up a good chunk of space in the SoC, going by the pre-annotated picture of Orin.
 

japtor

Member
Jan 19, 2018
975
I am not sure if it was already discussed, but are we expecting the "new" dock to have that 4K capable chip?
If that's the case, I'd say it's proof that the "Pro" model was/is in the works because I don't see Nintendo creating yet another dock version and break cross-compatibility between the different models

It would make totally sense to have in the near future the v2 discontinued and the OLED and 4K sharing most of the components (dock, screen, case, memory etc...) to consolidate production.

Then again, it would have made a lot more sense to launch them at the same time if they really were working at two new SKUs.
But I think the first tear down will give us some hints...
That feels pretty likely and basically ties up the last bit of the Aula firmware/rumor. 4K output is pretty normal at this point that I'd be surprised if it'd be a big savings making two sets of boards with different chips vs just a single line of new docks that can handle everything. Plus it becomes one less thing to deal with when ramping up production of the new one.
Generally, I think Apple priced the standard iPad 32GB at $329 due to retailers discounting it down to $250 Pre-Covid. (Heck, I even managed to get the 128 GB post Covid at a $50 discount after X-mas) The street price from other than Apple was generally lower than the $329 figure. I don't think I have ever seen a sale on a brand new Nintendo Switch (other than a Black Friday bundle w/ Mario Kart and the $35 eShop gift card bundle)
It's been that way for a while. The low end iPad was always officially $329, but retailers regularly put it on sale (simultaneously even) for $250. Basically seems like a tacitly approved sale price by Apple.
If they want to do VR at all they might as well do it right and make it its own SKU rather than making a Labo-like solution that makes for an awkward front-heavy feel. When good $300 standalone VR exists, a $300+ flat game machine plus extra VR accessory that works out to a half-assed solution is a hard sell.
Yeah the desire for a 1080p screen upgrade mainly for VR has always been weird to me. It'd still be low res for VR and the form factor would still suck.
 

BlueManifest

One Winged Slayer
Banned
Oct 25, 2017
11,678
You are comparing a 2006 console to 2017 portable console (11 years) and a 2013.console to a 2022(?) portable console (9 years).

I mean it may be correct anyway, but just comparing numbers is not entirely fair. It’s far from safe to assume switch 3 will be a portable ps5.
It will release in 2023 imo so that will be 10 years from ps4, and I believe switch was supposed to release holiday 2016 originally so that would have been 10 years from ps3
 
OP
OP
Dakhil

Dakhil

Member
Mar 26, 2019
4,379
Orange County, CA
If the OLED model's dock does have a DisplayPort 1.4 to HDMI 2.1 converter chip, I wonder which USB version (e.g. USB 3.2 Gen 1, USB 3.2 Gen 2x2, USB4 Gen 3x2, etc.) is going to be used for the male USB-C port inside the OLED model's dock and the OLED model console by extension, assuming the DLSS model is going to reuse the OLED model's dock.
 

KlaxOnKlaxOff

Member
Mar 3, 2021
48
I was not aware that there was an issue with the existing dock USB port/Ethernet adaptor. Something about packet loss? Anyone know how severely that limits bandwidth or connectivity on the existing Switch? I wonder if the Ethernet port on the new dock is upgraded and won’t have that issue.
 
Dec 21, 2020
4,993
It's also possible that the 8096 CUDA cores number from NIO is outdated, considering that NIO mentions that there are 68 billion transistors in total for 4 Orin SoCs, which results in 17 billion transistors for 1 Orin SoC, which is consistent with what Nvidia said about Orin initially. But Nvidia recently mentioned during GTC 2021 that Orin has 21 billion transistors. So there's a possibility that the number of CUDA cores Orin has might have changed.


Perosnally, the odds seem pretty low to me since 12 Cortex-A78AE cores seem to take up a good chunk of space in the SoC, going by the pre-annotated picture of Orin.
While they do take a big amount of die space, the CC take a significant amount of die space as well. That said, it was only a hypothetical anyway. I wonder how much IF they take from Orin will be modified.

And I wonder too if Orin S was actually scrapped....
 

prid13

Member
Mar 31, 2019
122
Honestly, even with all the buzz about it, I don't really care about 4K visuals for the Switch. I feel like the Switch needs more power to ensure stable framerate and help devs more than anything else. I'd rather have games run at the same resolution as Super Mario Odyssey and do it smoothly than anything else. I actually put off playing Age of Calamity because of the framerate drops in the opening level, hoping that a Pro would come out and give it a little boost. The framerate drops in Link's Awakening, as much as I loved the game, made me a little queasy at times :/

This ^ i've been holding off Astral Chain and Link's Awakening in hope that a Pro model would give me a smoother experience. Guess I can't wait anylonger :/
 
OP
OP
Dakhil

Dakhil

Member
Mar 26, 2019
4,379
Orange County, CA
I was not aware that there was an issue with the existing dock USB port/Ethernet adaptor. Something about packet loss? Anyone know how severely that limits bandwidth or connectivity on the existing Switch? I wonder if the Ethernet port on the new dock is upgraded and won’t have that issue.
I was talking about why the back USB port on the dock is only running on USB 2.0 speeds instead of USB 3.0 speeds, which is probably due to potential wireless interference caused by USB 3.0. So as a result, the max theoretical data transfer rate when connecting a USB to gigabit Ethernet adapter to the back USB port on the dock is limited to 480 Mb/s or 60 MB/s since USB 2.0's max theoretical data transfer rate is 480 Mb/s or 60 MB/s.

As for the built-in Ethernet port on the OLED model's dock, it depends on if Nintendo limits the max theoretical data transfer rate to 480 Mb/s or 60 MB/s, which is certainly a possibility, although I don't know if Nintendo's going to do so.
 

Neurotic

Member
Dec 2, 2020
1,244
Honestly, even with all the buzz about it, I don't really care about 4K visuals for the Switch. I feel like the Switch needs more power to ensure stable framerate and help devs more than anything else. I'd rather have games run at the same resolution as Super Mario Odyssey and do it smoothly than anything else. I actually put off playing Age of Calamity because of the framerate drops in the opening level, hoping that a Pro would come out and give it a little boost. The framerate drops in Link's Awakening, as much as I loved the game, made me a little queasy at times :/

Well I completed AoC early this year as I expected BotW 2 this Winter and didn’t want too much Zelda in one time period but I totally agree. Some of the performance metrics in that game are truly embarrassing.

As great as the Switch is as a concept and as successful as it is commercially, some of the technical performances in games have been awful even in first party titles or first party IP done by second / third party developers. Most shocking of all Super Mario Odyssey which runs at closer to 800p most of the time (720p in New Donk City) while at the same time having many framerate drops in a series that has for the past 12+ years been absolutely rock solid in that regard.

It irks me that they’ve now not once (Lite), not twice (red box with much better battery life) but three times (OLED with better battery life and a larger, higher quality screen) catered to handheld players when it comes to a hybrid device which from their own data shows it’s used fairly evenly in each mode. The fact they couldn’t even boost the CPU and GPU clocks in the new OLED model only while it’s docked to achieve higher resolutions in dynamic resolution games for 4K displays and smoother framerates in the likes of AoC is really annoying to me because unlike the red box Switch (which dock was identical to the original Switch) they could have put an extra little fan or more space at the back of the new redesigned dock to accommodate for higher clocks (even if they were just 20-30% higher).

The OLED to me is their first major misstep with the Switch brand when you consider you can buy a Series S and a year of Game Pass Ultimate for the same price (which lets you play games on the go). Of course it will sell out at launch due to it being new hardware along with the continued Covid bump gaming has had in general.

My general hope for Switch 4k is that it’s been much more of a collaborative effort between Nintendo and Nvidia hardware wise so that we get at least native 1080p games when docked with rock solid performance. Anything under 1920x1080 on a 4k screen is starting to look extremely dated.
 

bmfrosty

Member
Oct 27, 2017
1,632
SF Bay Area
I was talking about why the back USB port on the dock is only running on USB 2.0 speeds instead of USB 3.0 speeds, which is probably due to potential wireless interference caused by USB 3.0. So as a result, the max theoretical data transfer rate when connecting a USB to gigabit Ethernet adapter to the back USB port on the dock is limited to 480 Mb/s or 60 MB/s since USB 2.0's max theoretical data transfer rate is 480 Mb/s or 60 MB/s.

As for the built-in Ethernet port on the OLED model's dock, it depends on if Nintendo limits the max theoretical data transfer rate to 480 Mb/s or 60 MB/s, which is certainly a possibility, although I don't know if Nintendo's going to do so.
USB 2 and 3 on USB-C use entirely different wires. When you're in using alternative modes in USB 3, the USB Data Lanes run the other protocol (Displayport) instead of USB. The USB 2 signal is unaffected by this and can be still used for data as normal.

There are of course a bunch of caveats to this, but generally, if you're using alternate modes, you don't get USB3 data lanes to work with.
 

karmitt

Member
Oct 27, 2017
4,735
.
Honestly, even with all the buzz about it, I don't really care about 4K visuals for the Switch. I feel like the Switch needs more power to ensure stable framerate and help devs more than anything else. I'd rather have games run at the same resolution as Super Mario Odyssey and do it smoothly than anything else. I actually put off playing Age of Calamity because of the framerate drops in the opening level, hoping that a Pro would come out and give it a little boost. The framerate drops in Link's Awakening, as much as I loved the game, made me a little queasy at times :/

It’s two sides of the same coin for me.

Games that manage a stable framerate but do so at the expense of resolution are also extremely off putting. I let myself say I want 4K because we have reporting that it’s something Nintendo is aiming for, but what I really want is >= 1080p and locked 30fps or even 60fps. I'm tired of blurry titles on TV. It’s enough for me to not want to dock the Switch at all. I don’t expect these titles to be as visually dense or complex as XSX releases; I just want that same clean image quality on a 65" display

It's probably the reason I've convinced myself to update to this OLED device (despite bitching about the price bump). At least I can make the most of handheld play until they actually do launch the upgrade.
 
Dec 21, 2020
4,993
Edit: rephrase, how long do you reckon a small chip (120mm^2 or less) take from Tape Out to release in a product?
 
Last edited:

4859

Banned
Oct 27, 2017
7,046
In the weak and the wounded
.


It’s two sides of the same coin for me.

Games that manage a stable framerate but do so at the expense of resolution are also extremely off putting. I let myself say I want 4K because we have reporting that it’s something Nintendo is aiming for, but what I really want is >= 1080p and locked 30fps or even 60fps. I'm tired of blurry titles on TV. It’s enough for me to not want to dock the Switch at all. I don’t expect these titles to be as visually dense or complex as XSX releases; I just want that same clean image quality on a 65" display

It's probably the reason I've convinced myself to update to this OLED device (despite bitching about the price bump). At least I can make the most of handheld play until they actually do launch the upgrade.

Then you both should be super stoked about a dlss capable switch.
 

Lwill

Member
Oct 28, 2017
1,524
I can't deny this possibility but I believe kopite is right for one specific reason that I will not share publicly (though I've shared with people I trust like NateDrake in private); basically, based on something I heard happened, it made sense that someone would end up giving the kind of info that kopite eventually gave (though I didn't know he would be the one to do it, I just expected it to happen around the time that it did).

To be clear, I don't know for certain that that codename ties back to Nintendo's chip, but given how and when the info came about, I have no reason to think that kopite is wrong about the Nintendo association.

And that's all I'll say about that.
Hey Brainchild. I think at one point you implied (or at least others interpreted your response) that the chipset had complications and that’s why it wasn’t taped out yet. Due to what we know now, can we assume that all things are normal and that this system weren’t scheduled for a 2021 release anyway?
 

BDGAME

Member
Oct 27, 2017
921
Brasília
If they want to do VR at all they might as well do it right and make it its own SKU rather than making a Labo-like solution that makes for an awkward front-heavy feel. When good $300 standalone VR exists, a $300+ flat game machine plus extra VR accessory that works out to a half-assed solution is a hard sell.
I agree. A 1080p switch 2 and a wireless headset with a technology similar to Wii control can be a perfect, even if it's price.
 
Dec 21, 2020
4,993
Hey Brainchild. I think at one point you implied (or at least others interpreted your response) that the chipset had complications and that’s why it wasn’t taped out yet. Due to what we know now, can we assume that all things are normal and that this system weren’t scheduled for a 2021 release anyway?
I don’t remember if that was brainchild really, but I could be mistaken.

That said, I do remember something about a chip that was misinterpreted as "complication"
 

JershJopstin

Member
Oct 25, 2017
5,323
I'd like to point out that he listed that specific chip as an example and not the actual one.
As for the built-in Ethernet port on the OLED model's dock, it depends on if Nintendo limits the max theoretical data transfer rate to 480 Mb/s or 60 MB/s, which is certainly a possibility, although I don't know if Nintendo's going to do so.
The built-in Ethernet port is almost certainly going to still be USB internally, as the only alternatives are to use USB4 (which might have the same noise issues?) and tunnel it as PCIe, or use their own proprietary Alt Mode and send it that way. A built-in USB to Ethernet chip sounds like the cheapest solution to me, and they could even just integrate the same AX88179 they already support (it's the chip in the licensed adapter and does support USB 3.0).

I imagine they'll have a lot more maneuvering room to contain the interference with the absence of the type A connector. Additionally, it seems they were caught off-guard by the interference issues last time based on the broken promise for future support. This time the dock may have been built with these issues in mind, but ultimately we won't know until support for the port pops up in firmware.

If it never does, then it's likely the AX88179 just built into the dock using a USB 2.0 connection, as should theoretically work as-is.
USB 2 and 3 on USB-C use entirely different wires. When you're in using alternative modes in USB 3, the USB Data Lanes run the other protocol (Displayport) instead of USB. The USB 2 signal is unaffected by this and can be still used for data as normal.

There are of course a bunch of caveats to this, but generally, if you're using alternate modes, you don't get USB3 data lanes to work with.
It's unclear to me whether the dock uses the USB 2.0 lanes. The Switch as-is uses two DP lanes, leaving one bi-directional USB 3.0 lane open for use. Even with the older DP Alt Mode that the Switch supports, this is enough for 1080p at 60fps.

Curiously, there's a relatively new firmware setting not exposed in the user interface for forgoing the USB 3.0 lane to get a full 4 DP lanes. The internal name for the setting is called "usb!4kdp_preferred_over_usb30". This is another reason data mining was under the impression the OLED model would be allowing for 4K, but the setting is now a mystery.
 
Dec 21, 2020
4,993
I was having a look at the DSi and the DSi XL released like a year between one another.

The DSi released 4-5 years after the OG model when the DS was in its 5th-6th year on the market, when it had little reason to really exist with how well the DS was selling.

So to ask, why are we attempting to decide what is and what isn’t the time of what a successor is? Irrespective of the hardware capabilities because let me remind you that the wii was barely stronger than the GCN and was positioned and considered a whole new generation.

*ahem*

Anyway, why are we using timing to determine if it is a successor or a pro and not simply waiting for what the platform-holder positions it as and how they treat it? Nothing says that time necessarily determines when a successor or a pro comes out, that is only the assumption made, and an assumption that is made that isn’t common for the platform holder in question: internal spec increased revisions.


I feel as though, the logic with respect to timing is an attempt to apply how one company follows and does things to the other company, as though both follow the same ruleset and are set in stone in a rigid manner. It isn't as simple as that.

It isn’t intrinsically tied by hardware potential "How many cores?!? flops???"

It's determined by how they position it and how they support it.




If a platform-holder supports and treats said device in the same way they treat the original device by look and acts similar to, do you consider that a successor or a pro-revision?

If a platform-holder supports and treats said device in a unique way that completely excludes the original device, do you consider that a successor or a pro-revision?

If a platform-holder supports it neither uniquely/exclusively nor as the same thing as the original it looks and acts similar to in form, what is it? A successor? Or a pro-revision?


trick question, it's both, you folks call that "Iterative Successor" 🤭
 

Lwill

Member
Oct 28, 2017
1,524
It's difficult talking about this without getting into specifics, but I'll try...

There isn't anything inherently wrong with the SoC (as far as I know). The issue had more to do with Nintendo's goals not being met in some way, and until those goals are met or compromises are made, all characteristics of the SoC cannot be locked down and taped out.

I still do not know if this has happened, btw, but given that we now know that Switch 4k isn't bound to SwOLED's timeline, Nintendo/NVIDIA's IC design team should have plenty of time to figure that out.
Oh wow I see. I hope that it already worked out and that we will get a better product at the end of it. Thanks for answering this question to the best of your ability.
 
Dec 21, 2020
4,993
It's difficult talking about this without getting into specifics, but I'll try...

There isn't anything inherently wrong with the SoC (as far as I know). The issue had more to do with Nintendo's goals not being met in some way, and until those goals are met or compromises are made, all characteristics of the SoC cannot be locked down and taped out.

I still do not know if this has happened, btw, but given that we now know that Switch 4k isn't bound to SwOLED's timeline, Nintendo/NVIDIA's IC design team should have plenty of time to figure that out.
I would assume it would be how much it draws, that seems like a reasonable assumption. They would like a certain battery life for the device.
 

4859

Banned
Oct 27, 2017
7,046
In the weak and the wounded
It's difficult talking about this without getting into specifics, but I'll try...

There isn't anything inherently wrong with the SoC (as far as I know). The issue had more to do with Nintendo's goals not being met in some way, and until those goals are met or compromises are made, all characteristics of the SoC cannot be locked down and taped out.

I still do not know if this has happened, btw, but given that we now know that Switch 4k isn't bound to SwOLED's timeline, Nintendo/NVIDIA's IC design team should have plenty of time to figure that out.

Miyamoto wants the board logic to make a mosaic portrait of his face.
 
OP
OP
Dakhil

Dakhil

Member
Mar 26, 2019
4,379
Orange County, CA
It's difficult talking about this without getting into specifics, but I'll try...

There isn't anything inherently wrong with the SoC (as far as I know). The issue had more to do with Nintendo's goals not being met in some way, and until those goals are met or compromises are made, all characteristics of the SoC cannot be locked down and taped out.

I still do not know if this has happened, btw, but given that we now know that Switch 4k isn't bound to SwOLED's timeline, Nintendo/NVIDIA's IC design team should have plenty of time to figure that out.
I still appreciate the insightful answer in spite of not being able to go into specifics.

Don't mind me since I'm simply thinking out loud, but I wonder if Nintendo found the power draw of one of Samsung's 8 nm process nodes to be higher than what Nintendo might have expected.
 

fwd-bwd

Member
Jul 14, 2019
712
But with the great illustration you posted, I can see a major role for the new Nintendo Account system being to push existing Nintendo accounts towards their hardware-software ecosystem. However, this doesn't necessarily have to hinge on an iterative hardware model, or does it? 😅
If we take Iwata's remarks at face value, you're correct that the NX strategy only calls for a cross-channel user account and a common software architecture—neither of them necessitates the iterative hardware model. As you know, however, quite a few folks in this thread, me included, have extrapolated those points as an indication of iterative hardware model. It'd make too much sense not to update the Switch hardware iteratively, but as this week's event demonstrated, Nintendo may surprise us still.
 

Aether

Member
Jan 6, 2018
3,881
Which is not many compared to the times they kept almost full BC.
correct me if I'm wrong but wasn't the SNES supposed to be able to play NES games, but some cartridge issues prevented that?

and then the GCN/Wii/Wii U trifecta

GB>GBC>GBA was a thing, then GBA>DS, and DS>3DS. I'm almost certain the 3DS could play the previous systems without emulation, but was locked out like the Wii U and GCN

so it's not a very good counter point
I totally misread what I was replying to, so yeah, disregard my earlier statement.
Arguably there would even be a better counterpoint:GB-GBC (2 Zelda games that as far as i know only run on GBC)
DS -> DSI, the DSI ware was not available on the DS and it added a cemera
3DS -> new 3DS, xenoblade.

All arguably "Pro" variants, and had exclusives and added new features.
I still feel that this was so underplayed in all those cases (exclusive new 3DS games i think 5,
exclusive GBC games 113 compared to 1046 GB games, and DSI ware had never had a big release i think)
I still wouldn't compare those to current expectations of a Pro , but it does not seem that nintendo is oposed to that...
and they where all handhelds, nintendo did not do that with console, but from the rereleases it seems it
sees the switch more as a handheld then a console (the amount of special editions, the lite, prioritizing
battery to more power (even docked...), having another revision where the main focus is the screen..)
 

ILikeFeet

Member
Oct 25, 2017
50,248
Arguably there would even be a better counterpoint:GB-GBC (2 Zelda games that as far as i know only run on GBC)
DS -> DSI, the DSI ware was not available on the DS and it added a cemera
3DS -> new 3DS, xenoblade.

All arguably "Pro" variants, and had exclusives and added new features.
I still feel that this was so underplayed in all those cases (exclusive new 3DS games i think 5,
exclusive GBC games 113 compared to 1046 GB games, and DSI ware had never had a big release i think)
I still wouldn't compare those to current expectations of a Pro , but it does not seem that nintendo is oposed to that...
and they where all handhelds, nintendo did not do that with console, but from the rereleases it seems it
sees the switch more as a handheld then a console (the amount of special editions, the lite, prioritizing
battery to more power (even docked...), having another revision where the main focus is the screen..)
I think the biggest reason we didn't see more exclusives on previous enhanced systems is due to the lack of need for scalable games. If you're making a handheld game, you've already committed to making a bespoke game and need that system's audience to justify it. You can't offload the work to other systems to make up for lost userbase by not supporting the main system. The closest third parties came to making scalable games was the 3DS and Vita.

Now with switch and modern mobile phones, games can be brought down from higher performance systems. Third parties can justify making exclusives because it's this or nothing.
 

Aether

Member
Jan 6, 2018
3,881
I think the biggest reason we didn't see more exclusives on previous enhanced systems is due to the lack of need for scalable games. If you're making a handheld game, you've already committed to making a bespoke game and need that system's audience to justify it. You can't offload the work to other systems to make up for lost userbase by not supporting the main system. The closest third parties came to making scalable games was the 3DS and Vita.

Now with switch and modern mobile phones, games can be brought down from higher performance systems. Third parties can justify making exclusives because it's this or nothing.
Yeah, thats where im at. Some asked why they shout need to make a pro or why nuw such a thing would be needed and not for prior generations that got older.
Simply, prior developers knew what they are aiming for. Third parties are just downporting to the switch, and we feel i tin Resolution and Framerates.

But all of that is okay... its the first party developers that dont hit Native Resolution and 30FPS, 2 things that where way more dependable in previous generations.
Just think if there would have been sub native resolution pixel art games on the GB-DS. And with the 3DS people already startet to be anoyed by the low resolution.
To me thats a sign that the developers themselves have higher ambitions for how theyr games should look like then the switch can provide...
 

julian

Member
Oct 27, 2017
11,998
Arguably there would even be a better counterpoint:GB-GBC (2 Zelda games that as far as i know only run on GBC)
DS -> DSI, the DSI ware was not available on the DS and it added a cemera
3DS -> new 3DS, xenoblade.

All arguably "Pro" variants, and had exclusives and added new features.
I still feel that this was so underplayed in all those cases (exclusive new 3DS games i think 5,
exclusive GBC games 113 compared to 1046 GB games, and DSI ware had never had a big release i think)
I still wouldn't compare those to current expectations of a Pro , but it does not seem that nintendo is oposed to that...
and they where all handhelds, nintendo did not do that with console, but from the rereleases it seems it
sees the switch more as a handheld then a console (the amount of special editions, the lite, prioritizing
battery to more power (even docked...), having another revision where the main focus is the screen..)
New 3DS actually had dozens of exclusives between the couple of retail exclusives, the SNES VC and a lot of Unity games. It has a whole section in the 3DS eShop.
 

SiG

Member
Oct 25, 2017
6,468
It's difficult talking about this without getting into specifics, but I'll try...

There isn't anything inherently wrong with the SoC (as far as I know). The issue had more to do with Nintendo's goals not being met in some way, and until those goals are met or compromises are made, all characteristics of the SoC cannot be locked down and taped out.
I wonder now if the new SoC not being tapped out on time was the reason how we ended up with a refresh. At this rate, I'm guessing they're preparing for a suitably proper successor, although who knows at this point...

It could also have nothing to do with the chip itself per se: Perhaps its the price of manufacturing, or just the price in general. Either way, I could see the SwOLED as a way to bring down the costs of some components.