• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Status
Not open for further replies.

UltraMagnus

Banned
Oct 27, 2017
15,670
No, they claim it's "as powerful as XBOX One" ... In FP16. So half that in FP32. This is, in Flops rating. It's hard to know the IPC/Real world performance of the Power VR architecture it's using, but if they use the flops rating to claim its the same power, the IPC must be similar.
Modern Nvidia cards, on the other hand, have much higher IPC than the GCN Radeon Cards, so all in all I'd say the iPad Pro has a slightly more powerful GPU than the docked Switch. The CPU comparison is nothing to write home about because we don't know how it throttles during gameplay, or the overhead iOS needs. All in all it's very impressive however, for a passively cooled device to be that powerful.

I don't think they have all that much leeway. 4A76 cores at 1.8GHZ and 384 CUDA cores at ~1.2GHZ with 58GB/s bandwidth is what I think is possible within the Switch's 11W power usage. That is, using 7nm.

Even with just a rough estimate that A12X could be then be in the 600-700 GFLOPS on battery power (undocked basically).

If you put a chip of that caliber into a device like a Switch IMO most devs would be able to port PS4/XB1 titles. The current Switch as is can handle a few ports, but quadruple+ the performance and increase the RAM + bandwidth, and I think most ports would be fairly reasonable to do at 720p undocked.

We do know NBA 2K19 runs at native screen res of the iPad Pro, which is waaaaay above 1080p and it looks pretty damn close to the PS4/XB1 version of the game to me anyway. Put a chip like that in a Switch model and let it focus on only 720p pixels portably and let devs code to the metal and I think you would see some jaw-dropping results for a portable device.

Unfortunately that type of a chip is kinda wasted on a device like an iPad Pro because not many devs are ever going to take full advantage of the chip.
 

Moi_85

Member
Nov 26, 2018
68
Why would it be a problem to have a Switch too powerful in docked mode? (of course in handheld the priority is the battery)

¿?

I'm not saying it's going to be like that (probably not), only that I don't see the problem of having a Switch teorically designed for the future crossgen ports, because I don't understand why would it be a problem with the forward compatibility. The problem is when you see missing ports or games on the cloud (like Assassin's Creed: Odyssey) because Switch can't handle it.
 
Oct 27, 2017
5,618
Spain
Even with just a rough estimate that A12X could be then be in the 600-700 GFLOPS on battery power (undocked basically).

If you put a chip of that caliber into a device like a Switch IMO most devs would be able to port PS4/XB1 titles. The current Switch as is can handle a few ports, but quadruple+ the performance and increase the RAM + bandwidth, and I think most ports would be fairly reasonable to do at 720p undocked.

We do know NBA 2K19 runs at native screen res of the iPad Pro, which is waaaaay above 1080p and it looks pretty damn close to the PS4/XB1 version of the game to me anyway. Put a chip like that in a Switch model and let it focus on only 720p pixels portably and let devs code to the metal and I think you would see some jaw-dropping results for a portable device.

Unfortunately that type of a chip is kinda wasted on a device like an iPad Pro because not many devs are ever going to take full advantage of the chip.
It's not wasted in the iPad Pro because it's not a gaming device. It's a tablet, with a focus on productivity. Gaming is a continuous load, every n milliseconds a new frame has to be processed and pushed out, for long periods of time. Must productivity work (Say, editing a photo in Lightroom) is done in short bursts of activity followed by idleness. Which is why modern processors use turbo boost, especially in mobile chips.
But yeah, the change with a chip like that (What I described is similar, only in Tegra Land) can be quite big.
 

Turrican3

Member
Oct 27, 2017
781
Italy
Yeah - I think one core thing for people to understand is that - for better or for worse - the Switch is a very intentional divorce from the hardware spec race.
From this very specific point of view I see little differences compared to what they have been doing after the Gamecube, i.e. they' re not competing directly with Sony and Microsoft home consoles anymore (although it could be argued the Switch is a closer competitor on paper than, say, the Wii or the WiiU)
 

3bdelilah

Attempted to circumvent ban with alt account
Banned
Oct 26, 2017
1,615
I'd be an idiot to get a Switch now, right? I think I can wait another year to play Mario Odyssey and Smash Ultimate.
 

fourfourfun

Member
Oct 27, 2017
7,680
England
I think one core thing for people to understand is that - for better or for worse - the Switch is a very intentional divorce from the hardware spec race.

I would say Nintendo set off on this foot from the Wii onwards. Same with removing themselves from the "everything in E3" approach to conferences. They were getting steamrolled by the other two consistently and had to find a new path.
 
Oct 27, 2017
5,618
Spain
I'd be an idiot to get a Switch now, right? I think I can wait another year to play Mario Odyssey and Smash Ultimate.
Honestly, its up to you. If you can find a good price, go ahead and get one more year of games. I wouldn't be surprised if the Switch 2019 had an upgrade path where you only need to buy the new tablet and keep everything else for a cheaper price.
 

Adventureracing

The Fallen
Nov 7, 2017
8,027
I'd be an idiot to get a Switch now, right? I think I can wait another year to play Mario Odyssey and Smash Ultimate.

Nah I don't think you'd be an idiot. This still isn't even confirmed so I wouldn't be waiting a year based on rumours when we don't even know when it's happening or what the revision will be. Personally id probably just get a switch now and get a years worth of use out of it and just trade it in if there is a revision.

It really just depends how much you want a switch.
 

z0m3le

Member
Oct 25, 2017
5,418
I want to play smash so badly, wish they would just announce whatever it is so I know if I should wait or just buy one now.
The answer is you should buy one if you are interested in the games, you can sell it used and buy a new model in a year, when you wait, you usually miss out on older games you might have wanted, things like Octopath Traveler. Remember Switch is packed for 2019, and you might not find the time to play these older greats.
Why would it be a problem to have a Switch too powerful in docked mode? (of course in handheld the priority is the battery)

¿?

I'm not saying it's going to be like that (probably not), only that I don't see the problem of having a Switch teorically designed for the future crossgen ports, because I don't understand why would it be a problem with the forward compatibility. The problem is when you see missing ports or games on the cloud (like Assassin's Creed: Odyssey) because Switch can't handle it.
A game that runs docked only, is a game they shouldn't make. The relationship between the portable specs and docked specs are important, and a 720p to 1080p scenario is the one that is easiest to maximize. Next gen consoles won't even be out for 2 years and won't have exclusive (non-current gen) games from yearly multiplats like Assassin's Creed for at least another 2, Nintendo putting out a 2nd iteration of the Switch, that is again more powerful with 7nm+ pushing 2TFLOPs+ when docked in 2022, would allow them to get cross gen games then.
No, they claim it's "as powerful as XBOX One" ... In FP16. So half that in FP32. This is, in Flops rating. It's hard to know the IPC/Real world performance of the Power VR architecture it's using, but if they use the flops rating to claim its the same power, the IPC must be similar.
Modern Nvidia cards, on the other hand, have much higher IPC than the GCN Radeon Cards, so all in all I'd say the iPad Pro has a slightly more powerful GPU than the docked Switch. The CPU comparison is nothing to write home about because we don't know how it throttles during gameplay, or the overhead iOS needs. All in all it's very impressive however, for a passively cooled device to be that powerful.

I don't think they have all that much leeway. 4A76 cores at 1.8GHZ and 384 CUDA cores at ~1.2GHZ with 58GB/s bandwidth is what I think is possible within the Switch's 11W power usage. That is, using 7nm.
You think that a 7nm chip would only allow Nvidia to produce a 921GFLOPs GPU + quad core ARM CPU? "TSMC called their process at this "node" 16nm to reflect relaxed pitches. The initial process was 16FF followed quickly by 16FF+ with a 15% performance boost. 16FFC is now available and is reported to have 8 to 10 less masks driving lower cost while offering 0.55 volt operation for low power (50% lower power)." 12nm uses 60% less energy than 20nm planar in Tegra X1, while offering a 20% increase in performance. A57 is a power hog, at 1.8ghz it consumes 6.5 watts, while an A73 at 1.8GHz consumes less than 1.5 watts. 7nm is over 60% smaller than 12nm, "The 7nm process offers 35% to 40% performance gains over 16nm or a >65% power reduction."

You might be investing in Switch currently using 16nm, it's not likely, but whatever, it's your thoughts to think about, what you should remember though is that Tegra X2 produces Switch's docked performance in 7.5 watts, with a 20% faster CPU, 10% faster GPU (437GFLOPs) and 100% more memory bandwidth, that is on 16FF, not even the 16FF+ or 12nm. 7nm would absolutely blow away those specs with TSMC's numbers. Remember the A73 is using 1/4th as much power than the A57 cores, well they could offer that CPU performance at 2.5GHz quad core A73, still have that extra 6 watts to use on the GPU side, offering a portable 1TFLOPs GPU would not be outside the realm of possibilities, remember Nvidia is the leading GPU manufacture on the planet when it comes to performance / power ratio. 7nm isn't what the next Switch will be though, it's 12nm. That is the most likely push because that is what they need the Switch to do, about 900GFLOPs docked.

And yes, the entire board for TX2 consumes 7.5 watts in Max Q and 15 watts in Max P, that includes RAM, Fan, and other aspects of the computer: "Meanwhile the board's Max-P mode is its maximum performance mode. In this mode NVIDIA sets the board TDP to 15W, allowing the TX2 to hit higher performance at the cost of some energy efficiency." I also like Anandtech as a website, they rarely get anything wrong and they are very careful, which is why I am certain they didn't make a mistake with the use of Board TDP here, SoC in Max Q is allowing the board to run at 7.5 watts, not the SoC, much like Max P places that TDP to 15 watts.

Nintendo's iterations will likely go 12nm in 2019 and 7nm in 2022, there is a pattern these devices would create, allowing for an expected 3nm Switch in 2025, again increasing docked performance by 2.5x, so Switch in this scenario looks like this:
Switch Spring 2017: 393GFLOPs docked, 196GFLOPs portable
Switch Fall 2019: 944GFLOPs docked, 393GFLOPs portable.
Switch Fall 2022: 2.35TFLOPs docked, 944GFLOPs portable. (this is a year or two before a probable PS5 Pro arrives)
Switch Fall 2025: 5.8TFLOPs docked, 2.35TFLOPs portable. (this is a year or two after a probable PS5 Pro arrives)
It's important to remember that docked performance is targeting 1080p on the newest Switch units and 720p on the previous ones/portable modes, while PS5/PS5 Pro would be targeting 4K. Nintendo would likely offer a 4K dock after 4K becomes relevant enough for Nintendo to care about, likely sometime in or after 2022.
 
Last edited:
Oct 27, 2017
5,618
Spain
A game that runs docked only, is a game they shouldn't make. The relationship between the portable specs and docked specs are important, and a 720p to 1080p scenario is the one that is easiest to maximize. Next gen consoles won't even be out for 2 years and won't have exclusive (non-current gen) games from yearly multiplats like Assassin's Creed for at least another 2, Nintendo putting out a 2nd iteration of the Switch, that is again more powerful with 7nm+ pushing 2TFLOPs+ when docked in 2022, would allow them to get cross gen games then.

You think that a 7nm chip would only allow Nvidia to produce a 921GFLOPs GPU + quad core ARM CPU? "TSMC called their process at this "node" 16nm to reflect relaxed pitches. The initial process was 16FF followed quickly by 16FF+ with a 15% performance boost. 16FFC is now available and is reported to have 8 to 10 less masks driving lower cost while offering 0.55 volt operation for low power (50% lower power)." 12nm uses 60% less energy than 20nm planar in Tegra X1, while offering a 20% increase in performance. A57 is a power hog, at 1.8ghz it consumes 6.5 watts, while an A72 at 1.8GHz consumes less than 1.5 watts. 7nm is over 60% smaller than 12nm, "The 7nm process offers 35% to 40% performance gains over 16nm or a >65% power reduction."

You might be investing in Switch currently using 16nm, it's not likely, but whatever, it's your thoughts to think about, what you should remember though is that Tegra X2 produces Switch's docked performance in 7.5 watts, with a 20% faster CPU, 10% faster GPU (437GFLOPs) and 100% more memory bandwidth, that is on 16FF, not even the 16FF+ or 12nm. 7nm would absolutely blow away those specs with TSMC's numbers. Remember the A72 is using 1/4th as much power than the A57 cores, well they could offer that CPU performance at 2.5GHz quad core A72, still have that extra 5 watts to use on the GPU side, offering a portable 1TFLOPs GPU would not be outside the realm of possibilities, remember Nvidia is the leading GPU manufacture on the planet when it comes to performance / power ratio. 7nm isn't what the next Switch will be though, it's 12nm. That is the most likely push because that is what they need the Switch to do, about 900GFLOPs docked.

And yes, the entire board for TX2 consumes 7.5 watts in Max Q and 15 watts in Max P, that includes RAM, Fan, and other aspects of the computer: "Meanwhile the board's Max-P mode is its maximum performance mode. In this mode NVIDIA sets the board TDP to 15W, allowing the TX2 to hit higher performance at the cost of some energy efficiency." I also like Anandtech as a website, they rarely get anything wrong and they are very careful, which is why I am certain they didn't make a mistake with the use of Board TDP here, SoC in Max Q is allowing the board to run at 7.5 watts, not the SoC, much like Max P places that TDP to 15 watts.

Nintendo's iterations will likely go 12nm in 2019 and 7nm in 2022, there is a pattern these devices would create, allowing for an expected 3nm Switch in 2025, again increasing docked performance by 2.5x, so Switch in this scenario looks like this:
Switch Spring 2017: 393GFLOPs docked, 196GFLOPs portable
Switch Fall 2019: 944GFLOPs docked, 393GFLOPs portable.
Switch Fall 2022: 2.35TFLOPs docked, 944GFLOPs portable. (this is a year or two before a probable PS5 Pro arrives)
Switch Fall 2025: 5.8TFLOPs docked, 2.35TFLOPs portable. (this is a year or two after a probable PS5 Pro arrives)
It's important to remember that docked performance is targeting 1080p on the newest Switch units and 720p on the previous ones/portable modes, while PS5/PS5 Pro would be targeting 4K. Nintendo would likely offer a 4K dock after 4K becomes relevant enough for Nintendo to care about, likely sometime in or after 2022.
I'm pretty sure the "Board TDP" of the Jetson modules is for the SoC. It certainly is for the Jetson TX1, because that has a "Board TDP" of 15W and commercial implementations of the Tegra X1 draw a total 19W under gaming loads. Nvidia also happens to claim "2X efficiency" for the Tegra X2, which very conveniently matches the kind of frequencies the Shield TV can produce.
I insist, the analysis I did for the Tegra X1 and the chip in the Switch is not from manufacturer-provided TDP or anything like that, it's real world tests. So for the figures of the Tegra X2 I'd take them with a huge grain of salt, just like the figures for the Jetson TX1 should be.
I think you are fudging together manufacturer-provided figures in undisclosed conditions with real power consumption and performance, and ignoring the most important information about the chip that's actually in the hands of the consumers, which is the Tegra X1.
And therefore, you are way overestimating how power efficient Nvidia's tech can be in 12nm, which is a glorified 16nm node, and how efficient the Tegra X2 can be.
I don't think you realize the Switch is already 50% of the PS4's CPU performance and 1/3 of the GPU, while drawing like a seventh of the power. And it's supposed to do so in 20nm? Nvidia's power efficiency advantage is real, but it's not magic. There's no way Nvidia or anyone can fit 4 big ARMv8 cores and a 900+GFLOPs GPU in the Switch's TDP, in 12nm. It's just fantasy, man.
 

z0m3le

Member
Oct 25, 2017
5,418
I'm pretty sure the "Board TDP" of the Jetson modules is for the SoC. It certainly is for the Jetson TX1, because that has a "Board TDP" of 15W and commercial implementations of the Tegra X1 draw a total 19W under gaming loads. Nvidia also happens to claim "2X efficiency" for the Tegra X2, which very conveniently matches the kind of frequencies the Shield TV can produce.
I insist, the analysis I did for the Tegra X1 and the chip in the Switch is not from manufacturer-provided TDP or anything like that, it's real world tests. So for the figures of the Tegra X2 I'd take them with a huge grain of salt, just like the figures for the Jetson TX1 should be.
I think you are fudging together manufacturer-provided figures in undisclosed conditions with real power consumption and performance, and ignoring the most important information about the chip that's actually in the hands of the consumers, which is the Tegra X1.
And therefore, you are way overestimating how power efficient Nvidia's tech can be in 12nm, which is a glorified 16nm node, and how efficient the Tegra X2 can be.
I don't think you realize the Switch is already 50% of the PS4's CPU performance and 1/3 of the GPU, while drawing like a seventh of the power. And it's supposed to do so in 20nm? Nvidia's power efficiency advantage is real, but it's not magic. There's no way Nvidia or anyone can fit 4 big ARMv8 cores and a 900+GFLOPs GPU in the Switch's TDP, in 12nm. It's just fantasy, man.
So, you think anandtech used board TDP to refer to SoC, and then specifically referred to SoC power consumption separately? Remember 20nm leaks, it having a 19 Watts PEAK (At the wall) reading isn't comparable to what the Switch is doing at 11 watts on average.

Pixel C, in light gaming offers 5 hours of gaming on it's 34Wh battery, meaning under 7 watts used on a 20nm chip, to run the entire unit with a screen.

If the Switch is using a 16nm SoC, and the Pixel C is using a 20nm SoC, why do they have a similar battery life / power consumption via Streaming media, with Switch lasting just over 5 hours in Hulu (16wh battery) and Pixel C lasting around 10 hours (34wh battery). Anandtech even found the Pixel C to run for 13.5 hours straight with local media at 720p. Hulu only lasting 5 hours on Switch with half the battery of the Pixel C, is a pretty clear indication that they are on the same process node, as you'd expect a very large improvement in battery life thanks to moving to finfet 3D transistors over 20nm's planar 2D transistors. Switch is 20nm, so stop with the nonsense already. Fantasy is entirely on your end (You even asked us to put on a tinfoil hat in your thread), you believe that Nvidia redesigned TX1 for 16nm, didn't use it in their Shield TV 2017 model (for binning), and we don't see a performance increase in streaming video, even though the Pixel C has a 10 inch screen, there is something missing here, and it's evidence that Nvidia and Nintendo went through all that trouble. Lastly, I know that Nintendo and Nvidia used Tegra X2 for testing on this more powerful model, why would they do that if they already had a X1 on 16nm?
 
Last edited:
Oct 27, 2017
5,618
Spain
So, you think anandtech used board TDP to refer to SoC, and then specifically referred to SoC power consumption separately? Remember 20nm leaks, it having a 19 Watts PEAK (At the wall) reading isn't comparable to what the Switch is doing at 11 watts on average.

Pixel C, in light gaming offers 5 hours of gaming on it's 34Wh battery, meaning under 7 watts used on a 20nm chip, to run the entire unit with a screen.

If the Switch is using a 16nm SoC, and the Pixel C is using a 20nm SoC, why do they have a similar battery life / power consumption via Streaming media, with Switch lasting just over 5 hours in Hulu (16wh battery) and Pixel C lasting around 10 hours (34wh battery). Anandtech even found the Pixel C to run for 13.5 hours straight with local media at 720p. Hulu only lasting 5 hours on Switch with half the battery of the Pixel C, is a pretty clear indication that they are on the same process node, as you'd expect a very large improvement in battery life thanks to moving to finfet 3D transistors over 20nm's planar 2D transistors. Switch is 20nm, so stop with the nonsense already. Fantasy is entirely on your end (You even asked us to put on a tinfoil hat in your thread), you believe that Nvidia redesigned TX1 for 16nm, didn't use it in their Shield TV 2017 model (for binning), and we don't see a performance increase in streaming video, even though the Pixel C has a 10 inch screen, there is something missing here, and it's evidence that Nvidia and Nintendo went through all that trouble. Lastly, I know that Nintendo and Nvidia used Tegra X2 for testing on this more powerful model, why would they do that if they already had a X1 on 16nm?
The Pixel C is an Android tablet with dynamic frequencies and voltages. Meaning, it scales those accordingly to save power while keeping performance up. The Switch has a few frequency and voltage modes, and it runs in those modes, period. The CPU runs at 1020MHZ, period. The GPU runs at either 307, 384 or 768MHZ, period. So it's not surprising that it's not very efficient at tasks where the CPU and GPU are mostly idling. Perhaps they should add a mode with dynamic power scaling, but that's another story.

And if the rumor of the devkits with an overclocked Tegra X2 are true (It's very plausible) the reason is obvious. The SoC in the Switch is a Tegra X1, shrinked to run with 7W of power. It still has a 64 bit memory interface, and 25.6GB/s of bandwidth. The Tegra X2 does the same, only if you overclock it, you actually have a 128 bit interface and much more memory bandwidth to actually see improvements at say, the frequencies that a TDP of 20W allows you.

And again, more than one site said "The Shield TV had a consumption of 19W while gaming". No peak consumption, sustained consumption.
 

z0m3le

Member
Oct 25, 2017
5,418
The Pixel C is an Android tablet with dynamic frequencies and voltages. Meaning, it scales those accordingly to save power while keeping performance up. The Switch has a few frequency and voltage modes, and it runs in those modes, period. The CPU runs at 1020MHZ, period. The GPU runs at either 307, 384 or 768MHZ, period. So it's not surprising that it's not very efficient at tasks where the CPU and GPU are mostly idling. Perhaps they should add a mode with dynamic power scaling, but that's another story.

And if the rumor of the devkits with an overclocked Tegra X2 are true (It's very plausible) the reason is obvious. The SoC in the Switch is a Tegra X1, shrinked to run with 7W of power. It still has a 64 bit memory interface, and 25.6GB/s of bandwidth. The Tegra X2 does the same, only if you overclock it, you actually have a 128 bit interface and much more memory bandwidth to actually see improvements at say, the frequencies that a TDP of 20W allows you.

And again, more than one site said "The Shield TV had a consumption of 19W while gaming". No peak consumption, sustained consumption.
Quite convenient that it's power consumption falls in line with the 20nm pixel c doing the same task, running just as idle and thus not requiring more power to maintain clocks/create heat.

You also point out how one device with various frequencies can have different power draws, while ignore that case with shield tv. At this point there is no reasoning with you, believe what you want but with such a wild story, you should hold it as a theory and not a fact.
 
Oct 27, 2017
5,618
Spain
Quite convenient that it's power consumption falls in line with the 20nm pixel c doing the same task, running just as idle and thus not requiring more power to maintain clocks/create heat.

You also point out how one device with various frequencies can have different power draws, while ignore that case with shield tv. At this point there is no reasoning with you, believe what you want but with such a wild story, you should hold it as a theory and not a fact.
Nonsense. You don't know what frequencies the Pixel C is running at. Presumably, they are much lower than the ones the Switch runs at, because x264 decoding is done on fixed function hardware.
Idling at 1020MHZ is much more expensive than idling at, say, 200MHZ. If it weren't so, why does every mobile device and laptop scale frequency constantly? The Switch is not a tablet, it's a gaming device. It maintains high clocks at all time because it was not designed with YouTube in mind, but with games in mind. It didn't have YouTube until a month ago.
You are constantly comparing apples with oranges and ignoring the apples to apples evidence I already provided. You are mixing the fact that mobile devices scale their frequencies down to save on power with the fact they also scale frequency down under high loads due to TDP limitations. You keep using this confusion, intentional or not, to somehow try to discredit the only solid evidence we have about how the Shield TV throttles under gaming loads. We know how the Shield TV throttles under gaming loads, and we know how much power it draws in those conditions, period. Those are facts. You can throw around the figures of how much it draws while streaming Netflix, or internal power readings of the Tegra X2 doing a deep learning benchmark, or whatever, it doesn't change those facts, and they are moot comparisons because no Switch has ever performed deep learning tasks, nor has it existed running Android tasks with dynamic frequencies, and thus the comparisons don't even exist.
 

KanameYuuki

Member
Dec 23, 2017
2,649
Colombia
The answer is you should buy one if you are interested in the games, you can sell it used and buy a new model in a year, when you wait, you usually miss out on older games you might have wanted, things like Octopath Traveler. Remember Switch is packed for 2019, and you might not find the time to play these older greats.

That's a good point, and for what it is it may end up being cheaper like a 2DS instead and then a better model down the line.
 
Last edited:

z0m3le

Member
Oct 25, 2017
5,418
Nonsense. You don't know what frequencies the Pixel C is running at. Presumably, they are much lower than the ones the Switch runs at, because x264 decoding is done on fixed function hardware.
Idling at 1020MHZ is much more expensive than idling at, say, 200MHZ. If it weren't so, why does every mobile device and laptop scale frequency constantly? The Switch is not a tablet, it's a gaming device. It maintains high clocks at all time because it was not designed with YouTube in mind, but with games in mind. It didn't have YouTube until a month ago.
You are constantly comparing apples with oranges and ignoring the apples to apples evidence I already provided. You are mixing the fact that mobile devices scale their frequencies down to save on power with the fact they also scale frequency down under high loads due to TDP limitations. You keep using this confusion, intentional or not, to somehow try to discredit the only solid evidence we have about how the Shield TV throttles under gaming loads. We know how the Shield TV throttles under gaming loads, and we know how much power it draws in those conditions, period. Those are facts. You can throw around the figures of how much it draws while streaming Netflix, or internal power readings of the Tegra X2 doing a deep learning benchmark, or whatever, it doesn't change those facts, and they are moot comparisons because no Switch has ever performed deep learning tasks, nor has it existed running Android tasks with dynamic frequencies, and thus the comparisons don't even exist.

You are all over the place, the comparison was Netflix and ibbc streaming on pixel c, to Hulu on Switch. 10 hours to 5 hours. Larger screen on the Pixel C, and in your theory a much less efficient 20nm process, which even if clocked at 600mhz pulls in slightly more power than the Switch at 1GHz on 16nm, switch runs the gpu at 307mhz while the Pixel c runs the task at 230mhz on the gpu. 16nm more than makes up for the differences that the Pixel c might be running at, doesn't satisfy the larger screen drawing more power either.

Has nothing to do with YouTube or hardware encoding, that was a separate 13 hour local media, battery test, and Switch also has that hardware. This is as close to apples to apples as we are reasonably expected to guess, this isn't string theory here, there is no elegance to your theory, you don't even know that the shield TV is running switch clocks, you just know that it can throttle to switch clocks, if you force the cpu and gpu to 100% load and lock the cpu to 1ghz and ignore the gpu when it jumps to 1ghz.

If the switch was 16nm, there would be an obvious jump in battery life over the Pixel c doing the same tasks, especially with everything against the Pixel c, we don't see that, we see basically identical power draw between the devices, via the same task. You hiding behind frequencies with your theory just doesn't make sense in this application as streaming is actually not a light task.
 
Last edited:

ShadowFox08

Banned
Nov 25, 2017
3,524
No real need to worry about throttling here, X2 has Denver and A57 cores, but that leak said new cpu, my guess is A72/A73, which would allow the GPU side to take whatever it wanted in terms of power, so I'm thinking 884gflops to 944gflops is where they would go, and power consumption isn't their issue, as they are releasing switch pro with a different chip than X2 anyways.

The relationship is still 720p to 1080p, nothing else really makes sense, that's a small range of 2.25x to 2.5x. I think Foxconn leak had a golden nugget, I think switch pro or a customized chip to emulate it, was there. 2x every spec plus the 921mhz GPU for 944gflops.

From everything that I've heard going on from multiple insiders, this seems to line up. It also doesn't require a 7nm chip although that is possible, it can be done on 12nm just fine.
Yes I agree it doesn't require 7nm, but it wouldn't have to go to waste, even if they were targeting 1TFLOP for the GPU docked. They could improve battery life even further or even upgrade to a 1080p screen.

Not expecting it in the slightest, but it would be awesome.
 

z0m3le

Member
Oct 25, 2017
5,418
Yes I agree it doesn't require 7nm, but it wouldn't have to go to waste, even if they were targeting 1TFLOP for the GPU docked. They could improve battery life even further or even upgrade to a 1080p screen.

Not expecting it in the slightest, but it would be awesome.
Yeah the possibility of them doing it because Nvidia wants to shop the chip around to other vendors and for battery life is a possibility, it would cost more and considering this was in prototype form a couple years ago, it's more likely they will go with the cheaper 12nm option, which should offer better battery life than the current Switch anyways, thanks to changing to a newer cpu.
 
Oct 27, 2017
5,618
Spain
You are all over the place, the comparison was Netflix and ibbc streaming on pixel c, to Hulu on Switch. 10 hours to 5 hours. Larger screen on the Pixel C, and in your theory a much less efficient 20nm process, which even if clocked at 600mhz pulls in slightly more power than the Switch at 1GHz on 16nm, switch runs the gpu at 307mhz while the Pixel c runs the task at 230mhz on the gpu. 16nm more than makes up for the differences that the Pixel c might be running at, doesn't satisfy the larger screen drawing more power either.

Has nothing to do with YouTube or hardware encoding, that was a separate 13 hour local media, battery test, and Switch also has that hardware. This is as close to apples to apples as we are reasonably expected to guess, this isn't string theory here, there is no elegance to your theory, you don't even know that the shield TV is running switch clocks, you just know that it can throttle to switch clocks, if you force the cpu and gpu to 100% load and lock the cpu to 1ghz and ignore the gpu when it jumps to 1ghz.

If the switch was 16nm, there would be an obvious jump in battery life over the Pixel c doing the same tasks, especially with everything against the Pixel c, we don't see that, we see basically identical power draw between the devices, via the same task. You hiding behind frequencies with your theory just doesn't make sense in this application as streaming is actually not a light task.
You really are making weird comparisons. We don't know how the Switch decodes streaming apps, nor how the Pixel C decodes streaming apps, nor the frequencies the Pixel C runs at when performing those tasks. It's a useless comparison that goes nowhere. Just because both devices have the same hardware for video decoding it doesn't mean both are using it the same way. And again, the Switch has fixed clocks and voltages. How hard do I have to say that? It's not a valid comparison by any means. Even two implementations of the same chip can have wildly different performance while doing the same task. The Switch's power management is not optimized for the use a tablet like the Pixel C sees. It's optimized for predictable performance at the most efficient point in the power curve so that at full load it sees acceptable heat and battery life. It' designed with being all the time at full load in mind.

As for the Shield TV, MDave never said both CPU and GPU were under 100% load at all times, and even if they were, 100% load can have different power consumption depending on the kind of instructions you are using. The results of his test were very clear: under heavy loads the Shield TV ends up throttling to Switch frequencies. If the GPU spikes momentarily to 1GHZ that's great, but it can't sustain that frequency and on average it runs at the same frequencies as the Switch.
BTW, the Switch's 11W power draw is also a "peak" power draw. BOTW is one of the most demanding games on the system, most other games draw less power than that. In fact if you watch the actual video of the power draw when playing BOTW, the power can go down momentarily to 10.5W or below.
 

Dinobot

Member
Oct 25, 2017
5,126
Toronto, Ontario, Canada
The talks of this refresh and possible Pro model made me sell my $180 Facebook Marketplace Switch earlier today.

I don't mind waiting another year for Nintendo to iron out the online kinks, add online features, and solve QC issues in the revision. I'll also have a pretty big library of games to get to.

I don't know much about tech, but I'm pretty confident this new Switch will have 7nm in it as that seems to be the thing for 2019 tech.
Lol. You're gonna be waiting more than a year for that.

For the love of God I hope they differentiate this sku properly. No "New" Switch. Switch Advance, Super Switch, Switch Pro.

Slapping an XL won't help because people will think it's just bigger but not more powerful.
 

Skittzo

Member
Oct 25, 2017
41,037
Lol. You're gonna be waiting more than a year for that.

For the love of God I hope they differentiate this sku properly. No "New" Switch. Switch Advance, Super Switch, Switch Pro.

Slapping an XL won't help because people will think it's just bigger but not more powerful.

Nintendo SwitchH. The H is for handheld.

Or Nintendo SSwitch. The S is for Super.

Honestly if this is a pro they should just call it a Super Switch. Then they can come out with a Switch 2 later, then a Super Switch 2. Rinse and repeat.
 

Hogendaz85

Member
Dec 6, 2017
2,813
Nintendo SwitchH. The H is for handheld.

Or Nintendo SSwitch. The S is for Super.

Honestly if this is a pro they should just call it a Super Switch. Then they can come out with a Switch 2 later, then a Super Switch 2. Rinse and repeat.
I just hope Nintendo stays this course and does not do something dramatically different with their next console.
 

Pooroomoo

Member
Oct 28, 2017
4,972
Nintendo SwitchH. The H is for handheld.

Or Nintendo SSwitch. The S is for Super.

Honestly if this is a pro they should just call it a Super Switch. Then they can come out with a Switch 2 later, then a Super Switch 2. Rinse and repeat.
If this theoretical revision indeed ends up having the same relation to Switch as the PS4 Pro has to the PS4, it should just be called Switch Pro. It will be a clear distinction from "Switch", it will tell the target market (those who want a more powerful Switch revision) exactly what it is, and nobody will think this is the Switch 2 by another name (which I'm afraid some will think "Super Switch" is)
 

Acu

Member
Jan 2, 2018
365
The Nintendo SS. The preferred system of Pewdiepie

vVkoPPU.gif
 

Hermii

Member
Oct 27, 2017
4,685
If this theoretical revision indeed ends up having the same relation to Switch as the PS4 Pro has to the PS4, it should just be called Switch Pro. It will be a clear distinction from "Switch", it will tell the target market (those who want a more powerful Switch revision) exactly what it is, and nobody will think this is the Switch 2 by another name (which I'm afraid some will think "Super Switch" is)
If they keep going with stock arm cpus and Nvidia gpus they could maintain forward/ backwards compitability forever. There is no need to do a generational reset.

The relation would be more like the iPhone 8 to the iPhone 7 than pro or x.
 
Last edited:

LightKiosk

One Winged Slayer
Member
Oct 27, 2017
11,479
Lol. You're gonna be waiting more than a year for that.

For the love of God I hope they differentiate this sku properly. No "New" Switch. Switch Advance, Super Switch, Switch Pro.

Slapping an XL won't help because people will think it's just bigger but not more powerful.

I know the whole Nintendo does their own thing, not really competing with the others yada yada yada stuff people spout. But if Nintendo waits until 2020 to release a potential Switch Pro, they're gonna be going up against next gen whether they want to or not depending on the overall pricing of the systems I don't think it'll work out good for them.

Their best time to release such a system is in 2019 when the current gen is coming to an end and games slow down to get ready for next gen. If they miss 2019, they'll have to do a 2021 or 2022 release to not directly compete with next gen consoles.
 

Skittzo

Member
Oct 25, 2017
41,037
If this theoretical revision indeed ends up having the same relation to Switch as the PS4 Pro has to the PS4, it should just be called Switch Pro. It will be a clear distinction from "Switch", it will tell the target market (those who want a more powerful Switch revision) exactly what it is, and nobody will think this is the Switch 2 by another name (which I'm afraid some will think "Super Switch" is)

Yeah you may be right about that name. People would probably think it's like the Super NES.
 

Rand a. Thor

Banned
Oct 31, 2017
10,213
Greece
If I had to guess, the Switch Premium would be a good name. Premium power, premium screen, premium price. The regular Switch would be the Everyday Switch. Yes I have no idea how the name products.
 

Deleted member 5764

User requested account closure
Banned
Oct 25, 2017
6,574
No not at all. Remember the track record of threads like this one to predict the specs of the next Nintendo system tends to vastly overestimate what we actually get.

I have fond/horrible memories of the pre-launch Switch threads. There was an entire group of people who were convinced that Switch was two different devices and that the "handheld" we have now was the weaker of the two. Still waiting for that beast of a home console with an Supplemental Computing device to bump up the power of the handheld Switch.
 

z0m3le

Member
Oct 25, 2017
5,418
You really are making weird comparisons. We don't know how the Switch decodes streaming apps, nor how the Pixel C decodes streaming apps, nor the frequencies the Pixel C runs at when performing those tasks. It's a useless comparison that goes nowhere. Just because both devices have the same hardware for video decoding it doesn't mean both are using it the same way. And again, the Switch has fixed clocks and voltages. How hard do I have to say that? It's not a valid comparison by any means. Even two implementations of the same chip can have wildly different performance while doing the same task. The Switch's power management is not optimized for the use a tablet like the Pixel C sees. It's optimized for predictable performance at the most efficient point in the power curve so that at full load it sees acceptable heat and battery life. It' designed with being all the time at full load in mind.

As for the Shield TV, MDave never said both CPU and GPU were under 100% load at all times, and even if they were, 100% load can have different power consumption depending on the kind of instructions you are using. The results of his test were very clear: under heavy loads the Shield TV ends up throttling to Switch frequencies. If the GPU spikes momentarily to 1GHZ that's great, but it can't sustain that frequency and on average it runs at the same frequencies as the Switch.
BTW, the Switch's 11W power draw is also a "peak" power draw. BOTW is one of the most demanding games on the system, most other games draw less power than that. In fact if you watch the actual video of the power draw when playing BOTW, the power can go down momentarily to 10.5W or below.
Zelda's 11 watts is specifically clarified as an average by anandtech, not peak.

1. Shield TV power draw peaks at 19 watts when gaming. You need to show it above that to assume that it's 19 watts at switch clocks which mdave didn't report.

2. Time for hard facts, A57 at 1ghz draws 1.83watts on 20nm, A57 at 2ghz draws over 8watts. mDave reported that he locked the CPU at 2GHz and that the gpu ran at 614mhz or higher, and would sometimes drop to 537mhz, this is during his testing which included tomb raider for Nvidia shield. This is with an imposed 15watts (really 19watts read at the wall) meaning that to go from 614mhz to 768mhz, you claim costs the gpu over 6 watts. (the difference from running switch cpu clocks to the 2ghz mDave tested.)

TL;DR: The 20nm chip most likely, runs at a lower power consumption when running at switch clock speeds, because with its tdp thermal throttling, the shield TV is able to maintain quite a high gpu speed of 614mhz or better for the majority of a game test while clocked the A57 CPU at 2GHz, which consumes over 8 watts: https://images.anandtech.com/doci/8718/A57-power-curve_575px.png

"Even two implementations of the same chip can have wildly different performance while doing the same task." this is worth applying to the shield TV vs the switch, truth is, switch will have tighter voltages, and never needs to run the cpu at a high voltage, so its at the low end of the power curve, Tegra X1 in the shield TV has a better heat sink and draws power from the wall at all times, so as long as it doesn't melt, it's fine. Switch has a TPD of 18 watts, which makes sense given Shield TV's 19 watts at the wall reading. They are almost certainly the same process node with aggressive power management applied to the Switch with locked clocks, while shield TV can change frequencies on the fly and push hotter temperatures and higher voltages to meet twice the cpu clock and a 25% gpu bump.
 
Last edited:

bmfrosty

Member
Oct 27, 2017
1,894
SF Bay Area
KISS principal applies here.
Switch 2.

Super Switch
Mega Switch
Ultra Switch
Switch 64
Giga Switch
Tera Switch
Switch Pro
Switch X
Switch Premium

If you go with those, you eventually have a lot of confusion.

Although, if you pick one and keep using it for successive generations...

Switch (2017)
Switch Pro (2020)
Switch 2 (2022)
Switch 2 Pro (2025)
Switch 3 (2027)
Switch 3 Pro (2030)

Just to establish half generations, it could work, but numbers are still better - even if they're only 3 years apart. It works for phones that are 1 year apart.

Switch (2017)
Switch 2 (2020)
Switch 3 (2023)
Switch 4 (2026)
Switch 5 (2029)

They could also add letters - M for mini. H for home (or P for Pro or S for Stationary or T for TV), and have the numbers indicate a power level.

Switch (2017)
Switch M (2019)
Switch H2 (2020)
Switch 2 (2021)
Switch M2 (2022)
Switch H3 (2024)
Switch 3 (2025)
Switch M3 (2027)

I like the last one the best.
 

Booga

Alt account
Banned
Sep 15, 2018
937
Ok well this fortifies the idea of the Switch revision being a clamshell, or otherwise joy-con-less model aimed at a lower point of entry price:

Reggie when asked about a new Switch model: ". So right now, the current execution of Nintendo Switch with the Joy-Con and all of the capabilities, that's our focus right now."

Interesting that he even mentions joy-con-less and its current capability set. Almost as though the next iteration of the Switch will lack those functions. My money is still on a budget clamshell Switch. You can still use Joy-con-less, but it'll come with its own built in controls as well.

Here is the full statement from Reggie"


On rumors of new Switch hardware…

Right now, as we go into our second holiday, my focus is making sure the current grey and neon Switch continue to have momentum in the market place. You saw systems, whether it's our own or competitive home console systems, utilize that tactic a bit later in the life cycle. So right now, the current execution of Nintendo Switch with the Joy-Con and all of the capabilities, that's our focus right now.

https://nintendoeverything.com/regg...nce-of-dlc-no-news-about-new-switch-hardware/
 

Plankton2

Member
Dec 12, 2017
2,670
I think too many people are assuming a Playstation like strategy in terms of what a future revision might end up being, but I don't think you can exclude them following Microsoft's strategy.

So if my understanding this right the Tegra chip is on a 20nm Maxwell chip, and general wisdom tells us Nintendo wants to get off that and onto a Pascal variant for the future. They also will likely want and forward compatibility for at the minimum all first party games.

So look at what Xbox did this generation

2013 Xbox One releases - > Summer 2016 Xbox one S launches -> 2017 Original Xbox production ends -> Fall 2017 Xbox One X premium product launches

Nintendo might end up doing the same thing but on an expedited time line, because of the next gen looming.

2017 Switch launches -> Summer 2019 Switch (S) launches -> 2020 Original Switch ends -> Fall 2020 Switch Pro premium product launches

They can accomplish almost all their objectives by breaking the switch into 2 lines.

So keeping with the Switch (S) name for the first revision, they can give you a cost oriented product catered towards families while also getting off of 20nm. It would be just like the current switch with a minor upgrade (compared to current gen) and a similar priced SKU, access to all the first party games, all the indies, and a few third party games.

The original switch gets a price cut and then eventual discontinuation the next year. By waiting till 2020, you give those guys a better feel of value and investment, almost 4 years for a console is fine rather than 3.

Then the Switch pro would come, and be oriented towards the hard core gamer crowd. Better graphics, more third parties, but at a more expensive (~$400) sku. You get the the option of competing with next gen which is slated for March-Fall 2020. And if you just care about Nintendo games well they have a cheap version of the switch just for you!
 
Last edited:

Moi_85

Member
Nov 26, 2018
68
Honestly if this is a pro they should just call it a Super Switch.

It's a problem for the future names, a simple trick for the name would be "Switch 128" (in the case of 128GB)

With that the people know easily that the new Switch is superior but you can play any Switch past game.

But who knows, It's impossible to know how it will be called X-D
 

Surface

Member
Nov 6, 2017
650
KISS principal applies here.
Switch 2.

Super Switch
Mega Switch
Ultra Switch
Switch 64
Giga Switch
Tera Switch
Switch Pro
Switch X
Switch Premium

If you go with those, you eventually have a lot of confusion.

Although, if you pick one and keep using it for successive generations...

Switch (2017)
Switch Pro (2020)
Switch 2 (2022)
Switch 2 Pro (2025)
Switch 3 (2027)
Switch 3 Pro (2030)

Just to establish half generations, it could work, but numbers are still better - even if they're only 3 years apart. It works for phones that are 1 year apart.

Switch (2017)
Switch 2 (2020)
Switch 3 (2023)
Switch 4 (2026)
Switch 5 (2029)

They could also add letters - M for mini. H for home (or P for Pro or S for Stationary or T for TV), and have the numbers indicate a power level.

Switch (2017)
Switch M (2019)
Switch H2 (2020)
Switch 2 (2021)
Switch M2 (2022)
Switch H3 (2024)
Switch 3 (2025)
Switch M3 (2027)

I like the last one the best.

Like....Wii and Wii U? Im sure they wanna go that path again
 

TitanicFall

Member
Nov 12, 2017
8,263
Then the Switch pro would come, and be oriented towards the hard core gamer crowd. Better graphics, more third parties, but at a more expensive (~$400) sku. You get the the option of competing with next gen which is slated for March-Fall 2020. And it also gives you an option, if you just care about Nintendo games well they have a cheap version of the switch just for you!

I find it hard to believe a Switch Pro would drive more third party support. That would imply that you would have to upgrade to play those games. Even with PS4 Pro and Xbox One X, no base owners were getting left behind.
 

z0m3le

Member
Oct 25, 2017
5,418
I find it hard to believe a Switch Pro would drive more third party support. That would imply that you would have to upgrade to play those games. Even with PS4 Pro and Xbox One X, no base owners were getting left behind.
An upgrade model would meet the minimum requirements for the 2022 model's game library, that is sort of the way these things go, you get a premium model every 3 years, and the 6 year old model becomes obsolette... It's like if instead of the PS5, Sony release a PS4 Pro2 in 2020, with games that required at least the pro model.

If Switch is selling 15 to 20 million a year on average, a Switch Pro could have a 50 Million install base when they release the Switch Pro2 in 2022. Means developers can take advantage of these new platforms right away.
 
Oct 27, 2017
5,618
Spain
Zelda's 11 watts is specifically clarified as an average by anandtech, not peak.

1. Shield TV power draw peaks at 19 watts when gaming. You need to show it above that to assume that it's 19 watts at switch clocks which mdave didn't report.

2. Time for hard facts, A57 at 1ghz draws 1.83watts on 20nm, A57 at 2ghz draws over 8watts. mDave reported that he locked the CPU at 2GHz and that the gpu ran at 614mhz or higher, and would sometimes drop to 537mhz, this is during his testing which included tomb raider for Nvidia shield. This is with an imposed 15watts (really 19watts read at the wall) meaning that to go from 614mhz to 768mhz, you claim costs the gpu over 6 watts. (the difference from running switch cpu clocks to the 2ghz mDave tested.)

TL;DR: The 20nm chip most likely, runs at a lower power consumption when running at switch clock speeds, because with its tdp thermal throttling, the shield TV is able to maintain quite a high gpu speed of 614mhz or better for the majority of a game test while clocked the A57 CPU at 2GHz, which consumes over 8 watts: https://images.anandtech.com/doci/8718/A57-power-curve_575px.png

"Even two implementations of the same chip can have wildly different performance while doing the same task." this is worth applying to the shield TV vs the switch, truth is, switch will have tighter voltages, and never needs to run the cpu at a high voltage, so its at the low end of the power curve, Tegra X1 in the shield TV has a better heat sink and draws power from the wall at all times, so as long as it doesn't melt, it's fine. Switch has a TPD of 18 watts, which makes sense given Shield TV's 19 watts at the wall reading. They are almost certainly the same process node with aggressive power management applied to the Switch with locked clocks, while shield TV can change frequencies on the fly and push hotter temperatures and higher voltages to meet twice the cpu clock and a 25% gpu bump.
1-Anandtech also specifies 19.5W as an average for the Shield TV when playing games.

2-A57 draws those figures under 100% synthetic load on all cores. When MDave ran the 2GHZ test, you have no guarantee that the test was putting a 100% load on all cores, in fact it most certainly wasn't, because he said his best results on the test were obtained with the CPU at 1020MHZ. If his test running on Vulkan was not CPU-bound at 1020MHZ, it would hardly put a 100% load on the same CPU running at 2GHZ. Had the test he used been really CPU intensive, the GPU would have throttled even further. (We'll come back to this) When he did the tests at Switch CPU frequencies, it was a more balanced test that did a better job at really maximizing load while maintaining balanced clockspeeds. Given that he ran at a fixed resolution and the GPU was running at lower frequencies, the CPU at 2GHZ was doing less work while drawing more power.
His 1020MHZ test was a breakthrough because 1: It maximized performance, aka combined load of CPU and CPU. 2: The Shield TV ended up throttling to Switch GPU frequencies when it maximized its performance, indicating that that specific clock ratio was the one that maximized performance for typical gaming loads for the Tegra X1. That's why everybody back then thought that if those clocks appeared in the final Switch it would have a stock Tegra X1. What nobody counted on is that the Shield TV throttles to stay at 19W power consumption, and that the Switch consumes 11W in the same conditions.

You are misinterpreting what I said. Two equal chips can, say, play a video, or idle, with different power consumption because under the hood they are handling those tasks differently, at different frequencies, etc. But if the two chips are running games on Vulkan, at the same frequencies, with similarly high loads on CPU and GPU, and there is a massive disparity in power usage, then they are not the same chip. You say the Switch has tighter voltages, and of course it does, because it's on a FinFet node and can do it. If the Tegra X1 could like that at those voltages, the Shield TV would do it as well, and when consuming 19W it would be a beast.

And no, the Switch does not have an 18W TDP. It draws slightly below 18W when playing a game at full blast while charging the battery and the JoyCons. But only 11 of those go toward making the game run.

Finally, an appreciation on the 2GHZ mode and its performance. There are a few games on the Shield TV that are high end last gen games and run very, very poorly on it. For instance, Tomb Raider, Resi 5, Métal Gear Rising. What do all of those have in common? They are Direct X to OpenGL ports. The CPU code is probably very unoptimized and very poorly threaded. Which means, there has to be a single thread with a ton of load running at very high frequency, and the CPU is left running at 2GHZ with one thread full to the brim, and the GPU left to throttle. So the games end up running at low framerate and low resolutions. Then you have games like DOOM 3, which are native to OpenGL, with better threading and most surely running the CPU at 1 GHZ or so, that also run at higher resolutions because the GPU isn't throttled nearly as much. It also happens to run like what a Switch port would run. If you look for real game equivalents to MDave's tests, they exist.
 

z0m3le

Member
Oct 25, 2017
5,418
1-Anandtech also specifies 19.5W as an average for the Shield TV when playing games.

2-A57 draws those figures under 100% synthetic load on all cores. When MDave ran the 2GHZ test, you have no guarantee that the test was putting a 100% load on all cores, in fact it most certainly wasn't, because he said his best results on the test were obtained with the CPU at 1020MHZ. If his test running on Vulkan was not CPU-bound at 1020MHZ, it would hardly put a 100% load on the same CPU running at 2GHZ. Had the test he used been really CPU intensive, the GPU would have throttled even further. (We'll come back to this) When he did the tests at Switch CPU frequencies, it was a more balanced test that did a better job at really maximizing load while maintaining balanced clockspeeds. Given that he ran at a fixed resolution and the GPU was running at lower frequencies, the CPU at 2GHZ was doing less work while drawing more power.
His 1020MHZ test was a breakthrough because 1: It maximized performance, aka combined load of CPU and CPU. 2: The Shield TV ended up throttling to Switch GPU frequencies when it maximized its performance, indicating that that specific clock ratio was the one that maximized performance for typical gaming loads for the Tegra X1. That's why everybody back then thought that if those clocks appeared in the final Switch it would have a stock Tegra X1. What nobody counted on is that the Shield TV throttles to stay at 19W power consumption, and that the Switch consumes 11W in the same conditions.

You are misinterpreting what I said. Two equal chips can, say, play a video, or idle, with different power consumption because under the hood they are handling those tasks differently, at different frequencies, etc. But if the two chips are running games on Vulkan, at the same frequencies, with similarly high loads on CPU and GPU, and there is a massive disparity in power usage, then they are not the same chip. You say the Switch has tighter voltages, and of course it does, because it's on a FinFet node and can do it. If the Tegra X1 could like that at those voltages, the Shield TV would do it as well, and when consuming 19W it would be a beast.

And no, the Switch does not have an 18W TDP. It draws slightly below 18W when playing a game at full blast while charging the battery and the JoyCons. But only 11 of those go toward making the game run.

Finally, an appreciation on the 2GHZ mode and its performance. There are a few games on the Shield TV that are high end last gen games and run very, very poorly on it. For instance, Tomb Raider, Resi 5, Métal Gear Rising. What do all of those have in common? They are Direct X to OpenGL ports. The CPU code is probably very unoptimized and very poorly threaded. Which means, there has to be a single thread with a ton of load running at very high frequency, and the CPU is left running at 2GHZ with one thread full to the brim, and the GPU left to throttle. So the games end up running at low framerate and low resolutions. Then you have games like DOOM 3, which are native to OpenGL, with better threading and most surely running the CPU at 1 GHZ or so, that also run at higher resolutions because the GPU isn't throttled nearly as much. It also happens to run like what a Switch port would run. If you look for real game equivalents to MDave's tests, they exist.
I'm leaving the burden on you at this point. Switch for instance only uses 3 of the 4 cpu cores for games, which shield TV uses 4.

There is optimized mode for shield TV that has the system only draw between 10 and 13 watts during gaming, which is in line with what we are seeing on the switch.

I've shown that during the same conditions, the Tegra X1 in the Pixel c and Switch consume the same amount of power when streaming media. (you believe this is because of different clocks used, I don't.)

I've shown that a 2ghz cpu ran the gpu at at least 614mhz while gaming in the Nvidia shield, you claim that it isn't putting enough load on the cpu, well vulkan removes load off the cpu and onto the gpu, while opengl does the opposite.

What you have is a question "why does the switch only use 11watts when gaming?" you've gotten dozens of answers about voltage regulation, mobile design, lower clocks and then everything I've shown you, but you believe your theory is ironclad, so whatever, the Switch is secretly a 16nm chop and Nvidia doesn't use it in new Nvidia shield TV's. Switch isn't as efficient as X2, but that's okay because reasons.

I know you're going to respond with some new spin on clocks and how mdave's tests prove your theory, they don't, it's missing the critical evidence of power draw. I'm going back to facts now, taking off my tinfoil hat here. You find the evidence and I'll accept it, but don't treat your theory like it's a known fact, it's just something you've proven to yourself and no one else.
 

Plankton2

Member
Dec 12, 2017
2,670
I find it hard to believe a Switch Pro would drive more third party support. That would imply that you would have to upgrade to play those games. Even with PS4 Pro and Xbox One X, no base owners were getting left behind.

The switch itself drives up the third party support. Like third parties are waiting on Nintendo right now to put more games on the console, due to cart restraints.

But even with that the base switch can't run every third party game out there, that's just a fact. And I don't think any minor upgrade (like downclocked X2) will majorly change that. So a pro/major console upgrade is an eventuality that allows them to remain competitive with the next gen of consoles.

And I think for Nintendo as long as they don't leave base owners behind for first party games, they will be fine. So like for my scenario, in 2021 a potential Mario Odyssey 2 coming to all 3 platforms would keep everyone happy. But if in 2021 a 3rd party AAA game is exclusive to the switch pro, that's understandable.
 
Status
Not open for further replies.