• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
Status
Not open for further replies.

Ryno23

Banned
Dec 13, 2017
1,097


Presentation starts at 1 hour 9 minute mark for anyone tuning in late that's interested


So no one has actually seen yet what the Tesla hardware 3 custom chip + nueral network, which is their solution for autonomous full self driving, is capable of. Will be shown today, so this should be an interesting event.


Updates FSD hardware 3 video:


Impressions from FSD demo rides:

 
Last edited:

Deleted member 11985

User requested account closure
Banned
Oct 27, 2017
4,168
Hmm, 3 years to go from nothing to full autonomy? I want fully autonomous cars more than anybody I know, but I have a sneaking suspicion that's a little too fast, considering how complex the problem is.

I guess they haven't revealed yet when full autonomy is going to be released to the general public, though.
 

SRG01

Member
Oct 25, 2017
7,008
Hmm, 3 years to go from nothing to full autonomy? I want fully autonomous cars more than anybody I know, but I have a sneaking suspicion that's a little too fast, considering how complex the problem is.

I guess they haven't revealed yet when full autonomy is going to be released to the general public, though.

It won't be. It needs to approach Waymo and Cruise/GM levels of intervention before they'll be taken seriously.
 

Deleted member 10612

User requested account closure
Banned
Oct 27, 2017
2,774
This chip seems underpowered. I mean iPhone X had 8 billion transistors and my galaxy S10 has eight arm cores. I really expected them to blow the lid of with 32 core madness etc
 

pj-

Banned
Oct 25, 2017
1,659
Who is the audience for this? It's far too technical for anyone but computer/software engineers

Normal people should only care about the results
 

subrock

Member
Oct 27, 2017
1,958
Earth
This chip seems underpowered. I mean iPhone X had 8 billion transistors and my galaxy S10 has eight arm cores. I really expected them to blow the lid of with 32 core madness etc
Cores don't really matter for things that aren't multi-threaded. Real-time is generally not multi-threaded because you need a single pipeline of instructions going out to the car's systems. If it were async, you could have two competing instructions given concurrently which would be dangerous
 

Deleted member 10612

User requested account closure
Banned
Oct 27, 2017
2,774
Cores don't really matter for things that aren't multi-threaded. Real-time is generally not multi-threaded because you need a single pipeline of instructions going out to the car's systems. If it were async, you could have two competing instructions given concurrently which would be dangerous
I don't think that's how it works.

I get that they made some other stuff that can scan video feeds extremely good. But you need to do a lot more then just checking video feeds fast for self driving to work.
 

Keikaku

Member
Oct 27, 2017
4,768
What is this. Lol. "everyone using lidar is doomed" lol
giphy.gif
 

2pac_71

Member
Oct 25, 2017
2,503
Elon is so rude to the dude who introduces the next speaker haha. Who cares about his phd at Stanford that's not important.
 

subrock

Member
Oct 27, 2017
1,958
Earth

Deleted member 10612

User requested account closure
Banned
Oct 27, 2017
2,774
If you know how it really works, please share

Here's some relevant discussion as to why multi-core processors are not suitable for time sensitive applications: https://electronics.stackexchange.com/a/296418
Because there is more to self driving cars then sensor read outs. It's world modelling, driver prediction (of stuff that didn't happen yet) synch of gps/ hd map data , car to X communication, cloud synch over 5G etc.

All of that happening at once.
 

SRG01

Member
Oct 25, 2017
7,008
What is this. Lol. "everyone using lidar is doomed... DOOMED" lol

What a strange vibe this talk has.

Everyone using Lidar has a huge advantage over Tesla. The results from other autonomy projects speak for themselves.

I made a similar comparison with Night Sight: processing can only get you so far when there's a far superior, cost-effective sensor solution available.

Because there is more to self driving cars then sensor read outs. It's world modelling, driver prediction (of stuff that didn't happen yet) synch of gps/ hd map data , car to X communication, cloud synch over 5G etc.

All of that happening at once.

Yes, I agree, although I would also add that the robustness of the data set matters as well.
 

2pac_71

Member
Oct 25, 2017
2,503
From that tweet does it mean that Tesla chip isn't that great compared to nvidia and Mobileeye?
 

SRG01

Member
Oct 25, 2017
7,008
From that tweet does it mean that Tesla chip isn't that great compared to nvidia and Mobileeye?

It's a combination of hardware and software. At this point, it's obvious that Tesla is faaaaaaaaaaar behind their competition in both respects.

But yes, to answer your question, unless Tesla has baked in some secret sauce, but even then the raw performance is still behind.
 

2pac_71

Member
Oct 25, 2017
2,503
It's a combination of hardware and software. At this point, it's obvious that Tesla is faaaaaaaaaaar behind their competition in both respects.

But yes, to answer your question, unless Tesla has baked in some secret sauce, but even then the raw performance is still behind.

Thanks, so Elon's confidence etc., is just for show?
 
OP
OP
Ryno23

Ryno23

Banned
Dec 13, 2017
1,097
Everyone using Lidar has a huge advantage over Tesla. The results from other autonomy projects speak for themselves.

I made a similar comparison with Night Sight: processing can only get you so far when there's a far superior, cost-effective sensor solution available.



Yes, I agree, although I would also add that the robustness of the data set matters as well.

Lidar hardware alone costs $60k+...how on earth is that cost effective? They will never have the fleet and therefore the data to compete with Tesla
 

Deleted member 2625

User requested account closure
Banned
Oct 25, 2017
4,596
Because there is more to self driving cars then sensor read outs. It's world modelling, driver prediction (of stuff that didn't happen yet) synch of gps/ hd map data , car to X communication, cloud synch over 5G etc.

All of that happening at once.

Pretty sure subrock is right. Real-time systems are not like generalized out of order PC instructions
 

Deleted member 11985

User requested account closure
Banned
Oct 27, 2017
4,168
He's predicting autonomous robotaxis for next year. I'll take that.

Edit: Well, this is sounding expensive, so I guess I won't be taking it anytime soon. But still cool.
 

Armaros

Member
Oct 25, 2017
4,901
Looks like someone hasn't been watching the presentation

They literally stated in previous legally required documentation that they had zero recorded miles in California for the year.

Presentations > Legal Documentation?

In previous anoucements, Elon has stated that FSD was coming out within 2 years, which he has stated for 4-5 years straight.
 
OP
OP
Ryno23

Ryno23

Banned
Dec 13, 2017
1,097
They literally stated in previous legally required documentation that they had zero recorded miles in California for the year.

Presentations > Legal Documentation?

They have 500k cars on the road today running in shadow mode and sending data back to the nueral network this was all walked through in the presentation... That was my point I don't know what you are trying to get at.
 

Argyle

Member
Oct 25, 2017
1,054
A comparison of the TOPS:


From that tweet does it mean that Tesla chip isn't that great compared to nvidia and Mobileeye?
It's a combination of hardware and software. At this point, it's obvious that Tesla is faaaaaaaaaaar behind their competition in both respects.

But yes, to answer your question, unless Tesla has baked in some secret sauce, but even then the raw performance is still behind.

Is that really so, though? I only watched the hardware section and I wasn't taking notes, but that slide shows the design goals...and the text seems incorrect.

For example, the design goal was 50 TOPS. They said later in the presentation they achieved 72TOPs on the actual chip, so the tweet is already incorrect. The FSD computer has two chips on it (one for redundancy), so in less than 100W they are actually achieving 144TOPs. Now this guy has everything wrong so far but if we take his numbers at face value Mobileye seems competitive as far as performance per watt, but it looks like it might be a chip for next year on a different process node according to this: https://www.mobileye.com/our-technology/evolution-eyeq-chip/ ...Nvidia, not so much. Yeah, they have a little over double the performance if you're willing to throw 4x as much power and cooling at it, and in an electric car, taming power consumption is pretty important if you want to maximize range.

C'mon, man, SRG01 you're an engineer right? It's disappointing to see you misrepresent stuff like this.
 

Armaros

Member
Oct 25, 2017
4,901
They have 500k cars on the road today running in shadow mode and sending data back to the nueral network this was all walked through in the presentation... That was my point I don't know what you are trying to get at.

You do realize that the data they record from those cars is not even close to the full suite of data you get from actual testing cars like from Waymo? You, like most other people believe that any form of raw data is all you need for neural networks.

And alrady completely misunderstand how neural networks use and need data. And you here are talkinga about Tesla being better at data collection and neural networks then a company affiliated with Google.
 
OP
OP
Ryno23

Ryno23

Banned
Dec 13, 2017
1,097
You do realize that the data they record from those cars is not even close to the full suite of data you get from actual testing cars like from Waymo? You, like most other people believe that any form of raw data is all you need for neural networks.

And alrady completely misunderstand how neural networks use and need data. And you here are talkinga about Tesla being better at data collection and neural networks then a company affiliated with Google.

OK well again, when you have time watch the presentation it's literally what the thread is about, and they clearly walked through why what they're doing works and why they believe their strategy is far superior to Waymo.
 

pj-

Banned
Oct 25, 2017
1,659
Real curious to see demo footage

If you take Elon at his word, they are 5 years ahead of everyone else vs. the 5 years behind some analysts were claiming

I don't take him at his word of course, but it is interesting stuff either way. We're mere months from self driving cars that can pay for themselves. Or, Tesla is going to burn down like a shanghai parking garage
 

Luschient

Member
Oct 30, 2017
1,614
Wasn't able to watch this, did they mention to what extent (if at all) they are using radar/mmwave sensors for this?
 

Deleted member 10612

User requested account closure
Banned
Oct 27, 2017
2,774
Is that really so, though? I only watched the hardware section and I wasn't taking notes, but that slide shows the design goals...and the text seems incorrect.

For example, the design goal was 50 TOPS. They said later in the presentation they achieved 72TOPs on the actual chip, so the tweet is already incorrect. The FSD computer has two chips on it (one for redundancy), so in less than 100W they are actually achieving 144TOPs. Now this guy has everything wrong so far but if we take his numbers at face value Mobileye seems competitive as far as performance per watt, but it looks like it might be a chip for next year on a different process node according to this: https://www.mobileye.com/our-technology/evolution-eyeq-chip/ ...Nvidia, not so much. Yeah, they have a little over double the performance if you're willing to throw 4x as much power and cooling at it, and in an electric car, taming power consumption is pretty important if you want to maximize range.

C'mon, man, SRG01 you're an engineer right? It's disappointing to see you misrepresent stuff like this.
If one is for redundancy you can't add them together. Or else it would not be redundant.
 

Argyle

Member
Oct 25, 2017
1,054
If one is for redundancy you can't add them together. Or else it would not be redundant.

We're comparing performance per watt. Of course you can add them together. Redundancy is a design choice, if you want to say that redundancy doesn't matter, then let's say Tesla's system gives you 72TOPs at less than 50 watts, the performance per watt doesn't change.

But sure, let's talk about that. The Tesla chip has 72TOPs of NN acceleration on board. Do you think that performance will scale linearly as you add more chips? It seems that a major design consideration is the 64MB of on-chip SRAM cache for intermediate results, this is important because dumping things out to main memory and loading them back in is slow. It's probable that if you had two chips and wanted to use them together you'd get less than 144TOPs of real world performance because of the overhead of communication between the two chips.

But that's even more true for Tesla's competitors - for say Mobileye to have the same raw throughput as one Tesla chip you'd need three chips. In reality you might need even more chips to get 72TOPs of net performance because this overhead only gets worse as you add more chips, so there might not be a performance per watt advantage in reality.
 

SRG01

Member
Oct 25, 2017
7,008
Lidar hardware alone costs $60k+...how on earth is that cost effective? They will never have the fleet and therefore the data to compete with Tesla
Who told you lidar cost 60k lol.

Even with LiDAR being expensive, that's not the end game of autonomous driving...
Is that really so, though? I only watched the hardware section and I wasn't taking notes, but that slide shows the design goals...and the text seems incorrect.

For example, the design goal was 50 TOPS. They said later in the presentation they achieved 72TOPs on the actual chip, so the tweet is already incorrect. The FSD computer has two chips on it (one for redundancy), so in less than 100W they are actually achieving 144TOPs. Now this guy has everything wrong so far but if we take his numbers at face value Mobileye seems competitive as far as performance per watt, but it looks like it might be a chip for next year on a different process node according to this: https://www.mobileye.com/our-technology/evolution-eyeq-chip/ ...Nvidia, not so much. Yeah, they have a little over double the performance if you're willing to throw 4x as much power and cooling at it, and in an electric car, taming power consumption is pretty important if you want to maximize range.

C'mon, man, SRG01 you're an engineer right? It's disappointing to see you misrepresent stuff like this.

Ooops yeah, sorry bout that guys. I pulled it off Twitter in a hurry in the morning. I'll leave the evidence of shame, haha.
 

Gashprex

Member
Oct 25, 2017
1,029
I don't know why people are using traditional comparisons - this chip is specifically for FSD and real-time driving. The presentation went painstakingly through the difference between their chip and traditional chips and why they couldn't use them

They claim a 21x frames per second over their previous solutions
 
Last edited:
Status
Not open for further replies.