GM Volt Forum banner
1 - 5 of 5 Posts

· Registered
Joined
·
3,237 Posts
Discussion Starter · #1 ·
The recent fatal accident when the Tesla AP hit a lane divider in California raises a question for me about this technology. I figured that with the connectivity of these cars, there would be some kind of learning protocol for the auto pilot system.

It was reported that this driver had driven past the accident area previously and noticed that his AP system didn't handle it correctly on multiple occasions. Presumably he had to make manual corrections. It seems to me that the AP system could be programmed to notice when manual corrections are needed, and then use that information to handle that location better next time. That information could also be shared across all vehicles in the Tesla fleet. I realize that would be a pretty advanced level of machine performance, but from a company that can land a hovering rocket on a raft, it should be within reach at least to some extent. It also seems like that would be essential to eventually progress to fully autonomous vehicles. Isn't Tesla doing anything like that at this point?
 

· Registered
Joined
·
10,014 Posts
Last I checked, Tesla is using nVidia GPU boards which contain neural network technology. However, Tesla is not saying that they are actually using the boards in this manner.

Currently, AP is just driver assist. If the drivers don't realize that by now, there will be more of these crashes. No driver assist technology can recognize a fixed object. This is done on purpose to avoid false-positive reactions by the car's metal brain.

In my opinion, there is not an upgrade path from driver assist to true FSD cars. The technologies are too different other than a few sensors and actuators used in both. Thus GM has SuperCruise and also Cruise AVs. Totally different efforts.
 

· Moderator
Joined
·
8,616 Posts
There's a whole lot of we don't know here. At various points of time Tesla has given us hints of doing learning, but there's little evidence of that day to day.

There was a lot of talk about AP running in shadow mode last year - which we think means running the neural nets and comparing their steering direction with the actual input from the driver, but we never saw any reports or data from Tesla on what they learned then, if anything.

For a while there was an owner with an AP2 car that had been rooted on TMC, and we learned a fair amount from him about what the system can see. His car was getting requests to send photos from the AP cameras when it saw certain things recognized by the neural network - there's a group of photos it uploaded of construction work, correctly identified that he caught and posted.

But the only reason we know about those bits is his rooted car, which he no longer has - Tesla could be hitting any of the cars with similar requests every day, or they might be doing no learning - we just don't know what the car is saying behind the scenes unless Tesla decides to tell us.

One of the things firmware 8 was supposed to bring was a radar whitelist. One of the big problems with all adaptive cruise systems is correctly recognizing and responding to stopped cars in the lane. Tesla had a plan to have the cars watch all the big stationary returns that drivers drive right past/through, and upload them to the mothership with geotagged data about them. They were them going to develop a tiled whitelist of these objects - and start braking for large stationary returns that aren't on it. But I've seen nothing new on that in a year or so, and we seem to get phantom braking events from long standing objects on major roads still, so it seems like that may not have finished.

At some point, I'm expecting Tesla to map the lanes as the car is driving and use that with precision GPS as a double check for the AP camera based steering guidance - maybe if they get to level 3 a mismatch between the two will be a reason for a driver alert.
 

· Registered
Joined
·
4,232 Posts
It was reported that this driver had driven past the accident area previously and noticed that his AP system didn't handle it correctly on multiple occasions. Presumably he had to make manual corrections. It seems to me that the AP system could be programmed to notice when manual corrections are needed, and then use that information to handle that location better next time. That information could also be shared across all vehicles in the Tesla fleet. I realize that would be a pretty advanced level of machine performance, but from a company that can land a hovering rocket on a raft, it should be within reach at least to some extent. It also seems like that would be essential to eventually progress to fully autonomous vehicles. Isn't Tesla doing anything like that at this point?
This would be a real case of Natural Stupidity defeating Artificial Intelligence.
 

· Super Moderator
Joined
·
6,369 Posts
One of the things firmware 8 was supposed to bring was a radar whitelist. One of the big problems with all adaptive cruise systems is correctly recognizing and responding to stopped cars in the lane.
This aspect of recognizing stopped cars has HUGELY improved in the past several releases. I drive with AP a significant amount and even when I'm on 35-50 MPH roads here in Chicagoland suburbs. My previous AP1 and current AP2 Tesla X required a lot of attention when coming to stop cars as lights that were there already and stopped. If there cars around me still moving and slowing down with me (other lanes or my lane) it did great. Most thought it was simply because radar was mainly (only?) used.

It is dramatically better now in my AP2 car for a few releases. Most think it is because they are blending the sensor input from new neural networked / machine learning camera images with the radar input. I rarely (don't recall last time) I had an issue in the past 1000 miles.

Major keypoint is that this does NOT require everything to be mapped *ahead* of time on every road. The car is using NN/ML smarts in the car where it was trained on shared data. This means it works on more roads as well as new roads. Or changed roads as I've driven through construction zones several times in the past month (local and highway).

Excellent video on some of the AI/NN/ML TRAIN AI 2018 - Building the Software 2.0 Stack, by Andrej Karpathy recently.

 
1 - 5 of 5 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top