GM Volt Forum banner
1 - 10 of 10 Posts

· Registered
Joined
·
1,479 Posts
Discussion Starter · #1 ·
I really don't dig the reasoning behind NTSB's logic. It is like making a software, and then some idiot user intended not to use the software properly and so accounts were messed up. Can you say that if I use the car to run over somebody or intentionally crash it into a building that it lacked protection from improper use? Is the manufacturer liable for drunk drivers because the car had no protection against improper use such as driving while drunk? To what extent do we draw the line for improper use of technology? If the safety feature wasn't used properly and ends in a crash despite a lot of warnings, is the manufacturer to be blamed?

https://www.nytimes.com/2017/09/12/business/self-driving-cars.html
 

· Registered
Joined
·
752 Posts
I am not privy to all the details, but if what I remember is that Tesla does specify that you are not to let go of the steering wheel, and that this is what this guys was doing, then I agree with you.

I also think that Tesla by calling it Auto Pilot, which it iclearly is not, made a mistake, giving lunatics ammunition against them.
 

· Registered
Joined
·
4,890 Posts
I am not privy to all the details, but if what I remember is that Tesla does specify that you are not to let go of the steering wheel, and that this is what this guys was doing, then I agree with you.

I also think that Tesla by calling it Auto Pilot, which it iclearly is not, made a mistake, giving lunatics ammunition against them.
Actually Elon Musk claimed "The probability of having an accident is 50 per cent lower if you have Autopilot on," said Musk, speaking at an energy conference in Oslo, Norway. "Even with our first version, it's almost twice as good as a person."

http://www.telegraph.co.uk/technolo...s-autopilot-makes-accidents-50pc-less-likely/

Musk set about creating this perception in 2013, when he said that Autopilot would be capable of handling “90 percent of miles driven” by 2016. By mid-2014, Musk was promising investors that the system could handle all freeway driving, “from onramp to exit,” within 12 months of deployment. Tesla still has yet to make good on those bold claims, and competitors argue that Tesla’s relatively simple sensor hardware will never be capable of safely performing at such a high level of autonomy.

Experts have understood Autopilot’s hardware limitations for some time, but Tesla owners and investors clearly believed that Autopilot was either an autonomous drive system, or something very close to it. Brown clearly believed that Autopilot was “autonomous” and described it as such in the description of a video that Musk shared on Twitter. [NOTE: Musk did NOT correct Brown's beliefs!!! when he shared the video...:(] So great was his apparent faith in Autopilot’s autonomous capabilities that he was reportedly watching a DVD at the time of his fatal crash. The extent of Autopilot’s true abilities, which wax and wane with each Over The Air software and firmware update Tesla pushes to the car, is hotly debated on Tesla forums where even Musk’s most devout acolytes waver between extolling its miraculous powers and blaming drivers for their inattentiveness depending on the circumstances.

http://www.thedailybeast.com/how-te...eraged-safety-claims-about-autopilot-and-cars
 

· Registered
Joined
·
4,101 Posts
I really don't dig the reasoning behind NTSB's logic. It is like making a software, and then some idiot user intended not to use the software properly and so accounts were messed up. Can you say that if I use the car to run over somebody or intentionally crash it into a building that it lacked protection from improper use? Is the manufacturer liable for drunk drivers because the car had no protection against improper use such as driving while drunk? To what extent do we draw the line for improper use of technology? If the safety feature wasn't used properly and ends in a crash despite a lot of warnings, is the manufacturer to be blamed?

https://www.nytimes.com/2017/09/12/business/self-driving-cars.html
Normally a car won't stay in a lane for 7 seconds without steering it. Roads aren't flat. They are sloped to drain rainwater.

If his car did not have AP1 he could not have ignored the road for 7 seconds.
 

· Registered
Joined
·
4,230 Posts
I read this as an indictment of the semi-autonomous systems like Tesla's. We either need full automation or none.
 

· Registered
Joined
·
10,014 Posts
I read this as an indictment of the semi-autonomous systems like Tesla's. We either need full automation or none.
I disagree. ACC has made my commute much less stressful. And less tiring. Constantly switching between throttle and brakes as well as determining closing speeds is much harder on the psyche than letting the car handle most of it.

TAAC (Tesla) is apparently more capable than ACC (GM) and lulls the driver into a higher reliance on the system. With ACC I need to pay a lot of attention to other traffic.

Either way, being a computer guy, I don't put much faith in tech. Although the last couple years of AI looks very good. AI programs itself (learns), so, a lot of human hand-coding is eliminated.
 

· Registered
Joined
·
3,237 Posts
The dilemma here is that even if a system is "twice as safe as a human" and truly does eliminate almost half of accidents (or even 90% or 99.9%), you are still left with the other accidents that do actually happen, and who gets the blame for them? It likely falls on the shoulders of the car manufacturer. Or at least that is how the lawsuits and the medial will spin it. I wonder how Tesla is managing the potential liability with this. I hope they have figured out how to insure for it.

And by the way, the "fix" for not allowing drivers to use the system incorrectly is just to detect if their hands are on the steering wheel. Can't someone watch a movie with their hands on the wheel? How does that solve the problem? If Tesla really is responsible for babysitting drivers, they need something like eye-tracking technology.
 

· Registered
Joined
·
3,237 Posts
It is interesting that when the NTSB investigates aviation accidents, if an autopilot system fails, they still hold the pilot completely responsible for the accident (pilot should be able to handle equipment failures). The NTSB seems to have a completely different standard in this situation.

NTSB should have sent a very clear signal that drivers are responsible for operating their vehicles.
 

· Registered
Joined
·
4,890 Posts
The NTSB report published this month sees things differently. They agreed that it was Brown’s responsibility to remain alert while the Tesla was in Autopilot mode. But they also placed some blame on Tesla for not putting sufficient processes in place to ensure that its drivers are still paying attention to the road while the Autopilot feature is engaged.

Additionally, the NTSB placed some blame on the truck driver. They found him partially responsible for the accident since when he made a left turn across two lanes of the highway, he failed to yield the right of way when crossing the intersection. They also noted that the driver tested positive for marijuana use though “his level of impairment, if any, at the time of the crash could not be determined.”

The crash therefore was deemed by the NTSB to be the fault of Brown, Tesla, and the driver of the truck that the Tesla drove in to.

The NTSB also issued a series of recommendations to the DOT, the NHTSA, and to the manufacturers of all “vehicles equipped with Level 2 vehicle automation systems.” You can see the recommendations and the full report below.

https://www.ntsb.gov/news/events/Documents/2017-HWY16FH018-BMG-abstract.pdf
 
1 - 10 of 10 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top