GM Volt Forum banner

1 - 20 of 61 Posts

·
Banned
Joined
·
7,289 Posts
Discussion Starter #1 (Edited)

·
Registered
Joined
·
105 Posts
It's beta software

Also, AP doesn't relieve the driver of the need to watch what is going on. It is a driver-assist program, not a level 5 autonomous driving program. I have seen a human do precisely that same maneuver in a construction zone. They were either confused by the poorly marked lanes, or simply weren't paying attention.

AI doesn't have to be perfect, just better than human drivers. This is a pretty low bar actually, so I welcome any safety measure that lowers the chance that some yahoo busy texting will be prevented from plowing into me.
 

·
Registered
Joined
·
1,339 Posts
Ouch. I'm torn on stuff like this... on one hand, autonomy/assist can theoretically make things safer. But on the other, certain levels of autonomy/assist (at least below full autonomy) seem to lead to some people being careless. And even full autonomy will have to deal with unanticipated scenarios, and people will undoubtedly get hurt because of that. Even if the overall numbers show a safety improvement, I'd be furious if my car crashed and injured me or a pedestrian.
 

·
Registered
Joined
·
20 Posts
At least the warning flashers appear to come on automatically.

On my 2017 Volt with ACC I find it unnerving to begin to feel my Volt begin to accelerate because the distance to the car immediately in front of me is increasing but I can see a car on my right that I know will try to cut in front of me.

Maybe when all of our cars are "talking" to each other issues like the one I describe can be resolved between the autonomous cars.

I hear that like me autonomous cars have difficulty at a four-way stop!
 

·
Senior Member
Joined
·
2,356 Posts
Despite Tesla's marketing efforts, neither AP1 or AP2 are anything like autonomous driving systems. It's driver assistance only. True autonomous cars, properly tested and certified, won't do things like this. There'll be enough redundancy with radar or lidar sensors to avoid a concrete barrier.
 

·
Premium Member
Joined
·
14,156 Posts
Despite Tesla's marketing efforts, neither AP1 or AP2 are anything like autonomous driving systems. It's driver assistance only. True autonomous cars, properly tested and certified, won't do things like this. There'll be enough redundancy with radar or lidar sensors to avoid a concrete barrier.
+1 This.

Yeah AP sucks. While it's marketed (hyped?) as being a lot more, it's basically a gussied up version of ACC and LKA. The problem is that you need the driver to be involved, but there isn't a good way to let the driver know when they need to be involved. Something unexpected happens and you're in the ditch, so to speak. Things happen so fast there is no good way to handle the transition from auto to human pilot.

My sense is that at the moment you're safer with ACC and LKA since with these aids you have to be engaged. I was excited about super cruise but I wonder if that may not be ready for prime time either. Given that ACC and LKA takes a lot of the driving burden off your shoulders, I'm thinking that at the moment less may be more.
 

·
Administrator
Joined
·
20,192 Posts
"The car is AP1 (first generation Autopilot) and I've never had any problems until today. Autopilot was on and didn't give me a warning," the author wrote. "It misread the road and hit the barrier."

Taking a closer look at the footage, the construction site did not include clear road markings to guide traffic onto the median. The original line markers as still visible, potentially confusing the Autopilot system.
Part of the issue is calling it "Autopilot", allowing some to think it has more power than it does. "Lane keep assist" seems more accurate. Emphasis on assist.

But it's good to see owners willing to be pioneers for the rest of us, providing us with "don't do this" examples. I wonder what the driver was doing at the time? Reading? Texting? Playing Solitaire on the phone?
 

·
Banned
Joined
·
7,289 Posts
Discussion Starter #9
+1 This.

Yeah AP sucks. While it's marketed (hyped?) as being a lot more, it's basically a gussied up version of ACC and LKA. The problem is that you need the driver to be involved, but there isn't a good way to let the driver know when they need to be involved. Something unexpected happens and you're in the ditch, so to speak. Things happen so fast there is no good way to handle the transition from auto to human pilot.

My sense is that at the moment you're safer with ACC and LKA since with these aids you have to be engaged. I was excited about super cruise but I wonder if that may not be ready for prime time either. Given that ACC and LKA takes a lot of the driving burden off your shoulders, I'm thinking that at the moment less may be more.
In a ditch....or worse as we've already seen.

GM delayed the implementation of SuperCruise....probably to do additional testing internally that Tesla is willing to do with actual customers on public roads. I'm sure Elon got some good data for future AP updates from the above crash though! Should send a thank you note to that Tesla owner for being a beta tester.
 

·
Registered
Joined
·
4,405 Posts
In a ditch....or worse as we've already seen.

GM delayed the implementation of SuperCruise....probably to do additional testing internally that Tesla is willing to do with actual customers on public roads. I'm sure Elon got some good data for future AP updates from the above crash though! Should send a thank you note to that Tesla owner for being a beta tester.
Two thoughts on this: If it were a GM product, there would be law firms fighting over the class action income (sometimes over 80% of the actual settlement dollars).

Second, videos like that are only going to harm self-driving and anti-collision rollouts. Just because a single company wants to release beta software on the street, they might actually harm their long-term goals more than help their cash flow.
 

·
Registered
Joined
·
2,784 Posts
It looks like AutoPilot kept it in its lane during the hit and kept it from taking out the SUV in the right lane. It would have been more impressive for AP to avoid the obstruction all together, but I was still impressed with the "after the hit" control.
 

·
Registered
Joined
·
2,130 Posts
Could be many things I suppose, but the car looks to be "smoking" after bumping the wall. I would totally get that from a combustible engine, but I have to wonder what exactly is going on here? I understand it is pure speculation, but it does seem to be coming from both sides of the car as there seems to be two distinct puffs rolling out the rear.
 

·
Registered
Joined
·
4,405 Posts
Could be many things I suppose, but the car looks to be "smoking" after bumping the wall. I would totally get that from a combustible engine, but I have to wonder what exactly is going on here? I understand it is pure speculation, but it does seem to be coming from both sides of the car as there seems to be two distinct puffs rolling out the rear.
The tire, and the heat exchanger can both create smoke on a collision as the car is moving, and briefly afterwards. Some smoke might have come from the airbag if the air is vented outside.
 

·
Registered
Joined
·
442 Posts
Very impressed with the control the car had given the driver was not paying attention. That said, part of the problem is the road was not marked for safety (even human). You can read posts and details from other sites on this accident. It was poorly done construction in Dallas. Still, driver error as car was assisting only.
 

·
Registered
Joined
·
4,405 Posts
Look forward at least 1/4 mile. Know what lane is the best lane. Know what the cars around you are doing. If that transition was a broken car instead, the outcome would have been much worse. The pickup in front would have avoided it, and the car would hit it at full speed since both the driver and sensors ignored the lead vehicle and right side lane marking entirely.

This is what I see as the major issue with autonomous systems. They currently do not anticipate potential threats, only active threats. If all you do when you drive is react to active threats, you will eventually crash, because the number of active threats is related to the number of potential threats. The more potential threats you address, the fewer active threats you will encounter.

In this case, the potential threats were tailgating, speed, incorrect lane choice, bright colored warning signs. Reacting to any of those risk enhancements would have probably stopped an active situation from occurring. Reacting to all of them would have reduced to the risk to almost 0.
 

·
Registered
Joined
·
3,533 Posts
Wait'll they become self aware. The Teslanator!
 

·
Registered
Joined
·
4,405 Posts
Wait'll they become self aware. The Teslanator!
What's cool is since Tesla controls all the service, if they need more traffic in the repair business, they simply send out an update to the AP to enhance service and parts income. :D
 

·
Registered
Joined
·
3,717 Posts
This really doesn't surprise me. " I told you so" is just too easy. Most drivers don't have to hit a construction barrier to learn they're not supposed to.
 

·
Registered
Joined
·
2,359 Posts
Here’s what I see in that video, and I’m trying to be as neutral as I can.

The vehicle strikes the “Jersey wall” at the lane shift. I do not know unequivocally if the car was self-driving, manually piloted, or some combination. I do know that those lanes were not legally/correctly marked for the circumstances and it was raining. So I’m not going to make a judgment about the –cause- of the collision. I am, however, going to make some observations about what happens AFTER the collision.

Upon strike, the vehicle is physically thrown to the right, it’s emergency flashers instantly activate and the vehicle recovers, without over-correcting, and begins a completely controlled and gradual deceleration.

This is absolutely astounding for a number of reasons.

If you pay close attention you can see the uninterrupted skid mark, off center from the rear wheel. This tells us that, at minimum, the driver’s front wheel was locked up by the collision and was just sliding. That means this car maintained control, did not over-correct, fishtail, or strike other vehicles with one of its steering/control wheels out of commission and dragging like an anchor. I can tell you for a fact that 90% of human drivers would have done one, or a combination, of the following things…
  • Spun out
  • Crossed over in front of the Jeep (who probably would have then struck the Tesla).
  • Over corrected back into the left barrier.
  • Wobbled all over the place.
  • Jammed on the brakes abruptly, and been rear-ended by the car with the dash cam.

In addition, full telemetry from the car, and “machine learning” models mean that the minute Tesla’s programmers and engineers figure out why this happened, ALL Tesla cars will know about it and such a thing will probably never happen again.

So perhaps the Tesla is responsible for the collision in the first place, I don't know, but I do know that from the moment of impact on the car handled things better than any human could have.
 

·
Banned
Joined
·
7,289 Posts
Discussion Starter #20 (Edited)
For those hoping Autopilot 2.0 will be an improvement....who knows, maybe it eventually will, but nowhere near AP 1.0 right now.

Frankly, I'm shocked Tesla even released "local Autosteer" to the public in its current state. That video makes a drunk driver look good!
In case you're wondering whether darkness was an issue, the same guy did the same run during the daylight, and it wasn't any better.

 
1 - 20 of 61 Posts
Top