This one happened July 1st in Pennsylvania. Driver said the "AutoPilot" Mode was engaged during crash. Thankfully everyone appears to be okay. Link to preliminary report.
The other thread kinda took a snarky down turn, but you can reference it here: discussion about first self-driving fatality accident
I see 2-3 Teslas per day on my commute- I wonder how many of them are running AutoPilot?
T.J.
UltimaDork
7/6/16 8:28 a.m.
I don't mind being an early adopter for some things, but being a volunteer beta tester for a technology that has a high likelyhood of injuring myself or others when it goes wrong seems just plain stupid to me. If this grand experiment wasn't tied to Elon Musk I doubt it would be allowed to continue unabated.
I think a big hurdle to the self-driving cars is construction zones: wierd, and often confusing lane shifting on multi-lane highways. Here in Indiana, they will often grind off (most of) the lane marking paint from the road and sqeeze traffic tightly to get room for construction. How will the auto-pilots handle this with out incident?
Summer of the Autopilot Car Accidents!!
All I can think of is that every Tesla crash from now on will be blamed on Autopilot, even if the car bursts into flames while parked.
PHeller
PowerDork
7/6/16 10:10 a.m.
How many Falcon's blew up on the barge?
It's only a matter of time before they get it right.
T.J. wrote:
I don't mind being an early adopter for some things, but being a volunteer beta tester for a technology that has a high likelyhood of injuring myself or others when it goes wrong seems just plain stupid to me. If this grand experiment wasn't tied to Elon Musk I doubt it would be allowed to continue unabated.
It's a Level 2 autonomous vehicle. "The car can act autonomously but requires the full attention of the driver, who must be prepared to take control at a moment's notice"
Its not berkeleying autopilot. I think both sides are to blame, but the driver certainly isn't being experimented on. They are supposed to be 100% focused on driving, whether then are pushing the pedals and turning the wheel or the car is.
Our friends at Motor Trend just released a piece on the subject.
PHeller wrote:
How many Falcon's blew up on the barge?
Small difference, just to play devil's advocate: When a Falcon malfunctions, I'm not a possible victim.
With all technology comes risk, however. How much breakage is acceptable?
David S. Wallens wrote:
...With all technology comes risk, however. How much breakage is acceptable?
Practically, as noted in the other thread, they only have to be better then people.
Realistically, because of the nature of perceived risk, (e.g. why air travel is more feared then car travel despite being far safer) they have to be almost perfect. People would rather be in control and under risk than be out of control and under much less risk.
"Autopilot" makes me a bit nervous. I figure about 75% of the people on the road during my commute would use it and then text or whatever it is they are constantly doing with their phones instead of paying attention. How long will it be until one Autopiloted car crashes into another?
Hal
UltraDork
7/6/16 9:01 p.m.
David S. Wallens wrote:
With all technology comes risk, however. How much breakage is acceptable?
Florida crash killed the driver of the "autopilot car", no other injuries.
Pennsylvania crash seemed to have no serious injuries.
Whenever an "autopilot car" crashes into another vehicle killing or seriously injuring the occupants of that vehicle is going to set the technology back for years.
In reply to aircooled:
Yup, you will not see widespread acceptance (or perhaps even legalization) of autonomous vehicles until the average person can't recall the last time somebody got killed by one.
In reply to aircooled:
David S. Wallens wrote: ...With all technology comes risk, however. How much breakage is acceptable?
Practically, as noted in the other thread, they only have to be better then people.
Realistically, because of the nature of perceived risk, (e.g. why air travel is more feared then car travel despite being far safer) they have to be almost perfect. People would rather be in control and under risk than be out of control and under much less risk.
Excellent point, well said. I wouldn't say that I have a fear of flying, but I do not enjoy the helpless feeling. My brain knows that the chances of anything bad happening is very small, but I also know that if something does go wrong, there is zero I can do about it.
To me, autonomous cars are similar. Even if they could reach airline levels of safety, I think I'd feel the same way if every once in a while a car would fail and cause a fatal accident.
The difference is, I really don't have a choice about flying. If need to be somewhere within a certain amount of time, I have to fly. But I don't need an Autonomous car.
I enjoy driving too much to let a computer do it for me.
BrokenYugo wrote:
In reply to aircooled:
Yup, you will not see widespread acceptance (or perhaps even legalization) of autonomous vehicles until the average person can't recall the last time somebody got killed by one.
They won't have to be that rare. They'll have to be about as uncommon as a plane crash or an elevator/escalator accident, which will be achievable within 10-15 years.
Also much of the unreasonable fear about flying comes from the incredibly high speeds and altitudes involved, which won't be a factor with autonomous cars. The safety of the ground is still just as close by as it is in manned cars. In a plane, what amounts to a climate control system failure could kill you.
In reply to GameboyRMH:
Don't forget about the snakes on the plane....