1 2 3 ... 5
dyintorace
dyintorace GRM+ Memberand PowerDork
7/1/16 7:29 a.m.

This article appeared in our paper today. Williston is a small town just southwest of Gainesville. Sad circumstances. Sounds like a terrible accident.

Wonder if this will have any impact on the autonomous craze.

Tesla, maker of high-tech electric cars, announced Thursday that a Williston crash in May that resulted in the death of the driver was the first known fatality involving its Autopilot self-driving technology.

Tesla notified the National Highway Transportation Safety Administration of the crash and the agency is now investigating, the company said on its website.

“What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor,” the website post said. “The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S.”

Had the car hit the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents, the company added.

The Florida Highway Patrol reported the accident occurred May 7 at 3:45 p.m. on U.S. 27A in Levy County. Joshua Brown, 45, of Canton, Ohio, died in the crash.

Brown was a former Navy SEAL who owned a technology company, according an obituary posted online by the Murrysville Star in Pennsylvania.

A tractor-trailer driven by Frank Baressi, 62, of Tampa, was westbound on U.S. 27A in a left turn lane at 140th Court while Brown was eastbound on U.S. 27A.

As the truck was turning left, Brown’s Tesla Model S hit the underside of the trailer, left the road and hit two fences and a power pole. Brown, who was wearing a seat belt, died at the scene.

By the time firefighters arrived, the wreckage of the Tesla — with its roof sheared off completely — was hundreds of feet from the crash site where it had come to rest in a nearby yard, assistant chief Danny Wallace of the Williston Fire Department told The Associated Press.

Tesla said the death was the first known fatality in more than 130 million miles of travel in which Autopilot was in use. Tesla said Autopilot is not the default system in the cars and must be activated by the driver.

When it is activated, an acknowledgement box explains that Autopilot is an assist feature and that the driver must keep hands on the steering wheel at all times.

“We do this to ensure that every time the feature is used, it is used as safely as possible. As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert,” the company's statement read.

Tesla's post said company officials were saddened by Brown’s death.

“He was a friend to Tesla and the broader (electric vehicle) community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla’s mission. We would like to extend our deepest sympathies to his family and friends,” the statement read.

Greg Voth
Greg Voth Dork
7/1/16 7:38 a.m.

Those are tough accidents. With my last job I had several on scene investigates which sound like that exact scenario.

It's not like the trucks can just jump out although they bear part of the blame for failing to making sure they can clear the intersection. It seems to be weather (early morning fog usually) or inattentiveness on the cars part.

I am very interested in a full autopilot as my work productivity could almost double. Last month I did just under 3000 reimbursable miles plus my normal commuting.

alfadriver
alfadriver MegaDork
7/1/16 8:09 a.m.
dyintorace wrote: Wonder if this will have any impact on the autonomous craze.

Of course it will- how could it not?

The most important thing is that it will force OEM's to raise their game a lot to gain the trust of more consumers. And that is a good thing- it will make the process better and more robust.

But it's also fair to note that Tesla is taking a different path than most of the OEMs- Tesla and Apple (and maybe even google) are taking the direct path to autonomous cars.

Whereas the rest of the industry is taking smaller steps- getting cars to communicate with each other, and ground stations. So that warning systems can be put into place to tell drivers that something is going on.

Theoretically- in this case, had the truck and car talked- the truck driver would have known the car is coming, and it would have also told the Tesla that it's pulling out into traffic. So both drivers would have had a warning of something happening.

When that system is in place enough, then cars will become more self driving. And even then- on a scale- so instead of just a divided highway, it would be on limited access freeways to start with- and then progress down to local roads.

This just reminds us the scale of the issue, and the required safety standards that need to be met.

Sad for the guy, but I don't see how this would have been avoided as some point. If not this guy, someone else would have...

GameboyRMH
GameboyRMH GRM+ Memberand MegaDork
7/1/16 8:17 a.m.

Sounds like the software assumes the car can fit under anything taller than the hood of the car...surely they had to set an "allowable height" so that the car wouldn't treat tunnel entrances and parking garages like solid walls, but they set it too low and didn't require the entrance to be stationary...big mistake. V2V comms are not necessary to prevent this.

It will have no effect on the autonomous "craze" though. In the US alone, around 91 more people will die in car accidents today, probably no others will be in autonomous cars.

This is a hint of what's to come in the news though. When all cars are autonomous, a car accident will be as newsworthy as a plane crash (or an elevator accident).

Apexcarver
Apexcarver PowerDork
7/1/16 8:20 a.m.

http://electrek.co/2016/07/01/truck-driver-fatal-tesla-autopilot-crash-watching-movie/

In a development regarding the fatal accident in Florida where a Tesla Model S on Autopilot crashed into the trailer of a truck last month, the truck driver in question, Frank Baressi, now claims that the Tesla driver was watching a movie while traveling on the highway using the Autopilot.

Driver was trusting a public "beta" that cautions you to have your hands on the wheel and be ready to take over at any time...

T.J.
T.J. UltimaDork
7/1/16 8:24 a.m.

This is the type of thing I was worried about in the thread last week when I said either give me fully autonomous car or a fully manual one. I don't want to share my driving with some code and sensors. It's a bad situation where either two entities think they are each in control or both thinks the other is in control and maybe neither is in control.

Sad story and more than likely a gruesome accident scene, but sadly predictable and preventable.

Lof8
Lof8 GRM+ Memberand HalfDork
7/1/16 8:25 a.m.

There are more factors and dangers in everyday driving than vehicle sensors and a computer can account for. I will forever trust myself over autopilot in a car.

jimbob_racing
jimbob_racing Dork
7/1/16 8:39 a.m.
alfadriver wrote: Whereas the rest of the industry is taking smaller steps- getting cars to communicate with each other, and ground stations. So that warning systems can be put into place to tell drivers that something is going on. Theoretically- in this case, had the truck and car talked- the truck driver would have known the car is coming, and it would have also told the Tesla that it's pulling out into traffic. So both drivers would have had a warning of something happening. When that system is in place enough, then cars will become more self driving. And even then- on a scale- so instead of just a divided highway, it would be on limited access freeways to start with- and then progress down to local roads.

Nobody's going to give me a free install on a communication system for my '75 Datsun, my 2008 Honda or the Amish guy with a horse and buggy for that matter. Self driving cars need to be smart enough to not hit ANYTHING. I don't see that happening because there's too many variables and I doubt anybody is going to accept an supposedly autonomous car that regularly drops back into manual mode when it gets confused.

nderwater
nderwater PowerDork
7/1/16 8:39 a.m.
Lof8 wrote: There are more factors and dangers in everyday driving than vehicle sensors and a computer can account for. I will forever trust myself over autopilot in a car.

I hear you, but last year alone 38,300 people were killed and 4.4 million injured on U.S. roads. That averages 105 people killed and 12,000 people injured every single day while 'trusting themselves' and other human drivers. From a pure risk and statistics standpoint, we're all better off letting the computers make the mistakes instead of the drivers.

KyAllroad
KyAllroad UltraDork
7/1/16 8:40 a.m.

Interesting that the crash occurred back in early May and we're just now hearing about it.

KyAllroad
KyAllroad UltraDork
7/1/16 8:41 a.m.
nderwater wrote:
Lof8 wrote: There are more factors and dangers in everyday driving than vehicle sensors and a computer can account for. I will forever trust myself over autopilot in a car.
I hear you, but last year alone 38,300 people were killed and 4.4 million injured on U.S. roads. That averages 105 people killed and 12,000 people injured *every single day* while 'trusting themselves' and other human drivers. From a pure risk and statistics standpoint, we're all better off letting the computers make the mistakes instead of the drivers.

sarcasm font: but when 90% of drivers are "above average" how can that be?? end sarcasm

alfadriver
alfadriver MegaDork
7/1/16 8:48 a.m.
jimbob_racing wrote:
alfadriver wrote: Whereas the rest of the industry is taking smaller steps- getting cars to communicate with each other, and ground stations. So that warning systems can be put into place to tell drivers that something is going on. Theoretically- in this case, had the truck and car talked- the truck driver would have known the car is coming, and it would have also told the Tesla that it's pulling out into traffic. So both drivers would have had a warning of something happening. When that system is in place enough, then cars will become more self driving. And even then- on a scale- so instead of just a divided highway, it would be on limited access freeways to start with- and then progress down to local roads.
Nobody's going to give me a free install on a communication system for my '75 Datsun, my 2008 Honda or the Amish guy with a horse and buggy for that matter. Self driving cars need to be smart enough to not hit ANYTHING. I don't see that happening because there's too many variables and I doubt anybody is going to accept an supposedly autonomous car that regularly drops back into manual mode when it gets confused.

Free, no. But the communication devices will be available to the general public.

But I'm just illustrating the steps most OEMs are taking for autonomous cars.

Like it or not, that's how it's progressing.

T.J.
T.J. UltimaDork
7/1/16 9:05 a.m.

So, if I were Tesla's lawyers I would first blame the driver for not paying attention and then I would try to make the case that it was just a terrible accident and that it would've ended with the same result if no auto pilot was in use at all. Make the case that the computer did just as well as a human could've alone. Certainly there are enough cars driving under semi trailers that result in deaths. Blame it on the lack of side lights along the length of the trailer or something like that.

JoeTR6
JoeTR6 HalfDork
7/1/16 9:07 a.m.

On Wednesday I almost had a head-on offset crash with a huge paving truck at 35 MPH. We were on a narrow 2-lane road entering a corner where the visibility was limited due to vegetation. I got about 30 degrees through the turn and saw the trunk coming at me straddling the line. He was in the middle of the road (overcooked the corner) and if anything was still swinging out wider. What saved my bacon was knowing this road and that the shoulder, though grassy, wasn't a steep drop and had 3 to 4 feet of clearance. Autocross skills engaged and I braked, dropped the right side into the grass mostly letting off the brakes, and threaded the needle.

I'm curious how an autonomous car would have handled this situation. IMO, you'd be nuts to trust any AI system on twisty narrow roads, but unexpected situations occur frequently where knowledge/intelligence/intuition are more helpful than sensor input and fixed algorithms. I'm really worried that this type of system will make drivers even more lazy and ignorant. OTOH, it could be argued that the stupid meatsack driving the truck was the biggest part of the problem.

Duke
Duke MegaDork
7/1/16 9:07 a.m.

I'm curious about how the truck driver knew the Tesla guy was watching a movie.

mtn
mtn MegaDork
7/1/16 9:33 a.m.
JoeTR6 wrote: On Wednesday I almost had a head-on offset crash with a huge paving truck at 35 MPH. We were on a narrow 2-lane road entering a corner where the visibility was limited due to vegetation. I got about 30 degrees through the turn and saw the trunk coming at me straddling the line. He was in the middle of the road (overcooked the corner) and if anything was still swinging out wider. What saved my bacon was knowing this road and that the shoulder, though grassy, wasn't a steep drop and had 3 to 4 feet of clearance. Autocross skills engaged and I braked, dropped the right side into the grass mostly letting off the brakes, and threaded the needle. I'm curious how an autonomous car would have handled this situation. IMO, you'd be nuts to trust any AI system on twisty narrow roads, but unexpected situations occur frequently where knowledge/intelligence/intuition are more helpful than sensor input and fixed algorithms. I'm really worried that this type of system will make drivers even more lazy and ignorant. OTOH, it could be argued that the stupid meatsack driving the truck was the biggest part of the problem.

But if the cars could "talk" to each other, they'd each react appropriately.

rob_lewis
rob_lewis SuperDork
7/1/16 9:46 a.m.

My only concern with this, as the report kinda seems to indicate, is that drivers will stop paying attention as more autonomy becomes available. The problem is autonomy will come in small batches. It started with backup cameras, then lane departure warnings, then automatic braking, etc.
How many commericals have you seen over the past year touting a cars ability to stop when you're not paying attention? Now, compare that with how many car commercials focus more on entertainment than the driving aspects of the car.

I know of several people (and I refuse to ride with them and TRY to teach them) that will simply look at the backup camera when reversing. No mirrors, no looking out any windows. Just focus on the camera. "If I'm not supposed to use it, why is it there?" This same belief and trust I am sure extends to lane departure, "I don't need to check my blindspot, the car will tell me" and automatic braking, "I can text on my phone because if I miss something, the car will stop."

I sounds like, and I'll admit I don't know all the details, that the driver of the Tesla, perhaps, trusted in the autopilot a little too much. I can also see where if he's had the car for some time and has built trust in it, he would logically think it'll handle any situation. Basically, a false sense of security.

I think driver aids are fantastic. I think they WILL save lives in due time. But, today's drivers are already distracted and the sheer act of driving is viewed as a chore rather than something to enjoy. As a result, drivers will want the car to do more for them and will trust in those systems beyond what the capabilities are.

It's an interesting point to be in today. Do we add aids that we know will reduce accidents and risk that the driver will trust more in them than what they are capable of doing or do we remove all aids and trust that the drivers pay full attention to driving?

-Rob

GameboyRMH
GameboyRMH GRM+ Memberand MegaDork
7/1/16 9:59 a.m.
rob_lewis wrote: I know of several people (and I refuse to ride with them and TRY to teach them) that will simply look at the backup camera when reversing. No mirrors, no looking out any windows. Just focus on the camera. "If I'm not supposed to use it, why is it there?"

Depending on the car this might not be a bad idea, some have multiple wide-angle cameras that give you a better view than you can get by looking at the mirrors.

GameboyRMH
GameboyRMH GRM+ Memberand MegaDork
7/1/16 10:34 a.m.

There's a little more detail here, it looks like the fact that the trailer was white is relevant, Tesla says there was a contrast problem in identifying the trailer:

http://www.thetruthaboutcars.com/2016/07/tesla-autopilot-crash-victim-identified-ex-navy-seal-trucker-spotty-record-claims-victim-watching-movie/

There may not be a problem with the "height limit" after all, the problem may be that they're relying partially on optical sensors to detect how tall objects are. They'll need to have both LIDAR and optical sensing that can detect objects as tall as the vehicle, maybe an infrared camera as well?

gearheadmb
gearheadmb HalfDork
7/1/16 10:35 a.m.

In reply to JoeTR6:

I've wondered about this also. I've been in situations where I've been forced off the road at speed and had to do things like dodge poles/mailboxes/culverts. I've been lucky. But it comes down to assessing the environment, deciding a plan, and making the vehicle conform to the plan. On a road with typical driving situations a computer is going to be faster at this, but I don't know how well a computer could handle a situation like this. You just can't program for everything, and human reason has to be used. You understand the difference between a mailbox and an electric pole or a telephone box if you hit them. You can choose to hit the mailbox if the sideditch drops off farther to the right. You can try your luck with the dropoff if its an electric pole instead of a mailbox. What is the autonomous car going to do?

Keith Tanner
Keith Tanner GRM+ Memberand MegaDork
7/1/16 11:01 a.m.

As with every accident, hero drivers will always claim they could do better. This will not change, the 98% of drivers who are better than 98% of other drivers will always say they could have avoided the accident. The goal of autonomous cars (and driving aids) is to improve the odds overall. Sure, it'll be possible to second-guess cherry-picked accidents but overall, I'll take my chances with a well-programmed system that's always paying attention over someone who is lurching from accident to accident while drunk and dicking with their phone.

Sounds like the driver of this car liked to test the limits of his system, and has posted a video to YouTube of his car (Tessy) saving his life once before when a truck cut him off. "Autopilot" was an unfortunate name for the Tesla system, as it means different things to the general public (the car, it does everything!) and pilots (cruise control for the sky). The owners interpret it as full autonomy.

Joe Gearin
Joe Gearin Associate Publisher
7/1/16 11:24 a.m.

I think we all agree that most drivers are incompetent, and autonomous cars will save many of these lives. What about us though? Those who actually care about driving and pay attention to what we are doing.

I don't hate the idea of autonomous cars for the masses. What I am concerned about it the govt. restricting "manually driven" cars to improve safety. I'm no hero, but I pay attention to what I'm doing behind the wheel. I've driven myself for 30 years now and hundreds of thousands of miles. I trust my competence behind the wheel way more than I'd trust any computer driven car. Will my driving be restricted in the next 20 years? Will I be forced to ride in a "pod" for safety's sake? That's what concerns me.

Let the "non-car enthusiast" masses have their transportation pods. I want to drive, I love to drive, driving is one of my favorite things to do. I'm more and more concerned that the safety nannies are going to try and take this away from me. Am I being paranoid?

Keith Tanner
Keith Tanner GRM+ Memberand MegaDork
7/1/16 11:30 a.m.

The best analogy I've seen was the relationship between the horse and the car. Riding horses used to be something you had to do to get around. The car took them out of the equation, so horse riding became recreational with dedicated riding areas. You no longer had to ride a horse when you didn't want to, it because something you did because you wanted to.

GameboyRMH
GameboyRMH GRM+ Memberand MegaDork
7/1/16 11:34 a.m.
Keith Tanner wrote: The best analogy I've seen was the relationship between the horse and the car. Riding horses used to be something you had to do to get around. The car took them out of the equation, so horse riding became recreational with dedicated riding areas. You no longer had to ride a horse when you didn't want to, it because something you did because you wanted to.

Likewise, I think in most places, eventually driving manually-controlled cars in the middle of the street will be illegal, and they'll have to travel on the soft shoulder with a Slow Moving Vehicle sign on the back. I'd guess that's at least 25 years away though.

Javelin
Javelin GRM+ Memberand MegaDork
7/1/16 11:39 a.m.

In reply to Joe Gearin:

Have you not seen the anime eX-Driver?

1 2 3 ... 5

You'll need to log in to post.

Our Preferred Partners
GC4C5q6qkm2YcASzWbwIUWJD2aGKWnI6uAk6TBOzyE6GAAMgd6tKTFOczAOhKf1Y