1 2 3
Keith Tanner
Keith Tanner GRM+ Memberand MegaDork
9/1/20 11:25 a.m.

Devil's advocate: how is Tesla testing autopilot any different than  a young driver? Or an impaired or distracted one? Why do we need to alert the world that there is a silicon computer in charge of that car instead of a meat one?

If we get to the point where even a Level 3 autonomous car has an improved safety level (on average) than a human driver, do we need to start adding flashing lights to cars that are being manually driven?

mtn (Forum Supporter)
mtn (Forum Supporter) MegaDork
9/1/20 11:52 a.m.
Keith Tanner said:

Devil's advocate: how is Tesla testing autopilot any different than  a young driver? Or an impaired or distracted one? Why do we need to alert the world that there is a silicon computer in charge of that car instead of a meat one?

If we get to the point where even a Level 3 autonomous car has an improved safety level (on average) than a human driver, do we need to start adding flashing lights to cars that are being manually driven?

Here is the problem: There is never going to be good enough when it comes to human lives. We can't get there for anything. But if you say that, the autonomous cars are not good enough, will never be good enough, we will never move forward with them. Which is silly as they're already better than humans. 

And what about situations like the Uber autonomous vehicle? Here was our 9 page long thread on it. In my non-expert opinion, in that situation, there is extremely low chance a human driver would have been able to avoid the pedestrian that walked out in the middle of the road at night, not near a sidewalk. However, again in my non-expert opinion, there is a high chance that a human driver would have made an attempt - and the autonomous vehicle did not but should have. So who is at fault here? The city/municipality for not putting a crosswalk there? The pedestrian for crossing there? The vehicle for not catching the pedestrian? The driver for not seeing them either? 

 

Everyone needs to remember that multiple things can be true without contradicting each the others. 

  1. Based on the data we have available, autonomous vehicles are safer than human operated vehicles, at least in the US where it is super easy to get a drivers license
  2. Autonomous vehicles are not perfect and need to be improved
  3. A single death from an autonomous car is too many from a moral standpoint
  4. A single death from a manned vehicle is too many from a moral standpoint
  5. We will never eliminate either (3) or (4), because it would be far too expensive and/or restrictive

And hey, if you don't like driving on the roads with them... Then don't. Your choice to drive on the roads, isn't it?

STM317
STM317 UberDork
9/1/20 12:35 p.m.
Keith Tanner said:

Devil's advocate: how is Tesla testing autopilot any different than  a young driver? Or an impaired or distracted one?

A young or distracted/impaired driver isn't being marketed as a safety feature? If you market your tech as a safety feature, and in reality it's like a more experienced driver turning the wheel over to a dangerous driver, that seems like a net loss to me. If all vehicles were suddenly using Autopilot, and it was equivalent to all drivers suddenly being equivalent to  a teen or a drunk, that's bad news.

The goal should be reducing the number of dangerously operated vehicles on the road, not adding to them with tech that is under-developed or easily defeated and abused, right? Thinking being that if there are a baseline number of dangerous drivers on the road, and we then add Tesla's under Autopilot to that, then the raw number of dangerous vehicles on the road just increased by however many Teslas are on Autopilot didn't it?

 

Keith Tanner
Keith Tanner GRM+ Memberand MegaDork
9/1/20 12:55 p.m.

So as soon as autonomous vehicles become safer on average than manually driven ones, we should ban all manually driven cars.

I don't know if the current FSD (not Autopilot, that's different as noted) is safer or more dangerous than the average driver. There's likely quite a bit of debate about that and I'll bet an interested party could come up with stats showing both yes and no. But those distracted and young drivers are part of the average today, so we're not comparing it to a hyper-vigilant and well rested driver with above-average reflexes and skill like all of us.

RevRico
RevRico GRM+ Memberand PowerDork
9/1/20 12:57 p.m.

If they were so dangerous we would hear these stories every day. This is the first Tesla related fatality I've heard about in months.

What I do hear every day, are multiple accidents involving old, young, drunk, or distracted drivers. So much so that even if the police scanners say fatalities they don't even get a blurb in the paper unless there was a kid involved or a great deal of property damage.

Clearly, no matter how much safer we make manually operated vehicles, our production of distractions and just plain stupid people continues to out pace everything else.

Since its apparently illegal and "immoral" to let Darwinism weed out the useless and just plain stupid, no matter how good it would be for the species as a whole, we all need to accept that there will be a period of coexisting on everyone's part as the safer technology evolves, and people cling onto their beliefs that they are the ultimate driving machine, which enables them to text, drink coffee, watch a show, and shave for work while behind the wheel of a car. There will be bumps in the road, some caused by software, some caused by humans.

Death comes for everyone, much like E36 M3ting, it's one of the few equalisers of the planet. Nobody gets out of life alive, and when your time comes, it's your time, no matter your age, race, religion, intelligence, whatever. Everybody poops, everybody dies.

Or, just make licences harder to get and mandate recertification at renewal time, but that would just make sense, so we know that's impossible.

Keith Tanner
Keith Tanner GRM+ Memberand MegaDork
9/1/20 1:00 p.m.

I had a girlfriend in high school who felt that Darwinism was underutilized. She's a doctor now.

STM317
STM317 UberDork
9/1/20 1:06 p.m.

In reply to Keith Tanner :

I guess what I'm arguing is that right now, ability and willingness to make good choices behind the wheel varies. Software/AI isn't going to vary much, if at all. If Autopilot or whatever software brings everybody to the level just above "average" it would help the lower half who are poor drivers or those who make bad choices (distracted/impaired driving) and simultaneously be a reduction in safety for the top half of drivers that make better choices behind the wheel. You'll increase safety for some while reducing it for others if every vehicle is suddenly operated just slightly better than if an "average" driver were behind the wheel.

I do think that there are systems being used that I would trust to be much better than an average human. And to be fair, Tesla's setup may be as well when it's not being abused or used unsafely. The real issue is that it appears to be pretty easy to abuse or use unsafely when compared to other systems.

Keith Tanner
Keith Tanner GRM+ Memberand MegaDork
9/1/20 1:23 p.m.

That's the problem with risk assessment and why I kept on stressing "on average". If the autonomous car is better on average, then it's an overall win. But it will be a the cost of those at one end of the scale. Will we accept that, or will autonomous cars only be accepted when they are better than every single possible driver? From society's point of view, we win if it's just slightly better than average. But there will be many examples where, after lots of analysis, it could be determined that a human would have avoided the accident. And that's where our ability to judge risk falls down - never mind the fact that nobody considers themselves an average driver, never mind below average. 

As I noted earlier, if autonomous cars could cut traffic fatalities by 90%, we'd still have robot cars involved in 4,000 US deaths per year. Obviously, that's far better than 40,000. But could we as a society accept it? I think we all agree on that answer. 

Tesla gets all the headlines because it's by far the most common solution out there and Tesla is a polarizing company (is that an EV joke? Sure, why not). They're also going after the harder use case of autonomy at any time instead of working towards it by limiting the use of FSD to exhaustively mapped limited corridors. I think that solution is what we need eventually, but it's definitely a much bigger step than the controlled environment and maybe there should be more time spent watching how the humans drive cars equipped with a full sensor suite instead of letting the cars try it with human oversight, because you can't trust the humans.

Adrian_Thompson (Forum Supporter)
Adrian_Thompson (Forum Supporter) MegaDork
9/1/20 1:35 p.m.

The reason everyone, well half of us, dump on Tesla every time this happens is we (those doing the dumping) believe that Tesla are more responsible for these deaths/accidents because of how they describe their system to the buying public.  They throw around terms like Autopilot and Full Self Driving, implying strongly that their cars are actually real world autonomous vehicles.  Other manufacturers who offer similar capabilities take great pains. right up front in their adverts and online info,  to let the public know their systems are not really full autonomous.  Tesla states they are not full autonomous, but in the small print and disclaimers.  Ask the average person on the street, or even average Tesla owner, and they are not aware of level 1-5 autonomous, let alone what each step means.  Tesla splashed Full Self Driving around and people believe it.  In a world where most built in Nav systems and and remote/voice/Bluetooth audio systems require you to acknowledge the limitations, liabilities and risks of these systems every time you start the car or use them for the first time, a simple click to accept warning isn't enough for Tesla to pass the buck onto the user.  

Observations of Tesla from 30 years in the industry:

  • Everyone who I know who's interviewed their comes away believing it's a house of cards.
  • They have massive turnover in all departments, especially Engineering as the place is a meat grinder to work for.
  • Their business ethics are, to be charitable, questionable.
  • Other than a handful of Engineers most of their staff come from outside the auto industry and are making/learning mistakes known by the rest of the industry for decades.
  • No one can figure out how the hell their legal team gets away with a lot of this stuff like 'Full Self Driving'.  OGC at all other OEM's throw caveats all over this stuff.  No one can believe they haven't been hauled up and lost in court yet.
  • Their plan to do autonomous with cameras and no LIDAR is the laughing stock of the industry for full level 5 autonomous driving.
  • On the plus side everyone I know agrees that they've given the whole industry a long over due wake up call on what can be done, although no one is impressed with how they go about it.  

TL:DR YEs their systems are at industry par for what they actually are, the problem is they imply they are far more capable and in a world where people have got used to skipping the fine print, they end users end up exposing themselves to stupid risks in the belife they are safe.  Tesla has a moral responsibility for their misrepresentation.  

 

On a personal note, I've driven the S, X and 3.  They look good, well the S and X do, the 3 is a bit frumpy.  The straight line performance is amazing, even for the base models.  The 'autopilot' works impressively for what it is, rather than what they imply it is, although I've only used it on the test track, not on the road.  NVH is the S is not bad, but not great in the rear, X is so so and the model 3 is sumply a joke in the back seat.  Steering feel is, well, lacking.  Finally, their fit and finish is a joke.  The automotive press would have, rightly, torn the big three a new one for putting out anything that bad back in the 90's, let alone now.  

STM317
STM317 UberDork
9/1/20 1:42 p.m.

In reply to Keith Tanner :

So what's the motivation for the 50% of drivers that are safer than average (or anybody else in the bottom half who thinks they are safer than average) to hand over control to software that's going to be "average"? How does the tech ever get to a mature point or wide spread adoption unless it's way, way safer than "average"?

Slippery (Forum Supporter)
Slippery (Forum Supporter) GRM+ Memberand UltraDork
9/1/20 2:05 p.m.
Adrian_Thompson (Forum Supporter) said:

 Steering feel is, well, lacking.  Finally, their fit and finish is a joke.  The automotive press would have, rightly, torn the big three a new one for putting out anything that bad back in the 90's, let alone now.  


1988RedT2
1988RedT2 MegaDork
9/1/20 2:12 p.m.

In reply to Slippery (Forum Supporter) :

So even the Tesla fanboy admits the panel fit sucks. laugh

93EXCivic
93EXCivic MegaDork
9/1/20 2:37 p.m.
dean1484 said:

To me it is simple you should not mix autonomous vehicles with human controlled ones especially if the vehicles can not communicate with each other. 

If that is the case then autonomous vehicles are never going to happen. An autonomous vehicle is always going to have to interact with none autonomous inputs. Whether that is a pedestrian, a bicycle rider, farm equipment, farm animals, deer, horses. Plus the government is realistically going to be able force everyone to suddenly switch to fully autonomous cars due to a kinds of factors. 

I am all in favor of autonomous vehicles. I see the average idiot driving. There is no way autonomous vehicles won't save tons of lives. 

Keith Tanner
Keith Tanner GRM+ Memberand MegaDork
9/1/20 2:54 p.m.
STM317 said:

In reply to Keith Tanner :

So what's the motivation for the 50% of drivers that are safer than average (or anybody else in the bottom half who thinks they are safer than average) to hand over control to software that's going to be "average"? How does the tech ever get to a mature point or wide spread adoption unless it's way, way safer than "average"?

It's going to have to be convenience. People will (and do) choose the autonomous option because they don't want to be bothered with the driving. They want to sleep or watch movies or do unspeakable things to each other.

bobzilla
bobzilla MegaDork
9/1/20 3:09 p.m.

In reply to Keith Tanner :

So instead of "mile high club" it'll be "60mph club?" I might be able to get behin..... err.... under... um.... I might be ok with this. 

yupididit
yupididit PowerDork
9/1/20 4:31 p.m.
STM317 said:

In reply to Keith Tanner :

 The real issue is that it appears to be pretty easy to abuse or use unsafely when compared to other systems.

 

Can be said about every tech out there. 

Slippery (Forum Supporter)
Slippery (Forum Supporter) GRM+ Memberand UltraDork
9/1/20 7:44 p.m.
1988RedT2 said:

In reply to Slippery (Forum Supporter) :

So even the Tesla fanboy admits the panel fit sucks. laugh

Because you liked the previous one so much, here goes this one cheeky



1988RedT2
1988RedT2 MegaDork
9/1/20 7:51 p.m.

In reply to Slippery (Forum Supporter) :

Thank you.  That was positively nauseating. laugh

Slippery (Forum Supporter)
Slippery (Forum Supporter) GRM+ Memberand UltraDork
9/1/20 7:55 p.m.

In reply to 1988RedT2 :

Lol

1 2 3

You'll need to log in to post.

Our Preferred Partners
qcNF0LTt9opfPq3gdWskb40Vbxcix9qZCuOPZKGW6UNNHVTMTPDH1dNWUp3jNPnl