1 2 3 4 5
STM317
STM317 UltraDork
12/10/19 8:16 p.m.
Keith Tanner said:

How do you demonstrate "foolproof"? How do you determine the tech is mature?

That's my question. It's easy to just say "oh, when it's ready it'll be accepted". But how does it get ready? How do we prove that it's ready so that a majority of people will accept it?

You start with small scale testing, and in controlled or near ideal environments like Waymo or GM's Cruise. Waymo has something like 10 million test miles driven on public roads and over 7 Billion miles on simulators. With a fleet of 'around 600' vehicles, none in private hands.

Then you analyze the data from your small test for how frequently your safety driver had to intervene/mile driven. California requires companies that test there To make that information public. Waymo is comfortably ahead, but Tesla doesn't report. Then, you slowly expand your test region, while opening access to the public in your original test location.

Full autonomy is the prize here. Waymo is leading the pack, but Tesla really doesn't share any of their data.  Waymo has slowly tested and developed their tech in gradually larger areas. While they've been testing in public, they've done it with a small fleet of vehicles entirely owned/controlled by the company. Tesla has sold hundreds of thousands of vehicles with Autopilot to private individuals, and is using their experiences literally anywhere to build their data set. They've got people paying to beta test their product for them. It's amazing really. 

I think that's a risky (perhaps reckless) shortcut, just like their insistence that LiDAR isn't needed in hopes that through sheer number of miles driven and processing power they'll be the first to get to full autonomy. They're gambling far more than anybody else (besides maybe desperate Uber) with public safety to try and be the first. It's all about money.

Keith Tanner
Keith Tanner GRM+ Memberand MegaDork
12/10/19 8:32 p.m.
codrus said:
Keith Tanner said:

In 1996, I had a computer prof explain to me in great detail how full screen streaming video was impossible.

Bah.  We knew how to do full screen streaming video in 1996, we just didn't have networking hardware that was fast enough.  That's a simple application of Moore's law -- more transistors made everything faster.

Hey, I stopped listening to anything he had to say at that point :) He proved that it was impossible. I'm just saying that declarative statements about the progress of technology can look laughable in retrospect.

I'm seriously interested to see how this plays out. I don't like seeing people getting complacent and misusing the limited autonomy tools that are out there, because every sensationalist news story about a car crashing under computer control hurts everyone working towards that goal. I don't trust them enough to put myself in one yet. I also get nervous when riding in the car when my sister is driving. She's a low bar for autonomy to reach.

I do think we're setting unrealistic standards for when autonomous vehicles will be "ready". Things like ethical questions - aka the trolley problem - can be showstoppers. Meanwhile, most human drivers will, when faced with an emergency that involves an ethical choice, react badly and probably just drive into whatever they're looking at because their protein supercomputers have a tendency to lock up when faced with an unexpected situation. But we'll keep setting higher and higher standards for them, trying to reach an unattainable goal of zero fatalities. Even the space program couldn't manage that. At some point, we're going to have to let them loose.

ebonyandivory
ebonyandivory PowerDork
12/10/19 9:32 p.m.

When vehicles are eventually deemed "autonomous" and passengers are just along for the ride, there will be no one to hold responsible for the deaths that will be caused by the vehicle.

I personally like to have a human to punish for screwing up. 
 

PS: Cannonballers have a far better track record than autonomous vehicles! cheeky

Keith Tanner
Keith Tanner GRM+ Memberand MegaDork
12/10/19 9:35 p.m.

Would you rather have someone to punish or fewer deaths overall?

mattm
mattm GRM+ Memberand Reader
12/10/19 11:52 p.m.

In reply to Keith Tanner :

There are lots of well meaning individuals in this thread worried at some level about autonomous driving.  With good reason, I will add.  That said, the vast majority commenting have no experience at all with the system in question here.  I use the system almost daily in my own Tesla and I can say that while it is an imperfect system, it is years beyond the other systems on the market.  Some people I think mistake Tesla’s lead in this market as a result of some Musk cult of personality,  or just a complete disregard for human safety on the highway.  Trust me, if any competitor could release a system equivalent with the flawed Tesla product, they would do it tomorrow.  The fact that they have not is due to an issue that Tesla has exploited.  GM, Ford et al are hugely skilled at designing and manufacturing ICE cars and trucks. Replacing the ICE with batteries does not equal Tesla and OTA software updates are still only a part of the story.  Tesla’s lead is in software development.  That is what will be the most important competitive advantage in the car market of tomorrow.

ebonyandivory
ebonyandivory PowerDork
12/11/19 5:23 a.m.
Keith Tanner said:

Would you rather have someone to punish or fewer deaths overall?

I'll address that extreme hypothetical sometime in the distant future 

alfadriver
alfadriver MegaDork
12/11/19 6:36 a.m.
mattm said:

In reply to Keith Tanner :

There are lots of well meaning individuals in this thread worried at some level about autonomous driving.  With good reason, I will add.  That said, the vast majority commenting have no experience at all with the system in question here.  I use the system almost daily in my own Tesla and I can say that while it is an imperfect system, it is years beyond the other systems on the market.  Some people I think mistake Tesla’s lead in this market as a result of some Musk cult of personality,  or just a complete disregard for human safety on the highway.  Trust me, if any competitor could release a system equivalent with the flawed Tesla product, they would do it tomorrow.  The fact that they have not is due to an issue that Tesla has exploited.  GM, Ford et al are hugely skilled at designing and manufacturing ICE cars and trucks. Replacing the ICE with batteries does not equal Tesla and OTA software updates are still only a part of the story.  Tesla’s lead is in software development.  That is what will be the most important competitive advantage in the car market of tomorrow.

Trust me, no they would not.  I'm not sure why people are so convinced that everyone else lags behind Tesla- you really don't know where anyone else actually is because they are doing more work to make sure its more ready.

Edit- to add some context to that, I started working at Ford in 1992, and one of my first rotations was to joint the electronic throttle development team.  Which was already working toward a solution.  It was not until 2000 that a vehicle with electronic throttle was finally released.  That's how long it took for the company to be comfortable releasing it, and it STILL has a parallel processor to make sure that it's doing the right thing.  Just for the throttle.  Autonomy adds so many layers to that, I'm not sure how to comprehend that.

There's a HUGE difference between flawed and imperfect.  From what I can see in the kinds of accidents that are happening, Telsa is not imperfect, it's flawed- it's missing some of the very key parts of what autonomy is supposed to help at- making YOU, the consumer, the development team for them.  As a development engineer, that's something we are not allowed to do.  And even when you are not allowed to release a product that risks the consumer,  it still happens, and always results in an immediate recall program if it's really dangerous.  Given the number of products out there vs. the number of actual deaths- if it were GM, Honda, or VW that released the product, there would be not only a recall, but probably congressional hearings (which HAVE happened over simpler things).

So back to Keith- it's pretty easy to identify and prove "flawed"- and that is at least the standard that should be set.  If you can't see the nominal roads well, or a nominal person on the road well, or even a truck well- that's a fundamental flaw.  If you have still not decided on who is the right person to kill in the worst situation in the world- that's just imperfect, and that happens.

AngryCorvair
AngryCorvair GRM+ Memberand MegaDork
12/11/19 9:08 a.m.

In reply to alfadriver :

+ eleventy.  i've been in electronic chassis controls engineering since 1993.  the difference between 'imperfect' and 'flawed' is enormous.

dculberson
dculberson MegaDork
12/11/19 1:54 p.m.
ebonyandivory said:
Keith Tanner said:

Would you rather have someone to punish or fewer deaths overall?

I'll address that extreme hypothetical sometime in the distant future 

It sounds to me like just a rephrasing of what you were saying previously, though. You're worried about punishing someone for mistakes, and Keith is saying maybe it's more important to worry about decreasing the 30,000 annual deaths on the road than to worry about who to punish for them.

Knurled.
Knurled. GRM+ Memberand MegaDork
12/11/19 2:46 p.m.

In reply to dculberson :

Personally, I am more interested in accountability.

RevRico
RevRico GRM+ Memberand PowerDork
12/11/19 2:53 p.m.

Ok, so how has accountability played out in the past? 

Any single person taken down for Takata airbags? How about Toyota and their sticky throttle? Porsche that randomly catch fire? The berkeleying pinto?

To my knowledge, no single person had ever gotten in trouble for those fatalities, it's been the company or the department at the company.

What would be any different if/when something happens involving an automated vehicle?

Also, how is testing autonomous vehicles in public any different than testing drugs on a population? Still a lot of bullE36 M3 to get to market, but drug companies include lawsuit money in their budget for when side effects are found in the wild versus their small sample testing for approval. Because no matter how great your simulation or small sample testing is, you cannot possibly account for every variable. 

ebonyandivory
ebonyandivory PowerDork
12/11/19 3:44 p.m.

I have no idea what the answer is to this very broad question we're wrestling with here but I personally don't believe the answer lies in paying less attention to the road or the vehicle controls while traveling..

That's what trains, taxi's, limousines and air travel are for.

I just can't get past my issues with the notion of "yes, we can expect x amount of collisions, injuries and deaths but man, it's gonna be worth it to be carted around and not have to pay attention."

 

ebonyandivory
ebonyandivory PowerDork
12/11/19 3:51 p.m.

In reply to RevRico :

Not sure I agree with some of your analogies. The drug companies bring medicines to the market but consumers are not forced to buy them, let alone use them. The "victim" is the consumer and the consumer alone. Caveat emptor and all that.

I DO NOT consent to Tesla's products being driven around me or at me in anything approaching autonomy mode. Where's my recourse?

I'm sure you can name many other scenarios similar to this that we face on a daily basis but we're dealing with Tesla and autonomous vehicles so I'm only referring to this one industry here.

 

ebonyandivory
ebonyandivory PowerDork
12/11/19 3:56 p.m.

In reply to RevRico :

Maybe having no accountability is the problem. 
 
I can name quite a few things I'd like to try but don't because of accountability. 

Harvey
Harvey GRM+ Memberand SuperDork
12/11/19 4:00 p.m.

When they first came out with regular old cruise control for cars, which person spoke up and said, "What happens if the driver forgets to press the brakes and crashes into somebody?" Did anyone say that? Or did they instead say, "Well, the driver should be paying attention, it's not foolproof."

Why is it that when Tesla says, "Hey, the car drives itself, for the most part, but pay attention in case it messes up" no one takes them at their word?

The Tesla system is not a fully autonomous system and it's not marketed as such and doesn't even let you use it as such, so why is this even a concern? Says right on their site, "Current Autopilot features require active driver supervision and do not make the vehicle autonomous."

https://www.tesla.com/autopilot

RevRico
RevRico GRM+ Memberand PowerDork
12/11/19 4:01 p.m.

In reply to ebonyandivory :

I don't consent to drugged out soccer mom's driving around me with their kids screaming in the back seat, a latte in one hand and a cell phone in the other, but I have to do it every single time I get on the road. I'd rather let a computer whose main focus is driving be in control personally. 

alfadriver
alfadriver MegaDork
12/11/19 4:07 p.m.

In reply to RevRico :

While nobody has personally faced criminal charges WRT vehicles, some have come VERY close.  I think a deal between the prosecutor and the company prevented people going to trial.  Real people sign real papers when it comes to meeting engineering targets.  And those realities have actual consequences if things go wrong.

It's one of those things that Professional Engineers face.

And having seen vehicle testing in the wild, I would not say that it's like drug testing at all.  Perhaps the end bad result is the same, but the way to develop the product, as well as the intention of the product are quite different.

STM317
STM317 UltraDork
12/11/19 4:20 p.m.
Harvey said:

When they first came out with regular old cruise control for cars, which person spoke up and said, "What happens if the driver forgets to press the brakes and crashes into somebody?" Did anyone say that? Or did they instead say, "Well, the driver should be paying attention, it's not foolproof."

Why is it that when Tesla says, "Hey, the car drives itself, for the most part, but pay attention in case it messes up" no one takes them at their word?

The Tesla system is not a fully autonomous system and it's not marketed as such and doesn't even let you use it as such, so why is this even a concern? Says right on their site, "Current Autopilot features require active driver supervision and do not make the vehicle autonomous."

https://www.tesla.com/autopilot

You do realize that the video you linked, on the Tesla site, begins by stating: "the person in the driver's seat is only there for legal reasons. The car is driving itself". No warnings or other notes about remaining alert. And then, the entire length of the video, the drivers feet never get near the pedals and his hands remain on his knees. The video is  2 minutes long, but it's sped up footage and is probably more like 8-10 minutes at actual speed. If your driver can go 10 minutes without having to interact with the vehicle in any way, they're not going to remain alert.

That can definitely be interpreted as needing no attention from the driver. At the very least, it minimizes the need for a driver to constantly monitor the system.

And this is exactly the "wink wink, nudge nudge" stuff Tesla is doing. They could easily send an OTA update out that required more constant attention/feedback from the driver (like GMs system does). They could easily state that the system is not intended to be used without hands on the wheel, and make that happen too. At a minimum, the company is sending mixed messages about Autopilot. I think Tesla fully understands both how they're marketing the system, and how people are using/abusing the system in the real world. But customers abusing the system gets Tesla more data, and theoretically gets them to full autonomy sooner. They could fix both their marketing and the actual system pretty much instantly if they wanted to, and yet they don't...

 

ebonyandivory
ebonyandivory PowerDork
12/11/19 4:24 p.m.
RevRico said:

In reply to ebonyandivory :

I don't consent to drugged out soccer mom's driving around me with their kids screaming in the back seat, a latte in one hand and a cell phone in the other, but I have to do it every single time I get on the road. I'd rather let a computer whose main focus is driving be in control personally. 

I don't know any companies sending out moms strung out on drugs into the public space. If these "drugged out soccer mom's" crash, crash and injure or God forbid kill someone, they go to jail, lose their license and face the consequences and all that punishment that comes with committing a crime.

I like that.

ebonyandivory
ebonyandivory PowerDork
12/11/19 4:28 p.m.
RevRico said:

In reply to ebonyandivory :

I don't consent to drugged out soccer mom's driving around me with their kids screaming in the back seat, a latte in one hand and a cell phone in the other, but I have to do it every single time I get on the road. I'd rather let a computer whose main focus is driving be in control personally. 

I got E36 M3 for using that exact same reasoning regarding my non-hatred of Cannonball Runs vs. the general population. But somehow it's ok to use that logic now.

Kreb
Kreb GRM+ Memberand UberDork
12/11/19 6:03 p.m.

While my tendency is to defend Tesla and cut it slack, some very good points have been made in this thread. It seems like Tesla's playing a duplicitous marketing thing, where on one hand they carefully tell you how careful you have to be and use the technology properly; while on the other hand they practically wink at you, hand you a beer and a coke label to wrap around it.  

codrus
codrus GRM+ Memberand UberDork
12/11/19 7:24 p.m.
alfadriver said:

While nobody has personally faced criminal charges WRT vehicles, some have come VERY close. 

While not strictly safety-related, Martin Winterkorn might disagree with you.

 

alfadriver
alfadriver MegaDork
12/11/19 8:15 p.m.
codrus said:
alfadriver said:

While nobody has personally faced criminal charges WRT vehicles, some have come VERY close. 

While not strictly safety-related, Martin Winterkorn might disagree with you.

 

There's always the first.  Although relative to the Tesla, that is Germany as opposed to the US...  I was including emissions issues, too- as I know of a few people at F who have almost gone to trial.  Makes the company pretty conservative when that happens.

Knurled.
Knurled. GRM+ Memberand MegaDork
12/11/19 8:23 p.m.

In reply to alfadriver :

Just for a little perspective, 1992 (the time you mention as Ford working on throttle by wire) is not too far off from the time Ford was raked over the coals for the Pinto fuel tank situation, as we are today from Toyota getting raked over the coals over their drive by wire systems.
 

People buy Toyotas again,  but at the same time, I'm sure Toyota got extra-precautious as a result.

 

(It works in weird ways.  There is/was an Audi tuner going by the name Intended Acceleration...)

Harvey
Harvey GRM+ Memberand SuperDork
12/11/19 9:50 p.m.
STM317 said:
Harvey said:

When they first came out with regular old cruise control for cars, which person spoke up and said, "What happens if the driver forgets to press the brakes and crashes into somebody?" Did anyone say that? Or did they instead say, "Well, the driver should be paying attention, it's not foolproof."

Why is it that when Tesla says, "Hey, the car drives itself, for the most part, but pay attention in case it messes up" no one takes them at their word?

The Tesla system is not a fully autonomous system and it's not marketed as such and doesn't even let you use it as such, so why is this even a concern? Says right on their site, "Current Autopilot features require active driver supervision and do not make the vehicle autonomous."

https://www.tesla.com/autopilot

You do realize that the video you linked, on the Tesla site, begins by stating: "the person in the driver's seat is only there for legal reasons. The car is driving itself". No warnings or other notes about remaining alert. And then, the entire length of the video, the drivers feet never get near the pedals and his hands remain on his knees. The video is  2 minutes long, but it's sped up footage and is probably more like 8-10 minutes at actual speed. If your driver can go 10 minutes without having to interact with the vehicle in any way, they're not going to remain alert.

That can definitely be interpreted as needing no attention from the driver. At the very least, it minimizes the need for a driver to constantly monitor the system.

And this is exactly the "wink wink, nudge nudge" stuff Tesla is doing. They could easily send an OTA update out that required more constant attention/feedback from the driver (like GMs system does). They could easily state that the system is not intended to be used without hands on the wheel, and make that happen too. At a minimum, the company is sending mixed messages about Autopilot. I think Tesla fully understands both how they're marketing the system, and how people are using/abusing the system in the real world. But customers abusing the system gets Tesla more data, and theoretically gets them to full autonomy sooner. They could fix both their marketing and the actual system pretty much instantly if they wanted to, and yet they don't.

I didn't watch the video, I linked the page for the text I quoted, but you give a pretty biased account of it so I'll assume it's typical marketing stuff that most people should ignore, but don't. 

What are the facts though? Do you know how often the Tesla autopilot requires feedback from the driver before it turns off? How often it reminds the driver to remain alert? Exactly how much supervision does the car need to force on the driver before its safe enough to use any sort of driver aid?

1 2 3 4 5

You'll need to log in to post.

Our Preferred Partners
b37awzVREGqpPTy5C0dlCH4h8ExeTpM2xFUWqQj6X7ZFqFCbrPcpIt5Wn4p08lS1