1 ... 3 4 5
alfadriver
alfadriver MegaDork
12/12/19 6:20 a.m.
Knurled. said:

In reply to alfadriver :

Just for a little perspective, 1992 (the time you mention as Ford working on throttle by wire) is not too far off from the time Ford was raked over the coals for the Pinto fuel tank situation, as we are today from Toyota getting raked over the coals over their drive by wire systems.
 

People buy Toyotas again,  but at the same time, I'm sure Toyota got extra-precautious as a result.

 

(It works in weird ways.  There is/was an Audi tuner going by the name Intended Acceleration...)

But the whole Pinto thing wasn't a legal thing- it was a civil thing, which really was somewhat made up, IIRC.  And I have not seen any actual criminal charges brought up against Toyota for the acceleration, nor have I seen anything with GM and the key thing.

Things like the Pinto, and even the Corviar, forced the system to put in actual safety requirements.  There are requirements for cruise control and for electronic throttle, now.  And while I'm not sure about specific autonomy rules, I'm sure there are some, as well as the integration of previous safety rules.  On a civil front, though, all it would take are a number of qualified engineers with previous art showing that whatever cause the accident was a known issue to address in the past to put Tesla in Civil court.  The family of the pedestrian with the bike should have a pretty easy case- as that was one of the key pillars of what this technology was supposed to prevent.  And if this case is about being capable of ID'ing the road properly- that's a pretty easy one to claim, too.  Staying on the correct road is a very basic requirement of autonomy, and if it knows it can not do that, it should have stopped the car- so the driver would know to take over.

I have no idea if Tesla will be criminally held responsible.  Heck, not even sure if any of their dedicated fans, I mean customers, will hold it against them.  But I'm pretty confident if they keep this path of letting the consumers do the final development for them, they will not last long.  It's never worked for any automotive company.

Harvey
Harvey GRM+ Memberand SuperDork
12/12/19 7:19 a.m.
alfadriver said:
Knurled. said:

In reply to alfadriver :

Just for a little perspective, 1992 (the time you mention as Ford working on throttle by wire) is not too far off from the time Ford was raked over the coals for the Pinto fuel tank situation, as we are today from Toyota getting raked over the coals over their drive by wire systems.
 

People buy Toyotas again,  but at the same time, I'm sure Toyota got extra-precautious as a result.

 

(It works in weird ways.  There is/was an Audi tuner going by the name Intended Acceleration...)

But the whole Pinto thing wasn't a legal thing- it was a civil thing, which really was somewhat made up, IIRC.  And I have not seen any actual criminal charges brought up against Toyota for the acceleration, nor have I seen anything with GM and the key thing.

Things like the Pinto, and even the Corviar, forced the system to put in actual safety requirements.  There are requirements for cruise control and for electronic throttle, now.  And while I'm not sure about specific autonomy rules, I'm sure there are some, as well as the integration of previous safety rules.  On a civil front, though, all it would take are a number of qualified engineers with previous art showing that whatever cause the accident was a known issue to address in the past to put Tesla in Civil court.  The family of the pedestrian with the bike should have a pretty easy case- as that was one of the key pillars of what this technology was supposed to prevent.  And if this case is about being capable of ID'ing the road properly- that's a pretty easy one to claim, too.  Staying on the correct road is a very basic requirement of autonomy, and if it knows it can not do that, it should have stopped the car- so the driver would know to take over.

I have no idea if Tesla will be criminally held responsible.  Heck, not even sure if any of their dedicated fans, I mean customers, will hold it against them.  But I'm pretty confident if they keep this path of letting the consumers do the final development for them, they will not last long.  It's never worked for any automotive company.

I was under the impression that the Pinto thing was real, but the Audi unintended acceleration thing was basically driver error.

Karacticus
Karacticus GRM+ Memberand Dork
12/12/19 7:24 a.m.

As a (not really) serious thought about the safety of the public sharing the road with Tesla's running on Autopilot, what if they had to be covered in warning lights that were activated when the system was engaged?

Would it negate the learning and data acquisition if other drivers (that paid attention) gave them a wider berth?  Would Tesla owners see it as a new way to strut their status, turning it into something like the lights on oligarch's cars running around Moscow?

Harvey
Harvey GRM+ Memberand SuperDork
12/12/19 7:30 a.m.

How about LEDs spell out I AM AUTOPILOTING on all sides of the car whenever people use the system?

Karacticus
Karacticus GRM+ Memberand Dork
12/12/19 7:43 a.m.

In reply to Harvey :

The opportunities for hacking the LED display to display other messages just kind of boggle the mind.  And, from a manufacturer that builds in "fireplace mode" and "emissions mode" and other easter eggs, who knows what else could be incorporated.laugh

Karacticus
Karacticus GRM+ Memberand Dork
12/12/19 7:45 a.m.

So, maybe these vehicles just need to implement something equivalent to the red flag law

alfadriver
alfadriver MegaDork
12/12/19 8:17 a.m.
Harvey said:
alfadriver said:
Knurled. said:

In reply to alfadriver :

Just for a little perspective, 1992 (the time you mention as Ford working on throttle by wire) is not too far off from the time Ford was raked over the coals for the Pinto fuel tank situation, as we are today from Toyota getting raked over the coals over their drive by wire systems.
 

People buy Toyotas again,  but at the same time, I'm sure Toyota got extra-precautious as a result.

 

(It works in weird ways.  There is/was an Audi tuner going by the name Intended Acceleration...)

But the whole Pinto thing wasn't a legal thing- it was a civil thing, which really was somewhat made up, IIRC.  And I have not seen any actual criminal charges brought up against Toyota for the acceleration, nor have I seen anything with GM and the key thing.

Things like the Pinto, and even the Corviar, forced the system to put in actual safety requirements.  There are requirements for cruise control and for electronic throttle, now.  And while I'm not sure about specific autonomy rules, I'm sure there are some, as well as the integration of previous safety rules.  On a civil front, though, all it would take are a number of qualified engineers with previous art showing that whatever cause the accident was a known issue to address in the past to put Tesla in Civil court.  The family of the pedestrian with the bike should have a pretty easy case- as that was one of the key pillars of what this technology was supposed to prevent.  And if this case is about being capable of ID'ing the road properly- that's a pretty easy one to claim, too.  Staying on the correct road is a very basic requirement of autonomy, and if it knows it can not do that, it should have stopped the car- so the driver would know to take over.

I have no idea if Tesla will be criminally held responsible.  Heck, not even sure if any of their dedicated fans, I mean customers, will hold it against them.  But I'm pretty confident if they keep this path of letting the consumers do the final development for them, they will not last long.  It's never worked for any automotive company.

I was under the impression that the Pinto thing was real, but the Audi unintended acceleration thing was basically driver error.

Given that the Pinto design was pretty common across a lot of cars, the "killer memo" was kind of interesting- the Pinto had similar accident and injury rates as most other cars on the market.  None the less, the whole Pinto thing was never a criminal thing, and was always a civil case.  For this case, it would be more as if every other OEM put out the exact same development of autonomous system, but that isn't exactly the case.

 

STM317
STM317 UltraDork
12/12/19 9:21 a.m.
Harvey said:

I didn't watch the video, I linked the page for the text I quoted, but you give a pretty biased account of it so I'll assume it's typical marketing stuff that most people should ignore, but don't. 

What are the facts though? Do you know how often the Tesla autopilot requires feedback from the driver before it turns off? How often it reminds the driver to remain alert? Exactly how much supervision does the car need to force on the driver before its safe enough to use any sort of driver aid?

Please don't let your impression of my bias keep you from being informed. Go to the page you linked. See how long it takes you to find the text you quoted. The video of the car driving itself with zero input from the person in the drivers seat at any time, is extremely prominent.

Here's Tesla's Autopilot support page.

Here are some quotes from that page:

" Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous. "

" While using Autopilot, it is your responsibility to stay alert, keep your hands on the steering wheel at all times and maintain control of your car. "

That's the legal-ese, Cover your ass language. Now how does that jive with the marketing video you linked where the "driver" doesn't touch any of the vehicle controls for several minutes on public streets with intersections, etc. It's not like they're driving on a desolate highway or something. Tesla's legal team says you have to have your hands on the wheel and remain attentive. But the much higher profile marketing and sales stuff shows otherwise. It's contradictory info from the manufacturer.

 Youtube is full of videos of people sleeping at the wheel while their Tesla drives down the road. Some of those videos might be fakes or stunts, but this one seems pretty real and confirmed by law enforcement. If Autopilot really requires frequent input from the driver, then how does this happen?

Even the other Tesla owner that they interview is shown without his hands on the wheel as the car changes lanes, etc.

Bloomberg surveyed 5000 Model 3 owners, and 13% said Autopilot has put them in a dangerous situation. That's an insanely high percentage of failure for tech like this that is intended to save lives all the time.

The NHTSA has said in crash reports that Teslas design allows drivers to disengage too easily

The NTSB investigated a Tesla Autopilot crash and found that the driver didn't touch the steering wheel once in the last 3 min 41 sec before impact

Consumer Reports says that it's too easy to fool Autopilot into thinking you're paying attention

Autopilot only knows that you're paying attention if the steering wheel receives some feedback occasionally. There are tons of videos and forum threads where people defeat the Autopilot "Nag" with something like a water bottle shoved into the steering wheel. Yeah, that's the owner going against the manufacturers stated claims, but other manufacturers of semi autonomous systems actually watch the driver with cameras in the car to make sure they're following the rules and not abusing the tech. This is a serious oversight on Tesla's part, and people can and have been hurt as a result.

dculberson
dculberson MegaDork
12/12/19 9:47 a.m.
STM317 said:

The NTSB investigated a fatal Tesla Autopilot crash and found that the driver didn't touch the steering wheel once in the last 3 min 41 sec before impact

That crash did not have any injuries. From the report: "When the crash occurred, the firetruck was unoccupied, and the driver of the sedan did not report any injuries." There were no injuries or fatalities from bystanders that I could find.

NormPeterson
NormPeterson New Reader
12/12/19 9:49 a.m.

In reply to Knurled. :

That establishes responsibility without making the auto-whatever any more able or willing to get better as a "driver".

 

Norm

STM317
STM317 UltraDork
12/12/19 9:53 a.m.
dculberson said:
STM317 said:

The NTSB investigated a fatal Tesla Autopilot crash and found that the driver didn't touch the steering wheel once in the last 3 min 41 sec before impact

That crash did not have any injuries. From the report: "When the crash occurred, the firetruck was unoccupied, and the driver of the sedan did not report any injuries." There were no injuries or fatalities from bystanders that I could find.

I need to start having you review everything I type up before I hit "post". Thanks for keeping me honest. You are correct. I was reading about another fatal accident at the same time, but decided not to post about it as it was older. Got the two mixed up as I was typing.

codrus
codrus GRM+ Memberand UberDork
12/12/19 10:04 a.m.
alfadriver said:
codrus said:
alfadriver said:

While nobody has personally faced criminal charges WRT vehicles, some have come VERY close. 

While not strictly safety-related, Martin Winterkorn might disagree with you.

 

There's always the first.  Although relative to the Tesla, that is Germany as opposed to the US...  I was including emissions issues, too- as I know of a few people at F who have almost gone to trial.  Makes the company pretty conservative when that happens.

FWIW, Winterkorn has been indicted in both the US and in Germany.

NormPeterson
NormPeterson New Reader
12/12/19 10:38 a.m.
Knurled. said:

In reply to _ :

That is why I don't get the appeal.  It would seem to me that just sitting there, having to constantly mind what is going on and prepared to take control, would be more fatiguing than just driving the car in the first place.

This ^^^

I agree 100% with others who mentioned the matter of having to actually take over control costing more time (and distance) than if you were continuously providing all of the control inputs yourself.  Personally, I don't hand speed control (or even shifting) over to some form of automation, nor will my wife (those things really aren't that hard to do, and have to at least subliminally keep you more in touch with conditions around you).

 

Norm

 

 

NormPeterson
NormPeterson New Reader
12/12/19 10:52 a.m.
Keith Tanner said:

I'm not asking from a technology standpoint, I'm reasonably solid there.

But testing and deployment - how do we decide that they're okay to release, to be on the roads? Humans, as we know, are atrocious at judging risk and somehow will accept Distracted Debbie or Drunk Uncle Bob behind the wheel long before they'll accept Skynet.

The difference is that people can accept what DD or DUB do from time to time as individual flaws that won't happen every time.  If Skynet was to do whatever, it'd be because of a systemic flaw or logical pattern that would work the same way every time.  I suppose it's the difference between perceived risk and perceived certainty.

 

Norm

Harvey
Harvey GRM+ Memberand SuperDork
12/12/19 10:57 a.m.

Please don't let your impression of my bias keep you from being informed. Go to the page you linked. See how long it takes you to find the text you quoted. The video of the car driving itself with zero input from the person in the drivers seat at any time, is extremely prominent.

Here's Tesla's Autopilot support page.

Here are some quotes from that page:

" Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous. "

" While using Autopilot, it is your responsibility to stay alert, keep your hands on the steering wheel at all times and maintain control of your car. "

That's the legal-ese, Cover your ass language. Now how does that jive with the marketing video you linked where the "driver" doesn't touch any of the vehicle controls for several minutes on public streets with intersections, etc. It's not like they're driving on a desolate highway or something. Tesla's legal team says you have to have your hands on the wheel and remain attentive. But the much higher profile marketing and sales stuff shows otherwise. It's contradictory info from the manufacturer.

 Youtube is full of videos of people sleeping at the wheel while their Tesla drives down the road. Some of those videos might be fakes or stunts, but this one seems pretty real and confirmed by law enforcement. If Autopilot really requires frequent input from the driver, then how does this happen?Even the other Tesla owner that they interview is shown without his hands on the wheel as the car changes lanes, etc.

Bloomberg surveyed 5000 Model 3 owners, and 13% said Autopilot has put them in a dangerous situation. That's an insanely high percentage of failure for tech like this that is intended to save lives all the time.

The NHTSA has said in crash reports that Teslas design allows drivers to disengage too easily

The NTSB investigated a fatal Tesla Autopilot crash and found that the driver didn't touch the steering wheel once in the last 3 min 41 sec before impact

Consumer Reports says that it's too easy to fool Autopilot into thinking you're paying attention

Autopilot only knows that you're paying attention if the steering wheel receives some feedback occasionally. There are tons of videos and forum threads where people defeat the Autopilot "Nag" with something like a water bottle shoved into the steering wheel. Yeah, that's the owner going against the manufacturers stated claims, but other manufacturers of semi autonomous systems actually watch the driver with cameras in the car to make sure they're following the rules and not abusing the tech. This is a serious oversight on Tesla's part, and people can and have been hurt as a result.

Right above the video...

"All new Tesla cars come standard with advanced hardware capable of providing Autopilot features today, and full self-driving capabilities in the future—through software updates designed to improve functionality over time."

Basically what you are saying is that you are smart enough to recognize their video as marketing and not the reality of how the car works, but the people that actually buy the cars are too stupid to understand that and in addition will ignore all the instructions and warnings that the car puts up in favor of the premise that the car can drive itself autonomously.

Bloomberg article... 13% said Tesla Autopilot put them in a dangerous situation... 28% say Autopilot has saved them from a dangerous situation

And then we are down to, people are idiots and actively work to defeat the nag system that Tesla has in place or just completely ignore anything the system does to tell them they are idiots. Now, who are we going to blame on this exactly? It's not as if the drivers are accidentally turning off the nag system or that the nag system is not visible or audible. If you read the consumer reports article the driver comes off really badly.

  • Driver has the car set to a cruise of 80mph, which is far above the speed limit in just about any area of California.
  • Driver has coffee and a bagel and is using the radio.
  • Driver isn't even looking at the road or touching the wheel

Despite all this the collision occurred at a speed of 30.9 MPH, which is obviously due to the autonomous system braking since the driver had no clue there was even a fire truck there and likely didn't do any braking themselves. No one was injured.

I'm honestly not sure how to evaluate the Consumer Reports article where they review the various auto driving systems. They give Cadillac Super Cruise the nod even though it's far less capable than the Tesla system. They note that "Super Cruise is available only on limited-access highways that GM has already mapped." How is this even comparable to Tesla's system? If Tesla limited the places where you could use the system obviously they would never have a problem. CR notes that the Tesla system can be used in places where it shouldn't be used like poorly marked back roads. Okay, but isn't it the person's choice to use it there? I don't use cruise control on twisty winding roads because I know that's not the place to use it. They also don't mention that Super Cruise can't perform lane changes.

Here is a map of where you can use Super Cruise.



Those marked highways are the ONLY ones you can use Super Cruise on.

CR note that the Tesla system is far better at driving than any of the other systems and in the case of the Nissan and Volvo systems the reason they maintain attentiveness is because they suck and will drive you off a cliff if you don't pay attention. The Volvo does nothing to stop the car if you are not paying attention and ignore its warnings, it just disengages the system entirely.

I just don't know how they they rate Super Cruise higher overall.

It seems as if these articles are saying people are so stupid they shouldn't be using any autonomous system because they can't be trusted with it, but then how do we justify trusting them to drive a car in the first place?

ebonyandivory
ebonyandivory PowerDork
12/12/19 11:57 a.m.

I can't wait for a a Tesla to injure me so Elon can invent a robot and will pay for the robotic surgery to reattach my arm (with bionic assist technology installed and hardwired to my spinal column, obviously)

mattm
mattm GRM+ Memberand Reader
12/12/19 1:13 p.m.
STM317 said:
Harvey said:

I didn't watch the video, I linked the page for the text I quoted, but you give a pretty biased account of it so I'll assume it's typical marketing stuff that most people should ignore, but don't. 

What are the facts though? Do you know how often the Tesla autopilot requires feedback from the driver before it turns off? How often it reminds the driver to remain alert? Exactly how much supervision does the car need to force on the driver before its safe enough to use any sort of driver aid?

Please don't let your impression of my bias keep you from being informed. Go to the page you linked. See how long it takes you to find the text you quoted. The video of the car driving itself with zero input from the person in the drivers seat at any time, is extremely prominent.

Here's Tesla's Autopilot support page.

Here are some quotes from that page:

" Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous. "

" While using Autopilot, it is your responsibility to stay alert, keep your hands on the steering wheel at all times and maintain control of your car. "

That's the legal-ese, Cover your ass language. Now how does that jive with the marketing video you linked where the "driver" doesn't touch any of the vehicle controls for several minutes on public streets with intersections, etc. It's not like they're driving on a desolate highway or something. Tesla's legal team says you have to have your hands on the wheel and remain attentive. But the much higher profile marketing and sales stuff shows otherwise. It's contradictory info from the manufacturer.

 Youtube is full of videos of people sleeping at the wheel while their Tesla drives down the road. Some of those videos might be fakes or stunts, but this one seems pretty real and confirmed by law enforcement. If Autopilot really requires frequent input from the driver, then how does this happen?

Even the other Tesla owner that they interview is shown without his hands on the wheel as the car changes lanes, etc.

Bloomberg surveyed 5000 Model 3 owners, and 13% said Autopilot has put them in a dangerous situation. That's an insanely high percentage of failure for tech like this that is intended to save lives all the time.

The NHTSA has said in crash reports that Teslas design allows drivers to disengage too easily

The NTSB investigated a fatal Tesla Autopilot crash and found that the driver didn't touch the steering wheel once in the last 3 min 41 sec before impact

Consumer Reports says that it's too easy to fool Autopilot into thinking you're paying attention

Autopilot only knows that you're paying attention if the steering wheel receives some feedback occasionally. There are tons of videos and forum threads where people defeat the Autopilot "Nag" with something like a water bottle shoved into the steering wheel. Yeah, that's the owner going against the manufacturers stated claims, but other manufacturers of semi autonomous systems actually watch the driver with cameras in the car to make sure they're following the rules and not abusing the tech. This is a serious oversight on Tesla's part, and people can and have been hurt as a result.

As some who uses the autopilot in my Model S on almost every drive I can answer you with direct experience but we both know you aren’t interested in any experience I may have. I would suggest that the vast majority opining on the autopilot system in this thread have never used it or used it once 3 years ago. Every Tesla showroom has a car with the FSD package on it for you to test drive. If you are really interested in how it works, I suggest that you go actually drive the car vs watching YouTube. 

STM317
STM317 UltraDork
12/12/19 1:34 p.m.

In reply to Harvey :

The text above the video says nothing about how much input from a driver is needed to use Autopilot, or when/where it should be used. It says nothing about the systems limitations. It's ambiguous about the tech's capabilities, which system is actually installed, etc. The only time it mentions anything about Autopilot needing driver attention or focus is 2/3rds down the page in the smallest font on the page. They downplay it tremendously while distracting people with a big video that shows the car driving with zero input or attention needed. I'm just looking for some honesty and consistency from them. If they're going to stipulate in the owner's manual that a driver must keep their hands on the wheel the entire time Autopilot is in use, then they should probably have the driver in their video following their own rules no? It's intentionally misleading, which always looks bad for a company, but especially when it comes to tech where lives are on the line.

 I've already made multiple SuperCruise comments in this thread so I won't belabor the point anymore than I have to. I even posted a link multiple times that I felt was an honest comparison of the two systems and where they each excel. You've got the gist. While Super Cruise does similar things as Autopilot, It's more limited in it's scope. It's called Super Cruise because it's basically fancy cruise control for the times and places where you'd already be using cruise control. It's a driver aide, not full replacement to be used any place or anytime you want. It requires the driver to be more involved than Autopilot and that's ultimately safer. It's only used on limited access roads where traffic is mostly moving the same speed and direction and that's ultimately safer. The whole point of these systems is to be safer right?

The driver in the wreck absolutely looks bad, as does the drunk guy that fell asleep in the video, and the person messing with their dog that started this discussion. Every driver in an Autopilot wreck that I've read about has done something wrong. But every other company selling a similar product has designed their systems so they wouldn't have allowed those failures from the drivers. They do that because they understand that some people will push the limits or go outside of the rules. The guy set the cruise control to 80mph, which is probably above the speed limit. You say that makes the driver look bad, and I don't disagree, but I think it makes Tesla look bad too. Why would a semi autonomous safety system that knows exactly where it is and what the speed limit is even allow illegal speeds or maneuvers? What if the speed limit had been 35mph? 

Consumer Reports had issues with their Navigate on Autopilot upgrade breaking traffic laws and acting erratically too: https://www.consumerreports.org/autonomous-driving/tesla-navigate-on-autopilot-automatic-lane-change-requires-significant-driver-intervention/

Again, The whole point of these systems is to be safer right? How is a safety system even allowed to break traffic laws?

Tesla says on their site that Navigate on Autopilot is currently beta tech. Why is there beta level tech being tested by consumers on public roads? If you have to test it on public roads, that's fine but do it in company vehicles with employees behind the wheel until it's ready. It's irresponsible to do anything different. 

They're allowing all of this because it generates more data for the company for less money, and theoretically gets them closer to their goal of full autonomy faster. And it does all of that with minimal liability for the company, until they get sued for something. People are paying them to beta test their product in public instead of them having to pay employees to do it. The consumer takes all of the risk while the company gets the reward. They're putting the general public more at risk than any other autonomous or semi autonomous operation all because they're chasing profits. That goes against the grain of a system designed and intended to be a safety device doesn't it?

STM317
STM317 UltraDork
12/12/19 1:53 p.m.
mattm said:

As some who uses the autopilot in my Model S on almost every drive I can answer you with direct experience but we both know you aren’t interested in any experience I may have. I would suggest that the vast majority opining on the autopilot system in this thread have never used it or used it once 3 years ago. Every Tesla showroom has a car with the FSD package on it for you to test drive. If you are really interested in how it works, I suggest that you go actually drive the car vs watching YouTube. 

Take your condescension somewhere else. As one of the two Tesla owners in this thread, you have an opportunity to explain, inform and educate those of us who might have it wrong, and in both of your posts you've basically declined to do anything helpful and just talked about having used it and then act like anybody with concerns just doesn't get it. I'd love to hear you contribution to the discussion if you've got something productive to add. But your experience is also anecdotal. I'm not making up the issues that have come up with law enforcement, federal agencies, or auto/consumer magazines. I'm not lying about 13% of Model 3 owners polled saying that Autopilot has put them in a dangerous situation. It's easy to find ways to defeat Autopilot, and those methods won't work for other similar systems. Just because you've had good luck doesn't mean that Autopilot is statistically safer than any other system on the market.

I won't be getting in a Tesla on Autopilot anytime soon. The tech is not to a point where I'm comfortable trusting it with my safety. I would consider a Cadillac with Super Cruise, or a Waymo cab with no safety driver because I trust that they've taken their time, developed the tech responsibly and generally done their homework. And I think that's a big issue for Tesla. There are high profile incidents where Autopilot has been involved and that's negatively tainting the public image of their tech, and probably all autonomous tech. Waymo, Cruise, GM's SuoerCruise, and the other similar tech are all being developed more responsibly. That takes more time, but you don't hear about their incidents on the news either. If this tech is ever to become mainstream, it will be slowly and with as little negative PR as possible. Once lawyers and politicians get involved, the whole thing is in jeopardy and gets massively more expensive.

mattm
mattm GRM+ Memberand Reader
12/12/19 2:49 p.m.
STM317 said:

In reply to Harvey :

The text above the video says nothing about how much input from a driver is needed to use Autopilot, or when/where it should be used. It says nothing about the systems limitations. It's ambiguous about the tech's capabilities, which system is actually installed, etc. The only time it mentions anything about Autopilot needing driver attention or focus is 2/3rds down the page in the smallest font on the page. They downplay it tremendously while distracting people with a big video that shows the car driving with zero input or attention needed. I'm just looking for some honesty and consistency from them. If they're going to stipulate in the owner's manual that a driver must keep their hands on the wheel the entire time Autopilot is in use, then they should probably have the driver in their video following their own rules no? It's intentionally misleading, which always looks bad for a company, but especially when it comes to tech where lives are on the line.

 I've already made multiple SuperCruise comments in this thread so I won't belabor the point anymore than I have to. I even posted a link multiple times that I felt was an honest comparison of the two systems and where they each excel. You've got the gist. While Super Cruise does similar things as Autopilot, It's more limited in it's scope. It's called Super Cruise because it's basically fancy cruise control for the times and places where you'd already be using cruise control. It's a driver aide, not full replacement to be used any place or anytime you want. It requires the driver to be more involved than Autopilot and that's ultimately safer. It's only used on limited access roads where traffic is mostly moving the same speed and direction and that's ultimately safer. The whole point of these systems is to be safer right?

The driver in the wreck absolutely looks bad, as does the drunk guy that fell asleep in the video, and the person messing with their dog that started this discussion. Every driver in an Autopilot wreck that I've read about has done something wrong. But every other company selling a similar product has designed their systems so they wouldn't have allowed those failures from the drivers. They do that because they understand that some people will push the limits or go outside of the rules. The guy set the cruise control to 80mph, which is probably above the speed limit. You say that makes the driver look bad, and I don't disagree, but I think it makes Tesla look bad too. Why would a semi autonomous safety system that knows exactly where it is and what the speed limit is even allow illegal speeds or maneuvers? What if the speed limit had been 35mph

Consumer Reports had issues with their Navigate on Autopilot upgrade breaking traffic laws and acting erratically too: https://www.consumerreports.org/autonomous-driving/tesla-navigate-on-autopilot-automatic-lane-change-requires-significant-driver-intervention/

Again, The whole point of these systems is to be safer right? How is a safety system even allowed to break traffic laws?

Tesla says on their site that Navigate on Autopilot is currently beta tech. Why is there beta level tech being tested by consumers on public roads? If you have to test it on public roads, that's fine but do it in company vehicles with employees behind the wheel until it's ready. It's irresponsible to do anything different. 

They're allowing all of this because it generates more data for the company for less money, and theoretically gets them closer to their goal of full autonomy faster. And it does all of that with minimal liability for the company, until they get sued for something. People are paying them to beta test their product in public instead of them having to pay employees to do it. The consumer takes all of the risk while the company gets the reward. They're putting the general public more at risk than any other autonomous or semi autonomous operation all because they're chasing profits. That goes against the grain of a system designed and intended to be a safety device doesn't it?

On secondary roads, autopilot limits the driver to a maximum of 5 mph over the speed limit.  On the highway, autopilot shuts off an will not engage again until the car is brought to a complete stop and put in park if you exceed 90mph.  At the bottom of your response I bolded your comment about profits which makes it difficult to take you seriously.  You are speculating about what Tesla does vs the competition.  As far as I know, every self driving initiative from every start up or established manufacturer is chasing profits.  Clutching your pearls about the safety of one evil profit chaser vs the saintly profit chasers that aren’t spelled Tesla make it appear as though safety is not the main concern in the post.

In your follow up post, you volunteer that most other systems from companies not spelled Tesla would receive your endorsement.  So much so that you would gladly ride in those vehicles and engage their self driving feature.  In the case of Cadillac, calling super cruise a competitor to Autopilot is a huge stretch, but it does illuminate where your real concern lies.  I get it if you dislike Tesla for reasons, but don’t expect me to believe that your concern is genuine and legitimate.  I have thousands of miles using autopilot and it isn’t perfect but far surpasses all of the other commercially available products.  Autopilot, unlike GMs system can successfully and safely navigate automatically all of the highway interchanges in the state of Ohio as I have used it on all of them.  I have used it to travel between all of the major cities from western PA, to Ohio, Michigan to Indiana and northern Kentucky both during the day and night.  In general autopilot is more conservative than I would be in the road, but then again I used to drive BMWs.   I have even let the vehicle change lanes without my hands on the wheel!  That said, my hands are usually on the wheel and I am certainly engaged with my surroundings but for those long drives on the highway, autopilot has me arriving much less fatigued than in my wife’s vehicle. 

STM317
STM317 UltraDork
12/12/19 3:04 p.m.

In reply to mattm :

Profit is not a dirty word in my house, but the way they chase it, and how hard matters. If they're taking shortcuts, overlooking things, or pressing too hard in order to get those profits, that's a problem. Ask Uber how that works out. You get desperate, and you make forced decisions rather than well vetted ones. I'm not mad that Tesla is chasing profits, I just think they're going about it irresponsibly and putting the public more at risk than others.

Tesla is absolutely using unconventional testing means by allowing paying customers to test their product for them. That's not something that anybody else is doing. The only reason to do that is money, because it opens them up to serious litigation, so they obviously think the ends justify the means.

I think that Autopilot is probably the most capable of the semi autonomous systems around. Clearly it has fewer limitations, but that doesn't mean that it's the safest. I get that it's super convenient. Nobody is arguing that. The whole concern from detractors is whether it's safe enough to be used in the scale and ways that it's being used, not only for the Tesla driver but those around them.

ShawnG
ShawnG PowerDork
12/12/19 3:54 p.m.

Super cruise on the highway from Calgary to Alberta???

You can fall asleep in any car and not hit anything on that highway.

Slow down at Leduc, that's where the cop hides.

Harvey
Harvey GRM+ Memberand SuperDork
12/12/19 4:14 p.m.
mattm said:
STM317 said:
Harvey said:

I didn't watch the video, I linked the page for the text I quoted, but you give a pretty biased account of it so I'll assume it's typical marketing stuff that most people should ignore, but don't. 

What are the facts though? Do you know how often the Tesla autopilot requires feedback from the driver before it turns off? How often it reminds the driver to remain alert? Exactly how much supervision does the car need to force on the driver before its safe enough to use any sort of driver aid?

Please don't let your impression of my bias keep you from being informed. Go to the page you linked. See how long it takes you to find the text you quoted. The video of the car driving itself with zero input from the person in the drivers seat at any time, is extremely prominent.

Here's Tesla's Autopilot support page.

Here are some quotes from that page:

" Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous. "

" While using Autopilot, it is your responsibility to stay alert, keep your hands on the steering wheel at all times and maintain control of your car. "

That's the legal-ese, Cover your ass language. Now how does that jive with the marketing video you linked where the "driver" doesn't touch any of the vehicle controls for several minutes on public streets with intersections, etc. It's not like they're driving on a desolate highway or something. Tesla's legal team says you have to have your hands on the wheel and remain attentive. But the much higher profile marketing and sales stuff shows otherwise. It's contradictory info from the manufacturer.

 Youtube is full of videos of people sleeping at the wheel while their Tesla drives down the road. Some of those videos might be fakes or stunts, but this one seems pretty real and confirmed by law enforcement. If Autopilot really requires frequent input from the driver, then how does this happen?

Even the other Tesla owner that they interview is shown without his hands on the wheel as the car changes lanes, etc.

Bloomberg surveyed 5000 Model 3 owners, and 13% said Autopilot has put them in a dangerous situation. That's an insanely high percentage of failure for tech like this that is intended to save lives all the time.

The NHTSA has said in crash reports that Teslas design allows drivers to disengage too easily

The NTSB investigated a fatal Tesla Autopilot crash and found that the driver didn't touch the steering wheel once in the last 3 min 41 sec before impact

Consumer Reports says that it's too easy to fool Autopilot into thinking you're paying attention

Autopilot only knows that you're paying attention if the steering wheel receives some feedback occasionally. There are tons of videos and forum threads where people defeat the Autopilot "Nag" with something like a water bottle shoved into the steering wheel. Yeah, that's the owner going against the manufacturers stated claims, but other manufacturers of semi autonomous systems actually watch the driver with cameras in the car to make sure they're following the rules and not abusing the tech. This is a serious oversight on Tesla's part, and people can and have been hurt as a result.

As some who uses the autopilot in my Model S on almost every drive I can answer you with direct experience but we both know you aren’t interested in any experience I may have. I would suggest that the vast majority opining on the autopilot system in this thread have never used it or used it once 3 years ago. Every Tesla showroom has a car with the FSD package on it for you to test drive. If you are really interested in how it works, I suggest that you go actually drive the car vs watching YouTube. 

I'm interested in your experience.

Streetwiseguy
Streetwiseguy MegaDork
12/12/19 4:37 p.m.
alfadriver said:
Harvey said:
alfadriver said:
Knurled. said:

In reply to alfadriver :

Just for a little perspective, 1992 (the time you mention as Ford working on throttle by wire) is not too far off from the time Ford was raked over the coals for the Pinto fuel tank situation, as we are today from Toyota getting raked over the coals over their drive by wire systems.
 

People buy Toyotas again,  but at the same time, I'm sure Toyota got extra-precautious as a result.

 

(It works in weird ways.  There is/was an Audi tuner going by the name Intended Acceleration...)

But the whole Pinto thing wasn't a legal thing- it was a civil thing, which really was somewhat made up, IIRC.  And I have not seen any actual criminal charges brought up against Toyota for the acceleration, nor have I seen anything with GM and the key thing.

Things like the Pinto, and even the Corviar, forced the system to put in actual safety requirements.  There are requirements for cruise control and for electronic throttle, now.  And while I'm not sure about specific autonomy rules, I'm sure there are some, as well as the integration of previous safety rules.  On a civil front, though, all it would take are a number of qualified engineers with previous art showing that whatever cause the accident was a known issue to address in the past to put Tesla in Civil court.  The family of the pedestrian with the bike should have a pretty easy case- as that was one of the key pillars of what this technology was supposed to prevent.  And if this case is about being capable of ID'ing the road properly- that's a pretty easy one to claim, too.  Staying on the correct road is a very basic requirement of autonomy, and if it knows it can not do that, it should have stopped the car- so the driver would know to take over.

I have no idea if Tesla will be criminally held responsible.  Heck, not even sure if any of their dedicated fans, I mean customers, will hold it against them.  But I'm pretty confident if they keep this path of letting the consumers do the final development for them, they will not last long.  It's never worked for any automotive company.

I was under the impression that the Pinto thing was real, but the Audi unintended acceleration thing was basically driver error.

Given that the Pinto design was pretty common across a lot of cars, the "killer memo" was kind of interesting- the Pinto had similar accident and injury rates as most other cars on the market.  None the less, the whole Pinto thing was never a criminal thing, and was always a civil case.  For this case, it would be more as if every other OEM put out the exact same development of autonomous system, but that isn't exactly the case.

 

Lots more people burned in early Mustangs and Falcons than ever did in Pintos.  Following that, way fewer people burned in Crown Victorias.  Saddle tank Chev square body trucks had fuel tanks outside the frame rails.  The previous generation of truck had them behind the seat, under a layer of cardboard.  Ford trucks built at the same time as the square Chevs had the second tank directly in front of the rear bumper, and had about the same number of fires as the Chevs did.  People died in Cobalts or Cavaliers or whatever because the power steering stopped working.  Those people were probably going to die from trying to open a heavy door anyway.

There is a natural progression, and we've come very close to the safest we can be while still moving. The death rate per mile has stopped improving.  All the ABS and Airbags and  Stability control hasn't made a noticeable difference, because peoples is peoples.

1 ... 3 4 5

You'll need to log in to post.

Our Preferred Partners
3Op2yi4fT00LSGGtGft90YuWMdPZymVcK7coO3q3UDG19dS0O0Io4T4rTIWskqAH