STM317
UberDork
8/31/20 4:55 a.m.
Keith Tanner said:
STM317 said:
Sure would be interesting if Tesla would record and report driver interventions per mile the same way that all other companies developing autonomous driving are required to. You never hear about accidents with GMs Super Cruise, or autonomous taxis from Waymo or Cruise. That may simply be due to fewer miles being driven by those technologies, or it may be because they're being developed and released in a more responsible way. If Tesla were held to the same standard, we might actually have an answer.
Well, there was the small Uber problem.
Yep. That one time (How many Autopilot wrecks are we up to now?). And we know that their rate of driver intervention was ridiculously high relative to the better tech from Waymo, because they all report that stuff.
We have no idea what Tesla's rate of intervention is because they choose not to share it. But we've certainly heard about numerous Autopilot related accidents up to and including fatalities.
STM317
UberDork
8/31/20 5:21 a.m.
frenchyd said:
More cars driven more miles equal more exposure. So that's your answer.
New guys in a whole new area will get more things wrong than those who follow. That's just the way things develop.
I'm happy that Tesla is made and developed here in America first. At least we still have that edge. Sure China's chipping in but only because Tesla was first.
I'm not sure that Tesla is a trailblazer here. At least in any good way. Perhaps the true leaders in the industry are the ones that you don't hear about on the news every few months?
Google/Waymo have been doing autonomous driving tests on public roads since 2008. They're currently successfully using Level 4 Autonomy on public roads (Tesla is Level 3 at best). They had the first completely autonomous drive on public roads back in 2015 (no input of any kind from the driver who was blind). By then, they'd logged over 1 million miles of autonomous driving. They've had a fleet of driverless vans on the road since 2017. Their vehicles on average travel over 5600 miles without a driver intervening in any way. They have over 5 billion miles driven virtually, and over 20 million miles on public roads as of early 2020. They've done this entirely on their own without using customers as beta testers, and they've been transparent with their rate of success/failure. When they've had accidents that their tech was at least partially responsible, they've owned up to it too. The highest profile incident was probably this one, where the car came into the path of a moving bus:
In response, this is what they said "In this case, we clearly bear some responsibility because if our car hadn't moved there wouldn't have been a collision." When has Tesla even come close to taking any ownership of an accident with their tech?
Tesla first released their "Autopilot" driver assistance package in September of 2014. It included Autosteer, Autopark, and Traffic Aware Cruise Control. Tesla's failure to add safety overrides/checks to Autopilot to avoid driver misuse (checks that other semi-autonomous tech from competitors already use) is precisely because they're not leading. They're desperate to keep costs down and be the first to full autonomy, and they're letting the public do the legwork and take on the risk for them. As a result, you end up with stuff like this a couple of times per year:
If you want to gain public acceptance of this tech, then you need to thoroughly test it, share your data with the public like the rest of the companies, and avoid having highly visible accidents every few months. Stuff like that video above only builds apprehension and distrust (Perhaps rightly so in this case) and will slow the acceptance down with legal concerns and lawsuits.
In reply to STM317 :
So which Tesla would you buy?
Only thing I have to add... My parents just bought a new Honda Passport. It has some sort of auto-steering. I tested it on the highway at 55. I would not personally be comfortable to let it take full control, but I'll be damned if it isn't excellent. I didn't have to touch the steering wheel for 3 miles of slightly meandering 2 way. I didn't even know it was a thing on such plebian vehicles.
STM317
UberDork
8/31/20 8:29 a.m.
yupididit said:
In reply to STM317 :
So which Tesla would you buy?
I know you're joking but honestly, the original Roadster seems kind of cool. I'd at least test drive one.
tuna55
MegaDork
8/31/20 8:40 a.m.
We've had this thread every time, and the same people fall into the same lines.
I guess I have to say my lines.
Tesla acts irresponsibly then blames it on the user. Same as any phone or tech company. If an OEM automaker tried this they'd be sued out of existence. I don't know how Tesla found their way into this position, where they can just do whatever they want and their company just climbs in value no matter what. Apple had it this way for a time, but that seems to be eroding a bit. I won't be buying one. Any company that can "oops, we accidently ran your car into a firetruck" or "oops, we accidently shipped the car with half of its normal braking capacity" deserves to have people going to jail.
Carry on with your lines now. You know the script by now.
STM317 said:
Keith Tanner said:
STM317 said:
Sure would be interesting if Tesla would record and report driver interventions per mile the same way that all other companies developing autonomous driving are required to. You never hear about accidents with GMs Super Cruise, or autonomous taxis from Waymo or Cruise. That may simply be due to fewer miles being driven by those technologies, or it may be because they're being developed and released in a more responsible way. If Tesla were held to the same standard, we might actually have an answer.
Well, there was the small Uber problem.
Yep. That one time (How many Autopilot wrecks are we up to now?). And we know that their rate of driver intervention was ridiculously high relative to the better tech from Waymo, because they all report that stuff.
We have no idea what Tesla's rate of intervention is because they choose not to share it. But we've certainly heard about numerous Autopilot related accidents up to and including fatalities.
Given that Tesla's tech is - at best - Level 3, it would be expected that they'd have a higher level of driver intervention than the Level 4 systems being tested by the others. That's pretty much in the definition, that Level 3 requires driver monitoring and intervention. That's my concern regardless of who is rolling it out, that Level 3 is inherently flawed. You have to be at full Level 4 or you have to keep the driver involved at all times. You can't split the difference.
STM317 said:
yupididit said:
In reply to STM317 :
So which Tesla would you buy?
I know you're joking but honestly, the original Roadster seems kind of cool. I'd at least test drive one.
Wasn't a joke, to be honest. I only like the Model X. The rest really aren't my flavor.
STM317
UberDork
8/31/20 9:00 a.m.
Keith Tanner said:
STM317 said:
Keith Tanner said:
STM317 said:
Sure would be interesting if Tesla would record and report driver interventions per mile the same way that all other companies developing autonomous driving are required to. You never hear about accidents with GMs Super Cruise, or autonomous taxis from Waymo or Cruise. That may simply be due to fewer miles being driven by those technologies, or it may be because they're being developed and released in a more responsible way. If Tesla were held to the same standard, we might actually have an answer.
Well, there was the small Uber problem.
Yep. That one time (How many Autopilot wrecks are we up to now?). And we know that their rate of driver intervention was ridiculously high relative to the better tech from Waymo, because they all report that stuff.
We have no idea what Tesla's rate of intervention is because they choose not to share it. But we've certainly heard about numerous Autopilot related accidents up to and including fatalities.
Given that Tesla's tech is - at best - Level 3, it would be expected that they'd have a higher level of driver intervention than the Level 4 systems being tested by the others. That's pretty much in the definition, that Level 3 requires driver monitoring and intervention. That's my concern regardless of who is rolling it out, that Level 3 is inherently flawed. You have to be at full Level 4 or you have to keep the driver involved at all times. You can't split the difference.
Valid point on the difference between levels. Completely agree. But I will say that other brands that are doing Level 3 seem to have more systems in place to insure the driver stays more involved and alert. And we don't hear about high profile accidents from something like SuperCruise. Seems like something Tesla could probably fix pretty easily, perhaps even with an over the air update that they're famous for.
tuna55 said:
We've had this thread every time, and the same people fall into the same lines.
I guess I have to say my lines.
Tesla acts irresponsibly then blames it on the user. Same as any phone or tech company. If an OEM automaker tried this they'd be sued out of existence. I don't know how Tesla found their way into this position, where they can just do whatever they want and their company just climbs in value no matter what. Apple had it this way for a time, but that seems to be eroding a bit. I won't be buying one. Any company that can "oops, we accidently ran your car into a firetruck" or "oops, we accidently shipped the car with half of its normal braking capacity" deserves to have people going to jail.
Carry on with your lines now. You know the script by now.
Henry Ford's cars were known for fracturing the thumb if the wrong starting procedure was used and the engine kicked back during a hand crank. electric starters reduced that and people learned.
By the late 1920's fatal auto accidents were at levels they are today when cars are traveling billions of more miles and with triple the population.
Improvements are developed over time not instantly.
A decent degree of responsibility is our own stupidity.
People do stupid things, just read the news!!
Finally the reason Tesla's stock prices keep rising is because most investors are looking to the future. Tesla is out ahead of others and has conquered a lot of the development issues. And continues to do so.
Investors like growth. At this point there is no real competitor driving down prices to cut throat levels the way there is in the ICE manufactures
STM317 said:
Keith Tanner said:
STM317 said:
Keith Tanner said:
STM317 said:
Sure would be interesting if Tesla would record and report driver interventions per mile the same way that all other companies developing autonomous driving are required to. You never hear about accidents with GMs Super Cruise, or autonomous taxis from Waymo or Cruise. That may simply be due to fewer miles being driven by those technologies, or it may be because they're being developed and released in a more responsible way. If Tesla were held to the same standard, we might actually have an answer.
Well, there was the small Uber problem.
Yep. That one time (How many Autopilot wrecks are we up to now?). And we know that their rate of driver intervention was ridiculously high relative to the better tech from Waymo, because they all report that stuff.
We have no idea what Tesla's rate of intervention is because they choose not to share it. But we've certainly heard about numerous Autopilot related accidents up to and including fatalities.
Given that Tesla's tech is - at best - Level 3, it would be expected that they'd have a higher level of driver intervention than the Level 4 systems being tested by the others. That's pretty much in the definition, that Level 3 requires driver monitoring and intervention. That's my concern regardless of who is rolling it out, that Level 3 is inherently flawed. You have to be at full Level 4 or you have to keep the driver involved at all times. You can't split the difference.
Valid point on the difference between levels. Completely agree. But I will say that other brands that are doing Level 3 seem to have more systems in place to insure the driver stays more involved and alert. And we don't hear about high profile accidents from something like SuperCruise. Seems like something Tesla could probably fix pretty easily, perhaps even with an over the air update that they're famous for.
Tesla's software is evolving and improving with OTA updates, but the lack of lidar is not so easily solved. IIRC the car that caused the crash that started this thread was an early one with a less comprehensive sensor suite than current Teslas. That's going to be a problem for everyone who's rolling this sort of stuff out, the hardware will continue to evolve and improve and older systems will start to show limitations compared to state of the art.
It would be interesting to know how many miles have been driven on SuperCruise. It's far more limited as to when it can be engaged and there are lot fewer cars with it. The latter is a downside when it comes to gathering data but the former does seem like a good plan.
In reply to Mr_Asa :
Yes, but...
we registered one accident for every 4.53 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.27 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 1.56 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 479,000 miles.
1 per 4,530,000 miles (Autopilot) vs 1 per 479,00 (Human)
In reply to Keith Tanner :
So this is actually a serious question since you are a Tesla owner.
Do they do recalls? What's warranty like, since they don't have a traditional dealer network?
I can't imagine it being profitable, but could they "recall" older models and just swap out the sensors or add new, since they can seemingly chance all the software OTA, and at least from where I'm sitting, the software is model blind, it sounds like they could IF they had a shop/dealer network/repair and service center available.
Actual live action video of Tesla drivers on autopilot:
STM317
UberDork
8/31/20 12:15 p.m.
Keith Tanner said:
It would be interesting to know how many miles have been driven on SuperCruise. It's far more limited as to when it can be engaged and there are lot fewer cars with it. The latter is a downside when it comes to gathering data but the former does seem like a good plan.
GM says 5.2 million miles "incident free" as of a couple of months ago. That number should start to climb faster as SuperCruise becomes an option on Cadillacs that aren't the still-born CT6 as well as other premium GM models.
I'd think that those same maps are being compiled and shared with GM's Cruise division which has higher level autonomy. At least that would make sense from an external perspective.
I believe Waymo also chooses to map their surroundings before hand. And then lets the onboard tech navigate the vehicle with some known parameters and information. That approach seems to be working well for them.
So we've discussed this before, but it seems like Tesla's approach gives you more breadth, but less depth. The other companies seem to give you more depth and capability (in some cases Level 4 autonomy) but it's available in fewer places thanks to needing to map the environment before hand.
STM317
UberDork
8/31/20 12:21 p.m.
In reply to RevRico :
I think they're selling "Full Self Driving" as an $8k option right now. They're not currently capable of this, but the vehicles that come so equipped should be able to just be switched on when/if that is actually supported. Not sure what kind of hardware changes might be needed on vehicles that don't have that suite installed from the factory, or if it even makes financial sense vs just selecting that option on a new one at the point in the future when it becomes available.
BradLTL said:
In reply to Mr_Asa :
Yes, but...
we registered one accident for every 4.53 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.27 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 1.56 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 479,000 miles.
1 per 4,530,000 miles (Autopilot) vs 1 per 479,00 (Human)
Autopilot also doesn’t work in a lot of situations that are higher risk for human drivers:
Many factors can impact the performance of Autopilot, causing the system to be unable to function as intended. These include, but are not limited to: poor visibility (due to heavy rain, snow, fog, etc.), bright light (due to oncoming headlights, direct sunlight, etc.), mud, ice, snow, interference or obstruction by objects mounted onto the vehicle (such as a bike rack), obstruction caused by applying excessive paint or adhesive products (such as wraps, stickers, rubber coating, etc.) onto the vehicle; narrow, high curvature or winding roads, a damaged or misaligned bumper, interference from other equipment that generates ultrasonic waves, extremely hot or cold temperatures.
That came straight from Tesla’s website. It would be nice if we could get a direct comparison, but I doubt general accident data is broken down well enough to see how good or bad autopilot really is.
STM317 said:
In reply to RevRico :
I think they're selling "Full Self Driving" as an $8k option right now. They're not currently capable of this, but the vehicles that come so equipped should be able to just be switched on when/if that is actually supported. Not sure what kind of hardware changes might be needed on vehicles that don't have that suite installed from the factory, or if it even makes financial sense vs just selecting that option on a new one at the point in the future when it becomes available.
All Teslas sold today are equipped with the FSD hardware, but it's not turned on unless you pay for the upgrade. I suspect all the non-FSD cars are functioning as data gathering platforms regardless. I could upgrade our car to beta FSD status via the Tesla app using Apple Pay, which is a ridiculous concept.
Tesla does offer a hardware upgrade for cars with the older FSD hardware suite. They're on version 3.0 now, with a version 2.0, 2.5 and I don't know what else. There's also talk about FSD being subscription-based, which is (to some extent) what GM is doing with Super Cruise. Service is primarily performed by roving service techs who will come to a location of your choice and do the work on the premises. I haven't been through the recall process with this car (don't get me started on my Dodge experiences there) but I did have a service tech upgrade the HomeLink module and it was pretty painless once my appointment rolled around. If you live where they have a service center you can also do a more traditional visit.
eastsideTim said:
BradLTL said:
In reply to Mr_Asa :
Yes, but...
we registered one accident for every 4.53 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.27 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 1.56 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 479,000 miles.
1 per 4,530,000 miles (Autopilot) vs 1 per 479,00 (Human)
Autopilot also doesn’t work in a lot of situations that are higher risk for human drivers:
Many factors can impact the performance of Autopilot, causing the system to be unable to function as intended. These include, but are not limited to: poor visibility (due to heavy rain, snow, fog, etc.), bright light (due to oncoming headlights, direct sunlight, etc.), mud, ice, snow, interference or obstruction by objects mounted onto the vehicle (such as a bike rack), obstruction caused by applying excessive paint or adhesive products (such as wraps, stickers, rubber coating, etc.) onto the vehicle; narrow, high curvature or winding roads, a damaged or misaligned bumper, interference from other equipment that generates ultrasonic waves, extremely hot or cold temperatures.
That came straight from Tesla’s website. It would be nice if we could get a direct comparison, but I doubt general accident data is broken down well enough to see how good or bad autopilot really is.
I have had our car warn us a few times that it had degraded capabilities, mostly due to crap on the cameras (heavy snow) or blinding sun (as I squinted through the windshield).
I did try Autopilot TM (autosteer + cruise) on the interstate today just to see if it's possible that people are using it and watching movies. It doesn't have the authority to leave the lane, so on the stretch of road I used it on it was fairly frustrating as there was probably a 5-10 mph speed range with light traffic. I had to re-engage it about 5 times on my short interstate hop because I kept passing trucks and then pulling back in to the right lane. It's definitely just a more effective version of what's on my mom's VW Sportwagen, the ability to stay in a lane and not drive up the backside of slower moving traffic. I doubt this is the autopilot that is being used when people are watching movies and having bad crashes, I think that has to be FSD which the general public calls autopilot. But regardless, I turned Autosteer off again and it'll stay that way.
My thought is that I did not sign a waiver with Testla. As such why do they have the right to experiment with there technology on the same roads that I use? I am waiting for the mom and three kids to be killed by this system and the resulting inattentive driver it creates. It always seems to take this kind of event to stop stupid ideas that people somehow justify in the name of progress. To me it is simple you should not mix autonomous vehicles with human controlled ones especially if the vehicles can not communicate with each other. But that will cause a whole new set of issues involving big brother and personal freedoms.
In reply to dean1484 :
You stand a 100 times greater chance of being killed by a regular car than a Tesla.
Oh and you do realize every plane flying has an Autopilot. Want something scary? Look up the percentage of pilots who fall asleep in the middle of their flight. Then look at the millions of passengers flying.
Just to clarify. An autopilot on a plane is way way simpler then one for a car. Heck, there is something called a wing leveler (which does what it sound like it does) and just that (keeping the wing level) is 75% of an autopilot. Turn on the wing leveler, trim the elevator and you will maintain a course and altitude rather effectively.
You add in following airways and programmed turns, certainly it gets more complicated but still no where near as many potential issues and unknowns as a road. Very little to run into (just be aware of mountains!)
As far as falling asleep, that is a somewhat common thing, and a number planes have been found flying out to sea with a sleeping pilot, to be woken up as the the engine runs out of fuel...
For commercial planes, that is not much of a concern since a co-pilot is required. Of course with modern airliners, the pilots are almost not needed. Many can even land autonomously, which is far more complicated, but still not a lot to run into.
In reply to aircooled :
How many flight hours do you have? I have 1024 most multi engine.
The things that go wrong in the air are a lot more serious than driving.
The point is you're 1000 times more likely to get hit by a drunk driver A soccer Mom discipline her kid, or a trucker falling asleep than anything a Tesla driver might do.
frenchyd said:
In reply to aircooled :
The things that go wrong in the air are a lot more serious than driving.
This statement and how many flying hours you have does not change the fact that air autopilots are simpler systems than car systems. There is zero chance they will fly into the back of another plane with it's flashers on or a police plane. They don't have to deal with 100 other planes within a mile. Pedestrians that step off the sidewalk. Bikes. Red lights. 500 planes in front of them slamming on the brakes. Children chasing a ball. They pretty much maintain course and altitude or fly a course and altitude. Simple.
Unfortunately self driving cars do have to deal with all of those things and so many more it's unimaginable. Hence the crashes. Also unfortunately the only place to test these systems is on the road because all the situations are unimaginable. They are going to have to deal with them as they come. The good (bad) news is Tesla is collecting data from every car they sell.
If I was king of the world though, a Tesla crash while on autopilot would be an instant loss of license for at least 6 months. Much like drunk driving, there is no excuse.
I can not argue flying and it's risks as I don't have any first hand knowledge one way or another. Untill we have flying cars it is not really relivant here.
My point is I don't like them testing stuff on the public/me with out my concent. How many Teslas have driven my me on autopilot with out me knowing? I have not been given the option to agree/accept that risk. I think there should be a flashing lights on them something akin to hazard lights (but maybe green) so the public knows when the driver of a Tesla is in auto pilot mode. It lets other drivers be more aware as obviously the driver of the Tesla will be less aware of what is going on.