(Replying to PARENT post)
(Replying to PARENT post)
I have autopilot on my car, and it definitely makes me a better and safer driver. It maintains my distance from the car in front and my speed while keeping me in my lane, so my brain no longer has to worry about those mundane things. Instead I can spend all my brainpower focused on looking for potential emergencies, instead of splitting time between lane keeping/following and looking for emergencies.
I no longer have to look at my speedometer or the lane markers, I can take a much broader view of the traffic and conditions around me.
Before you say it's impossible to be safe driving with an assistive product, I suggest trying one out.
(Replying to PARENT post)
Risk compensation is fascinating; driving with a bike helmet causes the biker and drivers around the biker to behave more dangerously.
Is society sophisticated enough to deal with advanced driver assistance? Is it possible to gather enough data to create self driving ML systems?
(Replying to PARENT post)
No mention of the deceptive marketing name "Full Self Driving" in the article.
(Replying to PARENT post)
If we assume the number of tesla autopilot death double this year to 8 (from 4 at the time of probe launch), for about 900 thousand tesla on the road in USA, that's 8.9 autopilot death/million tesla/year.
Ratio between the numbers of 14.4.
Tesla reporting says for Q1 2021 one crash on autopilot per 4.19 millions miles vs one crash per 484 thousand miles all vehicules.
Ratio between the numbers of 8.7
All numbers are full of biases and their ratio probably aren't that meaningful but they end up in the same magnitude.
Interesting data there "Fatality Facts 2019 Urban/rural comparison":
https://www.iihs.org/topics/fatality-statistics/detail/urban...
"Although 19 percent of people in the U.S. live in rural areas and 30 percent of the vehicle miles traveled occur in rural areas, almost half of crash deaths occur there. "
I was shocked that in the USA in 2019 about 40-46% of all road death people were unbelted, while 90% of front seat people wear seat belts according to observation studies.
Incidentally tesla car will beep to no end if weight is detected on a seat and seat belt isn't clicked: I have to click the seat belt when I put my (not so heavy) bag on the passenger seat since there's no software option to disable the beeping.
(Replying to PARENT post)
As Teslas get better at driving the drivers will be paying less attention inevitably, Tesla needs to start being responsible at some point
(Replying to PARENT post)
(Replying to PARENT post)
The differentiating issue with Tesla's system is the way it is sold and marketed. Important operational safety information shouldn't be hidden in fine print. Subtly misleading marketing has unfortunately become acceptable in our culture, but this idea needs to stay out of safety-critical systems.
We need a mandate for clear and standardized labelling for these features, Γ la the Monroney sticker. All manufacturers should have to label and market their cars with something like SAE J3016. https://www.sae.org/binaries/content/gallery/cm/articles/pre...
(Replying to PARENT post)
To me driving requires paying constant attention to the road and being always ready to act swiftly: I just don't understand how you can have a "self driving car but you must but be ready to put your hands back on the steering wheel and your foot on the pedal(s)".
I have nothing against many "recent" safety features, like the steering wheel shaking a bit if the car detects you're getting out of your lane without having activated your blinker. Or the car beginning to brake if it detects an obstacle. Or the car giving you a warning if there's a risk when you change lane, etc.
But how can you react promptly if you're not ready? I just don't get this.
Unless it's a fully self-driving car, without even a steering wheel, a car should help you focus more, not less.
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
I still think that Tesla's approach is the right one, I just think they need to gather more data before letting this product be used in the wild unsupervised.
(Replying to PARENT post)
(Replying to PARENT post)
The question should be - how many lives were saved by this system vs how many would die if driven "normally"?
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
I have no idea how self-driving fits into this. I don't have a feel how self-driving responds to emergencies. I'd have to experience an emergency in one. For that reason, I don't see myself ever trusting self-driving.
(Replying to PARENT post)
Does anyone know if the FSD Beta has this ability?
(Replying to PARENT post)
(1) https://www.tesladeaths.com/miles.html (2) https://www-fars.nhtsa.dot.gov/Main/index.aspx
(Replying to PARENT post)
(Replying to PARENT post)
Or I can tell my car, "Hey tesla, go pickup my kid from soccer practice" and it would know what to do.
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
Tesla perhaps isn't being loud enough about how autopilot isn't self-driving, and shouldn't even be relied upon to hit the brakes when something is in front of you.
(Replying to PARENT post)
(Replying to PARENT post)
Doesn't autopilot require you to put your hands on the wheel fairly regularly? Are these incidents just a matter of people using this feature outside of its intended use case?
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
"The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes," NHTSA said in a document opening the investigation.
TACC is very different from Autopilot.
(Replying to PARENT post)
I think the best situation would be to have 'automated' stretches of highway specially designed to 'help' self driving systems.
Only self driving vehicles would be allowed on such special highways, and everything would be built around such systems.
(Replying to PARENT post)
Perhaps this is for the best.
[0] https://news.ycombinator.com/item?id=27996321
(Replying to PARENT post)
(Replying to PARENT post)
Teslas crash 40% less than other cars, and 1/3 the number of people are killed in Teslas versus other cars.
Indeed once a common failure mode like this is identified it needs to be investigated and fixed. Something similar happened a few years ago when someone driving a Tesla while watching a movie (not paying attention) died when they crashed into a light-colored tractor trailer directly crossing the road. So an investigation makes sense. But much of the general criticism of self-driving and autopilot here seems misplaced. Teslas and other self-driving vehicle technologies are saving lives. They will continue to save lives compared to human drivers, as long as we let them.
(Replying to PARENT post)
It's clear what Tesla really has - a good lane follower and cruise control that slows down for cars ahead. That's a level 2 system. That's useful, but, despite all the hype about "full self driving", it seems that's all they've got.
"Full self driving" just adds some lane-changing assistance and hints from the nav system.