(Replying to PARENT post)
> Should Riad be found guilty, “it’s going to send shivers up and down everybody’s spine who has one of these vehicles and realizes, ‘Hey, I’m the one that’s responsible,’”
Duh. There's plenty of warnings, you are still responsible and your attention is required at all times to intervene when needed. Of course, there are going to be plenty of situations where it will be too late for a human to intervene. Those are going to look harder, but purely from a legal standpoint you are not (yet) allowed to hand off responsibility to some algorithm just because it's baked into the car you buy.
(Replying to PARENT post)
At best the convicted drivers could sue Tesla for damages because of their advertising after conviction, because Tesla puts a lot of effort into their marketing materials to convince people that the car drives for you, but on the paperwork you sign you're reminded that it doesn't, and the software features that force you to keep your hands on the wheel at all times are a constant reminder of that.
(Replying to PARENT post)
This is going to be the interesting part. To my mind this is where partial automated driving systems are basically incompatible with people and cars. There is just no way a person who is behind the wheel but who isn't driving can be in control and highly attentive in the same way as a person who is actually driving.
(Replying to PARENT post)
I'm not saying this guy isn't guilty, all I'm saying is maybe Tesla should shoulder some of the blame here. I'm probably a very small minority in this thinking however.
(Replying to PARENT post)
Even if it is a software error, even if the brakes stop working, even if x, y or z (etc)
It is always the responsibility of the driver to accept the consequences of any forseen or unforseen actions of any kind of outcome.
Depending on the unfortunate event, any mitigating events might be taken into account.
(Replying to PARENT post)
PD: I don't own any car and I have nothing against Tesla.
(Replying to PARENT post)
(Replying to PARENT post)
I also don't think Tesla is necessarily liable here, an unpopular opinion which i also changed over time considering that one needs to prove the car misbehaved.But even then, assuming the car truly misbehaved, making the case that the person is not liable is a big stretch.The title is still clickbait
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
Yes, drivers must still pay attention in self-driving cars. Probably a future exists where that is no longer necessary, but that's not today.
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
The idea that a software bug could land me in jail for manslaughter, is not one I'm a fan of.
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
Um, no shit the driver’s responsible. This seems like a weird question to be confused about prior to level 5, which Tesla’s 2022 “full self-driving” isn’t and 2019 wasn’t either.
(Replying to PARENT post)
Doesn't send shivers up my spine. I know I'm responsible, that's why I keep a constant eye on the road and am prepared to take control at any time.
Though the more recent software is very good at stopping at red lights, I never trusted autopilot anywhere but the highway where Tesla themselves says it should be used. Now that I'm in the "Full Self Driving" beta I trust it on regular roads a bit more but I still 100% understand I need to be watching the road. It augments my driving not replaces it (in its current state).
(Replying to PARENT post)
In both cases the driver is responsible, and expected to monitor the road. Why would someone crashing while using e.g. a lane-keeping system be in a different position legally than someone using cruise control? Level 1 driving assistance has been in use for decades, and there's surely legal precedence to draw from.
1. https://www.synopsys.com/automotive/autonomous-driving-level...
(Replying to PARENT post)
Well that is the thing here. There is not proper driver monitoring on these cars and it allows both autopilot and FSD systems to be open to abuse to cause crashes like this given that the drivers have become too complacent and convinced that these cars drive themselves which Tesla's falsely advertises.
So yes. Guilty as charged. Both features of autopilot and FSD allow drivers like this one to become complacent with the system.
(Replying to PARENT post)
The person who assembled the sensor and has some kind of defect? Or steering/braking mechanism?
Remember Toyota's accelerator-by-wire spaghetti code? No oversight by any third party on that code, what has changed today (nothing except even more complex code doing more complex/dangerous things)
At some point people are going to start victim blaming pedestrians/cyclists for just "being there" when their car mows them over. I guess it already happens but it will become far more common.
(Replying to PARENT post)
Driver was on autopilot, which explicitly warns you upon activation that it doesn't stop.
Clickbait from a once great, now awful newspaper.
(Replying to PARENT post)
It's easy to blame the driver, until you consider whether or not the premise of a "partially automated self driving car" is fit for the road. It is not possible to both "stay alert" to anywhere near the same degree required for manual driving, and let an automated navigation system take control.
If you sell something inherently dangerous to people under the premise that it is not dangerous provided you do something that is impossible (but not obviously impossible to most people) - bad things are significantly more likely to happen, in other words, it is not only dangerous but misleading.
(Replying to PARENT post)
(Replying to PARENT post)
Well played.
(Replying to PARENT post)
The ultimate responsibility is the person behind the wheel.
That said, I’d argue Tesla is also liable. To what extent I don’t know, but it was clearly a contributing factor.