(Replying to PARENT post)

Short answer is “yes”. It’s the same way a homeowner can be responsible if they negligently leave some issues on their property and a person falls to their death.

The ultimate responsibility is the person behind the wheel.

That said, I’d argue Tesla is also liable. To what extent I don’t know, but it was clearly a contributing factor.

👤lettergram🕑3y🔼0🗨️0

(Replying to PARENT post)

Yes.

> Should Riad be found guilty, “it’s going to send shivers up and down everybody’s spine who has one of these vehicles and realizes, ‘Hey, I’m the one that’s responsible,’”

Duh. There's plenty of warnings, you are still responsible and your attention is required at all times to intervene when needed. Of course, there are going to be plenty of situations where it will be too late for a human to intervene. Those are going to look harder, but purely from a legal standpoint you are not (yet) allowed to hand off responsibility to some algorithm just because it's baked into the car you buy.

👤breakingcups🕑3y🔼0🗨️0

(Replying to PARENT post)

Yes. Tesla cars are not self-driving cars, despite their marketing. They probably won't be for years to come. I don't think anyone who has read up on the current state of self driving would come to any other conclusion.

At best the convicted drivers could sue Tesla for damages because of their advertising after conviction, because Tesla puts a lot of effort into their marketing materials to convince people that the car drives for you, but on the paperwork you sign you're reminded that it doesn't, and the software features that force you to keep your hands on the wheel at all times are a constant reminder of that.

👤jeroenhd🕑3y🔼0🗨️0

(Replying to PARENT post)

>“Whether a [Level 2] automated driving system is engaged or not, every available vehicle requires the human driver to be in control at all times, and all state laws hold the human driver responsible for the operation of their vehicles,” a NHTSA spokesperson said.

This is going to be the interesting part. To my mind this is where partial automated driving systems are basically incompatible with people and cars. There is just no way a person who is behind the wheel but who isn't driving can be in control and highly attentive in the same way as a person who is actually driving.

👤frgtpsswrdlame🕑3y🔼0🗨️0

(Replying to PARENT post)

This might not be a hot take, but I think some of the blame should be shifted onto the Tesla. This guy killed people, and it should not have happened on his watch, however the car should not have blown through a red light. For instance my car has lane keep assist and you can drive while pretty much not touching the wheel but still paying attention. It has almost pulled me into the car next to me due to a random construction line fading on the highway.

I'm not saying this guy isn't guilty, all I'm saying is maybe Tesla should shoulder some of the blame here. I'm probably a very small minority in this thinking however.

👤post_break🕑3y🔼0🗨️0

(Replying to PARENT post)

Ultimately yes. You are the driver of the car. It is involuntary manslaughter.

Even if it is a software error, even if the brakes stop working, even if x, y or z (etc)

It is always the responsibility of the driver to accept the consequences of any forseen or unforseen actions of any kind of outcome.

Depending on the unfortunate event, any mitigating events might be taken into account.

👤makach🕑3y🔼0🗨️0

(Replying to PARENT post)

If the drivers must be always in control of the vehicle (according to the manual and the activation of Autopilot), what's good in this feature? Couldn't the driver sue Tesla for bad marketing? They are selling a feature which is a lie, aren't they?

PD: I don't own any car and I have nothing against Tesla.

👤101008🕑3y🔼0🗨️0

(Replying to PARENT post)

I think we should immediately prohibit the use of the term Autopilot for level 2 autonomous vehicles like the current Tesla. The name is terribly misleading far past gray area marketing into factfully lying. Pilot assist would be a better term.
👤giorgioz🕑3y🔼0🗨️0

(Replying to PARENT post)

Yes.At the end of the day the car is the tool and just like with guns or knives, whoever uses the tool as an instrument for harm is at fault.I used to think one could argue: "but X made guarantees, software might be guilty, etc" yet at the same time that argument is not logical and ironically the legal framework explains it the best. We almost never give a pass to criminals even when they're not lucid and under the influence of factors that impair their judgement, how come we debate whether or not the human who used the car is at fault?

I also don't think Tesla is necessarily liable here, an unpopular opinion which i also changed over time considering that one needs to prove the car misbehaved.But even then, assuming the car truly misbehaved, making the case that the person is not liable is a big stretch.The title is still clickbait

👤sebow🕑3y🔼0🗨️0

(Replying to PARENT post)

Does Tesla’s autopilot normally stop at red lights? My Subaru’s equivalent doesn’t (Unless there’s a car in front of me that also stops at the light).
👤learc83🕑3y🔼0🗨️0

(Replying to PARENT post)

Here is an interesting question I recently came upon, does the driver of a Tesla need consent from their passengers before turning on FSD? I would argue yes, because I don’t want to unknowingly beta test software with my life, but I am curious what others think.
👤kamranjon🕑3y🔼0🗨️0

(Replying to PARENT post)

In my experience driving Teslas on autopilot (a few dozen hours), you need to keep watching the road. The whole time. Especially on on/off ramps and around traffic lights. I almost watched a Tesla I was operating drive into a meridian on a highway off-ramp because the concrete was approximately the same color as the road. I slowed down to see how long it would take to detect the coming collision ... and if I hadn't been watching carefully, it would have been too late.

Yes, drivers must still pay attention in self-driving cars. Probably a future exists where that is no longer necessary, but that's not today.

👤vngzs🕑3y🔼0🗨️0

(Replying to PARENT post)

Of course he's liable. He should have been driving. Fully autonomous cars aren't yet legal for anyone to drive autonomously, the driver still is responsible.
👤Mikeb85🕑3y🔼0🗨️0

(Replying to PARENT post)

I think the real problem is the driver monitoring system. Openpilot nailed it with their eye camera monitoring. There is no need to touch the wheel which makes it more relaxing but you can not take your eyes of the road, which is far more important then keeping your hand on the wheel. Both would even be better. Still trying to figure out why this is not mandatory for all lane keeping assistents.
👤razemio🕑3y🔼0🗨️0

(Replying to PARENT post)

Yes. Also, I think Tesla should be on the hook for some of that guilt for their misleading marketing around the "autopilot" feature.
👤nemacol🕑3y🔼0🗨️0

(Replying to PARENT post)

I think the answer's yes, and it's something that would make me leary of relying on a car's automated systems for anything safety related. Ultimately if I'm going to be criminally liable in a crash caused by my car, I'd like to retain control of what's going on.

The idea that a software bug could land me in jail for manslaughter, is not one I'm a fan of.

👤raesene9🕑3y🔼0🗨️0

(Replying to PARENT post)

Legislators that are allowing these things on the road are culpable too.
👤uptown🕑3y🔼0🗨️0

(Replying to PARENT post)

I feel like with all the tech advantage Tesla claims to have it should have some way to turn off autopilot in situations where it's not applicable instead of always saying "you're not supposed to use it like that". Just start beeping or something when you're not on the highway and then shut off the autopilot
👤sidibe🕑3y🔼0🗨️0

(Replying to PARENT post)

> realizes, ‘Hey, I’m the one that’s responsible,’” Kornhauser said. “Just like when I was driving a ’55 Chevy”

Um, no shit the driver’s responsible. This seems like a weird question to be confused about prior to level 5, which Tesla’s 2022 “full self-driving” isn’t and 2019 wasn’t either.

👤sokoloff🕑3y🔼0🗨️0

(Replying to PARENT post)

> Should Riad be found guilty, “it’s going to send shivers up and down everybody’s spine who has one of these vehicles and realizes, ‘Hey, I’m the one that’s responsible,’”

Doesn't send shivers up my spine. I know I'm responsible, that's why I keep a constant eye on the road and am prepared to take control at any time.

Though the more recent software is very good at stopping at red lights, I never trusted autopilot anywhere but the highway where Tesla themselves says it should be used. Now that I'm in the "Full Self Driving" beta I trust it on regular roads a bit more but I still 100% understand I need to be watching the road. It augments my driving not replaces it (in its current state).

👤throwaway2016a🕑3y🔼0🗨️0

(Replying to PARENT post)

The premise of this article is that a trial involving a crash with a level 2 system[1] is fundamentally different than one involving a level 1, without making the case for that.

In both cases the driver is responsible, and expected to monitor the road. Why would someone crashing while using e.g. a lane-keeping system be in a different position legally than someone using cruise control? Level 1 driving assistance has been in use for decades, and there's surely legal precedence to draw from.

1. https://www.synopsys.com/automotive/autonomous-driving-level...

👤avar🕑3y🔼0🗨️0

(Replying to PARENT post)

> “It certainly makes us, all of a sudden, not become so complacent in the use of these things that we forget about the fact that we’re the ones that are responsible — not only for our own safety but for the safety of others.”

Well that is the thing here. There is not proper driver monitoring on these cars and it allows both autopilot and FSD systems to be open to abuse to cause crashes like this given that the drivers have become too complacent and convinced that these cars drive themselves which Tesla's falsely advertises.

So yes. Guilty as charged. Both features of autopilot and FSD allow drivers like this one to become complacent with the system.

👤rvz🕑3y🔼0🗨️0

(Replying to PARENT post)

How about the coder? The person who reviewed the code?

The person who assembled the sensor and has some kind of defect? Or steering/braking mechanism?

Remember Toyota's accelerator-by-wire spaghetti code? No oversight by any third party on that code, what has changed today (nothing except even more complex code doing more complex/dangerous things)

At some point people are going to start victim blaming pedestrians/cyclists for just "being there" when their car mows them over. I guess it already happens but it will become far more common.

👤ck2🕑3y🔼0🗨️0

(Replying to PARENT post)

Autopilot is a free feature on Teslas, and is a simple lane assist system, akin to cruise control. It is not Full Self Driving. FSD is an expensive upgrade, and is supposed to do things like stop at lights.

Driver was on autopilot, which explicitly warns you upon activation that it doesn't stop.

Clickbait from a once great, now awful newspaper.

👤JPKab🕑3y🔼0🗨️0

(Replying to PARENT post)

No.

It's easy to blame the driver, until you consider whether or not the premise of a "partially automated self driving car" is fit for the road. It is not possible to both "stay alert" to anywhere near the same degree required for manual driving, and let an automated navigation system take control.

If you sell something inherently dangerous to people under the premise that it is not dangerous provided you do something that is impossible (but not obviously impossible to most people) - bad things are significantly more likely to happen, in other words, it is not only dangerous but misleading.

👤tomxor🕑3y🔼0🗨️0

(Replying to PARENT post)

if isnt regulated, the politians are guilty. they have to create laws and regulations on time (sorry my poor english)
👤jose-cl🕑3y🔼0🗨️0

(Replying to PARENT post)

“It’s a wake-up call for drivers,”

Well played.

👤Zigurd🕑3y🔼0🗨️0