Readit News logoReadit News
ado__dev · 2 years ago
Drivers fault 100%.

But Tesla is not blameless with their marketing "Autopilot", "Full self driving", blah blah, giving people a false sense of security. I can't think of a worse problem to try and solve with AI. GPT hallucinates and gives a wrong fact. No biggie, but annoying. Tesla FSD hallucinates and runs over a child, a biggie.

GGO · 2 years ago
Exactly. Tesla shares responsibility for marketing something deceptively that can cause serious incidents. We require toy manufacturers to put clear labels that the toy is not appropriate for certain ages, but allow Tesla to market something as dangerous as Autopilot and FSD, implying that the user is not needed.
JumpCrisscross · 2 years ago
> Tesla shares responsibility for marketing something deceptively that can cause serious incidents

The driver is criminally liable to the estate of the young man he killed. Tesla may be liable for money damages to the driver they misled. (Maybe the estate also has a civil claim on Tesla if they can show its marketing was grossly negligent.)

ahahahahah · 2 years ago
I would additionally apportion some blame to the YouTubers who cherry pick fsd footage and make ridiculous claims about how good it is, especially those who also use defeat devices so they can do it with their hands off the wheels.
Brybry · 2 years ago
What if "Autopilot"/"FSD"/computer-assisted driving hallucinates and drivers using it run over children but at a lower rate than people who are not using computer-assisted driving run over children?

If progress for automobiles is that it has to be perfect (and not just better) then the only answer is to use automobiles less or have stricter training standards for drivers.

I do think there needs to be strong proof that it is better to avoid liability. And I agree Tesla's marketing is bad and should make Tesla liable.

MadnessASAP · 2 years ago
It comes down to the simple fact that somebody is going to hold the liability bag.

Tesla and similar obviously are not pushing to take on that liability, courts/legislatures aren't particularly interested in putting that liability on the manufacturers. So that leaves the drivers to try and offload the liability from themselves.

Unfortunately drivers are not particularly well equipped to be pushing that liability off themselves so there it stays. Insurance companies are better equipped for that sort of thing but they're only interested in the financial liability. The criminal liability, is definitely not going anywhere anytime soon.

ado__dev · 2 years ago
I think it depends on how the car ran over the children (or anyone in that case). If it was an unavoidable freak accident, then you can't really blame the car.

But, if the car literally didn't see a person in front of it (where reasonably a human would 99.9999% of the time) because it's cameras malfunctioned or the LLM read it as something else, then those cars should not be on the road.

notyourwork · 2 years ago
It’s not called “almost autonomous driving”, it’s named incorrectly and as a result falsely marketed.
Zambyte · 2 years ago
> GPT hallucinates and gives a wrong fact. No biggie, but annoying.

Sometimes also a biggie: https://www.vice.com/en/article/pkadgm/man-dies-by-suicide-a...

resource_waste · 2 years ago
In a few hundred years, we are going to decide that people are allowed to have agency over their own lives.

We are just too primitive/traditional to understand.

Anyway, after 6 month health issue, I will literally laugh at people who think there is some value in enduring pain. Children.

breadwinner · 2 years ago
If Tesla is not blameless, then it is not driver's fault 100%. Total blame cannot exceed 100%.
throwup238 · 2 years ago
It's not like the juries of two different cases get together and decide how to proportion the penalty.
phrop · 2 years ago
But it can and often does. They do not "share" the blame, but are both at fault (IMHO).
mjw1007 · 2 years ago
There is no law of conservation of blame. Why would there be?
LZ_Khan · 2 years ago
I feel like 100% is not meant to be taken literally here.
kube-system · 2 years ago
If there was more than one tort or crime that happened, then yes, there could be.
nix0n · 2 years ago
There are plenty of situations where total blame adds up to less than 100%, where multiple people make honest mistakes.

This situation is the opposite of that, because multiple people were knowingly reckless.

Edit: By "multiple people" I mean the driver and Elon.

stop50 · 2 years ago
Ask some people on death row. 3-4 people were arrested and sentenced to death for the exact same crime.
tantalor · 2 years ago
<ianal>Clearly FSD is defective if it correctly doesn't plow into motorcycles most of the time, but fails in some low % of cases. That's products liability lawsuit, and some attribution of the fault in this case (idk 1-10%?)
csa · 2 years ago
> Clearly FSD is defective

FSD is not Autopilot.

FSD does a pretty good job of pedestrian and cyclist identification. It’s supposed to.

Autopilot (which is what I use most)… I dunno… I’ve never had a problem in SF, LA, or Vegas.

switch007 · 2 years ago
Does the US not regulate marketing? They’ve been lying for years now
belter · 2 years ago
How many times did Andrej Karpathy sit smiling uncomfortably, by the side of Elon Musk, while he did his loony dance on FSD with no extra comment? I wonder if he feels any responsibility?

https://youtu.be/BFdWsJs6z4c

https://youtu.be/i5tjTACY_3Q

FeloniousHam · 2 years ago
No one who has driven a Tesla with "Autopilot" for 5 minutes is under the delusion that it is a level 5 autonomous driving system.

It is amazing and useful, and assuredly makes me a better driver than I am without it.

I am begoggled at the scolds who think the "deceptive" branding contained _in a single word_ is sufficient to override the lived experience of driving with this fantastic (if flawed?) tool.

belter · 2 years ago
Clearly an issue they will solve in the next 4 months... \s

"Tesla will unveil a robotaxi on August 8, according to Musk" - https://www.engadget.com/tesla-will-unveil-a-robotaxi-on-apr...

frognumber · 2 years ago
I'd rather the article was more precisely titled "manslaughter" rather than "homicide." As much as in legal jargon, homicide encompasses manslaughter, murder, and a few other things, to many people, homicide is synonymous with murder.
spicybbq · 2 years ago
Journalists normally report the charge as written in court documents. Washington state appears to have "vehicular homicide" defined as a specific offense, but not "vehicular manslaughter". It would not be more precise to report an incorrect name for the charged offense.
fasthands9 · 2 years ago
I was confused by this too.

In the article they do clarify it is "vehicular homicide" which is the same thing as "vehicular manslaughter" legally.

A very serious crime, though often times people walk away with this with little or no jail time.

frognumber · 2 years ago
It's not an unreasonable outcome. The goal of the criminal system is to reduce crime, not random vengeance.

I can certainly think of a time when I was driving when I missed another car in my blind spot and almost caused an accident, or when I saw someone in a car distracted by a toddler, or similar. Those didn't lead to accidents, but they might have if someone were less lucky.

There's a range of ways a car can kill people, ranging from driving through a red light at 100MPH, to an illegal U-turn, to making a stupid mistake, to a completely random fluke of circumstance.

On one end of the spectrum, there should be prison time, and regardless of whether an accident happens. On the other end, insurance should pay damages, but I'm not sure what good the criminal system can do in terms of deterrence.

If a driver is already doing their best to be safe, but slips up, or even isn't doing their best but isn't being unreasonable, criminal penalties don't seem like the right outcome.

mattbillenstein · 2 years ago
Elon's refusal to adopt lidar - statistically probably fine, anecdotally it could turn out very badly for any one person which is a hard thing to swallow if it's you...
Hamuko · 2 years ago
Tesla's direction on sensors seems quite ass backwards to me. Didn't they even get rid of parking sensors and replace them with AI vision?
lb4r · 2 years ago
I'm not saying what the better option would be (because I don't know), but many people approach the problem from a very myopic point of view.

Adopting Lidar would of course provide Tesla with higher-quality input for their self-driving model. But the quality of the input isn't the whole equation; you need to process it as well. In other words, adopting Lidar would incur costs not only on the hardware side, but also on the software side, which of course would result in more expensive cars. More expensive cars means less cars sold, and less cars sold means less data, which in turns means less input.

Does this result in a worse model? Again, I don't know, but I do know that the issue is more complicated (and not only because of the reasons I mentioned here) than many people seem to think.

mattbillenstein · 2 years ago
It makes a lot of sense if you're trying to churn out more profit per unit - less costs - they're at the mercy of a sour market atm on the other side of it.
mensetmanusman · 2 years ago
Lidar is blinded by reflective signage.
joshspankit · 2 years ago
And humidity
kibitzor · 2 years ago
Related video "Tesla Autopilot Crashes into Motorcycle Riders - Why?"[0], summarized to: vision used by Tesla seems to process motorcycles differently, and may be incorrectly "assuming" the closer spaced brake lights on a motorcycle is actually a far away car.

More details on the homicide here[1], which shows the crash happened during daylight hours and the bike resembles a sport bike. This is a different condition than my referenced video (night collisions with cruiser-style motorcycles), but I suspect similar incorrect assumptions by Tesla vision happened.

[0]https://www.youtube.com/watch?v=yRdzIs4FJJg

[1] https://www.king5.com/article/traffic/traffic-news/tesla-on-...

bryanlarsen · 2 years ago
Another theory I've heard is that the driver was holding down the accelerator to prevent phantom braking. If this is true Tesla will likely respond fairly quickly to prove it wasn't them. So the longer they don't the less likely this theory is.
AlexandrB · 2 years ago
> Another theory I've heard is that the driver was holding down the accelerator to prevent phantom braking.

Would be interesting to know how commonly this workaround is applied by Tesla owners. If this is common enough it seems like a case where a feature that's merely unreliable becomes a safety issue due to second-order effects. Echoes of Therac 25[1].

[1] https://en.wikipedia.org/wiki/Therac-25

lamontcg · 2 years ago
Keep in mind that so far we only have the driver's assertion that they were using Autopilot at the time. The driver may be attempting to shift blame.
lnsru · 2 years ago
Brand new Model Y with latest software did 4 really dangerous phantom braking stunts. I engaged the system 5 times in total. It’s called enhanced autopilot. I can’t understand how people trust this kind of systems with their lives. Maybe in USA it works much better than elsewhere. But I will never ever turn it again. For the record I didn’t bought it. Got 3 months trial for using referral link.
jupp0r · 2 years ago
Independent of the incident itself, the article made it sound like the driver would have benefited from hiring a lawyer before making statements to police.
londons_explore · 2 years ago
Sounds like he passed the breathalyzer, yet still admitted to 'having a drink' before driving...

What a fool...

AlexandrB · 2 years ago
If he's under the legal limit, does it matter if he had a drink?
molticrystal · 2 years ago
There have been various discussion over the years of adopting and modernizing the model of Equine law , which dealt with injuries from horse & carriages another type of autonomous / semi-automnomous vehicles.

In this case in resolving do the people behind the vehicle share some of the blame.

An excerpt from: https://www.forbes.com/sites/rahulrazdan/2020/01/07/horses-e...

>How does the legal system adapt to new technologies? Generally, this is done by constructing new legal theories that should not conflict with older models and also have characteristics of stability and rationality. What might be the potential legal theories for Autonomous Vehicles? Here are the current candidates:

>Negligence: Today, a typical example includes impaired driving. An impaired AV? >Negligent Entrustment of Vehicles: Here the driver was negligent, but the owner is liable because they should not have trusted the driver. Can you be found negligent if you trust your Tesla AutoDrive? >Res Ipsa Loquitur: In this theory, (“the thing that speaks for itself”) the accident would not have occurred if not for some action from the plaintiff. By applying this logic, the plaintiff caused the accident because they became startled by an AV homing features because it was surprising. >Product Liability and Warranty: Are there implied warranties associated when you buy an AV? Can it be proven that some AV vendors are safer than others? If so, do all AV vendors have to come to some standard ?

>At this point, it is not clear which theory may apply. However, we may gain insight from a very old body of law — Equine Law. Horses were the original autonomous vehicles and for many centuries, the court system had to deal with horse-related accidents.

And while probably not applicable a guy texting, an earlier paper from 2012 which explores an interesting aspect about horses in a frightened state which is akin to the vehicle making its own decision in a crisis scenario:

"Of Frightened Horses and Autonomous Vehicles: Tort Law and its Assimilation of Innovations"

https://digitalcommons.law.scu.edu/cgi/viewcontent.cgi?artic...

Deleted Comment