I wouldn't be surprised if Elon considers it just. Most people consider it just that a revolutionary technology owner is being held liable for their actions, even if they are nearly a decade afterwards.
Regulation for big corporations are more like suggestions these days, so this may even feel "fair enough" for Elon in retrospect.
Or the tweet was bullshit, and Tesla will do what every company does and determine whether it is better to settle or not based on the financial cost/benefit and not the truth or validity of the claim.
the defective design of the door latch system entrapping him in the vehicle
This was discussed furiously in the HN articles of the time; the latch apparently has a manual actuator, but how to use it is not obvious at all, especially in an emergency.
I also noticed the subtitle is "Settlement Comes After Automaker in April Struck Confidential Accord". That had to be deliberate.
Maybe ideally they should have mixed the two: a light pull does the electronic thing (rolls windows clear of weather stripping, etc), a hard pull does the manual thing.
Tesla driver assist is just fine, thanks. Tesla surely followed all relevant software best practices, like MISRA, ISO-26262, etc and is in no way liable for poorly designed software that has been enabling fully self-driving vehicles since 2015 as was promised by the CEO.
It’s a little confusing, but this incident is not about self driving or software (unless the latching system is software) If anything it’s probably about the latching system or how vulnerable it is to catching fire.
We may never know the truth but I’m not sure what Tesla is at fault here for or why they would settle. Twice the legal limit for alcohol (alleged) by the driver is very bad for the plaintiff.
Did you not read the article? They included info about an old case for background but this was about the Apple engineer who in 2018 was killed when his Tesla drove itself off the edge of a freeway and into a barrier at 71 mph.
This has nothing to do with driver assist as far as I can tell? It was a drunk driver that had her foot on the gas the whole time and made no attempt to brake.
They have the best self driving of any company in the world. What other car can you buy and send off to work as a robotaxi on your behalf? That is truly incredible and I never hear anyone talking about it.
You never hear anyone talking about it because it's not yet possible to send a Tesla off to work as a robotaxi on your behalf. It will be incredible if/when they're able to do that, assuming they're able to do it without any major incidents. And people will be talking about it endlessly, and rightfully so, once/if it comes to pass. I sure hope it happens because I'd be stoked to ride in one.
Lol. I know Musk eventually did produce FSD (for America), but the unmanned robotaxi thing just seems to have so many fundamental problems I can't imagine it ever working (with people's owned cars, as opposed to the Waymo model)
Exactly. If he hadn’t been under the influence then the door latch system wouldn’t have had a defective design
>The suit blamed the “propensity of the vehicle to catch fire, as well as the defective design of the door latch system entrapping him in the vehicle.”
> If he hadn’t been under the influence then the door latch system wouldn’t have had a defective design
So... Poe's Law here. I can't tell if this is a sarcastic comment pointing out that a defective design remains defective even if someone is drunk, or whether this is a serious comment implying it wasn't really defective in normal circumstances.
In any case, the person who was trapped-and-died was the passenger, and we don't know if or how-much they were drunk.
The driver survived... or else they're making a lawsuit from beyond the grave.
That wasn't what the case was about. It was about the propensity of the car to catch on fire and faulty door latch design that prevented the passenger who survived the impact from getting out.
> Tesla maintains there was nothing wrong with the car. It said the data event recorder showed that Speckman kept her foot on the accelerator pedal before the crash and never attempted to brake.
The family settling could seen as support for this claim. If it really did accelerate randomly, that seems like more of an NHTSA (or whoever) sort of thing more than something that could be settled.
Nah: If anybody was actually holding Musk "ultimately at fault regardless of anything else", we wouldn't be talking about settling a product-safety lawsuit, but instead about a criminal trial for manslaughter.
Since that's not the case--and I don't think anyone has even suggested it needs to be--we can safely infer that there's already a high degree of nuance and splitting of different levels and layers of responsibility going on.
At the end of the day, "operator error"--even drunken operator error--is not enough to automatically negate all safety flaws.
I wonder how many of these suits they will have to settle (and how many people will die/get injuried in the process) just for them to be able to avoid judgments being used against them in their other procedure for falsely advertising their car as FSD (and charging more specifically for it).
" My commitment:
- We will never seek victory in a just case against us, even if we will probably win.
- We will never surrender/settle an unjust case against us, even if we will probably lose. "
Deleted Comment
Regulation for big corporations are more like suggestions these days, so this may even feel "fair enough" for Elon in retrospect.
Just to do what he is asking you would need a team like oj had but for each simultaneous case. He should start a law firm next.
Deleted Comment
- We will never seek victory in a just case against us, even if we will probably win.
- We will never surrender/settle an unjust case against us, even if we will probably lose. "
And yet Elon Musk filed harassing lawsuits against his critics Media Matters for America and the Center for Countering Digital Hate.
Musk's SLAPP suits are contrary to his purported love of free speech. They are manifestly unjust.
This was discussed furiously in the HN articles of the time; the latch apparently has a manual actuator, but how to use it is not obvious at all, especially in an emergency.
I also noticed the subtitle is "Settlement Comes After Automaker in April Struck Confidential Accord". That had to be deliberate.
The result of prioritizing design over function — and in this case, safety!
I love my Tesla but that company is horrible with making it apparent how to open doors. My passengers never know how to get in or out.
Worse, look at the Cybertruck... I swear their next car will only open the doors from the app.
Tesla driver assist is just fine, thanks. Tesla surely followed all relevant software best practices, like MISRA, ISO-26262, etc and is in no way liable for poorly designed software that has been enabling fully self-driving vehicles since 2015 as was promised by the CEO.
We may never know the truth but I’m not sure what Tesla is at fault here for or why they would settle. Twice the legal limit for alcohol (alleged) by the driver is very bad for the plaintiff.
That's the point. They're trying to hide something.
So it could be seen as an engineering shortcoming, even though the cause was someone driving drunk
https://x.com/elonmusk/status/1593769469741932544
sounds like theres something that tesla would not like being in news headlines if the case went to trial.
This company is a joke
Driving under the influence, what more is there to say?
>The suit blamed the “propensity of the vehicle to catch fire, as well as the defective design of the door latch system entrapping him in the vehicle.”
So... Poe's Law here. I can't tell if this is a sarcastic comment pointing out that a defective design remains defective even if someone is drunk, or whether this is a serious comment implying it wasn't really defective in normal circumstances.
In any case, the person who was trapped-and-died was the passenger, and we don't know if or how-much they were drunk.
The driver survived... or else they're making a lawsuit from beyond the grave.
> Tesla maintains there was nothing wrong with the car. It said the data event recorder showed that Speckman kept her foot on the accelerator pedal before the crash and never attempted to brake.
The family settling could seen as support for this claim. If it really did accelerate randomly, that seems like more of an NHTSA (or whoever) sort of thing more than something that could be settled.
Since that's not the case--and I don't think anyone has even suggested it needs to be--we can safely infer that there's already a high degree of nuance and splitting of different levels and layers of responsibility going on.
At the end of the day, "operator error"--even drunken operator error--is not enough to automatically negate all safety flaws.
Ive never seen a company do this so incredibly quickly.
Deleted Comment