Readit News logoReadit News
guywithahat · 22 days ago
> The company must pay $329 million in damages to victims and survivor, including compensatory and punitive damages.

> A Tesla owner named George McGee was driving his Model S electric sedan while using the company’s Enhanced Autopilot, a partially automated driving system.

> While driving, McGee dropped his mobile phone that he was using and scrambled to pick it up. He said during the trial that he believed Enhanced Autopilot would brake if an obstacle was in the way. His Model S accelerated through an intersection at just over 60 miles per hour, hitting a nearby empty parked car and its owners, who were standing on the other side of their vehicle.

On one hand I don't think you can apply a price to a human life, but on the other 329 million feels too high, especially since Tesla is only partially to blame, it wasn't FSD, and the driver wasn't using the system correctly. Had the system been being used correctly and Tesla was assigned more of the blame, would this be a 1 billion dollar case? This doesn't hold up logically unless I'm missing something, certainly the victim wouldn't be getting fined 329 million if it was decided to be his fault for not looking at the road

furyofantares · 22 days ago
> This doesn't hold up logically unless I'm missing something, certainly the victim wouldn't be getting fined 329 million if it was decided to be his fault for not looking at the road

I hope we haven't internalized the idea that corporations should be treated the same as people.

There's essentially no difference $3M and $300M fine against most individuals, but $3M means very little to Tesla. If you want Tesla's behavior to change - and other automakers take notice and not repeat the behavior - then the fine must be meaningful to them.

That's another difference - fining an indivisible is not going to change risks much, the individual's behavior changing is not that meaningful compared to Tesla's behavior changing. And it's not like a huge fine is gonna make a difference in other drivers deciding to be better, whereas other automakers will notice a huge fine.

ratelimitsteve · 22 days ago
>I hope we haven't internalized the idea that corporations should be treated the same as people.

Only when it comes to rights. When it comes to responsibilities the corporations stop being people and go back to being amorphous, abstract things that are impossible to punish.

stouset · 22 days ago
I agree that Tesla should receive punitive damages. And the size of the punitive damages must be enough to discourage bad behavior.

I'm not necessarily sure the victim(s) should get all of the punitive damages. $329 million is a gargantuan sum of money; it "feels" wrong to give a corporation-sized punishment to a small group of individuals. I could certainly see some proportion going toward funding regulatory agencies, but I fear the government getting the bulk of punitive damages would set up some perverse incentives.

I think in the absence of another alternative, giving it to the victim(s) is probably the best option. But is there an even better possible place to disburse the funds from these types of fines?

thelastgallon · 22 days ago
$300M means very little to Tesla. The stock didn't even drop a bit (other than the usual market fluctuations today). Perhaps $4.20B or $6.90B would've been meaningful. Elon notices these numbers.
ornornor · 21 days ago
To add, this is also punitive for endangering countless other road users that aren’t suing in this particular instance.

Deleted Comment

bko · 22 days ago
> If you want Tesla's behavior to change - and other automakers take notice and not repeat the behavior - then the fine must be meaningful to them.

What behavior do you want them to change? Remove FSD from their cars? It's been nearly 10 years since released and over 3bn miles driven. There's one case where someone died while fetching his cell phone. You would think if it was really dangerous, people would be dying in scores.

This is obviously targeted and the court system should not be playing favorites or going after political opponents

colingauvin · 22 days ago
Tesla was found partially liable for this crash. The reason they were liable was they sold something claiming (practically speaking) that it could do something. The customer believed that claim. It failed to do that thing and killed people.

So the question then is - how much did Tesla benefit from claiming they could do this thing? That seems like a reasonable starting point for damages.

onlyrealcuzzo · 22 days ago
And the fine needs to be high enough to prevent them from just saying - oh, well, we can make money if we keep doing it.

If you could only fine a person for committing murder, you wouldn't fine a billionaire $5m, and then hope he wouldn't go on killing everyone he thinks he'd rather have dead than $5m.

marcosdumay · 22 days ago
The US justice system uses punitive damages very heavily. And Tesla should absolutely get some punishment here.

On most other places you'd see it paying hundreds of millions in fines and a few millions in damages.

IdSayThatllDoIt · 22 days ago
I imagine the jury heard "autopilot" and then assigned blame to the company that called it that.

"[Plaintiffs] claimed Tesla’s Autopilot technology was flawed and deceptively marketed."

close04 · 22 days ago
> I imagine the jury heard "autopilot" and then assigned blame to the company that called it that.

It's only fair. If the name was fine when it was attracting the buyers who were mislead about the real capabilities, it must be fine when it causing the same to jurors.

There's another similar argument to be made about the massive amount awarded as damages, which maybe will be lowered on appeal. If people (Tesla included) can make the argument that when a car learns something or gets an "IQ" improvement they all do, then it stands to reason that when one car is dangerous they all are (or were, even for a time). There are millions of Teslas on the road today so proportionally it's a low amount per unsafe car.

Hamuko · 22 days ago
"Autopilot" isn't even the most egregious Tesla marketing term since that honour goes to "Full Self-Driving", which according to the fine text "[does] not make the vehicle autonomous".

Tesla's self-driving advertising is all fucking garbage and then some George McGee browses Facebook while believing that his car is driving itself.

ratelimitsteve · 22 days ago
do you think they heard "autopilot" or "full self driving"?
ajross · 22 days ago
As gets pointed out ad nauseum, the very first "cruise control" product in cars was in fact called "Auto-Pilot". Also real "autopilot" systems in aircraft (where the term of art comes from!) aren't remotely supervision-free.

This is a fake argument (post hoc rationalization): It invents a meaning to a phrase that seems reasonable but that has never been rigorously applied ever, and demands that one speaker, and only that one speaker, adhere to the ad hoc standard.

bko · 22 days ago
> On one hand I don't think you can apply a price to a human life

Yes, although courts do this all the time. Even if you believe this as solely manufacturer error, there are precedents. Consider General Motors ignition switch recalls. This affected 800k vehicles and resulted in 124 deaths.

> As part of the Deferred Prosecution Agreement, GM agreed to forfeit $900 million to the United States.[4][51] GM gave $600 million in compensation to surviving victims of accidents caused by faulty ignition switches

So about $5m per death, and 300m to the government. This seems excessive for one death, even if you believe Tesla was completely at fault. And the fact that this is the only such case (?) since 2019, seems like the fault isn't really on the manufacturer side.

https://en.wikipedia.org/wiki/General_Motors_ignition_switch...

panarky · 22 days ago
If you you make a manufacturing error without intentionally deceiving your customers through deceptive naming of features, you have to pay millions per death.

If you intentionally give the feature a deceptive name like "autopilot", and then customers rely on that deceptive name to take their eyes off the road, then you have to pay hundreds of millions per death.

Makes sense to me.

thorum · 22 days ago
The product simply should not be called Autopilot. Anyone with any common sense could predict that many people will (quite reasonably) assume that a feature called Autopilot functions as a true autopilot, and that misunderstanding will lead to fatalities.
bangaladore · 22 days ago
> feature called Autopilot functions as a true autopilot

What's a "true autopilot"? In airplanes, autopilot systems traditionally keep heading, altitude, and speed, but pilots are still required to monitor and take over when necessary. It's not hands-off or fully autonomous.

I would argue you are creating a definition of "autopilot" that most people do not agree with.

dzhiurgis · 22 days ago
Anyone who used it knows its limitations. IDK maybe in 2019 it was different tho, now it's full of warnings that make it barely useable when distracted. Ironically you are better off disabling it and staring into your phone, which seems what regulators actually want.

And by the way what is true autopilot? Is the average joe a 787 pilot who's also autopilot master?

Funny that pretty much every car ships with autosteer now. Ones I've used didn't seem to have much warnings, explanations, disclaimers or agreements that pundits here assume it should.

nullc · 22 days ago
A true autopilot is a system on a boat or aircraft that keeps it on a set heading. ISTM in this case that's what the autopilot did.
realusername · 22 days ago
There's two conflicting goals here, Tesla's marketing department would really like to make you think the car is fully autonomous for financial reasons (hence autopilot and full self driving) and then there's Tesla's legal department which would prefer to blame somebody else for their poor software.
motorest · 22 days ago
> On one hand I don't think you can apply a price to a human life, but on the other 329 million feels too high, (...)

Let me stop you right there. That's not how damages work.

Damages have two goals: compensate victims, and dissuade offenders from repeating the same mistakes. The latter involves punishments that discourage repeat offenses.

That's where these high values come from. They are intended to force the likes of Tesla to not ignore the lives they are ending due to their failures.

If damages were low, the likes of Tesla would do absolutely nothing and absorb them as operational expenses, and continue to cause deaths claiming they are unavoidable.

Once the likes of Tesla are forced to pay significant volumes of cash in damages, they suddenly find motives to take their design problems seriously.

lblume · 22 days ago
I tend to agree, however the government is not an unincentivized incentivizer. By being able to impose such fines, the government is potentially itself incentivized to not prevent these accidents for they potentially cause this kind of revenue.

There are ways to mitigate this, such as forcing the government to use these revenues in a way that is relevant to the issue at hand, i.e. creating safety jobs, strengthening control authorities, or something else.

You could also say that the amount is insignificant, but that could of course change with every lawsuit, and it of course accumulates. Or one could speculate that the interests are not really monetarily aligned at all (e.g. prisons), or that the judicial system is independent enough to stop propagation of these incentives. I think it is still needed to consider and try to controlledly align these motives between the relevant actors.

Dylan16807 · 22 days ago
> Damages have two goals: compensate victims, and dissuade offenders

Let me stop you right there. Just the compensatory damages were 129 million. And most of that was charged to the driver, no corporate boost there.

blargey · 22 days ago
The fault with an individual can be reasonably constrained to the one prosecuted death they caused. The fault with "autopilot by Tesla", a system that was marketed and deployed at scale, cannot.

And if you want to draw parallels with individuals, an individual driver's license would be automatically suspended and revoked when found at fault for manslaughter. Would you propose a minimum 1~3 year ban on autopilot-by-Tesla within the US, instead?

HWR_14 · 22 days ago
329 million is not just compensatory damages (the value of the human life) but also punitive damages. That number floats up to whatever it takes to disincentivize Tesla in the future.
recursivecaveat · 22 days ago
*Punitive damages*. From another article: "They claimed Tesla either hid or lost key evidence, including data and video recorded seconds before the accident." If Tesla is destroying evidence then yeah they ought to feel the pain, and those personally responsible should be charged as well. If you make it cheaper to evade the law than comply, what good is the court at all?
somerandomqaguy · 22 days ago
I just did some googling around:

> The case also included startling charges by lawyers for the family of the deceased, 22-year-old, Naibel Benavides Leon, and for her injured boyfriend, Dillon Angulo. They claimed Tesla either hid or lost key evidence, including data and video recorded seconds before the accident.

> Tesla has previously faced criticism that it is slow to cough up crucial data by relatives of other victims in Tesla crashes, accusations that the car company has denied. In this case, the plaintiffs showed Tesla had the evidence all along, despite its repeated denials, by hiring a forensic data expert who dug it up. Tesla said it made a mistake after being shown the evidence and honestly hadn’t thought it was there.

-https://lasvegassun.com/news/2025/aug/01/jury-orders-tesla-t...

Nothing enrages a judge faster then an attempt to conceal evidence that a court has ordered be turned over during discovery. If this is then I suspect the punitive damages have to do as much about disregard to the legal process as it is the case itself.

polotics · 22 days ago
329 million too high? if you had the money and handing it over would save your life, would you rather keep the money as a corpse?
thfuran · 22 days ago
So wrongful death liability should be infinite, or maybe just equal to the money supply (pick one, I guess)?
abeppu · 22 days ago
I think the conceptually messed up part is, when such an award includes separate components for compensatory damages and punitive damages, the plaintiff receives the punitive damages even if they're part of a much broader class that was impacted by the conduct in question. E.g. how many people's choice to purchase a Tesla was influenced by the deceptive marketing? How many other people had accidents or some damages? I think there ought to be a mechanism where the punitive portion rolls into the beginning of a fund for a whole class, and could be used to defray some costs of bringing a larger class action, or dispersed directly to other parties.
MBCook · 22 days ago
So if I make a lawsuit and prove there is a small possibility my toaster can cause my arm to be cut off, because that’s what it did to me, and win $400,000,000 I should only get $400 if it turns out they sold 1 million units?

It’s not a class action lawsuit. If they want their cash they should sue too. That’s how our system works.

bryant · 21 days ago
If there's an argument to be made that the damages are too high (I'm not making this argument, to be clear), it might be with the compensatory damages prescribed by the jury at $129 million, but then that begs the question of what the cost of a human life is. Money won't bring someone back, but if you're in a courtroom and forced to calculate this figure, it's better to just overestimate IMO.

But the punitive damages at $200 million are appropriate — it's what the jury thought would be appropriate to discourage Tesla's behaviors.

fuoqi · 22 days ago
>I don't think you can apply a price to a human life

Not only we can, but it also done routinely. For example, see this Practical Engineering video: https://www.youtube.com/watch?v=xQbaVdge7kU

pengaru · 22 days ago
If anyone's confused about what to expect of autopilot and/or FSD, it's Tesla's doing and they should be getting fined into oblivion for the confusion and risks they're creating.
beambot · 22 days ago
On the flip side: Penalties should be scaled relative to one's means so that the wealthy (whether people or corporations) actually feel the pain & learn from their mistakes. Otherwise penalties for the wealthy are like a cup of coffee for the average Joe -- just a "cost of business."

I'm also a big proponent of exponential backoff for repeat offenders.

dzhiurgis · 22 days ago
Let's fine Apple too, since they allow smartphones to be using while driving.
scoofy · 22 days ago
It's the sum of damages and punitive damages.

Deleted Comment

1vuio0pswjnm7 · 19 days ago
"This doesn't hold up logically unless I'm missing something, certainly the victim wouldn't be getting fined 329 millioon if it was decided to be his fault for not loooking at the road"

There was no "329 million fine"

There was a (a) 59 million compensatory damages award to the representative of the estate of the deceased and (b) 70 million compensatory damages award to her boyfriend who survived

The punitive damages were likley the result of Tesla's misconduct in deliberately concealing evidence, not its percentage of fault in causing the accident

HN front page: https://news.ycombinator.com/item?id=44787780

Why would Tesla conceal evidence. That question is left as one for the reader

Indeed, the HN commenter missed several things

Dead Comment

Simulacra · 22 days ago
It's sort of like with the Prius acceleration debacle we had, people always want to blame the car and not their own actions.
SilverElfin · 22 days ago
> While driving, McGee dropped his mobile phone that he was using and scrambled to pick it up. He said during the trial that he believed Enhanced Autopilot would brake if an obstacle was in the way. His Model S accelerated through an intersection at just over 60 miles per hour, hitting a nearby empty parked car and its owners, who were standing on the other side of their vehicle.

Hard for me to see this as anything but the driver’s fault. If you drop your phone, pull over and pick it up or just leave it on the floor. Everyone knows, and the car tells you, to pay attention and remain ready to take over.

naet · 22 days ago
The argument is that if the driver was in a different vehicle he would have done just that, pulled over and picked it up, but because he believed the Tesla was capable of driving safely on it's own he didn't do so.

Normally I turn the steering wheel when I want to turn my car. If you sold me a car and told me it had technology to make turns automatically without my input then I might let go of the wheel instead of turning it, something I would never have done otherwise. If I then don't turn and slam straight into a wall, am I at fault for trusting what I was sold to be true?

If the driver has heard that their Tesla is capable of autonomous driving, and therefore trusts it to drive itself, there may be a fair argument that Tesla shares in that blame. If it's a completely unreasonable belief (like me believing my 1998 Toyota is capable of self driving) then that argument falls apart. But if Tesla has promoted their self driving feature as being fully functional, used confusing descriptions like "Full Self-Driving", etc, it might become a pretty reasonable argument.

ninalanyon · 22 days ago
Every time I engage Autopilot in my Model S it admonishes me with a notice in the instrument cluster that I am to keep my hands on the wheel. If I don't make it clear the car that I am there and holding on by applying a little rotational force to the wheel at least every fifteen seconds the car will remind me.

So how does one conclude the that the car is capable of driving itself? Or is the version of Autopilot in the car in question different in this respect?

Autopilot is not autonomous driving and isn't marketed as such; Full Self Driving (FSD) is an extra cost option.

ghusto · 21 days ago
They also said that by putting his foot down on the accelerator, he overrode the feature. He might say he didn't know that's how it worked, but then there's even more fault for performing such a dangerous action not knowing how the thing you think will save you is supposed to be operated.

Deleted Comment

dzhiurgis · 22 days ago
Except driver already accepted liability.

Also this doesn't stand water today as all new cars have some basic autosteer.

m463 · 22 days ago
The model s has terrible phone docks. Don't get me started on cupholders, I'll bet people have drink mishaps all the time that affect driving.

I'm actually kind of serious about this - keeping people's stuff secure and organized is important in a moving car.

I'm surprised the touchscreen controls and retiring of stalks aren't coming under more safetly scrutiny.

With the new cars without a PRND stalk, how can you quickly reverse the car if you nose out too far and someone is coming from the side? will the car reverse or go forward into danger?

dzhiurgis · 22 days ago
What Tesla touchscreen controls are crucial for driving? FWIW climate is quick little swipe, arguably easier than turning dial.
SoftTalker · 22 days ago
And why was his mobile phone in his hand to drop, if he was driving? Most states have laws against mobile device usage while driving, and it was never a responsible thing to do even before the laws were enacted.
MBCook · 22 days ago
Perhaps he thought it was safe. After all, he had autopilot.
nielsbot · 22 days ago
Sure--dangerous and wrong. Despite that, Autopilot was driving at the time.
izacus · 22 days ago
The driver at that time was Tesla Autopilot. So yeah, drivers fault as the jury said.
tantalor · 22 days ago
1/3 Tesla's fault, 2/3 the operator's
SilverElfin · 22 days ago
That sounds indistinguishable from “The driver at the time was the motor”
tr81 · 22 days ago
Why would you pull over when you paid top dollars for Autopilot?
ineedasername · 22 days ago
Maybe he would have pulled over if the car’s capabilities hadn’t been oversold. Two entities did stupid things: 1) The person by not waiting to pull over because Elon Musk’s false claims, and 2) Tesla via Elon Musk making those false claims.

It passes a classic “but for…” test in causality.

v5v3 · 22 days ago
The articles has been updated (correction) to say Tesla is liable for a portion of the damages. Not all of it.

33% liable

https://www.cnbc.com/2025/08/01/tesla-must-pay-329-million-i...

hshdhdhj4444 · 22 days ago
33% for the compensatory damages and 100% of the punitive damages according to the article. So total of about $242 mm.
treetalker · 22 days ago
The case number appears to be 1:21-cv-21940-BB (S.D. Fla.).

I practice in that court regularly. Beth Bloom has a reputation as a good trial judge, so I'm somewhat skeptical of Tesla's claims that the trial was rife with errors. That said, both the Southern District and the Eleventh Circuit are known to frequently and readily lop off sizable chunks of punitive damages awards.

The real battle is just beginning: post-trial motions (including for remittitur) will be due in about a month. Then an appeal will likely follow.

numitus · 21 days ago
AI system manufacturers want to sell their products by advertising their superhuman reliability, but they don't want to take responsibility for any mistakes. I should mention that I don't have any Tesla cars in my environment, but a friend of mine claimed in 2017 that the car was capable of fully autonomous and safe driving without human intervention. It's interesting how advertising can distort the essence of technology in people's eyes.
jjaksic · 21 days ago
My wife and I have 2 teslas, a HW3 and a HW4. Even late last year FSD 12.5 was nowhere near close to being able to drive safely. Any non-straightforward situation (like merging during rush hour) would throw it off, so critical interventions were required at least daily.

Starting with FSD 13 on HW4, which came out last December, it's improved dramatically, and since then in my case it hasn't needed a single critical intervention. I think 12.6 on HW3 is also quite good.

The caveat is that we live in the Bay Area, which has an abundance of Tesla training data. Elsewhere I've heard the experience isn't as good. And of course, even in the Bay Area the reliability needs to get a few orders of magnitude higher to be suitable for fully unsupervised self-driving.

BonoboIO · 22 days ago
Tesla tried to hide evidence … the jury probably did not like to be lied to by Tesla.

F** around and find out

> The case also included startling charges by lawyers for the family of the deceased, 22-year-old, Naibel Benavides Leon, and for her injured boyfriend, Dillon Angulo. They claimed Tesla either hid or lost key evidence, including data and video recorded seconds before the accident.

> Tesla has previously faced criticism that it is slow to cough up crucial data by relatives of other victims in Tesla crashes, accusations that the car company has denied. In this case, the plaintiffs showed Tesla had the evidence all along, despite its repeated denials, by hiring a forensic data expert who dug it up. Tesla said it made a mistake after being shown the evidence and honestly hadn’t thought it was there.

https://lasvegassun.com/news/2025/aug/01/jury-orders-tesla-t...

jayess · 22 days ago
"Schreiber acknowledged that the driver, George McGee, was negligent when he blew through flashing lights, a stop sign and a T-intersection at 62 miles an hour before slamming into a Chevrolet Tahoe that the couple had parked to get a look at the stars."
mort96 · 22 days ago
Probably the reason why the driver was assigned 66% of the blame.. Did you have a point?
BonoboIO · 22 days ago
Does that change the shitty behavior of Tesla?
mike-cardwell · 22 days ago
> Tesla Model S in Autopilot mode allegedly came to a T-intersection and, failing to see that the roadway was ending, kept his foot on the accelerator

Autopilot requires you to have your foot on the accelerator? That seems weird to me.

joshuanapoli · 22 days ago
Normally, Autopilot does not require your foot on the accelerator. Pressing the accelerator overrides it to make to car move ahead where the autopilot decided to stop. At the time, autopilot would always stop at an intersection, and the driver was supposed to move ahead when it is clear.
aziaziazi · 21 days ago
For those not aware of the circonstances (as me), here’s Tesla defence:

> this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road

There’s lots to blame in auto makers « security marketing » and phone addiction but it seems obvious that driving a 1ton+ vehicle while not constantly looking at the road can lead to bad outcomes.

I’m all in for (mass surveillance) onboard eye tracking. Make it optional with 50% bonus on your car insurance and driving state tax. I see many, many drivers every day that are looking at their phone in very inappropriate moments like intersections and line changes.

drewbeck · 21 days ago
If someone wants the circumstances they should read the article, not Tesla’s press release. Here’s what the jury said:

… while McGee was two-thirds responsible for the crash, Tesla also bore a third of the responsibility for selling a vehicle "with a defect that was a legal cause of damage"

ghusto · 21 days ago
That doesn't give the circumstances, but the judgement. The original commentor gave the circumstances.
bryant · 21 days ago
Do cars normally allow people to prevent emergency braking with the throttle depressed? I haven't actually tried this for obvious reasons, but if their defense is that the safety mechanisms were disengaged with the throttle being fully depressed...

(Clarified my comment to "prevent" from "override" since overrides broadly exist - per jeroenhd's comment - but it seems in this case the argument was that the feature never engaged)

jeroenhd · 21 days ago
Many automatic safety features do allow user overrides, either by braking (hard) or by accelerating (fast). You may find that your accelerator pedal is harder to press than normal, or that full throttle doesn't do what it normally does. If a normal car does a Tesla and starts doing an emergency brake in the middle of the freeway for no reason, you want the driver to be able to intervene.
Lionga · 21 days ago
maybe do not call it Autopilot if it is well not auto piloting dangerous situations?
thoroughburro · 21 days ago
Do you usually learn about the circumstances of legal cases by reading the arguments of only one side?
scarab92 · 21 days ago
Do you have an actual critique of the argument?

I was grateful for it, and at first glance, assuming Tesla’s argument is true, it’s hard to see how they are even partially responsible.