Readit News logoReadit News
photoGrant · 3 years ago
> For example, more than 35% of all Autopilot crashes occur when the Tesla vehicle is rear-ended by another vehicle.

Well, if you want to know why some people are rear ending a Tesla, it's because it's far more unpredictable than a human driver.

https://theintercept.com/2023/01/10/tesla-crash-footage-auto...

Also it seems to contradict itself on data collection. First;

> In practice, this correlates to nearly any crash at about 12 mph (20 kph) or above, depending on the crash forces generated

But then they've decided to ignore crashes that doesn't deploy an airbag or similar...

> recent analysis led us to identify and implement upgrades to our data reporting. Specifically, we discovered reports of certain events where no airbag or other active restraint deployed [...]

So now we're only counting major accidents?

bagels · 3 years ago
Autopilot does emergency braking for nonexistent obstacles. I'm sure it's contributing to rear end collision risk. I stopped using it for this reason.
CamperBob2 · 3 years ago
Every autonomous braking system has that problem, including the one in my 2020 Macan. They are dangerously-flaky hacks. As of 2025 they are mandated by law.
killjoywashere · 3 years ago
Keep in mind, you can reasonably assume many minor accidents don't get reported to the government database. But Tesla is afflicted by perfect (or near perfect) knowledge of every vehicle in the fleet. A line has to be drawn somewhere. 0.1 mph? 2 mph? 25 mph? Half that? Where would you draw the line to be comparable to the federal data?
FireBeyond · 3 years ago
Sure. But we're very much in the data gathering phase of autonomous/semi-autonomous/enhanced driver assist vehicles, so I have a hard time arguing that "we don't really need to report this incident (not accident) to a database".
throwawayacc3 · 3 years ago
>Well, if you want to know why some people are rear ending a Tesla, it's because it's far more unpredictable than a human driver.

How unpredictable is a solid 5 seconds of the Tesla signaling it's lane change then slowing down? I blame the asshole that was in the far left lane going way too fast and not slowing down (just letting off the accelerator when the Tesla signalled would have prevented the accident) when someone is trying to enter. Then they brake way too late. Everyone behind that first vehicle is also to blame for following too closely at too fast of a speed.

bdcravens · 3 years ago
Does the Tesla always activate its brake lights when slowing down? (I have an EV6, and I don't even know if brakes lights come on when I'm using regen as opposed to the brake pedal) When I'm driving and traffic is stopped ahead, I used to always brake not because I had to, but as an indication to the car behind me.
chollida1 · 3 years ago
https://www.tesla.com/ownersmanual/modely/is_is/GUID-3DFFB07....

Yes, but it depends on the speed you are going

> If regenerative braking is aggressively slowing Model Y (such as when your foot is completely off the accelerator pedal at highway speeds), the brake lights turn on to alert others that you are slowing down.

throwthroyaboat · 3 years ago
Yes, brake lights are illuminated under any deceleration above a certain threshold, or application of the brake pedal.
jacquesm · 3 years ago
Yes, as soon as you hit regen your brake lights come on. I tested this with a Honda Hybrid a long time ago to make sure. If your car has a diagram somewhere that you can display while driving you will actually see it on the display.
cptskippy · 3 years ago
Yes, even the little CGI model of your car in the UI even shows your brake lights illuminate. AFAIK the the CGI representation of the car accurately shows tire position, blinkers, headlights, doors, etc.

In comparison, my 2015 Leaf doesn't illuminate the brake lights during regen but it's regen is significantly less than my Model 3.

fortylove · 3 years ago
Thanks for asking this. I've never thought of the implications of re-gen braking when slowing down!
gnicholas · 3 years ago
According to this article [1], the brake lights do come on. But the article is from 2021, so who knows if that is still true (or if it was ever true for all Teslas, since they only tested one).

1: https://www.greencarfuture.com/electric/does-regen-braking-t...

enslavedrobot · 3 years ago
Given that per the NHTSA 29% of all accidents are rear end collisions, I don't think the 35% number is meaningful, given the small sample size.

Also: Tesla has always been clear about their definition of "crash", it hasn't changed since they started reporting years ago.

Deleted Comment

photoGrant · 3 years ago
> Also: Tesla has always been clear about their definition of "crash", it hasn't changed since they started reporting years ago.

But it says specifically it does since Jan 2023 and they've retroactively adjusted their previous reporting...

culi · 3 years ago
ajross · 3 years ago
Almost as upsetting as it is to see a Gish gallop link list rise to the top of that topic. Needless to say not one of those corroborates "history of lying about their cars' safety", and you surely know that. But you'll challenge us all to figure it out on our own anyway?

It's absolutely amazing that we have to go through this every time someone gets video of a Tesla event. There are millions of these cars on the road now, surely the fact that this happens only monthly sits as evidence against your position, no?

Edit: FWIW: my money is on this story being quietly retracted at some point anyway. FSD doesn't behave like that. It just doesn't. Almost certainly the driver got confused, panicked, or otherwise overrode the automation and then blamed it when talking to police. The police report is almost silent on the subject: https://www.documentcloud.org/documents/23569059-9335-2022-0...

This entire kerfuffle is based on this one sentence: "P-1 stated V-1 was in Full Self Driving mode at the time of the crash, I am unable to verify if V-1’s Full 24 Self-Driving Capability was active at the time of the crash."

FireBeyond · 3 years ago
> Needless to say not one of those corroborates "history of lying about their cars' safety", and you surely know that.

From the first link:

> Federal safety regulators accused Elon Musk of issuing “misleading statements” on his company’s Tesla Model 3 last year, sending a cease-and-desist letter

Federal regulators don't exactly send cease and desists casually. Misleading and misrepresenting is dishonest, i.e. lying.

culi · 3 years ago
the National Highway Traffic Safety Administration, New York Times, and even published research[0] have all been calling Elon Musk out for years now, yet his fans will still believe him despite them withholding data

[0] https://engrxiv.org/preprint/view/1973/3986

oneoff786 · 3 years ago
I read through all these links. They’re not great aside from the worker safety one.

The first one states that Tesla is being misleading for claiming that it’s cars scores better for safety but not mentioning that the weight classes affect the scores. The original claim is still true though.

Most of the others deal with the claim that the cars accidents per mile ratings are lower because they’re highway miles or because they’re driven by rich people. But… they’re still low.

You can argue at best that there is no rigorous proof that Tesla cars are safer than other cars and that these are cherry picked stats for marketing. But there’s no compelling evidence that they’re more dangerous. Calling them lies is disingenuous. Regardless of one’s opinion of musk.

FireBeyond · 3 years ago
[Deleted for my poor reading comprehension].
culi · 3 years ago
Musk has repeatedly been called out by regulators for his misinformation. Some claims, like the safety of autopilot have been directly debunked by research. This article has a good literature review if you're interested:

[0] https://engrxiv.org/preprint/view/1973/3986

> In independent research, Templeton (2020)compared Tesla’s stated crash rates with Autopilot enabled and not enabled by attempting to control for increased use of Autopilot on relatively safer freeways. To compare human-driven crash rates of freeways and non-freeways, Templeton used fatality rates, which may overestimate crash rates on freeways as higher speeds increase crash severity according to a fourth power law (Evans, 1994). When controlling for road type, the crash rate benefits of Autopilot narrowed significantly. Templeton was unable to fully assess their comparison of Autopilot crash rates with national estimates due to their different definitions of crashes.

> Goodall (2021)investigated struck-from-behind crashes of automated vehicles using age-weighted crash rates from SHRP 2 NDS database as a baseline. Automated vehicles were struck from behind at five times the rate of human-driven vehicles, although much of the difference could be attributed to higher rates of urban driving experienced in automated vehicle testing.

I'm also not sure what percent of level 2 ADAS vehicles are Tesla vs other brands but they're by far the most common vehicles with driver-assist involved:

[1] https://www.theverge.com/2022/6/15/23168088/nhtsa-adas-self-...

Google's Waymo was lightyears ahead of Tesla. The reason they didn't go to market was that they knew it wasn't really ready. Elon Musk has lied (as in he knew the truth and lied) about their driver assistant technology's safety and capabilities and has been sued for it[2]. There's no other way to sugarcoat it

[2] https://www.theverge.com/2022/9/14/23353787/elon-musk-tesla-...

femiagbabiaka · 3 years ago
True, although it is indicative of some level of community sentiment and will certainly have a comment section full of rebuttals as your own. Good advantage of a mostly free speech platform.
Imnimo · 3 years ago
I do believe that Teslas are quite safe despite the high-profile failures, but I'm very unconvinced by a stat that compares accidents per mile using autopilot to accidents per mile not using autopilot. Those are not comparable miles - people don't uniformly use autopilot across all driving conditions. Imagine you had a subsystem that did absolutely nothing, but people turned it on when driving on empty highways. It's accidents per mile number would look great!
FireBeyond · 3 years ago
What's also egregious about this is that Tesla has now "decided" that if active restraints or airbags are not deployed, that incident "does not count going forward".

And as an automobile manufacturer, I cannot believe in good conscience that Tesla is unaware that active restraint and airbag systems of today are far more nuanced and weigh multiple criteria when deciding to deploy, versus "speed exceeds X mph, deploy" of old. You can have a very significant incident (two that I've witnessed recently involve vehicles into stationary objects at ~30mph) without airbag deployment. But if that was a Tesla in FSD that hit something at 30mph and didn't deploy airbags, well, that's not an accident according to Tesla.

That also doesn't account for "incident was so catastrophic that restraint systems could not deploy", also "not counted" by Tesla. Or just as egregious, "systems failed to deploy for any reason up to and including poor assembly line quality control", also not an accident and "not counted".

seeekr · 3 years ago
I don't disagree that it's not the best of comparisons (and I wonder if a better one could be imagined + implemented...). But still, it's not like we turn on Autopilot/FSD only "on empty highways", far from it! Certainly it's a tool where the user needs to learn its strengths and weaknesses and use it accordingly, but it is useful in so many more situations than not, that it's also not a terrible or meaningless comparison to make!

Anecdata: Almost all (95%?) of my highway driving (Europe) is on Autopilot. I don't even enjoy doing the driving myself any more in those situations where I know that Autopilot is doing a pretty good job. In particular, Autopilot does a better job than I can in conditions of heavy snow / rain / otherwise poor visibility conditions. I feel a lot safer being the operator than the driver in those instances! (The alternative would often be to slow down by a significant amount, and/or use up more of my focus/attention, leading to either less safe driving or forced breaks.)

enragedcacti · 3 years ago
> and I wonder if a better one could be imagined + implemented...

There is a ton of data using industry defined conditions and a ton of research has gone into determining the types of conditions that can affect accident rates. The only reason we can't compare is because Tesla only releases what we see in the link above. [1]

Best guess estimates of normalizing autopilot data against the average mix of highway/city driving finds that AP's safety advantage effectively disappears. Of course this is rough and Tesla likely has much better data that they are choosing not to share. [2]

[1] https://www.iihs.org/topics/fatality-statistics/detail/urban...

[2] https://twitter.com/NoahGoodall/status/1489291552845357058

femto113 · 3 years ago
It's not at all difficult to imagine many other measurements: minutes driven, type of road driven on, speed at time of crash, whether another vehicle was involved, etc. As a data analyst what I really want though is more granular data so we can figure out whether the published metrics are being cherry-picked.
alphabettsy · 3 years ago
> The alternative would often be to slow down by a significant amount

You should probably slow down.

FireBeyond · 3 years ago
This needs to be emphasized so much more.

Short of "not driving", human drivers don't have the option of "well, this is less than optimal, I can't do well, so I won't try".

FSD absolutely does. Disengages, or is not even engaged in the first place.

I wonder Tesla's FSD safety / disengagement stats for December to February in Pittsburgh, for example.

gumby · 3 years ago
> Short of "not driving", human drivers don't have the option of "well, this is less than optimal, I can't do well, so I won't try".

They sure do, and will stop when things are too complex for them (e.g. bad weather).

ok_dad · 3 years ago
Yes, how about a stat for, "accidents/incidents which were proceeded by AutoPilot disengagement within the past ~5 seconds."
AtlasBarfed · 3 years ago
And I assume that autopilot is used primarily on highways, in either stop and go or steady flow situations.

Really they should know this, because the autopilot system should be identifying it based on speed limits and detected type of road.

Of course that assumes some ability to look into a black box, but also location tracking should provide it too.

oneoff786 · 3 years ago
This is true, but there’s a counterpoint here too. If the autopilot takes all of the “easy miles” then you should expect to see a spike or accidents concentrated on the non autopilot “hard miles”.

Since we don’t see that spike, I’m inclined to believe that the effect size of the highway driving preference is relatively small.

lokimedes · 3 years ago
Comparing Tesla to the US incident average is a neat way of ignoring that high-end electric vehicles are not driven by the US average driver, in average situations. The driver (or should we call them executives, in self-driving cars?) is likely to have much more to do with the statistics, that the car.
gnicholas · 3 years ago
I wrote article [1] along these lines a few years ago. There are lots of non-comparables between Teslas and the US fleet average, although this has been getting less true over time (as Tesla released less expensive cars, and as its vehicles aged).

1: https://www.thedailybeast.com/how-tesla-and-elon-musk-exagge...

sidibe · 3 years ago
It's not even comparing Tesla to US incident average, that would be an improvement. It's comparing Tesla in situations where people are comfortable activating Autopilot to US incident average.

The absolute easiest miles for a luxury vehicle vs all miles for all cars.

hinkley · 3 years ago
Public policy in the town I grew up in was that intersections only got lights after a certain number of reported accidents had happened there.

There was a particularly nasty intersection near the most popular bike shop that the bike club kept going to the city about but since nobody had died yet, they weren't going to put one in.

People know which intersections are accident prone, well before the city does something about it. There are routes I don't take because there are others that are safer or easier to follow at all hours of the day instead of just outside of rush hour, or also when I'm tired/distracted. But there are also magnets that pull certain demographics to them. Like that bike shop, or the shitshow outside of CostCo where I live now.

bearjaws · 3 years ago
Calling Tesla drivers executives seems a bit... out of touch? Maybe the most premium model is an executive car. The vast majority are $50-$65k, with a house hold income of $133k. This means they are top 25% earners, not top 5%.

More important is the fact they are driven by older people (median age 48 years old), that is more important.

Model 3 Demographics: https://hedgescompany.com/blog/2019/03/tesla-model-3-demogra...

InitialLastName · 3 years ago
I think GP's "Executives" remark was less directly about the employment state of the subject than an attempt to distinguish them ("people in the driving seat of self-driving cars") from "drivers" (people who actually drive cars). IMO "operators" is probably a term with more fitting analogies.
postmeta · 3 years ago
Nothing is stopping you from looking at the Q3 2022 Autopilot vs Q3 2021 numbers, the others are nice for reference
bdcravens · 3 years ago
The reason for that difference should be evaluated then. Is it an income thing? Car cost thing? (compare stats to non-Teslas in the same group) Does Tesla's marketing create a false sense of confidence in the car's abilities? Does the UI distract? Numbers can lie, but in general, I think the data is out there to support or reject the "fake news" narrative.

Deleted Comment

olliej · 3 years ago
This is absurdly, and at this point clearly intentionally, disingenuous.

You would absolutely expect Autopilot/FSD to have fewer crashes than human-in-control crashes because AP/FSD isn't available in the conditions that result in more crashes: bad/variable roads, poor conditions, etc. Anything else would imply AP/FSD is absurdly dangerous as it would be getting higher crash rates in good conditions than human drivers in bad conditions.

If AP/FSD were actually better and/or safer than human drivers you would expect Tesla to aggressively demonstrate that. Given that they are choosing not to compare crash rates by road condition it is reasonable to assume that the statistics do not support the claim.

This is before we get into whether AP/FSD disconnecting moments before crashing counts as crashing under AP/FSD. I recall articles claiming that Tesla didn't include such crashes as being caused by AP/FSD, but I have no idea if that was ever confirmed. Certainly I know Tesla has gone out of its way to blame drivers for AP/FSD crashes, so my faith in any kind of honesty from Tesla in these matters is minimal.

advisedwang · 3 years ago
I was curious about what stats were used for the following:

> By comparison, the most recent data available from NHTSA and FHWA (from 2021) shows that in the United States there was an automobile crash approximately every 652,000 miles.

The figures they are using for this are 4,954,323 vehicles in crashes (ie a single crash with 3 vehicles adds 3 to this total). This includes large trucks, motorcycles, buses, other, and unknown vehicle types, from the NHTSA data. and 3,228.8 billion miles from the FHWA data.

kelseyfrog · 3 years ago
Tesla is discovering it's created a lightning rod for blame.

Previously, accident culpability was diffuse across individual drivers in a way that motivating any group of individuals to action is inherently a _social_ problem, and one that is often dismissed as such.

The development and deployment of a singular identifiable causal agent casts off that shield of diffuse social responsibility in a way that leaves Tesla vulnerable to intense scrutiny. It doesn't help that as Tesla's figurehead, the collective schadenfreude we gain from Musk's failures are converted into a cathartic group joy.

Nevertheless, the dynamics of blame have changed because the terrain has changed. The closest analogy would be a human driver that drive all the miles autopilot drove in 2022 and was responsible for the same number of autopilot crashes in 2022. Presented outside the frame of this debate, I can't help but imagine we would want them off the road regardless of whether their miles per crash were less than the population on average.

etchalon · 3 years ago
My main question is whether the miles equated are in fact equal.

Autopilot is used far more often on the highway, I believe, than in residential areas – where I also believe crashes are more frequent.

It'd be interesting to see the miles broken down by road type, speed, etc.