One of my biggest criticisms of Elon is that he rarely takes accountability for his words and nobody ever holds him accountable by asking directly. This isn't a high bar to clear, but some people continuously excuse him as being "overly-eager" in his predictions. If that was the case, he could still provide reasonable updates when a predicted date is missed along with an explanation, even if it's just: "Hey, it turns out this problem was much more difficult than we initially expected and it'll take longer". A lot of the problems that he's trying to solve are actually quite difficult, so it's understandable that predictions will be imprecise... But when you realize that your predictions are going to be wrong, you should have the basic decency to update people.
When you're wielding immense amounts of money, power, and influence I think it's worth trying to do the bare-minimum to hold people accountable for their words and claims. Otherwise your words are meaningless.
In America, you fail the second you apologize or take accountability. Ignoring criticism and deflecting all the time gets you further, as it is part of the game. Unfortunately, this is just an accepted social science-y thing at this point. It is a very much cultural thing of the past couple of decades.
It's too bad, because "I'm sorry; this is my fault" is the biggest diffuser of anger and best way to appease mad customers. Try it sometime; the other party goes from ready to kill you to apologetic themselves (if you're genuine). Unfortunately it's seen as a sign of weakness by people like Elon and his cult of impersonators and an admission of liability by the litigious crowd. If you can be strong, confident and ready to admit it when you're wrong you'll not only be successful in confrontational situations but also not a giant dick.
That's a interesting take. What I have heard from a very old friend of my father is the opposite:
> Knowing when to say thanks and when to say sorry is the key for success.
...and I have used this piece of advice since then, it paid me handsomely. Of course, this doesn't allow you to be shameless, on the contrary, it requires to stick to your values as a prerequisite.
I think what allows Elon to behave like that is how he can retaliate without any repercussions since he has tons of money and influence in some circles.
> One of my biggest criticisms of Elon is that he rarely takes accountability for his words and nobody ever holds him accountable by asking directly. This isn't a high bar to clear, but some people continuously excuse him as being "overly-eager" in his predictions.
I've written off pretty much everything he says since sometime before 2020, too many lies about self-driving to count.
But I'm not someone with any influence (nor do I really want that kind of attention).
> when you realize that your predictions are going to be wrong, you should have the basic decency to update people
Not to get too political, but the last I've heard of Elon Musk is that he was speaking to/mobilizing right wing extremists at a big protest in London. I am also pretty sure he has been trying to do similar things in other European nations (for whatever reason).
It seems to me that it is a bit late to plead for "basic decency" at this moment.
But at the same time let me ask you, what has he got to lose? What financial or reputational risk is he taking by not taking any accountability?
Society needa a "no assholes" policy in order to syay high trust. Elon not being a pariah because of his grifting is a sign the US is becoming a lower and lower trust society. And its billioniares making it so
He lies relentlessly even to customers who paid for the product.
I know because I’m one of them. FSD paid in full almost seven years ago, still does absolutely nothing in Europe. A five-year-old would do a better job at driving because the Tesla can’t even see speed limit signs correctly.
Tesla takes no responsibility for their misleading marketing and years of lies. Most recently Musk promised in early 2025 that these old cars would get a hardware update that will finally enable the paid-for FSD (as if…) The company itself pretends to know nothing about this latest promise made by its CEO.
It’s insane that a business with a trillion-dollar market cap operates like this. It seems to be more of a cult than a real company.
> because the Tesla can’t even see speed limit signs correctly.
This is sad and atrocious. Not only a Ford Puma (an econobox compared to a Tesla) can read almost all speed limit signs correctly, it can pull speed limit data correctly from its onboard maps when there are no signs. These maps can be updated via WiFi or an on board modem too.
Tesla mocked "big auto industry", but that elephant proved that it can run if it needs to.
Interestingly with SpaceX he is much more willing to change plans. With SpaceX he and SpaceX seem to be searching for the right solution.
For self driving, he simply decided X is right and talked about exponentials and no matter how many time it fails, there is no reflection what so ever.
He's a psychopath, in the sense he doesn't feel normal emotions like remorse and empathy. He will lie to your face to get you to buy his product and when it fails to deliver on promises he will lie again.
When the self-driving car killed a pedestrian several years ago, the initial sentiment on this site for the first few hours was essentially "those dastardly pedestrians, darting into traffic at the last second, how are you supposed to avoid them?" It took several hours for enough information to percolate through to make people realize that the pedestrian had been slowly and quite visibly crossing the road and the self-driving car (nor the safety driver) never did a thing to react to it.
Another thing to keep in mind is that video footage is much lower quality than what we can see with our human eyeballs. At no point in the video can I clearly identify what the debris is, but it's clearly evident that the humans in the car can, because they're clearly reacting to it seconds before it's even visible to us in the dash-cam-quality footage. I will freely accept that many drivers are in fact bad drivers, but a carcass (I think?) on a lane visible for >10 seconds away is something that anyone who can't avoid needs to have their license revoked.
(Assuming I know which accident you're referring to) The car that killed the pedestrian in Florida wasn't using supervised full self driving, he was using autopilot (which was basically adaptive cruise control at the time).
Well, I have to admit that your friend has a point. Humans are bad at reacting quickly and correctly to unexpected situations, and some debris large enough to damage your car showing up from out of nowhere after several hours of boring driving along a largely straight highway with little traffic is definitely one of these situations. But a self-driving system worth its salt should always be alert, scanning the road ahead, able to identify dangerous debris, and react accordingly. So, different pair of shoes...
I'm not convinced. The debris is clearly visible to the humans a long way off and the adjacent lane is wide open. Avoiding road debris is extremely common even in more congested and treacherous driving conditions. Certainly it's possible that someone texting on their phone or something might miss it, but under normal circumstances it could have been easily avoided.
Do they? "Many humans" would hit that? The humans in the car spotted the debris at least 8s before the impact. I don't think any humans would hit that in broad daylight unless they were asleep, completely drunk, or somehow managed to not look at the road for a full 10s. These are the worst drivers, and there aren't that many because the punishment can go up to criminal charges.
The argument that "a human would have made that mistake" backfires, showing that every Tesla equipped with the "safer than a human driver" FSD is in fact at best at "worst human driver" level. But if we still like the "humans also..." argument, then the FSD should face the same punishment a human would in these situations and have its rights to drive any car revoked.
> Well, I have to admit that your friend has a point. Humans are bad at reacting quickly and correctly to unexpected situations, and some debris large enough to damage your car showing up from out of nowhere
I read this comment before seeing the video and thought maybe the debris flies in with the wind and falls on the road a second before impact, or something like that.
But no, here we have bright daylight, perfect visibility, the debris is sitting there on the road visible from very far away, the person in the car doing commentary sees it with plenty of time to leisurely avoid it (had he been driving).
Nothing unexpected showed up out of nowhere, it was sitting right there all along. No quick reaction needed, there was plenty of time to switch lanes. And yet Tesla managed to hit it, against all odds! Wow.
My impression of Tesla's self driving is not very high, but this shows it's actually far worse than I thought.
> Well, I have to admit that your friend has a point. Humans are bad at reacting quickly and correctly to unexpected situations,
This was not one of those situations.
and some debris large enough to damage your car showing up from out of nowhere after several hours of boring driving along a largely straight highway with little traffic is definitely one of these situations.
Again, this was definitely not one of those situations. It was large, it was in their lane, and they were even yapping about it for 10 seconds.
> But a self-driving system worth its salt should always be alert, scanning the road ahead, able to identify dangerous debris, and react accordingly. So, different pair of shoes...
This is what humans already (and if we didn't do it, we'd be driving off the road). Based on what you're saying, I question that you're familiar with driving a car, or at least driving on a highway between cities.
Literally the passenger saw it and leaned it, the driver grabbed the steering wheel to brace himself it seems. That object on the road was massive, absolutely huge as far as on road obstacles go. The camera does not do any justice - it looks like it's 3 feet long, over a foot wide, and about 6 or 7 inches high laying on the road. Unless a human driver really isn't paying attention, they're not hitting that thing.
Bullshit. Some humans might hit thay because they werent paying attention, but most people would see that, slow down and change lanes. This is a relatively scenario that humans deal with. Even the passenger here saw it in time. The driver was relying on FSD and missed it
I dont think FSD has the intelligence to navigate this
I don't love Tesla (though I would like an electric car). I don't think it's unlikely that someone driving could have hit that or caused an even worse accident trying to avoid it.
However, I'm sure a human driver in the loop would have reacted. The driver sitting there watching a machine do it's thing 99% of the time and being expected to step in to save that situation though is a horrific misreading of human nature.
Those humans saw the debris. What happens next when a human is actively at the wheel is that the driver should look at all mirrors, decide whether to change lane or brake, execute. Or anything else that could lead to a movie like multiple car accident. Hitting the debris is the least dangerous line of conduct if there are cars all around. That looked like an empty road but who knows.
By the way, playing a lot of racing videogames is a great training for dealing with that sort of stuff, except maybe getting good at mirrors. I've been in a few dangerous situations and they were only the 10th thousand averted crash. Nothing to think, only reflexes.
I highly recommend people take a high performance/race driving course if they can. I did a single day one which involved high speed maneuverability trials designed to be useful in emergency scenarios (swerving, braking, hard turns) followed by a few laps around a racetrack.
It's one of the best ways to figure out what it feels like to drive at the limits of your car and how you and it react in a very safe and controlled environment.
Did you friend make any mention that the passenger saw it hundreds of feet away and even leaned in as they headed directly towards it? The driver also recognized it and grabbed the wheel as if to say "brace for impact!".
Obviously, in this particular case the humans wouldn't be hitting that. The people in the video have clearly seen the object, but they didn't want to react because that would have ruined their video.
Even if they did not understand what it is, in the real world when you see something on the road you slow down or do a maneuver to avoid it, no matter if it is a harmless peace of cloth or something dangerous like this. People are very good at telling if something is off, you can see it in the video.
Yes. Humans would. Which is why the car should be able to handle the impact. My honda civic has had worse without issue. The suspension should be beefy enough to absorb the impact with, at worst, a blown tire. That the car has broken suspension says to me that teslas are still too fragile, biuld more like performance cars than everyday drivers.
With millions of Teslas on the road one would think if that was true we would have heard something by now. My absolute worst car quality wise ever was a Honda Accord. And I owned shitty cars including a Fiat. My most reliable car was a Honda Civic before I “upgraded” to a brand new Accord. I abuse my Tesla and so far no issues driving in one of the worst roads in the country. I must hit 100 potholes per month and blew a tire already. It’s not a fun car to drive like a GTI (which I own as well) but it’s definitely a solid car.
> A friend of mine who loves Tesla watched this video and said "many humans would have hit that".
The very same video demonstrates this is not true, since the human in the video sees the debris from far away and talks about it, as the self-driving Tesla continues obliviously towards it. Had that same human been driving, it would've been a non-issue to switch to the adjacent lane (completely empty).
But as you said, the friend loves Tesla. The fanboys will always have an excuse, even if the same video contradicts that.
I would counterpoint my little cheap Civic has hit things like that and hasn't broken a thing. HEH.
The best time was a very badly designed speed bump that I didn't even hit at high speed but one side was ridiculously inclinded vs. the other so the entire civic's frontend just smashed right into the pavement and dragged for 3 feet. I wouldn't be surprised if I went into a body shop and find the frontend tilted upwards by a few centimeters. lol
Timestamp 8:00-8:30. Your Civic is not hitting that and surviving any better than the Tesla. It just got luckier. It may be easier to get lucky in certain vehicles, but still luck based.
That's laughable. Any human who couldn't avoid a large, clearly-visible object in the middle of an empty, well-lit road should not be operating a vehicle.
That's not to say that there aren't many drivers who shouldn't be driving, so both can be true at once, but this is certainly not a bar against which to gauge autonomous driving.
Question - isn't P(Hitting | Human Driving) still less than P(Hitting | Tesla FSD) in this particular case [given that if this particular situation comes up - Tesla will fail always whereas some / many humans would not]?
The question is if avoiding the obstacle or breaking was the safest thing to do. I did not watch the entire test, but they are definitely cases where a human will suddenly break or change lanes and cause a very unsafe condition for other drivers. Not saying that was the case here, but sometimes what a human would do is not a good rule for what the autonomous system should do.
An enormous part of safe driving is maintaining a mental map of the vehicles around you and what your options are if you need to make sudden changes. If you are not able to react to changing conditions without being unsafe, you are driving unsafely already.
Yes many human drivers would hit it. The bad ones. But we should want driverless cars to be better than bad drivers. Personally, I expect driverless cars to be better than good drivers. And no, good drivers would not hit that thing.
Anecdotal: I am surprised how the basic Tesla autopilot often cannot even read the speed limit signs correctly. In perfect lighting conditions. It just misses a lot of them. And it does not understand the traffic rules enough to know when the speed limit ends.
I know that the basic autopilot is a completely different system than the so-called FSD.
But equipped with that experience from the basic autopilot, it does not surprise me that a large debris on the road was completely missed by the FSD.
In FSD, there's an annoying bug where Georgia minimum speed signs are misinterpreted as speed limit signs. It's because most states that do minimum speed signage combine it with the speed limit into a single sign, but Georgia has separate, freestanding minimum speed signs. Thankfully, recent versions of FSD don't immediately start slowing down anymore when a misinterpreted sign makes the car think that the speed limit on a highway is 40; but the sign comprehension bug has remained unresolved for years.
It is unclear if it can even read signs at all. FSD, not Autopilot, on the current version can not read “Do Not Enter” and “Road Closed” signs and will entirely ignore them. It would be reasonable to assume it can not read any signs at all. But it will be safer than a human driver in just negative 9 years.
I use autopilot for local driving (city - suburbs) and I pay for FSD when on long road trips (>300 miles). You are correct, they are completely different things so one doesn’t correlate to the other one.
That they are different things is really disappointing. If you want people to trust the system enough to buy FSD, the autopilot mode should use the same system, with limited functions. There is no reason why the vision/detection systems should be different. Especially if you already have the proper hardware installed…
Tangentially - If you as a European happen to drive on US highways, you will noticed that they are heavily littered with fallen cargo, aluminum ladders, huge amount of shredded tires and occasionally a trailer without a towing car... It has been so bizarre for me to observe this. Nobody is cleaning that?
I just got back from a trip to the USA where I spent about five days driving around Michigan, Illinois, and Wisconsin and the number of shredded truck tires on the highways was flabbergasting.
From what I understand, in the USA when a truck tire wears down they put a new layer of tread on it. But it doesn't seem to work very well as it eventually peels off.
Those are called retreads and they are not uncommon worldwide. If you're seeing anything other than long thin strips of tread on the road it's not a retread related failure.
Every now and then the Karens get to screeching about it and it reaches a critical mass and the NHTSA does a study and they find that most of the debris you're seeing has nothing to do with retreads. Here's the summary of a recent one:
I hit a retread as it became detached from the lorry I was following on the M25, UK. Scary moment, similar to the video in TFA, + an expensive repair job.
I'm from Norway, and visited a buddy in Florida back in the early 2000s. It was my first time to the US.
I recall I was completely flabbergasted by all the cars just ditched along the highway. There were lots of them, just off the road into the grass on the side or whatever.
I asked my buddy about it and he said it was usually tires, as it was cheaper to buy another car than get new tires... Well that didn't help my blown mind one bit.
Mind you, on the way to his house I passed a Kia dealer which literally had a huge "buy one car - get one free" sign outside...
When I was a boy in Florida in the 1970s, there was an annual inspection for automobiles. Some other states still do this. It would certainly improve overall safety on the roads if we still had any minimum requirements.
In the linked video highway patrol comes out to remove the debris just a few minutes after they hit it. (highway patrol had been called out prior them hitting it)
It depends where in the US you drive. It’s a big country with independent state governments. It’s like saying I was driving in Romania and I was shocked by how bad European highways are. I lived in Texas and the stuff I saw on the highway was crazy, vacuum cleaners, decorated Christmas trees and refrigerators. Most parts of the country interstate and highway systems are pretty clean.
It's been years since I've seen anything you couldn't drive over that wasn't in the travel lane.
Around here anything big enough to matter gets reported, the cops fling it to the side of the road and it gets picked up on a schedule because they always pick things up a few weeks before they mow (presumably because hitting garbage isn't great for the mowers).
Fuel tax is a big factor, but US has a lot of road. The US has 3x the amount of paved surface vs. Germany. European winters are also milder than the US. I'm not sure how many European roads would survive going from -10 to 140 like they do in the midwest
Not just that but also tolls. There are way more toll roads, at least where I've lived in Europe, compared to where I've lived in the US (Spain being the one very noticeable exception between France and Portugal).
You don't need to drive far, just get on a US highway and there is dangerous litter every few hundred meters. In extreme cases it goes down to few dozens meters. Sometimes it was like driving in some Mad Max movie.
> That's kind of irrelevant, this technology is meant to be safer and held to a higher standard
I don't think that is the case. We will judge FSD whether you make more or less accidents than humans, not necessarily in the same situations. The computer is allowed to make mistakes that a human wouldn't, if in reverse the computer makes a lot less mistakes in situations where humans would.
Given that >90% of accidents are easily avoidable (speeding, not keeping enough safety distance, drunk/tired driving, distraction due to smartphone usages) I think we will see FSD be safer on average very quickly.
> The computer is allowed to make mistakes that a human wouldn't, if in reverse the computer makes a lot less mistakes in situations where humans would.
This subverts all of the accumulated experience of other users on the road about what a car will do, everyone is used to potential issues caused by humans, on top of that other road users will have to learn the quirks of FSD to keep an eye for abnormalities in behaviour?
That's just unrealistic, not only people will have to deal with what other drivers can throw at them (e.g.: veering off lane due to inattention) but also be careful around Teslas which can phantom brake out of nowhere, not avoid debris (shooting it on unpredictable paths), etc.
I don't think we should accept new failure modes on the road for FSD, requiring everyone else to learn them to be on alert, it's just a lot more cognitive load...
That's the main advantage self-driving has over humans now.
A self-driving car of today still underperforms the top of the line human driver - but it sure outperforms the "0.1% worst case": the dumbest most inebriated sleep deprived and distracted reckless driver that's responsible for the vast majority of severe road accidents.
Statistics show it plain and clear: self-driving cars already get into less accidents than humans, and the accidents they get into are much less severe too. Their performance is consistently mediocre. Being unable to drink and drive is a big part of where their safety edge comes from.
> That's kind of irrelevant, this technology is meant to be safer and held to a higher standard.
True, but not even relevant to this specific example. Since the humans clearly saw it and would not have hit it, so we have a very clear example where Tesla is far inferior to humans.
Indeed... You can see the driver reaching for the wheel, presumably he saw it coming, and would have hit the breaks. He left the car to do its thing thinking it knows better than him... maybe.
Personally if the road was empty as here, I'd have steered around it.
This was really best possible driving conditions - bright day, straight dry road, no other cars around, and still it either failed to see it, or chose to run over it rather than steering around it or stopping. Of all the random things that could happen on the road, encountering a bit of debris under ideal driving conditions seems like it should be the sort of thing it would handle better.
And yet Tesla is rolling out robo taxis with issues like this still present.
Yeah! Just add more sensors! We're only 992 more sensors away from full self-driving! It totally works that way!
The debris? The very visible piece of debris? The piece of debris that a third party camera inside the car did in fact see? Adding 2 radars and 5 LIDARs would totally solve that!
For fuck's sake, I am tired of this worn out argument. The bottleneck of self-driving isn't sensors. It was never sensors. The bottleneck of self-drivng always was, and still is: AI.
Every time a self-driving car crashes due to a self-driving fault, you pull the blackbox, and what do you see? The sensors received all the data they needed to make the right call. The system had enough time to make the right call. The system did not make the right call. The issue is always AI.
Respect to these guys for commiting to the bit and letting the Tesla hit it. This is real journalism. Stark contrast to so much of the staged engagement-bait on YouTube.
I noticed that, too. They seemed to tense up and notice it with enough time to spare to manually intervene and either slam on the brakes or swerve to avoid it.
I rant about Elon a lot but can someone just explain to me how this keeps going on ? FSD is almost completely a solved problem by the likes of Waymo etc. Why does anyone care what Tesla is failing to do with FSD? Is this all about, "how can we invent FSD without lidar"? Why are we bothering, because cybertruck owners don't want a dorky box on top of their truck? Does their truck already not look ridiculous?
FSD isn't just about the lack of lidar, it's the only system that can be employed everywhere, without any prior special mapping, by private owners. So far, there are no other manufacturers of vehicles available in the US who are meaningfully competing in this space.
Its the only system the manufacturer is willing to let it be employed everywhere. I bet Waymo would work better everywhere but they are safety conscious and care about liability.
It's almost entirely a product/economic gamble. Basically:
"How do we get self-driving into millions of regular consumer cars without doubling the price by adding expensive sensors and redesigning the vehicle around huge chunky sensors"
Waymo is focused on Taxis for a reason because it's likely going to be unaffordable, except to people driving Lambos. But that also may be fine for a big chunk of the public who don't want to own cars (or want to rent their own as a service).
Some consumer car companies are experimenting with adding smaller LIDAR sensors like recent Volvo Ex90 which costs ~$100k. But they aren't as sophisticated as Waymo
Is LIDAR full of unobtainium? Is there some fundamental first-principles reason that the cost of LIDAR can't go down by orders of magnitude? Isn't that kind of reasoning from first principles what Elon's genius was supposed to be about, like with cost of accessing space? But apparently he thinks a spinning laser is always going to cost six figures?
When you're wielding immense amounts of money, power, and influence I think it's worth trying to do the bare-minimum to hold people accountable for their words and claims. Otherwise your words are meaningless.
> Knowing when to say thanks and when to say sorry is the key for success.
...and I have used this piece of advice since then, it paid me handsomely. Of course, this doesn't allow you to be shameless, on the contrary, it requires to stick to your values as a prerequisite.
I think what allows Elon to behave like that is how he can retaliate without any repercussions since he has tons of money and influence in some circles.
I've written off pretty much everything he says since sometime before 2020, too many lies about self-driving to count.
But I'm not someone with any influence (nor do I really want that kind of attention).
On one hand, I agree with you that this is maddening.
On the other hand this is, as the kids would say, "the meta" in 202X for business and politics.
Lying at scale and never taking responsibility for anything obviously works exceptionally well, and not just for Elon. I wish it didn't, but it does.
Not to get too political, but the last I've heard of Elon Musk is that he was speaking to/mobilizing right wing extremists at a big protest in London. I am also pretty sure he has been trying to do similar things in other European nations (for whatever reason).
It seems to me that it is a bit late to plead for "basic decency" at this moment.
But at the same time let me ask you, what has he got to lose? What financial or reputational risk is he taking by not taking any accountability?
I know because I’m one of them. FSD paid in full almost seven years ago, still does absolutely nothing in Europe. A five-year-old would do a better job at driving because the Tesla can’t even see speed limit signs correctly.
Tesla takes no responsibility for their misleading marketing and years of lies. Most recently Musk promised in early 2025 that these old cars would get a hardware update that will finally enable the paid-for FSD (as if…) The company itself pretends to know nothing about this latest promise made by its CEO.
It’s insane that a business with a trillion-dollar market cap operates like this. It seems to be more of a cult than a real company.
This is sad and atrocious. Not only a Ford Puma (an econobox compared to a Tesla) can read almost all speed limit signs correctly, it can pull speed limit data correctly from its onboard maps when there are no signs. These maps can be updated via WiFi or an on board modem too.
Tesla mocked "big auto industry", but that elephant proved that it can run if it needs to.
For self driving, he simply decided X is right and talked about exponentials and no matter how many time it fails, there is no reflection what so ever.
Another thing to keep in mind is that video footage is much lower quality than what we can see with our human eyeballs. At no point in the video can I clearly identify what the debris is, but it's clearly evident that the humans in the car can, because they're clearly reacting to it seconds before it's even visible to us in the dash-cam-quality footage. I will freely accept that many drivers are in fact bad drivers, but a carcass (I think?) on a lane visible for >10 seconds away is something that anyone who can't avoid needs to have their license revoked.
Do they? "Many humans" would hit that? The humans in the car spotted the debris at least 8s before the impact. I don't think any humans would hit that in broad daylight unless they were asleep, completely drunk, or somehow managed to not look at the road for a full 10s. These are the worst drivers, and there aren't that many because the punishment can go up to criminal charges.
The argument that "a human would have made that mistake" backfires, showing that every Tesla equipped with the "safer than a human driver" FSD is in fact at best at "worst human driver" level. But if we still like the "humans also..." argument, then the FSD should face the same punishment a human would in these situations and have its rights to drive any car revoked.
I read this comment before seeing the video and thought maybe the debris flies in with the wind and falls on the road a second before impact, or something like that.
But no, here we have bright daylight, perfect visibility, the debris is sitting there on the road visible from very far away, the person in the car doing commentary sees it with plenty of time to leisurely avoid it (had he been driving).
Nothing unexpected showed up out of nowhere, it was sitting right there all along. No quick reaction needed, there was plenty of time to switch lanes. And yet Tesla managed to hit it, against all odds! Wow.
My impression of Tesla's self driving is not very high, but this shows it's actually far worse than I thought.
This was not one of those situations.
and some debris large enough to damage your car showing up from out of nowhere after several hours of boring driving along a largely straight highway with little traffic is definitely one of these situations.
Again, this was definitely not one of those situations. It was large, it was in their lane, and they were even yapping about it for 10 seconds.
> But a self-driving system worth its salt should always be alert, scanning the road ahead, able to identify dangerous debris, and react accordingly. So, different pair of shoes...
This is what humans already (and if we didn't do it, we'd be driving off the road). Based on what you're saying, I question that you're familiar with driving a car, or at least driving on a highway between cities.
I dont think FSD has the intelligence to navigate this
However, I'm sure a human driver in the loop would have reacted. The driver sitting there watching a machine do it's thing 99% of the time and being expected to step in to save that situation though is a horrific misreading of human nature.
By the way, playing a lot of racing videogames is a great training for dealing with that sort of stuff, except maybe getting good at mirrors. I've been in a few dangerous situations and they were only the 10th thousand averted crash. Nothing to think, only reflexes.
It's one of the best ways to figure out what it feels like to drive at the limits of your car and how you and it react in a very safe and controlled environment.
Even if they did not understand what it is, in the real world when you see something on the road you slow down or do a maneuver to avoid it, no matter if it is a harmless peace of cloth or something dangerous like this. People are very good at telling if something is off, you can see it in the video.
The very same video demonstrates this is not true, since the human in the video sees the debris from far away and talks about it, as the self-driving Tesla continues obliviously towards it. Had that same human been driving, it would've been a non-issue to switch to the adjacent lane (completely empty).
But as you said, the friend loves Tesla. The fanboys will always have an excuse, even if the same video contradicts that.
The best time was a very badly designed speed bump that I didn't even hit at high speed but one side was ridiculously inclinded vs. the other so the entire civic's frontend just smashed right into the pavement and dragged for 3 feet. I wouldn't be surprised if I went into a body shop and find the frontend tilted upwards by a few centimeters. lol
That's not to say that there aren't many drivers who shouldn't be driving, so both can be true at once, but this is certainly not a bar against which to gauge autonomous driving.
Dead Comment
I know that the basic autopilot is a completely different system than the so-called FSD.
But equipped with that experience from the basic autopilot, it does not surprise me that a large debris on the road was completely missed by the FSD.
From what I understand, in the USA when a truck tire wears down they put a new layer of tread on it. But it doesn't seem to work very well as it eventually peels off.
Every now and then the Karens get to screeching about it and it reaches a critical mass and the NHTSA does a study and they find that most of the debris you're seeing has nothing to do with retreads. Here's the summary of a recent one:
https://www.moderntiredealer.com/industry-news/commercial-bu...
I recall I was completely flabbergasted by all the cars just ditched along the highway. There were lots of them, just off the road into the grass on the side or whatever.
I asked my buddy about it and he said it was usually tires, as it was cheaper to buy another car than get new tires... Well that didn't help my blown mind one bit.
Mind you, on the way to his house I passed a Kia dealer which literally had a huge "buy one car - get one free" sign outside...
Your rake is much less likely to fall out of a van than out of the bed of a pickup.
So if stuff can't fall out, it won't get cleaned up.
Second, this is a metal ramp used to load vehicles on a trailer (think bobcat-like).
To tow a trailer like that in Europe requires additional licenses, which comes with training around tying down EVERYTHING and double checking.
In the USA you are allowed to drive with this with the same license you need to drive a Smart Car.
In the linked video highway patrol comes out to remove the debris just a few minutes after they hit it. (highway patrol had been called out prior them hitting it)
I honestly can't recall ever feeling like I was going through a markedly different area in either better or worse directions.
Around here anything big enough to matter gets reported, the cops fling it to the side of the road and it gets picked up on a schedule because they always pick things up a few weeks before they mow (presumably because hitting garbage isn't great for the mowers).
Have you compared distances?
Germany is also the country with the most traffic because it's kind of at the center of everything and pays roads with income tax instead of tolls.
The French charge really high tolls and have very little traffic compared to Germany. They really don't have an excuse.
??
They are not..
But yes there are folks whose job it is to clean that stuff up.
Deleted Comment
That's kind of irrelevant, this technology is meant to be safer and held to a higher standard.
Comparing to a human is not a valid excuse...
I don't think that is the case. We will judge FSD whether you make more or less accidents than humans, not necessarily in the same situations. The computer is allowed to make mistakes that a human wouldn't, if in reverse the computer makes a lot less mistakes in situations where humans would.
Given that >90% of accidents are easily avoidable (speeding, not keeping enough safety distance, drunk/tired driving, distraction due to smartphone usages) I think we will see FSD be safer on average very quickly.
It's the standard Tesla set for themselves.
In 2016 Tesla claimed every Tesla car being produced had "the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver": https://web.archive.org/web/20161020091022/https://tesla.com...
Wasn't true then, still isn't true now.
This is what Musk has been claiming for almost a decade at this point and yet here we are
This subverts all of the accumulated experience of other users on the road about what a car will do, everyone is used to potential issues caused by humans, on top of that other road users will have to learn the quirks of FSD to keep an eye for abnormalities in behaviour?
That's just unrealistic, not only people will have to deal with what other drivers can throw at them (e.g.: veering off lane due to inattention) but also be careful around Teslas which can phantom brake out of nowhere, not avoid debris (shooting it on unpredictable paths), etc.
I don't think we should accept new failure modes on the road for FSD, requiring everyone else to learn them to be on alert, it's just a lot more cognitive load...
A self-driving car of today still underperforms the top of the line human driver - but it sure outperforms the "0.1% worst case": the dumbest most inebriated sleep deprived and distracted reckless driver that's responsible for the vast majority of severe road accidents.
Statistics show it plain and clear: self-driving cars already get into less accidents than humans, and the accidents they get into are much less severe too. Their performance is consistently mediocre. Being unable to drink and drive is a big part of where their safety edge comes from.
True, but not even relevant to this specific example. Since the humans clearly saw it and would not have hit it, so we have a very clear example where Tesla is far inferior to humans.
Not presumably, we know for sure since they are talking about it for a long time before impact.
The point of the experiment was to let the car drive so they let it drive and crash, but we know the humans saw it.
This was really best possible driving conditions - bright day, straight dry road, no other cars around, and still it either failed to see it, or chose to run over it rather than steering around it or stopping. Of all the random things that could happen on the road, encountering a bit of debris under ideal driving conditions seems like it should be the sort of thing it would handle better.
And yet Tesla is rolling out robo taxis with issues like this still present.
Deleted Comment
The debris? The very visible piece of debris? The piece of debris that a third party camera inside the car did in fact see? Adding 2 radars and 5 LIDARs would totally solve that!
For fuck's sake, I am tired of this worn out argument. The bottleneck of self-driving isn't sensors. It was never sensors. The bottleneck of self-drivng always was, and still is: AI.
Every time a self-driving car crashes due to a self-driving fault, you pull the blackbox, and what do you see? The sensors received all the data they needed to make the right call. The system had enough time to make the right call. The system did not make the right call. The issue is always AI.
"How do we get self-driving into millions of regular consumer cars without doubling the price by adding expensive sensors and redesigning the vehicle around huge chunky sensors"
Waymo is focused on Taxis for a reason because it's likely going to be unaffordable, except to people driving Lambos. But that also may be fine for a big chunk of the public who don't want to own cars (or want to rent their own as a service).
Some consumer car companies are experimenting with adding smaller LIDAR sensors like recent Volvo Ex90 which costs ~$100k. But they aren't as sophisticated as Waymo