I think one of Tesla's biggest problems with any of their autonomous systems is the names they give those systems. They confuse regulators, reporters, and consumers.
Most reporters seem to confuse "autopilot" with "full self driving" or vice versa. It's not uncommon to see sentences like "Autopilot, a feature that costs $10k, is still in beta."
Autopilot is cruise control that keeps you within a lane when it can. That's it. Other cars have it, and they're equally effective or ineffective.
Full self driving, aside from being terrifying, does it all, and I suspect it's usually what regulators and reporters mean when they say "Autopilot."
I think if Tesla had used "lane aware cruise control" as the name for "Autopilot", it'd have helped reduce the number of times people confuse the two.
Frankly, as an owner of a Tesla that has both features, I wouldn't mind at all if both were pulled from the software. FSD is absolutely terrifying. I'd rather let my teenage son drive me than have FSD do it. Autopilot feels like a novelty. Given how frequently the car yells at you when you're not holding the wheel, it's hardly better than simple cruise control.
It doesn't help that Tesla's own marketing confuses the two features. For example, their marketing material for Autopilot sure sounds like it's describing a feature called "Full Self-Driving"[1]:
> The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.
That very same embedded video from Tesla has the title "Autopilot Full Self-Driving Hardware"[2], as well, further confusing the two.
> Given how frequently the car yells at you when you're not holding the wheel, it's hardly better than simple cruise control.
I tried that in a Hyundai Sonata traveling at highway speeds. The car stayed in the lane, turning the wheel on its own as it was supposed to. My hands were off the wheel, but at no point did I feel comfortable enough to look elsewhere, or move my hands further than a couple of inches away from the wheel. It's completely pointless to me, and actually worse than that, it gives me one more thing to worry about. I suppose if all of the vehicles were 100% autonomous, and the entire highway infrastructure was set up with that expectation, I could go ahead and fully relax while the car drives itself completely. But this 10% automation is like an uncanny valley. Well, valley of terror is more like it.
Cruise control on the other hand seems fine, because my foot is basically in the same position with or without it. The automation just saves me from having to modulate the thing with my foot, and it's not making changes that have very fast, significant consequences.
I have a car with lanekeeping and distance-keeping (Mercedes) and it is extremely nice for long distances on U.S. interstates. After many hours of driving, I feel much less fatigued, and more able to keep up "command" tasks like looking ahead and scanning mirrors, compared to no lane assist. I can't recall a single time when it made a dangerous move, either; it's been extremely good at following lane markings even in difficult conditions like driving into the sun, with patchy concrete. I've been really impressed.
> because my foot is basically in the same position with or without it.
Do you think that's true in general? I've never left my foot hovering over the pedals, back to when cruise control first became popular. I rest my foot fully on the floor, which is much more comfortable, but requires that I move my foot to apply either the brake or the accelerator.
> Cruise control on the other hand seems fine, because my foot is basically in the same position with or without it. The automation just saves me from having to modulate the thing with my foot, and it's not making changes that have very fast, significant consequences.
Maybe I am just living in the past with my 2007 Toyota Camry but cars have come a long way in just fifteen years. Just look at adaptive cruise control or lane assist.
Maybe I am just in awe because I don't own a new car and don't feel the pain of things like replacing a broken windshield (holy smokes these new things are expensive to calibrate) but it is nice when these things work. However, when I drive someone else's new car I can totally see myself going fora new car if I could somehow justify it financially.
I agree with what you say, but part of the reason Tesla is even around to have these problems is Musk's dubious talents as a relentless hype man. Musk himself says they were at the brink of bankruptcy at least twice: https://www.businessinsider.com/elon-musk-tesla-bankruptcy-m...
Personally, I think if we had better laws Musk would be looking at charges of criminal negligence for encouraging people to think they have things like actual "autopilot" and "full self driving". But as it is, he lied his way to the top of the world's richest list. So Tesla's problems here are just the consequences of his actions.
He strikes me more as a physicist arguing from first principles. Once you have drive-by-wire and a sensor array, self-driving is a pure software problem. You should be able to get 99% of the way there in simulation with a team of 4 really smart coders and an endless supply of Redbull and a year, tops. Also, the world's energy problems are trivial when you consider how cheap and efficient solar power is, and how little surface area is required. Clean water is just a matter of filtration at best or desalination at worst, which just devolves into an energy problem. And so on.
I do it myself sometimes, but at least I recognize how annoying and inaccurate it is. It is kind of nice to have someone arguing these positions and even better to act on them, and I for one wish that these analyses held (the world would be a better place if they did). But the real world just doesn't care about your first principles, and even simple ideas present obstacles you never imagined. Musk knows this, but willfully leaves it out during his pep talks, which is dishonest. But sometimes I get the uncomfortable feeling he's deceiving himself, which is quite chilling considering the amount of real power he's amassed (being able to launch large things into space is about as real as real power gets.)
"he lied his way to the top of the world's richest list" ... why do midwits always chose AP thread to gather .... i guess you dont need to start a rocket company an electric car company or try to create a fintech company in 1999 ... why work 100 hr weeks when you can just "lie" .... if biden goes ahead with this he will only shoot the foot of his supporter's like GM ( half of its current value is tied to cruise ) and blow up VC mna pipline for all the lidar based startup's
>>Personally, I think if we had better laws Musk would be looking at charges of criminal negligence for encouraging people to think they have things like actual "autopilot" and "full self driving". But as it is, he lied his way to the top of the world's richest list. So Tesla's problems here are just the consequences of his actions.
Your idea of better laws would mean no Tesla and no SpaceX.
Tesla has replaced two million gasoline cars with electric cars, and given its current growth rate, and Musk's long standing plan to release progressively more affordable cars, this number will likely be massively larger in a few years.
Beyond Tesla's own sales, its success has sparked massive investment by other carmakers to push their electric vehicle manufacturing timetables forward. All told, Tesla has had a massive impact in pushing the world to replace gasoline vehicles with electric ones.
SpaceX, for its part, is responsible for reducing the cost of launching material to orbit ten fold, with another 100 fold reduction possible with StarShip. The spike at the end of this graph is almost solely due to SpaceX:
Tesla “Autopilot” is otherwise known as the combination of “lane keeping assistance” and “adaptive cruise control” for virtually every other manufacturer. Sure, Tesla’s version is better than many of their competitors, but it still doesn’t make the marketing Tesla insists on doing any less misleading.
No way are you for real? Autopilot is incredible, for me it substantially reduces fatigue on long highway drives and decreases my stress when commuting in stop and go traffic. The difference between autopilot and simple cruise control is substantial. I wouldn’t buy a car that didn’t have Autopilot at this point.
You couldn't pay me to use it in stop and go traffic. My stress would be through the roof. I was using Autopilot once on one of the straightest stretches of highway in America (I-80 as it passes through Bonneville Salt Flats in Utah). With no cars on the road, and perfectly clear conditions, Autopilot decided that the lane I was in was ending (it wasn't) and began merging on to the shoulder.
I've never used Autopilot without it doing something clearly stupid. That is not a recipe for "stress free."
Are you sure we're talking about the same thing here?
I agree 100%. It definitely makes me a safer driver because I can pay attention to other things that may be going on 7 cars away when I don’t have to micromanage the job of staying in the lane and making sure I’m at the correct distance between the car ahead of me.
I recently drove a BMW 750i with lane-following on a long trip and then he lane-following feature was terrifying. It seemed like it was doing the most brain dead line-following algorithm that made it “bounce” back and forth between the left and right sides of the lane. I couldn’t stop worrying that it was freaking out the other cars around me by careening towards them and yanking back in the other direction at the last moment. I sure hope Tesla autopilot is better than that.
My Mercedes E-class has a setting where you can change between strict och relaxed (whatever the names were.)
In strict, it very closely keeps in the middle of the lane. That's not how I normally drives, so it feels scary. I usually bias away from oncoming traffic. It's probably fine on a wide highway, so this is more about me not feeling in control.
However, in relaxed mode, it seems to have the issue you're describing where there's too much slack, and it looks like it's playing pong between the lane dividers. Have the exact same worry about what this looks like to others...
It feels like different systems are better and worse at that. On my 2019 Honda Insight, it's actually pretty smooth and does a good job of staying in the center of the lane, but it won't turn the wheel too much -- if you're taking all but the gentlest curves, you're going to be turning the wheel. On my mom's 2018 Subaru Outback, it's exactly what you're describing -- it's fine in a straight line, but it doesn't really seem to judge the curve of the lanes as much as simulate a pinball. (I don't think I ever tried to let it take a turn on its own!)
I've only ridden in Teslas, not driven one, but it seems like it's probably got the best following/lane-keep system on the market, although the last time I rode in one -- admittedly about two years ago -- it was markedly jankier when the driver engaged more "self-driving" features, e.g., have it change lanes, take exits based on its own GPS guidance, etc. It was impressive that it could do it, to be sure, but it was the self-driving version of your reckless friend in high school you do your best never to ride with.
It's usually very good at staying perfectly centered between painted lane lines, even when following curves. This is fine usually, except when a lane on the right merges into your lane. In the United States, interstate highways don't always have any sort of markers establishing the non-merging lane. Because of that, about halfway into where the lanes merge, Autopilot considers that portion just one very wide lane and aggressively tries to center itself, rather than just keeping a relative distance from the lane markings on the left.
VW/Audi lane assist also "bounces" you off the side of the lane, but it doesn't send you so far to the other side that you bounce back. Overall the combination of lane assist + adaptive cruise does an okay approximation of FSD on highways.
Tesla fans will say "Autopilot in planes isn't much more than glorified cruise control," but they miss the point. What matters is what people think when they hear "autopilot," and Tesla AP or even NoA is nowhere close to that perception.
No one is confusing anything. It's obvious that they intend these things to do much more than they are currently capable of. Why are you making excuses for a company that some 8 years in has only ever doubled down on the naming and the connotations?
Software in general has a naming problem - they all tend to be aspirational metaphors. From 'AI', 'ML', 'Cryptocurrency', 'Agile process' to 'self-driving', the confusion they create is really frustrating and harmful at times.
A Big Mac isn’t a big burger. I’m sure many shoe names don’t reflect well on the increased abilities they sound like. Naming is a big part of marketing hype. It always has been, it’s not just software.
For instance, there's this paragraph: "Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous."
The problem is that this technical writing runs counter to all the flashy marketing, and the flashy marketing sticks in people's minds more thoroughly.
> Autopilot is cruise control that keeps you within a lane when it can. That's it. Other cars have it, and they're equally effective or ineffective.
But "Autopilot" obviously sounds cooler. So you can pay 10k and feel good, while that feature costs 1.300 Euro in my Hyundai (where it also comes with heated seats and steering wheel in the package).
The more I read and see from Tesla, the more it look likes some kind of A/B test on how much money people are willing to spend on a beta software/car-test.
Tesla's Autopilot feature is standard. The thing that costs $10k is Full Self Driving.
Which is an example of what I mean when I say "Tesla's biggest problems with any of their autonomous systems is the names they give those systems. They confuse regulators, reporters, and consumers."
I have a Nissan Leaf with lane following and adaptive cruise. It’s nice on road trips but it’s not something you can set and ignore. It’s just better cruise control that lets you relax a bit more.
If FSD is anything like the many videos I’ve seen I don’t want it and don’t think it’s ready for general use.
FSD seems like one of those problems where getting 80% of the way there takes 20% of the effort and getting the remaining 20% takes 80% of the effort. There is a truly massive chasm between adaptive cruise with lane following and FSD.
I drove a rented Nissan Murano and I found the lane keeping and adaptive cruise control to be excellent. If the auto supply was in better shape, I’d have seriously considered buying one.
I think the number of comments in this thread where people confuse Autopilot with FSD is reason enough to at least rebrand the names of the two distinctly different features.
If your Tesla is "yelling at you when you're not holding the wheel" "frequently" you are admittedly misusing the level 2 assist.
Contrarily, I use autopilot responsibly and appreciate that statistically it is reducing my odds of death. As such I WOULD mind that a political decision like buttieg's appointment, is being made to endanger me.
Im downvoted for using level 2 with my hands? Or for recognizing its safety value? Or for agreeing with the authors assesment of buttieg's appointment?
My use case was autopilot steering me off the left hand side of the road at 80 MPH during a sunset [1] My (WAG) guess is that the ML corpus was based on a model 3, while the model Y, I was driving, had placed the sensors slightly higher? I dunno; again, tis a WAG.
I am a huge fan of Tesla's permise. My Dad sold Vanguard Citicars [2], and I worked on GM's interactive ad for the EV-1. I put a down payment on Aptera's original EV and again on the resurrected version [3].
But, I backed out of buying a Tesla. Even though I disabled Autopilot and ran purely cruise control, the Model Y would brake for phantom obstacles. Such as: low rises in the road. Or: passing a 18-wheeler on the left. Driving from AZ to CA, the phantom braking was occuring every 5 miles or so. So, I had to drive 800 miles with a manual accelerator. Bummer!
When my dad was teaching me how to drive, he told me that cruise control is only for when you have at least a hundred feet between you and the next car, and that you fully turn it off to pass other cars. I treat Autopilot the same way when I use it. I treat it as cruise control, if I use it at all.
Problem is that, with no cars around for hundreds of feet, with dozens of seconds of reaction time, when you hit a low rise on the AZ Interstate, the Tesla Y will abruptly brake. This is with Autopilot OR with simple cruise control. Pretty stressful. Bonus impact on range prediction. So, I couldn't use either.
I know this is a controversial take, but I'll say it anyway: anyone who doesn't feel safe driving with cruise control is a dangerous driver. Cruise control is a tool. So long as you understand how it works, and that it doesn't paralyse your right leg, you really can use it in any driving context. It is an alternative input mechanism to the accelerator pedal, nothing more.
> the Model Y would brake for phantom obstacles. Such as: low rises in the road. Or: passing a 18-wheeler on the left.
Fwiw, I had similar phantom breaking experiences with my model y.
One of the recent updates fixed it such that phantom braking rarely happens (maybe once per 1000 miles driven or less now versus almost every time I passed an 18-wheeler on a two-lane highway before).
Autopilot cannot be enabled until the position of the cameras is calibrated. That's why it requires you drive on well-marked roads for several miles when you first receive your car, before it works.
It's anything but trivial to make a neural net properly abstract over some "camera position parameter".
More over it's nearly impossible to be sure it did properly abstract in all cases. I.e. it might have abstracting in every case but some arbitrary edge case which for a human looks no special at all but for some arbitrary reason is for the NN.
Anyway this is highly speculative and might very well be unrelated to the given Teslas behaviour.
One thing I don't quite understand is that many cars have autopilot-like software. Unlike tesla's which constantly requires putting pressure on the wheel to show you're there and paying attention, Ford's lets you drive indefinitely without any hands on the wheel. Wouldn't this same investigation be put onto other manufactures as a giant audit? Hitting emergency vehicles is obviously bad but 1) it happens in cars without autopilot and 2) if you're hitting one you're clearly no paying attention. it's not like they just appear out of no where
What it comes down to for me is the marketing. The other manufacturers are very careful in how they market the software, Tesla is not.
If you look at Mercedes, for example, their marketing page describing their driver assistance technology[0] (with a very similar feature set to Tesla's) uses the word "assist" more than 30 times and in practically every header. Few people would come away from that marketing thinking that their car is going to drive itself without them paying attention.
Tesla, in contrast, advertises their "autopilot" and "full self driving" capabilities. The word "assist" is used exactly once on the Autopilot landing page[1]. The rest of the words and names are carefully chosen to convey a sense of total autonomy.
I find it really interesting that for Mercedes L3 self driving they are even willing to put their money where their mouth is and take liability for the car in self driving mode.
The page literally talks about "full self-driving capabilities [...] through software updates designed to improve functionality over time". Which to me sounds like FSD is not actually being advertised as currently existing, as opposed to something you'll eventually get.
> Unlike tesla's which constantly requires putting pressure on the wheel to show you're there and paying attention, Ford's lets you drive indefinitely without any hands on the wheel
That’s not Ford’s version of Autopilot, it’s one step further. It’s actually hands off (despite how many treat autopilot). Named BlueCruise.
It’s comparable to GM SuperCruise. It ONLY works on specially mapped divided highways that Ford has approved. It will disengage for strong turns and anything it’s not ready for. You MUST watch the road, it keeps track with a camera on the wheel.
Basically the way it treats the driver is far more conservative. Instead of telling the driver they need to pay attention, it actively monitors them. Instead of saying “you should only use it on these kind of roads“ it actively prevents you.
It’s a fundamentally different approach. Ford’s ACC (not hands free, Co-Pilot 360) constantly monitors for steering wheel torque to ensure your hands are on the wheel and disengages pretty quickly if they’re hot and you ignore the warning.
That said, I have it on my car. It’s freaky as hell to use, kind of scary. Maybe I would use it on long drives in the country, but I just don’t want something else that in charge in even medium traffic.
That is kind of similar to Tesla still though. At least with FSD Beta there is a camera actively monitoring you and making sure your eyes are on the road. I've also been on roads where autopilot either won't activate at all or will activate but at a reduced speed. I was hoping that once Tesla enabled that internal camera they would stop relying on weight on the steering wheel and just use eye tracking.
As far as I know there had been a unusual high amount of "unusual" accidents associated with Tesla autopilots and there is not such observation for other car manufactures.
This doesn't mean their system is more advanced it actually could mean their system bails earlier due to being less advanced and in turn luckily avoiding this problems.
Or that they are much much less used.
But then Tesla is not really known for good QA.
And in the past there had been multiple unrelated tests for emergency brake systems in which Tesla cars failed really hard. Behaving worse then many much "simpler" less advanced systems. Sometimes to a point of only braking after/when hitting the pedestrian... (mechanized test dummy puppet the Tesla system by it's own feedback recognized as human).
If your most advanced self driving system can't even compete with emergency brake systems by such a large margin I would not be surprising if the Teslas system has major faults tbh.
Ford, like GM, has a literal camera monitoring the driver's face so they can see if they are paying attention. Seeing what your face is doing is a much more reliable system of measuring attention than whether or not the driver is touching the steering wheel while reading their book.
I don’t think anyone reads about Fords (for example) manual and thinks “oh wow, I don’t even have to have to pay attention and have my hands on the wheel!”.
The fact that it’s a big enough problem for Tesla that they have to monitor it, and still have issues, points to the main difference being user expectations and marketing around these features.
I don't know anything about other manufacturer's systems but I've seen video of Tesla's on autopilot doing unsafe things and read a lot of anecdotes of this as well. Elon Musk has made ridiculously optimistic statements about when Teslas will be self-driving and that by itself can influence people's behavior - perhaps fatally.
Edit: also, Tesla removed Lidar from their system. And there have been well-publicized deaths of people using autopilot.
Tesla has not removed Lidar form their system. You are mixing things up, they removed rader. And most of cases in the report are about system with radar.
I bought FSD version of Model 3 in the spring of 2020. I'm still waiting for what I purchased to be turned on. Frankly, I'd be psyched if they would expand on the little things instead of the big things. "Park the car in that spot". "Exit the parking spot". These would be worth the price I paid.
I bought FSD version of Model 3 in the spring of 2020
I don't understand how people keep falling for this. Sure, it seemed realistic enough at first but how many cracks in the facade are too many to ignore? In 2015 Musk said two years. In 2016 he said two years. In 2017 he said two years. Tesla did a 180 on the fundamental requirements of FSD and decided it doesn't need lidar just because they had a falling-out with their lidar supplier. That level of ego-driven horseshit is dangerous.
It's still not actually available to people who paid for it, and it doesn't actually work* (at least not to a degree where it can be trusted not to crash in to street signs or parked cars). I have no idea why anyone pays a $10k premium for vaporware.
> I don't understand how people keep falling for this.
Every successful fraud has people it's tuned for. For example, consider how terribly written most spam is. That selects for people who are not fussy about writing. Conversely, a lot of the people doing high-end financial fraud is done by people who are very polished, very good at presenting the impression of success. Or some years back I knew of a US gang running the Pigeon Drop [1] on young East Asian women in a way that was tuned to take advantage of how they are often raised.
Telsa's only has ~3% of the US car market, so they're definitely in the "fool some of the people all of the time" bucket. Musk's fan base seems to be early adopters and starry-eyed techno-utopians [2]. He's not selling transportation. He's selling a dream. They don't care that experts can spot him as a liar [3] because listening to experts would, like, totally harsh their mellow.
Although it's much closer to legal fraud, I don't think that's otherwise hugely different than how many cars are marketed. E.g., all of the people who are buying associations of wealth when they sign up for a BMW they can't afford. Or the ocean of people buying rugged, cowboy-associated vehicles that never use them for anything more challenging than suburban cul de sacs.
There's got to be a sizeable number of people who have paid for this feature and never received it before selling the car. And of course Tesla is willing to re-sell the feature to the next driver. Are they willing to pay back customers for a function they haven't shipped?
They had a falling out with MobileEye, who provided the AP 1.0 hardware. It never used LIDAR. Tesla doesn’t use LIDAR because you need to solve vision anyway. And once you solve that, LIDAR makes no more sense.
(You need to solve vision anyway, because for that object, of which LIDAR tells you is exactly 12.327 ft away, you still need to figure out whether it is a trashcan or a child. And if it is a child, whether it is about to jump on the road or walking away from the road. LIDAR does not tell you these things. It can only tell you how far they are away. It is not some magical sensor which Tesla is just too cheap to employ.)
The ability to use the phone or remote to move the car forward or back in a straight line is super useful and a cool, novel feature by itself. It’s also a buggy piece of shit that a few engineers could probably greatly improve in a month. Doesn’t seem like Tesla cares, it’s been stagnant for years.
Meanwhile Tesla is still charging people $10,000 for an FSD function that doesn’t exist.
> The ability to use the phone or remote to move the car forward or back in a straight line is super useful and a cool, novel feature by itself.
Is it? It’s hard to think of a situation where moving a car I’m at most a couple of hundred feet from backward or forwards in a straight line by fiddling with my phone is superior to just getting into the car and moving it myself. Maybe I’m not finding myself and my car on opposite sides of a gorge often enough?
I made the same mistake in 2019. $6000 for FSD. The car is an enigma. It is simultaneously wonderful and a total scam. My next one will be some other EV from another manufacturer. I would never buy another Tesla or anything else from Elon. Like the car, he is an enigma. I oscillate between believing he is brilliant and seeing him as a lying snake oil salesman. I can’t exactly say I’m disappointed with my purchase overall because it is a damn good EV, but I can’t help but feel like I was conned.
I got it this week. It's not really worth using in its current state. It makes minor mistakes every minute or two, major ones every five minutes or so.
I bought it years ago mostly for the guaranteed computer upgrade and the novelty of testing it as it develops. It's great as a novelty, really cool! But it is dangerous to use right now, and I don't care what anyone says, it's not going to be truly ready for years. In fact I think it needs another computer upgrade and probably a camera upgrade too.
I agree that they should have nailed autopark and summon before moving on to FSD. As it stands they are both useless. But if I could record and play back summon paths in known locations, that would be actually useful.
> I bought it years ago mostly for the guaranteed computer upgrade and the novelty of testing it as it develops.
I did the same, and it was significantly less expensive. I wouldn't buy it for a $1 today.
Also, your idea of novelty is my idea of a nightmare. There has never been a time when I used it where it didn't do something completely insane. The last time I used it (which will truly be the last time), it waited patiently to make a left hand turn. It waited far longer than I would have, and it was clear of oncoming cars for ages. When it did decide to turn, it did it when there were several cars coming, though it was still safe. Except then half way through its turn, it literally stopped, then turned a bit to the right and centered itself in a lane going the opposite direction of traffic flow, right into oncoming cars. Thankfully I was able to take over and two of the three oncoming cars stopped. Had I done nothing, we would have all been in a head on collision.
The fact that we don't have reliable, fully automated parking yet is bizarre. I'd love a solution that automatically parallel parks for me, and a computer with sensors around the vehicle should be able to do a better job. Plus, the problem is of limited scope, and low speed, so you don't have to deal with most of the potential issues and dangers with full self driving on a road.
BMWs have had that for quite a while. My 2016 5 series does it perfectly 100% of the time. It even gets into some spots that I consider way too short, but it surprises me.
My 2015 Model S does parallel parking between two other cars quite well. It will also reverse into the space between two cars parked perpendicular to the road.
If you have been following their updates, it looks like that they have finally found the correct approach for FSD and it is improving very quickly. At this rate, it seems that it'll be close to level 5 this year or next year.
> If you have been following their updates, it looks like that they have finally found the correct approach for FSD and it is improving very quickly. At this rate, it seems that it’ll be close to level 5 this year or next year.
Yeah, their perpetual 2-year estimate dropped to 1 year around a year and half ago, after being at 2 years for at least 6 years. So we’re probably four and half years from it either being ready…or dropping to 6 months off for the next several years.
They could turn this around if they gave up on the futurologist software bullshit and strived to simply sell battery-powered cars. Stop selling dreams of robotaxis, just sell cars.
This is actually what excited me about Tesla, I thought I’d just get a great car that was electric.
I didn’t really ask for a smart car.
I also drove my friends Volkswagen the other day, all the lane assist was just ridiculous. I was driving on a remote road pretty fast and the line markings went missing, it was a dark night. It immediately disengaged and had I not been paying attention we wouldn’t went straight into a barrier or other lane. It doesn’t claim to be autopilot but either I’m driving the car or not imo.
Yes become a boring company nobody cares about like the rest of the car industry. Nobody should try and innovate with futuristic products and become
bureaucratic behemoths like the government, which has an excellent track record of success.
Tesla succeeded in showing the world that an electic car can be something more than a glorified golf cart. That is frankly a huge achievement, compared to what came before. EVs are now the centerpiece of every manufacturer's plans for the coming decades, and that is because of Tesla. Whether they succeed as a brand is not really important anymore.
"EVs are now the centerpiece of every manufacturer's plans for the coming decades, and that is because of Tesla."
100% because of tesla? I remember in college before any EV being asked to sign a petition to car manufacturers, "please build EVs so I van buy one"
I agree Tesla has shifted the perspective on EVs. The absolutism that there was nothing else going on before in the absence of Tesla and nothing else since - seems perhaps myopic
TSLA will succeed, if you believe the hypothesis that oil prices will continue to climb to the point that many buyers are priced out of that market and switch to electric vehicles. It’s screwed if the alternate hypothesis is true (that is we continue to use oil forever).
Oh man you’re going to be really upset when you hear that autopilot is just another name for advanced cruise control. A huge percentage of the cars on the road have it.
This is fear mongering disguised as concern. Tesla publishes data about incident rates for drivers both on and off autopilot and they are markedly lower than other drivers.
>And yet my life is/may be in danger because of it.
??? While neither is perfect, your life is more in danger from a regular driver, that statistically can be drunk, distracted, going too fast for skill level, e.t.c, versus that exact person in a Tesla using FSD.
It doesn't help that Musk tries to create confusion by naming things something they're not. Next thing he'll implement might be the Flying Car Update... which will turn lights green as you drive by using the 14hz infrared signal.
To other commenters: this is a technical audit of autopilot software, not Tesla FSD.
And the actual context is much less of a big deal than it seems: the biggest plausible consequence would be forcing Tesla to push an over-the-air update with better driver attention monitoring or alerting.
The worst case outcome would require Tesla to disable the autopilot suite entirely, for an indeterminate amount of time, perhaps permanently on the existing fleet of vehicles.
The NHTSA is tired of Tesla's hand-waving away their safety investigations into Autopilot by pushing stealth updates that fix specific scenarios in specific places being investigated. NHTSA wisened up to that and independently purchased their own Tesla vehicles, and disabled software updates, so that they can reproduce those scenarios themselves.
If NHTSA asks Tesla to provide system validation tests showing that an updated version of their software meets the design intent of the system, Tesla would not be able to do so. If they can't prove the new Autopilot software corrects the safety-related defects identified in the current version, then it's not a valid recall remedy.
All evidence from their own AI/AP team and presentations is that there is no real design and system validation going on over there. They're flying by the seat of their pants, introducing potentially lethal regressions in every update.
> All evidence from their own AI/AP team and presentations is that there is no real design and system validation going on over there. They're flying by the seat of their pants, introducing potentially lethal regressions in every update.
What is this evidence?
I've seen a few talks from Andrej Karpathy that indicate to me a more deliberate approach.[0] "Software 2.0" itself seems like an approach meant to systematize the development, validation & testing of AI systems, hardly a seat-of-your-pants approach to releases. I have my own criticisms of their approach, but it seems there is pretty deliberate care taken when developing models.
> The NHTSA is tired of Tesla's hand-waving away their safety investigations into Autopilot by pushing stealth updates that fix specific scenarios in specific places being investigated.
Why isn't Tesla prosecuted for that? It's lawless!
If you read the report, you will realize that NHTSA is considering requiring Tesla to do better driver attention monitoring, or to improve alerting. They are not considering banning autopilot.
> introducing potentially lethal regressions in every update.
Meh. I mean, I understand the emotional content of that argument, and the propriety angle is real enough. I really do get what you're saying. And if your prior is "Tesla is bad", that's going to be really convincing. But if it's not...
The bottom line is that they're getting close to 3M of these vehicles on the roads now. You can't spit without hitting one in the south bay. And the accident statistics just aren't there. They're not. There's a small handful of verifiable accidents, all on significantly older versions of the software. Bugs are real. They've happened before and they'll no doubt happen again, just like they do with every other product.
But the Simple Truth remains that these are very safe cars. They are. So... what exactly are people getting upset about? Because it doesn't seem to be what people claim they're getting upset about.
Wow. It seems like now on HN, this sort of turn-of-phrase has sadly become a boiler plate way to dismiss a host of comments. IE, what "actual report" are you referring to?? The headline article is about the investigation that has opened but not yet closed, as the linked text shows (it's a PDF but it's essentially a press release showing discoveries so far - why they've escalated. It's not closed and not friendly to Tesla).
"Accordingly, PE21-020 is upgraded to an Engineering Analysis to extend the existing crash analysis, evaluate additional data sets, perform vehicle evaluations, and to explore the degree to which Autopilot and associated
Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver’s supervision"
> the biggest plausible consequence would be forcing Tesla to push an over-the-air update with better driver attention monitoring or alerting.
I see no basis for this conclusion, which appears to be pure speculation about what NHTSA might decide is necessary and sufficient to address the potential problems if confirmed by the deeper analysis.
I encourage reading the actual report.
> I encourage reading the actual report.
I did, and the conclusion do which you appeal to it does not appear to be well-supported by it.
Tesla has no hardware for proper driver monitoring. Most of model S have no internal camera. And model 3 internal camera wasn’t designed for it (doesn’t work in the dark, cannot see through sunglasses, cannot see where your eyes are actually pointing, etc).
You cannot OTA HW deficiencies.
Now, will they be forced to have monitoring like that, to be on par with their competitors? That’s a different story, and given how weak USA regulatory agencies are, and how reckless Tesla is at disregarding them - I’m pretty sure Tesla won’t be hurt by it.
> Tesla has no hardware for proper driver monitoring. Most of model S have no internal camera. And model 3 internal camera wasn’t designed for it
See. That has been my whole point for months that both Autopilot and FSD is still unproven safety critical software and it goes to show that if used in circumstances say at night it becomes even far dangerous to use at the worst time to drive, like I have said before [0][1][2][3]. Even worse that it lacks proper driver monitoring.
I guess they should be required to have this driver monitoring hardware installed on their cars as well as night vision cameras to avoid the crashes I have listed below. If the regulators enforce this, perhaps they might have saved another Tesla driver from losing control or avoided another crash.
Or perhaps if Tesla still finds it difficult to prevent further crashes using night vision cameras, perhaps they have to admit that they should have used LIDAR instead.
Because that’s a stupid system. I will never accept being under camera surveillance constantly by the car, no matter what stupid name people come up for it (“attention monitoring”)
NHTSA formally investigated Tesla Autopilot after incidents in 2016, 2017 and 2018. There was also a formal NHTSA investigation into Tesla battery fires in 2013. These investigations span 3 presidencies under both parties.
Would this mean that Tesla would have to provide refunds for customers who purchased FSD? If so, would they pro-rate the refund based, as is customary under lemon law refunds? On the flip side, could they be required to inflation-adjust the refunds, so that customers get back the same purchasing power that they spent?
Separately, would this prevent Tesla from having a beta tester group that tries out FSD features at no cost?
Most reporters seem to confuse "autopilot" with "full self driving" or vice versa. It's not uncommon to see sentences like "Autopilot, a feature that costs $10k, is still in beta."
Autopilot is cruise control that keeps you within a lane when it can. That's it. Other cars have it, and they're equally effective or ineffective.
Full self driving, aside from being terrifying, does it all, and I suspect it's usually what regulators and reporters mean when they say "Autopilot."
I think if Tesla had used "lane aware cruise control" as the name for "Autopilot", it'd have helped reduce the number of times people confuse the two.
Frankly, as an owner of a Tesla that has both features, I wouldn't mind at all if both were pulled from the software. FSD is absolutely terrifying. I'd rather let my teenage son drive me than have FSD do it. Autopilot feels like a novelty. Given how frequently the car yells at you when you're not holding the wheel, it's hardly better than simple cruise control.
> The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.
That very same embedded video from Tesla has the title "Autopilot Full Self-Driving Hardware"[2], as well, further confusing the two.
[1] https://www.tesla.com/videos/autopilot-self-driving-hardware...
[2] https://vimeo.com/192179727?embedded=true&source=vimeo_logo&...
If Musks moonshot had been successfull, we would not be having this discussion now.
But it was not, and now Tesla is in trouble because they can’t deliver what people think they were promised.
I tried that in a Hyundai Sonata traveling at highway speeds. The car stayed in the lane, turning the wheel on its own as it was supposed to. My hands were off the wheel, but at no point did I feel comfortable enough to look elsewhere, or move my hands further than a couple of inches away from the wheel. It's completely pointless to me, and actually worse than that, it gives me one more thing to worry about. I suppose if all of the vehicles were 100% autonomous, and the entire highway infrastructure was set up with that expectation, I could go ahead and fully relax while the car drives itself completely. But this 10% automation is like an uncanny valley. Well, valley of terror is more like it.
Cruise control on the other hand seems fine, because my foot is basically in the same position with or without it. The automation just saves me from having to modulate the thing with my foot, and it's not making changes that have very fast, significant consequences.
Do you think that's true in general? I've never left my foot hovering over the pedals, back to when cruise control first became popular. I rest my foot fully on the floor, which is much more comfortable, but requires that I move my foot to apply either the brake or the accelerator.
Maybe I am just living in the past with my 2007 Toyota Camry but cars have come a long way in just fifteen years. Just look at adaptive cruise control or lane assist.
https://www.youtube.com/watch?v=smf1uop7HoY
Maybe I am just in awe because I don't own a new car and don't feel the pain of things like replacing a broken windshield (holy smokes these new things are expensive to calibrate) but it is nice when these things work. However, when I drive someone else's new car I can totally see myself going fora new car if I could somehow justify it financially.
Personally, I think if we had better laws Musk would be looking at charges of criminal negligence for encouraging people to think they have things like actual "autopilot" and "full self driving". But as it is, he lied his way to the top of the world's richest list. So Tesla's problems here are just the consequences of his actions.
I do it myself sometimes, but at least I recognize how annoying and inaccurate it is. It is kind of nice to have someone arguing these positions and even better to act on them, and I for one wish that these analyses held (the world would be a better place if they did). But the real world just doesn't care about your first principles, and even simple ideas present obstacles you never imagined. Musk knows this, but willfully leaves it out during his pep talks, which is dishonest. But sometimes I get the uncomfortable feeling he's deceiving himself, which is quite chilling considering the amount of real power he's amassed (being able to launch large things into space is about as real as real power gets.)
Your idea of better laws would mean no Tesla and no SpaceX.
Tesla has replaced two million gasoline cars with electric cars, and given its current growth rate, and Musk's long standing plan to release progressively more affordable cars, this number will likely be massively larger in a few years.
Beyond Tesla's own sales, its success has sparked massive investment by other carmakers to push their electric vehicle manufacturing timetables forward. All told, Tesla has had a massive impact in pushing the world to replace gasoline vehicles with electric ones.
SpaceX, for its part, is responsible for reducing the cost of launching material to orbit ten fold, with another 100 fold reduction possible with StarShip. The spike at the end of this graph is almost solely due to SpaceX:
https://ourworldindata.org/grapher/yearly-number-of-objects-...
I see laws that prevent the emergence and flourishing of Tesla and SpaceX as far worse than current laws.
> Many of our Autopilot features, like Autosteer, Navigate on Autopilot and Summon, are disabled by default.
So I don’t think it’s fair to say that Autopilot just refers to cruise control - that’s certainly not how Tesla use the word.
No way are you for real? Autopilot is incredible, for me it substantially reduces fatigue on long highway drives and decreases my stress when commuting in stop and go traffic. The difference between autopilot and simple cruise control is substantial. I wouldn’t buy a car that didn’t have Autopilot at this point.
I've never used Autopilot without it doing something clearly stupid. That is not a recipe for "stress free."
Are you sure we're talking about the same thing here?
Dead Comment
In strict, it very closely keeps in the middle of the lane. That's not how I normally drives, so it feels scary. I usually bias away from oncoming traffic. It's probably fine on a wide highway, so this is more about me not feeling in control.
However, in relaxed mode, it seems to have the issue you're describing where there's too much slack, and it looks like it's playing pong between the lane dividers. Have the exact same worry about what this looks like to others...
I've only ridden in Teslas, not driven one, but it seems like it's probably got the best following/lane-keep system on the market, although the last time I rode in one -- admittedly about two years ago -- it was markedly jankier when the driver engaged more "self-driving" features, e.g., have it change lanes, take exits based on its own GPS guidance, etc. It was impressive that it could do it, to be sure, but it was the self-driving version of your reckless friend in high school you do your best never to ride with.
If anyone is confused, it's intentional.
I guess you're right. I've never read a more confusing comment on HN.
Do you think Tesla's support page describing Autopilot and Autopilot with Full Self Driving is clear enough? https://www.tesla.com/support/autopilot
For instance, there's this paragraph: "Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous."
But "Autopilot" obviously sounds cooler. So you can pay 10k and feel good, while that feature costs 1.300 Euro in my Hyundai (where it also comes with heated seats and steering wheel in the package).
The more I read and see from Tesla, the more it look likes some kind of A/B test on how much money people are willing to spend on a beta software/car-test.
Which is an example of what I mean when I say "Tesla's biggest problems with any of their autonomous systems is the names they give those systems. They confuse regulators, reporters, and consumers."
If FSD is anything like the many videos I’ve seen I don’t want it and don’t think it’s ready for general use.
FSD seems like one of those problems where getting 80% of the way there takes 20% of the effort and getting the remaining 20% takes 80% of the effort. There is a truly massive chasm between adaptive cruise with lane following and FSD.
What? You claim to own a Tesla but don't know that you must keep your hand on the wheel at all times?
Deleted Comment
Dead Comment
Contrarily, I use autopilot responsibly and appreciate that statistically it is reducing my odds of death. As such I WOULD mind that a political decision like buttieg's appointment, is being made to endanger me.
I don't think it really does, does it? I thought that "Autopilot" basically only worked on highways in fairly unsurprising conditions?
I am a huge fan of Tesla's permise. My Dad sold Vanguard Citicars [2], and I worked on GM's interactive ad for the EV-1. I put a down payment on Aptera's original EV and again on the resurrected version [3].
But, I backed out of buying a Tesla. Even though I disabled Autopilot and ran purely cruise control, the Model Y would brake for phantom obstacles. Such as: low rises in the road. Or: passing a 18-wheeler on the left. Driving from AZ to CA, the phantom braking was occuring every 5 miles or so. So, I had to drive 800 miles with a manual accelerator. Bummer!
[1] https://news.ycombinator.com/item?id=31504583
[2] https://en.wikipedia.org/wiki/Citicar
[3] https://aptera.us [edit]
Wives-and-girlfriends?
Fwiw, I had similar phantom breaking experiences with my model y.
One of the recent updates fixed it such that phantom braking rarely happens (maybe once per 1000 miles driven or less now versus almost every time I passed an 18-wheeler on a two-lane highway before).
Dead Comment
It's anything but trivial to make a neural net properly abstract over some "camera position parameter".
More over it's nearly impossible to be sure it did properly abstract in all cases. I.e. it might have abstracting in every case but some arbitrary edge case which for a human looks no special at all but for some arbitrary reason is for the NN.
Anyway this is highly speculative and might very well be unrelated to the given Teslas behaviour.
If you look at Mercedes, for example, their marketing page describing their driver assistance technology[0] (with a very similar feature set to Tesla's) uses the word "assist" more than 30 times and in practically every header. Few people would come away from that marketing thinking that their car is going to drive itself without them paying attention.
Tesla, in contrast, advertises their "autopilot" and "full self driving" capabilities. The word "assist" is used exactly once on the Autopilot landing page[1]. The rest of the words and names are carefully chosen to convey a sense of total autonomy.
[0] https://www.mercedes-benz.com/en/innovation/autonomous/the-n...
[1] https://www.tesla.com/autopilot
> The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.
[1] https://www.tesla.com/videos/autopilot-self-driving-hardware...
That’s not Ford’s version of Autopilot, it’s one step further. It’s actually hands off (despite how many treat autopilot). Named BlueCruise.
It’s comparable to GM SuperCruise. It ONLY works on specially mapped divided highways that Ford has approved. It will disengage for strong turns and anything it’s not ready for. You MUST watch the road, it keeps track with a camera on the wheel.
Basically the way it treats the driver is far more conservative. Instead of telling the driver they need to pay attention, it actively monitors them. Instead of saying “you should only use it on these kind of roads“ it actively prevents you.
It’s a fundamentally different approach. Ford’s ACC (not hands free, Co-Pilot 360) constantly monitors for steering wheel torque to ensure your hands are on the wheel and disengages pretty quickly if they’re hot and you ignore the warning.
That said, I have it on my car. It’s freaky as hell to use, kind of scary. Maybe I would use it on long drives in the country, but I just don’t want something else that in charge in even medium traffic.
Ah, thanks! I thought the dealer was saying "Blue's Clues"
This doesn't mean their system is more advanced it actually could mean their system bails earlier due to being less advanced and in turn luckily avoiding this problems.
Or that they are much much less used.
But then Tesla is not really known for good QA.
And in the past there had been multiple unrelated tests for emergency brake systems in which Tesla cars failed really hard. Behaving worse then many much "simpler" less advanced systems. Sometimes to a point of only braking after/when hitting the pedestrian... (mechanized test dummy puppet the Tesla system by it's own feedback recognized as human).
If your most advanced self driving system can't even compete with emergency brake systems by such a large margin I would not be surprising if the Teslas system has major faults tbh.
Deleted Comment
The fact that it’s a big enough problem for Tesla that they have to monitor it, and still have issues, points to the main difference being user expectations and marketing around these features.
I don't know anything about other manufacturer's systems but I've seen video of Tesla's on autopilot doing unsafe things and read a lot of anecdotes of this as well. Elon Musk has made ridiculously optimistic statements about when Teslas will be self-driving and that by itself can influence people's behavior - perhaps fatally.
Edit: also, Tesla removed Lidar from their system. And there have been well-publicized deaths of people using autopilot.
Dead Comment
I don't understand how people keep falling for this. Sure, it seemed realistic enough at first but how many cracks in the facade are too many to ignore? In 2015 Musk said two years. In 2016 he said two years. In 2017 he said two years. Tesla did a 180 on the fundamental requirements of FSD and decided it doesn't need lidar just because they had a falling-out with their lidar supplier. That level of ego-driven horseshit is dangerous.
Look, the message is clear enough already: FSD is two years away, and always will be! You've got to admire that kind of consistency.
Every successful fraud has people it's tuned for. For example, consider how terribly written most spam is. That selects for people who are not fussy about writing. Conversely, a lot of the people doing high-end financial fraud is done by people who are very polished, very good at presenting the impression of success. Or some years back I knew of a US gang running the Pigeon Drop [1] on young East Asian women in a way that was tuned to take advantage of how they are often raised.
Telsa's only has ~3% of the US car market, so they're definitely in the "fool some of the people all of the time" bucket. Musk's fan base seems to be early adopters and starry-eyed techno-utopians [2]. He's not selling transportation. He's selling a dream. They don't care that experts can spot him as a liar [3] because listening to experts would, like, totally harsh their mellow.
Although it's much closer to legal fraud, I don't think that's otherwise hugely different than how many cars are marketed. E.g., all of the people who are buying associations of wealth when they sign up for a BMW they can't afford. Or the ocean of people buying rugged, cowboy-associated vehicles that never use them for anything more challenging than suburban cul de sacs.
[1] https://en.wikipedia.org/wiki/Pigeon_drop
[2] https://www.theverge.com/2018/6/26/17505744/elon-musk-fans-t...
[3] e.g.: https://twitter.com/kaifulee/status/1126238951960993792
I am cautiously optimistic about future of FSD in general now.
Dead Comment
(You need to solve vision anyway, because for that object, of which LIDAR tells you is exactly 12.327 ft away, you still need to figure out whether it is a trashcan or a child. And if it is a child, whether it is about to jump on the road or walking away from the road. LIDAR does not tell you these things. It can only tell you how far they are away. It is not some magical sensor which Tesla is just too cheap to employ.)
The ability to use the phone or remote to move the car forward or back in a straight line is super useful and a cool, novel feature by itself. It’s also a buggy piece of shit that a few engineers could probably greatly improve in a month. Doesn’t seem like Tesla cares, it’s been stagnant for years.
Meanwhile Tesla is still charging people $10,000 for an FSD function that doesn’t exist.
Is it? It’s hard to think of a situation where moving a car I’m at most a couple of hundred feet from backward or forwards in a straight line by fiddling with my phone is superior to just getting into the car and moving it myself. Maybe I’m not finding myself and my car on opposite sides of a gorge often enough?
https://www.tesla.com/model3/design#overview
I bought it years ago mostly for the guaranteed computer upgrade and the novelty of testing it as it develops. It's great as a novelty, really cool! But it is dangerous to use right now, and I don't care what anyone says, it's not going to be truly ready for years. In fact I think it needs another computer upgrade and probably a camera upgrade too.
I agree that they should have nailed autopark and summon before moving on to FSD. As it stands they are both useless. But if I could record and play back summon paths in known locations, that would be actually useful.
I did the same, and it was significantly less expensive. I wouldn't buy it for a $1 today.
Also, your idea of novelty is my idea of a nightmare. There has never been a time when I used it where it didn't do something completely insane. The last time I used it (which will truly be the last time), it waited patiently to make a left hand turn. It waited far longer than I would have, and it was clear of oncoming cars for ages. When it did decide to turn, it did it when there were several cars coming, though it was still safe. Except then half way through its turn, it literally stopped, then turned a bit to the right and centered itself in a lane going the opposite direction of traffic flow, right into oncoming cars. Thankfully I was able to take over and two of the three oncoming cars stopped. Had I done nothing, we would have all been in a head on collision.
Never again.
I use it all the time, it’s quick and easy.
Then again I haven't used it again since it backed into the side of a parking garage and scraped a body panel :-)
Deleted Comment
Video of the latest version: https://www.youtube.com/watch?v=fwduh2kRj3M
Yeah, their perpetual 2-year estimate dropped to 1 year around a year and half ago, after being at 2 years for at least 6 years. So we’re probably four and half years from it either being ready…or dropping to 6 months off for the next several years.
But as time goes on, their products become less impressive and their CEO is not helping the brand.
I didn’t really ask for a smart car.
I also drove my friends Volkswagen the other day, all the lane assist was just ridiculous. I was driving on a remote road pretty fast and the line markings went missing, it was a dark night. It immediately disengaged and had I not been paying attention we wouldn’t went straight into a barrier or other lane. It doesn’t claim to be autopilot but either I’m driving the car or not imo.
100% because of tesla? I remember in college before any EV being asked to sign a petition to car manufacturers, "please build EVs so I van buy one"
I agree Tesla has shifted the perspective on EVs. The absolutism that there was nothing else going on before in the absence of Tesla and nothing else since - seems perhaps myopic
And yet my life is/may be in danger because of it.
No other car manufacturer has ever done anything like this as far as I know.
Even “normal” cars like the Civic or CRV.
https://www.tesla.com/VehicleSafetyReport
??? While neither is perfect, your life is more in danger from a regular driver, that statistically can be drunk, distracted, going too fast for skill level, e.t.c, versus that exact person in a Tesla using FSD.
Its actually amazing
[0] https://www.bbc.com/news/technology-53418069
And the actual context is much less of a big deal than it seems: the biggest plausible consequence would be forcing Tesla to push an over-the-air update with better driver attention monitoring or alerting.
I encourage reading the actual report.
The NHTSA is tired of Tesla's hand-waving away their safety investigations into Autopilot by pushing stealth updates that fix specific scenarios in specific places being investigated. NHTSA wisened up to that and independently purchased their own Tesla vehicles, and disabled software updates, so that they can reproduce those scenarios themselves.
If NHTSA asks Tesla to provide system validation tests showing that an updated version of their software meets the design intent of the system, Tesla would not be able to do so. If they can't prove the new Autopilot software corrects the safety-related defects identified in the current version, then it's not a valid recall remedy.
All evidence from their own AI/AP team and presentations is that there is no real design and system validation going on over there. They're flying by the seat of their pants, introducing potentially lethal regressions in every update.
What is this evidence?
I've seen a few talks from Andrej Karpathy that indicate to me a more deliberate approach.[0] "Software 2.0" itself seems like an approach meant to systematize the development, validation & testing of AI systems, hardly a seat-of-your-pants approach to releases. I have my own criticisms of their approach, but it seems there is pretty deliberate care taken when developing models.
[0] https://youtu.be/hx7BXih7zx8
Why isn't Tesla prosecuted for that? It's lawless!
Meh. I mean, I understand the emotional content of that argument, and the propriety angle is real enough. I really do get what you're saying. And if your prior is "Tesla is bad", that's going to be really convincing. But if it's not...
The bottom line is that they're getting close to 3M of these vehicles on the roads now. You can't spit without hitting one in the south bay. And the accident statistics just aren't there. They're not. There's a small handful of verifiable accidents, all on significantly older versions of the software. Bugs are real. They've happened before and they'll no doubt happen again, just like they do with every other product.
But the Simple Truth remains that these are very safe cars. They are. So... what exactly are people getting upset about? Because it doesn't seem to be what people claim they're getting upset about.
Deleted Comment
Wow. It seems like now on HN, this sort of turn-of-phrase has sadly become a boiler plate way to dismiss a host of comments. IE, what "actual report" are you referring to?? The headline article is about the investigation that has opened but not yet closed, as the linked text shows (it's a PDF but it's essentially a press release showing discoveries so far - why they've escalated. It's not closed and not friendly to Tesla).
"Accordingly, PE21-020 is upgraded to an Engineering Analysis to extend the existing crash analysis, evaluate additional data sets, perform vehicle evaluations, and to explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver’s supervision"
https://static.nhtsa.gov/odi/inv/2022/INOA-EA22002-3184.PDF
I see no basis for this conclusion, which appears to be pure speculation about what NHTSA might decide is necessary and sufficient to address the potential problems if confirmed by the deeper analysis. I encourage reading the actual report.
> I encourage reading the actual report.
I did, and the conclusion do which you appeal to it does not appear to be well-supported by it.
You cannot OTA HW deficiencies.
Now, will they be forced to have monitoring like that, to be on par with their competitors? That’s a different story, and given how weak USA regulatory agencies are, and how reckless Tesla is at disregarding them - I’m pretty sure Tesla won’t be hurt by it.
See. That has been my whole point for months that both Autopilot and FSD is still unproven safety critical software and it goes to show that if used in circumstances say at night it becomes even far dangerous to use at the worst time to drive, like I have said before [0][1][2][3]. Even worse that it lacks proper driver monitoring.
I guess they should be required to have this driver monitoring hardware installed on their cars as well as night vision cameras to avoid the crashes I have listed below. If the regulators enforce this, perhaps they might have saved another Tesla driver from losing control or avoided another crash.
Or perhaps if Tesla still finds it difficult to prevent further crashes using night vision cameras, perhaps they have to admit that they should have used LIDAR instead.
[0] https://news.ycombinator.com/item?id=29639080
[1] https://news.ycombinator.com/item?id=30267710
[2] https://news.ycombinator.com/item?id=28732866
[3] https://news.ycombinator.com/item?id=29516199
Deleted Comment
Its pretty much not worth even paying attention to.
Deleted Comment
Separately, would this prevent Tesla from having a beta tester group that tries out FSD features at no cost?
Deleted Comment