I may buy a car (new or used) soon and am a little worried about all the software in cars these days. Software can control pretty much every aspect of a modern vehicle, and so the idea of bugs in a vehicle's software scares me from a safety perspective. Poor software engineering has been implicated in automobile safety incidents in the past[1].
I'm aware of the NASA/JPL rules for developing safety-critical software[2] but I'm not sure if any car manufacturers follow anything similar.
Does anyone here have any knowledge of the software development practices of any automakers and what they do to ensure safety and reliability? And is there anything else I can do to mitigate this risk (short of buying a very old car, which would have other safety downsides)?
[1] https://en.wikipedia.org/wiki/Sudden_unintended_acceleration [2] http://spinroot.com/gerard/pdf/P10.pdf
Either way, if you've had a fuel injected car you were still exposed to these issues. You would have to go buy a carbureted engine from the 80s or before to get away from these "unintended acceleration" issues, as in the end a car with EFI probably has a computer actually controlling the injection. I'd be way more wary of daily driving an 80s or older car from a general safety standpoint than a software issue. You're way more likely to be t-boned at an intersection than a software glitch causing an accident; having a much more modern car will help from a crash safety standpoint than having a carburetor.
There's a ton of things that can go wrong in a car which can cause an accident. The software stack is surely one of those things, but even a 100% mechanical car can have a lot of failures as well. Ever have vacuum hoses fail on an old car? Carburetors get stuck or clogged? Personally, I'd prefer a computer controlling components directly instead of tons of vacuum lines and springs trying to keep things tuned right. On top of that I'll also get much better efficiency and reduce harmful emissions which hurt my family and my neighbors.
And modern cars are much better at handling these types of scenarios. For example, in my late model car, if you apply the accelerator and brake at the same time, the vehicle will ignore the accelerator input. This solves two potential problems from the past: someone accidentally stomping on both pedals when they meant to hit the brake, and a foreign object wedging the accelerator pedal down.
This happens to me (sometimes) in parking lots where pedestrians walk between cars. Mine (2019 Mazda Miata) does not do this, instead I get an engine rev while standing half on brakes, half on accelerator, full on clutch. I end up feeling embarrassed as I tend to get glared at by the pedestrian (no, I did not intentionally rev it to scare you).
I don't think unintended acceleration with cable operated throttles was ever much of an issue. The simpler EFI systems of the 80s and 90s were very robust with predictable failure modes. We've certainly bought a lot in terms of safety and efficiency with the newer designs, but their complexity also means problems can be more obscure and more likely to sneak their way to market.
Anecdote: My 01 Volvo had weird/dangerous intermittent acceleration, but had a fully computerized throttle. The software got confused by a failing throttle position sensor. The best fix is to replace it with a hall effect sensor that doesn't wear out.
> reduce harmful emissions which hurt my family and my neighbors.
If there was a true problem it could be solved by having less kids. Why do you end your sentence with this dipshit way of arguing? Nobody falls for that. Of course we all know the game here is for someone to call you out being a passive aggressive dipshit and play the victim once that happens.
And pretty much none of them ever do if the driver doesn't react exceptionally poorly. Even the spectacular stuff that the internet absolutely loves to hand wring about, like a wheel falling off for whatever reason, almost always results in the car coming to a controlled stop on the side of the road. The conversion rate between "failures" and "meaningful harm to anyone or anything is abysmal."
Even with EFI, if the throttle is mechanical and the EFI continues to ask for more fuel for whatever reason (or a fuel injector gets stuck open), all that will happen is the engine will stall due to the excessively rich mixture.
Ever have vacuum hoses fail on an old car? Carburetors get stuck or clogged?
The normal failure mode of a carburetor leads to an engine that doesn't run, and not the opposite. Before complete failure, you will notice a performance decline.
Personally, I prefer no computer control.
On top of that I'll also get much better efficiency and reduce harmful emissions which hurt my family and my neighbors
You can get a lot better efficiency from a carbureted engine than most people think.
As for safety, I'd rather have freedom.
If the throttle is mechanical. So yeah, I guess there's a window of time there where EFI became the norm but before throttles were also electronic, so late 80s to early 2000s. I imagine the majority of cars on the road today in the US are fully electronic.
> The normal failure mode of a carburetor leads to an engine that doesn't run, and not the opposite.
I've personally experienced carburetors getting stuck open, usually on abused/unmaintained lawn equipment. I do agree the usual failure is that it gets gummed up and inefficient in its atomization, but a stuck open carb isn't impossible. Either way, a carb that fails and you suddenly lose power can also cause problems when unexpected.
> You can get a lot better efficiency from a carbureted engine than most people think.
Yeah, a well-tuned and well-maintained carburetor isn't absolutely horrific in efficiency. But it'll still pale in comparison to the combustion efficiency that can be had in a GDI engine.
> As for safety, I'd rather have freedom.
Cool, and feel free to drive that freedom car in your freedom yard. Please keep your freedom emissions in your freedom air though instead of polluting your neighbors. When you're driving on the public streets there's more than just you out there.
To answer your last question first, buy a car that hasn’t been launched within the last 12 to 18 months. That’s not software specific, that general vehicle safety across the board as they will be working through the initial warranty issues. So if you are looking at second hand and you know model ABC was launched 2016, don’t buy one made in the 2016/2017 period.
ISO 26262 rates every system on a critically rating, if they have a ASIL rating of C or D they have multiple back up systems in place. This falls under functional safety which is a newer (5 years or so) area targeting that cars are now highly complex interconnect systems linked with software - the idea being that you target specific subsystems to make sure their function isn’t totally taken out due to some failure or error in the wider system.
Cyber security wise there is an EU reg coming in from 2024 making sure that OTA updates are safe, reducing hacking attack vectors and the like. This is being introduced to new cars and designs as a result of the issues cited above.
As far as people hacking in via the infotainment to access the car control systems - there are firewalls between infotainment and primary car control to mitigate against that issue. There multiple networks in a single vehicle to isolate systems so that no one central unimportant system (infotainment for eg) can take out the whole vehicle.
Software in cars to this level is new, it’s evolving and it takes 7 or so years to create a new platform. This means there is a lag in the system, especially during this transitionary period.
However car makers take this stuff incredibly seriously and their software teams are absolutely not run in the same way as a lower consequence dev situation. Lives are on the line and the type of devs who work in this field know that.
Nothing is perfect but the safety downsides of an old car are widely considered to be far greater than the threat of hacking or bad code in a new car.
The one thing that could cause a lot of problems for cars and software is Agile/Scrum.
The projects that are being run in this, new for the industry way, are always late and people hate working on them.
CEOs and other C suite people see the massively shorter lead times that software can offer and are getting greedy. They saved a year or more of time on a feature thanks to code and over the air and then they decide they want it made in 4 weeks, when 3 months would be prudent.
There’s something about the intangibility of software that makes traditional automotive people’s brains break.
Thankfully many rank and file engineers and PMs in OEMs are pushing back against Scrum etc so a more pragmatic layer of management will come up in the coming years. Sadly Agile/Scrum will cause some preventable issues in the meantime.
Unlikely to be safety critical stuff due to the rounds of QA and safety council sign offs and gateways they need to go through. But less safety critical stuff may slip through.
actually I see this break most managers' brains. In my experience it's been a constant pressuring to reduce scope such that the plans of the incompetent tend to be selected over those who know how to build great software with all the non-functional requirements in place (security, reliability, operability, modularity/flexibility etc) .
Nobody in the industry is doing Agile for safety critical systems. The development standards are getting such that writing automotive software is not fun any more, but that is the correct way to go.
Want to electronically open the frunk on an EV? That piece of hardware and software has a surprising level of safety concern. Because inadvertently opening the latch can kill someone.
You are correct to be concerned, but the industry is very much on top of things.
ISO 21434 came out a few years later. https://www.iso.org/standard/70918.html
This was all kicked off after the Jeep Hack. https://www.wired.com/2015/07/hackers-remotely-kill-jeep-hig...
Overall the people in the field working on security these days seem to be excellent to me. They have crypto experts, kernel experts, and pretty good standards.
Before the Jeep Hack, they still took it seriously, but it was a lot of roll your own crypto types, and they didn't really know what they were doing.
Since then all the automotive companies hired and purchased companies from the traditional Cyber area and have trained up hybrid automotive and cybersecurity experts.
They still aren't perfect, but nobody really is, but cars these days have pretty cool tech in them.
If you are worried I'd recommend trying to hack your own car. You can learn a lot from it, and there are a lot of cool things you can do. In my experience, nothing alleviates fear better than a deep dive into a subject.
comma.ai for example have built an open source self-driving platform from hacking on the internals of vehicles. https://comma.ai/
The industry has also recently seen the introduction of ISO21434, cybersecurity engineering standard for road vehicles.
"widely considered" by the same industry who would love to sell you a new car...
That wasn't enough to prevent the Uconnect disaster of a bug that only existed because they sold out on two occasions: when ECUs were invented (green and performance marketing), when smart crap was bundled into cars (smart being a word that universally means ostensibly convenient but in practice even layman consumers hate it).
The reality is that this is the current state of affairs. Most of people doing software for cars have not the foggiest idea what software is really about.
All the software I read is just impossible to understand. And no standard help in many cases.
Some examples I've seen in code: - Use of kind of hungarian notation to the point that a loop variable was named something like "uibe32bb_i_lns" - Comments in other human languages that were not english - Use of recursion - Have seen a call like name1::name2::name3::name4::name5::name6::name7::name8::name9::name10::name11::name12::name13::name14. The names where some kind of hungarian notation, those calls where everywhere in the code. - Lines more than 1000 characters wide, as a rule - Files north of 100kB of code I can go on and on and on....
Some examples of exchanges with people:
1) Software architect, of a ECU: one programer asks for the memory and CPU budget for a function. The reply was "I'm the architect, I've no idea what you are talking about"
2) System chief architect, for a very important project of a big auto-maker: one engineer says something about software errors. The architect interrupts, and explains that the software never makes an error. Because a computer only does what it is told to make. -- that is terrible enough, for example ignoring the possibility or a SEU, but he goes further, to say that any kind of test is not necessary, because, SW, as stated, makes no errors.
Some general points: - 99% of people in "SW" do not know what gdb is. They debug by "cout <<" - I found nobody who knows what tail recursion is - 90% are only able to program, to some extent, in one of C++ or Python, but no other language. - Mentioning Ada, Lisp, Forth will trigger a waterfall of insults saying those are old and should never be used.
I keep buying the most basic cars. I'm genuinely terrified to think in anything automatic in my car.
I meant, they do not know what a debugger is. As stated, they use "cout << "At line 26, after call to xx" as debugging tool. For gdb there are plenty of python extensions, and GUIs, even web front-end, that are not bad... but it may be difficult for some people, I understand that.
> Ada is useful for automotive. Lisp and Forth, not so much (especially since Lisp isn't usually used in hard real-time applications). This isn't the 1980s, MCUs aren't that memory constrained.
Well, first, they do not have the foggiest idea what Ada is. That is my problem. Once somebody suggested we should look into it, for L4 autonomous driving. He was laugh at, and it was said "it is a dead language from the 60, like Cobol or Fortran, nobody has used it in 50 years, there are no compilers for it!!!". I've seen forth being used in some 8-bit uC in the automotive industry still. Now is 99% gone, but was very much used. Lisp, can be used in hard real-time. BTW another thing always hanging around is the "hard real-time" for automotive. It is interesting, because other than for airbags, ignition and injection, you are talking about 100ms response times, which can be achieved very easily.
> Knowledge of obscure programming languages doesn't necessarily make you a better software engineer.
I'm not taking proficiency in the languages, I'm talking knowing it exist, having an idea of what is possible. I mean, I know no good C programmer that is not at least aware or the existence or Rust. And no, 90% of the programmers writing safety critical SW have no idea that a language called rust is available.
> I want my automotive embedded engineer to have a solid grasp of computer architecture, real-time safety protocols, and defensive programming.
Well, again, another example, with a chief software architect, in an ECU, so embedded: I ask "do we have some kind of stack monitoring?" Reply: "What?! we have no stack, stack is a data structure, ..." goes on with a long explanation of what stack, queue and tree are, and when are used... My personal opinion is, if you search for people that "know" only C++, and have no idea of asm, that is what you get. That is my experience at least.
> I don't see a problem with a Korean/Japanese car manufacturer having their documentation in a non-English language. As long as they do everything in-house and don't outsource to India like Boeing I have no problem with it.
I'm talking in-code comments, not documentation. But anyway, honesty, thinking you can do everything in-house today, and you will be able to maintain that in that way for the next 10 or 20 years when you have to maintain the code, sounds optimistic to me, at least. But again, I'm talking code I had to read, and maintain... so... yes... I'm talking a case, where it was BAD to have not english comments.
You are not reading it correctly. It is not code as everyone knows it. It's like an electrical circuit with variable names attached to each conductor, and the code propagates information like electricity would.
There's tools dedicated to this, able to draw pictures of such code circuits (e.g. Simulink, Ascet). And such pictures can be automatically translated into c-code, that looks even worse than anything translated manually.
In the end, of course the tests prove that the code works like the picture of the circuit shows, and therefore the car must work correctly! This avoids the need for anyone working on only the code to understand a car.
In reality, things usually work in the end only because of how simple everything is and high number of iterations.
I’m talking about human written code, meant to be read, maintained, debugged and tested by other humans.
I trusted Volkswagen because of their reputation. Then the news broke about them systematically lying and breaking the law with respect to engine emissions. Shortly after this came to light, other "reputable car companies" turned out to have been not trustworthy at all.
Yes there are good standards in place and some companies claim to adhere to them but no company should be trusted on their word or reputation alone. The better question is what kind of regulatory oversight is in place to make sure those claiming to adhere to certain standards are actually doing so? Also, how much power do the regulatory organizations have in addressing violators?
Ford as one I can speak about with knowledge took seriously the cost of recalls versus catching issues in testing. It's massively cheaper to spend money up front to do full process and catch every bug you can than to cover recall costs to update later not even considering liabilities if anything does go pop.
Mistakes of course happen. But they're also rarely working from scratch.
It makes working in modern ways horrific seeing the shoddy shit tossed out to meet consumer gadget deadlines.
Then a few years later they got hit again with one of their suppliers: Takata's killing airbags
There's a separate standard (ISO 21448) trying to address issues with safety of intent, i.e. maintaining safety when there's no actual fault in the system. (Like the misclassification example.) This one's newer, much less effort has been spent developing it, and even less has been spent trying to follow it. Frankly it doesn't have as much to say. (And how could it? Nobody knows how to solve general classification problems, and especially not with something running on some 20 W max control unit.) This part of the problem space is basically the wild west. Some auto makers do a good effort trying to create safe solutions. Others not so much.
In summary, some of the electronics solutions in the car can probably be trusted to do what they're meant to (e.g. airbags). Others (e.g. lane keeping assist, emergency braking) are still still mostly be safe but certainly warrant keeping your hands on the wheel. Anything approaching fully self driving is at best quite dubious at this point though.
Both are acceptable standards, but ISO 26262 is a behemoth of a standard that most people have never read. Many companies don't even make the full standard available to their development teams, let alone educate people to employ it effectively. Similarly, MISRA is fine in theory, but the practical usage often ends with running code through an automatic checker that can only detect half the rules.
The API itself is decent but the configuration and the ecosystem is a nightmare.
https://www.motorauthority.com/news/1121372_why-mazda-is-pur...
You can make a modern electric vehicle with actual buttons and dials. There is nothing about a car not having a gas motor that requires every tiny bit of functionality being controlled by a touch tablet. If anything it just seems like laziness in car design.
I'm with you and hope all the idiot touch screen crap is ditched.
That means you can get OTA upgrades that 99% of the times will work flawlessly, but a day may do not, the day you are in a rush in the early morning.
Since most connected cars are de-facto owned by their vendor a potential breach or deliberate sabotage might brick ALL at once across the globe or in some specific areas/countries.
...
A modern car is a car co-piloted by a human and a computer. A local airgapped computer might have bugs, a connected one might have vulnerabilities. Be more scared about them.
In mere local safety terms I can say most cars I know are partially mechanical that means for instance your steering wheel can auto-steer BUT with (more than) a bit of force you can steer it mechanically even if automation completely fail. Similar the break pedal have some servo systems but still partially work in mechanical forms, so might became very hard to push but still able to break a bit.
The most dangerous common design I know are:
- impossibility to turn off certain ADAS who might act really badly in certain weather condition, like the classic ABS on icy roads;
- automatic doors lock when car move, NO DAMN WAYS to unlock them while the car still moving;
- manual parking break disappeared so a kind of emergency breaking ALSO usable by a passenger (for instance if the driver fell ill suddenly) ABSENT and no electronic replacement either since the electronic one if present refuse to engage if the car is moving;
- cockpit design that makes very hard/slow for a passenger to push the driver feet out of accelerator etc if he/she fell ill suddenly.
I consider the above as a sign of VERY BAD design, so I doubt those who made it can be trusted for anything else in safety terms...
2. The code, in many cases, is probably an unmaintainable mess. Embedded programming is not always modern programming, for good and bad.
3. Today, the computers in cars are doing more, and the systems are more complex. It's reasonable to expect more serious problems as a result.
4. Companies do safety testing, of course, but there's no such thing as as "100%" test coverage for complex physical machines running outside of a lab.
5. The best way to judge the safety of cars is the best way to judge safety for airplanes: let other people test them out for a while and then check whether or not they report problems.
Now the companies are migrating to real programming in C++, and it is a terrible mess. There are just not enough people with software competence to drive it.
I've seen people trying to do L4 automated systems with this blocks. Pages and pages and pages of boxes (which can only be the basic logical function, and the 4 basic arithmetic operations!!!). Of course the project didn't go anywhere!
Optimize this problem by buying a car with the best safety rating. This is something that can be objectively measured, both in crash testing/labs and from reviews of real-world crash results. Expect that a crash could be inevitable as it is totally out of your control. Optimize for the best odds of surviving a crash without issues.