> NHTSA added that "open access to vehicle manufacturers’ telematics offerings with the ability to remotely send commands allows for manipulation of systems on a vehicle, including safety-critical functions such as steering, acceleration, or braking."
If this is accurate:
1. Manufacturers have created systems wherein they can murder their customers remotely. (I expect this to be true, at least to a degree, for Tesla, but other manufacturers should know better than to introduce something as insane as OTA brake control.)
2. What the fucking FUCK is NHTSA doing allowing manufacturers to create systems for remote steering, braking, and acceleration?
> (I expect this to be true, at least to a degree, for Tesla, but other manufacturers should know better than to introduce something as insane as OTA brake control.)
> Notwithstanding anything in the preceding paragraph, motor vehicle owners’ and independent repair facilities’ access to vehicle on-board diagnostic systems shall be standardized and not require any authorization by the manufacturer, directly or indirectly, unless the authorization system for access to vehicle networks and their on-board diagnostic systems is standardized across all makes and models sold in the Commonwealth and is administered by an entity unaffiliated with a manufacturer.
So the manufacturer cannot use their own software to manage authn/authz to their cars?
What would stop the unaffiliated entity from giving anyone access to anyone's car? They wouldn't even have a financial stake in making sure they implement proper authn/authz and internal access controls to stop employees/contractors from accessing this information improperly.
It sounds like the actual software that facilitates giving third parties access to remotely access car diagnostics can't be ran by the car manufacturer. So while they likely envision the user controlling the authorization for their car, my point is that this third party likely has little reason or obligation to safeguard access and authorization to this data and ensure only the user can authorize access to their car.
If the automaker themselves ran the software for this, they have a financial stake to make sure it's done right, since "hyundais being remotely controlled due to bad hyundai 2fa" is not a good headline. But with a third party, it'll have to be a de-facto monopoly over access to a manufacturer's cars, so once they have the contract and are years after the initial rollout, they might cut costs and leave the authorization/authentication system to rot, or have support agents incorrectly "recovering" user accounts for themselves or being phished into doing so.
It's been shown that people are able to rip headlights out of vehicles, plug into diagnostic ports behind them, unlock the doors and start the car. [1]
If criminals are able to figure out and exploit a relatively simple compromise, what other exploitable secrets are auto manufacturers hiding using obscurity that right to repair would expose to the public?
Any US residents know if the NHTSA is captured by corporate interests, or are they relatively free of corruption?
Physically plugging into a vehicle's CANBUS is not different from physically plugging into a computer's USB ports.
We accept that computer security has limits when the attacker has physical access to a machine. Why should this be different with a vehicle? In this example, attackers need to physically rip apart a vehicle to trigger a vulnerability.
It's not as though the headlight wiring harness is any more accessible than core components like the engine control unit.
> Physically plugging into a vehicle's CANBUS is not different from physically plugging into a computer's USB ports.
That's a bad example. If a computer can be taken over by a USB device, we would say there is a security vulnerability in its USB stack. Firewire is widely criticized for having a protocol based around DMA that naive implementations of are vulnerable in this way.
This is hardly a new problem though. Thieves used to be able to pull the wires from under the steering column and start a car.
For the more serious attacks mentioned, the automakers are essentially saying that they build extremely vulnerable systems and are afraid to disclose that fact.
I suspect we're in violent agreement though, that the correct outcome is to fix the vulnerabilities AND to document the repair methods.
Extremely vulnerable is a mischaracterization. Read up on how the CANBUS attack works, it’s actually quite sophisticated.
It’s the “evil maid” problem, and it’s extremely difficult to protect against, such that you pay huge premiums to get equipment hardened against these types of attacks.
> It's been shown that people are able to rip headlights out of vehicles, plug into diagnostic ports behind them, unlock the doors and start the car. [1]
Question about the U.S. legal system:
If that's true, and if NHTSA's position is based on rule rather than law, then can NHTSA be (successfully) sued because the rule is capricious?
IIRC courts have set a really low bar for the rationality of administrative rule-making, but I may not know what I'm talking about.
No shocker here, car manufacturers typically have been trash at software. They don’t want to open up so they lean on security/their own incompetence to say they couldn’t possibly open up telematics safely.
"It's so difficult for legacy car companies to get software right. You'd be surprised. Let me explain it quickly. To save probably 500 per vehicle, or let's say 350 quid a vehicle, we farmed out all the modules that control the vehicles to our suppliers because we could bid them against each other. So, Bosch would do the body control module, someone else would do the seat control module, someone else would do the engine control module, right? And we have about 150 of these modules with semiconductors all through the car. The problem is, the software is all written by 150 different companies, and they don't talk to each other. So, even though it says Ford on the front, I actually have to go to Bosch to get permission to change their seat control software. So, even if I had a high-speed modem in the vehicle and had the ability to write their software, it's actually their IP. We have 150, what we call, a loose confederation of software providers. 150 completely different software programming languages. All the structure of the software is different. It's millions of lines of code, and we can't even understand it all. That's why, at Ford, we've decided in the second-generation product to completely insource electric architecture. To do that, you need to write all the software yourself. But just remember, car companies have never written software like this. They've never written software. So, we're literally writing how the vehicle operates, the software to operate the vehicle, for the first time ever."
$TSLA which NHTSA is in bed with (too). Just "tesla nhtsa probes" and look for those that concluded in actionable items, especially any for regulating their deadly FSD.
NTSB OTOH...take their responsibilities more (or at all) serious.
If I am understanding the article correctly, NHTSA asserts that to open up telematics to 3rd parties would allow remote attacks on multiple vehicles' safety-critical systems simultaneously. This implies telematics has remote control on those safety critical systems.
This raises a question for me: Why the actual fuck are safety-critical systems able to receive commands from anywhere other than the driver's controls or diagnostic port? Perhaps I am old-scool, naive, ignorant, etc, but given what "safety-critical" means, that strikes me as egregiously unacceptable.
I can sympathize with the idea of convenient remote diagnostic and repair, but in my opinion, this is a case where the saftey risk not just to the driver and passengers, but anyone else nearby, outweighs the convenience of logging into Ford's/BMW/Honda's website, click button, car works again.
Watch the episode of PBS’s Frontline that dropped last week about the danger of trucks on US roadways because of the lack of underride prevention devices. They interview several former officials at NHTSA who talk about the agency being under a chokehold of regulatory capture.
It's a little weird right now, because Biden issued an executive order in 2021 backing existing right-to-repair rules. And last year he doubled down, making public statements in favor of enforcement. https://www.extremetech.com/extreme/330906-biden-formally-ba... The NHTSA is part of the administration, but it's also unlikely they totally went rogue here. Not sure what's up.
> They warned complying with the Data Access Law would require an automaker "to remove essential cybersecurity protections from their vehicles." The group declined to comment Tuesday on NHTSA's letter.
> "Vehicle manufacturers appear to recognize that vehicles with the open remote access telematics required by the Data Access Law would contain a safety defect," NHTSA said in its letter to General Motors, Tesla, Ford, Toyota, Rivian, Volkswagen and others.
I guess they’re just taking it at the manufacturers’ word that there’s just no way to implement this feature securely.
”We've tried nothing and we are all out of ideas!”
Most of us are in software and so are well aware how hard making things work againt future attacks are. Either there is no remote connection at all, or security is hard. Given that I trust Ford more than random mechanic, even if the majority if mechanics are more honest than Ford, there is at least one bad actor mechanic who will abuse telematics to do something ththey shouldn't.
> I guess they’re just taking it at the manufacturers’ word that there’s just no way to implement this feature securely
I think it is more that manufacturers say, and NHTSA agrees, that there is no way to securely comply with the state law quick enough to do so by the time the law goes into effect. At least that's what I recall from another article on this.
If this is accurate:
1. Manufacturers have created systems wherein they can murder their customers remotely. (I expect this to be true, at least to a degree, for Tesla, but other manufacturers should know better than to introduce something as insane as OTA brake control.)
2. What the fucking FUCK is NHTSA doing allowing manufacturers to create systems for remote steering, braking, and acceleration?
https://www.wired.com/2015/07/hackers-remotely-kill-jeep-hig...
So the manufacturer cannot use their own software to manage authn/authz to their cars?
What would stop the unaffiliated entity from giving anyone access to anyone's car? They wouldn't even have a financial stake in making sure they implement proper authn/authz and internal access controls to stop employees/contractors from accessing this information improperly.
https://web.archive.org/web/20230510205950/https://malegisla...
The NHTSA complaint: https://s3.documentcloud.org/documents/23846284/nhtsa-letter... (bump from a comment further down by Simulacra )
If the automaker themselves ran the software for this, they have a financial stake to make sure it's done right, since "hyundais being remotely controlled due to bad hyundai 2fa" is not a good headline. But with a third party, it'll have to be a de-facto monopoly over access to a manufacturer's cars, so once they have the contract and are years after the initial rollout, they might cut costs and leave the authorization/authentication system to rot, or have support agents incorrectly "recovering" user accounts for themselves or being phished into doing so.
If criminals are able to figure out and exploit a relatively simple compromise, what other exploitable secrets are auto manufacturers hiding using obscurity that right to repair would expose to the public?
Any US residents know if the NHTSA is captured by corporate interests, or are they relatively free of corruption?
[1]https://www.thedrive.com/news/shadetree-hackers-are-stealing...
We accept that computer security has limits when the attacker has physical access to a machine. Why should this be different with a vehicle? In this example, attackers need to physically rip apart a vehicle to trigger a vulnerability.
It's not as though the headlight wiring harness is any more accessible than core components like the engine control unit.
That's a bad example. If a computer can be taken over by a USB device, we would say there is a security vulnerability in its USB stack. Firewire is widely criticized for having a protocol based around DMA that naive implementations of are vulnerable in this way.
For the more serious attacks mentioned, the automakers are essentially saying that they build extremely vulnerable systems and are afraid to disclose that fact.
I suspect we're in violent agreement though, that the correct outcome is to fix the vulnerabilities AND to document the repair methods.
It’s the “evil maid” problem, and it’s extremely difficult to protect against, such that you pay huge premiums to get equipment hardened against these types of attacks.
Question about the U.S. legal system:
If that's true, and if NHTSA's position is based on rule rather than law, then can NHTSA be (successfully) sued because the rule is capricious?
IIRC courts have set a really low bar for the rationality of administrative rule-making, but I may not know what I'm talking about.
This is almost certainly what the Massachusetts AG will next do.
- Jim Farley
https://youtu.be/8IhSWsQlaG8?t=1476
NTSB OTOH...take their responsibilities more (or at all) serious.
This raises a question for me: Why the actual fuck are safety-critical systems able to receive commands from anywhere other than the driver's controls or diagnostic port? Perhaps I am old-scool, naive, ignorant, etc, but given what "safety-critical" means, that strikes me as egregiously unacceptable.
I can sympathize with the idea of convenient remote diagnostic and repair, but in my opinion, this is a case where the saftey risk not just to the driver and passengers, but anyone else nearby, outweighs the convenience of logging into Ford's/BMW/Honda's website, click button, car works again.
> "Vehicle manufacturers appear to recognize that vehicles with the open remote access telematics required by the Data Access Law would contain a safety defect," NHTSA said in its letter to General Motors, Tesla, Ford, Toyota, Rivian, Volkswagen and others.
I guess they’re just taking it at the manufacturers’ word that there’s just no way to implement this feature securely.
”We've tried nothing and we are all out of ideas!”
I think it is more that manufacturers say, and NHTSA agrees, that there is no way to securely comply with the state law quick enough to do so by the time the law goes into effect. At least that's what I recall from another article on this.