Frankly I feel that for this kind of investication (car crashes and accidents) the manufacturer should be required to hand over all data relevant to the investigation. On device logs are probably the most relevant data you could get in these cases.
That this lab decided to try to decipher the data, and found troves of it that wasn't known to exist before, speaks to how the relationship must be.
They didn't trust Tesla to hand everything over.
In my humble opinion, any region should force all manufacturer of cars that will be sold in that region to hand over data (or methods by extracting the data) in cases like these, lest their license to sell their product be rescinded.
I’d go further, and say that the data should be exportable in a predefined format (ideally an open standard), so that investigators can use 3rd party tools to investigate any make of ‘smart car’.
I'd go further still, the telemetry data should be standardised the same way OBD-II interfaces are. Open source the tech, or at least disclose it to researchers upon request.
So, you think it’s OK if the car you own spies on you, using electricity you pay for?
I’m rephrasing it that way to show that, depending on details this might be close to what Apple intended to do w.r.t. child pornography, something ‘the internet’ wasn’t a fan of, to say it mildly.
There’s a difference here in that there’s an ‘investigation’, but you’re also talking of ‘car crashes and accidents’, so this could get reasonably close (we already have cars that call 911 crash; should they also report it when they hear car accident-like sounds that co-occur with unexpected car movements?). Who’s going to decide whether an accident is serious enough to warrant this? Should car owners have the right to disable this kind of recording?
The car is already spying on you. I think the question is: if data is available to the manufacturer, should it be available to investigators?
I am think it is reasonable. Tesla uses these data to protect themselves (for example by showing that autopilot wasn't in use during an accident). It is not really about user privacy anymore, it is about corporate interests, and I think road safety is more important, that's why we have all these regulations.
This is not what apple suggested.
Apple suggested preemptively notifying government of "illegal" material in a users device.
This is about facilitating an ongoing specific investigation after a car accident.
> this might be close to what Apple intended to do w.r.t. child pornography, something ‘the internet’ wasn’t a fan of, to say it mildly.
Not to completely disregard your point but its not the same. After seeing a few brain bits splattered across windows and chairs/dashboards, stray limbs...road accidents are in another lesgue entirely. We sort of need accidents to happen so that regulation of technology will use blood as ink/toner.
> In my humble opinion, any region should force all manufacturer of cars that will be sold in that region to hand over data (or methods by extracting the data)
This is precisely what is required from Tesla in China.
I just read some Dutch news sources to get some more context.
Some things to know:
Tesla hands over all requested data in case of an investigation. But only the requested data. Because we now know what data is stored more data can and will be requested in the future.
The crash was caused by the driver because he thought lane assistance was turned on but only adaptive cruise control was turned on. This caused the vehicle to move to another lane.
So the question about this accident is: how obvious is it for the driver what is going on and what assistance to expect.
> So the question about this accident is: how obvious is it for the driver what is going on and what assistance to expect.
- you have to deliberately turn the full assistance on/off
- there are audio cues when it turns on, and whilst it’s on, and when it turns off
- the visuals on the screen are different when it’s on vs. off
- the steering wheel feels very different when it’s on
- when it’s on, there are regular cues to generate torque through the wheel to show you’re paying attention
- the overall concept is that even when it’s on, you should keep your hands on the wheel and pay attention - though this is obviously subject to human failings
TBF, I’ve done something similar once or twice —-only turned on cruise control when I thought I’d turned on full assist—- but if you’ve got your hands on the wheel it’s pretty natural and quick to correct and then realise your mistake.
In the 2,5 years I have been driving my model 3 it has happened to me several times that autosteer got disabled (likely due to steering wheel resistance) and I only noticed it after a while. The visual queues are extremely subtle, the auditory queues easily get lost in road noise or music and when you are driving on straight stretches of smooth highway you don’t constantly feel feedback through the steering wheel. Of course the person in the driving seat always remains responsible but it’s not hard to imagine how someone might drift out of their lane like this.
It wasn't a request for "all and any data". From the article:
> Tesla however only supplies a specific subset of signals, only the ones requested, for a specific timeframe, whereas the log files contain all the recorded signals
Tesla provided exactly what was requested and nothing more. Now that the government knows what is more in the data they can request that too, even if it's irrelevant to the investigations.
I am not sure. Maybe the law states you can only investigate what you need to investigate. Climate control and music volume settings might be irrelevant.
I wish there were more information about how they were able to "decrypt" the data. I put that in quotes because often times journalists or laypeople don't differentiate between a file format that is just compressed or obfuscated in some way, and true encryption.
After all, it would be trivial for Tesla to provide essentially unbreakable encryption: just encode all the stored data with a public key where only Tesla has the corresponding private key. But if NFI was able to decrypt, they obviously don't do that, so I'm curious what manner of "encryption" is used in the car.
The word “decrypt” means “make (a coded or unclear message) intelligible”, which is more broad than the IT-specific jargon of using cipher algorithms. It’s perfectly fine to use “decrypt” in the general sense in an article meant for general people to read.
The problem with this, then, is it confuses topics that are important for lay people to differentiate.
Take the current idiocy coming from the governor of Missouri, trying to prosecute a journalist for "hacking" for the crime of ... reading HTML. While the governor used the word "decoding", not decrypting, he accused the journalist of going through some complicated intentional procedure to access private data, which is of course bullshit.
Even in the general sense, "decrypting" something strongly implies the original author had the intention of keeping it private and secure. Was this the case with Tesla's data? It seems like not, because it would have been trivial for them to implement unbreakable encryption. Or was this just a case where someone else figured out their file format?
I’m assuming Tesla don’t give you the source code to their cars. Does anyone? I’d love to have a car that can be taken apart and put back together — the engine and the UI. Not because I have the skill or shop needed to do maintenance, but because at least I stand a chance of paying someone else to.
Right now it feels like I’m dealing with a bunch of pirates. I either pay $800 to hack an HDMI cable into my Audi’s virtual cockpit just so I can use a map that’s not 6 years out of date, or $500 to the dealership to install the latest map update pack. Argh!
I don't see an issue with source code, nor intellectual property or similar, it is all about recorded data, the whole point is that Tesla, requested according to the Law to provide the recorded data (decrypted) provided only a (small) subset of such data and the guys from the NFI managed to decrypt the whole stuff, finding out that Tesla omitted data useful to establish the reasons of the crash.
It also seems that - casually - the omitted data lead to believe that the root cause was Tesla's autoipilot making the car tailgating the one in front of it.
Imagine that an airplane crashes and the decoding of the flight recorder is carried by the airplane manufacturer that omits the part where (say) the tail rudder didn't work as it should have.
No, sadly there is no open source car avaiable (I think some designs exist, coming from univerisies, but nothing you could buy anywhere) I suppose consumers, tuners and repair shops would love it. But I do not see any big company ever going into that direction.
It would be interesting, how it could avoid the patent madness. I assume EVERYTHING with cars is patented.
I've actually had the intention of building a company focused on manufacturing and selling open-source household appliances, like washing machines, fridges etc. Cars would be the endgame for a company like this.
Still in the early planning stages, just creating a good toaster that's repairable, open etc. is an interesting undertaking if you want everything above board.
> The NFI said the decrypted data showed Tesla vehicles store information about the operation of its driver assistance system, known as Autopilot. The vehicles also record speed, accelerator pedal position, steering wheel angle and brake usage
Aren’t these the same things twitter.com/greentheonly publishes very often?
A lot of questions about responsibility come from this case.
If autopilot was following the car too close would the driver be responsible for not taking action? How much trust is a driver allowed to put in autopilot?
I also wonder how much data other cars collect. As far as I know most cars collect a lot of data you don't know about.
"According to Tesla, each follow distance setting corresponds directly to a time-based distance representing the time it takes the Model 3 from its current location to the location for the car in front of the driver.
For cars manufactured after April 27, 2021, Tesla set a minimum follow distance of 3 or greater. But with the 2021.4.21.3 update, this number has been reduced to 2, bringing Autopilot ‘Pure Vision’ close to parity with its previous radar-based counterpart."
Note they say time based and not second so 2 might be something like a second.
Would be interesting to know if the issue here that the driver selected 1 (or 2 or 3) car-distances, and the Tesla followed at 0.25 or 0.7 or something
What if not all sensor data is accurate or reliable? For example their new cars have no radar and therefore the distance is just an assumption of the AI model and can be wrong. It would be pretty bad to rely on unreliable data for the justice system. If they want to use all data I assume some form of interpretation should be done by the Tesla to avoid false positives or false negatives. Especially in crashes where FSD was driving.
That comment doesn't make sense at all. If anything you want to know what the Tesla AI was thinking so you can compare to reality (like Traffic cameras etc.), which might find fault with the driver assist system. Letting Tesla "interpret" the data before handing it over is completely backwards as they are an involved party on the accident.
Radar data is also an assumption, way weaker than cameras.
Cars radar sensors are not the F22 ones, they are very simple, limited and it is not the sky in which metallic objects are foreign and there is empty space everywhere.
The world on earth surface is full of metallic objects with very complex shapes and reflections everywhere(and you have military safety standards, my uncle got cancer(and died) after working with military radars as an engineer and working so close to them). My uncle could not prove the connection but if thousands of people start getting cancer after those devices are common it will be much easier to prove.
With cameras you have millions of points array for cheap that lets you differentiate things much better than simple radars do.
Imagine that you integrate all the colors that go in a camera to a single color. That is what you have with radar sensors.
Did you mean single pixel? Even with a single color, a camera provides a wealth of information. Radar is not quite a single pixel - there is speed, phase, and timing information - but it's much closer to a single pixel than to a monochrome camera. The driving radar I mean. The kit on used in aerospace sweeps out an area so it's more like a camera in the actual output.
Side note: Sorry about your uncle, but I do wish to point out, radar is non ionizing and extremely unlikely to cause mutation. What is more likely carcinogenic is stuff like degreasers, paints, fluids, and open burn pits the military likes to use.
That this lab decided to try to decipher the data, and found troves of it that wasn't known to exist before, speaks to how the relationship must be.
They didn't trust Tesla to hand everything over.
In my humble opinion, any region should force all manufacturer of cars that will be sold in that region to hand over data (or methods by extracting the data) in cases like these, lest their license to sell their product be rescinded.
https://ico.org.uk/your-data-matters/your-right-to-data-port...
I’m rephrasing it that way to show that, depending on details this might be close to what Apple intended to do w.r.t. child pornography, something ‘the internet’ wasn’t a fan of, to say it mildly.
There’s a difference here in that there’s an ‘investigation’, but you’re also talking of ‘car crashes and accidents’, so this could get reasonably close (we already have cars that call 911 crash; should they also report it when they hear car accident-like sounds that co-occur with unexpected car movements?). Who’s going to decide whether an accident is serious enough to warrant this? Should car owners have the right to disable this kind of recording?
I am think it is reasonable. Tesla uses these data to protect themselves (for example by showing that autopilot wasn't in use during an accident). It is not really about user privacy anymore, it is about corporate interests, and I think road safety is more important, that's why we have all these regulations.
Not to completely disregard your point but its not the same. After seeing a few brain bits splattered across windows and chairs/dashboards, stray limbs...road accidents are in another lesgue entirely. We sort of need accidents to happen so that regulation of technology will use blood as ink/toner.
This is precisely what is required from Tesla in China.
Some things to know:
Tesla hands over all requested data in case of an investigation. But only the requested data. Because we now know what data is stored more data can and will be requested in the future.
The crash was caused by the driver because he thought lane assistance was turned on but only adaptive cruise control was turned on. This caused the vehicle to move to another lane.
So the question about this accident is: how obvious is it for the driver what is going on and what assistance to expect.
- you have to deliberately turn the full assistance on/off
- there are audio cues when it turns on, and whilst it’s on, and when it turns off
- the visuals on the screen are different when it’s on vs. off
- the steering wheel feels very different when it’s on
- when it’s on, there are regular cues to generate torque through the wheel to show you’re paying attention
- the overall concept is that even when it’s on, you should keep your hands on the wheel and pay attention - though this is obviously subject to human failings
TBF, I’ve done something similar once or twice —-only turned on cruise control when I thought I’d turned on full assist—- but if you’ve got your hands on the wheel it’s pretty natural and quick to correct and then realise your mistake.
Is it because you must request specific data? Is making a request for all data from the manufacturer not allowed?
Am I missing something here?
> Tesla however only supplies a specific subset of signals, only the ones requested, for a specific timeframe, whereas the log files contain all the recorded signals
Tesla provided exactly what was requested and nothing more. Now that the government knows what is more in the data they can request that too, even if it's irrelevant to the investigations.
* what personal information the organisation holds about them;
* how they are using it;
* who they are sharing it with; and
* where they got their data from.
https://ico.org.uk/your-data-matters/your-right-to-get-copie...
Deleted Comment
After all, it would be trivial for Tesla to provide essentially unbreakable encryption: just encode all the stored data with a public key where only Tesla has the corresponding private key. But if NFI was able to decrypt, they obviously don't do that, so I'm curious what manner of "encryption" is used in the car.
https://github.com/NetherlandsForensicInstitute/teslalogs
1. https://github.com/NetherlandsForensicInstitute/teslalogs/tr...
[0]: https://www.researchgate.net/publication/355202701_Reverse_e...
Take the current idiocy coming from the governor of Missouri, trying to prosecute a journalist for "hacking" for the crime of ... reading HTML. While the governor used the word "decoding", not decrypting, he accused the journalist of going through some complicated intentional procedure to access private data, which is of course bullshit.
Even in the general sense, "decrypting" something strongly implies the original author had the intention of keeping it private and secure. Was this the case with Tesla's data? It seems like not, because it would have been trivial for them to implement unbreakable encryption. Or was this just a case where someone else figured out their file format?
Right now it feels like I’m dealing with a bunch of pirates. I either pay $800 to hack an HDMI cable into my Audi’s virtual cockpit just so I can use a map that’s not 6 years out of date, or $500 to the dealership to install the latest map update pack. Argh!
It also seems that - casually - the omitted data lead to believe that the root cause was Tesla's autoipilot making the car tailgating the one in front of it.
Imagine that an airplane crashes and the decoding of the flight recorder is carried by the airplane manufacturer that omits the part where (say) the tail rudder didn't work as it should have.
Deleted Comment
It would be interesting, how it could avoid the patent madness. I assume EVERYTHING with cars is patented.
Still in the early planning stages, just creating a good toaster that's repairable, open etc. is an interesting undertaking if you want everything above board.
IBM did that with PC architecture and it had profound impact on technology, business and world as a whole I believe.
Deleted Comment
Dead Comment
Aren’t these the same things twitter.com/greentheonly publishes very often?
Example: https://twitter.com/greentheonly/status/1431094320296235008
If autopilot was following the car too close would the driver be responsible for not taking action? How much trust is a driver allowed to put in autopilot?
I also wonder how much data other cars collect. As far as I know most cars collect a lot of data you don't know about.
"According to Tesla, each follow distance setting corresponds directly to a time-based distance representing the time it takes the Model 3 from its current location to the location for the car in front of the driver.
For cars manufactured after April 27, 2021, Tesla set a minimum follow distance of 3 or greater. But with the 2021.4.21.3 update, this number has been reduced to 2, bringing Autopilot ‘Pure Vision’ close to parity with its previous radar-based counterpart."
Note they say time based and not second so 2 might be something like a second.
Cars radar sensors are not the F22 ones, they are very simple, limited and it is not the sky in which metallic objects are foreign and there is empty space everywhere.
The world on earth surface is full of metallic objects with very complex shapes and reflections everywhere(and you have military safety standards, my uncle got cancer(and died) after working with military radars as an engineer and working so close to them). My uncle could not prove the connection but if thousands of people start getting cancer after those devices are common it will be much easier to prove.
With cameras you have millions of points array for cheap that lets you differentiate things much better than simple radars do.
Imagine that you integrate all the colors that go in a camera to a single color. That is what you have with radar sensors.
Side note: Sorry about your uncle, but I do wish to point out, radar is non ionizing and extremely unlikely to cause mutation. What is more likely carcinogenic is stuff like degreasers, paints, fluids, and open burn pits the military likes to use.