There are ways to bypass any of these restrictions imposed by the Android system, even if they were real. Android ships with eBPF, so you just need root.
"Just" is doing a lot of work there. Getting root isn't always possible or easy, depending on your device manufacturer. And if you do manage to get root, your phone will likely stop passing SafetyNet, and you'll lose access to a bunch of apps that you may care about. SafetyNet can be spoofed in some situations, but not all, and even when spoofing does work, it all seems very brittle to me.
Yes, of course, you can do this, but let's not pretend there aren't trade offs.
Yeah. Android has become as hostile as Apple devices due to hardware remote attestation. Might as well buy an iPhone at this point. Only reason I didn't is I discovered the existence of Termux which turned into something of an Android killer app for me. Who knows what Google's gonna kill next though? Termux is already incompatible with the official store due to system call restrictions. Maybe it'll be straight up impossible to run it in a future version of Android.
In the context of "changes to Android 14", "just" is right. Android has required root access for modifying random apps since before Android 6, and that's only because many apps didn't bother implementing certificate pinning (which was already known advice at that point).
Alternatively, you can use ADB + Frida to pull an APK from the device, inject a binary, and inject code at runtime using Javascript or Python. That's much easier for intercepting traffic than messing with certificate stores or eBPF ever was in my opinion.
"Yes, of course, you can do this, but let's not pretend there aren't trade offs."
Are there some words in the parent comment that "pretend there aren't tradeoffs". Is it that he did not include a warning about "SafetyNet". What would this "pretending" look like.
Losing access to "a bunch of apps you may care about" seems to be dependent on an assumption: that the reader cares about certain unnamed apps. Yet we cannot even name these apps. We cannot know what apps a user cares about unless the user tells us. I know Android users that do not use any apps that rely on SafetyNet.
Nor do we know what device manufacturer the reader may be dealing with. It might be one where it's relatively easy to the computer owner to have root privileges.
Perhaps we can refrain from making assumptions about readers.
Everything in the category "the user owns the device" is tricky.
For a lot of users, "It's really hard to break" is a value-add. Every capacity the user has to modify permissions is an opportunity for an attacker to compromise a device. You can see an example of this in web browsers these days, where sites have to `log` a big scary "Don't paste anything someone tells you to paste into here" message into the built-in developer tools because no matter how many safety features get added to the browser security model, the dev tools can bypass them.
It is definitely important that the purchaser knows what kind of phone they're getting (whether it's easy or hard to crack open all the layers of its protection model), but "The phone's protection model is easily broken by the owner" as a universal absolute applied to all devices should be considered harmful.
"Easily, but there's a big scary warning that the person asking you to do this might be trying to hack you" is still "easily". That obviously seems more consumer friendly than either extreme.
I don't think you deserve the downvotes, this is exactly right and is incredibly frustrating. Programmers, and increasingly folks who don't consider themselves to be writing malware, have no concept of a thing that should never be done by an application and only by the end user. If it's possible to do I should be allowed to do it! If they didn't want me doing it they should have stopped me! The amount of guides on the internet not targeted toward developers that teach users to go through the "create an application" flow and grant some random app access to their account in a more privileged way than would be allowed by the app itself is embarrassing for our industry. "Just add this configuration profile!", "Just paste in your API key."
The moment you allow users to add custom root certs ad blocker apps are going to ask users to add one for "advanced" network level ad blocking. You can't win with this crap. Nobody considers themselves a "not advanced" user so no amount of warning will ever work. Have you ever tried doing something, saw some warning that said this was for advanced users in your way to do the thing, and stopped? Me neither.
> sites have to `log` a big scary "Don't paste anything someone tells you to paste into here" message into the built-in developer tools because no matter how many safety features get added to the browser security model, the dev tools can bypass them.
That doesn't seem all that much work. Hardening things like that should be a dedicated job, but of course it doesn't "create value" so it's mostly left to rot until it's a source of bad PR.
I havent needed it on recent androids due to WFH and spending more time on my laptop, but back when I was flying more for work, I was much more into my phone.
Cant remember if it was my motorolla or nexus, but I felt like I had a full fledged laptop in my pocket back then.
Meanwhile, one of the straws that broke the camels back for Windows was the insane difficulty/impossibility of remove bloatware/malware that comes preinstalled with windows 11. In 2023, its mind boggling to think you have easier access to modify a cellphone OS than a desktop OS.
I recently got tired of trying to hack together a sane workflow on the windows computers in the lab at my university, so installed nix-on-droid and gotty on my phone.
Now I just open a tab to my phone's IP address and benefit from the big screen and full sized keyboard while still having exactly the tools I'm used to having elsewhere. When I get home and want to resume work on beefier hardware, I just push from my phone, pull from my desktop, and I'm just where I left off, except now with more resources.
You have to be a bit austere about your tool choices to make the similarity happen (sorry VSCode), but it feels like a bit of a superpower just the same.
That's not necessarily specific to iOS. Certificate pinning is usually done inside an app, not at the OS level. An app can choose to ignore the system certificate store and, for example, pin the cert used to talk to its private API. This is possible both on iOS and Android.
iOS is even easier than Android to add system certificates and can be done without rooting or jailbreaking the device unlike android. cert pinning is done by the apps not the system.
You can, but that's not the system certificate store.
Android has two certificate stores (the user store and the system store). The user store can be altered through the method you linked. The system store used to be part of the system image (you could always disable certificates, of course) and will now be moved to an APEX location that Google can update (to prevent the Let's Encrypt issue in the future).
To alter the system store, you need root access. At the moment it's just a matter of dropping a file with the right name and encoding at /etc/system/cacerts (through Magisk style overlays, or by modifying the system image) but that will change soon.
What's the practical difference between system store and user store? Do some apps or system operations only trust the system store and not the user store?
https://github.com/gojue/ecapture
Yes, of course, you can do this, but let's not pretend there aren't trade offs.
Alternatively, you can use ADB + Frida to pull an APK from the device, inject a binary, and inject code at runtime using Javascript or Python. That's much easier for intercepting traffic than messing with certificate stores or eBPF ever was in my opinion.
Deleted Comment
Are there some words in the parent comment that "pretend there aren't tradeoffs". Is it that he did not include a warning about "SafetyNet". What would this "pretending" look like.
Losing access to "a bunch of apps you may care about" seems to be dependent on an assumption: that the reader cares about certain unnamed apps. Yet we cannot even name these apps. We cannot know what apps a user cares about unless the user tells us. I know Android users that do not use any apps that rely on SafetyNet.
Nor do we know what device manufacturer the reader may be dealing with. It might be one where it's relatively easy to the computer owner to have root privileges.
Perhaps we can refrain from making assumptions about readers.
Anyone know why there is an aarch64 nocore but not an x86_64 nocore.
For a lot of users, "It's really hard to break" is a value-add. Every capacity the user has to modify permissions is an opportunity for an attacker to compromise a device. You can see an example of this in web browsers these days, where sites have to `log` a big scary "Don't paste anything someone tells you to paste into here" message into the built-in developer tools because no matter how many safety features get added to the browser security model, the dev tools can bypass them.
It is definitely important that the purchaser knows what kind of phone they're getting (whether it's easy or hard to crack open all the layers of its protection model), but "The phone's protection model is easily broken by the owner" as a universal absolute applied to all devices should be considered harmful.
The moment you allow users to add custom root certs ad blocker apps are going to ask users to add one for "advanced" network level ad blocking. You can't win with this crap. Nobody considers themselves a "not advanced" user so no amount of warning will ever work. Have you ever tried doing something, saw some warning that said this was for advanced users in your way to do the thing, and stopped? Me neither.
"It's really hard to break" should not be conducive to the dumbing down of the populace. Enabling power users is therefore more desirable.
And importantly, the two do not have to be mutually exclusive.
That doesn't seem all that much work. Hardening things like that should be a dedicated job, but of course it doesn't "create value" so it's mostly left to rot until it's a source of bad PR.
Isn't installing your own OS on your general purpose computer a trivial out? Shall we likewise disable that ability on all general purpose computers?
I havent needed it on recent androids due to WFH and spending more time on my laptop, but back when I was flying more for work, I was much more into my phone.
Cant remember if it was my motorolla or nexus, but I felt like I had a full fledged laptop in my pocket back then.
Meanwhile, one of the straws that broke the camels back for Windows was the insane difficulty/impossibility of remove bloatware/malware that comes preinstalled with windows 11. In 2023, its mind boggling to think you have easier access to modify a cellphone OS than a desktop OS.
Now I just open a tab to my phone's IP address and benefit from the big screen and full sized keyboard while still having exactly the tools I'm used to having elsewhere. When I get home and want to resume work on beefier hardware, I just push from my phone, pull from my desktop, and I'm just where I left off, except now with more resources.
You have to be a bit austere about your tool choices to make the similarity happen (sorry VSCode), but it feels like a bit of a superpower just the same.
EDIT: thanks for people clarifying that pinning is done by Apps and not by IOS.
It's a good feature for security (stalkerware remains a huge problem) but it does suck from a reverse engineering standpoint.
or
https://github.com/barre/privaxy
Android has two certificate stores (the user store and the system store). The user store can be altered through the method you linked. The system store used to be part of the system image (you could always disable certificates, of course) and will now be moved to an APEX location that Google can update (to prevent the Let's Encrypt issue in the future).
To alter the system store, you need root access. At the moment it's just a matter of dropping a file with the right name and encoding at /etc/system/cacerts (through Magisk style overlays, or by modifying the system image) but that will change soon.
Not rhetorical questions.
The MITM attacks by manipulating the keys was a godsend
Invaluable debugging tools
We need some new tact, pro-user AND pro-security - those are often seen as in conflict with each other.
or can you add your own for every domain or something?