- Cannot decrypt FileVault without you entering decryption key post-exploitation.
- Cannot access secrets in the Secure Enclave.
This isn't great, but it's not really a huge security issue? The main attack vector here is stolen Macs being re-sold as they can be "un-bricked".
I know part of the point of the T2 is to make physical access less interesting an attack vector, but it's typically doing so after a machine has been stolen. This makes it more attractive to do without stealing, but that's an attack style that is only really done by state level threats, and one that was still possible anyway, because hardware access is essentially game-over anyway at that point.
I feel like this is over-blown? Am I missing something? Is there another attack vector here that's worse for "regular users"?
For what it's worth, we're talking about mostly personal laptops, tablets, and smartphones here - not corporate servers. I wouldn't be so quick to dismiss the "physical access" requirement, although that does at least make it difficult for attackers to exploit the vulnerability at a large scale.
Sure, but as mentioned there are basically 2 versions: steal, or modify in place.
With the former, the device is already stolen. The only difference here is that the thief can re-sell it. That's not a huge win for the user who it was stolen from.
With the latter, you need ongoing physical access in order to install and maintain a presence on the device. That's a sophisticated attack, but sophisticated attackers already have ways of doing this.
physically target the macbooks of employees that have access to those servers, especially the enterprise device management servers, and you can probably exploit this at a larger scale. You could do a pretty trivial amount of LinkedIn research to figure out your targets.
What I thought from previous articles is that this helps brute forcing passwords. Note that I do not know this for sure.
My thought was that the T2 chip is responsible for rate-limiting access to the Secure Enclave. Specifically, for rate-limiting attempts to derive a key from as user-supplied password and the secret in the Secure Enclave.
This allows for weaker passwords to be used, because brute-forcing requires the secure-enclave in the loop, and access to that enclave is rate-limited. Drop that rate-limiting, and the brute-force attack becomes a lot more viable.
As far as I know, this was at stake when the FBI wanted apple's help a while ago to decrypt an Iphone. However, I have no idea whether this is the same scheme used for MacOS devices. Nor do I know whether the T2 chip is actually responsible for rate-limiting.
True that physical possession lowers the risk barrier considerably, but still this threat is problematic in a couple of ways.
a) An unattended laptop can be compromised in under five mins, all without any obvious indications. There are also loaner situations, at schools etc. where machines are shared.
b) no specialized hardware is necessary, so it's really cheap to execute.
c) this is probably the worst part - power cycling the laptop does not reset the T2 ("The T2 chip is fully booted and stays on, even if your Mac device is shutdown.") - so there's no practical way to ensure that your machine is not compromised after a brief interlude. For extra fun, if it turns out the attack can be executed by a malicious wall port, then things start to get really interesting.
Every time an apple device is taken from you at the us border, the border patrol could use this vector to capture your password later and send it over the internet without your knowledge. Before you could just assume they couldn't infect you. You aren't safe until you reset the t2 - every time someone takes it from you.
But the same is true for any laptop - if someone untrustworthy takes your laptop and you’re at that level of concern then your next step is necessarily erasing the machine.
Since this is a tethered exploit, simply rebooting the device should prevent that.
They could also install a hardware keylogger, so a non-exploitable T2 doesn't really help here in the general case. We just have to hope that the TSA aren't at the level of your average state actor.
right on Spot, but also brings an ethical question ! lets say you identified an vulnerability or an hack or found something that will nullify a product's security. what would be the first step one should do?
1). send it to apple(or respective company) and if they don't respond after followups, put it on internet
2) put it on internet so it gains popularity and makes the aforesaid companies do something!
in some cases like these (those require physical access) this is fine, but what if someone uncovers a potential remote hack for every android or apple phone out there! which option would be ethical?
I'd say it depends on what that "security" is. If it is remotely exploitable and can harm user freedom, tell the companies quietly. On the other hand, if it can enable user freedom, keep it to yourself and those who you trust to be on your side.
Regular users don't care about security - that's just a matter of fact.
As for security for "other users" - users with high security requirements can be targeted, even with a locked system.
As an example - if someone messes with my laptop while I'm away, TPM will fail and Windows will demand I re-enter my bitlocker key upon start. If you can suppress that - that's a big issue.
They don't care enough to be inconvenienced for it, but they do care when their identity is stolen, when they are blackmailed with photos, etc.
This is what's good about the T2, it secures those sorts of things for regular users in a very convenient way. This vulnerability does not appear to really change that for regular users (although it likely does for orgs with high security requirements, i.e. those considering an APT part of their threat model).
When you send your Mac to Apple Support, it is assumed that no technician is able to peep at your data, as T2 provide encrypted storage on SSD. With this loop hole, your data is wide open eg when your Mac die before you can wipe your data. Apple often would replace the whole Mac, including new SSD for you. Those who skip FileVault might want to reconsider.
It's a good idea to assume that if physical access is compromised, you are screwed. This just confirms this continues to be the case.
Even without this, many people could be owned just by installing a key logger on their external keyboard which would likely be easier to access and hack than the T2/ laptop.
I think to persist the attack you need a persistent hardware device attached so a reboot doesn't reset things?
The thing is, can't you just stick a keylogger onto the machine at this point and get an equivalent compromise? I log you login and then use that to authenticate?
I agree with you. Question for the group, since hardware security isn't my area. Will Apple be able to update this vulnerable in a T2.1 or T3? Gives me a new reason for yet another MBP I guess (just purchased the 16")
T2 is basically a bridge (EDIT: as others have pointed out for this term I meant a temporary fix for non ARM Macs) between Intel Macs and Arm Macs to allow for macOS 11 to work on both platforms with feature parity such as Siri.
Its probably a good idea to get a new Arm Mac as your next Macbook Pro because the T2 chip doesn't even need to be there anymore.
It's the same Checkm8 exploit that was hyped up as the end of the world for iPhones late last year (before it turned out that it was really only useful to allow sideloading software on your own device).
I'm continually reminded that a HN "HUGE!!" security issue is so rarely that.
I can think of so many more much more significant security issue, from remote root access to windows machines, potentially intentionally flawed VPN security, TPM on intel machines that are remotely exploitable and provide persistent access to machine and the list goes on and on.
There is something about Apple that makes folks blow up. Perhaps it's just they are one of the few with a bit of a reasonable reputation here.
Steal someone's phone? You realize iphones since Xs forward are using A12+ chips, they don't even have the intel / T2 combo being talked about as far as I know. And then to get access you need to install a keylogger PHYSCIALLY into the macbook. All this is "possible", but you can probably do a keylogger without breaking T2 and get equivalent access at the end of the day.
Which exists on likely 99% of people's setups, Mac, Windows, whatever. Keyloggers on external keyboards, keyboard hacks, bios hacks...
> Stolen phones, people incarcerated
This hack doesn't unencrypted the drive, so this hack won't help if you just steal someone's laptop. You have to get physical access, hack the T2, then get it back into their hands so they can put their password in to decrypt the drive.
> How difficult would it be to steal someone's phone, but illegal stuff on it...
This hack affects MacBooks & Macs with the T2 security chip installed, not about phones.
You can do exactly the same on a machine without a T2 chip though, so I’m not sure what makes this special vs any other time you give someone untrusted physical access to a device
Nothing very interesting but there's 3 replies I can see, you have to click on the body of the tweet to expand the replies, whereas clicking on the comments button starts your own reply.
While as always, I love reading about hacking efforts on every front, this part:
>> This could be used to e.g. circumvent activation lock, allowing stolen iPhones or macOS devices to be reset and sold on the black market.
Is making me very sad. As someone who had things stolen in the past, thieves are the scum of the earth - I was hoping that the implementation of lock on apple devices would be a major roadblock to theft.
Which doesn't really matter because carriers don't share those blacklists internationally - so stolen devices just end up in another country. Also, I believe this was specifically about MacBooks?
During initial setup the iPhone checks in with Apple (presumably providing its serial number, etc) and gets some client certificates back for things like iMessage, push notifications, etc.
So even if you circumvent activation lock client-side and get to the phone's home screen, you shouldn't be able to really use it for much (beyond a simple phone and web browser).
But this is specifically about the MacBooks, no? Meaning that if you can get the machine to boot it will work as normal, maybe App Store won't work but it shouldn't have much impact on the usability of the machine, right?
It is one of the manipulation techniques used to convince people they should trade freedom to use their devices however they want for notion of security.
Apple is known for opposing right to repair initiatives and prefer that faulty devices would rather be destroyed than repaired.
It's an interesting balance though. On one hand I don't want a thief(or anyone) to be able to use my device after it's stolen, it should be literally impossible for anyone to unlock that device, ever. On the other hand, I'm not happy with Apple positioning themselves where they are, aggressively going after independent repair shops and not selling spare parts. Can we have it both ways? I don't know.
Now this poses the question - if they made such security claims regarding the T2 chip, and if the T2 is effectively unsecure and unpatchable, are people who have devices with Apple T2 entitled to a refund, or at least compensation?
>It is one of the manipulation techniques used to convince people they should trade freedom to use their devices however they want for notion of security.
No, it absolutely is not. It is a 100% genuine trade off, and you are being deceptive out of your own personal interests if you lie and claim it isn't. The ability to do anything necessarily means the ability to do bad things, and the more power is available the more work, knowledge, and metaknowledge is needed to both make full use of it and avoid pitfalls.
Software lockdowns prevent power users like the typical HNer from doing useful and valuable things. But they also help ensure that non-power users (who, remember, may anything from doctors to engineers to diplomats to farmers, brilliant experts in their own fields key to society just not in computers) cannot even be social engineered into getting themselves too deep in trouble. And this is obviously, objectively a real problem, and one frankly the tech community brought in part on ourselves with constant "the user is at fault, the user is stupid" stuff for decades. But why SHOULDN'T users just be able to go easily find anything they think looks interesting and give it a whirl, and have a reasonable expectation that it'll meet certain isolation/privacy/payment standards, and if not that it'll get automatically removed and the developer banned? And even for power users, lockdowns can help shift the power balance away from developers and pool user buying power through a point so that devs are forced to obey certain basic standards whether they like it or not.
Of course, software lockdowns also destroy valuable innovation, and they create a single point of pressure that is in turn prone to abuse or (likely worse in theory and more prevalent in practice) pressure from even more powerful entities. The likes of China, the EU, or if things go bad enough the US can force censorship onto a vast array of people and devices via iOS in a way that they can't with traditional systems.
On the hardware side, lockdowns make repair more difficult/centralized. But they also prevent hardware hacks, and have proven make theft vastly less economic.
IMO, I'd like to see a single point, buy-time option to opt for the ability for an owner to load their own root keys for software, hardware, both, or neither. Personally, I'd go for software and not hardware: the devices are very reliable, I'm not a hardware hacker myself and in my own model I'd prefer to run the risk of buying replacement kit rather then even think about hardware subversion. For my parents and grandparents I'd strongly push them to continue with "neither" which should be the default. I'm sure many on HN would like both. Maybe some (particularly in countries where official repairs are much harder) would even prefer to keep the software side of things locked down for malware but as a practical matter allow repairs.
But it's intensely frustrating to still see techies who apparently never dealt with family support or Help Desk or whatever in their lives blithely repeat 1990s/early-00s memes about PEBKAC/PICNIC/lusers etc. The BOFH was a ton of fun to read but it's realworld application has limits. I don't feel bad about not being skilled at small engine work, or pharmaceuticals or whatever.
Indeed, it is always interesting to see these articles that never mention the positive side of these exploits, i.e. enabling freedom. I guess it shouldn't be that much of a surprise, given the authoritarian nature of the security industry (and most big corporations in general, including Apple.)
I had the opposite reaction. I thought -- great! We can return all this hardware that people have irresponsibly abandoned, and bring it back to successful reuse.
The important part about this is that even if 100% true, it's only exploitable with physical access or a compromised USB device of some kind—it's not remotely exploitable. (Exploits also would not last beyond a clean boot.)
So while the headline is strictly true based on the contents of the article, I would call it excessively alarmist, at best. This is not something that the average user ever needs to worry about.
> Good news is that if you are using FileVault2 as disk encryption, they do not have access to your data on disk immediately. They can however inject a keylogger
Overall I’m pretty happy hear. A stolen laptop can’t steal my secrets either from the Secure Enclave or disk, and this exploits prevents us from amassing iBricks because the hardware can always be recovered.
- Requires physical device access.
- Cannot decrypt FileVault without you entering decryption key post-exploitation.
- Cannot access secrets in the Secure Enclave.
This isn't great, but it's not really a huge security issue? The main attack vector here is stolen Macs being re-sold as they can be "un-bricked".
I know part of the point of the T2 is to make physical access less interesting an attack vector, but it's typically doing so after a machine has been stolen. This makes it more attractive to do without stealing, but that's an attack style that is only really done by state level threats, and one that was still possible anyway, because hardware access is essentially game-over anyway at that point.
I feel like this is over-blown? Am I missing something? Is there another attack vector here that's worse for "regular users"?
For what it's worth, we're talking about mostly personal laptops, tablets, and smartphones here - not corporate servers. I wouldn't be so quick to dismiss the "physical access" requirement, although that does at least make it difficult for attackers to exploit the vulnerability at a large scale.
With the former, the device is already stolen. The only difference here is that the thief can re-sell it. That's not a huge win for the user who it was stolen from.
With the latter, you need ongoing physical access in order to install and maintain a presence on the device. That's a sophisticated attack, but sophisticated attackers already have ways of doing this.
My thought was that the T2 chip is responsible for rate-limiting access to the Secure Enclave. Specifically, for rate-limiting attempts to derive a key from as user-supplied password and the secret in the Secure Enclave.
This allows for weaker passwords to be used, because brute-forcing requires the secure-enclave in the loop, and access to that enclave is rate-limited. Drop that rate-limiting, and the brute-force attack becomes a lot more viable.
As far as I know, this was at stake when the FBI wanted apple's help a while ago to decrypt an Iphone. However, I have no idea whether this is the same scheme used for MacOS devices. Nor do I know whether the T2 chip is actually responsible for rate-limiting.
a) An unattended laptop can be compromised in under five mins, all without any obvious indications. There are also loaner situations, at schools etc. where machines are shared.
b) no specialized hardware is necessary, so it's really cheap to execute.
c) this is probably the worst part - power cycling the laptop does not reset the T2 ("The T2 chip is fully booted and stays on, even if your Mac device is shutdown.") - so there's no practical way to ensure that your machine is not compromised after a brief interlude. For extra fun, if it turns out the attack can be executed by a malicious wall port, then things start to get really interesting.
They could also install a hardware keylogger, so a non-exploitable T2 doesn't really help here in the general case. We just have to hope that the TSA aren't at the level of your average state actor.
1). send it to apple(or respective company) and if they don't respond after followups, put it on internet 2) put it on internet so it gains popularity and makes the aforesaid companies do something!
in some cases like these (those require physical access) this is fine, but what if someone uncovers a potential remote hack for every android or apple phone out there! which option would be ethical?
https://www.bugcrowd.com/resource/what-is-responsible-disclo...
As for security for "other users" - users with high security requirements can be targeted, even with a locked system.
As an example - if someone messes with my laptop while I'm away, TPM will fail and Windows will demand I re-enter my bitlocker key upon start. If you can suppress that - that's a big issue.
They don't care enough to be inconvenienced for it, but they do care when their identity is stolen, when they are blackmailed with photos, etc.
This is what's good about the T2, it secures those sorts of things for regular users in a very convenient way. This vulnerability does not appear to really change that for regular users (although it likely does for orgs with high security requirements, i.e. those considering an APT part of their threat model).
Even without this, many people could be owned just by installing a key logger on their external keyboard which would likely be easier to access and hack than the T2/ laptop.
The thing is, can't you just stick a keylogger onto the machine at this point and get an equivalent compromise? I log you login and then use that to authenticate?
Its probably a good idea to get a new Arm Mac as your next Macbook Pro because the T2 chip doesn't even need to be there anymore.
https://arstechnica.com/information-technology/2019/09/devel...
How difficult would it be to steal someone's phone, but illegal stuff on it, and call the police? Could you figure it out for 10k USD?
I can think of so many more much more significant security issue, from remote root access to windows machines, potentially intentionally flawed VPN security, TPM on intel machines that are remotely exploitable and provide persistent access to machine and the list goes on and on.
There is something about Apple that makes folks blow up. Perhaps it's just they are one of the few with a bit of a reasonable reputation here.
Steal someone's phone? You realize iphones since Xs forward are using A12+ chips, they don't even have the intel / T2 combo being talked about as far as I know. And then to get access you need to install a keylogger PHYSCIALLY into the macbook. All this is "possible", but you can probably do a keylogger without breaking T2 and get equivalent access at the end of the day.
Which exists on likely 99% of people's setups, Mac, Windows, whatever. Keyloggers on external keyboards, keyboard hacks, bios hacks...
> Stolen phones, people incarcerated
This hack doesn't unencrypted the drive, so this hack won't help if you just steal someone's laptop. You have to get physical access, hack the T2, then get it back into their hands so they can put their password in to decrypt the drive.
> How difficult would it be to steal someone's phone, but illegal stuff on it...
This hack affects MacBooks & Macs with the T2 security chip installed, not about phones.
Deleted Comment
https://twitter.com/axi0mX/status/1313665047768391680
https://twitter.com/su_rickmark/status/1313733383453732864
Yeah, Twitter is terrible.
Sigh
>> This could be used to e.g. circumvent activation lock, allowing stolen iPhones or macOS devices to be reset and sold on the black market.
Is making me very sad. As someone who had things stolen in the past, thieves are the scum of the earth - I was hoping that the implementation of lock on apple devices would be a major roadblock to theft.
During initial setup the iPhone checks in with Apple (presumably providing its serial number, etc) and gets some client certificates back for things like iMessage, push notifications, etc.
So even if you circumvent activation lock client-side and get to the phone's home screen, you shouldn't be able to really use it for much (beyond a simple phone and web browser).
Didn't this become false advertising?
No, it absolutely is not. It is a 100% genuine trade off, and you are being deceptive out of your own personal interests if you lie and claim it isn't. The ability to do anything necessarily means the ability to do bad things, and the more power is available the more work, knowledge, and metaknowledge is needed to both make full use of it and avoid pitfalls.
Software lockdowns prevent power users like the typical HNer from doing useful and valuable things. But they also help ensure that non-power users (who, remember, may anything from doctors to engineers to diplomats to farmers, brilliant experts in their own fields key to society just not in computers) cannot even be social engineered into getting themselves too deep in trouble. And this is obviously, objectively a real problem, and one frankly the tech community brought in part on ourselves with constant "the user is at fault, the user is stupid" stuff for decades. But why SHOULDN'T users just be able to go easily find anything they think looks interesting and give it a whirl, and have a reasonable expectation that it'll meet certain isolation/privacy/payment standards, and if not that it'll get automatically removed and the developer banned? And even for power users, lockdowns can help shift the power balance away from developers and pool user buying power through a point so that devs are forced to obey certain basic standards whether they like it or not.
Of course, software lockdowns also destroy valuable innovation, and they create a single point of pressure that is in turn prone to abuse or (likely worse in theory and more prevalent in practice) pressure from even more powerful entities. The likes of China, the EU, or if things go bad enough the US can force censorship onto a vast array of people and devices via iOS in a way that they can't with traditional systems.
On the hardware side, lockdowns make repair more difficult/centralized. But they also prevent hardware hacks, and have proven make theft vastly less economic.
IMO, I'd like to see a single point, buy-time option to opt for the ability for an owner to load their own root keys for software, hardware, both, or neither. Personally, I'd go for software and not hardware: the devices are very reliable, I'm not a hardware hacker myself and in my own model I'd prefer to run the risk of buying replacement kit rather then even think about hardware subversion. For my parents and grandparents I'd strongly push them to continue with "neither" which should be the default. I'm sure many on HN would like both. Maybe some (particularly in countries where official repairs are much harder) would even prefer to keep the software side of things locked down for malware but as a practical matter allow repairs.
But it's intensely frustrating to still see techies who apparently never dealt with family support or Help Desk or whatever in their lives blithely repeat 1990s/early-00s memes about PEBKAC/PICNIC/lusers etc. The BOFH was a ton of fun to read but it's realworld application has limits. I don't feel bad about not being skilled at small engine work, or pharmaceuticals or whatever.
Deleted Comment
IMO you're better off just using an encrypted filesystem and typing in a passphrase on boot than dealing with hardware this locked down.
So while the headline is strictly true based on the contents of the article, I would call it excessively alarmist, at best. This is not something that the average user ever needs to worry about.
Overall I’m pretty happy hear. A stolen laptop can’t steal my secrets either from the Secure Enclave or disk, and this exploits prevents us from amassing iBricks because the hardware can always be recovered.
Deleted Comment