Readit News logoReadit News
nova22033 · 4 days ago
Remember...they can make you use touch id...they can't make you give them your password.

https://x.com/runasand/status/2017659019251343763?s=20

The FBI was able to access Washington Post reporter Hannah Natanson's Signal messages because she used Signal on her work laptop. The laptop accepted Touch ID for authentication, meaning the agents were allowed to require her to unlock it.

wackget · 4 days ago
Link which doesn't directly support website owned by unscrupulous trillionaire: https://xcancel.com/runasand/status/2017659019251343763?s=20
throwawayfour · 4 days ago
Good reminder to also set up something that does this automatically for you:

https://news.ycombinator.com/item?id=46526010

forgotTheLast · 4 days ago
I actually think it is fitting to read about a government agency weaponized by an unscrupulous billionaire going after journalists working for an unscrupulous billionaire on an unscrupulous trillionaire owned platform.
apparent · 3 days ago
There are trillionaires?

Dead Comment

b8 · 4 days ago
They can hold you in contempt for 18 months for not giving your password, https://arstechnica.com/tech-policy/2020/02/man-who-refused-....
ElevenLathe · 4 days ago
Being held in contempt at least means you got a day in court first. A judge telling me to give up my password is different than a dozen armed, masked secret police telling me to.
noident · 4 days ago
That's a very unusual and narrow exception involving "foregone conclusion doctrine", an important fact missed by Ars Technica but elaborated on by AP: https://apnews.com/general-news-49da3a1e71f74e1c98012611aedc...
teejmya · 4 days ago
I previously commented a solution to another problem, but it assists here too:

https://news.ycombinator.com/item?id=44746992

This command will make your MacBook hibernate when lid is closed or the laptop sleeps, so RAM is written to disk and the system powers down. The downside is that it does increase the amount of time it takes to resume.

A nice side benefit though, is that fingerprint is not accepted on first unlock, I believe secrets are still encrypted at this stage similar to cold boot. A fingerprint still unlocks from screensaver normally, as long as the system does not sleep (and therefore hibernate)

jakobdabo · 4 days ago
> I believe secrets are still encrypted at this stage similar to cold boot.

Does this mean that the Signal desktop application doesn't lock/unlock its (presumably encrypted) database with a secret when locking/unlocking the laptop?

patrickmay · 4 days ago
Is the knowledge of which finger to use protected as much as a passcode? Law enforcement might have the authority to physically hold the owner's finger to the device, but it seems that the owner has the right to refuse to disclose which finger is the right one. If law enforcement doesn't guess correctly in a few tries, the device could lock itself and require the passcode.

Another reason to use my dog's nose instead of a fingerprint.

parl_match · 4 days ago
I really wish Apple would offer a pin option on macos. For this reason, precisely. Either that, or an option to automatically disable touchid after a short amount of time (eg an hour or if my phone doesn't connect to the laptop)
thecapybara · 4 days ago
There's only ten possible guesses, and most people use their thumb and/or index finger, leaving four much likelier guesses.

Also, IANAL, but I'm pretty sure that if law enforcement has a warrant to seize property from you, they're not obligated to do so immediately the instant they see you - they could have someone follow you and watch to see how you unlock your phone before seizing it.

z3phyr · 4 days ago
0.1 in itself is a very good odd, and 0.1 * n tries is even more laughable. Also most people have two fingers touchID, which makes this number close to half in reality.
goda90 · 4 days ago
Remember that our rights aren't laws of nature. They have to be fought for to be respected by the government.
joecool1029 · 4 days ago
> they can't make you give them your password.

Except when they can: https://harvardlawreview.org/print/vol-134/state-v-andrews/

tedd4u · 3 days ago
75 footnotes for 89 sentences, nice! I guess that's how they roll over at the HLR.
notyourwork · 4 days ago
I don't get why I can be forced to use my biometrics to unlock but I cannot be forced to give a pin. Doesn't jive in my brain.
deltastone · 4 days ago
It's something you know vs. something you have. That's how the legal system sees it. You might not tell someone the pin to your safe, but if police find the key to it, or hire a locksmith to drill out your safe, it's theirs with a warrant.

It's interesting in the case of social media companies. Technically the data held is the companies data (Google, Meta, etc.) however courts have ruled that a person still has an expectation of privacy and therefore police need a warrant.

direwolf20 · 4 days ago
When they arrest you, they have physical control of your body. You're in handcuffs. They can put your fingers against the unlock button. You can make a fist, but they can have more strength and leverage to unfist your fist.

There's no known technique to force you to input a password.

soneil · 2 days ago
Compelled speech is protected, fingerprints aren't.

Imagine it's 1926 and none of this tech is an issue yet. The police can fingerprint and photograph you at intake, they can't compel speech or violate the 5th.

That's exactly what's being applied here. It's not that the police can do more or less than they could in 1926, it's that your biometrics can do more than they did in 1926. They're just fingerprinting you / photographing you .. using your phone.

wan23 · 4 days ago
The fifth amendment gives you the right to be silent, but they didn't write in anything about biometrics.
sejje · 4 days ago
"technicality" or "loophole" is probably the word.

I fully agree, forced biometrics is bullshit.

I say the same about forced blood removal for BAC testing. They can get a warrant for your blood, that's crazy to me.

Dead Comment

deltastone · 4 days ago
Also, using biometrics on a device, and your biometrics unlock said device, do wonders for proving to a jury that you owned and operated that device. So you're double screwed in that regard.
direwolf20 · 4 days ago
Remember, this isn't how it works in every country.
mbil · 4 days ago
Reminder that you can press the iPhone power button five times to require passcode for the next unlock.
rawgabbit · 4 days ago
Serious question. If I am re-entering the US after traveling abroad, can customs legally ask me to turn the phone back on and/or seize my phone? I am a US citizen.

Out of habit, I keep my phone off during the flight and turn it on after clearing customs.

thecapybara · 4 days ago
Did you know that on most models of iPhone, saying "Hey Siri, who's iPhone is this?" will disable biometric authentication until the passcode is entered?
fogzen · 4 days ago
In case anyone is wondering: In newer versions of MacOS, the user must log out to require a password. Locking screen no longer requires password if Touch ID is enabled.
qingcharles · 3 days ago
Everyone makes this same comment on each of these threads, but it's important to remember this only works if you have some sort of advance warning. If you have the iPhone in your hand and there is a loaded gun pointed at your head telling you not to move, you probably won't want to move.
kstrauser · 4 days ago
Or squeeze the power and volume buttons for a couple of seconds. It’s good to practice both these gestures so that they become reflex, rather than trying to remember them when they’re needed.
paulsmith · 4 days ago
Alternately, hold the power button and either volume button together for a few seconds.
tim333 · 3 days ago
One thing I miss from windows (on mac now) is there was an encrypted vault program that you could have hide so it wasn't on the desktop or program list but could still be launched. That way you could have private stuff that attackers would likely not even know was there.
innagadadavida · 3 days ago
Is there a way to setup Mac disabling Touch ID if the linked phone goes into lockdown or Face ID requires passcode? Apple could probably add that.
rustyhancock · 4 days ago
As far as I know lockdown mode and BFU prevent touch ID unlocking.

At least a password and pin you choose to give over.

raw_anon_1111 · 4 days ago
As if the government is not above breaking the law and using rubber hose decryption. The current administration’s justice department has been caught lying left and right
qingcharles · 3 days ago
And threats aren't illegal. They can put a gun to wife's head and say they're going to shoot. It's up to you then to call their bluff.
direwolf20 · 4 days ago
Plausible deniability still works. You enter your duress code and your system boots to a secondary partition with Facebook and Snapchat. No such OS exists.
p0w3n3d · 4 days ago
Allowed to require - very mildly constructed sentence, which could include torture or force abuse...

https://xkcd.com/538/

neves · 4 days ago
I just searched the case. I'm appalled. It looks like USA doesn't have legal protection for reporter sources. Or better, Biden created some, but it was revoked by the current administration.

The real news here isn't privacy control in a consumer OS ir the right to privacy, but USA, the leader of the free world, becoming an autocracy.

TheDong · 4 days ago
I find it so frustrating that Lockdown Mode is so all-or-nothing.

I want some of the lockdown stuff (No facetime and message attachments from strangers, no link previews, no device connections), but like half of the other ones I don't want.

Why can't I just toggle an iMessage setting for "no link preview, no attachments", or a general setting for "no automatic device connection to untrusted computers while locked"? Why can't I turn off "random dickpicks from strangers on iMessage" without also turning off my browser's javascript JIT and a bunch of other random crap?

Sure, leave the "Lockdown mode" toggle so people who just want "give me all the security" can get it, but split out individual options too.

Just to go through the features I don't want:

* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more

* Shared photo albums - I'm okay viewing shared photo albums from friends, but lockdown mode prevents you from even viewing them

* Configuration profiles - I need this to install custom fonts

Apple's refusal to split out more granular options here hurts my security.

Terretta · 4 days ago
The profiles language may be confusing -- what you can't do is change them while in Lockdown mode.
quizzical8432 · 4 days ago
I’m with you on the shared photo albums. I’d been using lockdown mode for quite a while before I discovered this limitation, though. For me, this is one I’d like to be able to selectively enable (like the per-website/app settings). In my case, it was a one-off need, so I disabled lockdown mode, shared photos, then enabled it again.

The other feature I miss is screen time requests. This one is kinda weird - I’m sure there’s a reason they’re blocked, but it’s a message from Apple (or, directly from a trusted family member? I’m not 100% sure how they work). I still _recieve_ the notification, but it’s not actionable.

While I share with your frustration, though, I do understand why Apple might want to have it as “all-or-nothing”. If they allow users to enable even one “dangerous” setting, that ultimately compromises the entire security model. An attacker doesn’t care which way they can compromise your device. If there’s _one_ way in, that’s all they need.

Ultimately, for me the biggest PiTA with lockdown mode is not knowing if it’s to blame for a problem I’m having. I couldn’t tell you how many times I’ve disabled and re-enabled it just to test something that should work, or if it’s the reason a feature/setting is not showing up. To be fair, most of the time it’s not the issue, but sometimes I just need to rule it out.

ectospheno · 4 days ago
Family albums work with lockdown mode. You can also disable web restrictions per app and website.
ethepax · 3 days ago
Agreed. If I know my threat model, I don’t need unnecessary restrictions.
everdrive · 4 days ago
>* Lockdown Mode disables javascript JIT in the browser - I want fast javascript, I use some websites and apps that cannot function without it, and non-JIT js drains battery more

This feature has the benefit of teaching users (correctly) that browsing the internet on a phone has always been a terrible idea.

rantingdemon · 4 days ago
I'll bite. Why is it so terrible? I'm browsing this site right now on my phone and don't see the horror.
jgwil2 · 4 days ago
I think that ship has sailed.
nxobject · 4 days ago
Sadly, they still got to her Signal on her Desktop – her sources might still be compromised. It's sadly inherent to desktop applications, but I'm sad that a lot more people don't know that Signal for Desktop is much, much less secure against adversaries with your laptop.
tadzikpk · 4 days ago
> I'm sad that a lot more people don't know that Signal for Desktop is much, much less secure against adversaries with your laptop

Educate us. What makes it less secure?

armadyl · 4 days ago
In addition to what the other person who replied said, ignoring that iOS/Android/iPadOS is far more secure than macOS, laptops have significantly less hardware-based protections than Pixel/Samsung/Apple mobile devices do. So really the only way a laptop in this situation would be truly secure from LEO is if its fully powered off when it’s seized.
digiown · 4 days ago
The key in the desktop version is not always stored in the secure enclave, is my assumption (it definitely supports plaintext storage). Theoretically this makes it possible to extract the key for the message database. Also a different malicious program can read it. But this is moot anyway if the FBI can browse through the chats. This isn't what failed here.
stronglikedan · 4 days ago
If people don't have Signal set to delete sensitive messages quickly, then they may as well just be texting.
AdamN · 4 days ago
That's a strong statement. Also imho it's important that we use Signal for normal stuff like discussing where to get coffee tomorrow - no need for disappearing messages there.
NewsaHackO · 4 days ago
Yea, I also would want to question the conclusions in the article. Was the issue that they couldn't unlock the iPhone, or that they had no reason to pursue the thread? To my understanding, the Apple ecosystem means that everything is synced together. If they already got into her laptop, wouldn't all of the iMessages, call history, and iCloud material already be synced there? What would be the gain of going after the phone, other than to make the case slightly more watertight?
NetMageSCW · 4 days ago
Not if she’s smart.
mrandish · 4 days ago
I would have thought reporters with confidential sources at that level would already exercise basic security hygiene. Hopefully, this incident is a wake up call for the rest.
pbhjpbhj · 4 days ago
Did she have Bitlocker or FileVault or other disk encryption that was breeched? (Or they took the system booted as TLAs seek to do?)
bmicraft · 4 days ago
There was a story here the other day, bitlocker keys stored in your Microsoft account will be handed over.
deltastone · 4 days ago
Bitlocker isn't secure, for several reasons, that I won't get into on here.

Deleted Comment

MoonWalk · 4 days ago
breached
827a · 4 days ago
Is there an implication here that they could get into an iPhone with lower security settings enabled? There's Advanced Data Protection, which E2EEs more of your data in iCloud. There's the FaceID unlock state, which US law enforcement can compel you to unlock; but penta-click the power button and you go into PIN unlock state, which they cannot compel you to unlock.

My understanding of Lockdown Mode was that it babyifies the device to reduce the attack surface against unknown zero-days. Does the government saying that Lockdown Mode barred them from entering imply that they've got an unknown zero-day that would work in the PIN-unlock state, but not Lockdown Mode?

kingnothing · 4 days ago
It's relatively well know that the NSO Group / Pegasus is what governments use to access locked phones.
827a · 4 days ago
This was known, in the past, but if its relying on zero-days Apple & Google are, adversarially, attempting to keep up with and patch, my assumption would not be that pegasus is, at any time, always able to breach a fully-updated iPhone. Rather, its a situation where maybe there are periods of a few months at a time where they have a working exploit, until Apple discovers it and patches it, repeat indefinitely.
halJordan · 3 days ago
The nso group is on the entity list, so no western govt is using it. And it was never used to gain access to devices that they already had physical control over.
zymhan · 4 days ago
Yes
macintux · 4 days ago
> Natanson said she does not use biometrics for her devices, but after investigators told her to try, “when she applied her index finger to the fingerprint reader, the laptop unlocked.”

Curious.

QuantumNomad_ · 4 days ago
Probably enabled it at some point and forgot. Perhaps even during setup when the computer was new.
intrasight · 4 days ago
My recollection is the computers do by default ask the user to set up biometrics
NewsaHackO · 4 days ago
I want to say that is generous of her, but one thing that is weird is if I didn’t want someone to go into my laptop and they tried to force me to use my fingerprint to unlock it, I definitely wouldn’t use the finger I use to unlock it on the first try. Hopefully, Apple locks it out and forces a password if you use the wrong finger “accidentally” a couple of times.
dyauspitr · 4 days ago
She has to have set it up before. There is no way to divine a fingerprint any other way. I guess the only other way would be a faulty fingerprint sensor but that should default to a non-entry.
giraffe_lady · 4 days ago
Could be a parallel construction type thing. They already have access but they need to document a legal action by which they could have acquired it so it doesn't get thrown out of court.

I think this is pretty unlikely here but it's within the realm of possibility.

quesera · 4 days ago
> faulty fingerprint sensor

The fingerprint sensor does not make access control decisions, so the fault would have to be somewhere else (e.g. the software code branch structure that decides what to do with the response from the secure enclave).

nozzlegear · 3 days ago
My read on this is that she tried to bluff, even though the odds were astronomically high that they'd call her on it. She didn't have anything to lose by trying a little white lie. It's what I would have done in the same situation, anyway.
b112 · 4 days ago
Very much so, because the question is... did she set it up in the past?

How did it know the print even?

ezfe · 4 days ago
Why is this curious?
macintux · 4 days ago
There appear to be a relatively few possibilities.

* The reporter lied.

* The reporter forgot.

* Apple devices share fingerprint matching details and another device had her details (this is supposed to be impossible, and I have no reason to believe it isn't).

* The government hacked the computer such that it would unlock this way (probably impossible as well).

* The fingerprint security is much worse than years of evidence suggests.

Mainly it was buried at the very end of the article, and I thought it worth mentioning here in case people missed it.

mmooss · 4 days ago
Don't be idiots. The FBI may say that whether or not they can get in:

1. If they can get in, now people - including high-value targets like journalists - will use bad security.

2. If the FBI (or another agency) has an unknown capability, the FBI must say they can't get in or reveal their capabilities to all adversaries, including to even higher-profile targets such as counter-intelligence targets. Saying nothing also risks revealing the capability.

3. Similarly if Apple helped them, Apple might insist that is not revealed. The same applies to any third party with the capability. (Also, less significantly, saying they can't get in puts more pressure on Apple and on creating backdoors, even if HN readers will see it the other way.)

Also, the target might think they are safe, which could be a tactical advantage. It also may exclude recovered data from rules of handling evidence, even if it's unusable in court. And at best they haven't got in yet - there may be an exploit to this OS version someday, and the FBI can try again then.

coppsilgold · 4 days ago
I would not recommend that one trust a secure enclave with full disk encryption (FDE). This is what you are doing when your password/PIN/fingerprint can't contain sufficient entropy to derive a secure encryption key.

The problem with low entropy security measures arises due to the fact that this low entropy is used to instruct the secure enclave (TEE) to release/use the actual high entropy key. So the key must be stored physically (eg. as voltage levels) somewhere in the device.

It's a similar story when the device is locked, on most computers the RAM isn't even encrypted so a locked computer is no major obstacle to an adversary. On devices where RAM is encrypted the encryption key is also stored somewhere - if only while the device is powered on.

pregnenolone · 4 days ago
RAM encryption doesn’t prevent DMA attacks and perofming a DMA attack is quite trivial as long as the machine is running. Secure enclaves do prevent those and they're a good solution. If implemented correctly, they have no downsides. I'm not referring to TPMs due to their inherent flaws; I’m talking about SoC crypto engines like those found in Apple’s M series or Intel's latest Panther Lake lineup. They prevent DMA attacks and side-channel vulnerabilities. True, I wouldn’t trust any secure enclave never to be breached – that’s an impossible promise to make even though it would require a nation-state level attack – but even this concern can be easily addressed by making the final encryption key depend on both software key derivation and the secret stored within the enclave.

Deleted Comment

Deleted Comment

QuiEgo · 3 days ago
I recommend reading the AES-XTS spec, in particular the “tweak”. Or for AES-GCM look at how IV works.

I also recommend looking up PUF and how modern systems use it in conjunction with user provided secrets to dervie keys - a password or fingerprint is one of many inputs into a kdf to get the final keys.

The high level idea is that the key that's being used for encryption is derived from a very well randomized and protected device-unique secret setup at manufacturing time. Your password/fingerprint/whatever are just adding a little extra entropy to that already cryptographically sound seed.

Tl;dr this is a well solved problem on modern security designs.

throwmeaway820 · 4 days ago
It seems unfortunate that enhanced protection against physically attached devices requires enabling a mode that is much broader, and sounds like it has a noticeable impact on device functionality.

I never attach my iPhone to anything that's not a power source. I would totally enable an "enhanced protection for external accessories" mode. But I'm not going to enable a general "Lockdown mode" that Apple tells me means my "device won’t function like it typically does"

jonpalmisc · 4 days ago
There is a setting as of iOS 26 under "Privacy & Security > Wired Accessories" in which you can make data connections always prompt for access. Not that there haven't been bypasses for this before, but perhaps still of interest to you.
H8crilA · 4 days ago
GrapheneOS does this by default - only power delivery when locked. Also it's a hardware block, not software. Seems to be completely immune to these USB exploit tools.
aaronmdjones · 4 days ago
It also has various options to adjust the behaviour, from no blocks at all, to not even being able to charge the phone (or use the phone to charge something else) -- even when unlocked. Changing the mode of operation requires the device PIN, just as changing the device PIN does.

Note that it behaves subtly differently to how you described in case it was connected to something before being locked. In that case data access will remain -- even though the phone is now locked -- until the device is disconnected.

UltraSane · 4 days ago
Computer security is generally inversely proportional to convenience. Best opsec is generally to have multiple devices.
Terretta · 4 days ago
> I would totally enable an "enhanced protection for external accessories" mode.

Anyone can do this for over a decade now, and it's fairly straightforward:

- 2014: https://www.zdziarski.com/blog/?p=2589

- recent: https://reincubate.com/support/how-to/pair-lock-supervise-ip...

This goes beyond the "wired accessories" toggle.

pkteison · 4 days ago
It isn’t. Settings > Privacy & Security > Wired Accessories

Set to ask for new accessories or always ask.

sodality2 · 4 days ago
I have to warn you, it does get annoying when you plug in your power-only cable and it still nags you with the question. But it does work as intended!
mrandish · 4 days ago
> it has a noticeable impact on device functionality.

The lack of optional granularity on security settings is super frustrating because it leads to many users just opting out of any heightened security.

ur-whale · 4 days ago
> I never attach my iPhone to anything that's not a power source.

It's "attached" to the wifi and to the cell network. Pretty much the same thing.