Readit News logoReadit News
Syonyk · 4 years ago
Don't shit in my hand and call it chocolate ice cream.

"We're going to scan your photos, on your encrypted device, to look for badness. Right now, we're going to claim that's only for the really icky people that nobody is going to defend, but, hey, once the tech is in place, who's to say we can't scan for dank memes and stuff?"

I think I'm done with Apple. Sad, really. I was hoping that their bowing to China with iCloud wasn't a sign of what's to come, but apparently it was. They had done such nice stuff with privacy too.

Demote my phone to a house phone and go without, I suppose.

voldacar · 4 years ago
These processes seem to only move in one direction. In 5 years this exact comment will probably make you sound like an "extremist" if you say it to a random person. "What, why wouldn't you be okay with apple searching through your photos?"

I see it everywhere and it literally seems like some kind of one-way entropic process. I can't think of anything that would reverse it. It would be like turning an omelet into an egg.

Is there something about modern life that just inescapably creates this complacent, servile temperament in most of the population? Or has it always been there and I'm just overthinking it? It's really depressing either way so I try not to think about it

throwaway0a5e · 4 years ago
>I see it everywhere and it literally seems like some kind of one-way entropic process. I can't think of anything that would reverse it. It would be like turning an omelet into an egg.

Thomas Jefferson had some arboriculture advice that seems relevant to this kind of thing.

bonestamp2 · 4 years ago
> Is there something about modern life that just inescapably creates this complacent, servile temperament in most of the population? Or has it always been there and I'm just overthinking it?

I only have anecdotal evidence, but it seems like most people don't care about problems unless they affect themselves personally right now. Also, most of us (HN aside) don't tend to think seriously about how things will go wrong, we're generally optimists that give more weight to the good something will do.

nradov · 4 years ago
I appreciate that Apple is trying to find technical means to reduce child porn. That's a more important problem to solve than optimizing ad click rates. But my concern is that once the tools for local image scanning exist, Apple will come under pressure from authoritarian regimes like China, Australia, and Saudi Arabia to also search for images associated with lesser crimes or even just criticism of the government. It's tough for a company to refuse. Either comply with the orders, or risk having your business shut down and your local employees punished.
ogurechny · 4 years ago
What makes you think that child porn is such an important problem, apart from rag articles on vague “Darknet”? Shouldn't all the phones get scanned for the evidence of tax evasion, a much more widespread crime? All those rich people dealing with offshore businesses have iPhones™, it's only natural to make Apple do their part as a honest member of society… I mean, honest corporation of society.

If you allow me to guesstimate wildly, most “child porn” these days, in a technical sense, is made by kids themselves having access to an internet-connected device with a camera. Sometimes it is extorted by despicable abusers, sometimes it is done for no one in particular, just for the perceived fame/popularity/likes on social services/etc. Big services have an army of moderating grunts to keep Victorian purity of blissfully ignorant common public intact, things are a bit different in poorer parts on the Net and on the Globe. Should we expect the naked selfie of a teenager sent to their significant other to automatically cause a police interrogation of the same teenager? What if the device is shared with older members of the family? Another man-made dystopia enabled by people who enjoy to express outrage over the racy stories in the media, and people who are too afraid to speak up.

In essence, Apple has introduced a software agent to signal whether you have files that belong the list someone provides. If I recall the scandal correctly, this is what Kaspersky allegedly (ab)used, and what other antivirus tools (including Microsoft's built-in and enabled-by-default Windows scanner that for some seemingly important reason nags all the time if uploading of files to Microsoft is disabled) surely enjoy offering to various agencies around the world.

I don't think you should worry about “China, Australia, and Saudi Arabia” so much, there's elephant in the room you don't like to notice.

option_greek · 4 years ago
You are stating it in reverse. Think about the children is always an excuse to get started with the next big things (like all the other things that you mentioned). So its step 1 of a grand plan on the lines of 1984 (probably to get some more markets to open for them by cozying up to governments).
hapless · 4 years ago
There is something like a 0% chance this program wasn't co-developed with representatives of the government entities you least want rifling through your data.
dimmke · 4 years ago
I find discussions around this kind of stuff frustrating because often times what's actually happening always gets muddled by the hysteria, even here on Hacker News.

From the link: "Before an image is stored in iCloud Photos..."

This leads me to believe that only data that is going to get uploaded to their servers is going to be scanned. If anyone has a different interpretation or thinks I'm wrong, feel free to reply.

It's my understanding that all cloud services do this type of scanning, when they are technically able to.

But all data on iCloud is encrypted by default, so Apple can't scan for this kind of material once it is on their servers. Doing it on device before it gets encrypted and uploaded is the only place they even could do a scan like this.

Additionally, they make it clear in the article that there has to be more than one hit (they don't say the actual number) which would mitigate risk of hash collision false positives.

If this type of scanning makes you uncomfortable, you can just not use their cloud services.

I do agree that this is still not a good direction to go, even with all the precautions they've taken. But I had to do some digging to figure out what was actually going on, the comments/commentary made it seem like Apple is now routinely scanning all your photos/videos if you have an iPhone.

Once the code is there to do local scanning, it might make it easier for a zero day exploit to do phone scanning and grab data it might not otherwise have access to or for governments to force Apple to conduct scans of content on a phone when they ask.

aaomidi · 4 years ago
The US is one of the main countries that spies on their citizens. I do wonder if Apple developed this to keep FBI happy. The same way they stopped on E2EE cloud backups to keep the FBI happy.
Zababa · 4 years ago
I'm always a bit wary about child porn arguments. For example, the Australian police ran a child porn website for 11 months https://www.vg.no/spesial/2017/undercover-darkweb/?lang=en. A child protection agency spread pictures of child pornography http://saucenao.blogspot.com/2021/04/recent-events.html. At some point, I start to wonder about the intentions of the people behind these.
erhk · 4 years ago
This tool has nothing to do with child pornography. Youre reading the marketing paper.
babyshake · 4 years ago
They need to at least have a canary clause that can be tracked to determine if they had complied with any such requests.
AniseAbyss · 4 years ago
You are right to worry. Once the tech is employed out in the field you literally cannot go back.
antpol · 4 years ago
forget about authoritarian regimes committing genocide, think about the children sharing pepe memes with ok signs. There are lots of pitchforks ready to go, and some of them already work withing Apple campus who are writing design docs on how to make Apple ecosystem even more safe
narrator · 4 years ago
Who has access to update what are considered bad images and when to alert the authorities? This could very easily be made into an anti-wikileaks feature.
nulld3v · 4 years ago
Exactly, if some whistleblower is going leak a picture of some classified documents, the intelligence agency could simply upload a picture of the documents to the database. And bam, the journalist is arrested.

Moreover, if the database doesn't store actual images and instead only the perceptual hashes, it would be impossible to audit, even if the auditor has access to the database.

All the auditor would see is a bunch of hashes, they wouldn't be able to tell which hash actually represents CP and which hashes represent pictures of confidential documents.

stefan_ · 4 years ago
The next question is since this database is not exact binary hashes, but an ill defined "perceptual hash" - how many false positive images has this system surreptitiously extracted from customer hardware and shown to presumably human control personnel? Were these customers notified? What is the recourse for them?

PhotoDNA has been around for a while and you might think about it what you want, but they have never answered those very pertinent questions.

alexfromapex · 4 years ago
This is all a few months after touting themselves as a company focused on privacy. Decentralization is the only way to truly know you have privacy today I suppose.
tmn · 4 years ago
You’re already completely trusting them if you’re using their device? I mean I get the privacy concern, but it’s odd that this is the line if you were fine with things before. That said I’m all for ditching the smartphone and best of luck to you
Syonyk · 4 years ago
Historically, Apple has come down, hard, on the side of "Your device is your device, and where that assumption is violated, we will work to protect it." They have added increasing layers of hardware security around device encryption keys, have added the various mitigations for password guessing attacks, etc.

And when the FBI said, "Hey, can you write and sign a custom bootloader for this phone to bypass that stuff?" they told the FBI to pound sand and made the hardware security features stronger so even Apple couldn't break them.

And then they bowed to China regarding iCloud and in-country servers, which clearly are accessible to the government. And then this. Whatever claims they've been making about privacy are obviously now crumbling under some external pressure.

tomjen3 · 4 years ago
At the limit you are trusting the sum of the people who are creating the software you use - and that includes drivers, firmware, etc. Linux is not exempt, but it may be easier to find issues in open source code.

Given this, Apple was not a bad person to trust. Yes they wanted you to pay them money, but were a) quite happy to deliver value in return, b) prepared to do lots of hard work to fight on your behalf and c) it was quite clear that you were their customer, not their product. Google is an advertising company, even if you pay them money you are always also the product.

I am seriously wondering what happened to Apple that they came up with this idea, it doesn't seem to be in their interests at all.

erhk · 4 years ago
Im worried about the millions of peolle who dont read hacker news
wyager · 4 years ago
If they outright lie about their device spying on you, the repercussions are much worse than them openly spying on you.
MisterTea · 4 years ago
> I was hoping that their bowing to China with iCloud wasn't a sign of what's to come, but apparently it was.

It's likely that this tech originated from Apples capitulation to China and has been in place. So it's probably been in place for quite some time.

0x000000001 · 4 years ago
They're already scanning your photos to classify the images. That's how you can search for photos by descriptions. They also can OCR the text in images. Seems like a weird time to start getting angry about your privacy being invaded
DSingularity · 4 years ago
Would you please describe a solution to finding CSAM which you would find acceptable? It seems to me that on-device scanning for known CSAM content is on the lower end of the privacy-violating spectrum.
hapless · 4 years ago
There is no solution to searching private data that I am going to find acceptable.

* If the "target" database is secret, then government entities are free to use a "CSAM" database to monitor dissidents by adding their own items of interest to the target list.

* If the "target" database is public, the only way to validate its contents is to ... traffic in child sex abuse material. That's not great.

Basically, there is no way to create and validate this database in public view. I don't need yet another flavor of secret policing in America.

This doesn't seem like a problem that can be solved without compromising my privacy.

I would rather the problem went un-solved than allow the state to rifle through all of my private files to "prevent the distribution of child sex abuse material," knowing full well the state will define that "material" however they like, then use parallel construction to prosecute dissidents whenever they have gotten information illegally.

gpm · 4 years ago
Traditional police work that respects the principle behind the 4th amendment, no searches of private information, algorithmic or otherwise, directly by the government or otherwise, where you do not have probable cause.
Syonyk · 4 years ago
Find the websites distributing it, infiltrate them, generally do the legwork to find what's going on - which is exactly what they've been doing.

"But the children!" is not a skeleton key for privacy, as far as I'm concerned.

I reject on-device scanning for anything in terms of personal content as a thing that should be done, so, no, I don't have a suggested way to securely accomplish privacy invasions of this nature.

I'm aware that they claim it will only be applied to iCloud based uploads, but I'm also aware that those limits rarely stand the test of governments with gag orders behind them, so if Apple is willing to deploy this functionality, I have to assume that, at some point, it will be used to scan all images on a device, against an ever growing database of "known badness" that cannot be evaluated to find out what's actually in it.

If there existed some way to independently have the database of hashes audited for what was in it, which is a nasty set of problems for images that are illegal to store, and to verify that the database on device only contained things in the canonical store, I might object slightly less, but... even then, the concept of scanning things on my private, encrypted device to identify badness is still incredibly objectionable.

In the battle between privacy and "We can catch all the criminals if we just know more," the government has been collecting huge amounts of data endlessly (see Snowden leaks for details), and yet hasn't proved that this is useful to prevent crimes. Given that, I am absolutely opposed to giving them more data to work with.

I would rather have 10 criminals go free than one innocent person go to prison, and I trust black box algorithms with that as far as I can throw the building they were written in.

WindyLakeReturn · 4 years ago
>on the lower end of the privacy-violating spectrum

I think that's the thing, privacy minded people want solutions that aren't on the spectrum at all.

I can come up with some solutions for greatly reducing CSA that are really low on violating parental rights but they have little chance of gaining support because the only solutions people want are the ones that don't violate parental rights at all (except when parental rights have been suspended due to evidence of CSA being found).

diebeforei485 · 4 years ago
Yeah. They can scan photos that are actually passed around, instead of something that's just in a photo library. That will likely result in 2-3 orders of magnitude fewer false positives.
jszymborski · 4 years ago
No, the lower-end is infiltrating CSAM distribution networks the old-fashioned way.
gorgoiler · 4 years ago
Boosting the amount of influence schools have in childrens’ lives is something we could do better at.

More teachers, smaller classes, stronger relationships, better discipline, more love. I see too many teachers where I work who just don’t care any more.

Educating school leaders to educate their teachers to help children root out the family member that is abusing them. That there would be a pretty neat solution — and I really don’t mean to be inflammatory about this — than catching villains based on photo evidence of the abuse that has become systematic enough to be shared online.

It is very expensive and time consuming though. Teachers are hard to recruit (low pay, low quality job) which exacerbates the problem which makes it even harder to recruit them.

tomjen3 · 4 years ago
Not OP, but a warrant, upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.

In other words, you don't get to yell think of the children and randsack peoples homes. If you don't have specific evidence suggesting that I am a criminal, you assume that I am not _and_ let me be.

r00fus · 4 years ago
How about not doing it.

The "privacy" of a on-device scanning is meaningless when you upload to iCloud where they will send to law enforcement and an unaccountable nonprofit as soon as it matches something.

aaomidi · 4 years ago
Yes. Find the one's producing it and use proper infiltration tactics.

Empower children to report abusers and protect them when they do.

throw1998149898 · 4 years ago
Anything which

- directly prevents actual abuse of children in the physical world (obviously)

- fights the sale of images created through abuse (as this incentivizes abuse).

Those are the main underlying reasons to fight CSAM, besides the obvious horrificness of the stuff in itself.

Please also consider the second-order effects of an increasingly totalitarian society on the wellbeing of children.

alexfromapex · 4 years ago
Them having to get a warrant or some form of fair adjudication instead of a carte blanche of privacy invasion?
wyager · 4 years ago
Don't spy on my device. I literally don't care if that doesn't help some pet cause.
Blikkentrekker · 4 years ago
Child pornography is the boogyman used to compromise privacy.

There is significantly more illegal material of greater concern such as communication to plot terrorists attacks or companies ignoring safety regulations and leading to the deaths of many in doing so, and all that isn't scanned for either on the theory that it might exist, not that there is due cause for a search warrant or some similar mechanism and burden.

But child pornography is the one thing Anglo-Saxon culture is notoriously emotional about, and willing to surrender all it's freedom and privacy for. For a while terrorism counted among this too, but that seems to have fallen off.

baybal2 · 4 years ago
Lawsuit incoming? How this is possible if this seems to go 180° against them previously pretty much putting a privacy guarantee in EULA.

Will they announce an EULA change, reversing the guarantee, and freeze anybody who haven't clicked it through?

I thought Apple was one of a kind company brave enough to put such a guarantee in writing.

jimbob45 · 4 years ago
Based on the press release, I don't think you're right. It doesn't appear to be all photos, just ones uploaded to iCloud.

It looks like it only triggers when you go to upload a photo to iCloud. In that case, it (maybe?) scans only that photo on your device and then decides if it's CSAM.

I guess Apple really really doesn't want CSAM on their servers? I don't know why they wouldn't just perform the scanning on their own server otherwise.

zionic · 4 years ago
Apple’s press release is a lie.

iCloud stuff already isn’t E2E and they already scan.

This system is built exactly to surveil the entire contents of your mobile device, cloud enabled or not.

Apple has burned years of social capital in 2 days.

etchalon · 4 years ago
Ah yes, the "slippery slope" argument, where evidence is never required, pessimism always wins over realism, and any counter-argument can be dismissed with "but they COULD!"
revolvingocelot · 4 years ago
I remember when they put a fucking U2 album on everyone's phone. Remember that? Many found that suspicious, but I recall people like you saying "oh Apple can write, who cares, but they'd never read, you're being ridiculous".

I remember when an Apple auth server went down and no one could launch non-Apple applications because Apple needed to see the hashes of the things people ran on their computers.[0]

When will the slope be inclined enough for you?!

[0] https://mobile.twitter.com/llanga/status/1326989724704268289

colordrops · 4 years ago
The "slippery slope" argument is not a fallacy. If you were in 2001 and told people what the state of data collection and privacy would look like in 20 years, they'd call you an absolute conspiracy nutjob. It was a national news story then that a printer driver phoned home. Now look at where we are at, and every loss of privacy has had some seemingly reasonable excuse.
Syonyk · 4 years ago
Do you trust your government to not abuse "But they could?" types of capabilities?

I don't. I've seen what the FBI claims, that they then magically find ways around the "impenetrable encryption" they claim they can't break (but then do), and I don't trust them to not require Apple to add hashes of "things they find problematic," and include a gag order with it so Apple can't report it.

I've worked computer security long enough to know that it's always worse than advertised, always will be abused, etc.

spockz · 4 years ago
Just the other day in the Netherlands it came out that cameras placed with the strictest promise that they would only record license plates and only trigger when it matched with a known license plate now also capture persons in the car. And before that the systems were changed to store pictures for 28 days in order to look back. It also came out that the deletion doesn’t always work and that the obfuscation that is supposed to happen also doesn’t always happen.

The reaction of the police seems to be “why do you care about this? We have the data let’s use it. And we could use it to find those ~killers~ these people that allegedly just killed the most popular crime news reporter. So let us do our work.”

Just like software expands to eat cpu/memory/disk resources and people start living up to their salaries: whatever is possible will happen.

To me the slippery slope argument is perfectly valid.

rsgrn · 4 years ago
You're right in one sense. In another, isn't this exactly what people said about intelligence agencies ("they could but they don't") and then... Snowden.
RIMR · 4 years ago
I think we've learned from experience that tech companies will happily exploit anything they’re capable of exploiting, and this new capability has plenty of potential for exploitation.

Searching everyone's media for evidence of criminal wrongdoing, but setting only one example of the kind of wrongdoing you're looking for is very susceptible to an actual slippery slope, given that there are plenty of other criminal activities they could start looking for should they decide that it's part of their mission.

The only thing that makes CSAM so attractive to go after is how disgusting society feels that it is. Up next could be reporting drug-related texts of known convicts to parole officers. Next could be drug-related texts of everyone to police officers. Next could be letting the police officers make their own searches, where they find everyone talking about a certain kind of political organizing.

We should refuse to accept even the first instance of this kind of thing.

Zak · 4 years ago
Prior to this change, they couldn't. Now they could.

It's much easier to change the policy than to add a new mechanism. Governments, including the US government have previously attempted to force Apple to break into users' devices, and often apply gag orders to such attempts.

Now they could, and that's enough for me to be unwilling to keep sensitive information on an iOS device.

randyrand · 4 years ago
Slippery slope? What they already announced is abhorrent. Scanning all of our photos against our wishes?

Who cares what it's for? The fact that they are even doing this at all is a huge infringement to their customer's autonomy.

hncurious · 4 years ago
You're right, but isn't the rapid expansion of censorship and surveillance the past year evidence that we are currently slipping down the slope?
gouggoug · 4 years ago
It is a slippery slope argument, but wouldn't you agree that it's been proven by the past decade of events (e.g. Snowden) that we can expect governments to try (and succeed) to abuse originally well intentioned technologies to spy on anyone?
Hamuko · 4 years ago
Considering that Chinese iPhone users already do not enjoy the same privacy features as non-Chinese iPhone users, I would not be at all surprised if Chinese iPhones are going to have a lot more "child porn" hashes to check against.
longtom · 4 years ago
Edward Snowden's revelations provide very solid evidence that what can be done is being done.
wyager · 4 years ago
Slippery slope is a shockingly predictive heuristic.
hapless · 4 years ago
Well, there's not really any slope here. Apple plans to have a target database of BAD FILES!!1! that will get your name handed to the government.

It's not subtle. No expansion to the program is necessary to violate your privacy or endanger dissidents.

Jaxkr · 4 years ago
This is incredibly disappointing. The sick criminals that run child pornography rings are not storing their material on iCloud.

The "This could be sensitive to view" screen is downright Orwellian. This technology could be used to scan for ANYTHING, completely undermining user privacy. It might just be CP today, but tomorrow it could be screenshots of protest material, whistleblower content, or anti-government memes.

I cannot express how sad I am Apple has decided to do this. It doesn't protect children, it won't catch any pedophiles, but it certainly WILL be misused in the future and create a chilling effect on what (politically dissident) content people are willing to store on their phones.

arriu · 4 years ago
Lol there is no way they will be scanning for just one type of thing. That would be a waste of computation. They will tag everything and decide if they want to use it later.

Scary times; for exactly the reasons you've called out.

cyral · 4 years ago
They already tag everything, just search for any object in your photos library (wether you are on iOS or even Android with Google Photos) and it will find it.
erhk · 4 years ago
I can't wait until photos of marvel movies get deleted and you get a warning for breaking copyright.
userbinator · 4 years ago
"The road to hell is paved with good intentions."

The gradual but steadily accelerating rise of authoritarianism scares me far more than terrorists, drugs, child abuse, and the pandemic.

Unless we push back mightily, it will be a question of when, not if, owning a general-purpose computer that's not controlled by the government or a company becomes discouraged, suspicious, and eventually illegal.

MonadIsPronad · 4 years ago
This scares the heck out of me too. It just seems like such an obvious next step.

"Oh you run linux? Do you not trust the government, or are you trying to hide something?"

hypertele-Xii · 4 years ago
Do you honestly think this, what I'm doing right now - posting on HN on a general-purpose computer, on an open source browser and server stack - will become limited to only licenced professionals?
EvanAnderson · 4 years ago
Yes. I absolutely do think that.

Companies holding interests in copyrighted works would love to see general purpose computers go away, replaced by "trusted" media players that can't make unauthorized copies or be used to make unsanctioned independent creative works.

Totalitarian governments would love to take away general purpose computers to prevent end users from removing surveillance and anonymity.

Companies who want to control software markets would love it if all software licensing transactions ran thru their "marketplace".

a1369209993 · 4 years ago
There are no good intentions; they're just lying.
judge2020 · 4 years ago
Apple is not an autonomous hive mind. The hundreds of people that built this and coordinated the deployment across multiple teams including management, deployment, QA, beta testing, machine learning, and SWE aren't lying and the majority of them certainly have good intentions.
ComputerGuru · 4 years ago
Remember, Apple doesn't actually control the CSAM database (NCMEC). They almost certainly don't (and wouldn't want to) even have access to a reverse mapping between hashes and original images. A/The government(s) could (yes, theoretically - this is cryptography we are purportedly talking about, the onus is on them to prove they can't) easily slip in pictures/screenshots of political materials to have your account flagged. Even if it requires manual human review on Apple's end, the government could still (and has in the past) serve them with a warrant + gag order for "false" matches in the past.

Yes, it's all "they could" but with current technical solutions already providing some measure of protection against a corrupt or malicious government from cracking down on its citizens, anything that erodes from that freedom deserves to be held to such a high standard.

zionic · 4 years ago
In the press release Apple says they are using hashes of hashes.

Apple has no visibility into the original image that generated the hash, so the gov can compromise the list at the source and Apple would have plausible deniability.

r00fus · 4 years ago
THE NCMEC isn't even "government". It's just a nonprofit. Here's the guy who founded it, and his controversies:

https://en.wikipedia.org/wiki/John_Walsh_(television_host)#C...

There is literally no oversight here. It's a misfeature and will be misused.

MrBuddyCasino · 4 years ago
„ In his book Tears of Rage, Walsh openly admits being in a relationship with 16-year-old Revé when Walsh was in his early 20s and aware of the age of consent being 17 in New York.[30] Critics of the Adam Walsh Act have pointed out that, had he been convicted, Walsh himself would have been subject to sex offender registration under the law which he aggressively promoted.“

Can’t say I‘m surprised.

rdtwo · 4 years ago
I wonder if the tiananmen square photo matches one of the Chinese hashes. I bet a bunch of lesser known uyghur photo hashes pop against the database. Is it a publicly available hash database? I wonder if it’s possible to run known controversial stuff and see if it flags?
judge2020 · 4 years ago
Such images of the Tiananmen square would have been wiped off the face of social media if so - every big social media company uses PhotoDNA or similar to catch likely matches before someone in the real world sees the illegal images on their platform.
brightball · 4 years ago
After reading the post and as a parent of two kids who are in middle school now...I'm pretty happy with what I see. I didn't expect to be based on the comments I read here before reading the article though.

I know a local family who has a daughter who's been in therapy for the last 3 years because she fell victim to the type of thing Apple is discussing in this post. They are firmly advocates for better parent education and oversight, sharing their experience so that other people can hopefully never have to deal with the same thing. They told us about an app called Bark[1] that's supposed to really help with a lot of this stuff and seems inline with what Apple is talking about here. I'm pretty happy to see it will be built in.

> The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.

All the parental controls in the world don't prevent the fact that getting your kids a phone in this day and age is a pretty terrifying experience if you know what type of things are out there.

> When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.

1 - https://www.bark.us/

zug_zug · 4 years ago
I think the account-owner being able to turn-on or turn-off that type of capability is generally okay (though god knows if I was a 17 year old I'd certainly be switching to Droid for that reason alone).

It sounds like there's a second, unrelated thing going on which cannot be turned-off that reports images to apple that set off their warnings. The large concern there is that this type of techology could obviously someday be used to say report/delete all photos of police brutality. Since 20% of the world alone lives in China, I think the question of "How do we ensure a malicious authorities cannot use this technology against good users" is not really an afterthought but must be addressed head-on before people will buy-in.

r00fus · 4 years ago
This iMessage feature is good, it's the hashing all of your photos for comparison against CSAM hashes that's the problem. How do you know if you've tripped the wire? What about false positives?

Given how free US Law enforcement is with violence, any potential threat of involvement with them makes me very very nervous.

Going to need to reconsider hosting my photos with Apple devices.

Zelphyr · 4 years ago
My kids' school recommends Bark but, after looking into it, I felt I couldn't trust it. Unlike Apple, Bark appears to (at the time I reviewed them) transmit nearly everything the kid does on the phone to the Bark servers for review. That's a pretty terrible way of solving this problem, in my opinion.
webdood90 · 4 years ago
> getting your kids a phone in this day and age is a pretty terrifying experience

there is a really easy solution to this problem

threeseed · 4 years ago
There is nothing easy about it.

a) Many places are still going in/out of lockdown or are schooling remotely and so their phone is the only way for them to communicate with their friends. Depriving them of social contact is incredibly unhealthy and harms their development.

b) For better or worse apps like Tiktok are a huge part of their culture and the popular dances etc are often known by everyone. Being the only child who is out of the loop can cause serious isolation.

Children are growing and making them not feel like they are part of a social group is incredibly harmful and can have permanent effects in adulthood. Giving them a phone but monitoring their activities is likely to be the least harmful approach.

brightball · 4 years ago
Oh I know. I'm the parent who's kids constantly complain "everybody else has one" and keeps holding out.
morpheuskafka · 4 years ago
The Messages app auto-blurring seems useful and respectful of the user, which is nice.

Additionally, the client side scanning seems very well-designed, but if iCloud Photos are not end-to-end encrypted, why are they going to such an effort to do this when they already have access to any image they want server-side?

ducadveritatem · 4 years ago
The explanation is complicated but really fascinating. I think I understand it, but not well enough to explain it. Read the section entitled "What cryptographic tools are used in the implementation of the system?" in this write up about Apple's methodology.

https://www.apple.com/child-safety/pdf/Technical_Assessment_...

Also look at their full whitepaper here: https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...

My take is that performing the initial hash matching and encrypting the results in two separate layers on device prevents Apple from having meaningful knowledge of low (under the set threshold for being flagged) quantities of matches on a user account. This protects the use of the threshold as a way to further reduce false positives. For example they couldn't comply with a subpoena that said "Hey, we know you set a threshold of only flagging + reporting accounts with 50 image matches, but we want to see a list of all accounts with 10 or more matches because we think that's good enough."

This method lets them set and enforce a threshold to maintain their target false positive rate which they say is ~1 in 1 trillion accounts incorrectly flagged.

Disclaimer: I'm not a cryptographer and could be misunderstanding this.

walkedaway · 4 years ago
My kids are older teenagers now, but I wish Apple would have had some of this 5-8 years ago. Good on Apple for investing in helping real world problems and issues instead of investing in silencing opinions they disagree with in the name of "misinformation."
crackercrews · 4 years ago
Is the blurred-photo feature available to adults as well? I can imagine that some people who get unsolicited photos sent to them might want it.

Also, it looks like the blurring feature is limited to the Messages app. That's pretty easy to work around.

tptacek · 4 years ago
They got Mihir Bellare to review (and write a proof for) the private set intersection cryptography pieces of this.

https://www.apple.com/child-safety/pdf/Technical_Assessment_...

Apple's official proof was done in part by Dan Boneh:

https://www.apple.com/child-safety/pdf/Apple_PSI_System_Secu...

skinkestek · 4 years ago
I think Moxie came up with the end-to-end encryption technology that WhatsApp and Signal uses.

It is great.

But it didn't prevent Facebook from trying to abuse WhatsApp users in every conceivable other way including some rather innovative ones as seen recently.

Also it didn't prevent Signal from releasing a desktop client with a really nasty XSS bug and the phone client hasn't exactly been without faults either.

You and Moxie are both much smarter than me but I foresaw and maybe even predicted (possibly so early it is under another handle) some of Facebooks actions while everyone else was just talking about how E2E-encryption was the big difference.

This feels more or less the same: the intent is honorable, the core code is clever and almost unbreakable - but you cannot really trust all of the actors you need to trust for the system to work.

We cannot read WhatsApp messages in transit. And, as specified this system sounds good. But WhatsApp happily uploads unencrypted copies to Google Cloud and/or iCloud without your consent if another participant enable backups, and I'm fairly sure we'll see this system containing huge opportunities for abuse by power hungry regimes as well as one or two nasty bugs that has the possibility to utterly destroy a few innocent persons lives.

tptacek · 4 years ago
I'm not sure I really follow this (Signal Protocol is Trevor and Moxie; in particular, I think the triple-DH AKE is Trevor's invention) --- but I'm just here to point out the interesting cryptography angle. I wouldn't touch the policy argument here with a 10 foot pole.
pranau · 4 years ago
So what? When you don’t trust the actors running the system, it doesn’t matter if the crypto is perfect.

And right now, I don’t trust the maintainers of the CSAM database and Apple to do the right thing.

tptacek · 4 years ago
So nothing? It's interesting that Bellare and Boneh are involved. That's the extent of my commentary.
zionic · 4 years ago
Even then, say you trusted Apple.

Nobody signed up/purchased an iPhone trusting random 3rd parties/CSAM aggregators.

drenvuk · 4 years ago
Thank you for the purely informational post and I'm currently reading through the second just to understand this more thoroughly but I can say already screw the proofs. Apple needs to stay off my devices with their crime scanning bs. It doesn't matter if this whole scheme ensures that Apple cryptographically keeps their hands clean. The effect is the same. They're on my device looking for things that can send me to jail.

These brilliant academics just don't care so long as they can publish an interesting paper I guess. Whatever, still reading.

Deleted Comment

tptacek · 4 years ago
I don't think Bellare or Boneh are hurting for cites.
TravisHusky · 4 years ago
Wow, I am surprised Apple is taking this route. It's not like iCloud was a haven for sharing illegal content. I love the security features of iOS but honestly this may have pushed me to move to an Android device running GrapheneOS.

Honestly, any time there is a new policy to "protect children" it is almost always incredibly invasive and it always feels like there is some other motive and "protecting children" is used to scare anyone who tries to question it.

PragmaticPulp · 4 years ago
I really don’t understand why Apple is doing this. The vast majority of their customers aren’t involved in any of these illegal activities, so it only provides potential downside through false positives.

I’m also struggling to imagine scenarios where a child predator is clever enough to acquire illegal photos without triggering any number of internet monitoring mechanisms (e.g. honeypots, server logs with their IP address) who would then turn around and upload those photos to their iCloud account. Doesn’t make sense.

This is a really strange move.

Shank · 4 years ago
> potential downside through false positives

I just spent the last hour of my life digging through the material. It seems like they've calibrated the system to have an expected false classification rate of one account in one trillion, per year. Based on the threshold secret sharing math, they won't know anything until a user has a significant collection of CSAM and the secret key is recovered.

> I’m also struggling to imagine scenarios where a child predator is clever enough to acquire illegal photos

I don't want to be the bearer of bad news, but CSAM has been shared openly on the clearnet on places like 4chan for years. The internet is a pretty wide open search space. Many of the people who download CSAM don't do so from some sketchy underground website. This is why Tumblr, for instance, burned their platform to the ground: lots of people sharing CSAM with no way to detect and stop it (without costing Verizon a ton of money to work on the problem).

jareklupinski · 4 years ago
i'm scratching my head wondering why they bothered announcing it ahead of time?

why not just run the scans in the background... (honestly surprised they aren't already) it's not like it would be hard to omit this avenue as your lead during prosecution

q3k · 4 years ago
> I really don’t understand why Apple is doing this.

My guess is that this is a PR campaign to show that they're doing self-regulation. Perhaps related to the EU push against end-to-end encryption.

Dead Comment