Readit News logoReadit News
tharne · 5 years ago
I think the problem Apple ran into was that there was no confusion at all. Apple announced they were going to scan users' devices after years of marketing themselves as a "privacy-focused" company. Shockingly, customers were pretty mad about the whole thing.
hypothesis · 5 years ago
There was no confusion at all.

There is no way Apple released their initial PR piece without thinking it through and deliberately fusing all those new features together as one big unassailable initiative. It was typical my way or the highway.

Which also make it funny now that they attempt to distinguish between them and run into same hole that they dug for other people.

[1] https://www.apple.com/child-safety/

slg · 5 years ago
>There was no confusion at all.

I don't know what you and tharne are talking about here. There was definitely confusion. HN is a tech forum and I still saw plenty of people here worried about how they would get in trouble for having innocent photos of their own children on their phone. You are allowed to be against Apple's plan while still recognizing that many people didn't understand what exactly was part of that plan.

echelon · 5 years ago
It's good because now Apple employees have a ton of reasons to question their employer and quit.

Apple:

- Isn't going to be remote work friendly.

- Shut down internal polls on compensation.

- Bows to the FBI, CIA, FSB, CCP.

- Treats its customers as criminals.

- Treats its employees as criminals.

- (Spies on both!)

- Doesn't let customers repair their devices or use them as they'd like.

- Closes up (not opens up) the world of computing. Great synergy with the spy dragnet.

Take your time and talent elsewhere. This bloated whale is bad for the world. There are a lot of good jobs out there that pay well and help society.

ksec · 5 years ago
>There is no way Apple released their initial PR piece without thinking it through and deliberately fusing all those new features together as one big unassailable initiative.

Something I bet wouldn't have happened when Katie Cotton was in charge. But yeah. Tim Cook thought he need new PR direction. And that is what we got. The new Apple PR machine since 2014.

ISO-morphism · 5 years ago
> without fusing all those new features

Industry is learning from omnibus bills

Bud · 5 years ago
This is inaccurate by definition, of course. Obviously. "My way or the highway" implies there is no alternative.

But in this case, of course, if you're an adult, the Messages part of this doesn't apply to you at all, and the photos part can be completely avoided by not using iCloud Photos.

gentleman11 · 5 years ago
They are gas lighting people who are upset about what is really happening, and what will happen in 5-10 years, and portraying them as confused and ignorant instead. It’s the standard “you’re holding it wrong” Apple play
ksec · 5 years ago
Well there is a huge difference to AntennaGate. The two aren't really comparable. Not to mention Apple did in a way admit to the mistake and gave out a bumper within weeks of the complain.

Compared to their Keyboard which took nearly 3 years before they have a programme for free repair.

ultimoo · 5 years ago
Can we stop using gaslighting when it's not applicable? This is an instance of deceiving or lying, not gaslighting.
politelemon · 5 years ago
Five years ago it was: "you’re holding it wrong"

Five years from now it'll be: "you're wrong"

HN will as usual agree and take pride in being wrong.

OrvalWintermute · 5 years ago
The confusion was about the pushback. They expected a 2 foot wave, and they are getting a tsunami.

We drank the Apple Privacy Kool-aid, and now we are holding them to it.

This is totally a battle worth fighting!

pico303 · 5 years ago
I didn't realize at first how much this was going to change my view of Apple. Used to be when I saw the "Verifying xyz..." popup on my laptop, I felt a little more secure. It popped up tonight, and I found myself wondering, "Am I in trouble?"

I guess what I'm saying is, at least for me, this backlash is blurring their entire privacy and security pitch. Apple built a walled garden, told me it was for my own protection, then come to find out the cameras are all pointing to the inside.

jjcon · 5 years ago
Agreed - even if Apple doesn’t back down, giving them hell would make other companies less likely to follow suit. This is a very important line in the sand that they have crossed
tiahura · 5 years ago
“They expected a 2 foot wave, and they are getting a tsunami.”

Are you sure? My local Apple store is just as crowded as it was two weeks ago.

mortenjorck · 5 years ago
The "confusion" is splitting hairs. Federighi is trying to draw an artificial distinction between client-side scanning and in-transit scanning where the code performing that in-transit scanning merely happens to be running... on the client.
willcipriano · 5 years ago
User story for this feature: "As a user if the phone I spent $1200 on is going to spy on me, I want it to also use my electricity."
rendaw · 5 years ago
While still being controlled remotely.
notJim · 5 years ago
I was definitely confused, for the record. I had the impression that Apple would scan all photos on device, but that is not true. I was also confused because several changes were announced at once, and the conversations sometimes blended them.
acchow · 5 years ago
It’s my understanding that the your iPhone would be checking ALL your photos on your device. Where did you get to understand otherwise?
zionic · 5 years ago
Disabling iCloud does not remove the scanner from your device though, it is always there waiting for any other API to call it and begin scanning.

Apple promising not to use the scanner is a weak promise they know they can’t keep (NSLs)

dehrmann · 5 years ago
You're also a toggle away from Apple scanning all your photos...if it isn't already enabled.
1vuio0pswjnm7 · 5 years ago
"On 5 August, the company revealed new image detection software that can alert Apple if known illegal images are uploaded to its iCloud storage."

People assumed this opens the door for Apple to alerted of any known file uploaded to its iCloud storage.

IOW, they assumed Apple can check what someone is uploading,1 despite alleged "end-to-end encryption" and a gazilion promises of "privacy".

No one except the people managing the "detection software" know what files the hashes represent.

Theres no way for the owner of an Apple computer to verify what files Apple is actually checking for.

Is this confusion. It sounds more like lack of trust.

1 Mind you, for a majority of computer owners the uploading is likely occuring by default, automatically, outside of the owner's awareness. As opposed to the owner consciously deciding to upload a particular file to a computer in an Apple datacenter. Tech cmpanies know that users rarely change defaults.

Remember how Apple had zero accountability:

https://www.newscientist.com/article/dn26133-jennifer-lawren...

https://arstechnica.com/information-technology/2014/09/what-...

Ricky Gervais' advice made sense. Wonder why he deleted it.

pcurve · 5 years ago
They knew they were being hypocritical, so they were reluctant to even divulge the fact that other cloud providers have already been doing it; they wanted to position themselves as the pioneer.

I can't imagine how they thought this would go well.

It's another example of Apple being stuck in an echo chamber and not being able to objectively assess how their actions will be perceived.

How many times have they made product and PR blunders like this?

tungah · 5 years ago
It was pure hubris on their part.
philipov · 5 years ago
It's a typical "We're sorry you got mad" non-apology that deftly avoids admitting fault for the thing people are actually mad about.

Deleted Comment

hughrr · 5 years ago
If Craig tells me I’m misunderstanding this I distrust them further because I completely understand the full arena of possibilities and not just the narrow intent.
zionic · 5 years ago
Because of this interview I’ve added Craig to my mental list of executives who need to lose their job over this little stunt of theirs.

Apple has spent billions in engineering and marketing to establish themselves as the privacy leader, all wiped away by this idiotic system so full of holes you could serve it on crackers.

xibalba · 5 years ago
A true story...

Me (Last month): "Apple is taking privacy very seriously. I'm going to vote with my dollars and switch from Android."

Me (This month): "..."

godelski · 5 years ago
Honestly I was going to make the switch next gen of phones (been Android since the get go). Glad I waited. At least a Google phone I can flash
DesiLurker · 5 years ago
this was me except I made the switch after a decade of being away from apple ecosystem. now I am seriously considering going back.
mightybyte · 5 years ago
Vote with your money. If you own AAPL stock, sell it. And loudly refuse to buy Apple devices. That is the only language organizations like this will understand.
jes · 5 years ago
I agree. I do have AAPL and will liquidate it. I have also been an Apple loyalist for 25 years. They burned their bridges with me.
zionic · 5 years ago
I only had a few thousand APPL, so I’m a small fry in the big scheme of things.

I sold every share the day they announced this.

karmakaze · 5 years ago
The confusion was that many were previously taking Apple at their word when past actions should make that a questionble premise.
bsder · 5 years ago
The problem is that Apple regrets the negative PR but doesn't actually regret the scanning at all.

And that's the crux of the problem.

katbyte · 5 years ago
you say no confusion yet you are saying "scan users' devices" which is incorrect as its just photos being uploaded to icloud.
zionic · 5 years ago
Removing iCloud does not remove the scanner or it’s database full of hashes.

Limiting the scanner to iCloud is a policy decision one NSL away from changing.

steve_adams_86 · 5 years ago
I’d personally switch that “just” to “specifically”; while you’re correct, “just” connotes insignificance but the scanning is arguably significant.
gauravjain13 · 5 years ago
Thank you.
innagadadavida · 5 years ago
This is limited to users of iCloud photos. If you want to store your photos on Apple servers, shouldn’t they have the right to exclude CSAM content? Apple owns those servers and is legally liable. Why is this such a big issue?
psyc · 5 years ago
> If you want to store your photos on Apple servers, shouldn’t they have the right to exclude CSAM content?

This seems worded to get a Yes answer. So, yes.

It's a big deal because it's unprecedented (to my knowledge) outside of the domain of malware*. Other cloud providers run checks of their own property, on their own property. This runs a check of your property, on your property. That's why people care now. The fact that this occurs because of an intention to upload to their server doesn't really change the problem, not unless you're only looking at this like an architectural diagram. Which I fear many people are.

A techie might look at this and see a simple architectural choice. Client-side code instead of server-side. Ok, neat. A more sophisticated techie might see a master plan to pave the way for E2EE. A net-win for privacy. Cool. But the problem doesn't go away. My phone, in my pocket, is now checking itself for evidence of a heinous crime.

*I hope the comparison isn't too extra. I was thinking, the idea of code running on my device, that I don't want to run, that can gather criminal evidence against me, and report it over the internet... yeah I can't get around it, that really reminds me of malware. Not from society's perspective. From society's perspective maybe it's verygoodware. But from the traditional user's perspective, code that runs on your device, that hurts you, is at least vigilante malware, even if you are terrible.

hackinthebochs · 5 years ago
Personally I don't see on device scanning as significantly different than cloud scanning. I think the widespread acceptance of scanning personal data stored on the cloud is a serious mistake. Cloud storage services are acting as agents of the user and so should not be doing any scanning or interpreting of data not explicitly for providing the service to the end user. Scanning/interpreting should only happen when data is shared or disseminated, as that is a non-personal action.

If I own my data, someone processing this data on my behalf has no right or obligation to scan it for illegal content. The fact that this data sometimes sits on hard drives owned by another party just isn't a relevant factor. Presumably I still own my car when it sits in the garage at the shop. They have no right or obligation to rummage around looking for evidence of a crime. I don't see abstract data as any different.

gambiting · 5 years ago
Because if the content is entirely encrypted(like apple says it is) they aren't legally liable and it's entirely voluntary that they do this.

Also, no one(well, most people) has any issue with photos being scanned in the iCloud. Photos in Google Photos have been scanned for years and no one cares. The problem is that apple said that photos are encrypted on your device and in the cloud, but now your phone will scan the pictures and if they fail some magical test that you can't inspect, your pictures will be sent unencrypted for verification without telling you. So you think you're sending pictures to secure storage, but nope, actually their algorithm decided that the picture is dodgy in some way so in fact it's sent for viewing by some unknown person. But hey don't worry, you can trust apple, they will definitely only verify it and do nothing else. Because a big American corporation is totally trustworthy.

btkramer9 · 5 years ago
The issue is that the scanning happens on your device just before upload. So now your own device is scanning for illegal activity _on_ your phone not the servers.

The second issue is that it will alert authorities.

In regards to CSAM content those issues may not sound terrible. But the second it is expanded to texts, things you say, websites you visit or apps you use it's a lot scarier. And what if instead of CSAM content it is extended to alert authorities for _any_ activity deemed undesirable by your government

short_sells_poo · 5 years ago
I’d expect a secure and privacy focused cloud data storage provider to not know what I’m storing _at all_.

Let’s not beat about the bush, if someone wants to store information in a form that can’t be decrypted by Apple, they can. This is a stupid dragnet policy that won’t catch anyone sophisticated.

Apple focused the last years pitching themselves as the tech giant who actually cares about privacy. They seemed to be consciously building this image.

To now implement scanning of private information and then try and sell this obvious 180degree slippery slope turnaround in the most weasel worded “but think of the children” trope is an insult to the customers’ intelligence.

I was a keen Apple consumer because I felt that even if their motivation was profit, this was a company who focused on privacy. It was a distinct selling point.

I certainly won’t be buying more Apple products.

For me, Apple lost the main reason to buy their stuff. If they are going to do the same thing everyone else is doing, I refuse to pay the premium they charge.

frosted-flakes · 5 years ago
If they were scanning images that were uploaded to icloud on Apple's servers, no one would care. iCloud is not encrypted and Apple provides governments access to iClod data, everyone knows that, and other cloud providers already scan content for CSAM material. The difference is that Apple is doing this scanning on your phone/computer. Right now, they say that only images that uploaded to iCloud will be scanned, but what's to stop them from scanning other files too? There's been a lot of pushback because this is essentially a back door into the device that governments can abuse.
hypothesis · 5 years ago
Note how they use your device to do the dirty work for them, instead of doing what everyone else is doing and scanning stuff on their servers.
chrismcb · 5 years ago
While apple owns the servers they shouldn't be legally lake. No more than a self storage facility is liable for the items individuals sure in their units.
zepto · 5 years ago
They aren’t scanning users devices. If you think this, there is definitely confusion in the information getting out.
david_shaw · 5 years ago
> They aren’t scanning users devices.

They are scanning images on iPhones and iPads prior to uploading those images to iCloud. If you're not uploading images to iCloud, your photos won't be scanned -- but if you are using iCloud, Apple will absolutely check images on your device.

From Apple's Child Safety page:

> Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

> Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.

Source: https://www.apple.com/child-safety/

RussianCow · 5 years ago
You're splitting hairs unnecessarily. Apple is scanning users' photos on their devices. To say that they are not "scanning devices" because they are (currently) only targeting photos and not every single other part of the phone is unhelpful at best, and detracts from the point that this is a massive violation of their users' privacy. The exact wording here really doesn't matter as much as you think it does.
zug_zug · 5 years ago
This is one of those failure apologies, that's just making us dislike them even more.

I have no idea why they haven't done a 180 yet, this is a bigger failure than the butterfly keyboard. They are letting themselves become the symbol of technological dystopia in the public consciousness. Even an acquaintance who does construction was venting to me about how bad apple's policy is and why she is getting a pixel.

After entirely removing that feature and making a commitment to fight against that kind of future I feel like they owe two more apologies to get on my good side - one for screwing up this bad in the first place and one for insulting my intelligence with their handling of the outcry. This isn't 1990, you don't handwave a mistake this big.

jjcon · 5 years ago
> Even an acquaintance who does construction was venting to me about how bad apple's policy is

I overheard a group of women on the marketing team at my company talking about how creepy it is and I’ve started having a lot people ask me about it - it doesn’t seem contained to just techie at this point but it is concentrated there. I do think it will continue to grow though, apple has lost control of the narrative around their brand.

spideymans · 5 years ago
On TikTok there are plenty of videos now going around saying “Apple is scanning your phone to report you to the authorities”, with little to no nuance.

This is really, really bad for their brand.

kragen · 5 years ago
> I have no idea why they haven't done a 180 yet, this is a bigger failure than the butterfly keyboard.

Pressure from governments.

Fordec · 5 years ago
Bingo. They're not deliberately pushing this, they're just the public face on the initiative. You can complain about Apple all you like, but you're not given the choice to boycott the CIA.

The only reason we were even told this was being introduced in the first place is because it's being run on edge hardware (ie, phones). One talk at DEFCON on weird resource/energy spikes on apple devices and its existance leaks to the public domain which is even worse PR. The only difference is that historically such government level analysis has been conducted behind data center black boxes.

jjcon · 5 years ago
I would suggest that it is cooperation with governments in exchange for easement (watering down) of antitrust pushes
honksillet · 5 years ago
Why haven’t they done a 180?

I speculate their hand is being forced by one or more governments and rather than admit that they tried to sell it as best they could. Just speculation.

zsmi · 5 years ago
Regret is not an apology. It means Apple is stating they are disappointed by the confusion, and I am pretty sure that's true.
whatever_dude · 5 years ago
It's the typical "I'm so sorry you feel that way".
ffritz · 5 years ago
> I have no idea why they haven't done a 180 yet, this is a bigger failure than the butterfly keyboard.

Look at the stock. It barely moved (up).

d6e · 5 years ago
The stock isn't a like/dislike button. Apparently, the stockholders think that, regardless of what happens, Apple will still be here tomorrow. And to be fair, it's not like a significant portion of Apple customers will throw away their phones.
TheDudeMan · 5 years ago
Pixel will do the same, but they won't make the mistake of announcing the policy.
arvinsim · 5 years ago
Which is why this is so bad. Apple sets the trend for the mobile industry.

Expect everyone else to follow suit and not apologize for it because "Apple is doing it".

xeromal · 5 years ago
What's Pixel?
Valakas_ · 5 years ago
It's a typical narcissistic non-apology: "I'm sorry you're feeling that way and that I couldn't hurt you without getting away with it more easily."
xtat · 5 years ago
its not even an apology, its manipulation
ballenf · 5 years ago
I think the "confusion" was 100% intentional. That the two features (iMessage scanning & on-device spying pre-upload to iCloud) were intentionally released at the same time to make the whole thing harder to criticize in a soundbite.

Confusion is the best-case scenario for Apple because people will tune it out. If they had released just the on-device spying, public outcry and backlash would have been laser targeted on a single issue.

jchw · 5 years ago
Fanatics also have a tendency to try to latch onto whatever details may offer a respite from the narrative. The core problem here is that Apple is effectively putting code designed to inform the government of criminal activity on the device. It’s a bad precedent.

Apple gave its legendary fan base a fair few facts to latch onto; the first being that it’s a measure against child abuse, which can be used to equate detractors to pedophile apologists or simply pedophiles (these days, more likely directly to the latter.) Thankfully this seems cliché enough to have not been a dominant take. Then there’s the fact that right now, it only runs in certain situations where the data would currently be unencrypted anyways. This is extremely interesting because if they start using E2EE for these things in the future, it will basically be uncharted territory, but what they’re doing now is only merely lining up the capability to do that and not actually doing that. Not to mention, these features have a tendency to expand in scope in the longer term. I wouldn’t call it a slippery slope, it’s more like an overton window of how much people are OK with a surveillance state. I’d say Americans on the whole are actually pretty strongly averse to this, despite everything, and it seems like this was too creepy for many people. Then there’s definitely the confusion; because of course, Apple isn’t doing anything wrong; everyone is just confusing what these features do and their long-term implications.

Here’s where I think it backfired: because it runs on the device, psychologically it feels like the phone is not trustworthy of you. And because of that, using anti-CSAM measures as a starting point was a Terrible misfire, because to users, it just feels like your phone is constantly assuming you could be a pedophile and need to be monitored. It feels much more impersonal when a cloud service does it off into the distance for all content.

In practice, the current short-term outcome doesn’t matter so much as the precedent of what can be done with features like this. And it feels like pure hypocrisy coming from a company whose CEO once claimed they couldn’t build surveillance features into their phones because of pressures for it to be abused. It was only around 5 years ago. Did something change?

I feel like to Apple it is really important that their employees and fans believe they are actually a principled company who makes tough decisions with disregard for “haters” and luddites. In reality, though, I think it’s only fair to recognize that this is just too idealistic. Between this, the situation with iCloud in China, and the juxtaposition of their fight with the U.S. government, one can only conclude that Apple is, after all, just another company, though one whose direction and public relations resonated with a lot of consumers.

A PR misfire from Apple of this size is rare, but I think what it means for Apple is big, as it shatters even some of the company’s most faithful. For Google, this kind of misfire would’ve just been another Tuesday. And I gotta say, between this and Safari, I’m definitely not planning on my next phone being from Cupertino.

Krasnol · 5 years ago
> I’d say Americans on the whole are actually pretty strongly averse to this, despite everything, and it seems like this was too creepy for mant people.

You mean that country which gives a damn about privacy altogether because all those fancy corps are giving them toys to play? You know, those companies which feed on the worlds populations data as a business model. The country which has a camera on their front door which films their neighbourhood 24/7? The country which has listening devices all over their homes in useless gadgets?

You have to be joking or that scale you impose here is useless.

This whole thing will go by fast and there won't be much damage on the sales side. Apple is the luxus brand. People don't buy it for privacy. Most of the customers won't probably even understand the problem here.

The only thing we might be rid of are those songs of glory in technical spheres.

danudey · 5 years ago
> The core problem here is that Apple is effectively putting code designed to inform the government of criminal activity on the device. It’s a bad precedent.

This is wildly disingenuous.

Apple is putting code on the device which generates a hash, compares hashes, and creates a token out of that comparison. That is 100% of what happens on the device.

Once the images and tokens are uploaded to iCloud photos, iCloud will alert if 30+ of those security tokens show a match, it will alert Apple's team, and they will get access to only those 30+ photos. They will manually review those photos, and if they then discover that you are indeed hoarding known child pornography then they report you to the authorities.

Thus, it would be more accurate to say that apple is putting on your device code which can detect known child pornographic images.

> And it feels like pure hypocrisy coming from a company whose CEO once claimed they couldn’t build surveillance features into their phones because of pressures for it to be abused.

This isn't a surveillance feature. If you don't like it, disable iCloud Photos. Yes, it could theoretically be abused if Apple went to the dark side, but we'll have to see what this 'auditability' that he was talking about is all about.

Honestly, with all of the hoops that Apple has jumped through to promote privacy, and to call out people who are violating privacy, it feels as though we should give Apple the benefit of the doubt at least until we have all the facts. At the moment, we have very few of the facts.

arvinsim · 5 years ago
With enough eyeballs, all disinformation/bugs are shallow.
jimbob45 · 5 years ago
Do you have a source on the iMessage thing? I don’t remember seeing anything about iMessage but maybe I failed to adequately read the press release.
kemayo · 5 years ago
It's a feature that only applies to kids under 18 who're in a family group, whose parents turn it on. It warns the kid before letting them see an image which machine-learning thinks is nudity. If the kid is 12 or under, their parents can be notified if they choose to see it. It apparently does no reporting to anyone apart from that parental notification.

Check the section "WHAT IS APPLE DOING WITH MESSAGES?" in this article: https://www.theverge.com/2021/8/10/22613225/apple-csam-scann...

throw7 · 5 years ago
"The system could only match "exact fingerprints" of specific known child sexual abuse images, he said."

This disinfo really angers me. That is the exact opposite of what I've read up till now. People talking about "NeuralHash" and being able to detect if the image is cropped/edited/"similar". SO what is the truth?

LeifCarrotson · 5 years ago
He carefully avoided saying that the image itself is the same. The exact fingerprint is the same, yes, but the fingerprint is just a hash of the actual image. Disinformation indeed!

The whole point of the system is that you get a matching hash after mirroring/rotating/distorting/cropping/compressing/transforming/watermarking the source image. The system would be pretty useless if it couldn't match an image after someone, say, added a watermark. And if the algorithm was public, it would be easy to bypass.

The concern, of course, is that all of this many-to-one hashing might also cause another unrelated image to generate the same fingerprint, and thereby throw an innocent person to an unyielding blankface bureaucracy who believes their black-box system without question.

734129837261 · 5 years ago
It simply means that they can have whatever the hell kind of method they use to identify specific images, and the scary part is: there IS an error-margin built-in because otherwise, as you said, this tech would be pretty useless.

"Find all images and tag them if they look like this fingerprint" doesn't mean that. It means: "Find all images and tag them if they look 80% like this fingerprint".

Which also means that it will allow governments to upload photographs of people's faces and say: "Tag anyone who looks like this".

Worse, this will allow China to track down more Uyghurs, find people based on guides in the form of images that are spread around to stay safe from the Chinese government, and countries like Saudi Arabia can start looking for phones with a significant amount of atheist-related images, tracking down atheists, and killing them. Because that's what that country does.

intricatedetail · 5 years ago
These perceptual hashes do have high number of false positives. That's why they employ AI to discard images that don't have certain features from the pool to minimise the risk. But that method in general without actual human checking manually is a recipe for disaster.
mLuby · 5 years ago
> Apple decided to implement a similar process, but said it would do the image-matching on a user's iPhone or iPad, before it was uploaded to iCloud.

Is this list of hashes already public? If not, seems like adding it to every iPhone and iPad will make it public. I get the "privacy" angle of doing the checks client-side, but it's little like verifying your password client-side. I guess they aren't concerned about the bogeymen knowing with certainty which images will escape detection.

btown · 5 years ago
It's all on pages 4 and 5 of https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...

> The main purpose of the hash is to ensure that identical and visually similar images result in the same hash, and images that are different from one another result in different hashes. For example, an image that has been slightly cropped or resized should be considered identical to its original and have the same hash. The system generates NeuralHash in two steps. First, an image is passed into a convolutional neural network to generate an N-dimensional, floating-point descriptor. Second, the descriptor is passed through a hashing scheme to convert the N floating-point numbers to M bits. Here, M is much smaller than the number of bits needed to represent the N floating-point numbers. NeuralHash achieves this level of compression and preserves sufficient information about the image so that matches and lookups on image sets are still successful, and the compression meets the storage and transmission requirements

Just like a human fingerprint is a lower-dimensional representation of all the atoms in your body that's invariant to how old you are or the exact stance you're in when you're fingerprinted... technically Federighi is being accurate about the "exact fingerprint" part. The thing that has me and others concerned isn't necessarily the hash algorithm per se, but rather: how can Apple promise to the world that the data source for "specific known child sexual abuse images" will actually be just that over time?

There are two attacks of note:

(1) a sophisticated actor compromising the hash list handoff from NCMEC to Apple to insert hashes of non-CSAM material, which is something Apple cannot independently verify as it does not have access to the raw images, which at minimum could be a denial-of-service attack causing e.g. journalists' or dissidents' accounts to be frozen temporarily by Apple's systems pending appeal

(2) Apple no longer being able to have a "we don't think we can do this technically due to our encryption" leg to stand on when asked by foreign governments "hey we have a list of hashes, just create a CSAM-like system for us"

That Apple must have considered these possibilities and built this system anyways is a tremendously significant breach of trust.

mzs · 5 years ago
Thanks, here's how one system works, calculates dozens of floating point moments:

http://www.fmwconcepts.com/misc_tests/perceptual_hash_test_r...

JohnCurran · 5 years ago
That "exact fingerprint" is, in my opinion, intentionally confusing.

This DaringFireball[0] article states the goal of the system is to "generate the same fingerprint identifier if the same image is cropped, resized, or even changed from color to grayscale."

So while the fingerprint may be "exact", it's still capable of detecting images which have been altered in some way

[0] https://daringfireball.net/2021/08/apple_child_safety_initia...

foobiekr · 5 years ago
An exact match of a perceptual hash is basically deliberately misleading. The entire point of a perceptual hash is that there are an almost unlimited number of images which it will match "exactly."

But hey, I'm just one of the screeching voices of the minority.

intricatedetail · 5 years ago
It will also match completely different images, that why there is "neural" bit to discard images that e.g. don't have nudity from the pool of matches.
gizdan · 5 years ago
The truth is they're rephrasing what was already known. They're going to match match finger prints of pictures against a database. Every picture. This was widely report. What confusion they're referring to I don't know, because they're saying exactly what has been reported.
notJim · 5 years ago
> Every picture

This is the confusion, it's only photos being uploaded to iCloud.

blintz · 5 years ago
The simple summary is: NeuralHash is not a cryptographic hash function. It's a private neural network trained on some images. We have no guarantees of its difficulty to reverse, find collisions for, etc. The naming of it as a 'hash' has confused people (John Gruber's post comes to mind) into thinking this is a cryptographic hash. It simply is not.
patrickthebold · 5 years ago
If I had to guess, cropping and other transformations result in the same (exact) fingerprint. So different images but same fingerprints.

Of course, that's just a nasty way to imply that the images match exactly.

totetsu · 5 years ago
Just wait till you download that meme image, to upload to your reaction meme folder on icloud, that some troll has kept the background of some csam image and edited meme text over the bits that might have made you aware of its origins. will that match?
laurent92 · 5 years ago
Who cares, the NCMEC database is certainly full of unreviewed material, given even their employees can’t automatically have access to it. For any dystopian state, the goal is to have as many false positives as possible in the NCMEC database, to be able to legitimately have your photos uploaded to their headquarters.
elliekelly · 5 years ago
Does it make a difference? My iPhone shouldn’t do anything to or with my photos unless and until I direct it to. Scanning, hashing, whatevering — Apple doesn’t get to decide to do any of it. I do. And only I do.
ddlutz · 5 years ago
And we all know software never has bugs. Somebody is going to get arrested over this feature for some benign photo one day, I guarantee it.
nomel · 5 years ago
How so? It would require passing the threshold to get human review, so actual material + false flags > threshold. This should probably result in the person getting in trouble. The case of false flags > threshold should not result in an any trouble since it would then go through human review.
pcmoney · 5 years ago
WTF is he talking about? There is no “confusion” they are scanning your phone for data they have decided is bad. Yes today it is allegedly CP, tomorrow it is anything.

“The system could only match "exact fingerprints" of specific known child sexual abuse images, he said.”

Or like whatever they, the US govt, or any govt where they want to make money (such as China) wants. Is anyone auditing the blacklist? Is it publicly reviewable? (Since it contains CP of course not)

nowherebeen · 5 years ago
And when a government want to scan for “illegal” images, they will just fall back to the argument that it’s the law there. It’s a terribly slippery slope.
734129837261 · 5 years ago
What's worse than child pornography in, say, Saudi Arabia? Atheism is. They can force Apple to tag accounts that have images that are popular in atheist circles (memes, information, etc.) and track these people down. The penalty for that in Saudi Arabia is death.

China can start finding Uyghurs based on the images they tend to share. If we're unlucky (as a world), they can even start searching for particular individuals.

"Save the children" is just the classic political ploy to get a ruling through that's just a precursor for evil things to come.

I'm absolutely disgusted by Apple.

kemayo · 5 years ago
I don't see how this is any different from what Apple could already have been forced to do. If the argument is that they're going to knuckle under to an abusive request involving this system, then they'd presumably have done so under the prior status quo which was no more secure.

They already were storing the photos unencrypted (or with keys available, at least) on their servers, so any government that was able to push them to add a hash to this scanning system could have gotten them to scan for something in iCloud.

China, in particular, could definitely already be doing that, since China made Apple host all iCloud data for Chinese users on servers inside China that're operated by a Chinese company. See: https://support.apple.com/en-us/HT208351

kemayo · 5 years ago
It's worth bearing in mind that the human review step does mean that a government can't just slip stuff in without securing Apple's cooperation (including training their review staff about all the political content they have to look for). Otherwise the reviewers would presumably just go "huh, that's weird, this Winnie the Pooh meme definitely isn't child porn" and move on.

Can a government secure Apple's cooperation in that? I have no idea. But it does make a useful subversion of the hash database a more complicated thing to accomplish.

elliekelly · 5 years ago
In some ways I think human review is even creepier. I don’t want an algorithm looking at my private photos but I _definitely_ don’t some rando “reviewer” looking at them! But I guess it all comes down to the same thing: I don’t want anyone looking at my photos unless I’ve deliberately shared my photos with them.
makeitdouble · 5 years ago
It’s been a while it hasn’t been discussed, but isn’t there laws to give gov special agencies the power to force private entities to cooperate, with potentially a ban on officially recognizing the request ever happened in the first place ?

And if memory serves well it doesn’t need to be Apple as a whole cooperating, a single employee with oversee power could be enough.

zionic · 5 years ago
Due to existing laws, Apple will be forced to make their human reviewers be active LEOs.

Their approval/report rate will make a FISA court judge blush.

farmerstan · 5 years ago
Unless some exec loses their job over this, this entire sequence of events was already playbooked by Apple. They knew to wrap the feature with CSAM to hopefully quell the protests, and also to add two features at the same time, so they could backpedal in case pushback was strong, and then they could blame “misunderstanding”. Even though they are being purposefully obtuse about the “misunderstanding” because there is none.

It’s a perfectly planned PR response but no one except the biggest sheep is buying it.

tgsovlerkhgsel · 5 years ago
I highly doubt it.

The other feature they're packaging with this (nudity warnings for children/teenagers) should be relatively uncontroversial. It seems well designed and respects the user's privacy: It shows a bypassable warning on the device, only sends a warning to parents for children up to 12 years old, and only if the child chooses to view it, and only after disclosing that the parents will be notified. I don't think there is much criticism they'd catch for that, no protests to quell.

On the other hand, the proposal that they're (rightfully) under fire for now is something that they can't easily back out of (they will immediately be accused of supporting pedophiles), and it's basically a "do or don't" proposal, not something that they can partially back out of. The press is also incredibly damaging to the "iPhones respect your privacy" mantra that's at the core of their current PR campaign.

I don't think they expected this level of pushback.

benhurmarcel · 5 years ago
> the proposal that they're (rightfully) under fire for now is something that they can't easily back out of

They could just say “Following the unexpected pushback, we will scan iCloud content on server like all other major cloud providers. Unfortunately this also forces us to shelve plans for end to end encryption for photos”.

Clubber · 5 years ago
>The other feature they're packaging with this (nudity warnings for children/teenagers) should be relatively uncontroversial.

To me that's the worst of the two. From what I understand, it uses AI trained to detect nude photos. This is much more likely to produce false positives than a hash compare.

Not only is it less accurate by nature, it's a backdoor that can be later used to scan the entire device for whatever photos it's trained to find and report to whoever it's coded to report to.

tinalumfoil · 5 years ago
The issue with things like this is, it's often a tradeoff of making the public happy vs making the government. If the initiative is partisan it might make sense to make the public happy. If the it's bi-partisan you make the government happy, and if you're lucky the government/political complex will eventually alter public opinion until your not really fighting the public anymore.

The PR show is kind of besides the point.

vouchmeplox · 5 years ago
>The issue with things like this is, it's often a tradeoff of making the public happy vs making the government.

The company should only be concerned with following the law, not earning bownie points for extralegal behavior. Making the government happy shouldn't be a thing in a country ruled by law.

rcfaj7obqrkayhn · 5 years ago
of course it is planned, even dropping this news on friday evening no less
kemayo · 5 years ago
They announced the whole thing back on Monday, though. If they were trying to hide it, the initial announcement would have been buried. Burying the "huh, we didn't expect this backlash" comment makes no sense.
wilg · 5 years ago
So many people in this thread are convinced this whole thing is intentionally malicious, that Apple is doing this because they want to enable government spying, and they are intentionally using child sex abuse as a way of trying to make it palatable in a PR battle.

I don't think that is the most likely situation at all.

Apple has been, as part of their privacy initiatives, trying to do as much as possible on the device. That's how they have been defining privacy to themselves internally. Then someone said "can we do something about CSAM" and they came up with a pretty good technical solution that operates on device and therefore, to them, seemed like it would not be particularly controversial. They've been talking about doing ML and photo scanning object recognition on device for years, they're moving much of Siri to on-device in iOS 15, all as part of their privacy initiatives.

It seems to have backfired in that people actually seem to prefer scanning in the cloud to on-device scanning for things like this, because it feels less like a violation of your ownership of the device.

I think the security arguments about how this system can be misused are compelling and it's a fine position to be strongly against this, but I don't know that there's good justification that Apple has some ulterior motive and is faking caring about privacy. I think they were operating with a particular set of assumptions about how people view privacy that turned out to be wrong and they are genuinely surprised by the blowback.

firebaze · 5 years ago
That's one of the few good aspects of a legendary fuck-up like this: you learn about people defending it.

People defending CSAM should go to hell, fast. But are we already done destroying all low-hanging fruits? Did we stop Johnny Savile? Did we put all clerical actors behind bars? Did we extinguish the child porn network behind Marc Dutroux (https://en.wikipedia.org/wiki/Marc_Dutroux)?

And even if we did, would that be enough of an excuse to implicitly accuse anyone? My spouses' family (well-off, so using iDevices) took photos of their young age kids playing, partially naked at the sea. They are now frightened if their photos could be stolen by someone and marketed as child porn.

So unbelievable.

notJim · 5 years ago
> They are now frightened if their photos could be stolen by someone and marketed as child porn.

That sounds bad, someone high up at Apple should do an interview clarifying that that's not what's happening!

system2 · 5 years ago
Doesn't matter how technically well done this is. I do not want my device to poke my files and send them to an AI software to make a decision. It makes me uncomfortable.

This is malicious. I do not want them to touch my photos or anything personal. I paid for this device, now it is doing things against my will.

jachee · 5 years ago
It doesn’t happen against your will. You still have full control over whether scanning happens.

Simply disable iCloud Photo Library, and nothing gets scanned.

ipv6ipv4 · 5 years ago
I agree it is likely not malicious at all. It’s the result of koolaid in an echo chamber. And inertia at this point.

However, I also think this is the poster child of the proverb that the road to hell is paved with good intentions.

Now Apple needs to cancel this misguided initiative and never speak of it again, if they want to salvage some of their reputation.

pcurve · 5 years ago
I don't think Apple has insidious reason, but their implementation can come across as betrayal.

Unlike Android phone, iphone will know about your uploading potentially problematic image beforehand. Yet, it will not alert you or block you from doing so.

craftinator · 5 years ago
> because it feels less like a violation of your ownership of the device.

It doesn't just feel like a violation, it is one. This opens doors for a number of future abuses of power, and abuses of rights.

The average user has no way of knowing what's in the database of hashes being compared against, and we have no way of controlling what goes into those databases. If some party in the US government decided that they didn't like the Quran, and put a million pictures of it in the database, and served the Apple reviewers with a gag order or some other nonsense, we'd have a 80's dystopian blockbuster as everyday life in our country.

My device, my property, my rules. Apple doesn't get to do a stop and frisk, even if it's "for the good of the children".

Spivak · 5 years ago
My device my property my rules is pretty rich talking about a platform where you can’t run your own apps without Apple’s approval, you have no control over how the OS behaves, and there is built-in DRM.

Your phone has been Apple’s device since the moment you bought it and turned it on.

insulanus · 5 years ago
Actually, I'm with you in that I think Apple's motives are different than people think, but I think yours are incorrect as well.

> Apple has been, as part of their privacy initiatives, trying to do as much as possible on the device. That's how they have been defining privacy to themselves internally. Then someone said "can we do something about CSAM" [...]

As a separate issue, many people in the company certainly do care about privacy, and that may go all the way to to Tim Cook. Who knows.

What is much more important to Apple the company, though, is making money. Governments have been hounding them for years about letting them spy on users. And they have painted themselves into a bit of a corner, by having the most secure phones.

Now the government comes to them with an offer they can't refuse, cloaked in child porn motivations. I believe many (most?) of the people involved are sincere. It's clear they have tried to make the least invasive system that still does what the government wants.

But that's not good enough in the crazy connected cyber-world we find ourselves in today.

Apple doesn't have a motivation to do this themselves. But they will do what they calculate they need to do.

cwkoss · 5 years ago
Apple has a huge potential profit motive. Once they roll this out for US users, they can sell the exact same capabilities to authoritarian regimes for detecting subversive images, images of warcrimes, etc.

China would happily pay billions of dollars per year for this capability.

sagarm · 5 years ago
> because it feels less like a violation of your ownership of the device.

Agreed that this was not intended to be malicious. Apple has always been pretty clear they they should decide what happens on their devices. This sort of on-device scanning that doesn't serve the user is just the latest example of it, and one that people who would never be affected by the code signing restrictions can relate to.

neop1x · 5 years ago
It may not be helping child abuse enough if it does not detect newly-taken child pornography. Couple of years fast forward and machine learning will be scanning all the photos you take and watch. A little bit forward and it will be forbidden to take photos of many things and there will be no-photo places. Apple and governments will dictate what is right and what is wrong to take and watch. They are doing it for the safety of everyone, right?
gerash · 5 years ago
I doubt most think this is malicious but Apple did preach "what happens on an iphone stays on the iphone" which aged terribly given this feature.

Apple has also benefited from the perception that Google (their smartphone rival) spies on people by collecting location information where as I am sure they also crowd source iphone location to improve their Maps and road traffic data

willio58 · 5 years ago
I agree with you. I’m all for privacy but have no issues whatsoever with companies scanning for CSAM. I do not feel this is an invasion of my privacy, because I know how hashing works and I know I do not have CSAM on my device.
blintz · 5 years ago
Do you know how NeuralHash works? NeuralHash is not a cryptographic hash. Unless you're an Apple employee, you can't - the model is private and not inspectable.
thephyber · 5 years ago
I think some people here have similar feelings, but as other replies point out, this particular hash is not cryptographic and we have to trust Apple’s word for it that false positives are extremely rare.

Also worth mentioning that the original corpus of data used to build the hashes isn’t inspectable, so we don’t know how much garbage exists on the input side.

A recent blog post suggests that the corpus of CSAM isn’t even very comprehensive, so the value of this system is questionable.

gigel82 · 5 years ago
> I’m all for privacy but have no issues whatsoever with companies scanning

That's an oxymoron.

jachee · 5 years ago
Exactly! There has been so much FUD, conspiracy theory, and fear-mongering.

None of the usual anti-regulation apologists have pointed out that Apple shouldn’t be forced to download and host potentially-illegal material in the interest of ensuring whether or not it’s actually illegal.

This whole program is their intelligent solution to protecting as much user privacy as possible while still being compliant with the law. On-device hashing is actually pro privacy compared to in-the-cloud scanning (which all other cloud hosting providers are also required to do).

m-ee · 5 years ago
This is not about compliance, the relevant law specifically says that companies are not required to proactively scan for CSAM.
jmull · 5 years ago
I believe I understand the distinction between the two features perfectly well.

Personally, I don't have a problem with the parental control feature in Messages (it's pretty clear what it does and the user can decide whether to use it or not, or the parent for younger kids -- that's exactly as it should be).

I do have a problem with the feature where they scan images on my phone to match against a database of images. To be clear, here's a list of things that don't make me feel better about it: that it scans only a certain subset of images on my phone; that the technology parts of it are probably good; that NCMEC maintains the database of images (is there any particular reason to believe the database is near perfect and has all appropriate quality controls in place to ensure it remains so?)

There are several issues about this that Apple does not address. A big one for me is the indignity and humiliation of them force scanning my phone for CP.

Here's a hypothetical for Craig Federighi and Tim Cook to consider:

Suppose we know there are people who smuggle drugs on airplanes on their person for the purpose of something terrible, like addicting children or poisoning people. If I run an airport I could say: to stop this, I'm going to subject everyone who flies out of my airport to a body-cavity search. Tim, and Craig, are you OK with this? If I can say, "Don't worry! We have created this great robots that ensure the body cavity searches are gentle and the minimum needed to check for illegal drugs," does it really change anything to make it more acceptable to you?

kemayo · 5 years ago
> There are several issues about this that Apple does not address. A big one for me is the indignity and humiliation of them force scanning my phone for CP.

Are you okay (conceptually, assuming a perfect database and hashing function) with them scanning pictures uploaded to iCloud for this material if the scanning happens on their servers? Or is this a complete "these pictures should never be scanned, regardless of where it happens" position?

If the former, I personally don't feel a distinction between "a photo is scanned immediately before upload" and "a photo is scanned immediately after upload" is very meaningful. I'd be more concerned if there wasn't a clear way to opt-out. I acknowledge that there's room to disagree on this, and maybe I'm unusual in drawing my boundaries where I do.

If the latter... I think that ship has sailed. Near as I can tell, all the major cloud platforms are scanning for this stuff post-upload, and Apple was a bit of an outlier in how little they were doing before this.

OrvalWintermute · 5 years ago
I see this as wrought with constitutional issues.

If Congress has to obey the Constitution, then they cannot create an organization which they control, and then push for that organization to execute functions they cannot perform by getting in cahoots with industry.