My opinion about:
1. Every pedophile know about the existence of this system, so I don't think it will be useful to fight those monster, maybe only marginally;
2. Anyway, is that legal ? Even if some crazy store material on his Apple hardware isn't that illegal search non usable in law courts ?
3. Child abuse is often used as Trojan horse to introduce questionable practice. What if:
- the system is used to looking for dissidents: I look for people that have a photo of Tiananmen Square protests on their pc, for example;
- for espionage: I have the hash of some documents of interest, so all the PCs with that kind of documents could be a valuable target;
- profiling people: you have computer virus sample on your PC -> security researcher/hacker;
I think that the system is prone to all kind of privacy abuse.
4. this could be part of the previous point, but, because I think it's the final and real reason for the existence of that system, I give to this point its own section: piracy fight. I think that the one of the real reason is to discourage the exchange of illigal multimedia material to enforce copyrighs.
For the listed reasons, I think that is a bad idea. Let me know what are you thinking about.
5. The system can easily be abused by governments or malicious actors to frame innocent people. People merely suspected of keeping such images are de-facto punished and stripped of rights even without standing before a judge or getting a conviction.
This is my primary concern. It will become a weapon to destroy the lives of anyone who is targeted by someone with middling or better hacking skills. A sort of digital “swatting” that makes using apple products a no go for anyone with cyber-enemies (one can’t opt out of Apple ID and apply security updates).
Google has been scanning your entire account for kiddie porn for the past decade.
>a man [was] arrested on child pornography charges, after Google tipped off authorities about illegal images found in the Houston suspect's Gmail account
I can't reply to or upvote ~stalkersyndrome's response to add my applause, but I'll do it here instead. He's right. It's inconvenient, and people don't like reading that, which is why it's downvoted to dead, but alas, this is the same reason people avoid getting counseling, because they haven't worked up the courage to be honest with themselves yet. We'll get there, I hope.
Perhaps we should also rise again the question about who controls own computer because any abuse starts there.
Only full control over own device can prevent abuses. Especially when device comes any close to definition of being personal. You should be able to install own software on the personal device. Including os and bios/firmware.
Not only would this be marginal it also wouldn’t necessarily be catching the real “monsters”. I don’t think if you find someone with old already known about images that it would necessarily equate to someone that actually abuses children. I think about this in a similar way (not exactly) as I do with drugs, just because a person gets busted with drugs doesn’t mean they are a drug dealer or a maker of drugs.
This is not to say that perhaps there are some more active real-time stuff in these databases that maybe with enough searching could make its way back to the perpetrator and indeed maybe even find a victim. It’s just seems that that would be far more marginal and is generally what I’m concerned about when it comes to these issues. For me it’s more important to protect children than it is to bust some weirdos for looking at the wrong porn (these can both be related as well and I do understand that I just think it’s not as cut and dry as we believe it is), further if it keeps said weirdo from actually harming a child then let them have it. We allow these databases to exist for, presumably, the same reason, with the idea that we can stop future victims from happening.
> 2. Anyway, is that legal ? Even if some crazy store material on his Apple hardware isn't that illegal search non usable in law courts ?
Yes, it's considered legal. Apple reviews the content first. Courts say this means it is not an illegal government search. It's a search by a private party, who then manually decides to notify the government.
No, it's not. At least not here in Germany. By law, even police officers are not allowed to look at child porn. The only institution explicitely allowed to do so is the BSI.
The rest of the population implicitely incriminates themselves when they look at (not own) child porn, including Apple's legal entity or employees.
See [1] for 184b Strafgesetzbuch
I'm trying to point out that with this action Apple bluntly decided to ignore a whole lot of countries and their federal laws, which is not something I would embrace - even when they had good intentions.
I think this is a bad move by Apple even if the point is to set up E2EE later. However, one thing that everyone seems to forget is that all these pictures were already being sent un-encrypted to iCloud. ALL of the same issues already completely exist today and were already being scanned and we have heard no outcry. ALL of the same loopholes and unreasonable warrants can be used against you today with all of the un-encrypted data they have on their server right now.
The one thing that occurred to me is that this is almost seems like this is a cya, Section 230 protection in disguise. There has been more discussions about Big Tech and 230, and this is one way to say "Look, we are compliant on our platform. Don't remove our protections or break us up, we are your friend!" It also shouldn't be too surprising given how Apple has behaved in China. They will only push back against the government up until the point it starts to affect profits.
Of course it would be possible to implement content search, profiling and reporting mechanisms for such content, but this seems to be a singularly bad platform for that sort of search.
The image profiles are part of the OS so there's no mechanism to deliver image profiles separately for different countries. Also when the threshold number of matching images is reached, the matches are reported to a manual reviewer at Apple not a government. It only checks images on upload to iCloud photo storage.
So of course each of these limitations of the system could be changed, but you'd really need to change all of them and at that point you've created a completely different system. There's no simple change to this system that would suddenly turn it into a snitch for e.g. China or Saudi Arabia.
I've seen exactly the same objections raised every time any kind of device content search has become mainstream. Back in the 90s it was virus checking (Do you trust the AV company? What if they were bribed by the content companies?), full device indexing and search (Do you trust the OS vendor? What if they're in league with the government?). I'm very surprised this didn't blow up when Apple implemented ubiquitous image text recognition. Maybe it did. AV and device indexing mechanisms, which are ubiquitous, seem like a far more vulnerable target for such requirements.
So I don't really buy the slippery slope argument. In theory any government could pass a law requiring any company operating in it's jurisdiction to do anything, with an implementation suitable to that actual purpose. Of course this mechanism is motivated by laws in the US so it's a perfect example of exactly that, and it's a completely new system not a slippery slope subversion of an existing one. The real slippery slope here is legislative, not technical and I think that should be far, far more concerning.
I do think the legal and moral questions about this mechanism are legitimate. I think it would make more sense for Apple to scan photos in their cloud storage on the cloud storage rather than on upload. I understand there are theoretical privacy benefits to users from this implementation but the optics of having user's devices snitch on them are all wrong.
>Back in the 90s it was virus checking (Do you trust the AV company? What if they were bribed by the content companies?), full device indexing and search (Do you trust the OS vendor? What if they're in league with the government?)
These are examples of companies choosing to do something as a selling point of their software as a benefit to the end user, and people worrying that it could aid the government down the line if they change their mind.
Apple's content review change is explicitly FOR reporting people to police in a way that can be expanded beyond it's currently set purpose (child porn) later.
>I'm very surprised this didn't blow up when Apple implemented ubiquitous image text recognition.
I'm personally not a fan of that stuff anyway, but personally if it's only my local device I don't tend to care about image recognition, it's only when it involves communicating information from MY hardware to THEIR servers that I get antsy.
Apple should just scan the pictures that are in iCloud (their servers). They just assumed that if you have the iCloud option enabled on your device that it gave them the right to do the scan on your phone/computer.
I want to also point out that A/V companies never said they were going to scan for child abuse images on your computer and report you if they found any.
>The image profiles are part of the OS so there's no mechanism to deliver image profiles separately for different countries
Haven't Apple already said it WILL be country specific?
>Apple’s new feature for detection of Child Sexual Abuse Material (CSAM) content in iCloud Photos will launch first in the United States, as 9to5Mac reported yesterday. Apple confirmed today, however, that any expansion outside of the United States will occur on a country-by-country basis depending on local laws and regulations.
I think they'd need to be country-aware at least, otherwise the FBI or whoever will get reports for all people on earth when they presumably don't need them for anyone outside the US?
An Apple recruiter recently reached out to me. I am in a fortunate position to turn down opportunities, so I made sure to explain that I am not interested in working for a company that is at the forefront of enabling further infringement on people's privacy. If you are able to push back, do it in any small way that you can.
In many ways Apple is also the world leader on consumer privacy, pushing for changes when the rest of the industry is walking in the opposite direction. Paying with Apple Pay makes you safer because it gives out minimal payment information; the Target fiasco would've been avoided.
Sign in with Apple allows users to provide minimal information in signing up for accounts; the idea that casual users should know how to setup email aliases is a joke. Apple private relay is the closest to getting grandma to use TOR. Apple is working on stopping pixel tracking in email.
Apple is also leading on the story of user permissions, which is a broken model where you blame users for accepting all the snooping in their lives, for not reading the TOS, and for their failure to negotiate against Walmart.
As always when talking about security and privacy, you need to understand the threat model.
Apple protects users from some threats while also becoming itself the biggest threat to users. And this is exactly what Apple wants. This is how you use Stockholm syndrome to entrench a feudal system.
The relationship is not 3-way as Apple wants users to believe (Apple the defender, users the victim, third-parties the aggressor). The map of the territory is a lot more complex.
As a long time Apple user and developer (since black and white Apple II), I have witness the rise/fall/rise again/and now fall again of Apple. The future computer I WILL be Linux/FreeBSD for all important stuff.
Well done, I'm with you. I've being doing similar in the games industry for years. If we, the actual makers stand against immoral action it will both build pressure and incentive for alternative ways of doing business.
Also, after you're hired, a group of Apple employees can sign a petition saying something you wrote 10 years ago--from which they saw an excerpt--bothers them....and Apple will fire you.
> I am not interested in working for a company that is at the forefront of enabling further infringement on people's privacy
Conducting scans on device instead of on server is your idea of infringement of privacy?
Apple's system keeps everything off their servers until there is an instance where many images on device match known examples of child porn and a human review is triggered.
Google's system scans everything on server, so a single false positive is open to misuse by anyone who can get a subpoena.
We've seen Google data misused to persecute the innocent before.
>Innocent man, 23, sues Arizona police for $1.5million after being arrested for murder and jailed for six days when Google's GPS tracker wrongly placed him at the scene of the 2018 crime
>Conducting scans on device instead of on server is your idea of infringement of privacy?
Why are you asking if the poster still beats their wife?
(More specifically, you're pre-supposing scanning must happen, which by itself is a highly debatable assertion)
Your point with Google is absolutely sound, but you seem to stop short of actually accepting that actual privacy (no peeking damnit) is dead on arrival. This is a case of rhetorical stealth goalpost moving whether you intended that or not.
>Conducting scans on device instead of on server is your idea of infringement of privacy?
It's an infringement on my right to freedom of speech. Client-side scanning merely opens the door for my device to censor me from sending any message of my choosing and impacts my ability to freely communicate. What is today child abuse, tomorrow is health information and further descends to political and religious memes, or whatever other content is deemed problematic.
I don't want to be that guy, but for this job there were lining up 300 more people.
Nobody except of a tiny group of nerdy guys (including myself, ofc) is against this apple csam move.
Just ask your parents or your non-tech friends if it's "ok" to scan people's phones to find those "bad pedophiles" in order to jail then up for the rest of their life. You will be surprised how much support Apple's initiative has in the broad public.
And that's why apple made this move. They don't really care for the 3% of people who we belong to. They do it because they know they will have the public and political support.
> Just ask your parents or your non-tech friends if it's "ok" to scan people's phones to find those "bad pedophiles" in order to jail then up for the rest of their life. You will be surprised how much support Apple's initiative has in the broad public.
My Dad is an old teacher today and was formerly a farmer.
My view is he clearly understands these issues and has done since I was a teenager sometime in the last millenium when I followed him around the farm and we talked about stuff.
Maybe your parents are like what you describe but don't underestimate other peoples parents. They might not agree immediately, but if one is careful many actually aren't unreasonable.
Also everyone: stop this defeatist attitude. Instead of asking leading questions, talk about it calmly and politely.
Just explain that once this system is in place it will be used for anything, not just photos (or otherwise bad guys could just zip the files). And when everything is scanned some people will add terrorist material (i.e. history and chemistry books), other will add extremist material (religious writings), blasphemous material (Christian or Atheist teachings in Saudi Arabia), and other illegal content (Winnie the Pooh, man against tank etc in China).
In the paragraph above there should be something to make everyone from Ateheists through Christians, Muslims, nerds, art lovers and Winnie the Pooh fans see why this is a bad idea.
I think you oversimplify this by a lot. No, Apple reputation won't be severely damaged by this move immediately. But I do believe that those "nerdy guys" did a lot to push the Apple brand, and a big part of that push was due to security and privacy. Until recently Apple was always the "privacy brand" and it was hard to argue against it without going the full FSF route of argumentation.
This is no longer the case and I'm sure this will deal some damage over time, even if it only starts with the "screeching voices" of the (nerdy) minority. Maybe not directly to their revenue, but certainly to their reputation. Nothing wrong with shaving off a bit of the prestige of working at Apple ;)
> I don't want to be that guy, but for this job there were lining up 300 more people.
This is the case for many jobs that don't come close to the holy "working at Apple".
If you think that you’re completely wrong. I have been asked about it by four non technical people so far after it made national news in the UK. There is a lot of anti surveillance sentiment here and it’s appearing in general public regularly.
I regularly go out with groups of random people on Meetup with no shared technical interest as well and I’m surprised at how much anti tracking and surveillance sentiment there is. It got to the point that out of 25 people on a trip out no one used NHS track and trace because they don’t trust it or don’t own a smartphone. This is across the 20-50 age group.
> but for this job there were lining up 300 more people
let them take it then. I try to minimize the blood on my hands.
>Just ask your parents or your non-tech friends
My parents were unhappy with it - they're non technical and not particularly concerned with privacy. I don't think they'll switch but they did ask how to mitigate it. I'm currently scrambling for a (friendly) alternative to icloud photos.
> They don't really care for the 3% of people who we belong to
Welcome to cyberpunk dystopia! Grab a devterm by clockwork (no affiliation), and log in, cowboy.
I agree with you, the general public doesn't give a shit. There will be headlines for a few times, some people change their phones, and that'll be it. The biggest of these movements, I think, is the "de-googling" one. There's a myriad of articles, subreddits, guides, websites even, listing alternatives. And look what happened to Google. Nothing.
When the alternative to apple's surveillance is to smash the phone against a wall, and buy something that's much less convenient, suddenly surveillance is not that big of a problem. And this is very important to note because many world's powerful entities are moving in this direction.
> You will be surprised how much support Apple's initiative has in the broad public.
I wouldn't be, but that's not the issue, the broad public is gullible, the overwhelming majority probably still believe that Iraq had WMDs before invasion.
> I don't want to be that guy, but for this job there were lining up 300 more people.
All of which could decide to stand up for individual rights, but won't with similar excuses to the one you formulated.
I know in US culture some see it as a strength to be selfish, but yet they complain about the society and the politics this kind of mentality necessarily lead to. If all the others are selfish, why should I be the sucker who pays for having principles?
Because suckers with principles shape a society until they don't.
were against hitler, ussr (inside ussr), unlimited king's powers, religion fanaticism, witch hunting, etc ...
in the beginning
today it's surveillance and attempts to legalize such abuses by Apple using some BS cover story intended to create emotional response and this way to fog the real issue: Spyware Engine installation/legalization
I am beginning to wonder if this was the plan all along. Back in 2013 via the Snowden leaks, it was revealed Apple was associated with the NSA domestic surveillance program, PRISM. It appears they (NSA and Apple, et al) pulled out due to the level of negative PR.
After 8 years, the intelligence community and tech companies figured out they could sell their surveillance through a thinly veiled effort to “protect X group” (in this case it was children).
Sorry, but I do not believe that is what the leak revealed.
There was a slide that indicated that data from Apple and other companies was now part of the PRISM program.
I am not trying to deny or refute Snowden's whistleblowing. I think it is highly likely that PRISM exists. What I dispute are the speculations that the companies listed are complicit.
The 2012 date is quite suspicious - it is precisely the same year that a new Apple datacenter in Prineville came online. Facebook also has a datacenter. Literally next door. Facebook also appears on those slides. I am not sure who else is also now in the area.
I wonder where all of the network cables go?
I personally think that PRISM works by externally intercepting data communication lines running to these facilities. Similar to the rumors that international comms links have been tapped. The companies themselves have not participated, but the data path has been compromised.
The NSA has previously tapped lines (AT&T), but they made the mistake of doing it inside the AT&T building. Google "Room 641A at 611 Folsom Street, SF". That is where "beam splitting" was done. This eventually leaked out. The NSA isn't stupid, I doubt they wanted to repeat that sort of discovery. The best way to keep something from being discovered is to not let people know. This is why I think it is believable and likely that the companies listed on the slides have no idea what has been done.
I will also note that PRISM and "beam splitting" are a rather cosy coincidence.
I think it is most likely that PRISM is implemented without the knowledge of anyone except the NSA and in Prineville there is some "diversion" of network cabling to a private facility that is tapping the lines.
> I personally think that PRISM works by externally intercepting data communication lines running to these facilities. Similar to the rumors that international comms links have been tapped. The companies themselves have not participated, but the data path has been compromised.
That wouldn't work without the company being at least passively complicit. Links between datacenters are encrypted. If you want even basic PCI-DSS compliance then links between racks must be encrypted (and a rack that uses unencrypted links must be physically secured). And properly implemented TLS or equivalent (which is table stakes for a company that takes this stuff at all seriously) can't be broken by the NSA directly (and if it could be then everything would be hopeless). Thus the MUSCULAR programme where the NSA put their own equipment in Google's datacenters - that's really the only way you can do it.
Remember how the legal regime in the US works with National Security Letters. Companies can be, and are, required to install these backdoors and required to keep their existence, and the existence of the letter itself, secret. Of course Google, Apple, Facebook, every other company with a significant US presence is in receipt of one of those letters and has installed backdoors - the NSA aren't stupid, what else would those laws and their funding be for?
They didn't pull out. Apple discloses over 30,000 customers' data each year without a warrant under PRISM (aka FISA 702) as disclosed in their own transparency report (listed under "FISA orders").
PRISM is just the internal NSA name for it. It continues unabated.
FISA orders are written by a Judge. Only judges can write these, this is the literal definition of a warrant. Warrants require specifics - Person X, person Y. These are enumerable. There is paperwork.
PRISM, based on the data available, is all about consuming data WITHOUT a warrant -- vacuuming data associated with identities that are not associated with ANY identities subject to a court order. Violating laws and possibly (USA) constitutional rights in quite a few ways. PRISM likely exists.
I ask of "sneak" to confirm their assertion that "PRISM == FISA orders" is true. Please present this "evidence" and the evidence of connection. If you cannot you are, by default, distributing mis-information, bad logic or at worst tying to mislead.
(my naive searching suggests that "sneak" is definitely not in a position to make these claims)
Let me tell you my perspective as a screeching minority long time Apple user. You may be right. We as professionals that evangelized a lot of people for Apple, don't have power or influence over core target of Apple of today. Yep.
But I can assure you that I personally, as my colleagues will do everything in our powers to hurt Apples public image and brand, to give real information to our clients, friends and families. To educate people why smartphone convenience is slavery to the Tech Lords and how in the future all this data will shape a Digital ID in Social Credit System which will render peoples freedom obsolete.
To all apologists, Apple employees and shareholders who will hold their stock after this, I have a simple message: F*ck You. No. Seriously. Go to hell.
You are created and supported the monster which will eat you at the end.
Didn't iMusic or whatever upload users' personal high quality files to cloud, to stream them back in lower quality, and then deleted the originals from the users devices? I remember something like that making the news.
Imagine being a musician and Apple deletes your originals to stream your own music back to you in low quality.
Yes, it did something like that. If and only if the user signed up for, paid for, and enabled the iTunes Match service, the whole point of which is to replace your local files with cloud music. (I don’t find this desirable myself, but I can see how some people might have.)
Apple screwed up big time in the functionality and messaging around it and some people found their original files deleted when they weren’t expecting it. Big problem.
But it was hardly some plot to scan users’ hard drives for copyrighted content and delete it. On the contrary, iTunes Match would happily launder a whole library full of pirated low-quality MP3s into legal, high quality, DRM-free AAC files.
For many years, anti-virus vendors were able to do that. Why haven't those vendors been already co-opted by governments (Kaspersky on the Russian side, Microsoft in the USA side) into scanning for illegal, copyrighted or secret material and reporting on it?
Even open source products like ClamAV rely on a opaque database of virus strings.
Kaspersky is blacklisted as a government security vendor for anything remotely resembling classified or sensitive material. Also, virus databases are open to having their definition databases perused by the user. You can actually dissect what is being scanned for. Apple's system is not, and goes through great pains to be as opaque as possible. Understandably so it may be, from a rational free agent point of view it is still a threat at scale.
Top of that, there are many other tools which can do all the same. People are thinking like this has been hard work to add right now, and now future exploiting comes easier. Hard part has been creation of system, which locks Apple out of your pictures. Scanning your system files and sending some metadata is literally few lines of code and could have been pushed on week anytime in the past.
Pushing Spyware Engine is literally abuse to all people including children and it's much worse then any problem they would claim to fight. Even if you believe them.
By this move people are indoctrinated with the idea that being watched by someone big and powerful is Ok. They learn to accept such abuse and what can be worse for any safety of anyone than learning that? If one is serious about any safety one should learn to walk away from such abuse first just like with any other abuses.
It is an attempt to legalize such Spyware Engine installation. Nothing more. The story is just to sell this move using emotional response from naive people. Because high emotions is when people do poor thinking for the long term consequences. Think about Vendetta and consequences of it.
Those people should be educated what the real abuse is and they should teach their children to recognize it because abuse by Apple is already there and it is much worse then the problem they claim are trying to solve. People need to understand that it will get much worse with the time.
So this one time I was tasked to verify a complaint of child pornography and image the infrastructure for evidentiary purposes, if necessary. It was the first time I’d ever been exposed to it as a naïve operations kid at a hosting provider.
Imagine my surprise and horror to find that not only was the complaint accurate, it led to a completely polished thumbnail site on par with PornHub. Boom, right there, no login. No nothing. Five high, seven wide thumbnails. No two of the same child. A complete search engine based on Solr that could filter the thousands of images by age of the victim. By the number of adults participating in the rape. A threaded comment section on each image where people discussed children in their neighborhood and their fantasies of abducting them. An erotic literature section where parents wrote about how they’ve been sexually attracted to their children since changing their first diaper.
I’ll never forget a photo of two men brutally raping a girl of about 9 or 10, because it was one of the highest voted on the site. One of the comments, which I still remember when I close my eyes at night, simply said “its better when they cry”. It’s been eleven years and I’ve seen and dealt with much more of it since then, and I still weep to this day thinking about the pain inflicted on those children, the pure evil of those who enjoy it, and even the design and engineering team who bafflingly put their skills toward building that nadir of human achievement.
Tell me again what “the real abuse” is and educate me, please, because you sound pretty confident that the frighteningly common story I just told isn’t that big of a deal. I can’t believe anyone sane would compare going through your photo collection, even egregiously, to the rape and exploitation of children and think, yeah, you know, based on my value system door number one is the “much worse” injustice. Your opinion is fucking sickening and the exact type of detached inhumanity that is poisoning this industry top to bottom.
Interesting that you chose to so thoroughly explore such a heinous site when all you had to do was image it and provide a copy to the authorities. More interesting that you then depict its graphic content in such detail here.
Perhaps a thorough search of your hard drives and NAS are in order citizen. No need to report to your local precinct, we've already pushed the updated scan list to your devices for analysis.
It is worse tho. The surveillance apparatus doesn't care much about your actual words and images, but your associations and relations. This makes finding needles much more efficient and pre-encryption exfiltration circumvents user added measures, like third party iCloud encryption. And I am pretty sure this will be baked into the OS deeper than your VPN/DNS's reach. Opening up this side channel isn't undone by trusting some "icloud deactivation". Much less in your mind.
This was the last straw for me. I’ve started looking around at alternatives (iCloud photos, literally no decent alternatives soo ok far to sync 400gb of photos) and have removed iCloud files to sub for just a simple NAS at home with tailscale.
2. Anyway, is that legal ? Even if some crazy store material on his Apple hardware isn't that illegal search non usable in law courts ?
3. Child abuse is often used as Trojan horse to introduce questionable practice. What if:
- the system is used to looking for dissidents: I look for people that have a photo of Tiananmen Square protests on their pc, for example;
- for espionage: I have the hash of some documents of interest, so all the PCs with that kind of documents could be a valuable target;
- profiling people: you have computer virus sample on your PC -> security researcher/hacker;
I think that the system is prone to all kind of privacy abuse.
4. this could be part of the previous point, but, because I think it's the final and real reason for the existence of that system, I give to this point its own section: piracy fight. I think that the one of the real reason is to discourage the exchange of illigal multimedia material to enforce copyrighs.
For the listed reasons, I think that is a bad idea. Let me know what are you thinking about.
>a man [was] arrested on child pornography charges, after Google tipped off authorities about illegal images found in the Houston suspect's Gmail account
https://techcrunch.com/2014/08/06/why-the-gmail-scan-that-le...
Their system can easily be abused by governments or malicious actors to frame innocent people.
https://news.ycombinator.com/item?id=28156934
Dead Comment
Only full control over own device can prevent abuses. Especially when device comes any close to definition of being personal. You should be able to install own software on the personal device. Including os and bios/firmware.
Not only would this be marginal it also wouldn’t necessarily be catching the real “monsters”. I don’t think if you find someone with old already known about images that it would necessarily equate to someone that actually abuses children. I think about this in a similar way (not exactly) as I do with drugs, just because a person gets busted with drugs doesn’t mean they are a drug dealer or a maker of drugs.
This is not to say that perhaps there are some more active real-time stuff in these databases that maybe with enough searching could make its way back to the perpetrator and indeed maybe even find a victim. It’s just seems that that would be far more marginal and is generally what I’m concerned about when it comes to these issues. For me it’s more important to protect children than it is to bust some weirdos for looking at the wrong porn (these can both be related as well and I do understand that I just think it’s not as cut and dry as we believe it is), further if it keeps said weirdo from actually harming a child then let them have it. We allow these databases to exist for, presumably, the same reason, with the idea that we can stop future victims from happening.
Yes, it's considered legal. Apple reviews the content first. Courts say this means it is not an illegal government search. It's a search by a private party, who then manually decides to notify the government.
No, it's not. At least not here in Germany. By law, even police officers are not allowed to look at child porn. The only institution explicitely allowed to do so is the BSI.
The rest of the population implicitely incriminates themselves when they look at (not own) child porn, including Apple's legal entity or employees.
See [1] for 184b Strafgesetzbuch
I'm trying to point out that with this action Apple bluntly decided to ignore a whole lot of countries and their federal laws, which is not something I would embrace - even when they had good intentions.
[1] https://www.gesetze-im-internet.de/stgb/__184b.html
Not if. When.
Using this system to look for unlicensed content will be irresistible to them.
Deleted Comment
The one thing that occurred to me is that this is almost seems like this is a cya, Section 230 protection in disguise. There has been more discussions about Big Tech and 230, and this is one way to say "Look, we are compliant on our platform. Don't remove our protections or break us up, we are your friend!" It also shouldn't be too surprising given how Apple has behaved in China. They will only push back against the government up until the point it starts to affect profits.
When it will be when people will say no? These are all small steps only.
The image profiles are part of the OS so there's no mechanism to deliver image profiles separately for different countries. Also when the threshold number of matching images is reached, the matches are reported to a manual reviewer at Apple not a government. It only checks images on upload to iCloud photo storage.
So of course each of these limitations of the system could be changed, but you'd really need to change all of them and at that point you've created a completely different system. There's no simple change to this system that would suddenly turn it into a snitch for e.g. China or Saudi Arabia.
I've seen exactly the same objections raised every time any kind of device content search has become mainstream. Back in the 90s it was virus checking (Do you trust the AV company? What if they were bribed by the content companies?), full device indexing and search (Do you trust the OS vendor? What if they're in league with the government?). I'm very surprised this didn't blow up when Apple implemented ubiquitous image text recognition. Maybe it did. AV and device indexing mechanisms, which are ubiquitous, seem like a far more vulnerable target for such requirements.
So I don't really buy the slippery slope argument. In theory any government could pass a law requiring any company operating in it's jurisdiction to do anything, with an implementation suitable to that actual purpose. Of course this mechanism is motivated by laws in the US so it's a perfect example of exactly that, and it's a completely new system not a slippery slope subversion of an existing one. The real slippery slope here is legislative, not technical and I think that should be far, far more concerning.
I do think the legal and moral questions about this mechanism are legitimate. I think it would make more sense for Apple to scan photos in their cloud storage on the cloud storage rather than on upload. I understand there are theoretical privacy benefits to users from this implementation but the optics of having user's devices snitch on them are all wrong.
These are examples of companies choosing to do something as a selling point of their software as a benefit to the end user, and people worrying that it could aid the government down the line if they change their mind.
Apple's content review change is explicitly FOR reporting people to police in a way that can be expanded beyond it's currently set purpose (child porn) later.
>I'm very surprised this didn't blow up when Apple implemented ubiquitous image text recognition.
I'm personally not a fan of that stuff anyway, but personally if it's only my local device I don't tend to care about image recognition, it's only when it involves communicating information from MY hardware to THEIR servers that I get antsy.
I want to also point out that A/V companies never said they were going to scan for child abuse images on your computer and report you if they found any.
Like you said, the optics are terrible.
Haven't Apple already said it WILL be country specific?
>Apple’s new feature for detection of Child Sexual Abuse Material (CSAM) content in iCloud Photos will launch first in the United States, as 9to5Mac reported yesterday. Apple confirmed today, however, that any expansion outside of the United States will occur on a country-by-country basis depending on local laws and regulations.
https://9to5mac.com/2021/08/06/apple-says-any-expansion-of-c...
I think they'd need to be country-aware at least, otherwise the FBI or whoever will get reports for all people on earth when they presumably don't need them for anyone outside the US?
Sign in with Apple allows users to provide minimal information in signing up for accounts; the idea that casual users should know how to setup email aliases is a joke. Apple private relay is the closest to getting grandma to use TOR. Apple is working on stopping pixel tracking in email.
Apple is also leading on the story of user permissions, which is a broken model where you blame users for accepting all the snooping in their lives, for not reading the TOS, and for their failure to negotiate against Walmart.
The relationship is not 3-way as Apple wants users to believe (Apple the defender, users the victim, third-parties the aggressor). The map of the territory is a lot more complex.
is also an early PRISM adopter and well known on cooperation with totalitarian regimes. Censoring Belarus protesters happened several months ago.
The leading narrative on /r/apple is now that "oh this invasive spyware has to exist so that apple can do E2EE iCloud", which is nonsense.
If they had done their job and deployed E2EE iCloud we wouldn't "need" this system in the first place.
It's a classic government pattern:
1. Create the problem (blocking E2EE such that providers have unencrypted copies of your content)
2. Screech and complain about this
3. Demand they do the thing you really wanted in the first place to solve the problem you created
Conducting scans on device instead of on server is your idea of infringement of privacy?
Apple's system keeps everything off their servers until there is an instance where many images on device match known examples of child porn and a human review is triggered.
Google's system scans everything on server, so a single false positive is open to misuse by anyone who can get a subpoena.
We've seen Google data misused to persecute the innocent before.
>Innocent man, 23, sues Arizona police for $1.5million after being arrested for murder and jailed for six days when Google's GPS tracker wrongly placed him at the scene of the 2018 crime
https://www.dailymail.co.uk/news/article-7897319/Police-arre...
Why are you asking if the poster still beats their wife?
(More specifically, you're pre-supposing scanning must happen, which by itself is a highly debatable assertion)
Your point with Google is absolutely sound, but you seem to stop short of actually accepting that actual privacy (no peeking damnit) is dead on arrival. This is a case of rhetorical stealth goalpost moving whether you intended that or not.
It's an infringement on my right to freedom of speech. Client-side scanning merely opens the door for my device to censor me from sending any message of my choosing and impacts my ability to freely communicate. What is today child abuse, tomorrow is health information and further descends to political and religious memes, or whatever other content is deemed problematic.
Nobody except of a tiny group of nerdy guys (including myself, ofc) is against this apple csam move.
Just ask your parents or your non-tech friends if it's "ok" to scan people's phones to find those "bad pedophiles" in order to jail then up for the rest of their life. You will be surprised how much support Apple's initiative has in the broad public.
And that's why apple made this move. They don't really care for the 3% of people who we belong to. They do it because they know they will have the public and political support.
My Dad is an old teacher today and was formerly a farmer.
My view is he clearly understands these issues and has done since I was a teenager sometime in the last millenium when I followed him around the farm and we talked about stuff.
Maybe your parents are like what you describe but don't underestimate other peoples parents. They might not agree immediately, but if one is careful many actually aren't unreasonable.
Also everyone: stop this defeatist attitude. Instead of asking leading questions, talk about it calmly and politely.
Just explain that once this system is in place it will be used for anything, not just photos (or otherwise bad guys could just zip the files). And when everything is scanned some people will add terrorist material (i.e. history and chemistry books), other will add extremist material (religious writings), blasphemous material (Christian or Atheist teachings in Saudi Arabia), and other illegal content (Winnie the Pooh, man against tank etc in China).
In the paragraph above there should be something to make everyone from Ateheists through Christians, Muslims, nerds, art lovers and Winnie the Pooh fans see why this is a bad idea.
I think you oversimplify this by a lot. No, Apple reputation won't be severely damaged by this move immediately. But I do believe that those "nerdy guys" did a lot to push the Apple brand, and a big part of that push was due to security and privacy. Until recently Apple was always the "privacy brand" and it was hard to argue against it without going the full FSF route of argumentation.
This is no longer the case and I'm sure this will deal some damage over time, even if it only starts with the "screeching voices" of the (nerdy) minority. Maybe not directly to their revenue, but certainly to their reputation. Nothing wrong with shaving off a bit of the prestige of working at Apple ;)
> I don't want to be that guy, but for this job there were lining up 300 more people.
This is the case for many jobs that don't come close to the holy "working at Apple".
I regularly go out with groups of random people on Meetup with no shared technical interest as well and I’m surprised at how much anti tracking and surveillance sentiment there is. It got to the point that out of 25 people on a trip out no one used NHS track and trace because they don’t trust it or don’t own a smartphone. This is across the 20-50 age group.
let them take it then. I try to minimize the blood on my hands.
>Just ask your parents or your non-tech friends
My parents were unhappy with it - they're non technical and not particularly concerned with privacy. I don't think they'll switch but they did ask how to mitigate it. I'm currently scrambling for a (friendly) alternative to icloud photos.
> They don't really care for the 3% of people who we belong to
Welcome to cyberpunk dystopia! Grab a devterm by clockwork (no affiliation), and log in, cowboy.
When the alternative to apple's surveillance is to smash the phone against a wall, and buy something that's much less convenient, suddenly surveillance is not that big of a problem. And this is very important to note because many world's powerful entities are moving in this direction.
I tell them that they can have a reasonable backup and network storage for a small amount of money and I believe their data is much safer on there.
Deleted Comment
Yes, it won't make a dent in Apple's finances, but at least that person can sleep better not supporting a company they find immoral
I wouldn't be, but that's not the issue, the broad public is gullible, the overwhelming majority probably still believe that Iraq had WMDs before invasion.
Dead Comment
All of which could decide to stand up for individual rights, but won't with similar excuses to the one you formulated.
I know in US culture some see it as a strength to be selfish, but yet they complain about the society and the politics this kind of mentality necessarily lead to. If all the others are selfish, why should I be the sucker who pays for having principles?
Because suckers with principles shape a society until they don't.
were against hitler, ussr (inside ussr), unlimited king's powers, religion fanaticism, witch hunting, etc ... in the beginning
today it's surveillance and attempts to legalize such abuses by Apple using some BS cover story intended to create emotional response and this way to fog the real issue: Spyware Engine installation/legalization
After 8 years, the intelligence community and tech companies figured out they could sell their surveillance through a thinly veiled effort to “protect X group” (in this case it was children).
There was a slide that indicated that data from Apple and other companies was now part of the PRISM program.
I am not trying to deny or refute Snowden's whistleblowing. I think it is highly likely that PRISM exists. What I dispute are the speculations that the companies listed are complicit.
The 2012 date is quite suspicious - it is precisely the same year that a new Apple datacenter in Prineville came online. Facebook also has a datacenter. Literally next door. Facebook also appears on those slides. I am not sure who else is also now in the area.
I wonder where all of the network cables go?
I personally think that PRISM works by externally intercepting data communication lines running to these facilities. Similar to the rumors that international comms links have been tapped. The companies themselves have not participated, but the data path has been compromised.
The NSA has previously tapped lines (AT&T), but they made the mistake of doing it inside the AT&T building. Google "Room 641A at 611 Folsom Street, SF". That is where "beam splitting" was done. This eventually leaked out. The NSA isn't stupid, I doubt they wanted to repeat that sort of discovery. The best way to keep something from being discovered is to not let people know. This is why I think it is believable and likely that the companies listed on the slides have no idea what has been done.
I will also note that PRISM and "beam splitting" are a rather cosy coincidence.
I think it is most likely that PRISM is implemented without the knowledge of anyone except the NSA and in Prineville there is some "diversion" of network cabling to a private facility that is tapping the lines.
That wouldn't work without the company being at least passively complicit. Links between datacenters are encrypted. If you want even basic PCI-DSS compliance then links between racks must be encrypted (and a rack that uses unencrypted links must be physically secured). And properly implemented TLS or equivalent (which is table stakes for a company that takes this stuff at all seriously) can't be broken by the NSA directly (and if it could be then everything would be hopeless). Thus the MUSCULAR programme where the NSA put their own equipment in Google's datacenters - that's really the only way you can do it.
Remember how the legal regime in the US works with National Security Letters. Companies can be, and are, required to install these backdoors and required to keep their existence, and the existence of the letter itself, secret. Of course Google, Apple, Facebook, every other company with a significant US presence is in receipt of one of those letters and has installed backdoors - the NSA aren't stupid, what else would those laws and their funding be for?
Remember the smiley face in the slide deck?
PRISM is just the internal NSA name for it. It continues unabated.
PRISM, based on the data available, is all about consuming data WITHOUT a warrant -- vacuuming data associated with identities that are not associated with ANY identities subject to a court order. Violating laws and possibly (USA) constitutional rights in quite a few ways. PRISM likely exists.
I ask of "sneak" to confirm their assertion that "PRISM == FISA orders" is true. Please present this "evidence" and the evidence of connection. If you cannot you are, by default, distributing mis-information, bad logic or at worst tying to mislead.
(my naive searching suggests that "sneak" is definitely not in a position to make these claims)
To all apologists, Apple employees and shareholders who will hold their stock after this, I have a simple message: F*ck You. No. Seriously. Go to hell. You are created and supported the monster which will eat you at the end.
Dead Comment
Imagine being a musician and Apple deletes your originals to stream your own music back to you in low quality.
Apple screwed up big time in the functionality and messaging around it and some people found their original files deleted when they weren’t expecting it. Big problem.
But it was hardly some plot to scan users’ hard drives for copyrighted content and delete it. On the contrary, iTunes Match would happily launder a whole library full of pirated low-quality MP3s into legal, high quality, DRM-free AAC files.
Deleted Comment
Even open source products like ClamAV rely on a opaque database of virus strings.
Deleted Comment
By this move people are indoctrinated with the idea that being watched by someone big and powerful is Ok. They learn to accept such abuse and what can be worse for any safety of anyone than learning that? If one is serious about any safety one should learn to walk away from such abuse first just like with any other abuses.
It is an attempt to legalize such Spyware Engine installation. Nothing more. The story is just to sell this move using emotional response from naive people. Because high emotions is when people do poor thinking for the long term consequences. Think about Vendetta and consequences of it.
Those people should be educated what the real abuse is and they should teach their children to recognize it because abuse by Apple is already there and it is much worse then the problem they claim are trying to solve. People need to understand that it will get much worse with the time.
Deleted Comment
Imagine my surprise and horror to find that not only was the complaint accurate, it led to a completely polished thumbnail site on par with PornHub. Boom, right there, no login. No nothing. Five high, seven wide thumbnails. No two of the same child. A complete search engine based on Solr that could filter the thousands of images by age of the victim. By the number of adults participating in the rape. A threaded comment section on each image where people discussed children in their neighborhood and their fantasies of abducting them. An erotic literature section where parents wrote about how they’ve been sexually attracted to their children since changing their first diaper.
I’ll never forget a photo of two men brutally raping a girl of about 9 or 10, because it was one of the highest voted on the site. One of the comments, which I still remember when I close my eyes at night, simply said “its better when they cry”. It’s been eleven years and I’ve seen and dealt with much more of it since then, and I still weep to this day thinking about the pain inflicted on those children, the pure evil of those who enjoy it, and even the design and engineering team who bafflingly put their skills toward building that nadir of human achievement.
Tell me again what “the real abuse” is and educate me, please, because you sound pretty confident that the frighteningly common story I just told isn’t that big of a deal. I can’t believe anyone sane would compare going through your photo collection, even egregiously, to the rape and exploitation of children and think, yeah, you know, based on my value system door number one is the “much worse” injustice. Your opinion is fucking sickening and the exact type of detached inhumanity that is poisoning this industry top to bottom.
Perhaps a thorough search of your hard drives and NAS are in order citizen. No need to report to your local precinct, we've already pushed the updated scan list to your devices for analysis.
Remember, every scan just renews your innocence!
/r/apple has a large number of robots suggesting "on device scanning is more private!"
I doubt I’ll buy a new iPhone next.