Readit News logoReadit News
threatofrain · 4 years ago
dang · 4 years ago
Thanks! Macroexpanded:

Expanded Protections for Children - https://news.ycombinator.com/item?id=28078115 - Aug 2021 (291 comments)

Apple plans to scan US iPhones for child abuse imagery - https://news.ycombinator.com/item?id=28075021 - Aug 2021 (349 comments)

Apple enabling client-side CSAM scanning on iPhone tomorrow - https://news.ycombinator.com/item?id=28068741 - Aug 2021 (680 comments)

trangus_1985 · 4 years ago
I've been maintaining a spare phone running lineage os exactly in case something like this happened - I love the apple watch and apple ecosystem, but this is such a flagrant abuse of their position as Maintainers Of The Device that I have no choice but to switch.

Fortunately, my email is on a paid provider (fastmail), and my photos are on a NAS, I've worked hard to get all of my friends on Signal. While I still use google maps, I've been trialing out OSM alternatives for a minute.

The things they've described are in general, reasonable and probably good in the moral sense. However, I'm not sure that I support what they are implementing for child accounts (as a queer kid, I was terrified of my parents finding out). On the surface, it seems good - but I am concerned about other snooping features that this portents.

However, with icloud photos csam, it is also a horrifying precedent that the device I put my life into is scanning my photos and reporting on bad behavior (even if the initial dataset is the most reprehensible behavior).

I'm saddened by Apple's decision, and I hope they recant, because it's the only way I will continue to use their platform.

JumpCrisscross · 4 years ago
> with icloud photos csam, it is also a horrifying precedent

I'm not so bugged by this. Uploading data to iCloud has always been a trade of convenience at the expense of privacy. Adding a client-side filter isn't great, but it's not categorically unprecedented--Apple executes search warrants against iCloud data--and can be turned off by turning off iCloud back-ups.

The scanning of childrens' iMessages, on the other hand, is a subversion of trust. Apple spent the last decade telling everyone their phones were secure. Creating this side channel opens up all kinds of problems. Having trouble as a controlling spouse? No problem--designate your partner as a child. Concerned your not-a-tech-whiz kid isn't adhering to your house's sexual mores? Solved. Bonus points if your kid's phone outs them as LGBT. To say nothing of most sexual abuse of minors happening at the hands of someone they trust. Will their phone, when they attempt to share evidence, tattle on them to their abuser?

Also, can't wait for Dads' photos of their kids landing them on a national kiddie porn watch list.

js2 · 4 years ago
> designate your partner as a child.

That's not how it works, unless you control your partner's Apple ID and you lie about their DOB when you create their account.

I created my kids Apple IDs when they were minors and enrolled them in Family Sharing. They are now both over 18 and I cannot just designate them as minors. Apple automatically removed my ability to control any aspects of their phones when they turned 18.

> Dads' photos of their kids landing them on a national kiddie porn watch list.

Indeed, false positives is much more worrying. The idea that my phone is spying on my pictures... like, what the hell.

suizi · 4 years ago
Moving the scanning to the client side is clearly an attempt to move towards scanning content which is about to be posted on encrypted services, otherwise they could do it on the server-side, which is "not categorically unprecedented".
NotPractical · 4 years ago
> can be turned off by turning off iCloud back-ups

Until they push a small change to the codebase...

  @@ -7637,3 +7637,3 @@
  -if (photo.isCloudSynced && scanForIllegalContent(photo)) {
  +if (scanForIllegalContent(photo)) {
       reportUserToPolice();
   }

walterbell · 4 years ago
Photosync can automatically move photos from iDevices into consolidated NAS, SFTP, cloud or iXpand USB storage, https://photosync-app.com

GoodReader has optional app-level file encryption with a password that is not stored in the iOS keychain. In theory, those encrypted files should be opaque to device backups or local filesystem scanning, unless iOS or malware harvests the key from runtime memory, https://goodreader.com/

mojzu · 4 years ago
If The Verge's article is accurate about how/when the CSAM scanning occurs then I don't have a problem with that, sounds like they're moving the scanning from server to client side, the concerns about false positives seem valid to me but I'm not sure the chance of one occurring has increased over the existing icloud scanning. Scope creep for other content scanning is definitely a possibility though so I hope people keep an eye on that

I'm not a parent but the other child protection features seem like they could definitely be abused by some parents to exert control/pry into their kids private lives. It's a shame that systems have to be designed to prevent abuse by bad people but at Apple's scale it seems like they should have better answers for the concerns being raised

chinchilla2020 · 4 years ago
The reality is that actual child abusers know who they are. They realize that society is after them. They are already paranoid, secretive, people. They are not going to be uploading pictures to the cloud of their child abuse.

And let's not forget the minor detail that this is now public knowledge. It's like telling your teenage son you're going to be searching his closet for marijuana in the future.

donkeyd · 4 years ago
> designate your partner as a child

This is way too much work to gain hardly anything. It's just as easy to just log into another device with their iCloud password and literally read everything they send. Less work, more result.

selykg · 4 years ago
I feel like you’re sensationalizing this a lot.

There’s two functions here. Both client side.

First, machine learning to detect potentially inappropriate pictures for children to view. This seems to require parental controls to be on. Optionally it can send a message to the parent when a child purposefully views the image. The image itself is not shared with Apple so this is notification to parents only.

The second part is a list of hashes. So the Photos app will hash images and compare to the list in the database. If it matches then presumably they do something about that. The database is only a list of KNOWN child abuse images circulating.

Now, not to say I like the second part but the first one seems fine. The second is sketchy in that what happens if there’s a hash collision. But either way it seems easy enough to clear that one up.

No father is going to be added to some list for their children’s photos. Stop with that hyperbole.

vineyardmike · 4 years ago
> as a queer kid, I was terrified of my parents finding out

I think many queer people have a completely different idea of the concept of "why do you want to hide if you're not doing anything wrong" and the desire to stay private. Especially since anything sexual and related to queerness is way more aggressively policed than hetero-normative counterparts.

Anything "think of children" always has a second order affect of damaging queer people because lots of people still think of queerness as dangerous to children.

It is beyond likely that lots of this monitoring will catch legal/safe queer content - especially the parental-controls focused monitoring (as opposed to the gov'ment db of illegal content)

heavyset_go · 4 years ago
> Anything "think of children" always has a second order affect of damaging queer people because lots of people still think of queerness as dangerous to children.

For example, YouTube does this with some LGBT content. YouTube has demonitized LGBT content and placed it in restricted mode, which screens for "potentially mature" content[1][2].

YouTube also shadowbans the content[1], preventing it from showing up in search results at all.

From here[1]:

> Filmmaker Sal Bardo started noticing something strange: the views for his short film Sam, which tells the story of a transgender child, had started dipping. Confused, he looked at the other videos on his channel. All but one of them had been placed in restricted mode — an optional mode that screens “potentially mature” content — without YouTube informing him. In July of that year, most of them were also demonetized. One of the videos that had been restricted was a trailer for one of his short films; another was an It Gets Better video aimed at LGBTQ youth. Sam had been shadow-banned, meaning that users couldn’t search for it on YouTube. None of the videos were sexually explicit or profane.

There are more examples like that here[2].

[1] https://www.rollingstone.com/culture/culture-features/lgbtq-...

[2] https://www.washingtonpost.com/technology/2019/08/14/youtube...

userbinator · 4 years ago
Especially since anything sexual and related to queerness is way more aggressively policed than hetero-normative counterparts.

I find it intensely ironic that Apple's CEO is openly gay.

bambax · 4 years ago
> probably good in the moral sense

How, how is it even morally good?? Will they start taking pictures of your house to see if you store drugs under your couch? Or cook meth in your kitchen??

What is moral is for society to be in charge of laws and law enforcement. This vigilante behavior by private companies who answer to no one is unjust, tyrannical and just plain crazy.

tekknik · 4 years ago
> Will they start taking pictures of your house to see if you store drugs under your couch? Or cook meth in your kitchen??

How many people have homepods? When will they start listening for illegal activity?

cle · 4 years ago
Unfortunately with SafetyNet, I feel like an investment into Android is also a losing proposition...I can only anticipate being slowly cut off from the Android app ecosystem as more apps onboard with attestation.

We've collectively handed control of our personal computing devices over to Apple and Google. I fear the long-term consequences of that will not be positive...

techrat · 4 years ago
Loosing sight of the forest for this one tree.

1) Google doesn't release devices without unlockable bootloaders. They have always been transparent in allowing people to unlock their Nexus and Pixels. Nexus was for developers, Pixels are geared towards the end user. Nothing changed with regards to the bootloaders.

2) Google uses Coreboot for their ChromeOS devices. Again, you couldn't get more open than that if you wanted to buy a Chromebook and install something else on it.

3) To this day, app sideloading on Android remains an option. They've even made it easier for third party app stores to automatically update apps with 12.

4) AOSP. Sure, it doesn't have all the bells and whistles as the latest and greatest packaged up skin and OS release, but all of the features that matter within Android, especially if you're going to de-Google yourself, are still there.

Any one of those points, but consider all four, and I have trouble understanding why people think REEEEEEEE Google.

So you can't play with one ball in the garden (SafetyNet), you've still got the rest of the toys. That's a compromise I'm willing to accept in order to be able to do what I want to and how I want to do it. (Eg, Rooting or third party roms.)

If you don't like what they do on their mobile OS, there's nothing that Google is doing to lock you into a Walled Garden to where the only option you have is to completely give up what you're used to...

...Unlike Apple. Not one iOS device has been granted an unlockable bootloader. Ever.

heavyset_go · 4 years ago
> We've collectively handed control of our personal computing devices over to Apple and Google

Hey now, the operating system and app distribution cartels include Microsoft, too.

trangus_1985 · 4 years ago
I don't think it's implausible that I carry around a phone that has mail, contacts, calendars, photos, and private chat on it. And then, have a second, older phone that has like Instagram and mobile games. It's tragic.
_red · 4 years ago
Yes, my history was Linux 95-04, Mac 04-15, and now back to Linux from 2015 onwards.

Its been clear Tim Cook was going to slowly harm the brand. He was a wonderful COO under a visionary CEO-type, but he holds no particular "Tech Originalist" vision. He's happy to be part of the BigTech aristocracy, and probably feels really at home in the powers it affords him.

Anyone who believes this is "just about the children" is naive. His chinese partners will use this to crack down on "Winnie the Poo" cartoons and the like...before long questioning any Big Pharma product will result in being flagged. Give it 5 years at max.

Dead Comment

forgingahead · 4 years ago
This can happen only because whenever any slippery-slope action was taken previously, there is an army of apologists and "explainers" who rush to "correct" your instinctive aversion to these changes. It's always the same - the initial comment is seemingly kind, yet with an underlying menace, and if you continue to express opposition, they change tack to being extremely aggressive and rude.

See the comment threads around this topic, and look back to other related events (notably the tech giants censoring people "for the betterment of society" in the past 12 months).

Boiling a frog may happen slowly, but the water continues to heat up even if we pretend it doesn't. Very disappointed with this action by Apple.

raxxorrax · 4 years ago
This is typical obedient behavior. Some abused spouses get through lengths to come up with excuses for their partners. Since I don't own an iOS device, I don't really care about this specific instance.

But I don't want these people normalizing deep surveillance and fear that I have to get rid of my OSX devices when this trend continues.

Andrew_nenakhov · 4 years ago
Signal is still a centralised data silo where by default you trust CA to verify your contacts identify.
trangus_1985 · 4 years ago
Yeah, but it's also useful for getting my friends on board. I think it's likely that I eventually start hosting matrix or some alternative, but my goal is to be practical here, yet still have a privacy protecting posture.
chimeracoder · 4 years ago
> Signal is still a centralised data silo where by default you trust CA to verify your contacts identify.

You can verify the security number out-of-band, and the process is straightforward enough that even nontechnical users can do it.

That's as much as can possibly be done, short of an app that literally prevents you from communicating with anyone without manually providing their security number.

taurath · 4 years ago
If my parents had the feature to be alerted about porn their kid’s device while I was a teen they would have sent me to a conversion camp, and that is not an exaggeration.

Apple thinks the appropriate time for queer kids to find themselves is after they turn 18.

neop1x · 4 years ago
Maybe Apple will decrease child abuse cases but increase cases of child suicides..
mrtranscendence · 4 years ago
If you're just downloading and looking at porn, no problem. It only becomes an issue if you're sharing porn via Messages or storing it in iCloud. And to be fair, I don't think they're alerted to the nature of the pornography, so you might be able to avoid being outed even if you're sharing porn (or having porn shared with you).

Edit: I'm wrong in one respect: if the kid under 13 chooses to send a message with an explicit image despite being warned via notification, the image will be saved to a parental controls section. This won't happen for children >= 13.

LazyR0B0T · 4 years ago
Organic Maps on Fdroid is a really clean osm based map.
Sunspark · 4 years ago
I'm impressed, it actually has smooth scrolling unlike OsmAnd which is very slow loading tiles in.

Critical points I'd make about Organic Maps, I'd want a lower inertia setting so it scrolls faster, and a different color palette.. they are using muddy tones of green and brown.

crocodiletears · 4 years ago
Does it let you select from multiple routes? I've been using Pocketmaps, but it only gives you a single option for routing, which can lead to issues in certain contexts
m-p-3 · 4 years ago
And I also invite everyone to contribute to OSM through StreetComplete, it's quite intuitive and it adds something to look for when taking a walk.
JackGreyhat · 4 years ago
Nearly the same as MagicEarth...I use it all the time.
2OEH8eoCRo0 · 4 years ago
>While I still use google maps

You can still use Google Maps without an account and "incognito". I wish they'd allow app store usage without an account though- similar to how any Linux package manager works.

trangus_1985 · 4 years ago
That's not really the issue. The issue is that for google maps to work properly, it requires that the Play services are installed. Play services are a massive semi-monolithic blob that requires tight integration with Google's backend, and deep, system-level permissions to operate correctly.

I'm not worried about my search history.

opan · 4 years ago
In addition to F-Droid, you can get Aurora Store (which is on F-Droid) which lets you use an anonymous login to get at the Play Store. I use it for a couple free software apps that aren't on F-Droid for some reason.
artimaeis · 4 years ago
It's not the device that's less secure or private in this context, it's the services. There's no reason you couldn't just continue using your NAS for photo backup and Signal for encrypted-communications completely unaffected by this.

Apple seems to not have interest in users devices, which makes sense -- they're not liable for them. They _do_ seem interested in protecting the data that they house, which makes sense, because they're liable for it and have a responsibility to remove/report CSAM that they're hosting.

adriancr · 4 years ago
So they should do that scanning server side at their boundary instead of pushing software to run on phones with potential to extend scope later if no push back.
trangus_1985 · 4 years ago
That's not the issue. The issue is that they have shipped spyware to my device. That's a massive breach of trust.

I suspect that this time next year, I'll still be on ios, despite my posturing. I'm certainly going to address icloud in the next few weeks - specifically, disusing it. However, I would be surprised if I'm still on ios a year or two after that.

What Apple has done here isn't horrible in the absolute sense. Instead, it's a massive betrayal of trust with minimal immediate intrusiveness; and yet, a giant klaxon that their platform dominance in terms of privacy is coming to an end

Deleted Comment

GeekyBear · 4 years ago
> with icloud photos csam, it is also a horrifying precedent

That precedent was set many years ago.

>a man [was] arrested on child pornography charges, after Google tipped off authorities about illegal images found in the Houston suspect’s Gmail account.

Microsoft’s “PhotoDNA” technology is all about making it so that these specific types of illegal images can be automatically identified by computer programs, not people.

PhotoDNA converts an image into a common black-and-white format and size the image to a uniform size, Microsoft explained last year while announcing its increased efforts at collaborating with Google to combat online child abuse.

https://techcrunch.com/2014/08/06/why-the-gmail-scan-that-le...

trangus_1985 · 4 years ago
cloud versus local device is a massive distinction imo. or maybe im a dinosaur ;)
Saris · 4 years ago
I think no matter what devices you use, you've nailed down the most important part of things which is using apps and services that are flexible, and can be easily used on another platform.
trangus_1985 · 4 years ago
I knew that eventually it'd probably matter what devices I used, I just didn't expect it to be so soon.

But yeah, I could reasonably use an iphone without impact for the foreseeable future with some small changes.

samstave · 4 years ago
What I am reminded of is all of the now seemingly prophetic writing and story telling in a lot of cyber-punk-dystopian anime about the future of the corporate state, and how mega corps rule EVERY THING.

What I always thought was interesting was that the Police Security Services in Singapore were called "CISCO" -- and you used to see these swat-APV-type vans driving around and armed men with CISCO emblazened on their gear/equip/vehicles...

I always was reminded of Cyberpunk Anime around that.

m4rtink · 4 years ago
Interesting! But actually this is not the singly thing with an "interesting" name in Singapore - well at least as long as you speak Czech. ;-)

You see, the mass transit company in Singapore is handled by the Singapore Municipal Rapid Transit company, abbreviated SMRT. There is also a SMRT corporation (https://en.wikipedia.org/wiki/SMRT_Corporation), SMRT buses, the SMRT abbreviation is heavily used on train, stations, basically everywhere.

Well, in Czech "smrt" means literarily death. So let's say for Czech speakers riding the public transport in Singapore can be a bit unnerving - you stand at a station platform and then a train with "DEATH" written on it in big letters pulls into the station. ;-)

biztos · 4 years ago
I’ve been thinking about switching my main email to Fastmail from Apple, for portability in case the anti-power-user trend crosses my personal pain threshold.

But if your worry is governments reading your mail, is an email company any safer? I’m sure FM doesn’t want to scan your mail for the NSA or its Australian proxy, but do they have a choice? And if they were compelled, would they not be prevented from telling you?

“We respect your privacy” is exactly what Apple has been saying.

dustyharddrive · 4 years ago
I think self-hosting email has too many downsides (spam filtering, for example) to be worth it; I’m more concerned about losing my messages (easily solved with POP or mbox exports while still using a cloud account) than government data sharing. Email is unencrypted in transit anyway, and it’s “industry standard” to store it in clear text at each end.
trangus_1985 · 4 years ago
> if your worry is governments reading your mail

complicated. As long as they require a reasonable warrant (ha!), I'm fine. Email is an inherently insecure protocol and ecosystem, anyways.

I haven't used email for communication that I consider to be private for a while - I've moved most, if not all, casual conversation to signal, imessage. Soon, I hope to add something like matrix or mattermost into the mix.

My goal was never to be perfect. My goal is to be able to easily remove myself from an invasive spyware ecosystem, and bring my friends along, with minimal impact.

neop1x · 4 years ago
I have been self-hosting email for 7 years successfully. But it required a physical server in a reputable datacenter, setting up Dovecot, Exim, SpamAssasin, reverse-DNS, SPF, DKIM. It took a bit of time to gain IP reputation but then it has worked flawlessly since. Occasionally some legit mail is flagged as spam or vice versa but it is not worse than any other mail provider. So it can be done! But my first attempts to do that on a VPS failed as IP blocks of VPS providers are often hopelessly blacklisted in major email providers.
mackrevinack · 4 years ago
there's always protonmail which is supposedly e2e, so they shouldn't be able to scan your mail
vineyardmike · 4 years ago
unfortunately, self hosting is a pretty clear alternative. Not much else seems to be.
alksjdalkj · 4 years ago
Have you found any decent google maps alternatives? I'd love to find something but nothing comes close as far as I've found. Directions that take into account traffic is the big thing that I feel like nobody (other than Apple, MS, etc.) will be able to replicate.

Have you tried using the website? I've had some luck with that on postmarketOS, and it means you don't need to install Play services to use it.

nickexyz · 4 years ago
Organic maps is pretty good: https://github.com/organicmaps/organicmaps
manuelmagic · 4 years ago
I'm using since many years HERE Maps https://wego.here.com/
krobbn · 4 years ago
I really like Here WeGo, and it allows you to download maps for specific countries too to have available offline.
beermonster · 4 years ago
OsmAND
OJFord · 4 years ago
> While I still use google maps

I use Citymapper simply because I find it better (for the city-based journeys that are my usual call for a map app) - but it not being a Google ~data collection device~ service is no disadvantage.

At least, depending why you dislike having everything locked up with Google or whoever I suppose. Personally it's more having everything somewhere that troubles me, I'm reasonably happy with spreading things about. I like self-hosting things too, just needs a value-add I suppose, that's not a reason in itself for me.

qwerty456127 · 4 years ago
> While I still use google maps, I've been trialing out OSM alternatives for a minute.

Is there a way to set up Android to handle shared locations without Google Maps?

Every time someone shares location with me (in Telegram) it displays as a tiny picture and once I click it it says I have to install Google Maps (I use an alternative for actual maps and don't have Google Maps installed). So I end up zooming the picture and then finding the location on the map manually.

rStar · 4 years ago
> it is also a horrifying precedent that the device I put my life into is scanning my photos and reporting on bad behavior

apples new customers are the various autocratic regimes that populate the earth. apples customers used to be human beings. there exist many profiteers in mountain view, cupertino menlo and atherton in the service of making our monopolies more capable of subjugating humanity.

peakaboo · 4 years ago
I also use Fastmail but being fully aware that Australia where its hosted is part of the 5 eyes spy network, and also one of the countries acting extreamly oppressive towards its citizens when it comes to covid restrictions.

So I don't actually expect my mail to be private. But at least it's not Google.

robjan · 4 years ago
Fastmail is hosted in New Jersey. If it was hosted in Aus the user experience would be pretty bad for most of its users.
ekianjo · 4 years ago
Signal is next on the list since it's a centralized solution - you can expect they will come for it next.
trangus_1985 · 4 years ago
I'm just trying to buy time until open source and secure alternatives have addressed these problems. Apple doing this has moved my timeframes up by a few years (unexpectedly).

Deleted Comment

TheRealDunkirk · 4 years ago
> I hope they recant

This is very much like driving a car through a crowd of protestors. They will slowly, inexorably, eventually push through.

paulcarroty · 4 years ago
> Fortunately, my email is on a paid provider

Paid doesn't mean more secure, it's popular mistake.

_arvin · 4 years ago
I'm really loving fastmail. Thanks for the heads up!
threatofrain · 4 years ago
What is your home NAS setup like?
trangus_1985 · 4 years ago
Freenas, self-signed tightly-scoped CA installed on all of my devices. 1TBx4 in a small case shoved under the stairs.

tbh, i would vastly prefer to use a cloud based service with local encryption - I'm not super paranoid, just overly principled

SOMA_BOFH · 4 years ago
how does apple protect againat hash collisions?
jeromegv · 4 years ago
It doesn't trigger for a single match, I guess that's the first line of defence.

Dead Comment

Deleted Comment

Dead Comment

Dead Comment

Dead Comment

gowld · 4 years ago
> I'm not sure that I support what they are implementing for child accounts (as a queer kid, I was terrified of my parents finding out)

If you don't want your parents to look at your phone, you shouldn't be using a phone owned by your parent's account. The new feature doesn't change this calculus.

As a queer kid, would you enjoy being blackmailed by someone who tricked you into not telling your parents?

triska · 4 years ago
I remember an Apple conference where Tim Cook personally assured us that Apple is fully committed to privacy, that everything is so secure because the iPhone is so powerful that all necessary calculations can happen on the device itself, and that we are "not the product". I think the Apple CEO said some of this in the specific context of speech processing, yet it seemed a specific case of a general principle upheld by Apple.

I bought an iPhone because the CEO seemed to be sincere in his commitment to privacy.

What Apple has announced here seems to be a complete reversal from what I understood the CEO saying at the conference only a few years ago.

Klonoar · 4 years ago
I think the EFF is probably doing good by calling attention to the issue, but let's... actually look at the feature before passing judgement, e.g:

https://twitter.com/josephfcox/status/1423382200880439298/ph...

- It's run for Messages in cases where a child is potentially viewing material that's bad.

- It's run _before upload to iCloud Photos_ - where it would've already been scanned anyway, as they've done for years (and as all other major companies do).

To me this really doesn't seem that bad. Feels like a way to actually reach encrypted data all around while still meeting the expectations of lawmakers/regulators. Expansion of the tech would be something I'd be more concerned about, but considering the transparency of it I feel like there's some safety.

https://www.apple.com/child-safety/ more info here as well.

ElFitz · 4 years ago
True. But, first, it also means anyone, anywhere, as long as they use iOS, is vulnerable to what the US considers to be proper. Which, I will agree, likely won’t be an issue in the case of child pornography. But there’s no way to predict how that will evolve (see Facebook’s ever expanding imposing of American cultural norms and puritanism).

Next, it also means they can do it. And if it can be done for child pornography, why not terrorism? And if it can be done for the US’ definition of terrorism, why not China's, Russia's or Saudi Arabia's? And if terrorism and child pornography, why not drugs consumption? Tax evasion? Social security fraud? Unknowingly talking with the wrong person?

Third, there apparently is transparency on it today. But who is to say it's possible expansion won't be forcibly silenced in the same way Prism's requests were?

Fourth, but that's only because I slightly am a maniac, how can anyone unilaterally decide to waste the computing power, battery life and data plan of a device I paid for without my say so? (probably one of my main gripes with ads)

All in all, it means I am incorporating into my everyday life a device that can and will actively snoop on me and potentially snitch on me. Now, while I am not worried today, it definitely paves the way for many other things. And I don't see why I should trust anyone involved to stop here or let me know when they don’t.

randcraw · 4 years ago
So your argument is, if you've done nothing wrong, you have nothing to worry about. Really? Will you feel the same when Apple later decides to include dozens more crimes that they will screen for, surreptitiously? All of which are searches without warrants or legal oversight?

Let me introduce you to someone you should know better. His name is Edward Snowden. Or Louis Brandeis, who is spinning in his grave right about now.

The US Fourth Amendment exists for a damned good reason.

vimy · 4 years ago
Teens are also children. Apple has no business checking if they send or receive nude pics. Let alone tell their parents. This is very creepy behavior from Apple.

Edit: I'm talking about this https://pbs.twimg.com/media/E8DYv9hWUAksPO8?format=jpg&name=...

karaterobot · 4 years ago
Since nobody would ever object to it, protecting against child abuse gets used as a wedge. As the article points out, the way this story ends is with this very backdoor getting used for other things besides preventing child abuse: anything the government asks Apple to give them. It's an almost inevitable consequence of creating a backdoor in the first place, which is why you have to have a zero-tolerance policy against it.
lights0123 · 4 years ago
My big issue is what it opens up. As the EFF points out, it's really not a big leap for oppressive governments to ask Apple to use the same tech (as demoed by using MS's tech to scan for "terrorist" content) to remove content they don't like from their citizens' devices.
mapgrep · 4 years ago
> It's run _before upload to iCloud Photos_ - where it would've already been scanned anyway

Right, so ask yourself, why is it on the device? Why not just scan on the server?

To me (agreeing with much of the commentary I’ve seen) the likeliest answer is that they are confining the scan to pre uploads now not for any technical reason but to make the rollout palatable to the public. Then they’re one update away from quietly changing the rules. There’s absolutely no reason to do the scan on your private device if they plan to only confine this to stuff they could scan away from your device.

aaomidi · 4 years ago
> - It's run _before upload to iCloud Photos_ - where it would've already been scanned anyway, as they've done for years (and as all other major companies do).

Then why build this functionality at all? Why not wait until it's uploaded and check it on their servers and not run any client side code? This is how literally every other non-encrypted cloud service operates.

thesimon · 4 years ago
"Feels like a way to actually reach encrypted data all around while still meeting the expectations of lawmakers/regulators"

And isn't that a problem? Encrypted data should be secure, even if lawmakers don't want math to exist.

kps · 4 years ago
> considering the transparency of it

What transparency? Apple doesn't publish iOS source.

achow · 4 years ago
There are always scenarios that one cannot catch. EFF highlights one such.

It sounds like it could be quite common. And it could be an absolute nightmare scenario for the kid who does not have the feature turned on.

This means that if—for instance—a minor using an iPhone without these features turned on sends a photo to another minor who does have the features enabled, they do not receive a notification that iMessage considers their image to be “explicit” or that the recipient’s parent will be notified. The recipient’s parents will be informed of the content without the sender consenting to their involvement. Additionally, once sent or received, the “sexually explicit image” cannot be deleted from the under-13 user’s device.

therealmarv · 4 years ago
Now it will be "before upload". In 1-2 years it's "scan all local photos" in the name of "make the World a better place". It's such a small technical step for Apple to change this scanning behaviour in the future and scan even offline photos. All the necessary software is on all Apple i-devices already by then.

Everybody is a potential criminal with photos on your phone unless you prove otherwise by scanning. This is the future we are heading to. To do the scanning on device is actually the weakest point of their implementation IMHO.

mulmen · 4 years ago
This seems even worse. If the images are only scanned before upload to iCloud then Apple has opened a backdoor that doesn’t even give them any new capability. If I am understanding this right an iPhone can still be used to distribute CSAM as long as the user is logged out of iCloud? So it’s an overreach and ineffective?

Deleted Comment

dabbledash · 4 years ago
The point of encrypted data is not to be “reached.”
xienze · 4 years ago
> Expansion of the tech would be something I'd be more concerned about

Yeah, and that’s precisely what will happen. It always starts with child porn, then they move on to “extremist content”, of which the term expands to capture more things on a daily basis. Hope you didn’t save that “sad Pepe” meme on your phone.

wayneftw · 4 years ago
It runs on my device and uses my CPU, battery time and my network bandwidth (to download/upload the hashes and other necessary artifacts).

I'd be fine with them scanning stuff I uploaded to them with their own computers because I don't have any really expectation of privacy from huge corporations.

suizi · 4 years ago
As many, many people have pointed out, building a mechanism to scan things client-side is something which could easily be extended to encrypted content, and perhaps, is intended to be extended at a moment's notice to encrypted content, if they see an opportunity to do so.

It's like having hundreds of nukes ready for launch, as opposed to having the first launch being a year away.

If they wanted to "do it as all major companies do", then they could have done it on the server-side, and there wouldn't have been a debate about it at all, although it is still extremely questionable, as far as privacy is concerned.

nerdponx · 4 years ago
The cynical take is that Apple was never committed to privacy in and of itself, but they are commited to privacy as long as it improves their competitive advantage, whether by marketing or by making sure that only Apple can extract value from its customers' data.

Hanlon's razor does not apply to megacorporations that have enormous piles of cash and employ a large number of very smart people, who are either entirely unscrupulous or for whom scruples are worth less than their salaries. We probably aren't cynical enough.

I am not arguing that we should always assume every change is always malicious towards users. But our index of suspicion should be high.

hpen · 4 years ago
I've always been convinced that Apple cared about privacy as a way of competitive advantage. I don't need them to be committed morally or ethically, I just need them to be serious about it because I will give them my money if they are.
duped · 4 years ago
What competitive advantage does performing semantic hashing of my photos for law enforcement give apple
withinboredom · 4 years ago
I’d say you’re spot on, but I can’t say why.
cronix · 4 years ago
As soon as Cook became CEO, he let the NSA's Prism program into Apple. Everything since then has been a fucking lie.

> Andrew Stone, who worked with Jobs for nearly 25 years, told the site Cult of Mac last week that Steve Jobs resisted letting Apple be part of PRISM, a surveillance program that gives the NSA access to records of major Internet companies. His comments come amid speculation that Jobs resisted cooperating. “Steve Jobs would’ve rather died than give into that,” Stone told the site.

> According to leaked NSA slides about PRISM, Apple was the last tech behemoth to join the secret program — in October 2012, a year after Jobs died. Apple has said that it first heard about PRISM on June 6 of this year, when asked about it by reporters.

https://www.huffpost.com/entry/apple-nsa-steve-jobs_n_346132...

I mean, maybe they didn't call it "PRISM" when talking about it with Cook, so it could technically be true that they didn't hear of PRISM until media stories. Everyone knows the spy agency goes around telling all of their project code names to companies they're trying to compromise. Hello, sir. We're here to talk to you about our top secret surveillance program we like to call PRISM where we intercept and store communications of everyone. Would you like to join? MS did. So did Google. Don't you want to be in our select cool club?

ksec · 4 years ago
Tim Cook doesn't Lie. I think he convinced himself what he said wasn't lying. That Apple and himself are so righteous. Which is actually worst, because that mentality filters through from Top to Bottom. And it is showing in their marketing and PR messages. He is also doing exactly Steve Jobs's last advice to him, Do the right thing. Except "the right thing" is so ambiguous it may turn out to be one of the worst advice.

My biggest turning point was Tim Cook flat out lying on the Apple case against Qualcomm. Double Dipping? Qualcomm patents being more than double than all the other six combined? And the tactics they used in court which was vastly different to Apple vs Samsung's case. And yes, they lost. ( Or settled )

That is the same with privacy. They simplifies their PR message as tracking = evil. Tracking is invading your privacy. Which is all good. But at the same time Apple is tracking you, everything you do on Apple Music, Apple TV+, App Store and even Apple Card. ( They only promise not to sell your Data to third party, they still have some of those Data. ). What that means is that only Apple is allowed to track you, but anyone else doing it are against privacy? What Apple really meant by the word Privacy then is that Data should not be sold to third parties. But no, they intentionally keep it unclear and created a war on Data Collection while they are doing it. And you now have people flat out claiming Apple doesn't collect any Data.

Then there is a war on Ads. Which was so bad the Ad industry pushes back and Tim Cook had to issue a mild statement saying they are not against Ads, only targeted Ads. What?

Once you start questioning all of his motives, and find concrete evidence that he is lying, along with all the facts from court case of how Apple has long term plans to destroy other companies, they all line up and shape how you view Tim Cook's Apple. And it isn't pretty.

And that is speaking from an Apple fan for longer than two decade.

nonbirithm · 4 years ago
What I want to know is why they decided to implement this. Are Apple just trying to appear virtuous and took action independently? Or was this done at someone else's request?

For all the rhetoric about privacy coming from Apple, I feel that such an extreme measure would surely cause complaints from anyone deeply invested in privacy. And maybe they're just using words like "significant privacy benefits compared to previous techniques" to make it sound reasonable to the average user who's not that invested in privacy.

JohnFen · 4 years ago
> because the CEO seemed to be sincere in his commitment to privacy.

The sincerity of a company officer, even the CEO, should not factor into your assessment. Officers change over time (and individuals can change their stance over time), after all.

dilap · 4 years ago
There was a funny, tiny thing that happened a few years back that made me think Tim Cook is a liar.

It was back when Apple had just introduced the (now-abandoned) Force Touch feature (i.e., pressure sensitive touch, since abandoned, since it turns out pushing hard on an unyielding surface is not very pleasant or useful).

To showcase the capability, Apple had updated many of its apps with new force-touch features. One of which was mail: if you pushed just right on the subject line of a message, you'd get a tiny, unscrollable popout preview of its contents.

It was totally useless: it took just as much time to force touch to see the preview as just normally tapping to view the message, and the results were less useful. It was also fairly fiddly: if you didn't press hard enough, you didn't get the preview; if you pressed too hard, it would open into the full email anyway.

So Tim Cook, demoing the feature, said a funny thing. He said, "It's great, I use it all the time."

Which maybe, just maybe, is true, but personally I don't believe, not for a second.

So since then, I've had Tim down in my book as basically a big liar.

boardwaalk · 4 years ago
If that's your bar for labeling someone a big liar, I surely don't wanna know ya.

I actually use this feature pretty regularly in Safari, even if it's a long press rather than force touch now.

avnigo · 4 years ago
I’m still waiting on iCloud backup encryption they promised a while back. There were reports that they scrapped those plans because the FBI told them to, but nothing official announced since 2019 on this.
minsc__and__boo · 4 years ago
Yet Apple gave access to all the chinese user iCloud data to the Chinese government, including messages, emails, pictures, etc.

NYT Daily had an episode where they talked about how the CCP is getting Apple to bend it's commitment to privacy:

https://www.nytimes.com/2021/06/14/podcasts/the-daily/apple-...

rudian · 4 years ago
Last I heard on HN was that it was scrapped entirely as a consequence of some event I don’t remember. I hope someone has a better memory than I do.
BiteCode_dev · 4 years ago
So you mean the company that was part of PRISM, that has unfair business practices and a bully as a founder was not really the world savior their marketting speach said they were ?

I'm in shock. Multi-billion dollars company usually never lies to make money! And power grabbing entities have such a neat track record in human history.

Not to mention nobody saw that coming and told repeatadly one should not get locked into such a closed and proprietary ecosystem in the first place.

I mean, dang, this serial killer was such a nice guy. The dead babies in the basements were weird but appart from that he was a stellar neighbour.

anonuser123456 · 4 years ago
Not really. This only applies to photos uploaded to iCloud. And photos uploaded to iCloud (and Google drive etc.) are already scanned on server for CP.

Apple is moving that process from on server to on phone in a way that protects your privacy better than current standards.

In the current system, all your photos are available to Apple unencrypted. In the new system, nothing will be visible to apple unless you upload N images with database hits. From those N tokens, Apple is then able to decrypt your content.

So when this feature lands, it improves your privacy relative to today.

rantwasp · 4 years ago
nah. this is not how trust works. if Apple does stuff like this, I stop trusting Apple. Binary. Trust of No Trust.

Who is to say once they start doing this they will not extend their capabilities and monitor everything on the device? This is the direction we’re heading in.

For me this is my last iphone. And probably my last mac. the hardware is nice, shiny ams usable but you cannot do shit like this after you sell everyone on privacy.

What would a company that cares about privacy do? you don’t scan any of my things without explaining why and getting my consent. That’s privacy

taxyovio · 4 years ago
Do you have any references for your remarks on the current situation where all iCloud photos are scanned on the server side?
nyolfen · 4 years ago
what happens when a government legally forces them to look for politically dissident content? they have already lost this fight, it is an inevitability
samstave · 4 years ago
Don't worry - you can trust ALL of these guys:

https://i.imgur.com/z3JeRgk.jpg

blakeinate · 4 years ago
This year I purchased my first iPhone since the 3G, after today I am starting to regret that decision. At this point, I can only hope Linux on mobile picks up steam.
robertoandred · 4 years ago
Except the hashing and hash comparison are happening on the device itself.
zionic · 4 years ago
That’s even worse
dylan604 · 4 years ago
It is secure, as long as you have nothing to hide. If you have no offending photos, then the data won't be uploaded! See, it's not nefarious at all! /s
mtgx · 4 years ago
It's all been downhill since we heard that they stopped developing the e2e encrypted iCloud solution because it might upset the FBI even more.
c7DJTLrn · 4 years ago
Catching child pornographers should not involve subjecting innocent people to scans and searches. Frankly, I don't care if this "CSAM" system is effective - I paid for the phone, it should operate for ME, not for the government or law enforcement. Besides, the imagery already exists by the time it's been found - the damage has been done. I'd say the authorities should prioritise tracking down the creators but I'm sure their statistics look much more impressive by cracking down on small fry.

I've had enough of the "think of the children" arguments.

burself · 4 years ago
The algorithms and data involved are too sensitive to be discussed publicly and the reasoning is acceptable enough to even the most knowledgeable people. They can't even be pressured to prove that the system is effective at it's primary purpose.

This is the perfect way to begin opening the backend doors.

kccqzy · 4 years ago
The algorithm is actually public: https://www.apple.com/child-safety/pdf/Apple_PSI_System_Secu... From an intellectual point of view it's interesting to learn about.

I agree with the rest of your points. The problem is that we don't know if Apple implemented this algorithm correctly, or even this algorithm at all, because the source code isn't subject to review and, even it were the binary cannot be proved to have been built from such source code. We also don't have proof that the only images being searched for are child abuse images as they claim.

suizi · 4 years ago
Security by obscurity has never been particularly effective, and there are some articles which allege that detection algorithms can be defeated fairly easily.
zionic · 4 years ago
I’m furious. My top app has 250,000 uniques a day.

I’m considering a 24h black out with a protest link to apple’s support email explaining what they’ve done.

I wonder if anyone else would join me?

collaborative · 4 years ago
We need to get organized first. We need a support platform where we can coordinate these type of actions. It's in my todo list, but if anyone can get this started please do so
mrits · 4 years ago
There isn't any reason to believe the CSAM hash list is only images. The government now has the ability to search for anything in your iCloud account with this.
2OEH8eoCRo0 · 4 years ago
Why is it always "think of the children"? It gets people emotional? What about terrorism, murder, or a litany of other heinous violent crimes?
falcolas · 4 years ago
I invite you to look up "The Four Horsemen of the Infocalypse". Child Pornography is but one of the well trodden paths to remove privacy and security.
vineyardmike · 4 years ago
"CSAM" is an easy target because people can't see it - it would be wrong for you to audit the db because then you'd need the illicit content. So its invisible to the average law-abiders.
suizi · 4 years ago
France has been pushing terrorism as a justification for mass-surveillance in the E.U.

Deleted Comment

bambax · 4 years ago
Yes. I'm not interested in catching pedophiles, or drug dealers, or terrorists. It's the job of the police. I'm not the police.
adolph · 4 years ago
Yes, if you act as the police you are a vigilante.
anthk · 4 years ago
> the damage has been done. I'd say the authorities should prioritise tracking down the creators

Russian and ex-Soviet countries with human trafficking mafias host several fucked up people who produce this crap.

Deleted Comment

tamrix · 4 years ago
You know it can be used to get the geo location in the meta data of pictures for people who took photos at protests. Etc
NoPicklez · 4 years ago
I do agree with your points, but I think it's obvious to see that this feature is trying to allow authorities to catch the creators.

Deleted Comment

Dead Comment

geraneum · 4 years ago
Didn’t they [Apple] make the same points that EFF is making now, to avoid giving FBI a key to unlock an iOS device that belonged to a terrorist?

“ Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.”

“… We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.”

“ The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.”

Tim Cook, 2016

rubatuga · 4 years ago
Think of the children!!!
Shank · 4 years ago
I really love the EFF, but I also believe the immediate backlash is (relatively) daft. There is a potential for abuse of this system, but consider the following too:

1. PhotoDNA is already scanning content from Google Photos and a whole host of other service providers.

2. Apple is obviously under pressure to follow suit, but they developed an on-device system, recruited mathematicians to analyze it, and published the results, as well as one in-house proof and one independent proof showing the cryptographic integrity of the system.

3. Nobody, and I mean nobody, is going to successfully convince the general public that a tool designed to stop the spread of CSAM is a "bad thing" unless they can show concrete examples of the abuse.

For one and two: given the two options, would you rather that Apple implement serverside scanning, in the clear, or go with the on-device route? If we assume a law was passed to require serverside scanning (which could very well happen), what would that do to privacy?

For three: It's an extremely common trope to say that people do things to "save the children." Well, that's still true. Arguing against a CSAM scanning tool, which is technically more privacy preserving than alternatives from other cloud providers, is an extremely uphill battle. The biggest claim here is that the detection tool could be abused against people. And that very well may be possible! But the whole existence of NCMEC is predicated on stopping the active and real danger of child sex exploitation. We know with certainty this is a problem. Compared to a certainty of child sex abuse, the hypothetical risk from such a system is practically laughable to most people.

So, I think again, the backlash is daft. It's been about two days of the announcement being public (leaks). The underlying mathematics behind the system has barely been published [0]. It looks like the EFF rushed to make a statement here, and in doing so, it doesn't look like they took the time to analyze the cryptography system, to consider the attacks against it, or to consider possible motivations and outcomes. Maybe they did, and they had advanced access to the material. But it doesn't look like it, and in the court of public opinion, optics are everything.

[0]: https://www.apple.com/child-safety/pdf/Alternative_Security_...

feanaro · 4 years ago
> that a tool designed to stop the spread of CSAM is a "bad thing"

It's certainly said to be designed to do it, but have you seen concerns raised in the other thread (https://news.ycombinator.com/item?id=28068741)? There have been reports from some commenters of the NCMEC database containing unobjectionable photos because they were merely found in a context alongside some CSAM.

Who audits these databases? Where is the oversight to guarantee only appropriate content is included? They are famously opaque because the very viewing of the content is illegal. So how can we know that they contain what they are purported to contain?

This is overreach.

Shank · 4 years ago
> Who audits these databases? Where is the oversight to guarantee only appropriate content is included? They are famously opaque because the very viewing of the content is illegal. So how can we know that they contain what they are purported to contain?

I wholeheartedly agree: there is an audit question here too. The contents of the database are by far the most dangerous part of this equation, malicious or not, targeted or not. I don't like the privacy implications about this, nor the potential for abuse. I would love to see some kind of way to audit the database, or ensure that it's only used "for good." I just don't know what that system is, and I know that PhotoDNA is already in use on other cloud providers.

Matthew Green's ongoing analysis [0] is really worth keeping an eye on. For example, there's a good question: can you just scan against a different database for different people? These are the right questions given what we have right now.

[0]: https://twitter.com/matthew_d_green/status/14233782854682091...

shuckles · 4 years ago
That’s a problem with NCMEC, not Apple’s proposal today. Furthermore, if it were an actual problem, it would’ve already manifested with the numerous current users of PhotoDNA which includes Facebook and Google. I don’t think the database of known CSAM content includes photos that cannot be visually recognized as child abuse.
neop1x · 4 years ago
>> Who audits these databases?

Maybe pedophiles working for those companies.

oh_sigh · 4 years ago
"Reports from commenters" = unsubstantiated speculation. Weird how no one was able to specifically state any information about these unobjectionable photos except for a theoretical mechanism for them to find their way into the database.
randcraw · 4 years ago
You presume Apple and the DoJ will implement this with human beings at each step. They won't. Both parties will automate as much of this clandestine search as possible. With time, the external visibility and oversight of this practice will fade, and with it, any motivation to confirm fair and accurate matches. Welcome to the sloppiness inherent in clandestine law enforcement intel gathering.

As with all politically-motivated initiatives that boldly violate the Constitution (consider the FISA Court, and its rubber stamp approval of 100% of the secret warrants put before it), the use and abuse of this system will go largely underground, like FISA, and its utility will slowly degrade due to lack of oversight. In time, even bad matches will log the IDs of both parties in databases that label them as potential sexual predators.

Believe it. That's how modern computer-based gov't intel works. Like most law enforcement policy recommendation systems, Apple's initial match algorithm will never be assessed for accuracy, nor be accountable for being wrong at least 10% of the time. In time it will be replaced by other third party screening software that will be even more poorly written and overseen. That's just what law enforcement does.

I've personally seen people suffer this kind of gov't abuse and neglect as a result of clueless automated law enforcement initiatives after 9-1-1. I don't welcome more, nor the gradual and willful tossing of everyone's basic Constitutional rights that Apple's practice portends.

The damages to personal liberty that are inherent in conducting secret searches without cause or oversight is exactly why the Fourth Amendment requires a warrant before conducting a search. NOW is the time to disabuse your sense of 'daftness'; not years from now, after the Fourth and Fifth Amendments become irreversibly passe. Or should I say, 'daft'?

shivak · 4 years ago
> recruited mathematicians to analyze it, and published the results, as well as one in-house proof and one independent proof showing the cryptographic integrity of the system.

Apple employs cryptographers, but they are not necessarily acting in your interest. Case in point: their use of private set intersection, to preserve privacy..of law enforcement, not users. Their less technical summary:

> Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

> Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection..

The matching is performed on device, so the user’s privacy isn’t at stake. But, thanks to PSI and the hash preprocessing, the user doesn’t know what law enforcement is looking for.

xondono · 4 years ago
Well, it’d be kind of dumb to make the mistake of building a system to stop child pornography only to have it become the biggest distributor of CP photos in history
echelon · 4 years ago
> There is a potential for abuse of this system, but consider the following too

> I think again, the backlash is daft.

Don't apologize for this bullshit! Don't let your love of brand trump the reality of what's going on here.

Machinery is being put in place to detect what files are on your supposedly secure device. Someone has the reins and promises not to use it for anything other than "protecting the children".

How many election cycles or generations does it take to change to an unfavorable climate where this is now a tool of great asymmetrical power to use against the public?

What happens when the powers that be see that you downloaded labor union materials, documents from Wikileaks, or other files that implicate you as a risk?

Perhaps a content hash on your phone puts you in a flagged bucket where you get pat downs at the airport, increased surveillance, etc.

The only position to take here is a full rebuke of Apple.

edit: Apple apologists are taking a downright scary position now. I suppose the company has taken a full 180 from their 1984 ad centerpiece. But that's okay, right, because Apple is a part of your identity and it's beyond reproach?

edit 2: It's nominally iCloud only (a key feature of the device/ecosystem), but that means having to turn off a lot of settings. One foot in the door...

edit 3: Please don't be complicit in allowing this to happen. Don't apologize or rationalize. This is only a first step. We warned that adtech and monitoring and abuse of open source were coming for years, and we were right. We're telling you - loudly - that this will begin a trend of further erosion of privacy and liberty.

artimaeis · 4 years ago
It's not doing any sort of scanning of your photos while they're just sitting on your device. The CSAM scanning only occurs when uploading photos to iCloud, and only to the photos being uploaded.

Source (pdf): https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...

throwaway888abc · 4 years ago
1. was new to me.

TIL - (2014) PhotoDNA Lets Google, FB and Others Hunt Down Child Pornography Without Looking at Your Photos

https://petapixel.com/2014/08/08/photodna-lets-google-facebo...

indymike · 4 years ago
> backlash is daft

Fighting to preserve a freedom is not daft, even if it is David vs. Goliath's bigger, meaner brother and his friends.

Deleted Comment

vorpalhex · 4 years ago
Who verifies CSAM databases? Is there a way to verify the CSAM hashlist hasn't been tampered with and additional hashes inserted?

Would it be ok to use this approach to stop "terrorism"? Are you ok with both Biden and Trump defining that list?

avnigo · 4 years ago
I’d be interested to see what any Apple executives would respond to the concerns in interviews, but I don’t expect Apple to issue a press release on the concerns.
cblconfederate · 4 years ago
What is the point of E2EE vs TLS/SSL based encryption?
wayneftw · 4 years ago
This is an abuse my property rights. The device is my property and this activity will be using my CPU, battery time and my network bandwidth. That's the abuse right there.

They should just use their own computers to do this stuff.

jdavis703 · 4 years ago
Then you have two choices, disable iCloud photo backups or don’t upgrade to iOS 15. There are plenty of arguments against Apple’s scheme, but this isn’t one of them.
samatman · 4 years ago
Photos is just an app.

You can use another photo app, link it to another cloud provider, and be free of the burden.

If you use Photos, you're along for the ride, and you've consented to whatever it does.

You don't get a line-item veto on code you choose to run, that's never been how it works.

For what it's worth, I'm basically with the EFF on this: it looks like the thin end of a wedge, it sucks and I'm not happy about it.

But being histrionic doesn't help anything.

8note · 4 years ago
You chose an apple device because apple knows what's best for you.

This is part of the integrated experience

api · 4 years ago
(2) is important. Apple put effort into making this at least somewhat privacy-respecting, while the other players just scan everything with no limit at all. They also scan everything for any purpose including marketing, political profiling, etc.

Apple remains the most privacy respecting major vendor. The only way to do better is fully open software and open hardware.

Wowfunhappy · 4 years ago
This isn't the biggest issue at play, but one detail I can't stop thinking about:

> If an account held by a child under 13 wishes to send an image that the on-device machine learning classifier determines is a sexually explicit image, a notification will pop up, telling the under-13 child that their parent will be notified of this content. [...] For users between the ages of 13 and 17, a similar warning notification will pop up, though without the parental notification.

Why is it different for children under 13, specifically? The 18-year cutoff makes sense, because turning 18 carries legal weight in the US (as decided via a democratic process), but 13?

13 is an age when many parents start granting their children more freedom, but that's very much rooted in one's individual culture—and the individual child. By giving parents fewer options for 13-year-olds, Apple—a private company—is pushing their views about parenting onto everyone else. I find that a little disturbing.

---

Note: I'm not (necessarily) arguing for greater restrictions on 13-year-olds. Privacy for children is a tricky thing, and I have mixed feelings about this whole scheme. What I know for sure, however, is that I don't feel comfortable with Apple being the one to decide "this thing we've declared an appropriate invasion of privacy for a 12-year-old is not appropriate for a 13-year-old."

BluSyn · 4 years ago
13 isn't an arbitrary cut-off. It's established as law in the US under COPPA. Similar to how 18 is the cut off for the other features. Other countries may have different age ranges according to local laws.
Wowfunhappy · 4 years ago
Is COPPA relevant to this feature, though?

18, to me, is different, because it's the point when your parents have no legal authority, so of course they shouldn't have any say over how you use your phone.

Ajedi32 · 4 years ago
Yeah, the "your phone will check your personal files against an opaque, unauditable, government-provided database and rat you out if it gets a match" part of this is very concerning, but I don't buy the EFF's arguments against the new parental control features. End-to-end encrypted or not, if you're sending messages to a minor you should expect that their parents can read those messages.
websites2023 · 4 years ago
The feature is opt-in. So, Apple isn't forcing anyone to do anything.
Wowfunhappy · 4 years ago
But you have fewer options if your child is 13 years old. Or am I misunderstanding the article?
jtsiskin · 4 years ago
COPPA. Apple necessarily already has a special under-13 settings. There’s also PG13, etc
strogonoff · 4 years ago
If Mallory gets a lawful citizen Bob to download a completely innocuous looking but perceptual-CSAM-hash-matching image to his phone, what happens to Bob? I imagine the following options:

- Apple sends Bob’s info to law enforcement; Bob is swatted or his life is destroyed in some other way. Worst, but most likely outcome.

- An Apple employee (or an outsourced contractor) reviews the photo, comparing it to CSAM source image sample used for the hash. Only if the image matches according to human vision, Bob is swatted. This requires there to be some sort of database of CSAM source images, which strikes me as unlikely.

- An Apple employee or a contractor reviews the image for abuse without comparing it to CSAM source, using own subjective judgement. Better, but implies Apple employees could technically SWAT Apple users.

bitexploder · 4 years ago
Do we know that they are using perceptual hashing? I am curious about the details of the hash database they are comparing against, but I assumed perceptual hashing would be pretty fraught with edge cases and false positives.

e: It is definitely not a strict/cryptographic hash algorithm: "Apple says NeuralHash tries to ensure that identical and visually similar images — such as cropped or edited images — result in the same hash." They are calling it "NeuralHash" -- https://techcrunch.com/2021/08/05/apple-icloud-photos-scanni...

anonuser123456 · 4 years ago
Downloading an image to your phone is different than uploading it to iCloud.

Downloaded images are not uploaded to iCloud w/out user intervention.

strogonoff · 4 years ago
Presuming iCloud Photos is enabled by Bob, an unsuspecting citizen, all downloaded images are synced to iCloud either right away or next time on Wi-Fi, depending on settings.
8note · 4 years ago
Today, sure. That's technically very easy to change

Deleted Comment

strogonoff · 4 years ago
From BBC’s article:

> Apple says that it will manually review each report to confirm there is a match. It can then take steps to disable a user's account and report to law enforcement.

So at least it’s the last option.