1. It is objectively true that Apple and Google accounts are extremely important to many people.
2. It is also objectively true that most users will only need one of each, a few at most. Fraudsters have no such limitations, and may want to create thousands of them per day if the possibility arises.
3. Therefore, it's likely that a significant percentage of all accounts ever created are fraudulent, even if the actual number of fraudsters is much lower. This is the crucial observation many people miss in this debate.
4. Real users do not want constant iMessage spam and other problems resulting from fraudulent accounts remaining open. Therefore, normal users care deeply about fraudulent accounts being closed promptly (and so do money-laundering regulators, but that's another discussion).
5. Normal users also care about their accounts remaining open. Apple has to balance these two problems.
6. If we force Apple (by regulation, PR crisis or any other method) to be softer on closures, the only way to do that without exacerbating #4 is to make opening fraudulent accounts harder.
7. The only reliable way of preventing fraudsters from opening accounts is strict and invasive identity verification.
8. Therefore, if we're asking Apple / Google to keep more accounts open, we're also asking for more surveillance.
This may actually be the right tradeoff to make, but it is important to point out that there is a tradeoff here, and that no decision in this regard goes without consequences.
Main thrust of the video is that these days these tools are predominantly being used for convenience of post-production and cost cutting at the expense of immersion and story telling.
The central claim here is illogical.
The way I see it, if you believe that AGI is imminent, and if your personal efforts are not entirely crucial to bringing AGI about (just about all engineers are in this category), and if you believe that AGI will obviate most forms of computer-related work, your best move is to do whatever is most profitable in the near-term.
If you make $500k/year, and Meta is offering you $10M/year, then you ought to take the new job. Hoard money, true believer. Then, when AGI hits, you'll be in a better personal position.
Essentially, the author's core assumption is that working for a lower salary at a company that may develop AGI is preferable to working for a much higher salary at a company that may develop AGI. I don't see how that makes any sense.
Also 10m would be a drop in the bucket compared to being a shareholder of a company that has achieved AGI; you could also imagine the influence and fame that comes with it.
When I started therapy, I felt the same way. But now I realize that there can be no easy solutions offered in therapy; the therapist cannot just give you an argument or trick that will resolve all your troubles. They are there to guide you through figuring it out yourself and help build the necessary habits to sustain the new state. That is why rapport between a therapist and their patient is crucial to success, thus why you are usually recommended to try several alternatives.
I wish this article (or Meta) were a bit clearer about the specific connection between the device settings and use and when humans get access to the images.
My settings are:
- [OFF] "Share additional data" - Share data about your Meta devices to help improve Meta products.
- [OFF] "Cloud media" - Allow your photos and videos to be sent to Meta's cloud for processing and temporary storage.
I'm not sure whether my settings would prevent my media from being used as described in the article.
Also, it's not clear which data is being used for training:
- random photos / videos taken
- only use of "Meta AI" (e.g., "Hey Meta, can you translate this sign")
As much as I've liked my Meta Ray Ban's I'm going to need clarity here before I continue using them.
TBH, if it were only use of Meta AI, I'd "get it" but probably turn that feature off (I barely use it as-is).
Just continues to prove that if you solve a bit of inconvenience for them, people will let you exploit them and their families.