In case it is not abundantly clear by now: Apple's AI strategy is to put inference (and longer term even learning) on edge devices. This is completely coherent with their privacy-first strategy (which would be at odds with sending data up to the cloud for processing).
Processing data at the edge also makes for the best possible user experience because of the complete independence of network connectivity and hence minimal latency.
If (and that's a big if) they keep their APIs open to run any kind of AI workload on their chips it's a strategy that I personally really really welcome as I don't want the AI future to be centralised in the hands of a few powerful cloud providers.
You mean .. with their said privacy-first strategy
This would be enough as an anouncement, rest of it is just sugar coating.
Deleted Comment
Which might be fine! If that's the kind of system that works for you.
~$ firefox -v
Mozilla Firefox 102.5.0esr
.. which is November 15, 2022Can you please give an example of an application you needed available only as a snap?
Deleted Comment
> The proxy is extremely lightweight. An inexpensive and tiny VPS can easily handle hundreds of concurrent users. Here’s how to make it work:
SSH into the server.
Install Docker, Docker Compose, and git:
I'm sorry but installing Docker on a tiny VPS last time I checked wasn't any light at all.
Truly admireable on their part and a great paradigm for others. Reasons for this doesn't really matter to me but I can't help but wonder if somehow they were obliged or otherwise indebted to follow this route.