The social web in a lot of ways led to our isolation and the amplification of the loneliness epidemic.
Now, these Web 2.0 / Social Web companies are the leaders in building the AI that may artificially treat the epidemic they created.
There's something quite cynically sad about that, and I would love it if we'd move away from these services and back into the "real world."
However, the 50GB figure was just a starting point for emails. A true "local Jarvis," would need to index everything: all your code repositories, documents, notes, and chat histories. That raw data can easily be hundreds of gigabytes.
For a 200GB text corpus, a traditional vector index can swell to >500GB. At that point, it's no longer a "meager" requirement. It becomes a heavy "tax" on your primary drive, which is often non-upgradable on modern laptops.
The goal for practical local AI shouldn't just be that it's possible, but that it's also lightweight and sustainable. That's the problem we focused on: making a comprehensive local knowledge base feasible without forcing users to dedicate half their SSD to a single index.