Yes of course everyone should check and unit test that every object is owned by the user or account loading it, but demanding more sophistication from an attacker than taking "/my_things/23" and loading "/my_things/24" is a big win.
Yes of course everyone should check and unit test that every object is owned by the user or account loading it, but demanding more sophistication from an attacker than taking "/my_things/23" and loading "/my_things/24" is a big win.
Or they could sell the broken design and people would just buy more as they broke. They don't care if Costco was eating the cost with their in-house warranty.
The fundamental problem though is the same with all "household gadget" products. They look cool, and appear to solve a problem, but that is actually all a perception based on novelty. They actually don't work very well, they are not built very well, and they don't last very long. There's no point in improving them because the concept is fundamentally something people don't need in the first place.
Just buy a good canister vacuum and you're set for a decade or more. It will cost more than the latest gadget from Shark or Dyson or iRobot but it won't frustrate you and it will just reliably do what it is supposed to do without uploading anything to an IP address.
AI-generated code still requires software engineers to build, test, debug, deploy, secure, monitor, be on-call, handle incidents, and so on. That's very expensive. It is much cheaper to pay a small monthly fee to a SaaS company.
Not to mention the author appears to run a 1-2 person company, so ... yeah. AI thought leadership ahoy.
Yes you can still mostly do Takeout, but it's garbage. (not incremental. Requires me to remember. duplicates files for every album (total incompetence). downloads regularly fail. Requires more room than I have on my mac to decompress so I have to put it on an external drive.)
I think it would be better if Signal more loudly communicated the drawbacks of its encryption approach up-front, warning away casual users before they get a nasty surprise after storing a lot of important data in Signal.
I’ve heard Signal lovers say the opposite—that getting burned with data loss is somehow educational for or deserved by casual users—and I think that’s asinine and misguided. It’s the equivalent of someone saying “ha! See? You were trading away privacy for convenience and relying on service-provider-readable message history as a record all along, don’t you feel dumb?”, to which most users’ will respond “no, now that you’ve explained the tradeoffs…that is exactly how I want it to work; you can use Signal, but I want iMessage”.
It shouldn’t take data loss to make that understood.
We'll see it intentionally backdoored this decade. Signal can afford to, eg, tell the UK or EU to go fuck themselves. Meta won't.
Google are doing exactly the same as Apple previously were doing, mandatory from end of next month - January 28, 2026.
Their new requirements: https://support.google.com/googleplay/android-developer/answ...
https://support.google.com/googleplay/android-developer/answ...
Don't hate the player, hate the game. I hate the game too, fwiw.
Including, of course, the way many popular chain restaurants got there is they make food a lot of people like.
Also, if most of your endpoints require auth, this is not typically a problem.
It really depends on your application. But yes, that's something to be aware of. If you need some ids to be unguessable, make sure they are not predictable :-)
Many systems are not sparse, and separately, that's simply wrong. Unguessable names is not a primary security measure, but a passive remediation for bugs or bad code. Broken access control remains an owasp top 10, and idor is a piece of that. Companies still get popped for this.
See, eg, google having a bug in 2019, made significantly less impactful by unguessable names https://infosecwriteups.com/google-did-an-oopsie-a-simple-id...